fredag den 30. august 2013

The History of the GPU

Once upon a time, or rather at the year of 1946, the first computer as we know it was created. The mechanical wonder is called ENIAC. Compared to the mordern day computer, our "little" ENIAC here have more in common with the modern calculator. A calculator you had to reprogram every time you took the power off that is.

Much water has flowed under the bridge since then, and modern technology is evolving at a steady pase. One of the evolutionary steps the computer has taken through the years are the graphic card.

One of the first personal computers to get released with a graphic card was the Commodore Amiga. Finally, the computer had a dedicated part of it to show graphics. You plug the display cables directly into the GPU, and the GPU sends the final video signal. Before the inclusion of a GPU, the CPU had to draw the display and send the final video signal.

The first real GPU was created by Intel, the iSBX 275. Its resolution was 256 x 256 and had the ability to display eight unique colors. This was quite revolutionary, and this little magnificent tool is one of the most important steps in the evolution graphics, which also means it's an important step in the evolution of gaming. I may not find it the most important step gaming wise, but very important non the less.


But enough about when it came to be, let's take a look at an actual GPU. Namely the one that sits in our old school computers.
We had to tear it appart, to my big pleasure and entertainment, in order to discover what made it tick.
And this is what we found.


The simple way to explain how it works, are that it sends data to the screen. What kind of data? Binary data of couse! But what are binary? To put it simple, it's a whole bunch of 1's and 0's put together to form a separate language. And when I say a whole bunch, then I mean more like millions or billions of numbers. But as I said, the GPU sends this data to your screen. It analyses what the binary code means, and sends the output data. But why do you need a separate component to do this task? The simple answer I can give is performance. If you made the CPU handle all the work the GPU does, just like in old times, the CPU would simply shut down. It can not handle the amount of data at its own.

The graphic card was invented in order to handle the processing of screen data. The GPU are the part of the brain which handles your eyesight. If you had to give this task to the other parts of the brain, you would get stressed and eventually shut down.

Be warned, this blog post might not be 100% true. I just described the GPU the way I understand it.

Ingen kommentarer:

Send en kommentar