A Brief History of Modern Graphics Processors

History Modern Graphics Processors

Every gamer acknowledges that the graphics processing unit (GPU) is one of the most important components in hardware architecture. After all, a good GPU is the only difference between smoother, more immersive gameplay and a frustratingly, laggy experience. The higher your FPS requirement and the better the title’s base graphics, the more advanced your GPU needs to be to run it. Take a trip down the memory lane as we explore the history of modern graphics processors.

It’s no surprise, then, that the innovations with GPUs marked most of the milestones in video game history. So, let’s take a closer look.

The Sole Support of the MDA

Before GPUs were created, PCs had little more than the Monochrome Display Adapter (MDA). It was developed by the International Business Machines Corporation (IBM) in 1981, able to display only high-resolution text and symbols at 80 x 25 characters. It didn’t support graphics of any kind, and the farthest it went in terms of the graphical display was blinking green-colored texts.

History Modern Graphics Processors
Credit: Boffy b under Creative Common License

One year later, Hercules Computer Technology released the Hercules Graphics Card (HGC), which was basically an MDA, but with a bitmapped graphics mode. HGC’s display was still green, but now programmers could create what became the earliest PC games in history.

It was popular for a while, but people eventually began looking for more.

Full-colored Display with the CGC

In the late ‘80s, IBM began their work on the first “graphics card” in history (though it still wasn’t called GPU at the time). The Color Graphics Card (CGC) was designed with 6 kB of video memory, two text modes, and the ability to connect to either a CRT monitor or an NTSC-compatible television. A couple of months after the release of the CGC, IBM then released the Enhanced Graphics Adapter (EGA). It could display 16 colors all at once at 640 x 350 pixels.

Here’s what the output looked like when CGC and EGA were combined.

It was a novelty for about three years before IBM improve upon its own invention again with the very first Video Graphics Adapter (VGA). Think about the aforementioned game, but in higher resolution.

3dfx Interactive and the end of the 2D era

Eventually, IBM faded into the background and other tech companies began developing their own cards with better resolution and more color options. In 1996, 3dfx Interactive shocked the industry with the Voodoo graphics chip. Essentially, it made 2D graphics look 3D. It was used in early arcade games like the first versions of Home Run Derby.

Within a year, 3dfx Interactive surprised consumers again with the Voodoo2. It had the same impression of the Voodoo, except that it now supported the parallel use of two cards within one PC.

NVIDIA and the Birth of the GPU

NVIDIA has been around since 1993, creating not-so notable cards like the NV1 (which showed off real 3D graphics, except that its quality was not enough to sustain the processes). But in 1997, they debuted the NV3 or the Real-time Interactive Video and Animation (RIVA) 128. It was the first card to offer quadratic texture mapping—so it was able to run 3D graphics without the system crashing. Built with upgraded drivers, the NV3 was able to clock in at 100 MHz.

In 1999, NVIDIA defined the term “GPU” with the GeForce 256. According to NVIDIA’s official definition, the GPU is a “single-chip processor with integrated transform, lighting, triangle setup/clipping, and rendering engines that is capable of processing a minimum of 10 million polygons per second.”

The GeForce 256 was everything NV3 was but improved immensely in terms of performance.

GeForce vs. Radeon

In 2000, a competitor came in to challenge NVIDIA’s innovations—ATI Technologies. It was the year when the Radeon DDR launched for retail. Paired with 32 MB DDR memory and a 128-bit interface, the Radeon DDR was able to catch up to where the GeForce was at that time (the GeForce 2).

In 2002, the Radeon R100 was released. It was a notable move forward, as plenty of the future Radeon GPUs would use the R100 as their framework.

It just so happened that in that same year, NVIDIA released the GeForce 4. This signaled the race to see who could come up with the sleeker, faster GPUs. Even AMD’s acquisition of ATI tech in 2006 wasn’t enough to stop this fierce rivalry, which continues to this day.

In fact, AMD’s extra resources have only fueled Radeon’s innovations. Of course, NVIDIA isn’t going to lose either. Using intelligent designs and better materials, PCB board thickness has been reduced and the component layout has been optimized, making modern GPUs better than ever before. Processors are added, transistors are multiplied, and the overclock numbers are increased in each iteration of modern GPUs.

Now, the best GPUs are tied between the GeForce GTX and Radeon RX series. With consistent 60 FPS at 1080p and other performance bonuses both already offer, it’s hard to imagine how much more powerful GPUs can grow.

The future of GPUs

The need for better graphics becomes less apparent with every GPU generation. However, graphics aren’t the only aspect by which GPUs can improve. NVIDIA hints that their current GPUs could be 75% faster, for example, giving the GeForce cards a huge performance uplift. This is good news for PC gamers.

Meanwhile, AMD is working on the next generation of Radeon DNA (RDNA) GPUs. Fun fact: the RNDA is the foundation of the next generation of console gaming—the PlayStation 5 and the Xbox Series X. The consoles are capable of 10.28 TFLOPs clocked at 2.23GHz. The RDNA can support 8K resolution and ray tracing; the former never done before, even on a PC.

Every year is a time of surprises for GPUs. Whether this is in applications or new designs, the graphics card will continue to make strides for as long as games continue to evolve.

You May Also Like

About the Author: Staff

Your friendly staff at GamesHedge, bringing you news, reviews, previews and hardware information from around the world.

Leave a Reply

Your email address will not be published. Required fields are marked *