Back in the distant past in the early 80s, the basic IBM built PC didn’t have a lot of colors to play with. It was a business machine first. Anything else was secondary such as games even though a number did make their way over to it. Compared to the Commodore 64 or the Apple II line, however, IBM’s boxes were pretty boring on the visual front.
Though there were a number of display standards that emerged from the early 80s on through the 90s, three big standards were popularized among games written for the IBM PC and compatibles.
The first graphics adapter to bring at least a little life to their screens was the Color Graphics Adapter, CGA, which is considered the first graphics card from IBM which would also set the standard for the IBM PC line way back in 1981. This was a step up up from the MDA, IBM’s Monochrome Display Adapter — green text against black. Those monitors in Fallout 3 and New Vegas? In that universe, MDA apparently never died.
CGA had a whopping 16kb of memory, connectors for a monitor (and in those days, the kind of monitor — RGB or composite — also made an important impact on visual quality) or a television and could technically display up to 16 different colors. It also had a max resolution of 640×200. It wasn’t pretty compared to the competition, but it got the job done. In 320×200 mode, only 4 colors could be displayed at once. In 640×200 mode, that went down to 2 colors.
Surprisingly, CGA had competition from Hercules Computer Technology with their Hercules Graphics Card. Compared to the CGA cards from IBM, it was relatively cheaper and was CGA compatible though it didn’t do color. But in the business world, that didn’t matter too much and it became extremely popular.
Next up after CGA was EGA, or the Enhanced Graphics Adapter, in 1984. This time, it could display 16 colors at the same time from a total palette of 64 and cranked the maximum resolution up to an eye blistering 640×350. It even enhanced CGA mode with its ability to display 16 colors at once though it wasn’t always a smooth transition. It was also packed with 64kb of memory.
EGA didn’t live long, either, and was quickly put out to pasture by IBM itself with the arrival of VGA in 1987. VGA, or Video Graphics Array, was introduced with IBM’s PS/2 computers and came to represent a number of key developments such as the famous 640×480 resolution or the 15 pin VGA monitor connector that still sees quite a bit of use today depending on where you look.
But VGA would be another factor for what would also allow IBM PC clones to quickly expand into the gaming space and ultimately supersede platforms such as the Commodore 64, the Amiga, and the aging Apple II line when it came to visual candy especially when third-party manufacturers leapt onto the bandwagon with their own adapters from Hercules to Diamond.
So what made it so great? Both CGA and EGA boards were built using multiple chips, but VGA only needed one which could be integrated onto a motherboard. VGA was also capable of using 256 colors to pick from, a big leap over EGA’s palette of 64. It could display 16 colors at once using its massive palette at a display resolution of 640×480 which would not only become a high-end standard for gaming but would also resurrect itself as mobile smart devices gained popularity such as MP4 players.
The VGA standard would go on to have a number of other enhancements such as “Super” VGA which was based on IBM’s VGA standard. Instead, SVGA topped off VGA with a number of additional enhancements such as a 800×600 4-bit pixel resolution which extended on out to 1024×768 8-bit pixels later.
There was also the XGA standard in 1990 which IBM had also tried to introduce as an official follow-up to VGA that raised the resolution up to 1024×768 using 256 colors. However, the popularity of SVGA eventually pushed it aside as third-party manufacturers and clone builders heavily promoted it along with confusing end users on whether or not they should upgrade their RGB monitor to a new SVGA compatible one.
From the early 90s, VGA and its derivatives would prove to be the de facto base standard for anyone who wanted their games to look good especially when it was demonstrated with games such as Origin’s Wing Commander. Although you could play it in EGA or MCGA (think of MCGA as a lower-resolution version of VGA) modes, people were awestruck by how it looked in high-res VGA. I know I was!
And with the number of IBM PC clones arriving in those years sporting sugary eye candy and sound, the machines muscled their way to the top of the heap arriving at a point where a few rivals had already been when they had dominated the gaming space in the 80s. The business behemoth had finally joined the party. There were few game developers out there that didn’t want to find a way to boast of VGA or SVGA support from the early to late 90s, spurring a silicon arms race especially when 3D gaming began to accelerate.
By the middle to late 90s and with the rise of 3D graphics technology, conventions such as “VGA compatible” gave way to new standards such as Microsoft’s DirectX, Windows, Linux, MacOS, and Pentiums on requirement labels for PC games — arguably an easier set of standards to work with for both developers and gamers to wrap their wallets around without worrying whether their monitor was SVGA compatible or not.