If you used a PC throughout any part of the DOS era, then unless you were the monochrome type, you're probably familiar with the default set of 16 colors. CGA represented them as RGBI: one bit each for Red, Green, Blue and Intensity. Those were the only colors that (digital) CGA was capable of, but later standards retained them as defaults for text mode and for 4/16-color graphics.
Those later standards mapped them to higher bit-depth RGB, and resulting palette of the 16 CGA colors is what most of us are familiar with. But is it a good match for the output of IBM's original CGA monitor, the 5153 Personal Computer Color Display?
If it was, I wouldn't be typing this up, so let's find out what the deal is.
The Standard (Canonical) CGA Palette
This is the common mapping of CGA's 16 RGBI colors to RGB - it reflects how they're rendered on practically all post-CGA systems, and by current PC emulators. The baseline standard these days is 24-bit RGB (8 bits per channel), so the common version has a range of 0..255 for each component, shown here as hex triplets:
The "Canonical" RGBI-to-rgb24 CGA palette
The idea is pretty straightforward. CGA color numbers are 4-bit values: I, R, G and B (most to least significant). This palette assigns a level of 2/3 to the original R,G,B signals, and treats I as additional white, by adding 1/3 to all three channels. The exception is color #6: CGA modifies it from a dim yellow to brown, using a circuit in the monitor to detect the bit combination '0110' and tone down its G component.1 The above palette represents that by cutting the G level from 2/3 to 1/3.
But here's the catch: this palette doesn't necessarily reflect the appearance of a CGA monitor like the 5153. Nothing in the RGBI color model calls for this precise mapping, and IBM's CGA specs don't explicitly define one at all. Rather, it's derived from how EGA (and VGA) rendered those 16 colors for backward compatibility.
EGA moved from 4-bit RGBI to 6-bit RGB (2 bits per color channel). Each component could take 4 discrete levels, 0 to 3. That's where the canonical CGA palette gets its "0, 1/3, 2/3, 3/3" quantization from: the 16 CGA-derived colors had to be represented by this "rule of thirds". VGA took it further to 18-bit RGB (6 bpc), and 'truecolor' gave us 24-bit (8 bpc), so the same values were normalized to the higher bit depths.
This palette really is canonical - after all, it's based on IBM's own method of translating CGA colors to later RGB standards. But I've had my 5153 for a while now, and it's pretty clear that these colors don't do a great job of representing its actual CGA output, no matter what the knobs are set to.
A More Accurate Model?
A better representation would be difficult without some hard data. Fortunately an unofficial source exists: electronics engineer and CRT enthusiast Dr. Hugo Holden has made some very informative investigations into the IBM 5153's innards, and one of them involved measuring the voltages that drive the gun amplifiers for each RGBI color.
These levels were measured with the contrast control at the maximum position; see later for why this is important. Here is the releveant data, reproduced from the original document:2
|IBM 5153 CGA Monitor Color Processing|
|Drive level: Volts peak above black level to gun amplifiers|
These values deviate from what the canonical CGA-to-RGB palette might lead you to expect:
- There is no quantization of voltage levels to a 'rule of thirds' (0, 1/3, 2/3, 1)
- For color #6, the "brown fix" (transistor Q206) applies a green reduction of only 36%, not 50%
- Intensity does not add a uniform value to all three channels
- For the intensified colors, the maximum level additionally depends on the count(!) of primary color components
Those are direct consequences of IBM's circuit design, which Hugo Holden's referenced write-up explains in much greater detail. All in all, this gives us a much better starting point if we want to improve our model of the 5153's color output.
By taking the measured voltage range of 0 to 1.30 (above black level) and normalizing it to the RGB24 range of 0..255, we get this:
A revised CGA palette for the IBM 5153, based on gun drive level measurements
At this point, you might raise an important question: Hold it! A linear conversion of voltages to RGB levels?! Who said you could do that? What about gamma correction? After all, the transfer function of a CRT is not a linear input-to-output mapping; it's best approximated by a power law, AKA the gamma function.3 The above translation can't be valid - unless the IBM 5153 CRT has the same gamma function as the display showing our RGB colors (probably not a CRT at all). And how likely is that?
Pretty likely, actually. Regardless of display type, virtually all current video systems are standardized for a gamma value of 2.2; for historical reasons this was chosen to reflect the gamma transfer function inherent in cathode ray tube technology. This value does not vary much across properly adjusted CRT monitors, modulo minor corrections for viewing conditions (the nominal value is somewhere between 2.25 and 2.5).4
So although IBM never published a gamma value for the 5153 (to my knowledge), we can make an educated guess that it's close enough, so this revised palette is more or less valid. But more importantly, it really is a good match for the 5153's visible output... as I will attempt to show below.
Here's a camera still of the 16 CGA colors as shown on my IBM 5153 (with the contrast knob at the maximum position), right next to our two CGA-to-RGB palettes: the 'canonical' one, and the revised one based on Holden's voltage measurements.
The photo was white-balanced so that color #7 is grey - apparently the color correction isn't 100% perfect (e.g. green and magenta), but it's as good as I could get it without introducing other errors. Aging electronics can affect the voltage bias on the three gun amplifiers and make them drift slightly out of adjustment, so it's not too unexpected.
The photo is still close enough to the perceptual result when looking at the monitor, so you should be able to judge which palette is more like the real thing:
On first glance the revised palette seems to have one immediate point in its favor: the difference between the normal (0-7) and intensified (8-15) colors. The revised version has less of a brightness difference between the two groups, quite like in the photo. But that difference is precisely what the 5153's contrast knob modifies (see below), so it's not an independent variable. Our new palette is simply based on measurements made with that knob at maximum.
On the other hand, there are other aspects which make the revised palette a better match:
- The greater difference between red (4) and brown (6) - in the canonical palette they're often hard to distinguish.
- The brightness levels aren't so uniform across the intensified colors.
- In particular, there's more separation between colors 10 and 11, between 12 and 13, and between 14 and 15.
...And Let's Contrast
The 5153 has an interesting little quirk compared to typical CRT monitors. Its brightness control behaves pretty much as you'd expect, modifying the levels uniformly by applying a fixed DC offset to the video signals. But the contrast control does something a little more unusual.
Contrast in most CRTs adjusts the amplifier's static AC gain, to control the difference between black level and maximum brightness across the entire image. But here, it affects only the non-intensified colors: whenever the intensity signal is high, the contrast pot is effectively bypassed and gain is not attenuated.5
In other words, contrast is applied dynamically across the raster depending on the contents of the image. Turning the knob down makes the normal colors (0-7) dimmer, but the intensified colors (8-15) are completely unaffected:
To simulate this oddity when mapping the RGBI colors to RGB, it's not enough to apply a static levels adjustment to the entire palette: you'd have to tune the values for colors 0-7 only. It's a bit useless to give precise RGB values here because they vary smoothly across the range of the adjustment, but all 3 color components have to be reduced proportionally, to preserve hues. I say "reduced" because our revised palette has the contrast knob at maximum, so from here you can only go down.
One more thing I've noticed is that the contrast control has a subtle effect on color #6 (brown). At max, it looks close enough to the one in our new palette; but when you start dialing down the contrast, its hue changes slightly and becomes a bit less reddish and more greenish. This change isn't sustained: it seems to happen around a specific point a little below maximum contrast, but then it stays like that the rest of the way down. Could be a quirk of circuitry design coupled with the special treatment of color #6 for chroma correction, or maybe just another little symptom of age.
Yes, But Why does it Even Matter?
The benefit here isn't very obvious - we're splitting hairs about one particular monitor, and a CGA one at that. 99% of the time, viewers aren't going to notice or care as long as the palette is kinda-sorta in the ballpark, and even then they'll probably wish it wasn't. So where's the practical value in all this OCD color nitpicking?
OK then: imagine you're working on a little project with the IBM 5153 as a possible target. You're doing the visuals on your modern PC, so you simulate CGA using the 16 canonical colors. Now, imagine you're trying to make a simple color gradient in text mode. In time-honored ANSI art tradition, you use the half-shade ("▒") character and tweak the foreground and background colors until it all looks nice and smooth. But then you check the result on your 5153, and it looks like stripy vomit:
Working with the canonical palette (top) leads you to expect a passable color ramp, reducing in brightness from left to right. But on the 5153, the differences in color relationships mess up the perceptual effect, so you get ugly discontinuities (center). The revised 5153 palette (bottom) would have shown you something much closer to the real thing; at which point you'd probably want to make a better gradient, but that's a whole other can of worms.
Even with the revised palette, the RGB rendering still deviates from what we see on the CRT screen. That's partly down to the camera settings, but this monitor isn't quite done with its surprises.
Dithering and Color Mixing Oddities
Looking at the idealized RGB rendering vs. the 5153's actual output, you'll naturally notice that the real monitor blends pixels gradually in the horizontal direction; there are no sharp edges. But that's not all: the dithered "colors" you get with the alternating-dot trick are darker than expected - even though the true, solid colors are not.
If you generate the same color ramp in 80-column mode instead of 40, you increase the resolution from 320 to 640 dots per line. The smaller dots blend together even more smoothly, so the dither patterns are pretty much invisible. But the darkening effect is stronger, too:
Those blended colors that were darkened in 40-column mode are darker still at 80 columns. Meanwhile the solid colors are still unaffected. In the sharp RGB rendering, both resolutions have the same apparent brightness - but on the 5153, finer dithering seems to amplify the darkening effect.
By looking at further examples we can start to discern a certain logic:
- Colors generated by blending pixels together are darker on the 5153, vs. the simple crisp rendering on a modern display.
- Fine patterns (with horizontally-shorter dots) are darkened more than coarse patterns composed of longer dots.
- The bigger the difference between the two dot colors, the stronger the darkening effect.
With two different dot-pattern chars (0xB0 ░ and 0xB1 ▒), we can illustrate the effect on pixel sequences of varying lengths:
The cause of all this is most likely the bandwidth of the CRT gun amplifier - which determines how fast it can modulate the voltages at the CRT cathodes. These voltages control the color being displayed as the beam scans, so the time they take to change defines the sharpness of the horizontal boundaries between pixels, and the maximum resolution that can be clearly displayed.6 We can see that the IBM 5153's video amplifier isn't terribly high-bandwidth, at least not after all these years, since it has trouble achieving full separation at 640 H-dots (80 columns).
That's what blurs the dot edges; but why do they also appear darker? That depends on the gun voltage's rise and fall times. If it rises more slowly than it falls, total output for repeated oscillations is reduced. Or possibly the rise/
Since this only occurs when colors actually change between pixels, we can see why shorter dot stretches are darkened to a greater degree, making the effect stronger for finer patterns. Also: because the bandwidth limitation lengthens rise/
Here are a couple of test patterns I made to see how various color mixtures are affected. If you compare the 40 and 80 column versions of each screen (say in two browser tabs), you'll see it:
Maybe this sort of thing simply happens when monitors get old. I've had friends test this on their own 5153s, to verify that mine isn't just trying to be special, and the results were pretty much the same. Whether or not they had better resolution when they were new, it's probably safe enough to expect the average 5153 to be like this today.
Tying it All Together
If that's the case, how can we simulate the fuzzy edges and the darkening effect in a plain RGB rendering?
One method that seems to work is applying horizontal motion blur to act as a low-pass filter - but without performing any gamma correction. In almost any other situation, that's a paddlin'. But if my above hunch is correct, we may actually have a good reason to do that here: in the real monitor, the voltage swings are smoothed out at the amps, before the tube de-linearizes the color space. By blending pixel values as if they were linear and displaying them as sRGB, we simulate just that.
By massaging the above examples this way, we get:
Ahhh, much better! (Duke Nukem voice and flushing water sound are optional.) There are still a few mismatches, but honestly if I consider all the factors that could (and probably did) introduce errors between the 5153's video circuits and the photos on my PC, even getting somewhat close at all is satisfying enough. This gives me a pretty good indication of what a given image will look like on the 5153, and that's better than none at all.
What About Other CGA Monitors?
This bears repeating: all of this really only applies to the IBM 5153. Since nothing about CGA's specifications mandates this exact interpretation of the RGBI signals, other CGA CRTs don't necessarily follow it all that closely, especially with those special little twists (brown, intensity, contrast). In reality, some of them are different enough. Let's start with IBM's other CGA-
5154 Enhanced Color Display: this is the original EGA monitor, and as already noted, EGA gave us the "canonical" CGA palette by adapting those 16 colors to its 6-
bit RGB color model. However the 5154 can also handle RGBI video directly, either from a CGA card or from EGA in 200-line (15.7 KHz) mode.
You might suppose that its RGBI signal processing would follow the 5153's, but no: the 4 input bits are internally translated to the 6-bit RGB model, giving exactly the same 16 values we get with native EGA video;7 this mapping is performed by a ROM chip in the input stage.8 So on the 5154, even true CGA video yields the 16 EGA-like canonical colors.
4863 PCjr Color Display: this was manufactured by Mitsubishi, not by Tatung like the 5153, and the one schematic I can find makes it look rather different internally (too bad it's not very readable). Although I don't own this one, enough photos exist to show that the brown color behaves about the same as on the 5153, but I can't be sure about the rest.
5515 12" Color Display (for the IBM JX): photos of this one tell us that there's no brown as such, and color #6 is clearly shown as dark yellow, so evidently it doesn't have the "chroma correction" circuitry. You can see three of these at the top here. That's the third variation (at least!) on the CGA color set just from IBM themselves.
There's also the 5145 PC Convertible Color Display, from a different manufacturer yet again, this time in Korea (probably Samsung, which also made the 5154). But this one seems to be the rarest of IBM's consumer-market CGA CRTs, and info/
schematics/ photos are just as scarce.
As for non-IBM ones, they're all over the place - I have a later Samsung/Samtron monitor where brown is extremely close to red, and others have said the same about Tandy models; Philips/
Moral of the story: there's no such thing as the One True CGA-to-RGB Palette, not when the guys who invented the standard couldn't decide on one. The canonical mapping isn't a particularly bad model for "generic" CGA displays, but to represent the output of a particular monitor, you have to pay attention to its characteristics. The revised 5153 palette (and other tweaks described here) seem to do the trick for this one.
- IBM 5153 - Dark yellow to brown circuitry at MinusZeroDegrees [return]
- Fitting an EGA Card to an IBM 5155, H. Holden (2016), pp. 21-24 [return]
- Linearity and Gamma in Learning Modern 3D Graphics Programming, Jason L. McKesson (2012) [return]
- An Accurate Characterization of CRT Monitor (I) Verifications of Past Studies and Clarifications of Gamma, Naoya Katoh, Tatsuya Deguchi and Roy S. Berns (2001) [return]
- Fitting an EGA Card to an IBM 5155, H. Holden (2016), pp. 17, 19 [return]
- Understanding the Operation of a CRT Monitor, National Semiconductor (1989), p. 2 [return]
- IBM Enhanced Color Display in Technical Reference Options and Adapters Vol. 1, IBM (1984), p. 81 (05-4) [return]
- IBM 5154 - Colour mapping via ROM at MinusZeroDegrees [return]