Why is the red primary color oriented at 104° on a vectorscope?
It can't be arbitrary! All the vectorscopes I've ever seen, and I've seen at least two or three, have used the same graticle orientation. The red primary always falls at roughly 104°.
But this can't be right! I was sure vectorscope were displaying colors as they exist in HSV. I mean, it is a circle, and it shows Hue and Saturation? Say no more. But then, a few weeks ago, while looking at one of these vectorscopes I've been so casually using for years, I mistakenly let an awful, pointless, time stealing question bubble up: If it is HSV, then why are the colors in the wrong spot?
HSV is always oriented such that the red primary (i.e. the pure, 100% saturated hue") falls along the 0° line. Even wikipedia says so! If it's not HSV, then what the heck is it?
What color space is a vectorscope actually showing?
The shortest answer is that it's not any color space at all! Vectorscopes don't know one single thing about colors, or color spaces, or how to represent any of it. They're just brute analog machines that plot the relationship between two aspects of a signal: its phase and its amplitude. This fact means that my original question was way too squishy. A refined question is:
What the hell does it mean for the color red to have a phase angle of 104°!?
The shortest answer to this version is that that angle is the number of degrees out of phase that one signal becomes relative to a baseline when you modulate it with another signal containing the encoded color information. But that's not actually a useful explanation, because that just yields more "wait... why is that like that?" questions (which is why this was such a time sink!). To understand how color is encoded, and why it's encoded that way, and what is getting modulated, and where these "phase angles" are coming from, you've got to go all the way back at the very first NTSC standard for encoding black and white TV. Because the whole thing is best summed up as "tech debt".
How black and white TV was encoded
Back in prehistory, television was black and white. It consisted of just one signal called Luminance ("Luma" for short, or just "Y" if you really need to save time). This signal tracked how dark or bright any given portion of the screen should be. Since there was just this one piece of information to track, a super elegant and simple standard for encoding video information emerged. it looked a bit like this:
This bunch of squiggles is what one line of TV looked like behind the scenes[0] – just some voltage going up and down on a wire (which blows my mind). Repeat these blocks about 480 more times and you've got yourself a frame of analog TV. With the exception of the bookkeeping bits at the beginning, everything else is video information. How high or low the amplitude of the video signal was is mirrored by how bright or dark that part of the image being encoded is.
This is important to the story, because this humble, bare-bones encoding scheme was standardized by the FCC in 1941, and color TV eventually had to be built on top of it.
The existing black and white standard made introducing color challenging
Within the decade, engineers began figuring out a bunch of different ways to display color information. Some bolted a motor and giant RGB wheel to the front of the TV and spun it really, really fast (seriously).
Others just started transmitting their own proprietary color signals, existing TVs be damned![1]
The Federal Communications Commission was having none of this fragmentation. In their 1949 session, they set out two hard line requirements for the development of a new color transmission standard. It must:
- Be backwards compatible with existing black and white receivers
- Have a minimum of redundancy and still fit in the 6MHz allotted to B&W transmission
These were brutal requirements. The first meant that engineers couldn't just swap over to some "simple" strategy like shipping each red, green, and blue channel independently – existing TVs wouldn't know how to decode it. The second meant that they similarly couldn't just send all the RGB channels in addition to the required luminance one. Far too much redundant information and not enough bandwidth.
The only real path forward was encoding just the RGB information that wasn't already captured as part of the luminance signal. Meaning, you'd subtract is out like this: $$ \begin{aligned}R' = R - Y\ G' = G - Y\ B' = B - Y\end{aligned} $$ where luminance (Y) is a sum of the RGB values weighted by how our eyes perceive their relative brightness [2] $$Y = 0.3R + 0.59G + 0.1B$$ These color difference signals are called Chrominance.
This color difference thing solved the problem of isolating color information while also keeping luminance around for existing TVs. However, it did not solve the redundancy or bandwidth problems. There were still four individual channels of information that needed encoded and fit into that squiggly standard.
Reducing chrominance from three channels down to two
These engineers were crazy clever. They realized that these three chrominance channels still contained redundant information. It turns out that you only need two of the color difference signals; the third can be reproduced entirely with just a little trigonometry. So, the $G - Y$ signal was dropped, and the canonical color difference components became $B - Y$, which is called $U$, and $R - Y$, called $V$. Now the entire color picture information can be encoded in just two chrominance channels, and one luminance: Y, U, and V.
It looks like this: (when $Y = 0.5$)
But you can actually do better than $YUV$! Engineers at RCA ran a ton of tests and found that humans are actually way more discerning when looking at oranges and blues than when looking at purples and greens. Because of this, you can compress the axis representing the less important colors into just $0.6Mhz$ without any noticeable loss in color reproduction, and then devote a chunky $1.5Mhz$ for the more important colors! Their experiments led to a new pair of axes on the same plane, called $I$ and $Q$, shifted 33° from $UV$.
Note that the color space itself is the same, only the meaning of the coordinates changes.
This color space is the core piece of the puzzle. It is the relationship between coordinates on this space that determines the phase of the encoded signal, and thus the angle of the color on the vectorscope!
This is where things get really cool.
You can compress the remaining two chrominance channels into one!
There's a crazy property of sinewaves that if you combine two of them in "quadrature", or 90° out of phase with each other, you get back a new pure sine wave with its own amplitude and phase. But what's even crazier is that you can get back to the original components that went into the signal by just performing the modulation action again!
This modulation strategy is from where the $I$ ("In phase") and $Q$ ("Quadrature") components draw their name, and how the individual color values are compressed into a single instantaneous voltage carrying crazy amounts of information.
The phase of the new sine wave produced from the combined $I$ and $Q$ components is exactly what leads to a color having a certain phase angle!
Now for something even cooler: how do you know what the phase of this resultant sine wave will be? Because the $I$ and $Q$ values are in quadrature, and thus at a 90° to each other, they form a right triangle! And with that, you only need "soh" "cah" "toa" level trig skills to figure out the phase angle the two components will have when combined!
Which is awesome, cause it gives us multiple tools to check our work. We can compute $arctan$ of $Q/I$ to see what we expect the phase angle to be, and then verify that same information in the encoded signal itself.
For instance, this is an arbitrary frequency modulated with a certain suspect red color.
Subtracting the modulated signal's phase from the unmodulated carrier where they both cross zero gives you their phase differences.
Compare those numbers to the ones computed from the trig relationships and they'll match one to one!
But! There's actually one final bit of subtlety to understand. If we do all of the above work for the red primary color (255, 0, 0), we'll actually end up with 19.3° – not the 104° that was promised! This can be really trippy at first. However, the thing to realize is that it's 19.3° relative to the $IQ$ axis, and the $IQ$ axis itself is rotated 33° away from the $U$ axis. So, your starting point, $I$, is actually located at 123° ($90° + 33° = 123°$).
It's from this initial angle that you subtract the 19.3° ($123° - 19.3°$), thus giving the final, much sought after, life stealing number of precisely 103.7° (rounded up to a nice clean 104°)!
That's why vectorscopes are the way they are.
The question is now answered. Red shows up at ~104° because that's the subtended angle between the color's coordinates when converted to YIQ, combined together in quadrature, and modulated against another signal.
If you're at all interested in this, the topic goes crazy deep. A must read is the RCA Review from 1953. Completely ignored for brevity is carrier modulation, subcarriers, color bursts, and how audio, luminance, and color all get squished together. Even though this is effectively a dead technology, it's still fascinating and inspiring to read about all the super clever engineering that went into it.
Footnotes and useful links in no particular order
- [0] Ignoring carrier modulation
- [1] This "fear not consumer!" message clipped from a book excerpt found here
- [2] Page 186 of RCA Review 1953
- https://en.wikipedia.org/wiki/Vectorscope
- https://en.wikipedia.org/wiki/SMPTEcolorbars
- https://en.wikipedia.org/wiki/Colorburst
- https://bkpmedia.s3.amazonaws.com/downloads/manuals/en-us/1249B_manual.pdf
- * This is where I first found reference to the exact angles of the hues.
- https://www.image-engineering.de/content/library/diplomathesis/borysgoliksoftwareinterface.pdf
- https://pysdr.org/content/sampling.html#receiver-side
- https://techpubs.jurassic.nl/manuals/0640/admin/CombinerUG/sgihtml/go01.html
- https://web.archive.org/web/20001019092616/http://www.ee.washington.edu/conselec/CE/kuhn/ntsc/95x4.htm
- http://www.kolumbus.fi/pami1/video/pal_ntsc.html
- https://en.wikipedia.org/wiki/IRE_(unit)
- https://www.testmart.com/webdata/mfr_pdfs/KEN/cg.pdf
- https://transition.fcc.gov/omd/history/tv/documents/76years_tv.pdf
- https://www.law.cornell.edu/cfr/text/47/73.699
- https://www.law.cornell.edu/cfr/text/47/73.682