This confuses me because the vertical rate is not 60 Hz, it's 3579545 Hz / (525/2 * 455/2) = 59.94 Hz. In other words it's odd that they would have chosen to be compatible with black and white instead of color.. instead of 44.1 KHz, it would be 44.056 KHz.
Edit:
Well it turns out that 44.056 KHz was used for the "EIAJ digital-audio-on-videotape standard"
Sony was originally proposing 44.056 kHz (NTSC - popular in Japan) with 16 bits while Philips was pushing for 44.1 kHz (PAL - popular in Europe) with 14 bits. The two reconciled their differences at the 4th Red Book meeting in 1980[1]. Sony was further ahead in developing the CD players but Philips supposedly was in the lead when it came to making the CD's[2]. Sony insisted on 16 bit vs while Philips was pushing for 14 bit. As a compromise they may have gone with the 44.1 kHz Philips was proposing and the 16 bit Sony was proposing because it would be easier to remember. Posts [1] and [2] are in direct conflict with each other on this point. There was further tension over what size disc to use [2].
The CD was one meeting away from launching another format war in the spirit of VHS vs Betamax or Blu-ray vs HD-DVD
Of course 44.056 kHz products did make it into the field for professional audio engineers. Anecdotally this made for some trouble: http://www.realhd-audio.com/?p=2197
❝Of course, lots of CDs were released with the original 44.056 kHz rate simple reclocked at 44.1 kHz. This resulted in a very slight speed increase AND a pitch shift of less than a quartertone.❞
While technically true (less than a quartertone), it's much, much, much less than a quartertone. It's 1.7cent (1.7 percent of a half-tone, or 1/30th of a quartertone). If you accidentally mix up 48kHz and 44.1, it's a much more noticable 1.5 half-tones. I doubt that this slight detuning is so blatantly obvious that it "feaks out" even a very well trained and very hot tempered classical violinist.
If you want to check the math: There are 12 semitones in an octave. One octave doubles the frequency, so the frequency difference between two adjacent semitones (e.g. from any key on your piano to the adjacent white or black key) is the twelfth root of two: ~1.0595. When tuning your guitar, your tuner might display the deviation from the true tone in cents, that's 1% of the interval between two halftones, or (python) math.pow(2,1.0/12.0e2) -> 1.0005777895065548.
Frequency ration between 44.1 and 44.056 kHz is 1.0009987.
Quite a lot of early CD players were actually 14 bits. Many early CDs were probably mastered with 14 bits in mind too, as they have a noticeably low average mix level.
The original American TV standard and TV recorders were 60Hz. When color was introduced, the frequency was shifted to 59.94 to avoid interference between the color signal and the sound signal.
(Color was encoded as a high-frequency sine-wave on top of the black-and-white signal which is mostly invisible on a black-and-white set (which allows for backwards compatibility). The phase of the fuzz indicates hue, and the amplitude indicates color intensity. This is why in the old system, if someone wore a shirt with vertical stripes on TV, viewers would see a rainbow of color over the shirt.)
I believe all NTSC equipment is required to support Black&White System M signals, which are exactly 60Hz[1]. It probably made their equipment much simpler to forget about colour encoding entirely. (And it made the 44,100Hz fit too)
Edit:
Well it turns out that 44.056 KHz was used for the "EIAJ digital-audio-on-videotape standard"
http://recordingtheworld.infopop.cc/eve/forums/a/tpc/f/22260...