In 1940, the United States Federal Communications Commission convened a group of engineers and executives to solve a crisis that threatened to fracture the American television industry. This group, the National Television System Committee, was formed to resolve bitter conflicts between competing companies over the introduction of a nationwide analog television system. The stakes were incredibly high, as the lack of a unified standard meant that consumers might be left with incompatible sets from different manufacturers, stalling the entire industry before it truly began. The committee issued its first technical standard for black-and-white television in March 1941, basing it on a 1936 recommendation by the Radio Manufacturers Association. This decision was not made lightly; it required a delicate compromise between RCA's 441-scan line standard, which was already in use by its NBC TV network, and the demands of Philco and DuMont, who wanted to increase the number of scan lines to between 605 and 800. The final selection of 525 scan lines represented a middle ground that allowed for higher resolution than RCA's initial proposal while remaining technically feasible for the electronics of the time. The standard also established a frame rate of 30 frames per second, consisting of two interlaced fields per frame at 262.5 lines per field, and 60 fields per second. Other critical standards included an aspect ratio of 4:3 and frequency modulation of the sound signal, setting the foundation for American television for decades to come.
The Color War And The Compromise
By January 1950, the committee was reconstituted to tackle the even more complex challenge of standardizing color television. The Federal Communications Commission had briefly approved a 405-line field-sequential color TV standard developed by CBS in October 1950, but this system was incompatible with existing black-and-white sets. The CBS system used a rotating color wheel, reduced the number of scan lines from 525 to 405, and increased the field rate from 60 to 144 with an effective frame rate of 24 frames per second. Legal action by rival RCA kept commercial use of the system off the air until June 1951, and regular broadcasts only lasted a few months before the manufacture of all color sets was banned by the Office of Defense Mobilization in October, ostensibly due to the Korean War. A variant of the CBS system was later used by NASA to broadcast pictures of astronauts in space, but the commercial failure of the CBS approach paved the way for a new solution. In December 1953, the FCC unanimously approved what became the NTSC color-television standard, later defined as RS-170a. This standard retained backward compatibility with existing black-and-white sets, a crucial requirement for mass adoption. Color information was added to the black-and-white image by introducing a color subcarrier of 3.579545 MHz, a frequency chosen so that horizontal line-rate modulation components of the chrominance signal fell between the horizontal line-rate modulation components of the luminance signal. This allowed the chrominance signal to be easily filtered out of the luminance signal on new sets while remaining minimally visible on existing sets. The horizontal line rate was reduced to 15,734 lines per second from 15,750 lines per second, and the frame rate was reduced to 30/1.001, approximately 29.970 frames per second, to accommodate the color subcarrier. These changes amounted to only 0.1 percent, yet they were tolerated by existing TV sets, allowing the transition to color to proceed without rendering millions of black-and-white sets obsolete.
The first publicly-announced network television broadcast of a program using the NTSC compatible color system was an episode of NBC's Kukla, Fran and Ollie on the 30th of August 1953, viewable in color only at NBC headquarters. The first nationwide viewing of NTSC color occurred on the following January 1 with the coast-to-coast broadcast of the Tournament of Roses Parade, viewable on prototype color receivers at special presentations nationwide. The first color NTSC television camera was the RCA TK-40, used for experimental broadcasts in 1953, and an improved version, the TK-40A, introduced in March 1954, became the first commercially-available color-television camera. Later that year, an improved TK-41 became the standard camera and was used through much of the 1960s. The NTSC standard was adopted by other countries, including Japan and several in the Americas, creating a vast network of compatible systems. The first publicly-announced network television broadcast of a program using the NTSC compatible color system was an episode of NBC's Kukla, Fran and Ollie on the 30th of August 1953, viewable in color only at NBC headquarters. The first nationwide viewing of NTSC color occurred on the following January 1 with the coast-to-coast broadcast of the Tournament of Roses Parade, viewable on prototype color receivers at special presentations nationwide. The first color NTSC television camera was the RCA TK-40, used for experimental broadcasts in 1953, and an improved version, the TK-40A, introduced in March 1954, became the first commercially-available color-television camera. Later that year, an improved TK-41 became the standard camera and was used through much of the 1960s. The NTSC standard was adopted by other countries, including Japan and several in the Americas, creating a vast network of compatible systems.
The Ghost In The Machine
Despite its technical ingenuity, the NTSC system was plagued by reception problems that could degrade the color accuracy of the picture. Ghosting could change the phase of the colorburst, altering a signal's color balance, and the vacuum-tube electronics used in televisions through the 1960s led to technical problems that required the inclusion of a tint control on almost every set. Hue controls are still found on NTSC TVs, but color drifting generally ceased to be a problem by the 1970s. Compared to PAL in particular, NTSC color accuracy and consistency were sometimes considered inferior, leading video professionals and television engineers to jokingly refer to NTSC as Never The Same Color, Never Twice the Same Color, or No True Skin Colors. The system used a luminance-chrominance encoding system where the red, green, and blue primary color signals were weighted and summed into a single luma signal, designated Y prime, which replaced the original monochrome signal. The color-difference information was encoded into the chrominance signal, which carried only the color information, allowing black-and-white receivers to display NTSC color signals by ignoring the chrominance signal. Some black-and-white TVs sold in the U.S. after the introduction of color broadcasting in 1953 were designed to filter chroma out, but early sets did not do this, and chrominance could be seen as a crawling dot pattern in areas of the picture with saturated colors. To derive separate signals with only color information, the difference was determined between each color primary and the summed luma, and these difference signals were used to derive two new color signals, known as I and Q, in a process known as QAM. The I color space was rotated relative to the difference-signal color space, with orange-blue color information transmitted on the I signal at 1.3 MHz bandwidth, and the Q signal encoding purple-green color information at 0.4 MHz bandwidth. This allowed the chrominance signal to use less overall bandwidth without noticeable color degradation. The two signals each amplitude modulated 3.58 MHz carriers, which were 90 degrees out of phase with each other, and the result was the sum with the carriers suppressed. The result could be viewed as a single sine wave, with varying phase relative to a reference carrier and with varying amplitude. The varying phase represented the instantaneous color hue captured by a TV camera, and the amplitude represented the color saturation. The 3.58 MHz subcarrier was added to the luminance to form the composite color signal, which modulated the video-signal carrier.
The Digital Transition
With the advent of digital television, analog broadcasts were largely phased out, marking the end of an era for NTSC. NTSC broadcasters in the U.S. were required by the FCC to shut down their analog transmitters by the 17th of February 2009, though the shutdown was later moved to June 12 of that year. Low-power and Class A stations and translators were required to shut down by 2015, although an FCC extension allowed some stations operating on Channel 6 to operate until the 13th of July 2021. Canadian analog TV transmitters in markets not subject to the mandatory 2011 transition were to be shut down by the 14th of January 2022, under a 2017 schedule from Innovation, Science and Economic Development Canada. Most countries using the NTSC standard and those using other analog television standards have switched to newer digital television standards, with at least four different standards in use worldwide. North America, parts of Central America, and South Korea are adopting or have adopted the ATSC standards, while other countries, such as Japan, are adopting or have adopted standards other than ATSC. Most over-the-air NTSC transmissions in the United States ended on the 12th of June 2009, and by the 31st of August 2011, in Canada and most other NTSC markets. The term NTSC has since referred to digital formats with 480 to 487 active lines and a 30 or 29.97 FPS frame rate since the introduction of digital sources such as DVDs, and is a digital shorthand for System M. The NTSC-Film standard has a digital resolution of 720 by 480 pixels for DVD-Videos, 480 by 480 pixels for Super Video CDs, and 352 by 480 pixels for Video CDs. The digital video camcorder format equivalent of NTSC is 720 by 480 pixels, and the digital television equivalent is 720 by 480 pixels.
The Technical Architecture
A transmitted NTSC television channel has a total bandwidth of 6 MHz, with the actual video signal, which is amplitude-modulated, transmitted between 500 kHz and 5.45 MHz above the lower end of the channel. The video carrier is 1.25 MHz above the lower end of the channel, and like most AM signals, the video carrier generates two sidebands: one above the carrier and one below. Each sideband is 4.2 MHz wide, with the upper sideband transmitted and only 1.25 MHz of the lower sideband, known as a vestigial sideband, transmitted. The color subcarrier, 3.579545 MHz above the video carrier, is quadrature-amplitude-modulated with a suppressed carrier. The audio signal is frequency modulated with a 25-kHz maximum frequency deviation, less than the 75 kHz deviation on the FM band. The main audio carrier is 4.5 MHz above the video carrier, 250 kHz below the top of the channel. Sometimes a channel may contain an MTS signal, which offers more than one audio signal by adding one or two subcarriers to the audio signal, normally the case when stereo audio or second audio program signals are used. An NTSC frame has two fields, F1 and F2, with field dominance depending on a combination of factors, including decisions by equipment manufacturers and historical conventions. Most professional equipment can switch between a dominant upper or dominant lower field, and field dominance is important when editing NTSC video, as incorrect interpretation of field order can cause a shuddering effect as moving objects jump forward and behind on each successive field. Field order is important when interlaced NTSC is transcoded to a format with a different field dominance, and three-two pulldown, converting 24 fps to 30, will also provide unacceptable results with an incorrect field order. A standard NTSC video image contains invisible lines, lines 1 to 21 of each field, known as the vertical blanking interval, or VBI, with lines 1 to 9 used for the vertical-sync and equalizing pulses. The remaining lines were blanked in the original NTSC specification to provide time for the electron beam on CRT screens to return to the top of the display. VIR, vertical interval reference, adopted during the 1980s, attempts to correct some NTSC color problems by adding studio-inserted reference data for luminance and chrominance levels on line 19. The remaining VBI lines are typically used for datacasting or ancillary data such as video editing timestamps, test data on lines 17 to 18, a network source code on line 20 and closed captioning, XDS, and V-chip data on line 21. Some stations transmitted TV Guide On Screen data on VBI lines 11 to 18, 20, and 22, but TVGOS was discontinued in 2013 and 2016, ending OTA program-guide services for compatible devices.