Apollo TV camera

Apollo Lunar Television Camera, as it was mounted on the side of the Apollo 11 Lunar Module when it telecasted Armstrong's "One small step". Notice how it is stowed upside-down on the camera's top, due to it being its only flat surface.[1]

The Apollo TV Camera refers to several television cameras used in the Apollo program's space missions, and on the later Skylab and Apollo-Soyuz Test Project missions, in the late 1960s and 1970s. These cameras varied in design, with image quality improving significantly with each successive model. Two companies made these various camera systems: RCA and Westinghouse. Originally, these slow-scan television (SSTV) cameras, running at 10 frames-per-second (fps), produced only black-and-white pictures and first flew on the Apollo 7 mission in October 1968. A color camera — using a field-sequential color system — flew on the Apollo 10 mission in May 1969, and every mission after that. The Color Camera ran at the North American standard 30 fps. The cameras all used image pickup tubes that were initially fragile, as one was irreparably damaged during the live broadcast of the Apollo 12 mission's first moonwalk. Starting with the Apollo 15 mission, a more robust, damage-resistant camera was used on the lunar surface. All of these cameras required signal processing back on Earth to make the frame rate and color encoding compatible with analog broadcast television standards.

Starting with Apollo 7, a camera was carried on every Apollo Command Module (CM) except Apollo 9. For each lunar landing mission, a camera was also placed inside the Lunar Module (LM) Descent Stage's Modularized Equipment Stowage Assembly (MESA). Positioning the camera in the MESA made it possible to telecast the astronauts' first steps as they climbed down the LM's ladder at the start of a mission's first moonwalk/EVA. Afterwards, the camera would be detached from its mount in the MESA, mounted on a tripod and carried away from the LM to show the EVA's progress; or, mounted on a Lunar Roving Vehicle (LRV), where it could be remotely controlled from Mission Control on Earth.

RCA Command Module TV camera

Development

Apollo 7 slow-scan TV, transmitted by the RCA Command Module TV Camera.

NASA decided on initial specifications for TV on the Apollo Command Module (CM) in 1962.[2][Note 1] Both analog and digital transmission techniques were studied, but the early digital systems still used more bandwidth than an analog approach: 20 MHz for the digital system, compared to 500 kHz for the analog system.[2] The video standard for the Block I CM meant that the analog video standard for early Apollo missions was set as follows: monochrome signal, with 320 active scan lines, and progressively scanned at 10 frames-per-second (fps). RCA was given the contract to manufacture such a camera.[2] It was understood at the time that motion fidelity from such a slow-scan television system (SSTV) would be less than standard commercial television, but deemed sufficient considering that astronauts would not be moving quickly in orbit, or even on the Lunar surface.[5]

Video signal processing

Since the camera's scan rate was much lower than the approximately 30 fps for NTSC video,[Note 2] the television standard used in North America at the time, a real-time scan conversion was needed to be able to show its images on a regular TV set. NASA selected a scan converter manufactured by RCA to convert the black-and-white SSTV signals from the Apollo 7, 8, 9 and 11 missions.[6]

When the Apollo TV camera radioed its images, the ground stations received its raw unconverted SSTV signal and split it into two branches. One signal branch was sent unprocessed to a fourteen-track analog data tape recorder where it was recorded onto fourteen-inch diameter reels of one-inch-wide analog magnetic data tapes at 3.04 meters per second.[7] The other raw SSTV signal branch was sent to the RCA scan converter where it would be processed into an NTSC broadcast television signal.[7]

The conversion process started when the signal was sent to the RCA converter's high-quality 10-inch video monitor where a conventional RCA TK-22 television camera — using the NTSC broadcast standard of 525 scanned lines interlaced at 30 fps — merely re-photographed its screen. The monitor had persistent phosphors, that acted as a primitive framebuffer.[8] An analog disk recorder, based on the Ampex HS-100 model, was used to record the first field from the camera.[8] It then fed that field, and an appropriately time-delayed copy of the first field, to the NTSC Field Interlace Switch (encoder). The combined original and copied fields created the first full 525 line interlaced frame and the signal was then sent to Houston.[8] It repeated this sequence five more times, until the system imaged the next SSTV frame.[8] It then repeated the whole process with each new frame downloaded from space in real time.[9] In this way, the chain produced the extra 20 frames per second needed to produce flicker-free images to the world's television broadcasters.[6]

This live conversion was crude compared to early 21st-century electronic digital conversion techniques. Image degradation was unavoidable with this system as the monitor and camera's optical limitations significantly lowered the original SSTV signal's contrast, brightness and resolution. The video seen on home television sets was further degraded by the very long and noisy analog transmission path.[10] The converted signal was sent by satellite from the receiving ground stations to Houston, Texas. Then the network pool feed was sent by microwave relay to New York, where it was broadcast live to the United States and the world.[11]

Operational history

Earth seen during the Apollo 8 live TV transmission on 23 December 1968 using the 100 mm telephoto lens on the RCA Command Module TV Camera.

Apollo 7 and Apollo 8 used an RCA slow-scan, black-and-white camera.[12] On Apollo 7, the camera could be fitted with either a wide angle 160 degree lens, or a telephoto lens with a 9 degree angle of view.[13] The camera did not have a viewfinder or a monitor, so astronauts needed help from Mission Control when aiming the camera in telephoto mode.[Note 3]

Specifications

The camera used interchangeable lenses, including a wide-angle lens with a 160 degree field-of-view, and a 100 mm telephoto lens.[16]

Camera[Note 4]

Camera name Command Module Television Camera, Block I
Supplier RCA
Sensor Vidicon Tube
Sensor size one-inch tube
Field Scan type progressive scan
Frame rate 10 fps
Frame size 320 scan lines
Resolution 200 lines
Color encoder monochrome
Aspect ratio 4:3
Bandwidth 500 kHz
Power Consumption 6.5 watts @ 28 volts DC
Weight 2,041 grams (72.0 oz)
Dimensions 210 mm × 95 mm × 76 mm (8.3 in × 3.7 in × 3.0 in) LxHxW
Lens mount type Bayonet

Westinghouse Apollo Lunar Television Camera

Development

Lunar Module training mockup, showing relative position of deployed camera

In October 1964, NASA awarded Westinghouse the contract for the Lunar TV Camera.[19] Stan Lebar, the Program Manager for the Apollo Lunar TV Camera, headed the team at Westinghouse that developed the camera that brought pictures from the Moon's surface.

The camera had to be designed to survive extreme temperature differences on the lunar surface, ranging from 121 °C (250 °F) in daylight to −157 °C (−251 °F) in the shade.[10] Another requirement was to be able to keep the power to approximately 7 watts, and fit the signal into the narrow bandwidth on the LM's S-band antenna, which was much smaller and less powerful than the Service Module's antenna.[20][Note 5]

Operational history

The camera was first tested in space during the Apollo 9 mission in March 1969.[21] The camera was stowed in the LM, and it used the LM's communications systems to evaluate their performance before lunar operations began.[22] This meant that the CM did not carry a video camera for this mission.[23] It was next used on Apollo 11, carried in the LM's descent stage, in the quad 4 Modularized Equipment Stowage Assembly (MESA). It was from the MESA where it captured humanity's first step on another celestial body on 21 July 1969.[21] Apollo 11 would be the first and last time the camera was used on the Lunar surface; however, it flew as a backup camera on the Apollo missions from Apollo 13 to Apollo 16, in case the color cameras suffered a similar fate as the Apollo 12 camera.[1]

Specifications

The camera's dimensions were 269 mm × 165 mm × 86 mm (10.6 in × 6.5 in × 3.4 in) in size, and weighed 3.29 kilograms (7.3 lb). It consumed 6.50 watts of power. Its bayonet lens mount allowed for quick changes for the two interchangeable lenses used on Apollo 11: a wide-angle and a lunar day lens.[24][Note 6]

Camera

NASA Component No. SEB16101081-701[26]
Supplier Westinghouse[1]
Sensor Westinghouse WL30691 Secondary Electron Conduction Tube (SEC)[27]
Sensor size 1/2 inch tube[28]
Field Scan type progressive scan
Frame rate 10 fps at 320 lines, 0.625 fps at 1280 lines[29]
Frame size 320 scan lines (10 fps) and 1280 scan lines (0.625 fps)[29]
Resolution 200 lines (10 fps),[30] 500 lines (0.625 fps)[31]
Color encoder monochrome[1]
Aspect ratio 4:3[29]
Bandwidth 500 kHz[29]
Power Consumption 6.5 watts @ 24—31.5 volts DC[32]
Weight 3.29 kilograms (7.3 lb)[24]
Dimensions 269 mm × 165 mm × 86 mm (10.6 in × 6.5 in × 3.4 in) LxHxW[24]
Lens mount type Bayonet[24]

Lenses[Note 7]

Lens Westinghouse Part No. Supplier Field-of-View Zoom Ratio Aperture Light transmission Weight Dimensions Lens mount type
Wide Angle Lens 578R159-1 Fairchild 80 degrees N/A F 4 T 4.8 100 grams (3.5 oz) 33 mm (1.3 in) long Bayonet
100 mm Lens 578R159-2 Fairchild 9.3 degrees N/A F 4 T 60 417 grams (14.7 oz) 126 mm (5.0 in) long Bayonet
Lunar Day Lens 578R159-3 Fairchild 35 degrees N/A F 4 T 60 100 grams (3.5 oz) 39 mm (1.5 in) long Bayonet
Lunar Night Lens 578R159-4 Fairchild 35 degrees N/A F 1 T 1.15 200 grams (7.1 oz) 53 mm (2.1 in) long Bayonet

Westinghouse Lunar Color Camera

Choosing a color process

Stan Lebar, the project manager for Westinghouse's Apollo Television Cameras, shows the Field-Sequential Color Camera on the left and the Monochrome Lunar Surface Camera on the right.

Color broadcast studio television cameras in the 1960s, such as the RCA TK-41, were large, heavy and power-hungry beasts. They used three imaging tubes to generate red, green and blue (RGB) video signals which were combined to produce a composite color picture. These cameras required complex optics to keep the tubes aligned. Since temperature variations and vibration would easily put a three-tube system out of alignment, a more robust system was needed for lunar surface operations.[34]

In the 1940s, CBS Laboratories invented an early color system that utilized a wheel, containing six color filters, rotated in front of a single video camera tube to generate the RBG signal.[35] Called a field-sequential color system, it used interlaced video, with sequentially alternating color video fields to produce one complete video frame. That meant that the first field would be red, the second blue, and the third field green — matching the color filters on the wheel and also in a different order than NTSC.[35] This system was both simpler and more reliable than a standard three-tube color camera, and more power-efficient.[34]

The camera

Lebar and his Westinghouse team wanted to add color to their camera as early as 1967, and they knew that the CBS system would likely be the best system to study.[36] The Westinghouse Lunar Color Camera used a modified version of CBS's field-sequential color system.[35] A color wheel, with six filter segments, was placed behind the lens mount. It rotated at 9.99 revolutions per second, producing a scan rate of 59.94 fields per second, the same as NTSC video. Synchronization between the color wheel and pickup tube's scan rate was provided by a magnet on the wheel, which controlled the sync pulse generator that governed the tube's timing.

The Color Camera used the same SEC video imaging tube as the monochrome Lunar Camera flown on Apollo 9. The camera was larger, measuring 430 millimetres (17 in) long, including the new zoom lens. The zoom lens had a focal length variable from 25 mm to 150 mm, with a zoom ratio rated at 6:1. At its widest angle, it had a 43-degree field of view, while in its extreme telephoto mode, it had a 7-degree field of view. The aperture ranged from F4 to F44, with a T5 light transmittance rating.[27]

Color decoding & signal processing

Signal processing was needed at the Earth receiving ground stations to compensate for the Doppler Effect, caused by the spacecraft moving away from or towards the Earth. The Doppler Effect would distort color, so a system that employed two videotape recorders (VTRs), with a tape-loop delay to compensate for the effect, was developed.[35] The cleaned signal was then transmitted to Houston in NTSC-compatible black and white.[Note 8]

Unlike the CBS system that required a special mechanical receiver on a TV set to decode the color, the signal was decoded in Houston's Mission Control Center. This video processing occurred in real time. The decoder separately recorded each red, blue and green field onto an analog magnetic disk recorder. Acting as a framebuffer, it then sent the coordinated color information to an encoder to produce a NTSC color video signal and then released to the broadcast pool feed.[34] Once the color was decoded, scan conversion was not necessary because the color camera ran at the same 60-fields-per-second video interlace rate as the NTSC standard.[36]

Operational history

It was first used on the Apollo 10 mission. The camera used the Command Module's extra S-band channel and large S-band antenna to accommodate the camera's larger bandwidth. It was only used in the Lunar Module when it was docked to the Command Module. Unlike the earlier cameras, it contained a portable video monitor that could be either directly attached to the camera or float separately. Combined with the new zoom lens, it allowed the astronauts to have better precision with their framing.[35]

Apollo 12 was the first mission to use the color camera on the lunar surface. About 42 minutes into telecasting the first EVA, astronaut Alan Bean inadvertently pointed the camera at the Sun while preparing to mount it on the tripod. The Sun's extreme brightness burned out the video pickup tube, rendering the camera useless. When the camera was returned to Earth, it was shipped to Westinghouse, and they were able to get an image on the section of the tube that wasn't damaged.[38] Procedures were re-written in order to prevent such damage in the future, including the addition of a lens cap to protect the tube when the camera was repositioned off the MESA.

Apollo 14 EVA frame demonstrates the "blooming" issue with Color Camera.

The color camera successfully covered the lunar operations during the Apollo 14 mission in 1971. Image quality issues appeared due to the camera's automatic gain control (AGC) having problems getting the proper exposure when the astronauts were in high contrast light situations, and caused the white spacesuits to be overexposed or "bloom". The camera did not have a gamma correction circuit. This resulted in the image's mid-tones losing detail.[39]

After Apollo 14, it was only used in the Command Module, as the new RCA-built camera replaced it for lunar surface operations. The Westinghouse Color Camera continued to be used throughout the 1970s on all three Skylab missions and the Apollo–Soyuz Test Project.

The 1969–1970 Emmy Awards for Outstanding Achievement in Technical/Engineering Development were awarded to NASA for the conceptual aspects of the color Apollo television camera and to Westinghouse Electric Corporation for the development of the camera.[40]

Specifications

Camera

NASA Component No. SEB16101081-701[26]
Supplier Westinghouse
Sensor Westinghouse WL30691Secondary Electron Conduction Tube (SEC)[41]
Resolution more than 200 TV lines (SEC sensor - 350 TV Lines in vertical dimension)
Field Scan rate 59.94 fields-per-second monochrome (color filters alternated between each field)[42]
Frame rate 29.97 frames-per-second [41]
Frame size 525 lines
Color encoder Field-sequential color system[43]
Bandwidth 2 MHz to 3 MHz (Unified S-band bandwidth restrictions)
Power Consumption 17.5 watts @ 28 volts DC[44]
Weight 5 kg (11 lb)[43][44]
Dimensions 287 mm × 170 mm × 115 mm (11.3 by 6.7 by 4.5 inches) LxHxW with handle folded [45]
Lens mount type C mount[46]

Lens

NASA Component No. SEB16101081-703[26]
Supplier Angénieux[45]
Focal length 25mm—150mm[47]
Zoom ratio 6:1[47]
Aperture F4 to F44[47]
Light transmission T5[48]
Weight 590 g (21 oz)[44]
Dimensions 145 mm (5.7 in) long, 58.9 mm (2.32 in) lens diameter [45]
Lens mount type C mount ANSI 1000-32NS-2A Thread[46]

RCA J-Series Ground-Commanded Television Assembly (GCTA)

Due to Apollo 12's camera failure, a new contract was awarded to the RCA Astro division in Hightstown, NJ. The RCA system used a new, more sensitive and durable TV camera tube. The design team was headed by Robert G. Horner. The team used the newly developed SIT pickup tube. The improved image-quality was obvious to the public with the RCA camera's better tonal detail in the mid-range, and the lack of blooming that was apparent in the previous missions.

The system was composed of the Color Television Camera (CTV) and the Television Control Unit (TCU). These were connected to the Lunar Communications Relay Unit (LCRU) when mounted on the Lunar Roving Vehicle (LRV). Like the Westinghouse Color Camera, it used the field-sequential color system, and used the same ground-station signal processing and color decoding techniques to produce a broadcast NTSC color video signal.

On Apollo 15 the camera produced live images from the LM's MESA, just as the previous missions did. It was repositioned from the MESA onto a tripod, where it photographed the Lunar Rover Vehicle (LRV) being deployed. Once the LRV was fully deployed, the camera was mounted there and controlled by commands from the ground to tilt, pan, and zoom in and out. This was the last mission to have live video of the mission's first steps via the MESA, as on the following flights it was stowed with the LRV.

Usage

Cameras used, CM = Command Module, LM = Lunar Module

See also

Notes

  1. NASA decided to go with a new communications system for the Apollo program that routed all communications signals simultaneously through the Unified S-Band (USB) system. All communication between the spacecraft and ground was handled by the USB, transmitting on the 2287.5 frequency for the CM, and at 2282.5 for the LM. It had a 3 MHz allotment for all communications that were divided into seven components: Voice, Telemetry, Television, Biomedical data, Ranging, Emergency Voice, Emergency Key.[3] The reason why the video signal had to be compressed into such a narrow bandwidth was due the way signals were allocated bandwidth. After allocating 1.25 MHz to Voice, and 1.024 MHz for Telemetry, only about 700 KHz was available for all other communication signals. In order to produce a clean frequency modulated (FM) transmission for video from the LM on the lunar surface, the Ranging signal was omitted. The Block II CM actually had a second 3 MHZ USB that could have allowed better resolution and scan rates, but that wasn't supported until the Apollo 10 mission in 1969.[4]
  2. For the purposes of clarity and simplicity in this article, 60 fields and 30 frames per second are used. NTSC actually runs at 59.94 fields per second, and 29.97 frames per second. Two interlaced fields create one complete video frame.
  3. The camera's lack of either a viewfinder or monitor was apparent when Apollo 8 tried to frame the Earth on their second broadcast from space. The Earth bounced around, often out of view, and Mission Control had to direct the astronauts to move the camera to bring it back into frame.[14] Apollo 8 astronaut William Anders said during the second telecast, that "I hope the next camera has a sight on it," referring to the RCA camera's lack of a sighting device.[15]
  4. All specifications for the RCA Command Module TV Camera are found in Coan's Apollo Experience Report — Television Systems, except its weight, which is found in Goodwin's Apollo 7: The Mission Reports.[17][18]
  5. Since digital compression video techniques weren't practical at the time (though studied by NASA as a possibility in 1965 in document NASA-CR-65508), the signal was "compressed" by simple analog means, starting by not using color, reducing the image resolution from the NTSC standard 525 lines to 320 lines, and reducing the frame rate from 30 fps to 10 fps. In this way, the Lunar TV camera was able to shrink the video signal by 95 percent less than a standard NTSC one. After Apollo 11, a larger S-band antenna was deployed by astronauts during their first EVA, eventually allowing better video from the lunar surface.[20]
  6. There were actually four lenses developed for this camera including the lunar day lens and the wide angle lenses. The other two lenses were the lunar night lens and a 100 mm telephoto lens.[25]
  7. All specifications for the Westinghouse Lunar Surface TV Camera found in Lebar's Apollo Lunar Television Camera Operations Manual pages 2-24 and A-11.[33]
  8. The unprocessed signal from the moon, with its fluctuating TV synchronization signals, was sent to the first VTR and was recorded on 2-inch tape. The tape was not spooled on that machine, but instead, was played back on the second VTR, using the steady house sync signal to play it back and fix any synchronization issues caused by the Doppler Effect (this timebase correction is now accomplished by digital methods since the mid-1970s).[37]

Citations

References

External links

This article is issued from Wikipedia - version of the Tuesday, March 15, 2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.