The Video Horizon

 Web Expanded
Photo Gallery
Visual Science Storytelling

The View from the International Space Station

Online Resources

NASA TV HQ Equipment List

"One of the things I've been using the camcorders for is to record images of lightning storms from space. While your eyes are much more sensitive than the cameras, the camcorders do a surprisingly good job with nighttime shots." —While living aboard the International Space Station, NASA Astronaut and ISS Science Officer Ed Lu wrote about his experiences using video in space.
All Photos courtesy of NASA

Look up in the sky. It's a spacecraft; it's an astronaut — it's a videographer? It's all three. As NASA continues to explore space and build the International Space Station (ISS), video is there documenting it all. And NASA Television is helping by allowing us all to watch some of our tax dollars at work 24 hours a day, seven days a week.

NASA TV was established in the early 1980s to provide the agency's Space Shuttle Program managers and engineers with realtime video of space shuttle operations and liftoff-to-landing coverage of missions. More recently, programming is also provided to the media and U.S. television networks. And while NASA TV is not a public or commercial TV station, it provides informational and educational programming on space exploration, space science, earth science, and aeronautic research that rivals that of any traditional network.

As is the case with most video operations today, the new digital and high-definition formats are making inroads while traditional analog video and newer DV/DVCAM equipment continues to be used as NASA TV pursues its mission of documentation and education.

“NASA Television exists to provide the agency with coverage of missions from human space flight to expendable launch vehicles,” says NASA TV Executive Producer Fred Brown. His team provides the news media with feeds (B-roll, soundbites, etc.), coverage of NASA news conferences, and press briefings, 3D animations to support and help explain missions, and produced videos of various programs such as the Hubble Space Telescope and Mars Rovers. NASA TV also runs educational programming produced by NASA's Education Division for schools across the country.

NASA TV is carried on numerous television cable systems around the country and on the major satellite systems. Programming is also carried on cable TV in Canada and in parts of Europe and Asia. For more than two years, it has also been available on the Internet as streaming video through NASA TV's website, In addition to all the internal uses at the agency, NASA TV distributes three digital video channels by satellite: one for the general public, one geared toward education, and one for the media. The channels are carried by MPEG-2 digital C-band signal to two satellites (AMC 6-17C and AMC 7-18C). (If the programs are not carried by the local cable company, a digital video broadcast [DVB]-compliant integrated receiver decoder [IRD] and a satellite dish is needed for reception.)

The Broadcast Center for NASA TV is located in Washington, D.C., at NASA Headquarters (HQ). All NASA Centers (Johnson, Kennedy, Goddard, Ames, etc.) have television units that are producing live events and various video products for television. There are 10 NASA Centers, and each has its own television infrastructure that supports mission, programs, and activities at its location. All centers also provide material for use on NASA TV. The number of video specialists at each location varies, but there are more than 100 throughout the agency.

Television Master Control at Goddard Space Flight Center in Maryland.

“Content originates at the centers and is routed to NASA headquarters via the NASA wide-area network. At headquarters, the content can either be recorded for playback later or routed in realtime to be multiplexed with other content for other channels. The multiplexed signal is then routed to Goddard Space Flight Center [GSFC], where it is modulated and uplinked to a commercial satellite [currently AMC 6-17C],” says Rodney Grubbs, chair of the NASA DTV Working Group and based at the Marshall Space Flight Center in Huntsville, Ala.

The system starts with the output from a center's video production switcher. Some centers still have legacy NTSC switching; some can produce entirely in SD SDI; and now, two centers use HD SDI. Each center has at least one Harmonic 4:2:2 SD MPEG-2 encoder and another low-latency SD MPEG-2 encoder (also Harmonic).

“We typically encode at about 7.5Mbps 4:2:2 long GOP,” Grubbs says. “The video stream can be recorded to tape or routed to the multiplexer. Standard configuration is to have four channels [all MPEG-2 4:2:0 long GOP] multiplexed together.”

Channel 101 is the NASA Public Services Channel, which is the channel most people know as NASA TV; that is what's placed on cable systems and the digital satellite providers. Channel 102 is the Education Channel typically programmed with content from a playback server at headquarters. Channel 103 is the Media Services Channel, and it sometimes mirrors what is on the Public Channel, but it is also used for raw feeds to the media of live shots and video files. Channel 104 is the Space Operations Channel, which is used internally for NASA space flight operations. “Often, content from this channel is also simulcast on the media channel and public channel,” Grubbs says.

As expected, when it comes to technology, NASA and NASA TV can rival any broadcast operation in the world. In August, the STS-118 Space Shuttle Endeavor mission put even more new technology to the test. “We enabled multicast routing within the NASA wide-area network, so instead of routing video from a center just to the headquarters in D.C., we can multicast it to other NASA centers around the U.S.A.,” Grubbs says. “The centers can decode it with the same IRDs they use for satellite reception. This gives us a lot of capability within the agency for contingencies and for sharing video in realtime. We also fully tested an HD channel. The HD stream was encoded at the Kennedy Space Center [12Mbps 4:2:0 MPEG-2 long GOP] with a Harmonic HD encoder. We slightly reduced the bit rate on channel 101, reduced 102 to just a graphic suggesting folks tune to 101, and added a fifth channel, 105, which simulcasted the launch in HDTV.”

Astronaut Barbara R. Morgan, mission specialist and teacher aboard the Space Shuttle Endeavor, uses a flight model Sony DSR-PD100 DVCAM during a farewell ceremony in the Zvezda Service Module of the International Space Station before returning to Earth.

Grubbs says he expects NASA TV to continue to provide high-definition video feeds via satellite in the future. And other enhancements are also on the horizon. “Looking into the future, we're considering changing the way we modulate the aggregate multiplexed signal so we can get additional bandwidth on the C-band transponder. Right now, we're limited to about 36Mbps, and by making the change, we'd be able to add the HD channel and not reduce any of the other standard-definition channels,” he says. “Later in the decade, we're considering transitioning from C-band to Ku and from DVB to DVB-S2 with a mix of MPEG-2 and MPEG-4. That way, we could add another HD channel and get up to 70Mbps on a single transponder.”

Such a complex collection of media outlets is not without its challenges. In addition to the transition to digital and HDTV, one design gotcha with a network this large with this architecture is signal latency, according to Grubbs. “Interactive events such as internal NASA Q&A sessions with the administrator or press conferences or live shots are affected adversely with the latency of this architecture,” Grubbs says. “Doing an encode at a center, decoding at headquarters, re-encoding and multiplexing, then uplink at GSFC [Goddard Space Flight Center], with all the IP packetizing and depacketizing for routing adds up to a pretty hefty latency. To lower the latency, we put in place some encoders optimized for latency performance, encode 4:2:0, and then a ‘pass-through'' at HQ going straight into the multiplexer [with no decode/encode cycle]. This cuts the latency in half but still leaves it at slightly more than two seconds.

“The shuttle program has its own leased satellite transponder for program support. By digitizing [making the signal digital] all of the shuttle services, we'll be able to use some of the bandwidth for a direct low-latency MPEG-2 stream for live shots [Center X uplinks direct to satellite, and Station Y receives it].” Grubbs says switching all Space Shuttle satellite services to digital video feeds should be finished by the time you read this. To read about NASA TV Master Control, go to

With such a comprehensive distribution network, NASA's production approach is also multifaceted. More than 100 videographers at the 10 NASA Centers gather footage for both internal as well as public uses such as NASA TV. The production pipelines vary with the content and the location, but there is some commonality.

Throughout NASA, handheld HD cameras are predominantly Panasonic Varicams and, more recently, the Panasonic AJ-HDX900. The handheld SD cameras are a mix of Panasonic, Sony, and Thomson Grass Valley DV-level cameras. For flight, NASA missions use Sony DSR-PD100 DVCAM camcorders.

Johnson Space Center in Houston uses a Euphonix System 5b to feed audio to NASA TV.

“For HD, NASA is a 720p shop, so the industry standard for that format has been DVCPRO. The new AVC Intra format by Panasonic supports native 720p, and Grass Valley's new Infinity camera system can record native 720p as well,” Grubbs says. “Sony is coming out with a new variant of XDCAM, which claims to be able to record native 720p, but their format is MPEG-2 long-GOP-based, and the NASA standard for HD acquisition currently requires no GOP, so we're going to have to test that to see if we need to mod the standard. For standard def, our videographers have lots of options, but HDV is discouraged and not used for anything meant to be distributed for production or broadcast.”

New gear and formats are constantly being used and tested. Grubbs indicates his testing of Panasonic's AVC Intra codec is encouraging. “[AVC Intra] is the best native recording camcorder format we've ever had in the lab. It looks very promising, but we just got our first camera in last week and haven't taken it out into the real world yet,” he says.

The bulk of the programming is live. For postproduced shows, most NASA editors use the suite of tools they have on their computers for live sweetening (Apple Final Cut Pro or Avids are most common). Additionally, most NASA Centers have two to four video and audio postproduction workstations attached to a SAN fiber-channel network. “I think most of the Final Cut Pro folks are also using Apples' Xsans for storage,” Grubbs says. NASA videographers don't do 5.1 mixes — instead, they capture with discrete dual mono audio channels so that producers can do their own 5.1 mix later.

When it comes to live feeds for NASA TV, the video and audio is distributed from the source (usually a studio or an auditorium) through a technical control room, where the video is routed to a production switcher and the audio is routed to the audio control room. Both feeds are then combined, after inserting whatever audio delay is necessary depending on the source of the audio, back in the technical control room. The combined audio/video program is then routed to the encoder for final distribution across the NASA WAN or through the broadcast chain.

Contributing Writer and Reviewer Tom Patrick McAuliffe is a
journalist and
media creator who has been writing for DCP and related Penton publications for more than 12 years.

Visual Science Storytelling

Wade Sisler is the executive television producer at Goddard Space Flight Center in Maryland, and he has also worked at NASA HQ and the Ames Research Center in California. Trained in journalism at Baylor University and Scientific and Technical Still Photography at the Rochester Institute of Technology, he began working at NASA Ames in the mid ‘80s while finishing up his degree at RIT, and he says he never looked back. These days, Sisler is heavily involved in what he calls “Visual Science Storytelling.”

Sisler and other NASA Center employees around the nation use the discipline of television and video graphics to tell the story of projects, research, and missions created and managed by their particular center. The video, animation, and multimedia products they produce are for a variety of audiences both public and internally within the agency, and some of this content is also broadcast on NASA TV.

DCP: What brought you to NASA, and can you tell me a little about the background of video production at Goddard?

Sisler: For me, NASA was a great place because every time you turned over a rock, a mind-blowing story and often wonderful visual opportunity would crawl out. I liked that there were many new challenges and that many of the things I was to document had never been captured before. By the late ‘80s, I was dabbling in emerging multimedia, digital photography, and video, and while I hated the quality of the video image, I loved being able to go deeper into a story. Eventually, painfully, I made the shift to video and television just as the tools became affordable to small groups like the one we had at Ames. We felt lucky to be shooting on 3/4in. tape and were thrilled to eventually upgrade to Beta and then BetacamSP.

I transferred to NASA HQ in 1994 and then came to Goddard in 1997. At HQ, I worked on the IMAX films Mission to Mir and [Space Station 3D], and I also worked on projects with NASA TV.

How is Goddard different when it comes to the kinds of things you document with video?

When I came to Goddard, I found my true niche in scientific storytelling. Working here is a curious person's dream come true. The 9,000-plus scientists and engineers are literally changing the way humans see the universe and changing world we live in. NASA science provides insights into some of the most pressing problems and biggest questions of the day. Communicating the results of our missions is now woven into the DNA of our agency, and I think our team feels lucky to be working with an organization so passionate about sharing their story with the widest possible audience.

What are the main aspects of what you do?

There are really four main areas of challenge:

  1. Visual science storytelling — translating complex stories with pictures, sound, and video

  2. Creating or capturing absolutely compelling core content

  3. Making that content widely available in multiple formats and multiple distribution channels

  4. Doing all of the above very efficiently.

You've seen a lot of changes in the visual tools you use.

Sure. These days, the quality of the image is not an issue, of course. We now have end-to-end HD and shoot on Panasonic P2 and Varicam. A great deal of our work these days involves working with and directing animation and data visualization. Most of our important images are no longer shot with cameras, but are captured by satellites or are rendered in our visualizers' minds.

The biggest challenge we see is the fragmentation of the production/media world. We consider our customers to be a continuous spectrum of traditional print and broadcast media, web media portals, educators and students, museums, scientists, stakeholders — and, of course, the general public. The user community is fragmenting as the new media world carves up distribution channels into narrower and narrower slices. This fragmentation means that there are many more users creating many more products with our core content.

Interesting. And how do you share that content?

The biggest challenge we see is the fragmentation of the production/media world. We consider our customers to be a continuous spectrum of traditional print and broadcast media, web media portals, educators and students, museums, scientists, stakeholders — and, of course, the general public. The user community is fragmenting as the new media world carves up distribution channels into narrower and narrower slices. This fragmentation means that there are many more users creating many more products with our core content.

Can you describe the process of distribution?

Let's say we're producing material to illustrate the NASA mission objectives of a new kind of climate-observing satellite. Our work plan would usually call for creation of an animation illustrating the satellite at work. We would show it in action and illustrate how it works. We might also create contextual animation to help folks visualize the science behind the mission. Our producer will make sure to capture a few signature sequences that define a project.

These days, momentum has shifted to creating two- to three-minute reporter packages that can be used on places like NASA TV, web portals, and distributed via iTunes. The second part of our strategy is actively producing resource collections, which can be obtained via our fulfillment house or, increasingly, directly via online download.

Has HD and Internet streaming made inroads at Goddard?

HD has more than made inroads. Everything is HD. Even satellites are beginning to deliver HD. We've been shooting almost all HD for the past two years. It has been a little reach, but because we have such a high rate of reusing previous footage, it's been worth it. When the Solar Dynamics Observatory is launched next year, it will be sending down an HD image of the sun every second. Here comes the sun! We'll see all of the incoming space weather as never before. As far as web streaming goes, the new NASA portal will stream content and allow users to pull it down on demand. To get the uncompressed satellite footage and animations, producers will still need to go to the home centers like Goddard and JPL.

Can you tell me anything about Goddard''s work with stereo video?

We are working stereo video, but not with traditional cameras, for the most part. We do some work with the stereo pair of solar observatories. They produce essentially right-left eye images and we conducted our first press conference using the 3D images last April.

When NASA TV wants/needs programming from Goddard, is the footage sent via the WAN or via tape or hard drive, or another way?

We can send it via the WAN or directly via fiber. Goddard, like HQ and some of the other centers, is very connected to the various backbones. We conduct interviews with the networks and cable news outlet directly via the Bell Atlantic AVOC [a dedicated satellite two-way feed].

What can you tell me about the Scientific Visualization Studio at NASA''s Goddard Space Flight Center?

The Scientific Visualization Studio [SVS] turns raw satellite data into images. But this is much more than translating numbers to pixels. Frequently, these folks combine data from many satellites and sensors into a single comprehensive story. The mission of the SVS is to facilitate scientific inquiry and outreach within NASA programs through visualization. All the visualizations created by the SVS [currently totaling more than 2,700] are accessible to everyone through the website. More recent animations are provided as MPEG-1s and MPEG-2s. Some animations are available in high definition as well as standard NTSC format. Where possible, the original digital images used to make these animations have also been made accessible. Lastly, high- and low-resolution stills, created from the visualizations, are included, with previews for selective downloading [see].

Eric de Jong at NASA''s Jet Propulsion Lab is probably the unofficial leader on 3D within the agency. He has done quite a bit of 3D camera data viz work. Visit him at

If there was one thing you''d like to share about digital multimedia content creation at Goddard, what would it be?

Our goal, and our mini slogan: One message, in many formats, through many channels, for many users!


The View from the International Space Station

DCP: Why are DVCAM cameras used on the Shuttle and International Space Station (ISS) versus HD cameras?

Rodney Grubbs, chair of the NASA DTV Working Group and based at the Marshall Space Flight Center in Huntsville, Ala.: HD cameras are susceptible to radiation damage. Presumably, this is due in part to the density of HDTV CCDs. As for the act of shooting in space, on Earth, we're used to holding cameras on our shoulders or prop them with our hands and elbows and look through eyepieces pushed up against our eyes. In weightless space, astronauts find it awkward to hold a camera's eyepiece up to their eye, so they prefer having a large LCD to view what they're shooting. Focus and light balancing are challenging because HDTV is far less forgiving of errors than analog cameras were.

What does the future of HD and video at NASA look like?

I expect we'll slowly roll out a NASA HD channel starting this fall. Most centers are still unable to produce a live HD program, and NASA HQs, where prerecorded video content is programmed for playback on the various NASA TV channels, is not able to program a NASA HD channel full time. As for technology, solid-state recording offers significant improvements in a variety of applications. For example, currently we have to fly videotapes to and from the ISS. Thus we have to wait for another shuttle flight to bring videotape back that was shot on the ISS. Carting videotape to and from the ISS, maintaining VTRs on orbit, and keeping up with tapes would all go away if we could fly solid-state media and leave it on orbit. Files could be downlinked via existing laptops so video could be shared with the public more efficiently or reviewed by NASA program managers quickly. What is not known is whether current solid-state media being used by camera manufacturers can itself survive the radiation environment. Densities on small chips could prove to be a problem just as HD CCDs have a problem. We plan on testing the latest high-capacity hard-disk media soon.
— T.P.M

Online Resources

NASA TV HQ Equipment List

  • Nvision 5600 SDI 128x128 routing switcher with frames for AES audio, SDI video, and four master control modules
  • Two Evertz 80884 AD caption encoders
  • 14 Wegener 4600 SDI IRDs with ASI inputs from WAN and SDI outputs to router, and composite outputs to LCD monitors
  • Six Wegener iPumps
  • 50 Harmonic MV50 encoders (40 for transmission, and 10 for backup)
  • Two Harmonic MN20 multiplexers (one online and one backup)
  • Two Harmonic BNG (broadcast network gateways) attached to WAN Cisco switch
  • One Ventura frame with SDI with SDI to ASI cards for fiber transmission from Master Control to Goddard TV for uplink
  • Marshall LCD monitors for preview, program, and viewing of all inbound and outbound signals
  • Tektronix 601 signal analysis
  • Multiple LCDs with Ethernet connections to hardware mounted in another room to afford control while minimizing equipment footprints
  • Two DVCPro50 Mbps VTRs
  • Four-channel “Play-to Air” Sundance automation system integrated to Leitch VR440 broadcast server with 4.6TB RAID array
  • Three Sundance “Prep Stations” for content preparation
  • One Sundance “Sat Recorder” for timed records.

NASA''s Multichannel Digital Television network (MCDTV) takes video content originated at places such as the Kennedy Space Center in Florida and JPL in California and route it to NASA HQ via the NASA TV wide-area network (WAN) and NASA Integrated Services Network (NISN) and then up to a Satellite for distribution to cable head ends, schools, and other receivers.
Click here for a larger image

The first live HD broadcast from space happened last year and is seen here being displayed on the NASDAQ Sony Jumbotron in New York's Times Square.

Troy Cryder rolls tape to a bank of Panasonic DVCPro HD AK-HC1400 VTRs
documenting activities leading up to the launch of STS-118 and the Space
Shuttle Endeavor. This was the first launch captured live with high definition.

The recent launch of Space Shuttle Endeavor STS-118 was the first ever shot live with HD. The box cameras used for
launch coverage are a mix of Panasonic AK-HC900 and AK-HC1500 720P60 HD
cameras. Public Affairs also uses several camcorders at various locations prior to and during launch. They are a mix of Panasonic AJ-HDC27 Varicam and AJ-HDX900 camcorders.

NASA Videographer Rodney Grubbs gets ready for the launch of STS-118
with long-lens video and photo cameras
that track and capture shuttle liftoff and follow the space craft for
miles after launch from the Kennedy Space Center in Florida.

In support of the recent Shuttle mission STS-118, Astronauts underwent
weightless training at NASA's Neutral Buoyancy Lab at Johnson Space
Center in Texas. Here, a NBL videographer uses a Hydroflex Underwater HD camera to capture high-definition video in 1080i. A version of the
captured footage may later be shown on NASA TV.

Like any other television network, NASA TV has hosts for programs that help explain various missions and technology. Here are some from programs produced at the Goddard Space Flight Center in Maryland.

Television Master Control at Goddard Space Flight Center in Maryland.

Goddard's video department, along with the other 10 NASA centers around the country, makes great use of green- and bluescreens so show hosts can be virtually anywhere.

NASA Mission Control at the Johnson Space center is DTV and HD ready
with new 16x9 monitors and main display.

As recently as five to six years ago, astronauts on board the Space Shuttle
were shooting with Hi-8 cameras. Here, a mission specialists on STS-83
uses a Canon camcorder.

An astronaut onboard the International Space Station anchors her DVCAM camera as she gets ready to document a weightlessness experiment.

Astronauts on Shuttle mission STS-105 prepare for a daily video broadcast.