By Zack Bacon
Chief Broadcast Engineer
12th Man Productions
COLLEGE STATION, Texas—12th Man Productions is the broadcast and post facility for Texas A&M University, one of the major players in college athletics in the U.S. Across our College Station campus, we have seven major venues, which between them host as many as 110 events a year.
For those events, we provide the infrastructure for broadcasters like ESPN, and we create the fan experience through big screen shows at each venue. Those big screen shows include sponsorship fulfillment within the venues. We also provide separate video feeds to each venue, for applications like official replays, concourse feeds, talent monitors and team recordings.
All in all, it is a complex environment. When the main venue, Kyle Field (A&M’s football stadium) was rebuilt back in 2014, 12th Man Productions took the opportunity to build a central video facility, including multiple studio spaces and four control rooms, which could be used by any venue.
Because the other venues are some distance away—the furthest no more than 10 kilometers—fiber was installed. At that time, the fibers were used as CWDM single-mode circuits, each carrying no more than eight HD video signals each way, which limited the options that could be provided and required the connectivity to be configured for each event—a time-consuming process and a potential point of failure.
As part of our rolling program of refurbishment, we had been looking at moving toward IP connectivity, particularly with the standardization and growing popularity of SMPTE ST 2110. We decided that we could greatly increase our flexibility and resilience by moving to a networked approach to connect all the venues.
We already had fiber in place, so we could migrate quite easily by implementing an enterprise-class Cisco Nexus Media Fabric. That means that each of the venues and the central production facility are now connected by a routed local area network.
The challenge then was to integrate the legacy HD-SDI elements into the IP connectivity. We evaluated the options available, and we identified the Selenio Network Processor (SNP) from Imagine Communications. Each 1RU FPGA-powered device is capable of 32 channels of HD processing, which can be reconfigured just by switching software settings.
That meant that we could eliminate a lot of equipment. As well as being gateways into and out of the SMPTE ST 2110 world, SNPs can replace everything from DAs to frame syncs to up/down/cross converters. All of that processing power is networked. And because the SNP is only 1RU high for 32 channels, we can put them anywhere we need them without running out of rack space.
So instead of a fiber carrying a few signals, now they can carry as many as 32 bidirectional HD signals with no impact on latency and no loss of quality, because we are running uncompressed SMPTE ST 2110. Capacity between venues is no longer an issue, which increases flexibility.
Sports like basketball require maybe eight manned cameras, plus more fixed POV cameras. Along with all the return feeds for officials, big screens and other displays around the venue, confidence feeds for commentators and more, the need for capacity quickly builds up. By moving to IP, we have at minimum doubled what we were able to do in terms of signal capacity between venues on the same amount of fiber.
It also reduces our workload. In the past we would need to configure the system based on seasons: as one sport wound down, we would reallocate core equipment to another stadium. Now we can leave all the core routing and processing set all year round, which reduces errors as well as saving us a lot of time.
This ability to have much of the setup static means that, to date, we do not need a full-featured management system to control it. Where we do need dynamic allocation and control, the basic functionality in the SNP Manager brings all of the processor chains into one place, where we can very simply set what we need for each event.
The SNPs are able to perform a lot of tasks: it just depends which licenses you have. If we need different functionality, we just take out the subscription for it. We have yet to hit a resource limit on the new networked installation, and I really do not know of an alternative to get the flexibility we need.
This project is an important part of the migration to a full IP production environment. This is not our goal in itself: we have a rolling program for hardware refresh, and as we update, we will look to IP-native technology. The wide availability of SMPTE ST 2110 equipment is critical to that.
We are great believers in SMPTE ST 2110 and the benefits of standardization, which will allow us to continue to buy equipment that best represents price and performance for us, without limiting interoperability. As we go forward, we will invest in the right products that are best suited to our needs, if necessary adding SNP gateways until we reach the tipping point when the majority of kit is IP native. Then we will redeploy the SNPs for other tasks.
At Texas A&M, we like to be leaders: we like to stay on the front of what’s new and what’s happening. We know we have moved the industry forward, but we could not have done what we did—and we will do in the future—without the support of key technology partners like Imagine and Cisco.
We pride ourselves on providing the best student athlete experience and fan experience: that is one of our core principles. But with projects like this, we want to help other organizations—broadcasters and production companies—to benefit from our experience, so we all benefit.