Yahoo went to great lengths to assure that the more than 15 million unique viewers in 185 countries who watched the first live, free, worldwide Web streaming of a regular-season NFL football game on October 25 experienced video quality better than TV. The platform Yahoo created not only streamed live video to PCs and TV sets, but across mobile platforms, devices and video game consoles globally.
In a blog post, P.P.S. Narayan, Yahoo’s VP of engineering, goes into some detail in describing exactly what took place ¨under the purple rug.¨
In addition to making use of its own globe-spanning content delivery network (CDN) and network infrastructure, Yahoo partnered with six outside CDNs and Internet Service Providers (ISPs).
Yahoo served up some 5 terabits per second (Tbps) across all seven CDNs during the game. At peak demand, digital video bit rates were streaming at 7 Tbps to viewers globally.
Free NFL Game Streaming
To kick things off, an HD video signal was transmitted from Wembley Stadium in London, where the game took place, to Yahoo sites in Dallas and Sunnyvale, where it was encoded for Internet video streaming.
This encoded signal was then transcoded – compressed to make for more efficient network transmission – into nine bit rates that ranged from 300 kbps to 6 Mbps at frame rates of 30 and 60 frames per second (fps). Audio commentary from Yahoo Sports experts was then layered on top, and the audiovisual bit rates also were sent out across the Web to viewers worldwide, Narayan recounts.
Yahoo made extensive use of pre-existing and newly created network diagnostics, metrics and tools both before and during the global Webcast. These proved critical to assuring a viewing experience that exceeded broadcast and cable TV, Narayan highlights.
In essence, this live Internet video streaming ¨toolkit¨ enabled the network architecture Yahoo created to adjust on-the-fly to changing conditions and points of weakness or failure from the level of CDNs down to local ISPs and even home networks and devices. All this proved critical to assuring the HD quality streaming Yahoo desired, Narayan points out.
As a for instance, he explained: ¨Policy rules to routes were adjusted based on CDN performance and geographies. For example, we were able to reduce the international traffic to one under-performing CDN during the game and the changes were propagated in under six seconds across viewers. Such capabilities delivered a near flawless viewing experience.¨
One of the new ¨tools¨ Yahoo developed was a network and device simulation framework dubbed ¨Adaptive Playground¨that enabled Yahoo to take a more scientific approach in testing and measuring video streaming performance.
Another, called ¨Stream Monitor,¨ constantly monitored all CDN streams to check that they were valid and correct, as well as to identify ingestion or delivery problems, Narayan continues. During the NFL live stream, this tool exactly identified problems and took actions to correct them.
Building redundancy into the globe-spanning live video streaming architecture so that viewers’ experience was uninterrupted was another key focal point for Yahoo Engineering. Activated automatically, the recovery mechanism used reconnected to Yahoo’s back-end API servers to fetch content from another CDN upon experiencing problems or failures in the primary. In one severe case, this recovery mechanism was automatically switching CDNs to deliver seamless streaming for as many as 100,000 viewers in less than 30 seconds.
Data from the Yahoo video players used by viewers tracked all that took place during video playback on end-user devices. Data streams measuring parameters such as first video frame latency, bit rate, bandwidth, buffering and frame drops were continuously sent to Yahoo data centers for analysis and possible corrective action.
These so-called ¨beacons¨ were processed in real-time to assess key performance indicators (KPIs), which included the number of concurrent viewers, total video starts, re-buffering ration across different dimensions (CDN, device, OS and geography). Yahoo operations teams used this data to make key decisions regarding routing policies and switching CDNs in real-time.
During the game, data transfers carried by Yahoo’s beacon servers peaked at more than 225,000 events per second. In total, they handled some 2 billion events, equal to about 4 TB of data.