The Greening of Streaming
There is some hope, though. We reached out to Limelight Networks, one of the world's leading CDNs. "CDNs rely on extensive data center networks to deliver services," says Ersin Galioglu, the company's VP of strategic initiatives. "And as providers look to enhance their services, they'll typically deploy more data centers. But as more data centers are deployed, carbon emissions increase due to power utilization. If innovative solutions aren't put in place to curtail the influx of power utilization, this could certainly create environmental concerns."
Galioglu goes on to note that a CDN doesn't need to sacrifice service or choose between innovation and efficiency to reduce environmental impact. "At Limelight, we've increased our network egress capacity by 70% in an environmentally sensitive manner," he says. "With software innovations, we increased the average amount of data delivered per unit of power (megabits per second per watt) by almost 80%, providing increased capacity and reliability for customers. And by evaluating our data center deployment process, we've found new ways to improve power utilization, such as proactively selecting data center locations based on [their] access to renewable energy. We've also devoted entire teams to analyze and correct our energy exertion across all data centers."
Server Hardware
A subset of the data center issue is the power consumption of server hardware. Rather than using AC-powered servers, the industry might consider a transition to DC-powered servers. Anyone with a telco background knows about the efficiencies gained (both in cooling reduction and overall "five nines" reliability) in telco switch rooms even 40 years ago. One of the main reasons was the use of DC-powered devices, which generate significantly less heat than their AC counterparts.
Given the fact that a number of renewable energies (e.g., solar, some hydro) begin their journey as DC, it seems shortsighted to expend major effort to convert DC to AC, only to have to use an internal power supply on a server to convert AC back to DC so that the server mainboard's chipsets can accomplish their tasks.
Rather than using AC-powered servers, the industry might consider a transition to DC-powered servers—a change that telco switch rooms achieved 40 years ago in pursuit of cooling reduction and “five nines” reliability.
Fortunately, there is movement in this space, including efforts by electrical engineers who have left large server companies to branch out on their own, exploring ways to eliminate the power supplies in "pizza box" rack-mounted servers. Also, a number of small, portable computers (such as the Intel NUC series) have integrated wide-voltage power supplies that can function in a range from 12 volts up to almost 20 volts.
Peering Points
While peering itself is essentially connecting two routers on the borders of an ISP's autonomous system—these routing fabrics are extremely expensive high-end technologies—they are not regularly replaced. This means that even if there is a power efficiency introduced as routing interfaces jump in speed, it may be some years before a stable, trusted core router is replaced by the ISP. Therefore, it takes a considerable amount of time for power efficiencies to percolate through. However, the power supply in hosting facilities is generally at a premium, and so this economic driver helps accelerate the chase for efficiency.
One thing to note: Optic networking that provides fiber-based dense wavelength division multiplexing (DWDM) connectivity can be surprisingly power-efficient. We spoke to Michal Naor Tanhum, VP of business development at PacketLight Networks, to get an understanding of the power requirements for optical networking. Linking 20x10Gbps channels over a fiber point-to-point requires something like 250W at each end and can reach more than 80 km. Compared to copper, this is incredibly efficient. And while there are many challenges to establishing an apples-to-apples comparison to wireless links such as satellite and microwave, clearly, optical networking is one of the few areas in trunk networking that was born with power efficiency as a first principle.
Caching
Caching means different things to streaming professionals and to consumers. The concept with which most of us are familiar is caching at peering points, CDNs, and data centers. Caching in the home, aka DVR or PVR, is likely to be familiar to consumers. In reality, there is not a lot of difference between the two, beyond where in the network the hard drive storing the cache is located and the number of users who are able to access that cached content.
Obviously, the CDN's greatest internal aim is to avoid "cache misses," in which an end user requests a piece of content that is not cached, causing the cache to request the content from an upstream origin. In a CDN in which the content is limited to a fixed library (for example, a CDN serving Netflix or Spotify), it may well be possible to simply store a copy of all of the available content in all edge cache locations. This would ensure that there are no cache misses and all users get the highest performance possible. However, this would also be largely inefficient. Most of the content would rarely be requested, and so the edge resources would be used up hosting the content for no reason. Were there only a few edges, that might be an excusable waste of power, but the problem in CDNs—as Akamai's marketing boasts of a "quarter of a million edge servers" highlight—is that they consider the number of caches they have as a key competitive advantage.
However, it would be inaccurate to say that, for instance, the entire Disney+ library is cached in all CDN locations. Caches are fixed in size and usually host only actively in-demand content. The key challenge is that in order to achieve faster startup time for a video, CDNs pre-position content in caches. As those caches fill up, new caches need to be brought online, resulting in increased energy consumption. If performance—how quickly a piece of content is delivered to a customer—was not the primary KPI by which CDNs compete, then we might (as consumers) all be prepared to wait a little longer for videos to start. That might, in turn, completely remove the need for edge caching, and many thousands of ~100W computers could be turned off.
First Steps to a Solution
As we wrap up this introduction to the greening of streaming, let's take a look at a radical hypothetical.
The internet is a shared resource used by multiple services. Obviously, if it is provided for one intended purpose—for example, ecommerce or file transfer—a service may also be able to deliver a wide range of other functions such as voice and video. Untangling streaming from the rest of the internet to deterministically establish how much power and environmental demand the various streaming technologies create is a near-impossible task.
Nonetheless, one extreme way to explore the energy demand driven by streaming video is to imagine an internet without streaming. Even a layperson can see that the resources in a laptop or smartphone would be dramatically reduced. Significant amounts of battery power are used to connect to, decode, and display video. In the last mile, entire "generation" rollouts (3G, 4G, etc.) have been underpinned by providing bandwidth to stream watchable video. For nearly all of the other services provided on a smartphone (information access/ecommerce/voice calls/photo messaging), 2G was adequate and 3G was sufficient. So, as extreme as it first sounds, it may be reasonable to ask: Is it sensible to argue that all of the energy, fabrication pollution, and distribution costs of digging, ducting, installing, commissioning, and providing consumer access to 4G (and soon 5G) are purely (or at least significantly) a byproduct of our demand for the one service that requires a constant high-bandwidth network—streaming video?
Even gaming (not esports viewing) is "bursty" and only requires high bandwidth for short periods of time. And a well-written game can actually manage low bitrates, since much of the bandwidth-demanding scenery and so on can be delivered in advance. So, arguably, the main reason we crave high bandwidth is for video streaming. Could we contend that any bandwidth above (for example) half a megabit per second is purely provisioned for streaming video? If so, then imagine a world where the video-free internet was all we needed. How much infrastructure could we roll back?
Smartphones would not need GPUs, nor would they need such large lithium polymer batteries or such high-resolution displays (although their dual purpose as a still camera complicates the argument). The "churn rate" of this technology would thus reduce, since consumers would be less compelled to frequently upgrade their phones, and the landfill, mining, and fabrication demands would significantly reduce too. Then, the cellular networks might focus on rolling out 3G/4G ubiquitously, rather than simply selling bandwidth for the sake of bandwidth to a video-less market. This would, in turn, cut down the need for backhaul, satellites, microwave, and fiber. We literally wouldn't need most of the infrastructure we have rolled out in the past 10, perhaps 15, years. And all of that is before we think about the production resources, core CDN capabilities, and the DVR and caching for performance mentioned earlier.
Of course, that's not going to happen. The video streaming genie is out of the bottle. But it is absolutely crucial that we start to comprehend how much demand video streaming is placing on the environment.
Remember, just a couple of months ago, only 4% of the U.S. workforce worked from home, but by the first week of April, that number had risen to 34%, according to Recode. And yet, the net effect of COVID-19 has proven that not only is it possible to radically change our behavior—if we face and take ownership of existential challenges—but that there can be better outcomes such as lower air pollution and more time at home.
This should be a lesson for us all, even as our business continues to accelerate. We need, collectively, to ensure that streaming is sustainable, or there will come a time when we may be faced with the unsustainability of streaming at scale and are forced to roll back some of the practicalities and capabilities we all take for granted.
As we said early in the article, we encourage a widespread industry commitment to the greening of streaming. If you want to join us, please email greeningofstreaming@streamingmedia.com.
Related Articles
How do others see us, and how do we measure our progress? New data on streaming power consumption will allow the industry to target further reductions in power consumption while working toward longer-term solutions that reuse old technologies alongside current best practices as a way to extend the life of streaming tech for years to come.
12 Mar 2024
Only time will tell how successful new sustainability reporting standards ESRS E1 (European) and the SEC (US) mandate will be in the near-term in curbing greenwashing and improving sustainability requirements and adherence in the streaming industry, or how much of the long-term their failure might costs us.
27 Mar 2023
"Sustainability" is perhaps the most important buzzword in the streaming industry, but most current efforts to define "green" or "net zero" streaming miss the target.
08 Feb 2022
From an article in Streaming Media a year ago has grown a brand-new organization devoted to meeting the sustainability challenge in the streaming industry.
02 Aug 2021
While many media organizations are starting to use the cloud for editing and post-production workflows—in part because of its merits to sustainability—they can go further by using fully cloud-native solutions, according to a new report commissioned by Blackbird and carried out by Green Element.
16 Mar 2021