The Signal in the Noise: Signal Processing Principles and the Engineering Imperative for Sustainable Systems
By Mark R. P. Thomas & Natasha Tuck
I approached this editorial as an electronic engineer, skilled in the art of signal processing, but with very little understanding of the sustainability as a discipline in its own right. The more I read, the more familiar it felt: the Earth’s natural ecosystem is a massive adaptive filter that, by engineering measures, is the most sophisticated signal processing system ever assembled. A multivariate cost function, optimized by vast numbers of control loops, has resided in a near-converged state about a global optimum for millennia. Anyone who has implemented an adaptive filter, such as an echo or noise canceler, knows the importance of control logic for handling corner cases, addressing issues like underdamped step responses or runaway misconvergence. But the logic is only as good as the conditions used to create it.
The problem we face today is that human-designed systems — and digital infrastructure in particular — are introducing disturbance into this reference system faster than it can adapt. We are, in the language of control theory, driving a highly optimized system outside its stable operating range. The question for engineers is not whether this is happening. The science is unambiguous. The question is what we intend to do about it.
Drivers for efficiency of on-device signal processing algorithms typically include MIPS, memory, bandwidth, thermal, and battery life. Across multiple industries – AI in particular – we have seen processing progressively offloaded off device and into the edge or the cloud, where the drivers for efficiency and sustainability can look very different. I caught up with Natasha Tuck, Dolby’s Director of Sustainability, to learn more about signal processing for sustainable systems.
Out of Sight, Out of Mind: The Hidden Infrastructure of the Digital Age
The cloud is one of the most successful abstractions in the history of computing. It has made enormous computational power available to billions of people with nothing more than a browser or a smartphone. But abstractions, by design, hide complexity — and in this case, they have hidden something important. The seamlessness of digital experiences has distanced us from the physical reality of the infrastructure required to support it.
When a user asks a generative AI system a question, streams a film in high definition, or stores a lifetime of photographs, they are drawing on vast physical systems: warehouses filled with servers, cooling infrastructure, power distribution networks, and the supply chains that built every component. That infrastructure is growing rapidly, and its resource demands are no longer negligible.
Global data center electricity demand is projected to have more than doubled since 2019[i]. By 2030, data centers are expected to consume over 9% of US electricity, more than double their current share of approximately 4%[ii]. These projections reflect a consensus that now spans energy analysts, infrastructure planners, and climate scientists alike – driven by the compounding growth of data generation and the rising compute intensity of modern workloads, particularly artificial intelligence.
The scale of data itself is difficult to fully comprehend. By 2035, global data is projected to reach 2,000 zettabytes[iii]. To put that in physical terms: printing just one zettabyte of data would require ten trillion trees[iv] — nearly three times the number of trees that currently exist on Earth. Storing even 10% of projected 2035 data would require more than one billion servers[v]. The abstraction of the cloud makes none of this visible to the people generating the demand. That invisibility is a design problem.
The emissions profile of the tech industry has two primary sources. The first is the electricity required to power the code running in data centers — a direct function of compute intensity and the carbon content of the grid supplying power. The second is embodied carbon: the emissions generated during the manufacture of the devices — laptops, smartphones, networking equipment — that form the user-facing layer of digital infrastructure. Embodied carbon also encompasses the growing problem of electronic waste, as device lifecycles shorten and recycling infrastructure fails to keep pace. Taken together, these sources represent a material and growing share of global greenhouse gas emissions, with AI-driven workloads positioned to accelerate that trajectory significantly.
Generative AI deserves particular attention here. Unlike traditional software, which executes deterministic instructions at known computational cost, generative AI inference is resource-intensive by nature. Integrating large language models into routine applications — search engines, customer service tools, productivity software — has the potential to make every online interaction substantially more compute-heavy than its predecessor. The efficiency gains that the industry has achieved through decades of algorithmic refinement and hardware optimization risk being offset by a step change in per-query resource consumption.
A Familiar Problem in an Unfamiliar Domain
Signal processing engineers have always worked under constraint. Bandwidth is finite. Power budgets are real. Hardware has limits. The discipline was built on the necessity of achieving more with less — transmitting more information across a channel with less power, reconstructing a signal from fewer samples, filtering noise without distorting the signal of interest. These constraints produced some of the most elegant and durable results in applied mathematics: the Nyquist-Shannon sampling theorem, the Fast Fourier Transform, and the principles of optimal filtering and data compression.
Sustainability is, at its core, the same kind of problem. It is the discipline of operating within constraints — energy, material, atmospheric — that are real, quantifiable, and non-negotiable. The natural ecosystem has been solving this problem for billions of years. Human industrial systems are only beginning to reckon with it.
The conceptual tools of signal processing map onto sustainable systems design with surprising fidelity. Consider feedback loops: control theory is built on the principle of measuring output, comparing it to a desired state, and applying corrective action. This is precisely the logic of sustainability management — measure emissions, compare to targets, adjust operations. The challenge is that most corporate sustainability programs still operate at very low sampling rates, collecting data annually rather than continuously, which means they are flying blind between measurements. High-frequency sustainability data — smart meters, supply chain sensors, satellite-based emissions monitoring — provide the resolving power to measure signals unambiguously, giving a picture of the system that is actually accurate.
Consider filtering: the most basic signal processing operation is the separation of useful signal from unwanted noise. Sustainable systems design is an exercise in the same discrimination — identifying genuine value in a resource flow and eliminating waste. An organization that treats compute resources, energy, and materials with engineering discipline, optimizing utilization and minimizing idle consumption, is applying a filter to its operations. The output is the same product or service, delivered with less waste.
And consider efficiency under constraint as a design principle rather than an afterthought. The most efficient signal processing systems were not made efficient after the fact — efficiency was a first-order design requirement from the beginning. The same shift is needed in how we design digital infrastructure and the software that runs on it. Green software engineering — designing applications to minimize compute cycles, data transfer, and storage — is not a compromise on functionality. It is good engineering practice applied to a newly visible constraint.
Closing the Loop: Responsibilities and Responses
The path forward requires clear thinking about where responsibility lies and what actions are available at each level of the system.
Cloud providers bear responsibility for the sustainability of the cloud itself: the efficiency of their data centers, the carbon intensity of their power procurement, the design of their hardware, and the transparency of their emissions reporting. Progress is being made — the major providers have made significant investments in renewable energy and cooling efficiency — but the pace of efficiency improvement is being challenged by the pace of demand growth. The industry cannot efficiency-engineer its way out of a demand curve that is accelerating faster than algorithmic improvements can offset.
For organizations that consume cloud services, the responsibility lies with the cloud consumer to manage their environment within the cloud: optimizing resource utilization, eliminating idle compute, right-sizing workloads, and treating cloud resources with the same discipline applied to any other business input. Just as no competent engineering organization would tolerate systematic waste in a manufacturing process, no organization should accept it in its digital operations. The invisible nature of cloud consumption is not an excuse — it is a gap in instrumentation that needs to be closed.
For engineers and technologists individually, the moment calls for the application of professional skills to the problems the world is actually facing. The engineering community has the analytical tools, the systems thinking, and the technical credibility to contribute meaningfully to the transition to sustainable infrastructure. This means designing for efficiency from the first line of code, advocating within organizations for measurement and accountability, and engaging with the policy conversations that will shape the regulatory environment for digital infrastructure.
Beyond professional practice, individual agency matters. Engineers, like all citizens, make choices about where they work, what they buy, how they invest, and who they vote for. An ethical commitment to a sustainable future is not separate from technical identity — for a generation of engineers who will spend their careers building the infrastructure of the digital age, it is inseparable from it.
The Engineering Imperative
The natural ecosystem — the reference system against which all human engineering should be measured — is a signal processing system under stress. It is being asked to filter and absorb disturbances at a rate that exceeds its adaptive capacity. The feedback loops that have maintained atmospheric and biological equilibrium for millions of years are being pushed toward instability.
Engineers understand what happens when a stable system is driven past its operating limits. We understand the value of feedback, the importance of measurement, the necessity of designing within constraints. We have spent careers building systems that are efficient, robust, and self-correcting. Those skills are needed now — not in a narrow professional sense, but in the broadest possible application of the engineering mindset to the challenge of our time.
The signal is clear. The question is whether we have the discipline to act on it.
Further Reading
Sustainability practices have become increasingly integral to modern organizations. IEEE Technology for a Sustainable Climate[vi] has undertaken a variety of initiatives, including publications such as the ICASSP 2025 Sustainability Report[vii]. Articles on sustainability frequently appear in IEEE Spectrum and the IEEE Signal Processing Magazines, such as increasing concern over the environmental impact of international conferences[viii]. Sustainability at Dolby[ix] is an example of efforts within industry on sustainable practices, with annual sustainability reports[x] made available to the public.

Mark R.P. Thomas is an Editor for IEEE SPS Industry Signals and Principal Researcher at Dolby Laboratories. His research background is in all things audio from DSP to UX, leading a research group working on the capture, creation, coding, transporting, perception, and rendering of spatial audio for both professional and consumer. Dr. Thomas received an MEng degree in Electrical Electronic Engineering from Imperial College London in 2006 and a PhD in Glottal-Synchronous Speech Processing in from the same institution in 2010.

Natasha Tuck is the Director of Sustainability and Environmental, Social & Governance (ESG) at Dolby. She is responsible for measuring and monitoring Dolby’s overall environmental impact related to its operations and supply chain, as well as achieving Dolby’s climate goals. She is also responsible for Dolby’s voluntary and mandatory ESG reporting. She is passionate about embedding sustainability across operations, to drive efficiency and attain excellence. Natasha graduated from Presidio Graduate School with her MBA in Sustainability in 2009 and lives and works in San Francisco.
[i] International Energy Agency (IEA), Digitalization & Energy, IEA, 2022.
Available at: https://www.iea.org/reports/digitalization-and-energy.
[ii] U.S. Department of Energy (DOE), Projected Data Center Energy Consumption, 2020.
Available at: https://www.energy.gov/eere/articles/new-forecasts-show-growing-energy-use-data-centers.
[iii] International Data Corporation (IDC), The Digitization of the World: 2021 Summary and Recommendations, 2021. [Online]. Available at: https://www.idc.com/getdoc.jsp?containerId=prUS47664421.
[iv] Scientific American, “The Environmental Cost of Big Data,” 2018. [Online]. Available: https://www.scientificamerican.com/article/the-environmental-cost-of-big-data/
[v] U.S. Department of Energy, Advances in Data Infrastructure Research, 2020. [Online]. Available at: https://www.energy.gov/articles/advanced-computing-and-data-centers.
[vi] IEEE Technology for a Sustainable Climate. Available at: https://www.ieee.org/advancing-technology/building-better-world/technology-sustainable-climate
[vii] ICASSP 2025 Sustainability Report. Available at https://signalprocessingsociety.org/sites/default/files/2025-12/IEEE_Signal_Processing_Society_-_2025_Sustainability_Report.pdf
[viii] A. I. Perez-Neira, “Going for Sustainable Conferences,” IEEE Signal Processing Magazine, vol. 41, issue 1, April 2024. Available at: https://ieeexplore.ieee.org/document/10502014.
[ix] Dolby Environmental Commitment. Available at: https://www.dolby.com/about/corporate/sis/environmental-commitment/
[x] Dolby 2024 Sustainability Report. Available at: https://www.dolby.com/siteassets/about/corporate/sustainability-at-dolby/past-reports-page/final-2024-sustainability-report.pdf

