#89 The Carbon Conundrum
AI, Big Data, and Our Looming Environmental Apocalypse; What on Earth is Open Cloud Compute?
Today, Arindam Goswami tries to answer a contentious question: Is AI going to reduce or amplify our carbon emissions?
Satya Sahu provides a brief explainer on People+AI’s Open Cloud Compute project, and what it might mean for desired governance outcomes in the compute segment of the AI value chain.
If you like the newsletter, you will love to read our in-depth research and analysis at https://takshashila.org.in/high-tech-geopolitics.
Also, we are hiring! If you are passionate about working on emerging areas of contention at the intersection of technology and international relations, check out the Staff Research Analyst position with Takshashila’s High-Tech Geopolitics programme here.
Technomachy: AI, Big Data, and Our Looming Environmental Apocalypse
— Arindam Goswami
Picture this: the sky is an ominous shade of orange, the air thick with the acrid smell of smoke. Trees stand bare against the desolate landscape, their branches like skeletal fingers reaching out in desperation. The once-bustling cities are now ghost towns, eerily silent except for the occasional gust of hot, dry wind. It's a scene straight out of a doomsday movie, but this is not fiction. This is our not-so-distant future if we don't get a grip on our carbon emissions.
Enter Artificial Intelligence (AI), the white knight—or perhaps the dark horse—in our battle against climate change. Recent research from Nature suggests that AI systems like GPT-3 and BLOOM might just be the unlikely heroes we need to curb our carbon footprint. But is this too good to be true?
AI: The Unlikely Savior
First, let’s talk numbers. Training GPT-3, one of the more advanced AI systems, generates a whopping 552 metric tons of CO2e. Sounds bad, right? But when you spread this out over the millions of tasks it performs, each query only emits about 2.2 grams of CO2e. BLOOM, another AI system, is even more efficient, with per-query emissions of around 1.6 grams of CO2e.
Compare this to your average human. A US-based writer, for example, cranks out around 1400 grams of CO2e per page. Even an Indian writer, who might be typing away in a less energy-intensive environment, produces about 180 grams of CO2e per page. That’s like comparing a Prius to a gas-guzzling SUV. The AI wins hands down in the emissions department.
This research demonstrates that AI systems produce between 130 and 1500 times fewer CO2 equivalents per page of text generated than human writers, while AI illustration systems emit between 310 and 2900 times fewer CO2 equivalents per image compared to their human counterparts. However, it's important to note that emissions analyses do not encompass broader social implications such as job displacement, legal considerations, and potential rebound effects. Furthermore, AI cannot fully replace all human tasks. Nonetheless, current evidence suggests that AI has the capacity to execute various significant tasks with significantly lower emissions than those generated by human activities.
This is important for the UN Sustainable Development Goals, specifically Goal 12 ("Ensure sustainable consumption and production patterns") and Goal 13 ("Take urgent action to combat climate change and its impacts”). Given the widespread integration of writing and illustration across corporate activities, government affairs, educational processes, and various other domains, even minor alterations in the environmental impact of these activities are deemed significant. These findings illustrate not just a marginal change but rather the potential for a substantial hundred- or even thousand-fold reduction in the environmental impact of common human activities like writing.
But Wait, There's More
Let's not forget about illustrations. AI-generated art has a significantly lower carbon footprint compared to human-created artwork. So, in the realm of creativity, AI seems to be a greener alternative. And this isn’t just pie-in-the-sky theory. AI's efficiency and scalability mean that, despite the initial high emissions from training, the overall environmental impact is significantly reduced when these systems are deployed on a large scale.
The Devil in the Details
Before we crown AI as our climate saviour, let's dig a bit deeper. Although the embodied energy and end-of-life recycling costs of the hardware used for AI are lower than the energy costs associated with human activities over time, they still present a challenge. It's like switching from coal to natural gas; it’s better, but it’s not perfect.
The process of training GPT-3, recognized as one of the most potent and extensively utilised AI systems thus far, produces carbon emissions that are tantamount to the cumulative environmental footprint of five cars over their entire lifespan.
Moreover, while AI can theoretically reduce global carbon emissions, this assumes we use it responsibly. Remember, these are the same technologies powering massive data centres—veritable behemoths of energy consumption. According to a Wall Street Journal report, these data centres are slowing the shift to clean energy because of their enormous power demands. So, while AI could help, it’s also part of the problem.
The Data Centre Dilemma
Data centres, the backbone of our digital world, are now giant energy hogs. The Wall Street Journal article lays it bare: these behemoths are slowing the shift to clean energy because of their insatiable appetite for power.
The proliferation of hyperscale data centres, particularly in regions like Northern Virginia, USA, has disrupted electric utilities' efforts to reduce reliance on fossil fuels. Consequently, in certain areas, this has resulted in an extended duration of coal burning beyond initial projections. The epicentre of the struggle lies in Northern Virginia's renowned "Data Center Alley," through which approximately 70% of worldwide internet traffic traverses.
Northern Virginia's status as the global hub for data centres traces back decades to the inception of the internet. The foundational infrastructure laid during that period attracted titans of the dot-com and telecommunications industries, including AOL, Yahoo, and WorldCom. The emergence of ChatGPT and analogous large-language AI models, which demand substantial computing resources, significantly accelerated the demand for data centre infrastructure.
To address increasing demand, many utilities opt to extend the operational lifespan of coal-fired power plants while integrating natural gas power plants to stabilize fluctuations resulting from renewable energy expansion. A notable achievement in the U.S. energy shift has been the consistent reduction of coal-generated power. Over the past decade, approximately 10 gigawatts of coal power capacity have been retired annually. However, due to heightened demand, projections from S&P Global Commodity Insights suggest this rate will decrease to around 6 gigawatts per year until 2030.
Big tech companies like Google and Amazon tout their commitments to renewable energy, but the reality is far more complex. While they are investing in wind and solar power, the sheer scale of their data operations means they still rely heavily on traditional power sources. This isn't just a hiccup; it's a seismic problem that threatens to derail global efforts to combat climate change.
A Cautionary Tale
As we march towards a tech-driven future, we need to be mindful of the double-edged sword we wield. AI might be our best bet for reducing emissions in certain areas, but it’s no silver bullet. On the one hand, AI holds immense potential as a tool for optimizing energy efficiency, enhancing predictive modelling capabilities, and informing evidence-based policy interventions aimed at mitigating climate change impacts. Conversely, the energy requirements associated with the development, deployment, and operation of AI systems pose a significant counterforce to sustainability objectives, necessitating careful consideration of trade-offs and synergies within a holistic framework.
AI algorithms, particularly those used in deep learning and training large models, require significant computational resources. This demand for computing power could lead to a surge in energy consumption. The production of AI hardware components, such as semiconductors and data servers, requires the extraction of rare earth metals and other resources. Unregulated mining practices can lead to habitat destruction, biodiversity loss, and water pollution, further straining ecosystems already under pressure from climate change. The rapid turnover of AI hardware and the short lifecycle of many devices contribute to the growing e-waste problem. AI systems are susceptible to biases and errors, which can have unintended consequences on climate-related decision-making. Flawed algorithms may produce inaccurate climate projections or ineffective policy recommendations, leading to misguided actions or exacerbating existing environmental issues.
Addressing these potential impacts requires careful regulation, ethical guidelines, sustainable practices and innovation in the development and deployment of AI technologies.
Need for Innovation
Consider this for how innovation can help drive this. A collaborative effort between Chinese and Swiss researchers has resulted in the development of an energy-efficient neuromorphic chip that mimics the functionality of neurons and synapses in the human brain. Known as "Speck," this asynchronous chip, crafted by scientists from the Institute of Automation at the Chinese Academy of Sciences and SynSense AG Corporation in Switzerland, boasts an impressive resting power consumption of just 0.42 milliwatts. This minimal energy usage means that the chip consumes nearly no power when idle. In comparison, the human brain, renowned for its ability to process complex neural networks, operates on a mere 20 watts of power, far less than current AI systems. Thus, neuromorphic computing holds significant promise for energy-efficient machine intelligence. The framework is adept at meeting the dynamic computational demands of various algorithms, achieving real-time power consumption as low as 0.70 milliwatts. This groundbreaking work offers artificial intelligence applications a brain-inspired intelligent solution characterized by exceptional energy efficiency, minimal latency, and reduced power consumption.
Conclusion
As we navigate the intricate nexus of AI, energy consumption dynamics, and climate change, it becomes imperative to adopt a transdisciplinary approach that transcends traditional disciplinary boundaries and fosters collaboration among diverse stakeholders. By harnessing the power of technological innovation, policy reform, and collective action, we can forge a path toward a more equitable, resilient, and sustainable future for generations to come.
India is set to host the Quad Leaders' Summit in 2024. Subscribe to Takshashila's Quad Bulletin, a fortnightly newsletter that tracks the Quad's activities through the Indo-Pacific.
Your weekly dose of All Things China, with an upcoming particular focus on Chinese discourses on defence, foreign policy, tech, and India, awaits you in the Eye on China newsletter!
The Takshashila Geospatial Bulletin is a monthly dispatch of Geospatial insights for India’s strategic affairs. Subscribe now!
Cyberpolitik: What on Earth is Open Cloud Compute?
— Satya Sahu
Late last year, in a Takshashila Discussion document, “A Pathway to AI Governance,” we touched upon the criticality of the compute segment of the AI value chain for the development and deployment of AI systems. Rehashing the definition, “compute” as an umbrella term encompasses the infrastructure and resources (processors, memory, peripherals, etc.) required to process data, train models, and run AI applications.
In the document, we emphasise the importance of fostering a competitive and diverse market for cloud service providers and briefly examine the need for and potential avenues for creating sovereign computing resources and promoting investments in domestic capabilities.
Interestingly, in what has been hailed as a “UPI Moment” for cloud compute in India, People+AI (an initiative of the non-profit EkStep Foundation) launched its Open Cloud Compute (OCC) project at its Adbhut India event in Bangalore this past May.
The OCC, in simple terms, is a digital public infrastructure approach to compute that aims to create an open and decentralised micro-cloud computing grid/network. This network would bring together independent providers (of apparently disparate compute resources) on a single availability marketplace, improving discoverability and access to computing power and related services from any provider. The project seeks to democratise access to cloud computing infrastructure, making it more resilient, interoperable, and adaptable for innovation. Think of it as an ONDC for micro data centres across India, whose services can be chosen by a client on the basis of required processing power, latency needs, geographical proximity, etc. So far, a 24-member consortium, including giants like AMD, Oracle, and Dell, alongside startups like Von Neumann AI and Vigyan labs, have signed up as partners on this project.
While we are yet to learn the technical, administrative, and implementation details of the OCC, we can hazard a guess at some of the ways in which this project can effectuate or affect the desired outcomes we envision for the compute segment.
First, the OCC and its concomitant DPI approach align broadly with the objective of promoting fair competition and reducing entry barriers. The presence of a shared, decentralised platform means that the dominance of a few large cloud service providers can be checked by a “network of smaller, interoperable micro-players that collectively behave like a mega player.” Alongside the reduction of entry barriers, this may also mitigate the risk of vendor lock-in.
Second, it is arguably a more cost-effective way to develop India’s domestic computing capabilities by integrating local providers into the global compute network. While this cannot completely supplant the need for some sovereign computing resources essential for military and governance needs, etc, it has the potential to reduce a near-complete dependency on foreign-based cloud service providers.
Third, the OCC’s emphasis on “open” standards and interoperability may strengthen data localisation and data protection norms, reducing the risk of foreign interference or cyberattacks. Decentralised networks are generally more resilient to security and data breaches, further strengthening a push towards strategic autonomy.
Aside from this, a couple of challenges also come to mind at this point, although these are just guesstimates at this point.
In a bid to counter market concentration risks in the compute segment of the AI value chain, OCC can run into the risk of being run the same way the NPCI manages the UPI. The NPCI has a monopoly over the UPI architecture and often interferes in the UPI ecosystem to “prevent the dominance and mitigate the systemic risk of failure of any single player” by dictating limits on the number of transactions for third-party apps. Having a monopoly on a decentralised network means that efficient resource allocation and management can often lead to misguided policies like the above, which can distort market incentives. Even so, if the OCC becomes dominated by a few large cloud players (where economies of scale and existing customer bases matter a lot), market concentration risks may crop up again, similar to how large providers like Google Pay and PhonePe dominate the UPI platform.
While still in its early stages, the OCC is apparently designed to be used globally and will not just be limited to Indian customers. Questions abound regarding fiscal sustainability, who maintains the infrastructure, whether AWS and Google, etc., can also plug into this grid and offer their existing solutions, pricing, etc., and these were not answered in the Adbhut India event, but we will be keeping an eye on future developments.
What We're Reading (or Listening to)
[Opinion] Agnikul’s first test flight: Time for private sector to be wings of Indian space industry, by Ashwin Prasad
[Opinion] Dr. Jagdish Chandra Asthana Could Have Been a Good Policy Analyst, by Pranay Kotasthane and Madhav Kanchiraju
[Article] Chinese AI chip firms downgrading designs to secure TSMC production (Business Standard)
[Opinion] China’s Energy Intensity and Carbon Intensity Targets Are All But Unachievable, by Rakshith Shetty