The AI Energy Crisis: Can Photonics and Quantum Computing Prevent a Technological Collision?
Tanvir Khan’s stark warning highlights that the current trajectory of AI expansion is colliding with physical and environmental limits, as today’s energy-intensive AI models demand unsustainable amounts of electricity and water for data centers, threatening to “burn the planet” if scaled according to projections.
This crisis is forcing a fundamental technological pivot, with innovations like NTT’s photonic-based IOWN technology—aiming for a 100-fold reduction in power consumption—and India’s investments in quantum computing emerging as critical pathways to break the dependence on conventional electronics. The solution requires a dual approach of improving existing efficiency while accelerating collaborative global research into these paradigm-shifting alternatives to ensure AI’s growth does not come at the expense of planetary sustainability.

The AI Energy Crisis: Can Photonics and Quantum Computing Prevent a Technological Collision?
If we build all that computer over the next 20-30 years, we will burn the planet down — Tanvir Khan, Executive Vice President of NTT Data North America
The statement by Tanvir Khan, Executive Vice President of NTT Data North America, cuts through the optimistic buzz surrounding artificial intelligence with a sobering dose of reality. As companies race to deploy and scale AI, Khan warns that current projections are “not possible by the laws of physics” and would be environmentally catastrophic. This stark warning highlights a fundamental collision course between unfettered AI expansion and planetary sustainability—one that demands immediate attention and innovative solutions.
The Staggering Environmental Cost of AI Ambition
Artificial intelligence systems, particularly the generative models that have captured global attention, are extraordinarily resource-intensive. They consume vast amounts of electricity and require significant water resources for cooling the powerful computing infrastructure they run on. As data centers multiply to meet demand, their environmental footprint expands dramatically.
In India, where Mumbai is emerging as a significant data center hub, the water consumption for data centers is projected to surge from 150 billion liters to a staggering 358 billion liters by 2030. This alarming increase underscores how the AI revolution, if pursued with current technologies, could place unsustainable pressure on essential resources.
The fundamental issue lies in the exponential growth trajectory of AI. As models become more sophisticated and their applications more widespread, the computational requirements grow accordingly. This creates a paradox: the very technology hailed as a solution to complex problems may itself become an environmental crisis if its foundational infrastructure remains unchanged.
Industry Realities: A Disconnect Between Aspiration and Implementation
Despite the breakneck pace of AI development, most organizations remain in the early stages of implementation. According to McKinsey’s 2025 Global Survey on the state of AI, nearly two-thirds of organizations have not yet begun scaling AI across the enterprise. This gap between aspiration and implementation may be providing a crucial grace period during which more sustainable approaches can be developed and adopted.
The survey reveals telling patterns in how companies are approaching AI:
Table: Current State of AI Implementation Across Organizations
| Implementation Phase | Percentage of Organizations | Key Characteristics |
| Experimenting/Piloting | Approximately 65% | Testing AI in limited contexts, not yet enterprise-wide |
| Beginning to Scale | Approximately 33% | Expanding deployment across some business functions |
| AI High Performers | About 6% | Achieving significant EBIT impact (5%+) from AI use |
Interestingly, organizations seeing the most value from AI are those with ambitious transformation goals. High performers are three times more likely than others to say their organization intends to use AI to bring about transformative change to their businesses. Yet even these forward-thinking companies face the same physical constraints that Khan identifies—constraints that may ultimately limit how far and how fast AI can progress with current technology.
The Physical Limits: When Exponential Growth Meets Finite Resources
The challenge Khan describes is rooted in fundamental physics. Conventional computing technologies based on electronics are approaching physical limits in terms of power efficiency, heat dissipation, and transmission speeds. As computational demands increase exponentially, these limitations become increasingly problematic.
Research organizations like Epoch AI track scaling trends and expected limits, while discussions in technical communities acknowledge that “physics dictates the speed of light and the quantum and heat dissipation limits for transistors”. These are not engineering challenges that can be solved with incremental improvements—they represent fundamental barriers that require paradigm-shifting approaches.
The semiconductor industry has already encountered similar constraints. As noted in discussions about scaling laws, “when Moore’s law ‘broke’ – the number of transistors by volume was scaling as predicted until the 2010s, but related metrics, such as the energy use per computation, or the cost per transistor, have continued”. This historical precedent suggests that when one metric of progress hits fundamental limits, the focus must shift to alternative metrics and approaches.
For AI, this means that simply building more data centers with current technology may not be a viable path forward. The energy requirements alone would be staggering. Stanford’s AI Index Report 2025 notes that while AI has become more efficient—with inference costs dropping over 280-fold for systems performing at GPT-3.5 level between 2022 and 2024—training compute continues to double approximately every five months. This relentless growth in computational demand collides with the finite capacity of our energy infrastructure and cooling resources.
IOWN: A Photonic Pathway to Sustainable AI
In response to these challenges, NTT is developing what Khan describes as “a fundamentally different know-how to scale” AI. Their Innovative Optical and Wireless Network (IOWN) initiative represents a paradigm shift from electronics to photonics—using light rather than electricity to transmit and process information.
Table: IOWN’s Performance Targets Compared to Current Technology
| Performance Metric | IOWN Target Improvement | Potential Impact |
| Power Consumption | 1/100 of current levels by 2032 | Dramatically reduced energy demands for data centers |
| Transmission Capacity | 125 times increase | Ability to handle massive AI datasets efficiently |
| End-to-End Latency | 1/200 of current levels | Enables real-time applications previously impossible |
IOWN is built on three core technological pillars:
- All-Photonics Network (APN): Applies optical technology across the entire network infrastructure, from end devices to core systems. Already commercialized in Japan since March 2023, APN reduces latency to just 1/200 compared to conventional networks while eliminating latency fluctuations that disrupt real-time applications.
- Photonic-Electronic Convergence (PEC): Shifts functions traditionally handled by electricity to photonics. This technology is being developed in four generations, progressing from optical interconnects between data centers (already commercialized) to optical wiring within individual chips.
- Data-Centric Infrastructure (DCI): A next-generation data processing platform that disaggregates computing resources and activates only what is needed. By extending the reach of resource interconnection through optical wiring, DCI enables optimization across large-scale systems.
The breakthrough that made IOWN possible came in 2019 when NTT developed the world’s first ultra-low-power optical modulator and transistor using photonic crystal structures, achieving a 94% reduction in power consumption. This technological leap shifted optical computing from theoretical concept to practical possibility.
The Quantum Alternative: India’s Parallel Approach
While NTT pursues photonic solutions, India is placing significant bets on quantum computing as another pathway to overcoming conventional computing limits. The country is establishing itself as a hub for quantum technology development, with ambitious initiatives like the Amaravati Quantum Valley (AQV) in Andhra Pradesh.
This emerging hub has already secured South Asia’s first quantum computer—a 133-qubit system slated to be India’s most advanced. The initiative aims to create a complete quantum ecosystem encompassing hardware manufacturing, software development, talent nurturing, and research excellence, with the goal of positioning Amaravati among the world’s top five quantum hubs by 2030.
NITI Aayog, the Indian government’s policy think tank, recognizes that “quantum technologies still in their formative stages globally present India with a rare opportunity to shape the trajectory of a foundational technology”. Unlike previous technological waves where India played catch-up, the country now has the chance to establish leadership from the outset.
Telangana has become the first Indian state to launch a dedicated quantum strategy, with Deputy Chief Minister Mallu Bhatti Vikramarka stating that “Hyderabad will emerge as future leader in quantum economy”. These parallel initiatives in photonics and quantum computing reflect a growing recognition that overcoming AI’s environmental constraints will require fundamentally new approaches to computation.
Practical Applications and Industry Collaboration
The transition to sustainable AI infrastructure isn’t merely theoretical—it’s already underway in practical applications. IOWN technologies are being tested in scenarios ranging from long-distance data center interconnects to remote operation of industrial robots.
NTT DATA has conducted demonstration experiments connecting data centers via APN in countries including the UK and the US, achieving latency of less than one millisecond between sites approximately 100 kilometers apart. This capability enables the creation of distributed data center networks that can operate as single virtual data centers while optimizing for energy efficiency and resource utilization.
In manufacturing, NTT DATA is testing remotely operated robots for smart factory maintenance, supported by real-time video transmission and AI analysis over IOWN’s APN. These applications demonstrate how sustainable computing infrastructure can enable new capabilities rather than merely reducing the environmental impact of existing ones.
Recognizing that no single company can drive such a fundamental shift alone, NTT established the IOWN Global Forum in January 2020 in collaboration with Intel and Sony. This consortium has grown to include over 160 organizations, including technology leaders like Microsoft, NVIDIA, Cisco, and Nokia. The forum works to promote cross-industry cooperation, standardization, and knowledge sharing—essential elements for achieving the scale needed to make photonic computing a viable alternative to conventional approaches.
The Path Forward: Balancing Innovation and Sustainability
As AI continues its rapid expansion—with 88% of organizations now reporting regular AI use in at least one business function according to McKinsey—the need for sustainable approaches becomes increasingly urgent. The collision course that Khan describes represents one of the most significant challenges in the history of information technology.
The solutions emerging fall into two broad categories:
- Evolutionary improvements to existing technologies, reflected in the steady efficiency gains documented in Stanford’s AI Index Report
- Revolutionary approaches like photonics and quantum computing that offer fundamentally different paradigms
The most likely path forward involves both trajectories operating in parallel. While photonic networks and quantum processors develop toward commercial viability, traditional computing will continue to see incremental efficiency improvements. This dual-track approach acknowledges that the transition to sustainable computing infrastructure will be gradual rather than sudden.
For business leaders and policymakers, several actionable insights emerge:
- Invest in both efficiency and innovation: Support incremental improvements to existing infrastructure while simultaneously funding research into paradigm-shifting alternatives.
- Develop holistic metrics: Move beyond traditional performance benchmarks to include energy efficiency, water usage, and total environmental impact in technology evaluation frameworks.
- Foster cross-industry collaboration: Follow the IOWN Global Forum model of bringing together diverse stakeholders to address challenges that transcend individual organizations.
- Align technological development with sustainability goals: Ensure that AI roadmaps explicitly address resource constraints and environmental impact from the outset.
- Support workforce development: Build educational programs that prepare talent for both current AI implementations and emerging sustainable computing paradigms.
The warning from NTT’s Tanvir Khan serves as a crucial reality check at a pivotal moment in technological history. As we stand at the intersection of unprecedented AI capabilities and growing environmental awareness, the decisions made today will determine whether artificial intelligence becomes a force that elevates human potential without compromising planetary health. The race is not merely to build more powerful AI systems, but to build AI systems that power a sustainable future.
You must be logged in to post a comment.