IBM chief executive officer Arvind Krishna has warned that the current pace of spending on AI and large scale data centers cannot be sustained over the long term, citing rising infrastructure and energy costs that are challenging the economics of the industry. Krishna said that the investments being made by hyperscalers to support artificial general intelligence level computing and large AI workloads are reaching levels that may not yield profitable returns. His comments come at a time when global cloud providers are announcing ambitious expansion plans to meet increasing demand for AI training and inference capacity.
Krishna said that the industry’s rapid shift toward gigawatt scale data centers, which require massive capital expenditure and operating budgets, is raising questions about long term financial viability. He noted that AI training and model development consume enormous amounts of power, making energy availability and cost a critical constraint. According to him, even the largest technology companies may find it difficult to generate profits from such large infrastructure commitments unless there are major breakthroughs in energy efficiency and semiconductor technology.
Analysts have observed a significant increase in capital allocation toward AI compute infrastructure over the last two years. Major players in the cloud industry have announced multibillion dollar investments in new data center regions, chip development, and AI hardware procurement. Krishna suggested that the economic model behind these investments does not currently balance the revenue expected from AI services with the rising cost of building and maintaining the necessary infrastructure. He added that without major improvements in system efficiency, the cost of AI compute will continue to exceed what enterprises are willing or able to pay.
The IBM CEO also raised concerns about the sustainability of trillion dollar projections for the AI infrastructure market. He said that the industry is operating on assumptions that may not hold true if energy costs rise or if hardware supply constraints intensify. Many cloud providers are exploring advanced cooling systems, renewable energy partnerships, and custom silicon designs to manage these challenges, but Krishna argued that the gap between investment and returns remains large.
Krishna pointed out that hyperscalers such as Amazon, Google, Microsoft, and other major cloud companies are already facing pressure to justify continued spending on AI data centers. He said that the profitability of gigawatt scale facilities is particularly difficult to achieve because of the enormous power requirements and the need to procure specialised hardware at high volumes. While demand for AI services is growing, Krishna said that monetisation models are still evolving and may not be able to offset infrastructure costs in the near term.
Experts in the data center industry have highlighted similar concerns. They note that the acceleration of AI adoption has pushed operators toward infrastructure that is expensive to build and maintain. In several regions, power availability has become a limiting factor for data center expansion. Governments and energy regulators are increasingly examining the environmental impact of large data facilities, adding another layer of complexity for companies planning rapid growth.
Krishna said that the industry needs to invest in innovation to improve energy efficiency and reduce the operational burden of AI infrastructure. He pointed to advances in chip design, cooling technologies, and software optimisation as necessary steps to make AI deployment sustainable. IBM has been expanding its own AI and hybrid cloud offerings while emphasising efficiency and responsible use of compute resources.
The CEO also commented on the growing debate around artificial general intelligence and the expectations surrounding its capabilities. He said that much of the current investment is driven by speculation about AGI, yet the amount of compute needed to achieve such systems could far exceed what is financially practical. Krishna argued that companies should focus on commercially valuable AI applications rather than pursuing models that require exponentially increasing computational resources.
Industry observers said Krishna’s remarks reflect rising caution among technology leaders who are increasingly aware of the economic and environmental costs associated with AI growth. While demand for generative AI continues to surge, companies are evaluating whether the infrastructure required to support it can deliver sustainable returns. Some analysts predict that the market will eventually shift toward more efficient models, including small and specialised AI systems that use fewer resources and operate at lower cost.
The concerns around data center economics also come at a time when enterprises are scrutinising cloud expenses. Many organisations have reported rising costs linked to AI workloads, prompting them to optimise usage or explore hybrid and on premise alternatives. Krishna said that the industry must prioritise cost effective AI development to ensure long term adoption and profitability.
As AI continues to advance, the balance between innovation and sustainability is emerging as a critical issue for technology companies. Krishna’s comments highlight the need for breakthroughs in infrastructure efficiency and the importance of aligning investment with realistic economic returns. The discussion is expected to influence how cloud providers and AI companies plan their long term strategies as they navigate the complexities of scaling AI infrastructure.