Akamai Plans AI Inference Cloud in India

Akamai Technologies, one of the world’s leading content delivery and cloud infrastructure companies, is expanding its footprint in India with plans to establish an AI inference cloud designed to power artificial intelligence workloads closer to end users. The initiative underscores Akamai’s strategic focus on India as a fast-growing market for cloud, edge computing, and AI adoption.

Speaking about the company’s vision, Akamai CEO Tom Leighton said that India is emerging as a critical hub for AI innovation and infrastructure growth. The company’s upcoming AI inference cloud will support enterprises in deploying and scaling AI models efficiently by leveraging Akamai’s extensive edge network. The move aligns with Akamai’s strategy to combine distributed cloud capabilities with AI-driven workloads, enabling faster, localized, and cost-effective data processing.

According to Leighton, Akamai is prioritizing medium-sized AI models rather than large, resource-intensive ones. He explained that while massive models like GPT-4 dominate headlines, medium-sized models are better suited for practical enterprise applications where efficiency, customization, and speed are crucial. These models can perform inference tasks quickly without demanding the same scale of computing power required by the largest models.

The company believes that running these AI models on its distributed edge network will help businesses achieve lower latency and improved data privacy. Akamai’s global edge infrastructure, which already spans more than 4,000 locations, provides a unique foundation for building decentralized AI capabilities. This setup allows data and applications to stay closer to users, reducing costs and energy consumption while improving performance.

Akamai’s new AI inference cloud in India is expected to cater to multiple sectors, including e-commerce, finance, manufacturing, media, and logistics. These industries are increasingly looking to integrate AI into their operations for tasks such as real-time personalization, fraud detection, predictive maintenance, and automated decision-making. By hosting AI inference at the edge, companies can ensure faster responses and greater reliability for mission-critical applications.

Leighton highlighted that India’s expanding digital economy, supported by strong developer communities and growing AI adoption, makes it an ideal environment for Akamai’s next phase of innovation. He pointed out that Indian enterprises are rapidly embracing AI to enhance customer experience, streamline operations, and unlock new business opportunities. The company’s upcoming investments aim to provide local businesses with the infrastructure needed to accelerate these transformations.

Akamai’s focus on India comes at a time when the country is witnessing increased demand for localized data processing and storage. With the rise of AI regulation and the need for data sovereignty, organizations are shifting toward in-country infrastructure solutions. Akamai’s decentralized approach fits well into this framework, offering compliance-ready solutions that balance global scalability with regional autonomy.

The CEO also emphasized the importance of algorithmic innovation in optimizing AI performance. He noted that Akamai’s research teams are working on new algorithms to make model inference more energy-efficient and computationally economical. These advancements are expected to reduce costs for customers while maintaining high accuracy and reliability in AI-driven processes.

The AI inference cloud initiative is part of Akamai’s broader connected cloud vision, announced earlier this year. The company has been expanding its capabilities beyond content delivery networks (CDNs) into a full-scale distributed cloud platform. This includes integrating storage, compute, and security services into a unified infrastructure that supports edge-to-cloud workloads. The Indian expansion marks a key milestone in executing this global strategy.

Industry analysts view Akamai’s entry into AI infrastructure as a logical evolution for the company, which already manages one of the most distributed networks in the world. By integrating AI processing capabilities into its edge servers, Akamai can offer a differentiated value proposition for businesses that require fast, reliable, and secure AI execution.

The company’s decision to focus on medium-sized AI models also reflects a pragmatic understanding of enterprise needs. While large models dominate research and consumer-facing applications, most businesses require smaller, domain-specific AI systems that can operate efficiently within existing hardware constraints. Akamai’s infrastructure is optimized for such models, providing scalability without the massive carbon footprint associated with hyperscale AI data centers.

Leighton reaffirmed that Akamai is committed to investing in local talent and partnerships to support the rollout of its AI services in India. The company plans to collaborate with Indian enterprises, research institutions, and cloud-native startups to drive adoption and co-create industry-specific AI solutions. This collaborative model aims to nurture an ecosystem where innovation and infrastructure grow hand in hand.

Akamai’s leadership believes that the future of cloud computing will be decentralized, with workloads distributed across multiple environments to balance efficiency, security, and cost. By embedding AI at the edge, the company hopes to redefine how organizations handle real-time data and decision-making. The planned inference cloud will enable Indian enterprises to execute complex AI operations close to users, reducing dependency on centralized hyperscale clouds.

As global competition intensifies in the AI infrastructure space, Akamai’s distributed model could serve as a sustainable alternative to traditional cloud architectures. By focusing on medium-sized models and real-time inference, the company is carving out a niche that aligns with the operational realities of most enterprises.

Leighton expressed confidence that India will play a pivotal role in Akamai’s global AI roadmap. With its rapidly evolving digital landscape, skilled workforce, and government initiatives promoting AI adoption, the country offers both scale and innovation potential. Akamai’s upcoming AI inference cloud is expected to be operational within the next year, with additional expansions planned across Asia-Pacific and Europe.

The company’s approach reflects a broader industry shift toward intelligent edge computing, where data processing and AI execution occur closer to the source of data generation. For Akamai, this evolution represents the convergence of two decades of expertise in content delivery with the emerging demands of the AI era. By combining network reach, algorithmic efficiency, and localized infrastructure, Akamai aims to empower enterprises to harness the full potential of artificial intelligence in a distributed world.