Perplexity CEO Warns On-Device AI Could Upend the Data Centre Boom
As the world’s biggest technology companies pour billions into building massive AI data centres — with some even exploring space-based facilities — Perplexity AI CEO Aravind Srinivas believes the industry may be heading toward an unexpected disruption. According to Srinivas, the future of artificial intelligence could shift away from energy-intensive server farms and move directly into personal devices.
Speaking on a podcast with Prakhar Gupta on YouTube, Srinivas offered a contrasting perspective to the infrastructure-heavy strategies being pursued by companies led by Elon Musk, Sundar Pichai, and others. While these firms are betting big on centralised computing power, Srinivas argued that advances in on-device AI pose the greatest long-term risk to large data centres.
He explained that if sophisticated AI models can be efficiently compressed into chips capable of running directly on smartphones, laptops, or other personal devices, the reliance on cloud-based inference could drop significantly. “The biggest threat to a data centre is if the intelligence can be packed locally on a chip that's running on the device and then there's no need to inference all of it on like one centralized data center,” Srinivas said.
At present, popular AI platforms such as ChatGPT, Gemini, and Perplexity depend heavily on vast data centres to process user requests. These facilities consume enormous amounts of electricity, require advanced cooling systems, and often rely on large quantities of water to prevent overheating. Srinivas questioned whether this model is sustainable in the long term if AI workloads can be handled locally on user devices.
Beyond infrastructure and energy costs, Srinivas highlighted privacy as a key advantage of on-device AI. When processing happens locally, sensitive user information does not need to be transmitted to remote servers, reducing exposure to data misuse and breaches. He also pointed out that local AI systems can respond faster, avoiding latency caused by communication with distant data centres.
Srinivas described this vision as a deeply personal form of artificial intelligence — one that belongs entirely to the user. “That way you don't have to repeat it. That's your intelligence. You own it. It's your brain,” he said. In his view, such a shift could fundamentally alter the economics of AI, raising serious questions about whether it makes sense to invest hundreds of billions — or even trillions — of dollars into centralised data centre infrastructure.
However, he acknowledged that this future is not yet a reality. Srinivas noted that no company has so far released an on-device AI model that is both powerful and efficient enough to handle complex tasks reliably without cloud support. “That's not yet happened,” he said, adding that a true breakthrough in this area could reshape the entire AI ecosystem.
For now, data centres remain at the heart of AI development. But Srinivas’ comments suggest that the industry’s long-term trajectory may depend less on where data centres are built — even in space — and more on how much intelligence can be brought directly into users’ hands.