OpenAI is giving the UK mixed signals.
Ink had barely dried on its pause during Stargate UK’s data centre project with Nvidia and Nscale when it announced its first permanent UK office.
While blaming high energy costs and unclear regulations for the former, OpenAI signed a lease for 88,500 square feet in King’s Cross, London, creating space for over 500 employees that would more than double its UK staff.
Both moves are part of the same strategy, founders tell Tech Funding News. OpenAI has chosen to bolster research in the UK but run its data centres elsewhere.
Britain works well for attracting talent, but not for building the infrastructure needed for computing. Until its energy costs shrink, the UK will keep getting new jobs but miss out on big tech builds.
Why Stargate stalled
UK industrial electricity prices are soaring. They are four times higher than in the US and almost double those in France, according to research from the Institute of Economic Affairs. For a data centre running thousands of GPUs nonstop, these costs make the project impossible, no matter the ambition or political backing.
Issues accessing the grid are another spanner in the works. Building a data centre takes 18 to 24 months, but connecting it to the UK power grid can take three to eight years. The government’s AI Growth Zones policy, launched in November 2025, failed to address many issues faced by data centre developers.
The UK did, however, replace its first-come, first-served system for grid connections with a first-ready, first-connected policy, which allows it to focus on strategic projects that are up and running. However, the change was clearly not fast enough to help Stargate UK.
OpenAI told CNBC the project would restart “when the right conditions, such as regulation and the cost of energy, enable long-term infrastructure investment.” No timeline was given.
For Ben Peters, CEO and co-founder of industrial efficiency company Cogna, the pause was unsurprising. “If a project of this scale cannot stack up in the UK, that can very simply tell you that the underlying costs and constraints are still too high,” he tells Tech Funding News.
“Nothing material has changed, and that is exactly the problem. The UK keeps asking businesses to invest in a system that still makes power expensive and certainty hard to come by,” Peters says.
The UK’s energy capacity has long been deemed insufficient. “I remember a professor telling me 25 years ago that we were already two decades too late building new nuclear plants. The consequences compound every day through higher costs and weaker investment incentives,” Peters adds.
The London bet: Talent over compute
The King’s Cross office is right in the heart of what Chatham House calls London’s main tech hub. This area is also home to Google DeepMind, Meta, the Alan Turing Institute, and the Francis Crick Institute. London is ranked second in the world for tech talent, just behind San Francisco.
OpenAI now has about 200 employees in London working in research, engineering, policy, and business. The new office aims to make London its biggest research centre outside San Francisco, according to Reuters.
“DeepMind made this move a decade ago, and what followed was world-class research, serious commercial outcomes, and a blueprint others are now following. London, Paris, and Berlin are where serious AI work is happening now,” says Hakob Astabatsyan, CEO of Synthflow AI, a platform that builds AI-powered voice agents for businesses.
OpenAI is not the only company taking this approach. Anthropic, a competitor, also announced plans to expand to London and has secured space for 800 employees.
“The concentration of expertise in London, Oxford, and Cambridge creates a unique feedback loop where proximity to world-leading AI labs directly benefits those of us building at the application layer,” notes Ilia Drozdov, CEO of AI lettings platform Dwelly.
“London’s economy runs on the kind of repetitive, high-stakes knowledge work that AI was built to automate,” he adds.
Where AI is built vs where it runs
The Stargate UK pause shows a bigger, global shift. As AI models get stronger, the infrastructure needed to run them often does not fit well with the cities where their creators want to live and work.
“AI is becoming infrastructure, and once you hit that point, decisions are driven as much by energy grids, planning laws, and geopolitics as they are by engineering,” Rich Pleeth, CEO of AI logistics platform Finmile, tells TFN.
The result, he says, is a clear split. “Put talent in cities where you can hire fast, and place heavy infrastructure where energy is cheaper, more abundant, and politically supported,” continues Pleeth.
This is exactly what is happening in the United States. Stargate sites are moving ahead, including a large campus in Texas. There, companies like Crusoe are building facilities that use energy from natural gas flares instead of the main grid. Now, new data centres are built wherever power is available.
“High-leverage areas cluster around talent and iteration speed. Infrastructure follows a different logic: compute availability, energy, and local policy create a disconnection that AI companies are now managing as separate organisational strata,” adds Edward Tian, CEO of GPTZero, a platform that detects AI-generated content.
As the UK tries to establish itself as an AI powerhouse and build out sovereign AI capacity, its sky-high energy costs, grid limitations, and shaky regulations prove too great a barrier for companies that do want to break ground and build in the country. Instead, it may well become a place where AI is researched and sold, but not actually run, forcing it to depend on infrastructure elsewhere.
As Peters says: “Policy should fix that, because without it, the country will keep losing investment and the chance to build for the long term.”