What are AI data centres, and why are they suddenly everywhere?
The AI boom is driving a massive data centre construction wave across Britain. Here is what these facilities actually are and why they matter.
AI data centres are large facilities built to house the computers that power every AI service you use today. Each time you ask an AI assistant a question, that request travels to a data centre to be processed. These buildings have become some of the most debated infrastructure in Britain. Billions of pounds of investment have been announced across the country in the last two years.
The reason is straightforward. AI services need enormous amounts of computing power. That power has to live somewhere physical, in a building with electricity and cooling. The bigger the AI system, the more computing capacity it needs to run.
What an AI data centre actually is
A data centre is a building designed to house computers at scale. It needs power, cooling, and fast network links. The computers sit in metal frames called racks. They run non-stop, drawing electricity and generating heat that has to be removed at all times.
In an ordinary corporate data centre, machines handle business tasks, store files, and run company software. An AI data centre is different. The core hardware is the GPU, or graphics processing unit. GPUs were designed for rendering video games.
GPUs turned out to be ideal for the maths behind machine learning. This made them the standard hardware for AI work. Google, Amazon, and Meta are now building custom chips for training AI models. The goal is to rely less on Nvidia, which makes the most widely used AI chips in the world.
AI workloads are different from normal computing tasks. Standard software sits idle much of the time. AI systems at scale are busy non-stop. They process millions of user requests every day of the year.
The chip supply chain matters here. Nvidia’s most advanced chips are in short supply. Many AI projects have been delayed because of this.
Companies that design their own chips can sidestep this bottleneck. This is why Google, Amazon, and Meta have all invested heavily in custom silicon chips. Their goal is to reduce dependence on a single supplier.
Why demand for AI data centres has grown so fast
Training a large AI model is a demanding process. A single model can need thousands of GPUs running for weeks. The big technology firms are investing heavily. The scale dwarfs their previous spending by a wide margin.
Microsoft alone announced plans to spend around $80 billion on data centres in a single year. That figure reflects how seriously technology firms treat AI capacity. The race to build has no obvious endpoint. Each year the targets grow larger.
The AI tools people use daily are only the visible part. Behind every AI product is a chain of training runs and inference workloads. Inference is the process of running a trained model to answer a query. Every query adds to the load on these facilities.
To understand how many AI tools now draw on this infrastructure, it helps to know how the systems differ. Our guide to multimodal AI and what it means for text, images, and voice explains how AI is expanding beyond simple text tasks. Each new capability adds to the data centre workload. More capability means more computation.
Britain’s position in global AI data centre expansion
Britain is positioning itself as a major hub in this global network. In 2024 and 2025, Microsoft, Google, and specialist operators announced large UK investments. The government published its AI Opportunities Action Plan in early 2025. It promised to expand AI data centre capacity as part of a wider national AI strategy.
The plan identified AI growth zones where planning and grid links would be faster. The government has also classified large AI data centres as nationally significant in certain cases. This means planning decisions for the biggest facilities go to ministers, not local councils.
The reasoning is that Britain needs to keep pace with the United States, Germany, and Ireland. All three are building out AI infrastructure at speed. You can read the full plan on the GOV.UK AI Opportunities Action Plan page.
The electricity and water challenge
The UK grid is already under pressure as homes switch to heat pumps and cars shift to electric power. AI data centres add a large new category of always-on demand. A site drawing 100 megawatts uses roughly as much electricity as a small town. The facilities being planned now are larger than anything built before.
Getting a large facility connected to the grid takes time. The network often needs upgrading first. The queue of projects waiting for a connection has grown a lot. Some operators have waited several years.
National Grid forecasts that data centre demand could add several gigawatts to UK electricity use by 2030. That is a substantial increase in a grid already stretched by the shift to electric heating and transport. The grid upgrade needed to support it is one of the less-discussed parts of the AI boom.
Cooling is the other main energy cost. Large sites use either air-based or water-based cooling. Air cooling uses more electricity. Water cooling uses less power but draws on local water supplies.
In parts of England, recent droughts have raised concerns about water use. Consumption by large data centre sites now appears in planning debates alongside electricity concerns. Many operators have promised to run AI data centres on renewable energy. The sector has a net zero target of 2030.
The planning debate in UK communities
The nationally significant infrastructure label has caused friction in some areas. Hertsmere in Hertfordshire hosts one of the densest data centre clusters in Europe. Residents and local planners have questioned whether the label is being applied too broadly. Similar debates have emerged around West London, the Thames Valley, and parts of Wales.
The argument is about process. It centres on where facilities are sited and how communities are consulted. Some argue the nationally significant label is used to bypass legitimate local concerns. These are governance questions as much as planning ones.
The debate also touches on what AI systems are for. Our explainer on the difference between chatbots, copilots, and AI agents shows how different AI tools place different demands on the underlying hardware. Each type of AI service has a different footprint. Those footprints are all growing.
What this means going forward
The AI systems being built now are expected to grow a lot over the next few years. Each step up in power needs a step up in hardware. Facilities that seem large today will look modest by the end of the decade.
AI data centres are not controversial in themselves. AI services need computation. Computation needs hardware. Hardware needs physical sites with power and cooling.
The questions are practical ones: where these facilities go, how they connect to the grid, and what say local communities have. These choices are being made now. The outcomes will affect energy bills, local landscapes, and the shape of Britain’s digital economy for years to come.
These buildings are a permanent part of modern life. The services people rely on daily depend on them completely. The buildings on business parks that most people drive past are now at the centre of one of Britain’s most significant infrastructure debates. How it is resolved will shape the country’s energy and technology landscape for decades.