Organisations running AI workloads, like banks training fraud detection models, hospitals testing diagnostic tools, or manufacturers using predictive analytics, all face the same problem: hosting them is costly and resource-intensive.
They require dedicated GPUs running non-stop, vast amounts of data moving in and out, and far more power and cooling than a typical IT system.
For businesses scaling up AI, that creates a clear decision point. Should these workloads sit in the public cloud or in a colocation facility? Each offers a credible path, but the trade-offs around cost, performance, and compliance are substantial.
In this blog, the team at Pulsant, offers their insights.
Why AI workloads push infrastructure harder
Analysts forecast that by 2030, global data centre demand could reach 171 to 219 gigawatts (GW), against a current demand of around 60 GW, with AI-ready data centres accounting for about 70 per cent of that demand.
Unlike ordinary business software, AI doesn’t tick over in the background. Training a model means weeks of continuous GPU usage, with high-bandwidth connections to feed in data. Even when training is complete, inference AI still needs reliable performance to deliver results in real time.
That level of intensity has forced organisations to re-examine whether cloud or colocation offers the better foundation.
Cloud: easy to start, harder to sustain
Public cloud has obvious appeal, as it allows teams to provision GPUs fast, scale up or down, and test different model configurations without buying hardware. For pilots or short-term projects, this flexibility is valuable.
The challenges appear when workloads become permanent. Running high-end GPUs for long periods is costly, and moving large volumes of data in and out of the cloud can add unexpected bills through egress charges (fees providers charge for moving data out of their networks). Once public cloud costs go into the stratosphere, financial directors pull the plug.
In shared cloud environments, demand for high-performance GPUs has at times exceeded available capacity, making it difficult for some organisations to secure resources at short notice. In colocation or privately owned infrastructure, access to specialised hardware is governed by procurement rather than platform availability.
For sectors dealing with sensitive records – such as banks, insurers, or healthcare providers – the cloud also raises questions about data sovereignty and auditability.
Colocation: built for high-performance AI
Colocation allows organisations to house their own GPU infrastructure in facilities designed to handle workloads that ordinary IT environments can’t. These data centres are designed to keep high-performance clusters running around the clock, alongside resilient connectivity into major cloud platforms.
That combination lets businesses run demanding training workloads on stable infrastructure while retaining the option to link into the cloud for additional capacity. Recent survey data shows why AI workloads are moving this way. High-density power and cooling (54%), direct connections to cloud providers (51%), and support for high-performance computing infrastructure (49%) were ranked as the top three capabilities IT leaders look for when hosting AI workloads.
Stephen Spittal, Technology Director at Pulsant, says: “AI puts far more strain on infrastructure than traditional IT. Once you move past the pilot stage, the demand for power, cooling, and connectivity is constant.
“Colocation gives organisations the capacity to run those workloads without interruption in sites specifically designed to be efficient – and the confidence that performance will hold up as projects grow.
“AI is moving to the Edge – inference needs to move closer to consumers. Our latest research indicates 87% of UK businesses plan to migrate partially or fully from public cloud in the next two years.”
Cost, performance, and compliance compared
For businesses running AI projects at scale, the financial model dictates what is practical. Cloud removes the barrier of buying hardware, but once training runs 24/7, usage bills become unpredictable and difficult to defend in budgets. Colocation asks for investment upfront but delivers stable operating costs that boards and finance teams can plan against with confidence.
When it comes to performance, cloud capacity is convenient for pilots, but organisations relying on the newest GPUs have found availability uneven. In colocation, the infrastructure is dedicated, so processing power is available when needed and throughput is consistent, making it a safeguard for projects that cannot afford interruption.
Compliance is often the deciding factor, especially in regulated industries. Cloud platforms meet high digital security standards, but data residency and audit requirements remain outside the customer’s direct control. In colocation, location and oversight are set by the business itself, allowing sensitive workloads to run within chosen jurisdictions under strict physical security.
Making the choice count
Colocation is becoming the steady base for many AI projects, with facilities that can deliver dense power, advanced cooling, and direct links into cloud platforms when extra flexibility is needed.
For organisations running intensive AI workloads, the difference often comes down to how well infrastructure decisions are planned and who they choose to support them. A trusted colocation partner makes that possible, providing the environment, resilience, and connections needed to keep projects sustainable over the long term.