Global survey explores networking needs for AI era

Data centre experts predict at least 6X increase in DCI bandwidth demand over next five years, with 43% of new data centre facilities expected to be dedicated to AI workloads.

The rapid growth of AI workloads is driving a major transformation in data center network infrastructure, with global data center experts anticipating a significant increase in interconnect bandwidth needs over the next five years, according to a study commissioned by Ciena.

The survey, conducted in partnership with Censuswide, queried more than 1,300 data center decision makers across 13 countries. More than half (53%) of respondents believe AI workloads will place the biggest demand on data center interconnect (DCI) infrastructure over the next 2-3 years, surpassing cloud computing (51%) and big data analytics (44%).

To meet surging AI demands, 43% of new data center facilities are expected to be dedicated to AI workloads. With AI model training and inference requiring unprecedented data movement, data center experts predict a massive leap in bandwidth needs. In addition, when asked about the needed performance of fiber optic capacity for DCI, 87% of participants believe they will need 800 Gb/s or higher per wavelength.

"AI workloads are reshaping the entire data center landscape, from infrastructure builds to bandwidth demand," said Jürgen Hatheier, Chief Technology Officer, International, Ciena. "Historically, network traffic has grown at a rate of 20-30% per year. AI is set to accelerate this growth significantly, meaning operators are rethinking their architectures and planning for how they can meet this demand sustainably.”

Creating More Sustainable AI-Driven Networks

Survey respondents confirm there is a growing opportunity for pluggable optics to support bandwidth demands and address power and space challenges. According to the survey, 98% of data center experts believe pluggable optics are important for reducing power consumption and the physical footprint of their network infrastructure.

Distributed Computing

The survey found that, as requirements for AI compute continue to increase, the training of Large Language Models (LLMs) will become more distributed across different AI data centers. According to the survey, 81% of respondents believe LLM training will take place over some level of distributed data center facilities, which will require DCI solutions to be connected to each other. When asked about the key factors shaping where AI inference will be deployed, the respondents ranked the following priorities:

· AI resource utilization over time is the top priority (63%)

· Reducing latency by placing inference compute closer to users at the edge (56%)

· Data sovereignty requirements (54%)

· Offering strategic locations for key customers (54%)

Rather than deploying dark fiber, the majority (67%) of respondents expect to use Managed Optical Fiber Networks (MOFN), which utilize carrier-operated high-capacity networks for long-haul data center connectivity.

"The AI revolution is not just about compute—it’s about connectivity," added Hatheier. "Without the right network foundation, AI’s full potential can’t be realized. Operators must ensure their DCI infrastructure is ready for a future where AI-driven traffic dominates."

Lenovo research highlights a growing AI execution gap as organizations struggle to control and...
Barracuda research reveals how attackers leverage device code authentication for persistent access,...
Infosecurity Europe 2026 will include coverage of how artificial intelligence is being applied in...
British organisations are increasingly adopting AI agents but face challenges in orchestration and...
Cisco has unveiled its Universal Quantum Switch, a prototype designed to support communication...
New research indicates that many UK IT leaders do not yet have comprehensive AI governance...
Pax8 teams up with NinjaOne, aiming to strengthen managed service providers through enhanced...
Exploring identity challenges with AI agents and governance opportunities for secure and scalable...