Edge infrastructure models – solving the architectural challenges of 5G and Edge Computing

It’s no secret that edge computing and 5G are intrinsically linked. 5G networks can be up to 500% faster than 4G and support a 100x increase in traffic capacity, but edge computing is central to realising this promise, providing compute and storage power that eliminates backhaul latency issues inherent to a reliance on a central data centre. By Jon Abbot, EMEA Telecom Strategic Clients Director for Vertiv

However, despite this clear need, there are significant challenges to deploying these sites quickly and economically. Operators need hundreds – in some cases thousands – of new edge sites to fully realise the potential of their 5G networks, but the differing geography, climate, compute load, power demands, and myriad of local and regional regulations and guidelines present specific architectural and engineering challenges for each site.

A new set of infrastructure models To help solve these pressing issues, Vertiv has introduced a new set of edge infrastructure models designed to streamline and standardise the design and deployment of various edge sites including those supporting 5G networks. Using in-depth insight from industry practitioners across industry sectors, including telecom operators, Vertiv has defined four distinct models:

1. Device Edge: In this model, compute is at the end-device. It is either built into the device itself (for example a smart video camera with artificial intelligence capabilities) or is an “add-on edge”, stand-alone form factor that directly attaches to the device. When the compute is built in, the IT hardware is fully enclosed within the device, so it does not need to be designed to endure harsh environments. When the compute is attached to the outside of a camera it must be ruggedised, but if it is built into the camera, ruggedisation is not necessary.

2. Micro Edge: A small, standalone solution that ranges in size from one or two servers up to four racks. It is often deployed at an enterprise’s own site but can also be situated at a telco site if required. The Micro Edge can be deployed in both

conditioned and unconditioned environments. In conditioned environments such as IT closet, the Micro Edge doesn’t need advanced cooling and filtration as external factors like temperature and air quality are stable. In contrast, in unconditioned environments such as a factory shop floor, the Micro Edge requires specialised cooling and filtration to account for the harsher external factors.

3. Distributed Edge Data Centre: Perhaps the most commonly referenced approach to edge infrastructure, this model refers to a small, sub-20 rack data centre that is situated at the enterprise’s site, telco network facilities or a regional site – for example in modern factories or large commercial properties.

4. Regional Edge Data Centre: A data centre facility located outside of core data centre hubs. As this is typically a facility that is purpose-built to host compute infrastructure, it shares many features of hyperscale data centres; conditioned and controlled, high security and high reliability.

Putting the models into practice

So, there is plenty to consider. But when it comes to identifying the appropriate edge infrastructure model, any decision will ultimately depend on the use case being deployed.

For example, the lower the required latency, the closer the edge infrastructure must be to the end device. It’s for this reason that what we call “Life Critical” use cases (those that directly impact human health and safety and which demand speed and reliability) often need to be hosted at the Device Edge.

Data Intensive use cases like high-definition content delivery require the edge to be close to the source of data to prevent high bandwidth costs - on-premise deployments are desirable. In this case, a Micro Edge provides a good balance of short data transmission distance (thus limiting bandwidth costs) and greater compute capabilities than a Device Edge.

The latency requirements of Machine-to-Machine Latency Sensitive applications, which include smart grid technologies, are met by the Device Edge. However, there will be a move to the Micro Edge as enterprise edge adoption becomes more widespread, particularly for machine-to-machine devices that are too small or low cost to justify a Device Edge.

Making the right choices

Physical infrastructure is key in any edge computing strategy. The power, cooling and enclosure equipment, as well as the compute it supports, provides the foundation on which applications can run and enables countless edge use cases.

Making the right physical infrastructure choice is even more important at the edge given that many deployments are in locations where additional support and protection is required.

Navigating edge infrastructure is also made more challenging with the broad and varied definitions of edge. Fortunately, there is an ecosystem of suppliers, system integrators and other channel partners with experience and expertise in edge deployments to provide support, so the future for 5G and edge infrastructure looks bright.

By Alex Mariage, Regional Director at BCS.
By Jamie Cameron, Associate Director, Cundall.
With data centres now deemed as critical infrastructure, organisations must evolve beyond...
By Michael Crook, Data Center Market Development Manager, Corning Optical Communications.
In the next five years, Nigeria's data centre industry is set to explode, doubling its capacity...
Ben hadn’t considered a career in the Data Centre industry until he saw the advert by BCS on the...
By Marc Caiola – nVent Vice President of Global Data Solutions.
By Kamlesh Patel, VP Data Center Market Development at CommScope.