ao link
Business Reporter
Business Reporter
Business Reporter
Search Business Report
My Account
Remember Login
My Account
Remember Login

Computing life on the Edge: change is coming

Linked InTwitterFacebook

Bruce Kornfeld at StorMagic explains why organisations with edge sites should embrace hyperconverged infrastructure

 

The prevailing cloud services landscape is being forced to change as businesses require more flexibility to accommodate edge computing and distributed organisations. Some would argue it’s high time, having put up with expensive, under-performing options that have continually fallen short of expectations. 

 

While cloud is often perceived as the precursor for agility and efficiency, this certainly hasn’t been the case for smaller businesses and large enterprises with remote locations. Although cloud does offer manifold benefits for organisations with predominantly centralised functions, it can pose significant challenges for those with scattered premises and remote outlets.

 

The cloud dilemma

Despite advances in cloud technology, processing and protecting data at the edge has remained complex for organisations, no matter whether they are large corporations with many edge locations, or SMBs. Some have opted for a cloud-first strategy without any onsite IT systems. As a result, many are suffering with cloud contracts that are far too costly, provide an unreliable service, and have performance issues for their real-time, mission-critical applications at these smaller locations. 

 

Customers face the dilemma of whether or not to take back responsibility for key parts of their infrastructure onsite, managing their servers and software locally. This could help ensure 100% uptime, but isn’t necessarily cost effective or straightforward to administer, especially as space for hardware can be constrained by the physical size of smaller sites typically found at the edge of the enterprise or at small businesses. Realistically, there might not be the room, power or cooling for a robust data centre infrastructure.

 

Furthermore, the availability of local IT expertise is often limited, and putting trained staff onsite to perform administration and security tasks for these mission-critical systems would push up costs.

 

Not forgetting, most edge sites will still need cloud or corporate data centre connectivity to access other important applications as well as storage. This leaves data managers with yet more headaches, trying to decide which data should remain on the edge, be stored in the cloud, backed up to a data centre, or deleted completely.

 

It’s hardly surprising that IT decision-makers can find themselves going round in circles, frustrated that there’s not a feasible alternative to the status quo. What’s needed are flexible solutions that address edge data’s specific parameters.  

 

The elongating edge

In the past, all of these issues have been mainly confined to geographically dispersed industries, predominantly retail, manufacturing, energy and healthcare. Thriving on time-sensitive data for real-time decision-making, efficiency and to optimise vital supply chains, these sites at the edge have become accustomed to unreliable cloud performance and latency. 

 

However, the combination of digital transformation and growth in AI analytics is galvanising change on a sweeping scale. IoT devices such as patient health monitoring devices in a hospital, smart shelves and self-checkout in retail, and digital twins used in manufacturing plants, are driving the creation of huge amounts of data, straining the capabilities of data centres and cloud computing services. 

 

The data being generated at these edge sites is becoming so time sensitive that the AI engines need to be deployed locally so that decision-making is instant - there just isn’t time to send all the data to a cloud for AI processing. Enterprises ultimately require reliable and affordable processing to deliver fast and responsive services to their customers, and as a result, the edge is elongating to accommodate this new demand. 

 

Research from IDC suggests that worldwide spending on edge computing is expected to reach $232 billion in 2024, an increase of 15% over 2023, with that figure rising to nearly $350 billion by 2027. This market is here to stay – and expand.

 

The overprovisioning bugbear

This is where including a full-stack HCI (hyperconverged infrastructure) at the edge as part of a cloud strategy can have a substantial positive impact. An HCI consolidates computing, networking, and storage resources into a single, streamlined data centre architecture.

 

But, where a traditional architecture requires specialist hardware and software for each designated function, the modern approach is to use virtualisation to reduce server requirements without diminishing performance, effectively providing a solution equivalent to an enterprise-class infrastructure. 

 

It runs applications and stores data securely at remote sites, while still providing connections to the cloud and data centre as needed. All of this is supplied with flexible, modular features, replacing the overengineered offerings that used to be the norm. 

 

What makes the big difference in performance is that modern HCI solutions are purpose-built from scratch, designed and geared specifically for smaller sites. Additionally, they make it easy to connect solutions built for the edge back to the cloud or data centres, as the infrastructures do not need to be the same as each other.

 

A lightweight, enterprise-class alternative

These flexible, lightweight infrastructures require only two servers for high availability, instead of three or more. This lowers costs but doesn’t compromise on reliability and uptime. Failover is often as little as thirty seconds to preserve the integrity of data and keep operations running.

 

Minimising server requirements also reduces the physical footprint required for high performance, making HCI ideal for space-constrained environments. Further cost savings can also be realised from consuming less power, cooling, spare parts and minimal onsite maintenance.

 

Another major advantage is simplicity. HCI providers have worked hard to radically reduce complexity by designing solutions for easy remote set-up and management. Installations can be managed by IT generalists instead of requiring dedicated expertise. Systems can usually be deployed in under an hour, avoiding disruption to day-to-day operations, getting new sites or applications up and running without delay.

 

As edge deployments are likely to grow over time, HCI options accommodate easy scaling, enabling the infrastructure to adapt to different data demands without complex reconfiguration.  The centralised management tools ensure administrators can remotely manage and secure all edge sites from a single console. Then, the system automatically balances and allocates resources for computing and storage in real-time, optimising hardware usage for greatest efficiency. This avoids unnecessary and costly overprovisioning, once the bugbear of edge deployments.

 

Today’s HCI offerings now provide a compelling alternative to computing life on the edge, enabling seamless integration with existing cloud and data centre technologies. By optimising resource allocation and enabling rapid deployment in remote locations, they have been engineered with precision to meet the exacting performance and availability requirements of distributed environments – a simple solution for those living life on the edge.

 


 

Bruce Kornfeld is chief product officer at StorMagic 

 

Main image courtesy of iStockPhoto.com and TU IS

Linked InTwitterFacebook
Business Reporter

23-29 Hendon Lane, London, N3 1RT

23-29 Hendon Lane, London, N3 1RT

020 8349 4363

© 2024, Lyonsdown Limited. Business Reporter® is a registered trademark of Lyonsdown Ltd. VAT registration number: 830519543

We use cookies so we can provide you with the best online experience. By continuing to browse this site you are agreeing to our use of cookies. Click on the banner to find out more.
Cookie Settings