Edge computing is in many ways a natural extension of other distributed architectures, especially hybrid cloud environments.
There are immense potential benefits – which explains rising interest and adoption – that can be further enhanced when cloud and edge get paired together.
“Cloud compute services will be enhanced and complemented with the use of edge compute, edge security, and the right network,” says Ron Howell, managing enterprise network architect, Capgemini Americas.
That upside doesn’t come free: The very nature of distributed IT environments creates inherent complexity. That’s not a reason to eschew the trend; it’s just motivation to plan and then plan some more.
Let’s look at four key challenges that edge computing poses – and insights on how to tackle them.
Challenge #1: Managing highly distributed environments
Managing any large-scale edge architecture is kind of like managing hundreds or even thousands of smaller IT environments. And you can’t very well send out a help desk pro every time something requires attention.
“You’re probably going to have a lot of devices out on the edge and there probably isn’t much in the way of local IT staff there,” Gordon Haff, technology evangelist, Red Hat, told us recently.
If you try to manually handle things like configuration management, OS patches and other software updates, and a host of other common operational requirements, you’re likely in for some pain.
The same applies to standardization – or lack thereof – in terms of OS and other software, tooling, and so forth.
“Deploying and operating large-scale distributed infrastructures is challenging enough without throwing randomness and silos into the mix,” Haff says.
Teams already managing hybrid cloud or multi-cloud systems – especially with containerized applications in the mix – probably already “get it,” but it’s always a good reminder: You need a practical plan for Day 2 operations and beyond.
[ Also read Edge computing: 4 considerations for success ]
Beating this challenge: Automation and consistency are pillars of edge operations:
“Automation and management are essential for tasks like mass configuration, taking actions in response to events, and centralized application updates,” Haff says. “A corollary to this requirement is maximizing consistency from the datacenter out to the edge. Standardized operating system configurations and cluster orchestration are your friends.”
Challenge #2: Finding the right problem-solution fits
Realizing the benefits of edge computing depends on a thoughtful strategy and careful evaluation of your use cases, in part to ensure that the upside will dwarf the natural complexity of edge environments. (See Challenge #1.)
“CIOs shouldn’t adopt or force edge computing just because it’s the trendy thing – there are real problems that it’s intended to solve, and not all scenarios have those problems,” says Jeremy Linden, senior director of product management at Asimily.
Part of the intrinsic challenge here is that one of edge computing’s biggest problem-solution fits – latency – has sweeping appeal. Not many IT leaders are pining for slower applications. But that doesn’t mean it’s a good idea (or even feasible) to move everything out of your datacenter or cloud to the edge.
“So for example, an autonomous car may have some of the workload in the cloud, but it inherently needs to react to events very quickly (to avoid danger) and do so in situations where internet connectivity may not be available,” Linden says. “This is a scenario where edge computing makes sense.”
In Linden’s own work – Asimily does IoT security for healthcare and medical devices – optimizing the cost-benefit evaluation requires a granular look at workloads.
“It would require unrealistically high bandwidth – and create data security concerns – to send all the network traffic we gather on IoT devices up to the cloud, so our edge devices process it on location and send extracted insights back to the cloud,” Linden says. “But much of our application, such as calculating risk scores for devices, can be easily done in a normal cloud model.”
[ Related read: Edge computing: 4 pillars for CIOs and IT leaders. ]
Beating this challenge: Conduct an honest internal cost-benefit assessment of potential edge use cases – and then focus on those where performance/latency (or other goals) are really critical. Remember: As with cloud, you don’t have to take an all-or-nothing approach.
Challenge #3: Building in a still-maturing ecosystem
From a software perspective, the same or similar set of fundamental challenges apply everywhere – whether at an edge node, in a cloud, or on traditional endpoints, according to François Baldassari, CEO of Memfault:
- How do we deploy it and track its versions?
- How do we monitor performance / What is our observability strategy?
- How do we track defects?
- How do we detect and mitigate security risks?
For cloud and cloud-native applications, there’s already a robust set of answers to those kinds of questions. Baldassari points to the growth of Site Reliability Engineering as a field that essentially exists to tackle those and other challenges in modern software and infrastructure. DevOps, DevSecOps, GitOps, and other disciplines likewise offer overlapping approaches to these types of challenges.
While those practices can extend to edge deployments, the edge ecosystem itself is still in its relative infancy compared with a cloud.
“The ecosystem is still immature, and there’s more or less a patchwork of incomplete solutions currently in the market,” Baldassari says.
Just like cloud computing before, the edge computing ecosystem will mature over time. In the meantime, it behooves IT leaders to do some extra due diligence when it comes to choosing platforms, tooling, and other requirements.
Beating this challenge: “CIOs and other IT leaders must work with their suppliers to understand what their reliability strategy is, what SLAs they offer, and what tools they make available to IT teams to monitor edge devices on their corporate network,” Baldassari says. “If all else fails, appropriate network policies must be used to tightly control and restrict what edge devices can do on the network.”
Challenge #4: Approaching security in edge infrastructure and applications
Just as security became a major subdomain of cloud computing, so will it be a major priority in edge computing. By definition, edge expands your risk surface.
“Edge computing brings fantastic benefits to the user experience but comes at the cost of introducing fundamental security concerns,” says Christopher Sestito, CEO of HiddenLayer.
Also like cloud computing: edge isn’t “less secure” than a centralized environment, per se. It just requires executive-backed priority – and some updated strategies – to build a strong posture.
Sestito and other security pros say a granular approach – at the user or node level – is critical in edge environments, for example. “Access management is even more critical and multi-factor authentication should be employed whenever possible,” Sestito says.
Security automation – Sestito points to XDR and EDR tools as examples – is big here, as is careful consideration of how data is exposed or moved across network devices.
If you’re not already using a Zero Trust model, an edge initiative could be the reason to start.
“The use of Zero Trust Security design principles is fast becoming the trusted standard of choice for well-segmented and well-secured company resources,” Howell told us recently.
Beating this challenge: Properly prioritized, edge security can actually bolster an organization’s overall posture in our more distributed IT reality. Leaders and teams that are already taking a layered approach to cloud and container security should feel right at home.
“Industry leaders are coming to know that not all IT and business network requirements can be solved using only a cloud-centered enterprise architecture,” Howell says. “Cloud compute services will be enhanced and complemented with the use of edge compute, edge security, and the right network.”
For more insights, check out our related article: Edge computing: 4 key security issues for CIOs to prioritize.