CIO, Cloud, Data Center, Opinion

PanelCast 2019 Predictions: The Rise of Edge Computing

Edge computing appears to be the next big area for growth. How do you think it will play a role in enterprise architecture in 2019?

This blog post was created following ActualTech Media‘s inaugural PanelCast event, held in December of 2018.  This event addressed 2019 enterprise IT predictions in a discussion moderated by Scott D. Lowe, along with four industry experts, including Sirish Raghuram of Platform9, Theresa Miller of Cohesity, Mike Wronski of Nutanix, and Jeff Ready of Scale Computing.

If you’d like to watch our very first PanelCast, please visit https://youtu.be/lbFfTbdztn8


Panelist responses to this audience question:

Loosely speaking, edge computing refers to any non-cloud IT system that exists outside the four walls of the data center. “Ultimately, when you talk about flexibility, it’s all about running the application in the way that the application makes sense and where the application makes sense,” says panelist Jeff Ready (Scale Computing). Putting compute closer to the data opens up a world of new opportunities, but this new model comes with its fair share of challenges as well. New systems demand new people to run those systems, and most organizations are not sending a bunch of staff out to these Edge locations.

Mike Wronksi (Nutanix) adds that, in addition to making sense, it needs to be simple. “When you don’t have skilled IT in the field, and you have thousands of retail locations, you really want that to be as simple as possible for a non-skilled worker to do something as basic as reset, power on, power off.”

Security is a factor, too, and it’s an area where edge computing might offer an advantage. “From a security perspective, there is value in keeping that stuff at the edge and not bringing it back to the data center,” says Theresa Miller (Cohesity).

“It’s about pushing new experiences to customers at new speeds,” says Sirish Raghuram (Platform9), who agrees that creating simplicity around the complexity of the edge is key to its success. “People want a way to give developers the freedom to be able to innovate without getting locked into the specifics of edge locations vs. data center locations vs. public cloud locations.”

Scott's Take

On its own, the “edge” is almost the antithesis of the cloud.  It’s a highly distributed physical presence, but, like cloud, it has incredible potential.  We used to talk about “the cloud” and “on-premises.”  On-premises could, technically, include edge devices, but it’s more commonly applied to describe systems that are outside the formal data center.  Edge services can encompass the various Internet of things (IoT) devices strewn about the enterprise, but the term more accurately describes some level of formal computing services and data generation.

Edge computing has become a significant force for a number of reasons.  First and foremost is the issue of data generation and use without significant latency being introduced, particular as organizations continue to generate growing volumes of data in real time. As data generation grows, it’s less palatable to introduce latency into processes by shipping that data to a central data center or to the cloud.

There are a number of use cases that demonstrate this, but I’ll focus on just two here.

First, consider a retail environment with hundreds or thousands of locations.  Every day, there is a massive volume of data generated at each of these locations.  There is a period of time during which that data needs to be closer to the user so it makes sense for it to be handled locally.  After a transaction is complete, however, it makes sense to push that data to the central data center for long-term analytics and archival needs.  For these kinds of edge environments, there isn’t always stable and redundant network connectivity and there aren’t generally dedicated IT personnel at each site and each site doesn’t have a hardened data center.  Operating a locally robust edge stack that can serve the site’s local needs for a period of time and then transfer data as needed to a more robust HQ environment or to the cloud makes perfect operational sense.

The second use case revolves around mobility, but not in the devices sense.  Think autonomous vehicles.  The computing needs in such vehicles is intense as there are all kinds of recognition systems in use to help the vehicle differentiate between a tree, a traffic light, and a human.  These vehicles generate massive amounts of data that has to be processed instantly so that an appropriate decision can be made.  Can you imagine what would happen if the vehicle needed to make a decision only after uploading gigabytes worth of data to the cloud to have it analyzed there?  At the same time, though, data centers and cloud environments are critical components of an autonomous vehicle’s network.  What each vehicle “sees” and learns is essential information to other autonomous vehicles.  As such, the combination of edge and cloud-centric operating environments is crucial to their success and to the safety of these fleets.