AndPlus acquired by expert technology adviser and managed service provider, Ensono. Read the full announcement

Edge Computing in Analytics

Mar 27, 2019 9:05:00 AM

shutterstock_1281051094-1We’ve been hearing for years now that the future of computing is all about the cloud. Cloud computing, we’re told, will have numerous advantages for enterprises large and small:

  • Eliminate the need for local data centers
  • Provide more secure, available, and reliable server resources
  • Reduce the need for “big iron” user workstations
  • Manage costs more effectively by paying only for resources that are actually used, when they are used
  • Increase IT staff productivity (or decrease costs) by eliminating the need for resources to install, configure, manage, and troubleshoot physical computing infrastructure

…and more!

All of these advantages are undoubtedly true, and many businesses, from mom-and-pop shops that never had more than one computer, to multi-billion-dollar international corporations, are migrating to the cloud in one way or another.

However, in many cases, this silver cloud has a dark lining.

The Cloud’s Dark Side

It turns out that the cloud computing model greatly depends on having reliable, high-bandwidth connections to the internet.

Seems obvious, right? And in many cases, such as in office buildings in large cities, that requirement is easily met. We take it for granted, like electricity, oxygen, or coffee in the break room.

However, there are many settings in which you can’t count on always-on, fast network connections to all devices at all times. Consider:

  • Outdoor settings, such as farms, forests, quarries, and construction sites where wired connectivity may not be practical and wireless may be spotty, slow, or unreliable
  • Factories, warehouses, or other industrial settings where wires are easily damaged and wireless connectivity can be degraded by metal structures and other interference
  • Offshore oil rigs, wind farms, and ships
  • Rural locations that have lower-bandwidth internet connections

In these cases, complete reliance on cloud computing ranges from annoying and frustrating to impractical to downright impossible.

To solve this problem, businesses are turning (or returning, depending on your perspective) to what observers are calling edge computing.

Computing At the Edge

The idea of edge computing is that, where warranted, computing and storage resources should be close to the devices that are generating data—that is, at the edge of the network, and not in some central data center or in the cloud.

That sounds like what we started out with, before all this buzz about moving to the cloud, right? It is - kind of. But unlike the conventional approach, where server hardware collects, analyzes, and reports the data, all on the same premises as the data sources, edge computing is intended to complement cloud computing and make it more efficient.

How does this work?

Consider an industrial internet of things (IIoT) setting, where there may be hundreds or thousands of network-connected sensors, all chatting at more-or-less the same time on the local network. Some are reporting temperatures, others pressure or flow rate, tank levels, or other metrics. Some might be scanning barcodes to track materials as they make their way through the process. Other devices, such as valves, dampers, and motor controllers, may also be connected to the network, actuating on command from whatever system is looking at the incoming data.

Even if all those devices had ultra-reliable network connections, if all that data had to traverse the wide-area network to the cloud, the network would soon become highly congested. Messages to actuators might come late or not at all, resulting in undesirable or even dangerous conditions in the plant.

With edge computing, the responsibility for analyzing and acting on the incoming data streams is closer to the sources of those streams. To the extent that a cloud connection is needed, the data can be sent in summarized form, rather than all the raw data streams at once, or cloud-bound messages can be limited to fault conditions or alerts that need to be brought to the attention of local or remote human operators.

Advantages of Edge Computing

This architecture offers a couple of important advantages:

  • Reduced network latency, which is vitally important for mission-critical systems
  • Reduced need for high-bandwidth wide-area network connections, which could be impractical or prohibitively expensive

Further, thanks to advances in computing power, much edge computing can happen on machines such as purpose-made IIoT appliances, laptops, and workstations, rather than servers in a local data center. Thus, one can have the best of both worlds: The reduced data center footprint offered by cloud computing, plus the reliability and low latency of locally-connected computing resources.

Security At the Edge

One area of concern with edge computing, as it is for any other kind of computing these days, is security. It’s easy to think that edge computing is inherently more secure than cloud computing, because all of a company’s data resides within its own local network and doesn’t leave the “four walls.”

However, this kind of thinking can lull operators into a false sense of security, leading to lax cybersecurity measures. As many companies have learned the hard way, hackers are clever, adept, and persistent, and will find a way in if they think you have something worth stealing. Edge computing resources need to be hardened against cyberattacks just as much as cloud servers do, and the data—both “at rest” on disk drives and in transit across the local network—needs to be encrypted and protected.

In other words, calling something “edge computing” does not relieve a company of its duty to protect its data assets.

Edge Computing and Analytics

Edge computing holds special promise in the field of analytics.

Analytics, of course, is the process of mining through mountains of data to find actionable information through calculation and presentation of key performance indicators (KPIs). In an IIoT setting, with data streaming from multiple sensors in real time, it would be difficult even for a high-powered cloud server to process it all in a meaningful way, especially if there are timing and latency issues. A cloud-based analysis system might misrepresent what’s really going on, or trigger unneeded alerts, if some of the data streams go “silent” for any period of time, even for only a few seconds.

With edge computing, much of the analytical processing can happen in close proximity to the data sources, where the chances of having missing data are greatly reduced. The analysis results can then be sent to a cloud-based system, which can compile the summarized results with those from other such edge systems to generate a complete picture. If management needs to drill down from there to see the details, the cloud system can easily query the edge resources as needed.

This is an advantageous situation for analytics because the cloud system doesn’t have to rely on iffy data streams, but still can present complete, accurate KPIs to inform management decisions.

Thus, edge computing, particularly in IIoT and other real-world situations, may be the sweet spot between old-school, on-premise computing and a 100% cloud solution. Look for edge computing systems to increase in popularity in the near future.

Learn More!

Topics: Cloud Computing

Chris DeProfio

Written by Chris DeProfio

Chris runs the “engine room” of AndPlus’ world-class engineering team that solves problems using a myriad of technologies. He is responsible for all aspects of product engineering and quality assurance, and often works closely with clients. He also manages the AndPlus employee professional development program, mentoring and guiding employees in their technical, business, and management skills development. Chris received a BA in Computer Science from Clark University, and is a certified Scrum Master.

Get in touch

LET’S BUILD SOMETHING AWESOME. TOGETHER.