Deciphering Edge Computing

Oct 12, 2022

file-7925151

If it seems like everyone is talking about IoT and edge computing, it’s because they are. More and more companies, particularly those with large, dispersed operations with massive amounts of data, are moving to the edge because it can be faster, cheaper, and more efficient than cloud-based data processing operations. This post explores different types of edge-based devices/hardware and scenarios in which they are most effective.

First, what’s the difference between cloud-based computing and edge-based computing? It’s actually pretty simple. Cloud-based computing relies on services like Amazon Web Services (AWS) or Azure. Data can be collected from various points in your Information Technology ecosystem and then sent back to the cloud for processing. This certainly has many advantages over the server farm model of yore but is not without challenges, specifically downtime, latency, and connectivity in remote environments.

Edge-based computing collects and processes data at the edge. What’s the edge? It could be different factories, oilfields, substations, water containment facilities or any combination of remote sites. In a simplified definition, edge-based computing collects data from various systems and/or devices. The key benefits of edge computing include decreased latency; faster data transmission rates; and higher reliability. This is because data is collected closer to where the instrumentation/sensors are. That data is processed on premises and sent out on a lightweight protocol, which reduces stress on the communications infrastructure.

Depending on where your remote sites are; what kind of data they are collecting; and how much data they are collecting, you’ll need to select hardware that not only meets your needs but can withstand the environment. There are 3 basic categories of edge-compute devices.

  1. Raspberry Pi (99% Reliability) is a series of small single board computers (SBC). It was originally created as a small, inexpensive option to teach computer science in schools and developing countries. Raspberry Pi can run small/lightweight apps (typically Windows-based but also LINUX). The devices can be easily deployed – board only – and run simple apps like historian and some simple automation. Some examples include control and data gathering like pressure, flow, and other discrete values (e.g., is a valve open or closed?); initiating simple tasks like turning on a light, sending an alert, or turning something on/off. And though not nearly as robust as other edge devices, Raspberry Pi machines can be run in some harsh environments.n
  2. Industrial PCs (IPCs) (99.5% Reliability)[2] are a very popular option now. Depending on configuration, they run from $1000-2500. They have a more robust processor, more RAM, and a bigger hard drive (usually 512G – 1 terabyte range.) IPCs run on LINUX Windows operating systems and can run different containerized applications on a single device. Some examples would be historian, video, and SCADA apps. However, IPCs are not fault-tolerant or high availability. Onsite, they are typically DIN-rail mounted inside a panel. They do have higher temperature specifications typically – 10 to 60 degrees Celsius. IPCs are more functional than Raspberry Pi devices as they can run complex data sets; be a SCADA node; do data collection on the edge; analyze that data; and send it back via lightweight protocol, such as MQTT, serial communications, or ethernet. nnAdditionally, IPCs provide the ability to hook up a monitor and keyboard to use as a remote workstation. These peripherals are built to be more hardy and robust for industrial environments. The machine itself can run multiple apps simultaneously. Because of this plus their functionality and flexibility, IPCs are becoming extremely popular in oil and gas, manufacturing, and other industrial environments.n
  3. Mini Data Center (99%-99.9999% reliability) is appropriate for scenarios where large data sets are being collected and/or processed from different data points/systems and in areas where a robust communications infrastructure might not exist. Due to infrastructure requirements – power, internet/networking capability, ability to run many sophisticated applications, etc., this is the most complex yet reliable option.nnIn the “olden days,” these complex communications and the heavy processing load would have required construction of cell towers to talk to remote sites – a costly endeavor. Plus, data transmission and communications would rely on a sometimes-unreliable cellular network. Further, data processing was done at the central location, which could and often did result in lags between data generated and data processed. In situations like processing transactions or responding to an outage, these lags proved costly.nnWith a mini data center, data is collected and processed onsite. It allows smaller data packets to be sent back via MQTT (Message Queuing Telemetry Transport) because while onsite systems are always collecting and processing data, it’s only communicating back when a value has changed. It’s not sending a packet until it’s needed, and the system is not constantly pinging for data. When something changes, it sends that information. This is much more efficient, reliable, and cost-effective.

Edge computing is literally and figuratively changing the landscape of remote operations by giving operators/companies more timely, accurate, reliable insights into remote sites – virtual ‘eyes and ears on the ground.’

Twin Eagle Solutions is a leader in edge communications, compute, and implementation. Learn how we can help you operate more efficiently from the office to the edge.