Edge Computing and FoggingBy
As cloud computing matures, the technology continues to adopt some rather extraordinary functions, including blockchain, edge computing, and something called “fog computing,” or simply “fogging.”
WHAT IS EDGE COMPUTING?
Edge and fog computing represent significant architectural shifts in the way the cloud is used. A recent prediction in Forbes promises, “Edge computing is all set to become the most preferred architecture for running data-driven, intelligent applications. Though edge computing is ideal for IoT (Internet of Things) solutions, it offers tremendous value for the departmental and traditional line of business applications.”
Reduced to the simplest terms, these two are just ways to move the computing layer away from the center (cloud) out closer to the data sources (edge). It might sound counterintuitive, but before we get to the reasons for doing this, let’s begin with a few definitions.
Microsoft defines cloud computing this way: “Simply put, cloud computing is the delivery of computing services—servers, storage, databases, networking, software analytics, and more—over the Internet (the cloud).” According to Webopedia, “Fog computing is a term created by Cisco that refers to extending cloud computing out to the edge of an enterprise’s network. Also known as edge computing, or fogging, fog computing facilitates the operation of compute, storage, and networking services between end devices and cloud computing data centers.”
Notice the metaphors used. Clouds are remote, drifting in areas we think of as vast. Fog is localized, on the ground, right around us. Edge computing creates data processing power at the edge of the network, but it isn’t intended to replace cloud computing.
Although often used interchangeably, the terms “edge” and “fog” define networks with similar goals but with a difference. Fog networks connect edge devices that speak to each other (e.g., IoT gateways), while edge computing focuses on the devices and technologies that are attached to things such as industrial machines. In our limited space here, we’ll restrict our focus to edge computing.
WHY USE THE EDGES?
So, why would anyone want to add another computing layer, especially out on the edges of their network? Latency, that’s why. The amount of data generated today at the edges by machines, smartphones, IoT devices, and ATM stations is becoming overwhelming.
Mary Shacklett of TechRepublic writes that by 2020, the number of smart sensors and other IoT devices feeding networks will be 5,635 million. These sources will generate more than 507.5 zettabytes of data to be sent out to the cloud, processed, and then returned in ever-slowing cycles (1 zettabyte = 1 trillion gigabytes). Latency.
If you want an example of where that will be not only inconvenient, but downright dangerous, consider the autonomous automobiles with their hundreds of on-vehicle sensors. Intel estimates that for every eight hours of driving, each of these cars will generate 40TB of data. The chip-maker says, “It is unsafe, unnecessary, and impractical to send all that data to the cloud.”
What’s needed is real-time, low-latency, local processing to keep the passengers safe.
Shacklett explains that by 2023, “The global IoT market is expected to top $724.2 billion. That accumulation of IoT data and the need to process it at local collection points is what’s driving edge computing.”
And don’t think this explosive growth has escaped the notice of commercial software vendors. They also are constantly finding new ways to allow their software to exploit the devices and the streams of their IoT data.
WHAT ARE THE ADVANTAGES?
The need for a different processing model has created the new edge middleware layer, and it can perform very fast information processing. This includes feedback to the devices that enables them to respond with smart follow-up actions not possible with the inherent latency of the longer round-trips into the cloud.
Benjamin Rousseau of TechGenix media company offers three additional advantages:
- There are greater efficiencies in industrial applications that require decision making based on recently generated data.
- It’s less expensive compared to 100% cloud storage and computing models.
- And because the edge computing model is stand-alone, it will continue to operate even if one device suffers downtime.
In 2016, Tech Pro Research conducted a survey of companies including large enterprises to very small businesses and found more than half were planning to implement IoT in the next 12 months. The researchers concluded that many will use edge computing with their IoT strategies.