Rapid advancement in technology, coupled with fierce competition and the need for survival is making more and more companies opt for Internet of Things (IoT), As per a survey by Gartner, it is projected that there will be anywhere between 25 to 50 billion “things” connected to the IoT. To put it in proper perspective, that is about 7 connected devices per person in this world. The data generated by all these devices is expected to grow to around 2.3 zettabytes by 2020. Go google how many zeros after ‘1’ that is - we cannot even fathom the amount of such data!Before we tackle the issues related to processing this humongous amount of data, let us first revise the basic concepts of IoT.
What is IoT?
In its simplest definition, the Internet of Things (IoT) refers to the billions of physical devices around the world that are connected together and share data.A more technological definition of IoT would be that it is a network of physical objects (“things”) that are embedded with sensors, actuators and software that connect and share data with other devices and systems over the internet.Connecting all these different objects and putting sensors in them adds a level of digital intelligence to devices that would be otherwise dumb. With IoT in place, these devices become intelligent, enabling them to communicate real-time data without involving a human being. Have you ever operated or heard about a light bulb that can be turned on or off using an app on your mobile device? That is IoT at work for you.
Real world ioT
Of course, real world IoT is not as simple as turning on a bulb with a cell phone. Plants and machineries are complex structures. Often times, each depends on the other for seamless functioning. All the gadgets and gizmos in a typical plant are embedded with umpteen numbers of sensors and actuators that keep churning a huge volume of data. To manage IoT, companies invariably turn to cloud hosting, especially if the physical plants are located far away from each other.
A typical IoT architecture consists of the device (“thing”), its sensors and actuators. These in turn are handled by the controller which is nothing but hardware or software component that interacts electrically or logically with the device’s sensors and actuators. The software agent or gateway is an embedded program that runs on or near the IoT device. It is the task of this gateway to report the status of a component or state of the system to the controlling software – something like PTC ThingWorx platform. The agent acts as a bridge between the controller and the cloud. It is the function of the agent to determine which data to send to the cloud and which data to discard. It is a two way process - the agent also processes and responds to cloud-based commands and updates provided by the central IoT software.
Limitations of Cloud based IoT
As we have seen, IoT consists of interconnected things that share data. The software that controls the IoT system assimilates the data, and decides what the next course of action is. While this is all good in theory, there are practical limitations in real world.
Bandwidth: IoT systems generate a huge amount of data (Big Data) that is bound to challenge existing bandwidth capacity. No matter what the connectivity, IoT devices are a strain on bandwidth. As an example, an IIoT connected chemical reactor, with all its sensors and actuators, can generate hundreds of megabytes’ data per day. Handling this data intelligently with a software platform like PTC ThingWorx / Kepware is a challenge in itself. If your IoT cloud storage is located at a distance, the bandwidth challenges only increase.
Data Latency: Even though it is digital, IoT data has to travel from one point to another. While bandwidth is the rate of data transfer for a fixed period of time, latency is the amount of time it takes for the data to cover this distance. With so many routers involved in the data transfer, the journey from source to the destination is anything but straightforward. This delay is called as latency.
Connectivity: ISPs are known to be fickle. Power outages are frequent in some countries. If data flow is interrupted for any reason, it creates issues with the IoT system.
Security: Data security is one of the biggest threats that IoT systems face. With cyber attacks becoming all too frequent, protecting IoT data as it travels through the cloud is another crucial challenge.
Let’s see an example to make the above limitations clear. Let us say that an oil company is India manages an oil well in Latin America or Far East. For oil wells, maintaining the pressure at which oil is pumped is crucial. If the pressure increases or decreases for any reason, it can create havoc downstream, where the oil is distributed and processed. Let’s say the company has implemented an efficient IoT system, and their data is deployed on the cloud in Europe. In case there is any fluctuation in the oil pressure, it becomes unviable for the company to address this emergency instantaneously because of the data transfer limitation.
The Need for Edge Computing
So far in this article, we have not remotely mentioned what Edge computing is. Let us do so now.
As we have seen, while IoT is rewarding to enterprises, it brings its own set of challenges, especially when deployed on the cloud. Edge Computing refers to computing infrastructure that performs data processing at the edge of the network, near the source of the data. Let us make this clear by an example that involves the oil well mentioned above. If, instead of routing the data to the cloud, an edge IoT solution is deployed to the pressure valve sensor, it will immediately take an intelligent call and take appropriate remedial action. Say the pressure of the oil has increased drastically due to seismic activities. Rather than routing all the data to the cloud, which may be located in Europe, the sensor itself actuates a mechanism whereby the excess oil is immediately diverted to a flare. The flaring gets rid of the unwanted or excess gases and liquids released during such situations, protecting the oil well and the entire downstream infrastructure. Edge computing in IoT therefore bypasses the issues related with bandwidth constraints, latency and connectivity. It is a way of providing compute and storage services more immediately and close to the physical devices of an organization, that is, at the edge of the Cloud network. Edge computing is a decentralized computing infrastructure in which computing resources and application services can be distributed along the communication path from the data source to the cloud. That is, computational needs can be satisfied “at the edge,” where the data is collected, or where the user performs certain actions.A fundamental part of Edge computing is the robust and seamless integration between IoT and Cloud; between the physical world and the world of computation.It is essential to understand that the Edge is a logical layer rather than a specific physical divide, so it is open to individual opinion and interpretation of “where” the edge is.
Here are the advantages of Edge computing in IoT:
Cloud Vs. Edge – where to Host IoT Data
While Edge computing is seen as an advanced solution, it is not always the de-facto choice. Cloud hosting provides data scalability to IoT solutions. For big organizations, a central depository is extremely useful for deriving meaningful insights about the performance of different plants / production units. Cloud solutions are best suited for supporting industrial IoT platforms like ThingWorx that provide speed, security, and scalability.
Edge computing is useful for operations that require instant data analysis, and data latency is not an option. For this reason, Edge computing is the choice for operations that need to process and respond to data in real time, without relying on a nearby centralized server farm. However, Edge computing is costlier than cloud hosted IoT because of challenges involved in the maintenance and upgradation of distributed hardware.