Five Challenges to IoT Analytics Success

[ Editors Note: This is a cross post of Authors Blog]

 

The Internet of Things (IoT) is an ecosystem of ever-increasing complexity; it’s the next wave of innovation that will humanize every object in our life. IoT is bringing more and more devices (things) into the digital fold every day, which will likely make IoT a multi-trillion dollars’ in the near . To understand the scale of interest in the internet of things (IoT) just check how many conferences, articles, and studies conducted about IoT recently, including a recent well written article about IoT future by SAP listing 21 experts insights about IoT future. This interest has hit fever pitch point last year as many companies see big opportunity and believe that IoT holds the promise to expand and improve businesses processes and accelerate growth.

However, the rapid evolution of the IoT market has caused an explosion in the number and variety of #IoT solutions, which created real challenges as the industry evolves, mainly, the urgent need for a reliable IoT model to perform common tasks such as sensing, processing, storage, and communicating. Developing that model will never be an easy task by any stretch of the imagination; there are many hurdles and challenges facing a real reliable IoT model.

One of the crucial functions of using IoT solutions is to take advantage of IoT analytics to exploit the information collected by "things" in many ways — for example, to understand behavior, to deliver services, to improve products, and to identify and intercept business moments. IoT demands new analytic approaches as data volumes increase through 2021 to astronomical levels, the needs of the IoT analytics may diverge further from traditional analytics.

There are many challenges facing IoT Analytics including; Data Structures, Combing Multi Data Formats, The Need to Balance Scale and , Analytics at the , and IoT Analytics and AI.

Data structures

Most sensors send out data with a time stamp and most of the data is "boring" with nothing happening for much of the time. However once in a while something serious happens and needs to be attended to. While static alerts based on thresholds are a good starting point for analyzing this data, they cannot help us advance to diagnostic or predictive or prescriptive phases. There may be relationships between data pieces collected at specific intervals of times. In other words, classic time series challenges. 

Combining Multiple Data Formats

While time series data have established techniques and processes for handling, the insights that would really matter cannot come from sensor data alone. There are usually strong correlations between sensor data and other unstructured data. For example, a series of control unit fault codes may result in a specific service action that is recorded by a mechanic. Similarly, a set of temperature readings may be accompanied by a sudden change in the macroscopic shape of a part that can be captured by an image or change in the audible frequency of a spinning shaft. We would need to develop techniques where structured data must be effectively combined with unstructured data or what we call Dark Data.

The Need to Balance Scale and Speed

Most of the serious analysis for IoT will happen in the cloud, a data center, or more likely a hybrid cloud and server-based environment. That is because, despite the elasticity and scalability of the cloud, it may not be suited for scenarios requiring large amounts of data to be processed in real time. For example, moving 1 terabyte over a 10Gbps network takes 13 minutes, which is fine for batch processing and management of historical data but is not practical for analyzing real-time event streams, a recent example is data transmitted by autonomous cars especially in critical situations that required a split second decision.

At the same time, because different aspects of IoT analytics may need to scale more than others, the analysis algorithm implemented should support flexibility whether the algorithm is deployed in the edge, data center, or cloud.

IoT Analytics at the Edge

IoT sensors, devices and gateways are distributed across different manufacturing floors, homes, retail stores, and farm fields, to name just a few locations. Yet moving one terabyte of data over a 10Mbps broadband network will take nine days. So need to plan on how to address the projected 40% of IoT data that will be processed at the edge in just a few years’ time. This is particularly true for large IoT deployments where billions of events may stream through each second, but systems only need to know an average over time or be alerted when a trends fall outside established parameters.

The answer is to conduct some analytics on IoT devices or gateways at the edge and send aggregated results to the central system. Through such edge analytics, organizations can ensure the timely detection of important trends or aberrations while significantly reducing network traffic to improve performance.

Performing edge analytics requires very lightweight software, since IoT nodes and gateways are low-power devices with limited strength for query processing. To deal with this challenge, Fog Computing is the champion. 

Fog computing allows computing, decision-making and action-taking to happen via IoT devices and only pushes relevant data to the cloud, Cisco coined the term “Fog computing “and gave a brilliant definition for #FogComputing: “The fog extends the cloud to be closer to the things that produce and act on IoT data. These devices, called fog nodes, can be deployed anywhere with a network connection: on a factory floor, on top of a power pole, alongside a railway track, in a vehicle, or on an oil rig. Any device with computing, storage, and network connectivity can be a fog node. Examples include industrial controllers, switches, routers, servers, and video surveillance cameras.”. Major benefits of using fog computing: minimizes latency, conserves network bandwidth, and addresses concerns at all level of the network. In addition , operates reliably with quick decisions, collects and secures wide range of data, moves data to the best place for processing, lowers expenses of using high computing power only when needed and less bandwidth, and gives better analysis and insights of local data.

Keep in mind that fog computing is not a replacement of cloud computing by any measures, it works in conjunction with cloud computing, optimizing the use of available resources. But it was the product of a need to address many challenges; real-time process and action of incoming data, and limitation of resources like bandwidth and computing power, another factor helping fog computing is the fact that it takes advantage of the distributed nature of today’s virtualized IT resources. This improvement to the data-path hierarchy is enabled by the increased compute functionality that are building into their edge routers and switches.

IoT Analytics and AI

The greatest—and as yet largely untapped—power of IoT analysis is to go beyond reacting to issues and opportunities in real time and instead prepare for them beforehand. That is why prediction is central to many IoT analytics strategies, whether to project demand, anticipate maintenance, detect fraud, predict churn, or segment customers.

Artificial Intelligence (#AI) use and improves current statistical models for handling prediction. AI will automatically learn underline rules, providing an attractive alternative to rules-only systems, which require professionals to author rules and evaluate their performance. When AI applied it provides valuable and actionable insights.

There are six types of IoT Data Analysis where AI can help:

 1.    Data Preparation: Defining pools of data and clean them which will take us to concepts like Dark Data, Data Lakes.

2.    Data Discovery: Finding useful data in the defined pools of data

3.    Visualization of Streaming Data: On the fly dealing with streaming data by defining, discovering data, and visualizing it in smart ways to make it easy for the decision-making process to take place without delay.

4.    Time Series Accuracy of Data: Keeping the level of confidence in data collected high with high accuracy and integrity of data

5.    Predictive and Advance Analytics: Very important step where decisions can be made based on data collected, discovered and analyzed.

6.    Real-Time Geospatial and Location (logistical Data): Maintaining the flow of data smooth and under control.

But it's not all "nice & rosy, comfy and cozy" there are challenges facing using AI in IoT; compatibility, complexity, privacy/security/safety , ethical and legal issues, and artificial stupidity.

Many IoT ecosystems will emerge, and commercial and technical battles between these ecosystems will dominate areas such as the smart home, the smart city, financials and . But the real winners will be the ecosystems with better, reliable, fast and smart IoT Analytics tools, after all what is matter is how can we change data to insights and insights to actions and actions to profit .

Leave a Reply