To the list of greatest inventions of the world such as the wheel, the compass, steam engine, concrete, the automobile, railways, airplane add 21st century’s offering the Internet of Things or IoT. Gartner estimates the total economic value-add from IoT will reach US$1.9 trillion worldwide in 2020. Other reports claim close to half the companies in sectors like oil, gas, and manufacturing industries are already using Instrumented devices capable of providing valuable data.
The major benefit of IoT is the ability to measure, monitor and manage any asset in any location from anywhere at any time. However, companies implementing IoT face some real challenges. At its heart, IoT involves dealing with data streams from a wide variety of sensors. For example, a regular automotive car has roughly 30k parts and the new generation autonomous cars may likely have an even higher number of smarter parts. These assets either by themselves or through connected instrumentation generate about a GB of data per second in operation, Guess what – we are talking about 3600GB of data per hour. All this data must be taken through an intelligent lifecycle from capture to archive and used all along to support the data-driven decision process.
How do you make sense of this? Where do you begin? How do you create an environment that learns by itself to use or discard the insights generated by using these data? Is it about accuracy or precision or both? What impact does it have on real-time operations and decision making?
The primary challenge is handling the massive amounts of data. This includes security of the data and the network as well as the analytics required to derive usable business intelligence from it. IoT technologies have to support this entire process from sensing, transforming, network, analysis, and action. If it were to be a homogeneous environment where we are dealing with assets of one class or type, it would still be a scalability challenge. With a number of asset types and assets of each class multiplying in the process, the complexity is getting multiplied across the layers from edge to core. There are no interoperability standards coupled with crowded as well as siloed solutions and products.
Apart from this, an IoT implementation faces issues like:
Yes, it is 2017 but network outages do occur even in the most advanced nations leaving the very concept of in-stream data analytics at risk.
The numbers of different systems connected through IoT continue to create interoperability challenges.
Different IoT vendors are pushing their standards creating a veritable Tower of Babel. This will take time to sort out.
A successful IoT implementation needs to address three issues:
IoT is not a one-size fit all kind of solution. The key is to find a scalable platform that integrates all aspects of the business to ensure seamless information flow across the enterprise.
Sensors will throw up a huge amount of data. There needs to be sanity across all layers of implementation. The overall platform architecture needs to have a robust data management and distributed analytics framework to create actionable insights where it matters.
Hackers and sensitive data loss created a big risk and interrupt operations. It is therefore a business imperative to have a solution that identifies each and every asset which is locked-down at every tier using authentication and authorization while enabling logging, blacklisting and encryption of data at rest or in transit.
The benefits that accrue from a smart enterprise far outweigh the risk and the complexities involved in its implementation. Customers have the real need of dealing with new product innovation, eliminate wastage in their process, improve product quality, reduce operational costs and create new consumption led business models.
So the bottom line really is not if you need to make your business smarter, but how soon you can do so. For delaying the process is simply handing your competition an unassailable advantage.