Is the interoperability of things a model for the IoT?

| Information and Communication Technology

With around 2 billion IoT devices connected somewhere in the ethereal Internet of Things landscape, it’s important to design them so they actually “talk” to each other.

Interoperability should be an integral part of IoT design, says Barry Strauss head of marketing at Talksum, a specialist in high-speed data processing and management. In addition, the interoperability should be seamless. Processing large volumes of disparate data coming in at high speeds not only requires vast computing resources, it also takes a long time.

In the transport sector, vehicles currently carry an average of 60 to 100 sensors. As cars become more intelligent, the number of sensors is projected to surpass 200 per car and hence the need for the Intelligent Transportation System (ITS). Recent reports have estimated that by 2020 there will be about 22 billion sensors entering the automotive industry per year.

With that in mind, it’s critical for sensors to interoperate both within the vehicle and to entities external to the vehicle. An extreme example of this can be seen when two cars collide, although in theory sensors would mitigate this as well. After a collision, sensors may fire to: Emergency response, Roadside assistance, Traffic control, Manufacturer, Insurance, Home notification systems, Fleet management, Train tracks and signals, Hospitals.

Today, the protocols, formats, regulations, policies and standards can be vastly different among different sensors and devices. It’s important to connect data sources coming from multiple systems in different schemas, protocols, and formats. And at the same time, the IoT design should allow a system to extract, transfer, and asynchronously load the data to different business-intelligence tools and analytics platforms, dashboards, and storage systems, providing interoperability to the interconnected systems that would otherwise live in silos.

Traditional approaches try to solve this by making storage bigger and faster (improving traditional databases and building new storage solutions such as Hadoop, in-memory, and others), and building better analytics on top of it. But this comes with complexity and implementation expenses, as well as scalability and stability concerns.

A different approach, that of first understanding the data and then acting upon it in real time before storing it, is a required design element. In this case, business logic can be applied early in the process before data is stored; optimizing what needs to be acted on in real time and what needs to be routed to respective downstream sources. This first-make-sense-out-of-it-then-store-it approach lets enterprises efficiently manage, distribute, and track real-time IoT data, basically providing the Interoperability for Things for the Internet of Things.

Related news

Leave a Reply

Your email address will not be published. Required fields are marked *