It’s good to talk

Andy Pye asks if society is becoming divisive as technology and industry become more connected.

It is ironic that as industry steams towards a connected future, our wider society is recoiling from the idea. Isolationist policies appear to be the order of the day: as Trump pulls away from global markets, the UK Government prefers to go it alone in the world, rather than in partnership with its closest neighbours.

What if anything can we learn from our automated systems? Conversely, these seem determined to forge much closer links with their closest neighbours, as unified standards make it ever more feasible to speak the same language. As we scale to a world of billions of intelligent, connected sensors and other devices, mission-critical data must be transferred reliably, in real-time and in the right order of priority. Manufacturing operations require tight coordination of sensing and actuation to safely and efficiently perform closed loop control.

Industrial automation approaches have traditionally been hampered by different incompatible and non-interoperable standards used for communication between devices – in effect, a barrier of language and culture. As a result, users have often found themselves locked into proprietary systems, while vendors have had to develop multiple versions of essentially the same product.

Machines and their designers have learned that it’s good to talk, and how hard it becomes when everyone retreats into their own proprietary dialects.

But what about all the data that these connected sensors generate? The ability to generate, assimilate and interpret huge quantities of data and take appropriate action means that data streams in at an unprecedented speed and must be dealt with in a timely manner. RFID tags, sensors and smart metering are driving the need to deal with torrents of data – extremely large data sets to be analysed computationally to reveal patterns, trends, and associations.

And yet, while machines are heading in that direction, human society once again appears to be heading in the opposite direction. Divergent levels of trust in statistics has opened up in western liberal democracies. Shortly before the November presidential election, a study in the US discovered that 68% of Trump supporters distrusted the economic data published by the federal government. In the UK, a research project by Cambridge University and YouGov discovered that 55% of the population believes that the government “is hiding the truth about the number of immigrants living here”.

Antipathy to statistics has become one of the hallmarks of the populist right, with statisticians and economists chief among the various “experts” that were ostensibly rejected by voters in 2016. Perhaps the answer lies in educating ourselves how to analyse data objectively, in the way that we have taught our machines to do for us?

Has the age of the machine has truly arrived?

Andy Pye
Latest posts by Andy Pye (see all)