In my previous article I discussed whether the world is actually ready for the Internet of Things and looked at the types of connection ecosystems that might be required to support the every-growing diversity of edge-based IoT device. Since that article was written, John Chambers, CEO of Cisco, has talked about 2015 being the year of inflexion for the growth of IoT. Gartner is also forecasting a 40% increase in IoT devices within smart cities between 2015 and 2016, further adding to the issue of device proliferation and their subsequent management.
The explosion of “things” is starting to happen and we are seeing devices being used in all sorts of distributed applications, from the rise of smart cities, through fitness equipment to mobile monitoring of cargo, as well as the individual “app-based” applications like being able to see inside your fridge while you are the shop, or check in on the babysitter remotely. The real enabler for the Internet of Things is an inherent ability to distribute the monitoring and control of individual machines without the requirement for a traditional closed control network. We apparently no longer require a specific connectivity design to allow our set of specific use devices to be controlled or maintained. I use the word “apparently” for the very specific reasons I discussed in my previous post as there seems to be an underlying belief that simply using the Internet is good enough.
Whilst this might be true today, my thoughts are extending beyond the needs of the few million devices which are today enjoying the early adoption of the current Internet. What will this world require when there are multiple billions of things out there? I’m conscious that the statements of the current providers are that the data sets are so small that it doesn’t matter. However, we are already seeing things which go beyond carrying out simple tasks such as reporting the status of a car parking spot (full or empty) with a latency requirement of a few tens of seconds, but rather things which can control entire ecosystems and have the ability to report themselves faulty automagically. The fact is autonomous things require highly complex systems for communication and management. Add to this the need for in-situ upgrade of those things (more complexity means more bugs and more opportunity to add features) and the communications requirements start to grow exponentially.
In order to create a sustainable growth in the Internet of Things we will require a layered architectural approach, much like the custom-designed industrial control systems of today, but updating this to encompass the full capabilities of the Internet and Cloud.
This layering, of course, starts with connectivity to individual devices. At the last few metres there are also multiple connectivity standards, all of which connect the thing to the Internet at some point – whether this be WiFi, 3G/4G or any of the other standards. Once the thing has Internet connectivity, this is where the potential future problems start to emerge. The current favourite place to “connect” the thing to is some kind of collection machine usually based in a cloud compute service like AWS or Azure and of course this allows growth with the number of things connected. But if you are not careful, what get’s built is a massive machine handling millions of inputs.
The cloud providers are already starting to exploit the per-message billing model on top of the compute requirements, seeing a way to monetize the communications system – this is exactly the model employed by the global voice carriers for their monetisation strategy over the last 100 years, the users rent some equipment (both in the home and the exchange) then pay per call. Thus, architectures which employ intermediate layers for the concentration of messages and thing control, giving the things an independence, will minimise this free-fall into the ways of the old-style carrier (now suffering from the failure of that model due to the over the top solutions provided by the Internet) and allow control to be retained by the users of the things, rather than the providers of the infrastructure.
This implies a need to have a smart way to conceive of the global eco-system of things and this is fully in line with the needs of the modern enterprise who want the ability to have this freedom to deploy distributed technology for business purposes, but retain the control, management and security of their enterprise network. There is thankfully an emergence of this intermediate system for the Internet of Things and several start-ups are beginning to see an opportunity in this platform space, but these companies still do not necessarily understand the full needs of the enterprise for the connectivity of their things. An example of this is the belief that encryption is good enough for data transfer and that secure connectivity is not required (ask an enterprise IT person if encryption is enough), and so I believe partnerships need to be formed with the major carriers (even though they are slow and old) to fully realise a flexible, extendable future for billions of things.
What do you think we need to do to ready the world for the Internet of Things? Leave a comment below.