Perfect Technology Storm
Where we came from
According to historical timelines, millions of years passed between the time humans made the first tools and the time the first boats were constructed, and thousands more years went by before the invention of the wheel. Static electricity was discovered in 600 BCE, but it wasn’t until 1660 AD that Otto von Guericke devised a machine that could produce it, and we had to wait almost another century before Ben Franklin conducted his famous go-fly-a-kite experiment during a thunderstorm and collected an electrical charge. A few decades later, Alessandro Volta built the first batteries, and finally in 1880 – over a hundred years later – the electric utility industry got its start with Thomas Edison’s power station in lower Manhattan.
Where we are
Now let’s look at our more recent history and what we’ve accomplished in less than a century. The first theoretical computing machine was conceived by Alan Turing in 1937. The first real computer was the 1800 square foot, 50 ton ENIAC, invented in 1946. The first commercial personal computers appeared in the 1970s. Today, a mid-level smart phone has much, much more computing power than either of them.
Computers and networking have undergone profound progress in a short time. We’ve gone from desktop towers to laptops to tablets and phones and hybrid devices that combine elements of all of the above. When it comes to communicating information from one computer to another, we’ve moved from sneakernet to Ethernet to wi-fi to 4G, getting faster and more reliable with every new technology. We’ve lived through peer-to-peer networking, client-server networking, domain-based networking, and gone from server room to datacenter to the cloud.
Halfway into this second decade of the twenty-first century, we’re standing on the threshold of even more amazing changes. Not only have computers gotten much faster and more powerful, but the way that we interacted with them for decades – via keyboards and pointing devices – is giving way to far more intuitive “human interface” methods that integrate better into our everyday lives.
We can input our commands and queries to our computers via touch screens or voice, and there are some who say that in the not-too-distant future our machines will be able to connect directly to our brains so that we can simply “think” to the computers – an idea that is fascinating and scary at the same time. A proliferation of various types of sensors can gather information without our having to explicitly input it at all; GPS tracks our location and, along with movement sensors, measures our steps, heartrate sensors provide health and fitness data to our apps, proximity sensors can detect that we’re moving too far away from our equipment and alert us to prevent loss or theft. Remember when you had to adjust the brightness on your screen manually? Now ambient light sensors can measure the room lighting and set your display accordingly, modifying it as conditions change.
Where we’re going
In the next few years, it’s almost inevitable that we will move from an Internet of computers to an Internet of Things. That transition has already begun, with the IP camera market overtaking that of CCTV surveillance, the quiet home invasion of NEST thermostats and the growing popularity of smart phones and smart watches and clever wearables.
The future is here now, but it hasn’t quite taken hold yet. The vast majority still own refrigerators that fail to notice when we’re out of eggs and order up a dozen for us. Most of us still aren’t able to log on to the Internet from work and tell our ovens to please start preheating at precisely 4:47 p.m. so it will be ready when we get home. Very few of us are currently remotely pausing and starting our washers’ and dryers’ cycles remotely via our smart phones. Not many of us are using an app to dispense food to our pets in small portions throughout the day. Only a small proportion of us can lock or unlock our front doors from upstairs or all the way across the country by pressing a touchscreen.
But one day – probably sooner than you think – these scenarios and many more will be a matter of routine, just every day, unremarkable activities that we perform without thinking much about how incredibly powerful our technology has become. The Internet of Things is coming at us like a freight train, even if it’s not quite traveling at the top speed of the Shanghai Maglev. Gartner predicted last year that there will be almost 21 billion devices connected to the Internet by the year 2020 – and that’s a short four years away.
When technologies converge
The IoT, by itself, would be disruptive enough. However, it’s when we look at some of the underlying technologies that the IoT will rely upon that things get really interesting. As mentioned earlier, many of the devices that currently connect to the Internet, and even more of those that do so in the future, are based on the premise of collecting sensor data. Lots and lots of sensor data. In other words, the IoT is set to have a profound impact on another rapidly growing technology phenomenon: Big Data.
With so many Internet-connected devices, many of them containing multiple sensors that are generating huge amounts of data, companies will be faced with a challenge: how do you sort through those petabytes of data to separate what’s relevant from what’s not, and then use the relevant data in a useful and productive way?
I’m reminded of a popular quote that’s usually attributed to Clifford Stoll (famous for his investigation of KGB-affiliated hacker Markus Hess in one of the first digital forensics cases when Stoll was a sys admin at Lawrence Berkeley National Laboratory):
“Data is not information, information is not knowledge, knowledge is not understanding, understanding is not wisdom.”
Our goal in collecting big data from all the “things” on the Internet is to turn those mounds of data into – at the very least – knowledge, if not wisdom. That means in addition to big data, we need big data analytics. Despite the ever-falling price of storage space and the huge capacity of drives in servers and NAS systems both on premises and in the cloud, along with the free open source nature of Hadoop – one of the most popular big data software frameworks – the overall cost of a properly implemented big data infrastructure still doesn’t come cheap. And the infrastructure is only part of the story.
Effective analysis of big data requires more resources. As is so often the case, a major expense is that of qualified expert personnel to design, implement and maintain the system. Depending on your use case, that can mean developers, engineers, database managers and/or big data scientists as well as specialists in your particular field who are experienced in integrating big data into particular environments. All of this adds up fast. However, it’s going to be a necessary expense because the overwhelming amount of data that will be gathered by the enormous number of devices in the IoT will be useless without a means for storing, sorting and making sense of it.
Of course, because of this sheer volume of data, there is no way that you can hire enough data scientists to go through it and review it all manually – and even if you could, that would be tremendously inefficient. However, gaining usable business insights from massive quantities of data that comes from disparate sources is beyond the capabilities of standard computer algorithms. That’s where artificial intelligence (AI) comes in.
Rather than just follow explicitly programmed instructions, AI – through machine learning – can identify patterns and correlations in data and use them to predict future behaviors and outcomes and make decisions based on those predictions. AI can improve both the speed and the accuracy of big data analysis so that the insights from that analysis can be applied in real time or at least in a timely manner.
What could possibly go wrong?
Most technology futurists are in agreement that the IoT is, at this point, unstoppable (short of an apocalyptic event), that it will generate far too much data for human analysts to handle, and that artificial intelligence is the solution that will unite IoT and big data in a marriage of convenience, if not of wedded bliss. But as with all marriages, the matchmaking is the easiest part. It’s making it work happily ever after that can be difficult.
Integrating AI with IoT has huge potential – and also presents a number of significant challenges. Fear is up there at the top of the list. Many people are uneasy with the idea of computers attempting to emulate human processes; this is due in part to the many science fiction tales of intelligent machines rebelling and turning against their human “masters.” This also raises legal and ethical issues concerning whether machines have “rights.”
At a less dramatic level, there are safety issues that have to be considered. This has been one of the big obstacles to acceptance of the driverless automobile. We’ve all experienced computer error in one form or another. A machine’s error in judgment could have ramifications that range from merely inconvenient to life-threatening, depending on the tasks for which the device is responsible and the environment in which it operates.
Privacy and security concerns are legitimate worries; the patterns detected in IoT-collection data can reveal a great deal about individuals, and some of that information is sensitive and personal. Many feel it’s bad enough that, given the slow demise of the cash economy and our reliance on credit cards and electronic payments, detailed records of our purchase histories exist. AI adds a brand new dimension to those concerns, as intelligent machines could look at that data and decide a person is at greater than average risk of heart attack or liver disease based on the food or amount of alcohol he buys – and if his health insurance company gets that information, his rates will go up or his policy might even be cancelled.
Even if we could qualm all these fears, there are other problems that must be overcome before the merger of IoT/big data/AI becomes a widespread reality, much less provides a real return on our investment. The actual implementation – getting it all to work in real life – may involve a lot of weeping and wailing and gnashing of teeth, on several different levels.
It’s not just public acceptance that will get in the way. There are also more practical problems. The complexity of such a system involves many different components that use different protocols and processes must be capable of interacting with one another. Compatibility issues are sure to arise.
Tomorrow’s technologies, if we could travel ten or twenty years into the future and see them now, would undoubtedly boggle our minds in the same way today’s smart phones and ubiquitous connectivity would have astonished our grandparents in their youth. The road between here and there, however, is likely to be a rocky one. The combination of IoT, big data and AI is the ultimate trifecta, but some won’t want to take the gamble, and those who do may have to endure some ups and downs before it pays off.