A new year is upon us, and like at the start of every year, tech journalists such as myself are publishing their tech predictions for the upcoming year. I will be the first to admit that year-end tech prediction pieces are fun to write, but this time around I decided to do something a little bit different. Rather than telling you what I expect for the next year, I want to give you my tech predictions for the next five years.
I don’t claim to have a crystal ball or a psychic on speed dial, so I can’t tell you with any certainty what the future holds. However, there is one major trend that is going on right now that I think will shape IT for years to come. No, I’m not talking about cloud, although that does play into it in some ways. Let me give you the back story.
Maybe four or five years ago, one of the big tech stories was the so-called consumerization of IT. Back then, most large companies required employees to work from corporate-issued devices that were configured in such a way as to make the IT security guy, the HR department, and the legal department happy. Eventually, however, there was a revolt and employees demanded to use personal and often nontraditional computing devices to access corporate resources. That is what most people mean when they speak of the consumerization of IT.
In my mind, the consumerization of IT was much bigger than that. In fact, I see the consumerization of IT as a two-way street. Yes, consumer electronics were brought into the workplace, but enterprise technology also made it into the home. Consumer grade WiFi routers, for example, include advanced networking features that were solely used in corporate environments not all that long ago. Likewise, I am writing this article from a desktop computer that is equipped with a 1TB SSD. Sure, it’s a consumer-grade drive, but five years ago a terabyte of solid-state storage would have been almost unheard of.
My point is that there was more to the consumerization of IT than just the whiney boss who wanted to work from an iPad. The consumerization of IT shaped the way that we did things both in the office and at home. As significant as this transition was, I believe that we are at the beginning of the consumerization of IT. Let’s call it IT 2.0. The trends that I am predicting for the next five years stem from a further blurring of consumer and enterprise technology.
One of the huge trends that will shape IT in the coming years is artificial intelligence. Microsoft spent a significant amount of time discussing and demonstrating their advancements in artificial intelligence at this year’s Ignite conference. Artificial intelligence is also becoming popular in the home, thanks to devices such as Google Home and the Amazon Echo.
My point is that companies such as Google, Microsoft, and Amazon have invested heavily in artificial intelligence and in speech recognition capabilities. The thing that I find interesting is that all three of those companies serve both the enterprise and the consumer market. Since these companies will want to receive the maximum return on their investment in artificial intelligence and speech recognition, it stands to reason that these technologies are going to become much more prevalent in enterprise environments. Imagine for example how artificial intelligence and speech recognition might shape business intelligence. A user could conceivably use a front-end application to verbally perform natural language queries against backend database applications.
One of the most significant IT developments of the ’90s was USB. Before that, there was no real standard for connecting peripherals to a computer. Many devices such as modems and the digital cameras of the time were connected by a nine-pin serial port. Printers were commonly connected through an RS-232 port, and scanners and external disks often relied on a SCSI port. USB made it possible to plug almost anything into a universal port.
I think that within the next five years we are going to see a similar concept applied to wireless connectivity. Right now, there are a ridiculous number of wireless devices both in the home and in the office. Some of these devices make sense, but I have also seen really strange devices such as a WiFi-enabled curling iron or a WiFi-enabled baby spoon.
Although we are being inundated with wireless devices, I think that the number of devices on our networks is set to explode, as a part of the IoT trend. This device explosion, however, presents a number of challenges. Security, manageability, and network bandwidth all come to mind, but what about connectivity standards? Within my own home, I have WiFi devices, Z-Wave devices, Bluetooth devices, IR devices, and there is no telling what else I’m not thinking of.
My guess is that we are eventually going to see a universal standard emerge for wireless devices. Before that happens, though, I think we are going to begin to see universal routers enter the marketplace that support all of the major wireless connectivity methods. If that sounds far fetched, consider how your smart phone works. It connects to Bluetooth, but it also connects to WiFi and the cellular networks. That one single device provides three different types of wireless connectivity. So what’s so crazy about a universal access point?
The third technology I believe will rapidly evolve over the next five years is display screens. We have already reached the point at which LCD and OLED screens have become pervasive, but there are two trends that I am watching. First, I expect small form factor screens to become far less expensive. Think about it for a moment. A few years ago, it became possible to verbally record a message in a greeting card. I think that screen technology will evolve to the point that you will soon be able to purchase greeting cards that allow you to record a video message.
The other thing that I am expecting is that screens will become thinner and lighter, will consume less power, and more importantly, support a much higher resolution. So why am I predicting this? The industry will demand it. 4K televisions are already commonplace. Soon consumers will expect to be able to watch (and record) 4K content on their mobile devices. Furthermore, that IoT device explosion that I am predicting will drive the demand for small form factor screens, bringing down the price.
I recently attended a tech lecture in which the speaker said something truly profound. I wish that I could remember where I heard this so that I could give proper credit, but the speaker said that right now is a very unique time in the history of the world. It has only been within the last 50 years or so that it has been “normal” to view information two dimensionally. Prior to the invention of the television and computers, information was always consumed in some sort of three-dimensional format. The speaker went on to explain that 2D is not natural, and that there will be a tendency toward 3D computing as technology improves.
I think that we are already seeing this trend in action. Almost every Blu-ray player supports 3D video, and computing devices such as Microsoft HoloLens create 3D computing environments.
My guess is that as technology improves, 3D video will be widely adopted by gamers, but will eventually spread to more business-oriented applications such as architecture, medicine, and engineering.
Many years ago, I saw a technology demonstration in which the presenter stood in front of a projected display and verbally instructed the computer to draw a circle and color it red. He then used his hands to tell the computer to make it “this big” and to “put it over there.” This was the first demonstration that I had ever seen of gesture control.
Today, gesture control is becoming more common. Technologies such as the Xbox One and allow the user to use gestures to interact with the virtual world. Similar technology is also being used in Windows 10’s Windows Hello feature for authentication purposes. A sensor array similar to what is found in the Xbox Kinect is able to perform facial recognition for the purposes of user authentication. I think that this proves that the technology can be put to work in business environments.
Right now, gesture recognition is still somewhat clunky, but I think that the technology will get better, and will eventually be used to interact with business applications. For instance, it may eventually be possible to use gestures to control a PowerPoint presentation or to interact with Google Earth or AutoCAD.
The future is (almost) now
None of us really knows what the future holds, but I am envisioning a world in which the lines between enterprise and consumer technology are blurred to a much greater extent than they are today. In fact, as enterprise environments outsource more and more enterprise workloads to the cloud, it may become increasingly common for the workplace to be completely dominated by consumer or prosumer electronics.