A look at the emerging technology landscape of the future tells us that the role of the IT professional is changing rapidly. According to Gartner researchers and analysts, top trends for 2016 point in the direction of more and more connected devices that go far beyond the traditional computer/tablet/smart phone triad.
This Internet of Things is poised to permeate our lives and change our lifestyles in ways that no one but sci-fi writers could have imagined a decade or two ago. But it’s about much more than just controlling your washing machine or turning the lights on and off over the Internet. The next big push is to take these machines to a whole new level, to make them capable of learning and taking autonomous actions on their own. In the best case scenario, this would free us from many mundane tasks and allow us to work more productively and have more time to play – but the road to utopia is likely to be a rocky one.
Cars that do the driving for us, “smart” virtual personal assistants to organize our schedules, robots that perform surgery, mechanical soldiers to man the battlefield in place of vulnerable humans: What could possibly go wrong? Let’s examine both the promise and the potential dangers of a world where our machines can both “think” and act on those thoughts – and what it means to those of us whose job it is to oversee those systems.
I wrote recently, in an article for WindowsNetworking.com, about the perfect technological storm that is brewing over the convergence of IoT, AI and Big Data, and how it will change our lifestyles. Here I want to take it a little further and look at it from the perspective of the IT professionals who will be faced with learning entirely new skill sets and will have to reassess how we do our jobs.
Intelligent machines: this changes everything
Science fiction books and movies have seized upon the potentials of artificial intelligence as a plot device, usually focusing on the most disastrous unintended consequences that the scriptwriters can imagine. This has instilled a fear of AI in the minds of members of the general public, who associate it with rebellious robots who will leverage their superior capabilities and physical indestructibility to enslave or destroy mankind.
Such dramatic outcomes aside, there are more realistic concerns; studies by such reputable sources as Bank of America and Oxford University have predicted that smart robots could render many millions of people unemployed in the coming years, replaced by automated machines that don’t demand raises (or paychecks of any size), never take sick days, and don’t mind working twenty-four hours a day, seven days a week.
While it might seem that more computerized machines would be a recipe for job security for the computer industry, forward-thinking IT pros may realize that many of the tasks they perform to keep the servers and networks running are, in fact, repetitive, routine, and ripe for takeover by automated systems. The concept of autonomic computing, which some say hails the end of the artisan IT worker, has already been around for quite a while, first introduced by an IBM initiative a decade and half ago. The goal is to develop computers that can manage themselves, performing the maintenance, troubleshooting and repair that human network admins have traditionally taken care of.
Modern software has been steadily moving in this direction for years. Many of the tasks that once upon a time had to be done manually – things such as running virus and malware scans, installing system and application updates, keeping hard disks defragmented, repairing corrupted files, and more – are now done in the background without the necessity for human intervention.
Autonomic computing systems are built on the premise of self-configuration, self-optimization, self-protection, self-monitoring, self-management, and self-healing. Autonomic systems also need to be self-learning and adaptive so they can change their configurations and functions when they detect changes in the operational environment, without waiting for human interaction. Cloud services providers rely heavily on autonomic computing to run their gargantuan data centers, and research into autonomic applications is the subject of such industry-wide events as the International Conference of Autonomic Computing. The ICAC boasts such supporters as Microsoft, IBM, Google, Hewlett-Packard, and SAP, and is in its thirteenth year as of 2016.
The next logical progression of autonomic computing is the concept of autonomic networking, which is built on a compartmentalized structure that allows admins to define policies and rules, which are then carried out by the self-managing systems through software that runs on top of the operating system. The intelligent machines that make up autonomic networks interoperate to simplify the administration of today’s increasingly complex networks, both the physical networks that exist within an on-premises data center and the virtual networks that exist in the cloud.
What does this mean for IT?
If computers and networks are capable of configuring and reconfiguring themselves, detecting security vulnerabilities and threats, and patching or neutralizing them, diagnosing problems with connectivity, performance, functionality, etc. and repairing them, you might be wondering “What do they need us for?” However, if you think about, technological advancement has always resulted in more routine tasks being done automatically, which means some jobs that involved doing those tasks may go away. However, on the other end, more complex technology requires more highly skilled people who understand that technology, which results in the creation of new and different jobs with different skill sets.
As Russell Roberts wrote five years ago in the Wall Street Journal, automation has resulted in better efficiencies for businesses, which in turn has brought about a higher standard of living and shorter work hours for everyone, with more total jobs that include many job descriptions that didn’t even exist twenty years ago. In general, these jobs are less physically strenuous and less physically dangerous as machines have taken on the kind of manual labor and exposure to risks that often damaged workers’ health and shortened their lives.
In the IT industry, as in the job market in general, job duties are shifting from mundane tasks to more intellectually stimulating knowledge-based positions that pay more. It’s true that those who aren’t able or willing to learn, grow, and change may be left behind. The good news is that those who ready to reinvent themselves and plunge into new and more challenging (and more gratifying) types of work have the opportunity to advance as they never have before.
After decades of HR-driven emphasis on checklist requirements such as a degree in the “right” field of study or a hard-code number of years of experience in a particular job title, companies seem to be recognizing that the candidates who appear to fill the bill on paper are not always the best ones for the job. Last year, large professional services companies Ernst & Young and PricewaterhouseCoopers announced that they were de-emphasizing the college degree requirement in the recruitment process.
Around the same time, Forbes reported that liberal arts degrees, formerly considered “useless” in the tech industry, are now being sought after by some software companies. A number of technology companies’ CEOs have come out in favor of hiring liberal arts and humanities majors – considering them to be more creative thinkers – to balance out the emphasis on logic and reason that is the focus of engineering and other technically-trained employees. After all, creativity is vital to innovation, and innovation is what the future of IT is all about.
Future-proofing your career in IT
Information technology isn’t going away anytime soon (short of a world-wide electro-magnetic pulse that renders all our electronics useless). That means IT professionals will be around for a long time, too – but just as the job of today’s accountant would be almost unrecognizable to the nineteenth century bookkeeper with his ledgers and pens, tomorrow’s IT jobs may involve very different duties than those of the typical network admin of today.
What are some of the positions that we can expect to take off in the next decade? Specialists in robotics, artificial intelligence, and virtual reality design are likely to be in demand, along with 3D printing technicians, machine learning scientists and digital anthropologists. Some current jobs that will probably still be around and experience growth include cloud architects, data scientists and – because no matter how good automated threat detection and protection get, hackers will always stay one step ahead – cybersecurity experts.
As IT becomes more and more about IoT (the Internet of Things), we’ll need more people who are trained to program and integrate devices, from self-driving cars to personal and commercial drones, seamlessly into both our work environments and our off-duty lives. Of course, if you’re a coder, there will be plenty of opportunities over on the developer side, as well. All of those smart cars, smart home appliances, smart watches, and other components that are getting smart will run on software programs that will need to be written and regularly updated.
Then there’s the biotech industry and all the job roles that would be made possible in a world when computer chips are implanted into the human body. Although the idea of implanted microchips sounds like something out of a dystopian sci-fi movie and makes many people uneasy, the technology already exists and is in use. Yet another hot tech field for those who are looking ahead is that of nanotechnology, which is being developed for everything from manufacturing to medicine.
Many long-time IT professionals seem to be at a crossroads now, old enough to have a major time and learning investment in the tech industry and too young to retire as the job roles they’ve held for years are being phased out. Many are looking at retraining for new opportunities. The machines that will do much of the work in the years ahead may indeed make some IT jobs obsolete, but they will also open up new and exciting doors that can lead to career success.