A staple in the science fiction works of Steven Spielberg and Isaac Asimov, artificial intelligence has been a dream for scientists and engineers since its definition in 1956. Spielberg’s eponymous movie “AI” was about as bad as “Jurassic World,” actually much-harder-to watch bad, but that is another topic. However, the time has finally come for AI to become a reality, and hopefully it is not manipulated by some fool like Vincent Moore in “Chappie,” who really went off the deep end on that one, but let’s get back on track here. And though it’ll still be a while before you have the “Bicentennial Man” walking around your living room, businesses, especially in the IT industry, have already started utilizing the technology for growth and efficiency. Imagine MU/TH/UR 6000 from the “Alien” franchise (minus the propensity for killing, of course), and you’ll have a fairly accurate idea of what we’re talking about. Artificial intelligence has affected aspects of some businesses more than others, but none more so than datacenters. Worldwide, these datacenters have started relying on AI for reducing operational costs and increasing their performance quality. Let's see what AI and datacenters can do together.
Because of the rate at which information is transferred to the cloud, AI and datacenters will form the backbone of every successful business. And the more datacenters there are, the higher the operating costs will be. What’s interesting is that a large chunk of these operating expenses goes toward keeping the servers it houses cool. Artificial intelligence helps provide a more practical solution, as in the case of Google.
DeepMind Technologies, a British AI company acquired by Google in 2014, has begun applying its machine learning to datacenters. And so far the results have far exceeded anybody’s expectations.
Not only did DeepMind manage to slash the energy used for cooling the Google datacenters by up to 40 percent, the company also clearly demonstrated the potential of AI to optimize systems and address serious issues associated with climate change. The thing is, Google’s datacenters were already pretty efficient, which made these newfound gains all the more exciting.
If you’re wondering whether what works for Google will work for other datacenters as well, know that engineering experts believe that this level of improvement would be looked upon favorably by other large-scale, energy-consuming environments as well. Thus, the introduction of AI presents a huge step forward. Firms that run on Google’s cloud will also improve their energy efficiency using these advances.
Did you know that based on their size, datacenters can use more than a million gallons of water annually for fueling their cooling systems and keeping their servers operational? Under normal circumstances, this would be achieved through the use of big industrial equipment, like chillers, cooling towers, and pumps.
But the datacenter environment is complex, and optimal operations may get hampered if all the cooling equipment is used. Plus, the way it works and interacts with the environment in a nonlinear, complicated fashion cannot be captured by traditional human intuition and formula-based engineering. Moreover, the system is slow to adapt to external and internal changes as the engineers are unable to develop heuristics and guidelines for each scenario.
So, considering the fact that datacenters have unique environments and architecture, this is not a one-solution-fits-all problem. Instead, a general intelligence framework may be required to figure out the interactions of the data center. And this is exactly where artificial intelligence plays a significant role.
DeepMind Technologies, for example, enhances datacenter efficiency through machine learning. If you were wondering what “machine learning” is, it is a type of artificial intelligence that is capable of learning from, and making calculations and predictions based on, the accessed data. The benefits don’t stop there — the utility of the overarching datacenter system can also be improved considerably through the use of the AI and datacenters.
Google’s approach to using artificial intelligence was, thus, an ingenious one. The company built up a system of neural networks and trained them according to various operating scenarios and parameters of their datacenters. The outcome was a highly adaptive and profound framework — one that could optimize efficiency and understand datacenter dynamics better than before.
Though this process has been simplified here, it’s more complex than you think. Data had to be collected by numerous sensors in the datacenter, monitoring pump speeds, set-points, temperatures, power, and more, and use it for training deep neural networks.
AI plays a significant role in automating datacenter maintenance and security tasks. In previous setups, in the event of a security attack, datacenter workers would have to manually secure the servers. Moreover, maintenance tasks, like fixing faulty storage devices or replacing problematic servers, could get overwhelming due to their manual nature. Throw robotics into this mix, and datacenters could become more self-reliant. The future might allow AI to guide robots for performing maintenance work, thereby ensuring higher quality of service (QoS) at a reduced operational cost.
The AI and datacenters system Google uses is in charge of monitoring and maintaining more than a hundred variables, including windows, fans, and cooling systems.
If all this sounds astounding, that’s because it is. But there is always scope for improvement, and the full potential of AI is yet to be explored. According to the researchers, it all comes down to how many sensors are placed inside the datacenter. Now that they possess a clearer idea of how it all works, they are likely to experiment and push the boundaries even further while even expecting more prolific outcomes.
Since the algorithm works great when it comes to figuring out complex dynamics, it can be applied to various other challenges within the datacenter environment in the near future. The applications of AI can have a positive impact on other industries as well.
Artificial intelligence might be a new addition to the world of dataservers, but the implementation opens up several new avenues.
If properly explored, AI technology may be used later on to improve the conversion efficiency of power plants, help manufacturing units raise throughput, and reduce semiconductor manufacturing energy. For the time being, the benefits of AI and datacenters has given impetus to engineers and researchers to explore all possible methods in which artificial intelligence can help them.
Photo credit: Wikimedia
Most techies geek out at the end of each calendar year as we wait in…
Monitoring Azure Windows Virtual Desktop, especially keeping an eye on the health of session hosts…
Migrating SQL data to Microsoft Azure takes planning because there are several ways to do…
Offices are reopening, but after months of a work-from-home routine, many employees may not want…