Oliver Goodman, Head of Engineering at Telehouse, explains the impact AI is having on data centre security and energy efficiency

Demand for data centre (DC) services has been steadily rising every year, but since the beginning of the COVID-19 pandemic, that demand has skyrocketed with people and businesses more reliant on them than ever before. Despite some operators scrambling to overcome capacity shortages, the sector has coped well with the increased demand and has even achieved greater recognition with the UK government giving the sector a voice on COVID-related matters and DC workers being given key worker status.

Relying on human monitoring and intervention can be problematic when trying to stay on top of three of a DC’s biggest challenges – energy efficiency, electricity costs and cybersecurity – when demand is rapidly rising. This is where AI can help. 

Maximising energy efficiency and minimising cost

It’s no secret that DC facilities are power hungry, so it would be easy to assume the sector has a negative environmental impact. However, this simply isn’t the case. A recent survey of UK commercial operators revealed that 76.5% of the electricity they purchased is 100% renewable – 6.5% is between 0 and 50% renewable, 7% is between 50% and 99% renewable and 10% is purchased according to customer demand. But that doesn’t mean DC operators aren’t going further to improve energy efficiency, and this is one area where AI can help.  

The load (the amount of energy consumed by servers and network equipment in server halls) can vary at given time depending on the network demand and accommodating the load efficiently is challenging without the intervention of AI. For example, if the load suddenly goes up in one server hall, additional chilling is required to keep the servers cool and running efficiently. Energy efficiency gains can be made by knowing exactly when to switch that additional chiller on and when to switch it off. 

By collecting, aggregating and analysing operational data, AI can set certain trigger points and execute actions – such as switching the chiller on or off – at exactly the right moment. Machine Learning can also by deployed to understand load patterns and predict when fluctuations in load will occur, allowing DC operations to react efficiently. In an uninterruptable power supply (UPS), AI can switch between efficiency modes automatically in response to changing load levels, ensuring the system runs as close to the optimum efficiency for the load at any given time. 

This can also be applied to reducing electricity overheads. Balancing energy efficiency with the cost of electricity is a constant struggle for DC operators. With loads increasing every year, operators are faced with growing electricity bills. Attempts to keep electricity costs low can impede upon the energy efficiency of the facility. For example, running chillers at 10% of their capacity is one way to minimize electricity costs but this means the chillers will run inefficiently. 

IT Programmer Working in Data Center Syst
IT Programmer Working in Data Center System Control Room.

AI can be used very effectively in control systems to help operators balance cost and efficiency. This is improving over time but there is an onus on the manufacturers to make these developments faster so that operators can build greater levels of automation on top of those systems to help strike the right balance.

Robust cyber security measures

Increasing cyber security in DCs largely comes down to understanding behavioural patterns in the IT infrastructure and reacting immediately when a typical pattern is disrupted by an atypical behavioural event. This is very similar to the way cyber security works in a conventional office-based business. Each company device will have its typical usage pattern and AI can understand how individual devices typically interact with the network. A device logging on to the network outside of regular working hours and extracting data from the system would be an unusual behavioural event and AI can recognise this then disable the device’s network access and notify the business of a possible attempted security breach. 

In the context of a DC, AI will monitor the behavioural pattern of every server and will react accordingly to any event that diverges from the typical pattern. These AI capabilities can be leveraged at an extremely granular level to further enhance security. For example, if a server’s behaviour suddenly changes after somebody has been present in its server hall. This kind of granularity offers huge potential for DCs from a cyber security perspective and will continue to improve security as demand for their services grows.

Where humans would typically struggle to make data-informed split-second decisions that could improve energy cost and efficiency or stop a data breach, AI is helping the DC sector to evolve. It’s an exciting time for the sector and we can expect to see decision-making becoming more intelligent and autonomous as AI-driven solutions continue to evolve. 

Learn more about emerging trends across the tech panorama in the latest issue of Interface

Experts have been predicting for some time that the automation technologies that are applied in factories worldwide would be applied…

Experts have been predicting for some time that the automation technologies that are applied in factories worldwide would be applied to datacentres in the future. Not only to improve their efficiency but to help gather business insights from ever-increasing pools of data. The truth is that we’re rapidly advancing this possibility with the application of Robotic Process Automation (RPA) and machine learning in the datacentre environment. But why is this so important?

At the centre of digital transformation is data and thus, the datacentre. As we enter this new revolution in how businesses operate, it’s essential that every piece of data is handled and used appropriately to optimise its value. This is where the datacentre becomes crucial as the central repository for data. Not only are they required to manage increasing amounts of data, more complex machines and infrastructures, we also want them to be able to generate improved information about our data more quickly.

In this article, Matthew Beale, Modern Datacentre Architect at automation and infrastructure service provider, Ultima explains how RPA and machine learning are today paving the way for the autonomous datacentre.

The legacy datacentre

Currently, businesses spend too much time and energy on dealing with upgrades, patches, fixes and monitoring of their datacentres. While some may run adequately, most suffer from three critical issues;

•           Lack of consistent support, for example, humans make errors when updating patches or maintaining networks leading to compliance issues.

•           Lack of visibility for the business, for example, multiple IT staff look after multiple apps or different parts of the network with little coordination of what the business needs. 

•           Lack of speed when it comes to increasing capacity or migrating data or updating apps.

Human error is by far the most significant cause of network downtime. This is followed by hardware failures and breakdowns. With little to no oversight of how equipment is working, action can only be taken once the downtime has already occurred. The cost impact is much higher as the focus is taken away from other things to manage the cause of the issue, combined with the impact of the actual network downtime. Stability, cost and time management must be tightened to provide a more efficient datacentre. Automation can help achieve this.

‘Cobots’ make humans six times more productive

Automation provides ‘cobots’ to work alongside humans with unlimited benefits. The precisely structured environment of the datacentre is the perfect setting to deploy these software robots. There are many medial, repetitive and time intensive tasks that can be taken away from users and given to a software robot with the effect of boosting both consistency and speed.

Ultima calculates that the productivity ratio of ‘cobot’ to human is 6:1. By reviewing processes that are worth automating, software robots can be programmed, and once verified, they can repeat them every time. Whatever the process is, robotics ensure that it is consistent and accurate, meaning that every task will be much more efficient. This empowers teams to intervene only to make decisions in exceptional circumstances.

The self-healing datacentre

Automation minimises the amount of time that human maintenance of the datacentre is required. Robotics and machine learning restructures and optimises traditional processes, meaning that humans are no longer needed to perform patches to servers at 3 am. Issues can be identified and flagged by machines before they occur, eliminating downtime.

Re-distribution of resources and capacity management

As the lifecycle of an app across the business changes, resources need to be redeployed accordingly. With limited visibility, it’s extremely difficult, if not impossible, for humans to distribute resources effectively without the use of machines and robotics. For example, automation can increase or decrease resources accordingly towards the end of an app’s life to maximise resources elsewhere. Ongoing capacity management also evaluates resources across multiple cloud platforms for optimised utilisation. When the workload is effectively balanced, not only does this offer productivity cost savings, it also allows for predictive analytics.

The art of automation

These new, consumable automation functions are the result of what Ultima has already been doing for the last year when it found itself solving similar problems for three of its customers. It was moving three customers from their end of life 5.5 version of VMWare and recognised that it would be helpful to be able to automatically migrate them to the updated version, so it developed a solution to do this. Where once it would have taken 40 days to migrate workloads, the business cut that in half, resulting in a 33 per cent cost saving for those companies. It then moved on to looking at other processes to automate with the ambition of taking its customers on a journey to full datacentre automation.

Using discovery tools and automated scripts to capture all data required to design and migrate infrastructure to the automated datacentre, Ultima’s infrastructure is used as a code to create repeatable deployments, customised for customer environments. These datacentre deployments are then able to scale where needed without manual intervention.

The journey to a fully automated datacentre

The first level of automation provides information for administrators to take action in a user-friendly and consumable way, moving to a system that provides recommendations for administrators to accept actions based on usage trends. From there automation leads to a system that will automatically take remediation actions and raise tickets based on smart alerts. Then you move to a fully autonomous datacentre utilising AI & ML, which determines the appropriate steps and can self-learn and adjust thresholds.

AI-driven operations start with automation

Businesses are adopting modern ways of consuming applications as well as modern ways of working. Over 80 per cent of organisations are either using or adopting DevOps methodologies, and it is critical to the success of these initiatives that the platforms in place can support these ways of working while still keeping efficiency and utilisation high.

In the not too distant future is a central platform to support traditional and next-generation workloads which can be automated in a self-healing, optimum way at all times. This means that when it comes to migration, maintenance, upgrades, capacity changes, auditing, back-up and monitoring, the datacentre takes the majority of actions itself with no or little assistance or human intervention required. Similar to autonomous vehicles, the possibilities for automation are never-ending; it’s always possible to continually improve the way work is carried out.

Matthew Beale is Modern Datacentre Architect, Ultima, an automation and transformation partner. You can contact him at matthew.beale@ultima.com and visit Ultima at www.ultima.com

Microsoft has developed a fully automated system that stores digital data as DNA in an attempt to reduce the magnitude…

Microsoft has developed a fully automated system that stores digital data as DNA in an attempt to reduce the magnitude of stored data.

A proof-of-concept, conducted by the software giant and the University of Washington, successfully encoded the word “hello” into snippets of fabricated DNA and converted it back to digital data using a fully automated end-to-end system.

Microsoft is looking to address capacity issue in modern data centres by attempting to encrypt digital information in synthetic DNA molecules of a significantly smaller magnitude than the model data centres currently use.

Microsoft believes that through molecular computing technologies and algorithms, the DNA system could fit all the information currently stored in a warehouse-sized datacentre into a space “roughly the size of a few board game dice”.

The automated DNA data storage system uses Microsoft software, developed with the UW team that converts the ones and zeros of digital data into the As, Ts, Cs and Gs that make up the building blocks of DNA ready to be retrieved, through the assembly of liquids and chemicals that can read the DNA sequence in a way that computers can understand.

Microsoft principal researcher Karin Strauss commented: “Our ultimate goal is to put a system into production that, to the end user, looks very much like any other cloud storage service — bits are sent to a data centre and stored there and then they just appear when the customer wants them. To do that, we needed to prove that this is practical from an automation perspective.”