Tag Archives: data center

Energy Efficiency and Reliability in Modern Data Centres

Introduction

Data centres are rapidly becoming the power houses of the modern world. Combined with the rise of digital industries, virtually all business operations now rely in some way on the transfer of data. As data transfer rates increase in tandem with an explosion in mobile communication the demands on data centre infra-structure are ever increasing.

It is estimated that by 2018 global data traffic will exceed 8500 exabytes (32% compound annual growth rate).

Data centres provide the infra-structure to support the transfer and hosting of data. They are often classified into 4 tiers. Tier 4 provides highest levels of redundancy, security and efficiency. For example, a Tier 4 data centre is required to have an uptime of 99.995% equivalent to less than 27 minutes downtime per year! Tier 4 sites have fully redundant systems, power supplies and biometric security. Zero downtime is the ideal as the costs incurred via end user penalties can be huge.

data centre tiers

Why the need to measure temperature, humidity and differential pressure?

Data centres must be maintained to specific environmental conditions to ensure the performance and longevity of the hardware installed. As standard, temperature must be 18-27 °C, dew point 5-15 °C dp and humidity no higher than 60 %rh. This ensures the hardware is at a suitable temperature, condensation is avoided and the chance of static build up is reduced (caused by low humidity).

A control range of ±9 °C may seem relatively broad, however 100% of the energy supplied to server hardware is converted to heat. In most data centres if the cooling system fails and servers are not shut down, heat levels will rise above a critical 35 °C within minutes or even seconds. If unchecked, temperature levels will rise causing hardware damage and can result in electrical fires.

Achieving the specified control range requires precision sensors and advanced control systems. Typically modern data centres are designed using computational fluid dynamics to ensure the very highest efficiency. Despite this it is estimated around 5% of US electrical energy used is for data centre cooling.

pue power usage effectiveness

Since 100% of electricity utilised by servers is converted to heat, theoretically a 100% efficient cooling system would require an equal amount of energy. Efficiency is measured by comparing total facility energy use, with IT equipment energy use. This is called Power Usage Effectiveness (PUE). Theoretically PUE can be 1 but typically reported values are above 2. By utilising precision measurements and design, modern data centres achieve PUEs of ~1.1!

An improvement of 0.5 in a data centre’s PUE  equates to a energy saving of ~£2.2 M & ~12,000 tonnes CO2 over 5 years (for a site with 1 MW load).

 

What solutions can Rotronic offer?

Rotronic provides a range of instruments for environmental monitoring and control. Reliable and precise outside air sensors and weather shields enable natural cooling to be utilised where possible.

Inside the data centres, Rotronic interchangeable HC2-S probes can provide a combination of precise, fast response temperature and humidity measurements with ease of calibration. Our latest PF4 differential pressure transmitters provide precision low drift measurements.

With both digital and a range of analogue outputs available as well as several probe mounting options, products can be selected for all applications.

Importantly though we aim to understand your needs and build a relationship with the goal of providing an appropriate solution, combining instruments, training, calibration and ongoing support.

Dr Jeremy WIngate
Rotronic UK

 

Temperature and Humidity Monitoring in Data Centres

Over the years there has been a rapid increase in large stand-alone data centres housing computer systems, hosting cloud computing servers and supporting telecommunications equipment. These are crucial for every company for IT operations around the world.

It is paramount for manufacturers of information technology equipment (ITE) to increase computing capability and improve computing efficiency.  With an influx of data centers required to house large numbers of servers, they have become significant power consumers. All the stakeholders including ITE manufacturers, physical infrastructure manufacturers, data centers designers and operators have been focusing on reducing power consumption from the non-computing part of the overall power load: one major cost is the cooling infrastructure that supports the ITE.

Data Centre Modelling
Data Centre Modelling

Too much or too little Humidity can make one uncomfortable. Similarly, computer hardware does not like these extreme conditions any more than we do. With too much humidity, condensation can occur and with too little humidity, static electricity can occur: both can have a significant impact and can cause damage to computers and equipment in data centers.

It is therefore essential to maintain and control ideal environmental conditions, with precise humidity and temperature measurement, thus increasing energy efficiency whilst reducing energy costs in Data Centers. ASHRAE Thermal Guidelines for Data Processing Environments has helped create a framework for the industry to follow and better understand the implications of ITE cooling component.

Rotronic’s high precision, fast responding and long-term stability temperature and humidity sensors are regularly specified for monitoring and controlling conditions in data centres.

Why measure temperature and humidity?

Maintaining temperature and humidity levels in the data center can reduce unplanned downtime caused by environmental conditions and can save companies thousands or even millions of dollars per year. A recent whitepaper from The Green Grid (“Updated Air-Side Free Cooling Maps: The Impact of ASHRAE 2011 Allowable Ranges”) discusses the new ASHRAE recommended and allowable ranges in the context of free cooling.

The humidity varies to some extent with temperature, however, in a data center, the absolute humidity should never fall below 0.006 g/kg, nor should it ever exceed 0.011 g/kg.

Maintaining temperature range between 20° to 24°C is optimal for system reliability. This temperature range provides a safe buffer for equipment to operate in the event of air conditioning or HVAC equipment failure while making it easier to maintain a safe relative humidity level.  In general ITE should not be operated in a data center where the ambient room temperature has exceeded 30°C. Maintaining ambient relative humidity levels between 45% and 55% is recommended.

Additionally, data centre managers need to be alerted to  change in temperature and humidity levels.

Rotronic temperature and humidity probes with suitable transmitters or loggers are most suitable for monitoring & controlling conditions in data centres due to their high precision and fast response with long-term stability.

With Rotronic HW4 Software a separate monitoring system can be implemented. This enables data center managers to view measured values and automatically save the measured data. Alarm via email and SMS, with report printout allow data integrity guaranteed at all times.

Dr Jeremy Wingate
Rotronic UK