Introduction
Data centres are rapidly becoming the power houses of the modern world. Combined with the rise of digital industries, virtually all business operations now rely in some way on the transfer of data. As data transfer rates increase in tandem with an explosion in mobile communication the demands on data centre infra-structure are ever increasing.
It is estimated that by 2018 global data traffic will exceed 8500 exabytes (32% compound annual growth rate).
Data centres provide the infra-structure to support the transfer and hosting of data. They are often classified into 4 tiers. Tier 4 provides highest levels of redundancy, security and efficiency. For example, a Tier 4 data centre is required to have an uptime of 99.995% equivalent to less than 27 minutes downtime per year! Tier 4 sites have fully redundant systems, power supplies and biometric security. Zero downtime is the ideal as the costs incurred via end user penalties can be huge.
Why the need to measure temperature, humidity and differential pressure?
Data centres must be maintained to specific environmental conditions to ensure the performance and longevity of the hardware installed. As standard, temperature must be 18-27 °C, dew point 5-15 °C dp and humidity no higher than 60 %rh. This ensures the hardware is at a suitable temperature, condensation is avoided and the chance of static build up is reduced (caused by low humidity).
A control range of ±9 °C may seem relatively broad, however 100% of the energy supplied to server hardware is converted to heat. In most data centres if the cooling system fails and servers are not shut down, heat levels will rise above a critical 35 °C within minutes or even seconds. If unchecked, temperature levels will rise causing hardware damage and can result in electrical fires.
Achieving the specified control range requires precision sensors and advanced control systems. Typically modern data centres are designed using computational fluid dynamics to ensure the very highest efficiency. Despite this it is estimated around 5% of US electrical energy used is for data centre cooling.
Since 100% of electricity utilised by servers is converted to heat, theoretically a 100% efficient cooling system would require an equal amount of energy. Efficiency is measured by comparing total facility energy use, with IT equipment energy use. This is called Power Usage Effectiveness (PUE). Theoretically PUE can be 1 but typically reported values are above 2. By utilising precision measurements and design, modern data centres achieve PUEs of ~1.1!
An improvement of 0.5 in a data centre’s PUE equates to a energy saving of ~£2.2 M & ~12,000 tonnes CO2 over 5 years (for a site with 1 MW load).
What solutions can Rotronic offer?
Rotronic provides a range of instruments for environmental monitoring and control. Reliable and precise outside air sensors and weather shields enable natural cooling to be utilised where possible.
Inside the data centres, Rotronic interchangeable HC2-S probes can provide a combination of precise, fast response temperature and humidity measurements with ease of calibration. Our latest PF4 differential pressure transmitters provide precision low drift measurements.
With both digital and a range of analogue outputs available as well as several probe mounting options, products can be selected for all applications.
Importantly though we aim to understand your needs and build a relationship with the goal of providing an appropriate solution, combining instruments, training, calibration and ongoing support.
Dr Jeremy WIngate
Rotronic UK