For a long time, people have tried to predict weather conditions using the hydrologic climate cycle.
In the early 1920`s scientists were able to compile a six-hour forecast. Back then it took six weeks to calculate by hand the weather data collected at two points in Europe and create a useful illustrative model.
Today, supercomputers are used to predict the weather for a period of several weeks. The complex modelling programs require several million data points for parameters such as temperature, humidity, pressure, vertical & horizontal wind velocity with time stamps and absolute coordinates. To create a correlation between the data and the environment, scientists “slice” the atmosphere virtually into smaller horizontal & vertical parts – this process is called discretization. It is more useful to compute the chronological change of the parameters using this model.
Meteorological events that are too “small” such as a single thunderhead, layer clouds or smaller turbulences will be parameterised through variables. This parameterisation is a science of its own that aims to reduce uncertainties as best as possible.
Every forecast calculation starts with the current weather conditions. The quality of this input is crucial for the accuracy of the final forecast. Meteorologists link the forecast of yesterday’s weather with the actual measured parameters. Only large data centres are capable of computing this data assimilation. The overall result is a best possible calculation basis to predict the weather for the next day. If this groundwork is flawed the forecast may be incorrect, for example it could report rain at the wrong location.
Today’s meteorological mathematicians also take parameters into account that change extremely slowly compared to the other factors. Growth and the reduction of polar ice, or the temperature of the oceans are summarised as boundary values
After a model is run using all the available data, meteorologists’ process and customize reports for a wide range of target groups such as public authorities, flight control centres, energy producers, industries and many more, including the issue of specific warnings.
Facts & figures:
17.8 cm is the diameter of the largest hailstone ever recorded.
Sukkur City in Pakistan is one of the most humid places in the world with 30 °C dew point & a felt air temperature of 65 °C.
A study showed that a small thunderstorm system holds more than 10 million tons of water.
No two weather patterns are completely alike.
Some weather models assimilates data obtained from more than 25,000 weather stations.
Why The need to Measure Humidity?
As described above, the daily weather fore-cast relies on the precise measurement of weather parameters. The science of numerical weather prediction aims to describe the daily hydro-logic cycle in numbers – humidity plays an important role in this – data errors will multiply during calculations.
Humidity values influence weather calculations e.g. through the water vapor balance equation – this formula expresses the influence of humidity through rain & condensation, and vice-versa.
Incorrect measurement or incomplete humidity data directly leads to wrong predictions of a huge number of weather phenomena; this can include the condensation altitude of clouds, locations of hyetal regions, fog layers and storms.
In 1999, incorrect data sent by a weather station in Nova Scotia, Canada led to an incorrect forecast for Hurricane Lothar two days before it hit Central Europe. Authorities were insufficiently prepared to alert people in time.
What is the Rotronic Solution?
Rotronic products are used in weather stations around the globe. They provide temperature & humidity data continuously with high accuracy even in demanding environments.
Rotronic manufactures a range of meteorological probes and weather shields to meet the standards required by meteorological organizations.
Rotronic is pleased to announce the introduction of its smallest ever temperature and humidity logger. The HL-1D measures only 90 x 60 x 23 mm, is well specified with good accuracy, durable and has high ingress protection against dust and water (IP67). HW4-Lite validated software for programming, data download and analysis is included. The logger is available now at a competitive, inexpensive price.
The Rotronic HL-1D logger is very suitable for monitoring and recording conditions for a wide range of applications across all industries, in commerce and for research organisations. The compact logger is particularly suitable for monitoring high value products of all types during transportation to ensure that quality is maintained.
The significant features of the HL-1D inexpensive logger include:
• High measurement accuracy of 3 %rh and 0.3 °C
• Slim, compact and durable. High ingress protection rating of IP67
• Clear LC display with configurable visual alarms
• Large data recording capacity: 32,000 data points
• Ranges: -20…70 °C, 0…100 %rh. Logging interval from 30 s
• Min / Max / Average statistical function on logger display
• Package includes HW4-Lite validated PC software with data analysis
• Three year battery life (with five minute logging interval)
• FDA 21CFR Part 11 / GAMP5 conformity
• Temperature only logger available (product code TL-1D)
HL-1D / TL-1D Logger Technical Datasheet – click here
Contact us now for logger pricing and additional information
Companies across many industries needing to perform regular monitoring and calibration have never faced a more challenging environment. Stricter compliance requirements mean companies are under greater pressure to deliver accurate and reliable data, whilst internal budget restrictions demand the most cost effective and efficient solutions.
Can modern measurement & calibration techniques help your business operations?
It is well known that accurate measurements reduce energy use and improve product consistency. Instrument users, calibration laboratories and manufacturers are constantly looking for smarter ways of operating and are responding with innovations that are transforming the measurement and calibration industry.
New ways of working
Industrial environments are now more automated and interconnected than ever before and companies need to ensure that their infrastructure and processes have the ability to respond and adapt to industry changes. With the introduction of newer, more complex instrumentation, organisations can often be slow to recognise the additional business benefits that can be achieved by replacing a traditional method that (offers a short term result) with a more modern method (that delivers a longer term sustainable solution). Implementing a new approach can also help re-position the calibration process from being viewed simply as a cost to business to one that helps deliver improved process and energy efficiencies with a return on investment.
Historically, in-situ calibration has been the standard approach; however, advances in technology means that there is now a viable alternative whilst still maintaining the growing demand for on-site services. With the market moving away from analogue to digital signal processing, interchangeable digital sensors are proving to be a more practical solution for both large and small organisations alike. As businesses look for greater automation and productivity, modern interchangeable digital sensors are allowing calibration to be achieved much more quickly without the costly implications of operational downtime and on-site maintenance.
Why calibrate? – The only way to confirm performance In unsettled economic times it can be tempting to simply extend the intervals between calibration cycles or to forgo calibration altogether. However, neglecting system maintenance and calibration will result in reduced performance and a loss of measurement confidence, ultimately leading to a failure to meet compliance standards. Measurement drift over time negatively impacts on processes and quality. Regular, accredited calibration demonstrates compliance, but equally importantly, sends a message to customers that quality is taken seriously and that they can be confident in both the process and the final product.
Traditional In-Situ Sensor Calibration
Until recently most humidity calibrations were performed on-site in-situ. Larger organisations with multiple instruments generally found it more convenient to have their own in-house calibration instruments with dedicated technicians working on-site. Smaller organisations unwilling or unable to invest in on-site calibration equipment had the option to engage the services of a commercial calibration provider.
In most cases, trained instrument technicians are required for in-situ calibration work; the equipment is brought to the probes and generally only one probe can be calibrated at a time. One of the main disadvantages of this process is the impact that it has on production downtime, as typically a salt or chamber based calibration can take more than three hours. Moreover, as the processes or control systems are interrupted during calibration, the actual conditions can be unknown.
Modern Ex-Situ Sensor Calibration
Companies keen to avoid the impacts of in-situ calibration and/or operational downtime caused by the replacement of failed hard wired instruments are opting instead for the flexibility and convenience of interchangeable sensors and modern portable calibration generators. Instead of bringing in equipment to calibrate in-situ, the technician brings pre-calibrated probes directly from the laboratory (on-site or external). Using interchangeable digital sensors, the pre-calibrated probes can be exchanged with the in-situ probes in seconds (known as hot swaps), saving time and avoiding operational disruption. If a wider system loop calibration is required, digital simulators are applied and provide any fixed values exactly and instantly. The old probes are then taken back to a calibration laboratory and calibrated accordingly. This adds the benefit that an external accredited laboratory can be used without issue.
Improved accuracy and traceability?
By ensuring that all calibrations are performed within dedicated laboratories as opposed to ad-hoc locations, better procedures and instrumentation can be utilised. In addition, time pressures are usually reduced as processes and monitoring systems are unaffected during calibration. As such calibrations are typically performed to a higher standard leading to lower associated measurement uncertainty (every calibration will have an uncertainty associated with it – whether it is defined or not). Overall in most circumstances these methods deliver greater reliability, improved traceability and importantly, reduces on-site workload and limits operational downtime.
CASE STUDY – Meeting the demands at the National Physical Laboratory, London.
When the National Physical Laboratory (NPL) in London needed to replace their entire building management system (BMS), they turned to Rotronic Instruments (UK) for an integrated solution to the sensors and calibration. The NPL was looking for both a complete range of temperature and humidity sensors and instrumentation, and the fulfilment of the calibration and commissioning needs of these instruments. Working closely with the project stakeholders, the Rotronic Instruments (UK) team developed a tailored solution, matching the instruments and service to the project requirements.
The decision by the NPL to replace the BMS was brought about by the need for tighter control, greater reliability and easier calibration. One of the key elements in achieving these objectives was the use of interchangeable probes. This immediately limited time-consuming and disruptive on-site sensor calibration to a minimum. Every probe’s digital output was calibrated in Rotronic Instruments’ (UK) UKAS accredited laboratory, and each transmitter’s analogue output was calibrated using a simulated digital input. To resolve any measurement errors in-situ between the calibrated sensors and uncalibrated BMS, each installed high accuracy instrument was loop calibrated and adjusted. Typical installations errors corrected for to date on the brand new BMS are ±0.5 %rh and ±0.25°C; a significant result for labs requiring tolerances of better than 1 %rh and 0.1°C.
Whilst the use of high performance instruments was essential, not every sensor location or application could justify this approach. However, mindful of the NPL’s long term objectives, even the lowest specification thermistor products were customised to provide long-term performance and low drift. Additionally, a robust commissioning procedure and training for key personnel was developed to enable ongoing commitment to delivering quality measurements. Finally, it was effective communication and regular on-site interaction with all the stakeholders that helped deliver a successful outcome to this substantial project.
All companies that need to perform regular monitoring and instrument calibration should be constantly reviewing their processes and questioning whether their operations and procedures are delivering the maximum return for their business. As increased regulatory compliance and demands for improved energy efficiencies continue to grow, traditional processes may no longer offer the optimum solution. An organisational mindset change may be needed to move calibration from being seen as a fixed cost to a process that can help deliver business objectives through ongoing cost and energy efficiencies.
With the advent of calibration methods that can significantly reduce in-situ disruption, downtime is minimised, labour costs are reduced and productivity improved. Using interchangeable digital systems increases the accuracy and traceability of calibrations, resulting in higher quality product.
Choosing the right calibration methodology may require new thinking and a different approach, but those companies that get it right will end up with a modern, flexible system that both achieves compliance and delivers long term cost and energy efficiencies to their business.
For more information on the NPL case study or how your business can develop innovative and efficient monitoring solutions please contact us.
There has been a rapid increase in large stand-alone data centres housing computer systems, hosting cloud computing servers and supporting telecommunications equipment, they are crucial for company IT operations around the world. Data centres must be extremely reliable and secure; many are wholly remote facilities.
Air conditioning is essential to maintain temperature and humidity levels within tight defined tolerances, thus ensuring the longest possible service life of the installed hardware.
Precise temperature and humidity measurement with fast reacting sensors is an absolute requirement. This increases energy efficiency whilst reducing energy costs. Additionally, data centre managers need to be alerted to even a small change in temperature and humidity levels. A separate monitoring system with networked alarms using fast reacting temperature and humidity sensors is installed.
Rotronic ‘standard’ HC2-S interchangeable temperature and humidity sensors are regularly specified for monitoring & controlling conditions in data centres due to their high precision and fast response with long-term stability. Used with a HygroFlex5 measurement transmitter analogue (scalable) or digital outputs are available exactly as required for interface with control systems. The loop can be validated electrically in minutes saving a significant amount of time. Probes can be exchanged rapidly when service work or periodic calibration checks are required.