Thorne and Derrick have written a great blog post on the importance of quality measurement and control in cheese manufacturing (a key industry for Rotronic sensors due to their reliance to the high humidity conditions in cheese maturing!)
Most of us have a never ending choice of the most delicious breads, cakes and pastries to please both the palate and the eyes. We have become so used to this diverse range of bread and baked products, but do you how bread originally came into existence?
The interesting history of what is now called the “staff of life”, bread, and the making of it, started in comparatively recent times.
At the very beginning of recorded history there was the discovery of fire making and thus along with light, heat could be generated. Then it was found that different grasses and their seeds could be prepared for nourishment.
Later, with the combination of grain, water and heat, it was possible to prepare a kind of dense broth. Hot stones were covered with this broth or the broth was roasted on embers and “hey presto” the first unsoured flat bread was created. This ability to prepare stable food radically changed the eating habits and lifestyles of our early ancestors. They progressed from being hunters to settlers.
Facts & figures:
Records show that as early as 2600-2100 B.C. bread was baked by Egyptians, who it is believed had learned the skill from the Babylonians.
On average, every American consumes around 53 lb (24 kg) of bread per year.
The “pocket” in pita bread is made by steam. The steam puffs up the dough and, as the bread cools and flattens, a pocket is left in the middle.
US Farmers receive just 5 cents (or less) for each loaf of bread sold.
Why the need to measure humidity?
The production of baked goods such as bread, cakes, biscuits and pastries requires a number of processing steps in which humidity and temperature play an important role.
After mixing, it is typical to divide the dough into pieces and allow it to rest for a few minutes so that the gluten network in the dough can relax allowing easier moulding, which is the next step.
If at that stage, the temperature is too hot the dough will be too sticky and cannot be easily processed further, if too cold the dough can become damaged during moulding which leads to holes forming in the bread. If the humidity level prior to the moulding process was too low a skin of dry dough can form on the dough surface. This makes it harder for the dough to increase its volume during the next
process step called proving.
Proving is the professional term for the final dough-rise step before baking, where 90% of the bread volume is achieved. To achieve consistently good dough rising results special chambers are used. These chambers can maintain the ideal environment for the yeast to grow. Depending on the yeast and flour used, temperatures between 38…42°C and humidity levels between 70…80%rh are considered ideal.
In summary, the use of quality ingredients and careful handling throughout the various stages of production will not result in a quality product unless the dough temperature, and the temperature and humidity of the bakery are carefully regulated. Modern day bakeries use custom ventilation systems that are controlled by precision humidity and temperature sensors.
So once again the behavior of the humble water molecule is to blame! In this case for the stricken faces of The Great British Bake Off contestants as they stress about the quality of their crust and whether the dough will be cooked through to perfection!
Rotronic is pleased to announce the introduction of its smallest ever temperature and humidity logger. The HL-1D measures only 90 x 60 x 23 mm, is well specified with good accuracy, durable and has high ingress protection against dust and water (IP67). HW4-Lite validated software for programming, data download and analysis is included. The logger is available now at a competitive, inexpensive price.
The Rotronic HL-1D logger is very suitable for monitoring and recording conditions for a wide range of applications across all industries, in commerce and for research organisations. The compact logger is particularly suitable for monitoring high value products of all types during transportation to ensure that quality is maintained.
The significant features of the HL-1D inexpensive logger include:
• High measurement accuracy of 3 %rh and 0.3 °C
• Slim, compact and durable. High ingress protection rating of IP67
• Clear LC display with configurable visual alarms
• Large data recording capacity: 32,000 data points
• Ranges: -20…70 °C, 0…100 %rh. Logging interval from 30 s
• Min / Max / Average statistical function on logger display
• Package includes HW4-Lite validated PC software with data analysis
• Three year battery life (with five minute logging interval)
• FDA 21CFR Part 11 / GAMP5 conformity
• Temperature only logger available (product code TL-1D)
HL-1D / TL-1D Logger Technical Datasheet – click here
Contact us now for logger pricing and additional information
Companies across many industries needing to perform regular monitoring and calibration have never faced a more challenging environment. Stricter compliance requirements mean companies are under greater pressure to deliver accurate and reliable data, whilst internal budget restrictions demand the most cost effective and efficient solutions.
Can modern measurement & calibration techniques help your business operations?
It is well known that accurate measurements reduce energy use and improve product consistency. Instrument users, calibration laboratories and manufacturers are constantly looking for smarter ways of operating and are responding with innovations that are transforming the measurement and calibration industry.
New ways of working
Industrial environments are now more automated and interconnected than ever before and companies need to ensure that their infrastructure and processes have the ability to respond and adapt to industry changes. With the introduction of newer, more complex instrumentation, organisations can often be slow to recognise the additional business benefits that can be achieved by replacing a traditional method that (offers a short term result) with a more modern method (that delivers a longer term sustainable solution). Implementing a new approach can also help re-position the calibration process from being viewed simply as a cost to business to one that helps deliver improved process and energy efficiencies with a return on investment.
Historically, in-situ calibration has been the standard approach; however, advances in technology means that there is now a viable alternative whilst still maintaining the growing demand for on-site services. With the market moving away from analogue to digital signal processing, interchangeable digital sensors are proving to be a more practical solution for both large and small organisations alike. As businesses look for greater automation and productivity, modern interchangeable digital sensors are allowing calibration to be achieved much more quickly without the costly implications of operational downtime and on-site maintenance.
Why calibrate? – The only way to confirm performance In unsettled economic times it can be tempting to simply extend the intervals between calibration cycles or to forgo calibration altogether. However, neglecting system maintenance and calibration will result in reduced performance and a loss of measurement confidence, ultimately leading to a failure to meet compliance standards. Measurement drift over time negatively impacts on processes and quality. Regular, accredited calibration demonstrates compliance, but equally importantly, sends a message to customers that quality is taken seriously and that they can be confident in both the process and the final product.
Traditional In-Situ Sensor Calibration
Until recently most humidity calibrations were performed on-site in-situ. Larger organisations with multiple instruments generally found it more convenient to have their own in-house calibration instruments with dedicated technicians working on-site. Smaller organisations unwilling or unable to invest in on-site calibration equipment had the option to engage the services of a commercial calibration provider.
In most cases, trained instrument technicians are required for in-situ calibration work; the equipment is brought to the probes and generally only one probe can be calibrated at a time. One of the main disadvantages of this process is the impact that it has on production downtime, as typically a salt or chamber based calibration can take more than three hours. Moreover, as the processes or control systems are interrupted during calibration, the actual conditions can be unknown.
Modern Ex-Situ Sensor Calibration
Companies keen to avoid the impacts of in-situ calibration and/or operational downtime caused by the replacement of failed hard wired instruments are opting instead for the flexibility and convenience of interchangeable sensors and modern portable calibration generators. Instead of bringing in equipment to calibrate in-situ, the technician brings pre-calibrated probes directly from the laboratory (on-site or external). Using interchangeable digital sensors, the pre-calibrated probes can be exchanged with the in-situ probes in seconds (known as hot swaps), saving time and avoiding operational disruption. If a wider system loop calibration is required, digital simulators are applied and provide any fixed values exactly and instantly. The old probes are then taken back to a calibration laboratory and calibrated accordingly. This adds the benefit that an external accredited laboratory can be used without issue.
Improved accuracy and traceability?
By ensuring that all calibrations are performed within dedicated laboratories as opposed to ad-hoc locations, better procedures and instrumentation can be utilised. In addition, time pressures are usually reduced as processes and monitoring systems are unaffected during calibration. As such calibrations are typically performed to a higher standard leading to lower associated measurement uncertainty (every calibration will have an uncertainty associated with it – whether it is defined or not). Overall in most circumstances these methods deliver greater reliability, improved traceability and importantly, reduces on-site workload and limits operational downtime.
CASE STUDY – Meeting the demands at the National Physical Laboratory, London.
When the National Physical Laboratory (NPL) in London needed to replace their entire building management system (BMS), they turned to Rotronic Instruments (UK) for an integrated solution to the sensors and calibration. The NPL was looking for both a complete range of temperature and humidity sensors and instrumentation, and the fulfilment of the calibration and commissioning needs of these instruments. Working closely with the project stakeholders, the Rotronic Instruments (UK) team developed a tailored solution, matching the instruments and service to the project requirements.
The decision by the NPL to replace the BMS was brought about by the need for tighter control, greater reliability and easier calibration. One of the key elements in achieving these objectives was the use of interchangeable probes. This immediately limited time-consuming and disruptive on-site sensor calibration to a minimum. Every probe’s digital output was calibrated in Rotronic Instruments’ (UK) UKAS accredited laboratory, and each transmitter’s analogue output was calibrated using a simulated digital input. To resolve any measurement errors in-situ between the calibrated sensors and uncalibrated BMS, each installed high accuracy instrument was loop calibrated and adjusted. Typical installations errors corrected for to date on the brand new BMS are ±0.5 %rh and ±0.25°C; a significant result for labs requiring tolerances of better than 1 %rh and 0.1°C.
Whilst the use of high performance instruments was essential, not every sensor location or application could justify this approach. However, mindful of the NPL’s long term objectives, even the lowest specification thermistor products were customised to provide long-term performance and low drift. Additionally, a robust commissioning procedure and training for key personnel was developed to enable ongoing commitment to delivering quality measurements. Finally, it was effective communication and regular on-site interaction with all the stakeholders that helped deliver a successful outcome to this substantial project.
All companies that need to perform regular monitoring and instrument calibration should be constantly reviewing their processes and questioning whether their operations and procedures are delivering the maximum return for their business. As increased regulatory compliance and demands for improved energy efficiencies continue to grow, traditional processes may no longer offer the optimum solution. An organisational mindset change may be needed to move calibration from being seen as a fixed cost to a process that can help deliver business objectives through ongoing cost and energy efficiencies.
With the advent of calibration methods that can significantly reduce in-situ disruption, downtime is minimised, labour costs are reduced and productivity improved. Using interchangeable digital systems increases the accuracy and traceability of calibrations, resulting in higher quality product.
Choosing the right calibration methodology may require new thinking and a different approach, but those companies that get it right will end up with a modern, flexible system that both achieves compliance and delivers long term cost and energy efficiencies to their business.
For more information on the NPL case study or how your business can develop innovative and efficient monitoring solutions please contact us.
The future is very encouraging for wind power. The technology is growing exponentially due to the current power crisis and the ongoing discussions about nuclear power plants. Wind turbines are becoming more efficient and are able to produce increased electricity capacity given the same factors.
Converting wind power into electrical power:
A wind turbine converts the kinetic energy of wind into rotational mechanical energy. This energy is directly converted, by a generator, into electrical energy. Large wind turbines as shown in the picture, typically have a generator installed on top of the tower. Commonly, there is also a gear box to adapt the speed. Various sensors for wind speed, humidity and temperature measurement are placed inside and outside to monitor the climate. A controller unit analyses the data and adjusts the yaw and pitch drives to the correct positions. See the schematic below.
The formula for wind power density:
W = d x A2 x V3 x C where :
d: defines the density of the air. Typically it’s 1.225 Kg/m3 This is a value which can vary depending on air pressure, temperature and humidity.
A2: defines the diameter of the turbine blades. This value is quite effective with its squared relationship. The larger a wind turbine is the more energy can be harnessed.
V3: defines the velocity of the wind. The wind speed is the most effective value with its cubed relationship.
In reality, the wind is never the same speed and a wind turbine is only efficient at certain wind speeds. Usually 10 mph (16 km/h) or greater is most effective. At high wind speed the wind turbine can break. The efficiency is therefore held to a constant of around 10 mph.
C: defines the constant which is normally 0.5 for metric
values. This is actually a combination of two or more constants depending on the specific variables and the system of units that is used.
Why measure the local climate?
To forecast the power of the wind over a few hours or days is not an easy task.
Wind farms can extend over miles of land or offshore areas where the climate and the wind speed can vary substantially, especially in hilly areas. Positioning towers only slightly to the left or right can make a significant difference because the wind velocity can be increased due to the topography. Therefore, wind mapping has to be performed in order to determine if a location is correct for the wind farm. Such wind maps are usually done with Doppler radars which are equipped with stationary temperature and humidity sensors. These sensors improve the overall accuracy.
Once wind mapping has been carried out over different seasons, wind turbine positions can be determined. Each turbine will be equipped with sensors for wind direction and speed, temperature and humidity. Using all these parameters, the turbine characteristics plus the weather forecast, a power prediction can be made using complex mathematics.
The final power value will be calculated in “watts” which will be supplied into power grids, (see schematic on the right). Electricity for many houses or factories can be powered by the green energy.
Why measure inside a wind turbine?
Wind farms are normally installed in areas with harsh environments where strong winds are common. Salty air, high humidity and condensation are daily issues for wind turbines.
Normal ventilation is not sufficient to ensure continuous operation. The inside climate has to be monitored and dehumidified by desiccant to protect the electrical components against short circuits and the machinery against corrosion. These measurements are required to ensure continuous operation and reduce maintenance costs.
What solutions can Rotronic offer?
Rotronic offers sensors with exceptional accuracy and a wide range of products for meteorological applications and for monitoring internal conditions.
Low sensor drift and long-term stability are perfect in wind energy applications where reduced maintenance reduces operational costs.
The wide range of networking possibilities including RS-485, USB, LAN and probe extension cables up to 100 m allows measurements in remote or hard to reach places. Validated Rotronic HW4 software makes it easy to analyse the data or it can be exported into MS Excel for reporting and further processing.
The ability to calibrate accurately using humidity standards and portable generators on site ensures continued sensor performance!
Over the years there has been a rapid increase in large stand-alone data centres housing computer systems, hosting cloud computing servers and supporting telecommunications equipment. These are crucial for every company for IT operations around the world.
It is paramount for manufacturers of information technology equipment (ITE) to increase computing capability and improve computing efficiency. With an influx of data centers required to house large numbers of servers, they have become significant power consumers. All the stakeholders including ITE manufacturers, physical infrastructure manufacturers, data centers designers and operators have been focusing on reducing power consumption from the non-computing part of the overall power load: one major cost is the cooling infrastructure that supports the ITE.
Too much or too little Humidity can make one uncomfortable. Similarly, computer hardware does not like these extreme conditions any more than we do. With too much humidity, condensation can occur and with too little humidity, static electricity can occur: both can have a significant impact and can cause damage to computers and equipment in data centers.
It is therefore essential to maintain and control ideal environmental conditions, with precise humidity and temperature measurement, thus increasing energy efficiency whilst reducing energy costs in Data Centers. ASHRAE Thermal Guidelines for Data Processing Environments has helped create a framework for the industry to follow and better understand the implications of ITE cooling component.
Rotronic’s high precision, fast responding and long-term stability temperature and humidity sensors are regularly specified for monitoring and controlling conditions in data centres.
Why measure temperature and humidity?
Maintaining temperature and humidity levels in the data center can reduce unplanned downtime caused by environmental conditions and can save companies thousands or even millions of dollars per year. A recent whitepaper from The Green Grid (“Updated Air-Side Free Cooling Maps: The Impact of ASHRAE 2011 Allowable Ranges”) discusses the new ASHRAE recommended and allowable ranges in the context of free cooling.
The humidity varies to some extent with temperature, however, in a data center, the absolute humidity should never fall below 0.006 g/kg, nor should it ever exceed 0.011 g/kg.
Maintaining temperature range between 20° to 24°C is optimal for system reliability. This temperature range provides a safe buffer for equipment to operate in the event of air conditioning or HVAC equipment failure while making it easier to maintain a safe relative humidity level. In general ITE should not be operated in a data center where the ambient room temperature has exceeded 30°C. Maintaining ambient relative humidity levels between 45% and 55% is recommended.
Additionally, data centre managers need to be alerted to change in temperature and humidity levels.
Rotronic temperature and humidity probes with suitable transmitters or loggers are most suitable for monitoring & controlling conditions in data centres due to their high precision and fast response with long-term stability.
With Rotronic HW4 Software a separate monitoring system can be implemented. This enables data center managers to view measured values and automatically save the measured data. Alarm via email and SMS, with report printout allow data integrity guaranteed at all times.
Dr Jeremy Wingate Rotronic UK
News, views and interesting things from the world of metrology