No matter what type of analytics you do, or if you’re working with structured or unstructured data, one thing is for certain… The data analytics space is moving quickly, very quickly. If your efforts to keep current make you feel like you’re shooting at a moving target, you are not alone. We’ve all heard of big data and many of us are even working in big data analytics. While many analytics professionals are working with web data, social data, and business data on the enterprise, there is another type of data that deserves its fair share of attention and respect. That data type is called sensor data.
What are sensors?
Sensors cheaply and automatically collect huge volumes of observational data. This is on any type of natural or man-made system. This ranges from the automated collection of flood water level data, to the collection of sensor-derived soil temperature measurements. At this point, near-infrared image sensors map the moon.
Sensors are not new, but what’s new about them is the very low price at which sensor technologies can now be deployed. Sensors are really just cheap, automated, accurate data-makers. They analyze and understand the operating conditions of any type of physical system known to man. More importantly, analytics can be built around sensor data. It is to protect and improve the operation and design of the critical systems that we depend upon in built environments. All this boils down to dollars saved, jobs created, and even lives spared (in systems that rely on sensor data to initiate life-saving tasks).
Alquist’s new Celsius Range product demonstrates a great example of the business value of sensor data analytics.
In a novel application of fiber optic sensor technology, Celsius Range uses fiber optic sensor cables. This is to monitor temperatures and automate the cooling of cabinets and servers in large data centers. In the Celsius Range solution, cables run along the racks of the data center and take real-time temperature measurements at each location in the center. Proprietary algorithms then produce real-time geospatial heat maps from these temperature readings.
These automated heat maps can then be integrated with the building management system. This is so the cabinets and servers can automatically undergo the individual cooling that they need when they need it. The traditional way of keeping servers cool was to cool the entire data center to a temperature where no server would operate at a point too close to the threshold of overheating. In comparison, Alquist’s segmented and individualized temperature detection and cooling method is likely to create huge savings in its client’s data center cooling costs in the very near future.
This example is important to data analytics professionals because it demonstrates a case where data analytics can be used to create true value in an organization by diverting unnecessary spending.
Not only does this solution save money, but it protects the environment by reducing energy wastage. In this solution, data analysts were required to work with copious amounts of sensor data. This is to develop algorithms that map temperatures across the center in real-time. Data analysts then build trigger and alert functionality into the algorithms. With that, building managers could be alerted to problems as they arise. There is, and will continue to be, a definite need for data analysts and analytics professionals. They should be experienced in sense-making with the huge volumes of sensor data being generated today.
What about you? Do you have experience working with sensor data? If you have built algorithms for sensor data analysis, what resources did you use when forming your design?