Disaster management using edge-of-field computing and low-cost sensors

If there is something that has become much clearer in recent years with climate change, it’s that the magnitude and frequency of natural disasters has increased dramatically in the last 5 years. The Office of Management and Budget in the US Government conducted a preliminary analysis this year and concluded that six major types of natural disasters (coastal disasters, flooding, crop failures, climate-related health disasters, wildfires and building failures)  “were likely to result in annual expenditures of approximately $134 billion and could result in as much as $2 trillion in lost revenue by the end of the century”. 


As a result, there’s been a lot of interest in technologies that can help responses in the aftermath of disasters as well as those that can help detect these disasters early and hence mitigate the damage from them. 


Recently, scientists from OakRidge National Laboratory helped develop a prototype system to detect damaged utility poles from hurricanes and other disasters. This system uses a combination of hardware in the form of drones and sensors together with machine learning algorithms - and can be deployed quickly and efficiently in locations where local infrastructure has been damaged or destroyed. The interesting aspect of this system is that it uses an efficient “edge of field” computing technology that will allow the system to function independently - a real boon in conditions when internet access is unreliable. 


Edge of field computing on on-device computing is typically when machine learning algorithms are deployed on sensors and/or devices and do not connect directly to the internet. The advantage of these systems is that they are small, easily deployed and more robust in conditions with limited resources. They are often more secure as well. The disadvantage is that because of computational limitations since they don’t use the typical cloud-computing resources, algorithms need to be developed that can function with limited data. 


In order to get past that limitation, the team at OakRidge started training the algorithms on the cloud using as much data as possible. The trained algorithm was then deployed on the sensors so that the computational effort was minimized while accuracy was still maintained. Improvements in sensor hardware as well as new algorithms helped this effort. As one of the scientists said - “The image analytics capabilities of the smaller, more affordable sensors that we’re making allow for a lot of things that were previously impossible because of price and resolution limitations”. 


Building these low-cost systems is not just something that can be deployed in disasters, but could also be used in communities and systems that have limited resources.

What our community are reading

Moonshots, Models, IoT and Machine Learning in Agriculture

Our online community space is now live!

How much water should an email consume? Data centers and water use