Will AI Transform Water, Agriculture, Energy And Other Clean Technology Sectors?

 Will AI transform water, energy, agriculture, climate and all the other clean tech sectors? Can AI transform these sectors?

Some version of these questions always gets asked at any meeting or conference in clean technology. Of course, part of that is because there’s been so much hype around AI and the whole “software is eating the world” interviews that came out a couple of years ago. But part of it is also because these tools are so powerful that professionals working in these sectors can see the potential - but just aren’t sure if it’s applicable to their sector yet.


So, let’s start by asking a couple of fundamental questions. Why do we need AI at all? Or any models for that matter?


Models are used to understand the world - to estimate the impacts of changes in systems and to try and predict what will happen in the future. Typically, the approaches used in building models can be classified into three broad categories - physical or mechanistic approaches, statistical approaches and more recently, machine learning.


Physical or mechanistic models are built on understanding the system - the laws governing how the system works can be represented by equations and these equations can be solved to understand how different conditions can impact the results. Let’s say, for example, that we’re interested in understanding how excess rainfall can impact the flow of a large river like the Mississippi and hence lead to flooding in cities and towns located near the river. In a classic physical or mechanistic model like the SWAT model, this question can be represented through a series of equations - rainfall amounts linked to water levels in the river, slope of the land and vegetation linked to water flow, and all these linked to determine how much water will move through a city near the river and how fast the water will move. Now, this relatively simple series of equations can be made more complex as we include additional processes and systems. Weather models can be used to predict rainfall amounts based on the wind, cloud conditions, temperature and so on. If a storm is going to impact a large area, all the smaller streams, canals and rivers can be modeled to estimate water flow and water volume. Since these systems feed into a large river like the Mississippi, the results from these models can be fed into the model of the Mississippi river to make the predictions more accurate. 


The advantages of using physical models are that the results are easily interpretable, the models can be made as complex or as simple as needed, and the results rarely violate natural laws. Also, it’s easier to represent the uncertainty present in the system because the models can be run using a range of different conditions. For example, if we expect that a storm will dump between 10 - 20 inches of water in an area over a period of 8 hours, we can estimate a range for how much the water level in the area is likely to rise and how long it will take for the water to drain out. 


Of course, the challenge of using physical models is that as our models become more and more complex to understand large, interconnected systems -  it becomes increasingly challenging to build them. Most physical models require parameters that are difficult to derive from first principles - and as our models get more complex, the number of parameters that are needed becomes even larger. Then, running these models is time consuming and takes up a lot of computational resources. Add to that the issues associated with biases in estimating the model parameters, calibrating models accurately, performing sensitivity analyses and estimating model uncertainty - and you can see how challenging it can be.


That’s where statistical approaches and more recently machine learning approaches come into play. Both these approaches are focused primarily on the data - what do the data tell us, can we infer anything from the data and can we use the data alone to make predictions about the system. However, machine learning has been applied relatively recently compared to traditional statistical methods and comes with an entirely different set of assumptions and challenges. So, when do we use each of these approaches and why?


Statistical approaches, spatial and temporal statistics, have been the go-to approach when the solution to the problem can be determined using the data alone - or if we need to understand the patterns inherent in the data in order to deduce something about the system itself. To better understand when we use statistical methods, let’s go back to our example about predicting flooding in a city near the Mississippi. 


Let’s say that as scientists and engineers, we’ve built a sufficiently complicated physical model that can help us understand the areas that are most at risk from rising water levels in the Mississippi during a storm event. In addition to our models we also have a series of sensors along the river to help us monitor what’s happening in real time and ensure that our model results are sufficiently accurate. However, as our storm progresses, we notice that our sensor data are not following our predictions from the model. Why would that happen? Either the sensor data are faulty, or there is something else happening that we did not account for in our model.


Now, this is where being able to quickly verify and “ground truth” data is so essential as we discussed last time! Using classical statistical modeling (modeling the distribution, identifying the outliers etc), we can identify if the data from some of the sensors are faulty and hence the physical model results are more valid or if enough sensors have different data (using a p-test for example) that there is really something different happening in our system.


Why do we care about something that is so seemingly esoteric? Because the answer to these questions is what will help us predict which areas are going to be impacted and for how long, where to send resources and how to best help people during an emergency. And that’s a typical example of how a combination of physical models and statistical approaches have been used in clean technology.


Given that we already have tools like physical models and statistical models, are machine learning and AI really useful and necessary? Machine learning models also work primarily with data, but the difference is that there are several tools that can be used in addition to classical statistics and machine learning can be used when we have “big data” or data that is high velocity, high volume and high variability. Many machine learning models (neural networks, SVM, K-means models) are based on equations where we can set certain constraints on the problem and hence teach the model what parameters to use to find the best answer. Getting good results from a machine learning model means that we need to have sufficient high-quality data to train and test the model to get useful results.


To go back to our previous example on flooding and the Mississippi - let’s see how machine learning can be of use.


Now, if we had only about 30 sensors along our river sending us data every hour or so, we would have a relatively small data set that can be processed much faster using traditional statistics. However, let’s say we had 3000 sensors that were sending us data every 10 seconds - something that is becoming more and more common these days! How would we process it and find out if the sensor data were faulty or if there really was something unexpected happening?


This is where machine learning models come into play! Given this level of data, it would be more practical to build a machine learning model to analyze the time series and find anomalies in the data for each sensor, make a decision on sensor reliability using a constraint on the value from a p-test or an expert rule, and hence identify if the sensors are at fault or if the physical model is inaccurate. 


If there’s sufficient high-quality data, either statistical or machine learning approaches can give us results faster and more efficiently than physical models. Also, they allow us to uncover surprises in the data and relationships between parameters and systems that may not have been obvious. 


However, because these approaches focus only on the data, there’s a good chance that errors that violate natural laws can creep in. And it’s harder to understand exactly what is driving the results compared to a physical model. Many machine learning models, like neural networks for example, are “black boxes” and teasing out the relationships between the different features can be difficult in complex Earth systems. Additionally, it’s difficult to represent the inherent uncertainty present in our understanding of these complex systems using statistical or machine learning models.  


While there are challenges in all these approaches and choosing one approach or another is usually done for a good reason, ultimately all of them are complementary. And that’s where a lot of the current work in using machine learning and AI in clean technology is happening!


We use mechanistic models to better understand the system, forecast changes in the future and look at linkages between different aspects of the system. We can use machine learning and statistical models to better estimate the large number of parameters needed in these models, and uncover unexpected relationships between parameters. Additionally, in cases where the system is well understood and physical constraints can be implemented into machine learning models, we could replace sub-models of a large, complex physical model with a machine learning or statistical model. This would make the model faster and more efficient - which means less time needed to get useful results! 


As you can see, figuring all this out is pretty complicated! Next time, we’ll talk about how and where it can be learnt - degrees and university programs that teach it and what to expect.

What our community are reading

Moonshots, Models, IoT and Machine Learning in Agriculture

Our online community space is now live!

How much water should an email consume? Data centers and water use