Virtual Reality (VR) and Augmented Reality (AR) In Clean Technology

What do the terms Virtual Reality (VR) and Augmented Reality (AR) bring to mind? Hollywood movies like “Black Panther” with the crazy action sequences where cars and airplanes are controlled from a laboratory, games like World of Warcraft with your gaming character moving through all the different locations, Pokemon Go and hunting for the prize in an actual physical location, Star Trek holodecks where you could explore completely different planets and surfaces… the list goes on and on. The one thing that all these examples have in common though is that they all come from the entertainment industry.

 

Now, VR and AR have been used extensively in playing games, having fun, and making movies more realistic. However, as devices like Google Cardboard, Oculus Rift and HTC’s Vive become more widely available and affordable, VR and AR have begun making their way into fields beyond just entertainment. In clean tech, in particular, there’s been increased interest in ways in which these technologies can be deployed to improve outcomes in the field, improve access to education and heighten awareness of the environment. 

 

But first of all, what’s the difference between VR and AR, other than a letter? 

 

Augmented Reality (AR) is when additional information, either from a simulation or from a digital repository like a GIS dataset, is overlaid onto an existing environment. For example, let’s say that you have a smart glass like Google Glass or a smart phone camera and are looking at the city around you. If you were playing a game like Pokemon Go, a Pokemon could pop up in a location near you. If you were a technician from an energy company trying to find a buried utility line, a map of the lines near your location could be fed to your headset so that you could figure out where to dig to find the line. 

 

AR typically requires a smart glass or cameras and sensors paired with an app on the phone or computer. It’s another way to display information - like Google Maps with traffic on it and it typically requires that the user is aware of their surroundings.

 

Virtual Reality (VR), on the other hand, is when the user is transported virtually to somewhere completely different from their current location. For example, if you were to put on a VR headset, it could then let you pretend to be in the middle of the ocean, swimming with dolphins. You would experience the sights and the movement in a way that would very closely mimic what happens if you were actually swimming in an ocean. VR headsets like the Oculus Rift are opaque to their surroundings - so if you as the user were to put on on and turn it off, you would be practically blind to what’s going on around you. If you were to turn it on, you would be in the middle of whatever simulation or system you wanted to interact with - like swimming in the ocean.

 

In order to build a VR application, it needs to be tied to a VR headset and whatever other system is supplying the power for the simulation - whether it’s a gaming console, a PC or a standalone mobile headset. 

 

A VR experience can be thought of as a completely immersive experience in a different environment, while an AR experience is more like learning more about your current environment. 

 

The two technologies, while closely related, have very different applications in the clean tech field. At present, AR is being used more widely as it’s a relatively easier technology to implement and find use cases for. The energy sector in particular, is at the forefront of using AR to improve their outcomes in the field. EPRI, the energy research institute in Palo Alto, California, is partnering with several utilities in the United States to see if AR systems can be used to help technicians in the field. Some of the use cases they are investigating is having a GIS feed of transmission lines/distribution lines and power lines to the technicians in real-time so that they can find the locations of faults easily. Another possibility would be if fault reports were added to the GIS feed, so that repair crews could prioritize the sector to start work in closer to real-time. Or say, if repair crews were wearing sensors that monitored natural gas concentrations - they would more easily identify leaks in natural gas pipelines before major disasters occurred. 

 

Another use case where AR is beginning to be used widely is in making design decisions. Some companies have begun adding an AR component to their design documents. For example, let’s say a company wants to build a wind farm / solar farm in an area with a natural preserve close by. They could create an AR application where the wind turbines or solar panels are added to the actual location as designed. This would allow people to see exactly how the natural beauty of an area would be changed, if at all - or if the system design would negatively impact bird flight paths. Something like this would help immensely in minimizing conflicts between companies and communities over how an area could be developed.

 

These are just a couple of examples - but the potential for using AR is almost limitless. If you’re building a 3-D simulation of something, or creating a visualization using a map, the chances are that it can translate into a AR system. Think about looking at how well placement could be visualized to watch groundwater recharge or extraction. Or watching how contaminants move through the atmosphere near a factory. Or how quickly a forest could recover after a fire and where recovery efforts should be focused.

 

VR, however, at present is being used more extensively as a training and awareness building tool in the clean tech field. While, in the future, it’s likely to overtake AR, at present its utility is still limited. Utility companies are using VR systems to help train technicians in making complicated repairs to systems. In this case, it’s being used as an aid to existing training methods so that the technicians can get a feel for the actual system before going out to make the repairs. It’s also being used as an educational tool in universities. Stanford’s Human Virtual Interaction Lab for example, recently ran studies where students used VR systems to watch what happens when acidification of the oceans takes place. As students watched the coral reefs bleach and the environment change, they reported both increased understanding of how the ocean systems work as well as a desire to improve the environment. 

 

In short, both VR and AR have a bright future ahead in the clean tech field! Next time, we’ll look at the technology and data science needed to make this technology work.

What our community are reading

Moonshots, Models, IoT and Machine Learning in Agriculture

Our online community space is now live!

How much water should an email consume? Data centers and water use