Posts

What's new in 2020 at Ecoformatics?

  Here's wishing all our readers a very happy new year!  Last year, we started our journey to making data science accessible for people interested in clean technology and solving the problems facing our planet.  We had the pleasure of conducting several workshops and online webinars on different aspects of data science in clean technology. We covered a wide range of topics in our in-person workshops where we discussed data sources in different clean tech sectors, how to build effective algorithms and models including deep learning, and presented uncertainty analyses and business use cases. We also began conducting online sessions in the latter half of the year where we introduced folks to careers and tools at the intersection of data science and clean tech.  As part of our expansion plans in 2020, we're creating an online education platform that focuses on applying data science effectively in clean tech sectors. We're in beta this month ...

When Data Science Fails Clean Technology

  Is data science infallible? If all we had to go on were the breathlessly excited articles published in business magazines and the highly polished releases from startups and large tech companies, it would certainly seem so. Think of the articles that have been published this year with titles like -     “Artificial Intelligence (AI) to replace all jobs by such and such year”, “Machine learning solves problem faster than humans”,     “ Data science shows promise to end world hunger soon” - and so on and so on….  Data science is a relatively new field, but one that combines elements from disciplines that have been around for a while - computer science, statistics, and algebra for example. The difference right now is the sheer power and availability of computational resources like the cloud that allow people to build and run different models and experiment on a scale that we haven’t seen before. And in high-tech companies, we’re also seeing an explosion in the...

Data and Smart Cities

  If you had to pick a buzzword for 2019 in clean technology and data science, it would be “Smart Cities”!     This year, we’ve heard about Alphabet’s Sidewalk Labs and their efforts to design one in Toronto; India and China have announced plans to redesign over 100 cities into “Smart Cities”; European countries like Norway and Finland highlight the fact that much of what is touted as a “Smart City” already exists in their systems; and plenty of people in Silicon Valley have their own ideas of what Smart Cities should be like and what they should do.     A couple of features do stand out in many of the conferences and presentations about Smart Cities - 1) Opportunities abound with an estimated market size of   $237 billion by 2025 according to one study   and 2) There’s a wide range of interpretations about what exactly makes a city “Smart”.   The most conservative definition and the one that governments and city organizations highlight is “ where...

Upcoming conference on Smart Cities in Silicon Valley

  There's an interesting conference coming up in Sunnyvale, California on October 8th. The topic is "Smart Cities Innovation Day: A future outlook on urban life". The location is at one of the Valley's incubators - Plug and Play and the agenda promises to be quite interesting! Several startups are presenting, there's a panel discussion with officials from regional and city governments, and a keynote from the head of IoT at the World Economic Forum.    If you're curious, the link to register is   here.  

Wildfires and the limitations of data science

  Let's get back to talking about wildfires! This is the second of our two post series on wildfires, data science and what we can do about them.   In the first post, we talked about satellite data and how it is used to track wildfires - especially large ones like we see in the Amazon this year. The wildfires burning in the Amazon have slipped off the front page of most newspapers, but they're still burning. And let's not forget the opposite end of the globe where wildfires in Indonesia are also burning out of control! The interesting aspect about wildfires and natural disasters in general is that most of the attention and resources are focused on dealing with them as they are happening and figuring out what resources are needed and what changes are needed after it's all over.  So, we see a lot of effort focused on satellite imagery, understanding wildfire extents after they have started, building apps and websites for people to access resources and tools during and afte...

Live event Announcement : Feeding the world with data science at Sacramento, CA

I'm excited to have been invited by the organization "Women in Big Data" to talk about data science, agriculture, the environment and all the associated challenges. The symposium is on Wednesday, September 18th at Sacramento, CA and you can expect a lively discussion with me and three other experts on the panel! If you're interested, event details are at https://www.eventbrite.com/e/symposium-using-data-to-sustainably-feed-the-world-tickets-72188915991 . For all of you who are curious,   Women in Big Data   is a grassroots organization that started in Silicon Valley with women from different organizations who work in data science coming together to talk about the technical challenges, career pathways and entrepreneurship and funding among other fun topics. The group started with 5 women and now has close to 14,000 members all over the world - which is an amazing growth rate over the last 3 years! They hold conferences, regular meetups and training sessions and the one...

Wildfire monitoring from satellite data

Image
  We had a great time hosting our   "Getting started with Data Science for clean technology professionals" webinar  - a big thank you to everyone who registered and asked all those interesting questions!  If you're interested in getting the recording, it's now available for download on our website. And now, back to our regular posts on how data science is used in different clean tech fields! There's been a lot of news about the wildfires in the Amazon and the consequences for the planet, so let's talk about wildfire detection and how it's done. Wildfires, and in particular the Amazon fires, are detected using data from a wide range of satellites - NASA's array, the EU's Copernicus, Brazil's Terra satellites, Japan's Himawari-8, and CubeSats among others. But what exactly do these satellites see and how can you identify a wildfire from the data? Typically, satellites carry multi-spectral cameras or sensors on board. As the satellite passes o...

Webinar announcement: Getting started with data science

  We've been getting a lot of questions recently from folks who are already working in clean tech fields - water, energy, agriculture, environmental consulting, climate change - about data science and what you need to get started. Some of the questions that we get asked a lot: Is all this AI and data science stuff hype? Is it really useful? How is data science different from the traditional statistical methods that we've used? I'm confused about what a data scientist does? Is that different from a data analyst? I can see the potential, but there I don't know where to start and which options are most useful to me at this stage. If this sounds like something you've been thinking, join us for a free webinar this Friday, August 16th at 11.30 am Pacific Time!

Snippets: Monitoring crop diseases, infrastructure health and wildlife

  "If you can't measure it, can you fix it?"   One of the greatest challenges faced by almost everyone working in a clean technology field - water, agriculture, energy, climate, forestry, wildlife, soils, corporate sustainability, smart cities - is the challenge of monitoring. At its essence,this is the challenge of   what needs to be measured, how often and how accurately can it be done . Traditional methods of monitoring have involved sensors (of different levels of accuracy) placed in specific locations and the data removed and processed off-site by engineers and field analysts at specific time intervals. This is a time-consuming process, with data that isn't as frequent or as spatially dense or with as many parameters as decision makers and scientists would like - but, until recently that's been the best that we've had. The advent of smartphones, high-frequency and high resolution satellite data and the whole Internet of Things (IoT) is changing this parad...

Data Science For Water In California

I was at the   Open Water for California   conference in Sacramento last week - a conference dedicated to data science for water, with a focus on California. The conference itself was well attended with people from many different sectors - academia, government, non-profits and community members. What was particularly interesting is that it had an entire track devoted to hackathons and building new tools to solve some of the pressing problems being faced in water today - as well as the more traditional sessions with talks on the latest research and tools for the water sector.   The hackathon were focused on issues related to understanding trash movement into water sources (especially plastic), building consumer confidence reports on water safety and tools to better understand drinking water sources. There was a lot of interest in water safety understandably, with Flint still fresh in our minds and the recent focus on water quality issues in California communities affected ...

Coding, Databases, GIS and other tools for a clean tech data scientist

  As we saw in the last post, a data scientist's role requires the ability to  capture, process, analyze and visualize the data.  While there are some off the shelf software tools, most applications in the clean tech and data science space require knowledge of a programming language in order to perform many of the tasks effectively.  The popular choices for a clean tech data scientist are 1.   Python : Python is probably the single most critical element in the data scientist’s toolkit. It’s a flexible, easily learnt computer language that is powerful because of the large stack of libraries that have been developed. Do you need to figure out how to get data from a website – or train a machine learning algorithm? The chances are that there is an existing library in Python that can be plugged into your code.     The main libraries that are necessary for any of the data science use cases are   scipy, numpy, statsmodel and pandas . These can be used fo...