Project Description:
Despite all the love that Mother Nature provides us on a daily basis,
humans remain a not very obedient children and in 90% of the time, they light the first fusefor wildfires, and then they indeed become wild creatures running at as fast as fast as 10.8 kilometers per hour in forests and 22 kilometers per hour in grasslands (Wikipedia). However, significant effort and research has been deployed in purpose of detecting the early signs aiming to prevent what could lead to a huge loss of ecological systems and in turn huge impact on the our planet.
Contributing to this noble striving, we utilized the technology offered in our hands to build a system that involves the community members also in this cause. PROJECT NAME is a mobile app that allows users to share their observations on wildfires happening (or about to happen) in their region.
These insights include: photos, video recordings, and text (e.g.: coordination)
How it works?
Detect & Verify Step:
The mobile app receives the fire detection data from the server that is in synchronization with NASA Fire Information for Resource Management System (FIRMS), the two on-board satellite (satellites: Terra and Aqua) image processing algorithms that FIMRS provide (MODIS and VIIRS) to detect wildfire provide two types of confidence levels numeric for the first (from 0% to 100%) and classes for the second (low-nominal-high), we chose to go with the numerical values to define our thresholds and assign them to classes: https://earthdata.nasa.gov/faq/firms-faq#ed-confidence
Which are: low (potential starting wildfire), nominal (fire at early stages), high (fire already spreading)
The users will receive an alarm on the close potential wildfire places that have an assigned class (discussed above). Additionally, the mobile app user can scan a wider region for potential wildfires locations and will be able to view the class assigned to the location. The user then reacts by capturing info on the wildfire, this info (photos mostly) will be checked with a machine learning algorithm that we trained using IBM Watson (currently it can distinguish nominal to high stages of wildfire)
Alarm Step:
If the user photo passes the test, an alarm will sent to all connected users in the same region calling them to react to the wildfire and help the community with their data. We also aim to send an alarm to the authorities especially the firefighting road traffic department in order to have an updated insight on the incident and –if needed- direct the traffic away from the potentially impacted areas.
React Step:
Despite the phenomenal efforts and the advanced AI level used, algorithms (MODIS or VIIRS) can fail due to several reasons related to satellite overpasses or cloud, heavy smoke cover( e.g: The California Wildfires) https://earthdata.nasa.gov/faq/firms-faq#ed-not-detected
So we are obliged to think of a plan in minimize the human suffering in case everything got out of control. The app allows facility owners such as (warehouses, schools, campuses, restaurants, hypermarkets, and even households) to register themselves to the app as Community Heroes, thanks to those heroes, people affected by the wildfire (or..yes you thought of it… any other type natural disaster) will be able to use an off-line pre-loaded map (that had been frequently updated with new heroes join who joined) to be directed to the closest shelter! At the end, we, children of nature, need to come together when Mother Nature gets mad at us. Those heroes will be rewarded first of all by being true real life heroes, and also by having some certain amount of tax cut or other incentives (everybody needs a tangible motivation!)
All the code done to support the proof of concept for the project is available on our GitHub repository and is licensed to be open-source.
image Server => https://github.com/mohsenmgr/imageServer.git
Mobile App => https://github.com/mohsenmgr/mobileSpaceApp.git
Website => https://github.com/mohsenmgr/SpaceWebsite.git
SpaceApps is a NASA incubator innovation program.