FireWatch has received the following awards and nominations. Way to go!
The project is an iOS application that uses crowd sourcing as a tool to allow any user to report a fire. The application uses machine learning for image recognition, allowing for early and instantaneous wildfire detection.
On average, it takes two to three hours for a satellite to recognize, confirm, and report a wildfire before response units can take action. FireWatch monumentally reduces the time it takes to report and rescue by allowing users to take photos of wildfire and alerting every other user within the vicinity of it, along with the appropriate rescue units needed. Furthermore, the manpower needed to put out a fire can be determined through inspection of the images.
Why FireWatch, why not just call?
Usually when reporting a wild fire the time line goes as follows: Users make a phone call, emergency call centers answer and report the fire, authorities are alerted and respond accordingly, once the rescuers arrive at the scene of the fire the local area citizens begin to evacuate.
FireWatch however allows users to merge two steps into one. By allowing the user to simultaneously alert authorities and other users the evacuation process, in the case of confirmation, will be much easier and at a faster rate.
The source code and screenshots from the project was uploaded to github and can be accessed through the URL https://github.com/AliKelkawi/FireWatch. It is important to note that Firebase was used as a database to store and retrieve information about wildfire locations.
Reports are uploaded to a database that all users can access through the application. Users may view specific pictures of fire in pinned locations on the map, however they will not be able to manipulate the data since it is automated. The application allows users to view pictures of active fires as they are being uploaded. Colours are used to indicate the number of wildfires within a region, and as a user zooms into the map the fire pins begin to de-cluster allowing a user to view specific locations with reported wildfires.
The project was developed using XCode for iOS development, using the python language, the IBM Watson Developer Cloud for the machine learning model, and Adobe illustrator for art work and design.
For future works the team would like to integrate smart sensors into both populated and unpopulated danger areas in order to collect constant information and send it to data centers. Data centers then automatically process the data and alert authorities in case of crisis. The team would also like to incorporate weather and traffic APIs in order to inform users of potential danger areas as well as the routes to avoid them.
SpaceApps is a NASA incubator innovation program.