App repository: https://github.com/MasonMcV/Ember-Watch
Front-end of the app was development using Flutter, Google's framework for developing cross-platform apps. It utilizes the dart programing language, Google's material design language.
Image recognition uses Python, using the TensorFlow library with edits for Python 3 compatability, this is run in an AWS Lambda function. After verifying the image contains a wildfire, it is checked against the NASA FIRMS data set to further validate user input.
The data from this function is passed to Google Cloud Firebase and downloaded onto the front-end into a list of widgets.
All of this works seamlessly and quickly using API calls.
Competing on the College/Professional Level
SpaceApps is a NASA incubator innovation program.