Inspiration
In the modern world, despite the efforts put into addressing Zero World Hunger by 2030 goal, the issues of water scarcity and soil degradation continue to worsen. Considering the current rates, the situation would only grow dire in a decade.
This situation makes it imperative for fast iterations of policies and identifying crises to act upon it.
Satellite data is often vague and oriented towards large areas, it might not be effective for regional applications. The other approach being manual surveying, we see that the time required for the surveying often is much longer than the time available to design optimal policies. Therefore, it becomes crucial for effective data sourcing to address this major issue.
What it does
Landmark is a data collection network of low power low cost sensor arrays. These nodes are evenly spread across land, and send periodic data of temperature, humidity, pH, salinity. This data is sent over TTN to a gateway, which allows it to be updated in the MongoDB database. For validity in auditing, we introduced a unique fingerprint for each entry using Solana.
Based on this data, we normalise anomalies using a Kalman filter, and infer general trends over a longer period of time.
Also using Weather APIs and local data on terrain and soil type, we are able to draw effective correlative inferences based on the data. Considering the limited data available at start, we will be using a continual learning ML model.
How we built it
We used an STM32F series Nucleo board as the foundations for the sensor, and we are considering the usage of STM32 L series boards for the actual product. We used a temperature and humidity sensor now, while we could integrate pH and salinity sensors in the actual product (did not have it in hand, or in the hardware chest). Due to the lack of Radio capabilities by the board, we used an ESP32 to communicate over Wifi to send API requests to MongoDB to update the data. (STMCUBEIDE, C, Assembly)
We used a React frontend, with typescript.
For the data processing, we implemented a Kalman filter to remove anomalies, and based on general trends observed, we use an LLM call to draw inferences on the data. (Python)
Challenges we ran into
Data representation on the map using APIs turned out to be a lot harder than expected. We ended up implementing a gradient based visualisation that ended up being a lot clearer than standard representation of nodes on the map.
Designing hardware turned out to be even harder, where the balancing of the ground between the stm32 and the esp32 turned out to be a bigger hassle than the software part. We ended up sorting it out after spending 3 hours on that single problem (woohoo)
Accomplishments that we're proud of
- Intuitive and low-friction UI
- Data integrity through blockchain
- Hardware-Software Integration
- Exploring and implementing in a different domain with IoT
What we learned
- How to use time series analysis in agro systems
- Working with embedded systems
What's next for Landmark
- Ethical data sourcing
- Providing better inferential data (identifying causalities rather than only correlations)
Built With
- assembly
- c
- embedded
- machine-learning
- mapbox
- mongodb
- next
- python
- solana
- stm32
- typescript
Log in or sign up for Devpost to join the conversation.