Inspiration

Wearable tech is really good at collecting all sorts of health data. From Apple Watches to Fitbits to Oura Rings, they are an untapped source of health data that we wanted to explore. While brainstorming, we realized that every member of our team had a form of this wearable tech.

Although the data was super insightful, how they’re conveyed isn’t that engaging. To us, data should tell a story, and what better medium to tell a story than art? This year’s theme of HackGT is “Midnight at a Museum,” after all.

So we thought: what if we took our raw health data and created beautiful pieces of AI art, called them Heartifacts, and made them all accessible in an iOS app?

Boom.

What it does

Heartifacts uses AI to transform your wearable’s raw health data into works of AI art. For example, if your Apple Watch logged eight hours of sleep with no interruptions, DALL-E could generate a flowery, colorful masterpiece that represents a good night’s rest. For the rest of the day, if you crushed your step goal, you could then get a dynamic, active piece of art that reflects it.

The app weighs the individual variables from your sleep patterns (REM, deep, core sleep), movement data (steps, energy burned, exercise), and mindfulness/stress levels (HRV, resting heart rate) to develop ‘scores’ for each category. They’re then passed through prompts that first chooses an art style, filters your score depending on your performance, then prompts DALL-E to generate your unique Heartifact, a piece of art that directly reflects your health performance throughout the day.

How we built it

Heartifacts uses SwiftUI and the Apple HealthKit framework to query your personal health data related to sleep, movement, and stress. It then uses Open AI’s DALL-E API to generate personalized artwork based on these metrics. Because we don’t have access to Apple’s proprietary sleep/stress/exercise scores that are calculated for the health app, we developed our own algorithm to calculate quality of sleep, stress, and exercise based on the raw data available to ourselves (i.e. for stress: HRV, resting heart rate, and mindful minutes).

The app features two tabs; the first is a color coded dashboard that allows the user to see their raw health data that contribute to the unique ‘Heartifact’ generation. The second is the screen that shows a piece of art generated by DALL-E 3 that represents the quality of the user’s sleep, stress, and movement.

Challenges we ran into

None of us had experience with Swift or building an iOS app. We were learning a new language and development environment together, and it was really fun (albeit sometimes a little frustrating). Regardless, it didn’t prevent us from executing on our idea.

Another roadblock was due to concerns regarding compliance with data privacy regulation. Unfortunately Apple HealthKit doesn’t allow you to sync with real Health accounts while the app is in development. To overcome these issues, we had to generate synthetic data to generate our Heartifacts using DALL-E.

Accomplishments that we're proud of

We were super ambitious in trying to build with something none of us had experience with. We had never developed an iOS app before, so we all had to learn the fundamentals of Swift and all of the weird restrictions that Xcode brings along with it. We were also extremely proud of how we handled our class design! We were able to separate the development of the health data collection, AI processing, and UI presentation and combine them at the end into one cohesive package. Finally, as a small touch, we had a lot of fun building the museum themed interface including the parallax effect when swiping between the artifacts.

What we learned

Prompt engineering: We took inspiration from Midjourney in designing prompts for AI image generation. This included taking prompt weighting, multi-prompt structures, and negative prompts into account when prompting for our final image.

DALL-E 3 Integration: Learned to work with OpenAI's image generation API by handling auth, error management (prompt kept violating TOS, and async image processing)

We also learned how to split up complex tasks separately and concurrently, as we all had to learn and develop a different part of the app to bring it all together at the end.

What's next for Heartifacts

There’s a lot of features we thought of that we wish we had more time to work on! The first is trying to integrate other health data APIs from companies like Fitbit and Garmin. This would expand the userbase and allow for more unique data metrics to come from their unique technologies. Also, we’d love to have an archive feature that allows the user to archive their unique Heartifacts that they generate everyday so they can look back at their activity history and their cool piece of art that goes along with it. Finally, we wanted to consider the possibility of an MCP server so that we could expedite the fetching process instead of manually sending queries to the API.

Built With

  • healthkit
  • openai
  • swift
  • swiftui
Share this project:

Updates