Inspiration

During a hackathon, organisers are swamped with ensuring smooth running of events. It becomes tricky to ensure hacker teams have a great hacking experience, especially for newer hackers.

Often, organisers only find out about poor experiences and areas to improve – through surveys post-hack. If only they were able to support teams that needed the most help and guidance during a hack. Additionally, hacker teams struggle to understand how their project is going, how well they are communicating, and having fun at a hackathon through events.

Two important problem statements arise:

  1. How can we empower organisers to understand how each team is going during a hackathon and not only after?
  2. How can we incentivise better project, communication and participation outcomes for teams during a hackathon?

What it does

TeamTree is a reflective visualisation of a hacker team.

Starting off as a sapling and growing in complexity and beauty as the team develops aspects of their project, communication, and participation. Each aspect results in a tree branch growing.

During a hackathon:

  1. Organisers are able to see a real-time forest growing an intuitive visualisation of how teams are going. Thus, easily identifying and then provide guidance to teams that are struggling in some aspect.

  2. Hackers are incentivised:

  • Better Project Outcomes: GitHub based – Through more frequent GitHub commits, using multiple languages/frameworks/tools, and learn proper development practices.
  • Deeper Communication: Discord based – to chat more frequently and use more positive language and collaboration.
  • Broader Participation: Google Forms based – checking into hackathon events and sponsor showcases results in a more fun hackathon experience.

After the hackathon, hacker teams are able to flex their TeamTree through an AR model.

How we built it

We wanted to encapsulate an integration of live data from multiple popular tools used in hackathons into a AR output.

  • Tied in Azure CosmoDB and Azure Blobs for centralized storage solutions
  • Used GitHub API to extract interesting data from RealityHacks 2020 repo's.
  • Made a Discord Bot (FlowerBot) for super simple integration, that pulls the Team communication data we need.
  • Created a Bee Simulation (based on) for during-hack visualisation for organisers to understand mentor attention to different teams and thus, empower them to allocate support more equitably. Based on math frameworks used in Ant-colony behaviour of Decay Model and Brownian Motion with Headings tied to two simple behaviors (If I don't have food = Go Find Food, If I have Food = take my food home and tell my fellow hive mates by leaving information (pollen particulates on the ground))
  • L-Systems in Unity (based on) used to rapidly generate unique tree models in Unity based on data input. L-System is a rewriting system used mainly to model the development of plants.

Challenges we ran into

  • Sustaining communication in smaller groups.
  • Working across 4+ timezones, including 12 hours ahead.
  • Collaborating as a 9 member team, and ensuring everyone's input is taken.
  • Hacking through unforeseen circumstances like illness (including COVID) and work commitments.
  • Getting access to GitHub centralisation late into the hackathon.

Accomplishments that we're proud of

  • Connecting disparate parts.

What we learned

  • Constant communication is critical.
  • Time boxing is key to refocus.

What's next for TeamTrees

  • Develop API for organisers to pull data for their own analytics/processing.
  • Additional mapping around the collected and processed data with sub-classes translated visually.
  • Capacity to watch TGE simulation unfold, and also be able to infer new meaning from it and act on the simulation (co-creation, cross-pollination, adding connections, and communication to the ecosystem)
  • Real time visualisation of the model connected in (web)AR, (web)VR, and immersive spaces.
  • Visitors in those spaces considered as an external class of "bees" also able to add data / real time interaction to the model with a master visualisation model (interoperability) hosted online.
  • Spatial sounds representing processed data.
  • Visualisation generated by AI systems / non-human agents based on type/frequency of keywords by teams and composing new visual interpretations.

Built With

  • a-frame
  • ant-colony-modelling
  • azure
  • blender
  • discord-bot
  • discord-scrapping
  • figma
  • github-api
  • l-system
  • sketchpad
  • unity
  • zappar
Share this project:

Updates