Inspiration

Saving and making notes while studying on the web requires a lot of context switching and it prevents you from entering the flow state. SecondBrain helps you by making this experience frictionless by only using hand gestures without the need to change tabs or even use your keyboard.

What it does

Initialise the program and start browsing on chrome. If you want to save a page to your 'second brain' you can make a grabbing gesture with your hand to save the current web page or PDF which can then be queried later on.

Whenever you want to query your notes, you can either use our web page to chat with your saved pages or raise your finger and use our speech to text feature to ask a question.

How we built it

Vision Model - Media Pipe is used to detect hand gestures. The supported gestures are closing your fist and rasing a finger. A browser extension then scrapes the text on the page or PDF and OpenAI's text-embedding-3-small is used to generate embeddings (vectors) of the webpage. This is then stored in Redis Cloud and a vector search is performed for context. We then use a language model to summarise the context and answer the question appropriately.

Taipy is used for the front-end of the app in addition to the hands free interaction powered by MediaPipe.

What's next for SecondBrain

Adding support to create flashcards tailored to the content that the user has been browsing.

Create your second brain today!

Built With

  • llm
  • redis
  • taipy
  • vision-model
Share this project:

Updates