Inspiration

One of the biggest roadblocks in music production is choosing the right sample for your song. It's tedious and inhibits the creative process. What if there was a way to quickly and easily find samples that fit into your mix?

What it does

SoundSift references your entire sample library, and users can find sample by either uploading a clip of their song or searching by text. E.g. searching "sad guitar melody" would return samples that most closely matched your query.

How we built it

We used two vector embedding models that use a technology called CLAP: Contrastive Language-Audio Pretraining. One is audio-to-audio and one is text-to-audio. We were able to create an API using these models and performed a FAISS similarity search to search for the top samples in the user's sample library. The frontend was built in Svelte as an Electron desktop app.

Challenges we ran into

Getting the models to work was a challenge, as the Huggingface API did not work. We eventually figured out these models were relatively light-weight and could be used locally. Wiring our API to the frontend was also a challenge, as we have to store the embeddings on the user's file system and perform similarity search on the frontend.

Accomplishments that we're proud of

We are proud of making a genuinely useful app that music producers gain real value from. It enables producers to spend their time on what's important--making art. We are also proud of our UI which is sleek and elegant.

What we learned

We learned a lot about how to deploy and host these kinds of embedding models, and how to build a Svelte app with electron.

Built With

Share this project:

Updates