Be the first to know and get exclusive access to offers by signing up for our mailing list(s).

Subscribe

We ❤️ Open Source

A community education resource

How vector databases unlock semantic search and AI workflows

From Star Wars to semantic search: Understanding vector databases.

Vector databases might sound like another tech buzzword, but they are quietly powering many of today’s generative AI applications. In her lightning talk at All Things Open, Jessica Garson from Elastic shares insights on how vector databases change the way we think about search and why developers should pay attention.

Subscribe to our All Things Open YouTube channel to get notifications when new videos are available.

Read more: Deep dive into the Model Context Protocol

Jessica begins by comparing traditional keyword-based search with semantic search. Classic search engines rely on tokenization, synonyms, and inverted indexes, which are fast and scalable, but often fail to capture user intent or handle complex queries. They also struggle with image, audio, and video searches. Vector databases step in to bridge these gaps, enabling search based on meaning rather than literal matches.

She explains that vectors are mathematical representations of data, like words, images, or audio, encoded as lists of numbers. By using vector embeddings, developers can transform content into numeric representations that capture context and relationships. This unlocks similarity searches across multiple data types and enables applications to understand meaning in a more human-like way.

For developers, vector databases serve as a foundation for retrieval-augmented generation (RAG) and other machine learning workflows. By pairing embeddings with approximate nearest neighbor searches, teams can build applications that surface more relevant results and deliver richer experiences. Jessica notes that her slides and example code are available for anyone ready to explore further and apply these concepts in practice.

Key takeaways

  • Traditional keyword search is fast and scalable, but it struggles with intent, multimedia data, and complex queries.
  • Vector databases enable semantic search by encoding data as vectors, allowing similarity searches across text, images, and audio.
  • Developers can use vector databases to power RAG applications and machine learning workflows that deliver more contextual and relevant results.

Conclusion

Vector databases are more than just hype, they are a practical tool for building smarter AI-driven applications. Jessica’s talk highlights how the move from token-based search to semantic vector search opens new opportunities for developers to create applications that feel more natural and intuitive.

More from We Love Open Source

About the Author

The ATO Team is a small but skilled team of talented professionals, bringing you the best open source content possible.

Read the ATO Team's Full Bio

The opinions expressed on this website are those of each author, not of the author's employer or All Things Open/We Love Open Source.

Want to contribute your open source content?

Contribute to We ❤️ Open Source

Help educate our community by contributing a blog post, tutorial, or how-to.

We're hosting two world-class events in 2026!

Join us for All Things AI, March 23-24 and for All Things Open, October 19-20.

Open Source Meetups

We host some of the most active open source meetups in the U.S. Get more info and RSVP to an upcoming event.