A Spring Boot application built with GraalVM that demonstrates semantic search using Oracle Vector Store and OpenAI embeddings.
GitHub: alina-yur/oracle-database-vector-search
Find full details in a blog post.
- GraalVM, or install with
sdk install java 25.0.1-graal - Oracle Database with Vector support (for example, Autonomous Database 26ai)
- OpenAI API key
By default, this project uses Oracle Autonomous Database via TLS connection.
Required environment variables:
export OPENAI_API_KEY=
export DB_PASSWORD=
export ORACLE_JDBC_URL='jdbc:oracle:thin:@(description=(address=(protocol=tcps)(port=1522)(host=<your-adb-host>))(connect_data=(service_name=<your-service-name>))(security=(ssl_server_dn_match=yes)))'Optional environment variable:
export DB_USERNAME=ADMINIf you start with only the (description=...) descriptor body, prepend jdbc:oracle:thin:@.
mvn spring-boot:runThe application automatically loads sample pet store inventory data on startup.
Now let's build it with Native Image:
mvn -Pnative native:compile
➜ eza -l ./target | grep "store"
... 107M ... vector-pet-store
./target/vector-pet-storeYou can search for the pet store items using natural language:
curl "http://localhost:8080/petstore/search?query=Treats%20for%20small%20loud%20dogs"
# or with httpie
http localhost:8080/petstore/search query=="find cat food with tuna"Example requests:
- "Find dog toys" -> finds "Labrador Bark Control Chews", "Heavy Duty Rope for Large Breeds"
- "Find cat food" -> finds "Gourmet Tuna Souffle for Cats", "Gourmet Chicken Soup for Cats"
The API returns semantically similar products using vector similarity search.
Using Docker or Podman:
podman run -d -p 1521:1521 --name oracle-free -e ORACLE_PASSWORD=mypassword -e APP_USER=appuser -e APP_USER_PASSWORD=mypassword -v oracle-data:/opt/oracle/oradata gvenzl/oracle-free:latestPoint ORACLE_JDBC_URL at the local database and run the app.
- Startup: ~1.5 seconds
- Package size: ~107 MB standalone executable
- Memory: Significantly lower
Native Image compiles your Spring Boot application ahead-of-time into a self-contained executable with faster startup (typically 20-30x), lower memory footprint (typically 3-5x), and compact deployment.