A high-performance, reactive Java library for semantic search using vector embeddings with Qdrant and Ollama.
public void usage() {
// Create service with configuration
VectorDbService vectorDbService = new VectorDbService();
// Use config with authentication tokens if services require them
VectorStoreConfig config = VectorStoreConfig.create(
"http://localhost:11434/api/embeddings", // Embedding service URL
"nomic-embed-text", // Embedding model
"embedding_token", // Token for Ollama (if required by reverse proxy)
"http://localhost:6333", // Qdrant URL
"qdrant_token", // Token for Qdrant (if required by reverse proxy)
"default", // Namespace
"vector_store" // Collection name
);
// Add a record with attributes
VectorDbInput record = new VectorDbInput(
1, // ID
"Sales by Region", // Name
List.of( // Attributes
new AttributeGroup("dimensions", List.of("Region", "Date")),
new AttributeGroup("measures", List.of("Sales Amount", "Quantity"))
),
System.currentTimeMillis() // Updated timestamp
);
// Store the record with auto-initialization if needed
vectorDbService.upsertOrLoadRecords(
config,
List.of(record),
() -> Mono.just(List.of(record1, record2, record3)) // Supply all records if collection is empty
).block();
// Search for similar records
String query = "sales performance by geographical area";
int limit = 5;
VectorDbQuery searchQuery = VectorDbQuery.basic(query, limit);
List<VectorDbSearchResult> results = vectorDbService
.findRelevantRecords(config, searchQuery)
.block();
// Process results
results.forEach(result -> {
System.out.println("ID: " + result.id() + ", Score: " + result.score());
System.out.println("Name: " + result.name());
});
}
nomic-embed-text
modelFor local development, you can run the required services using Docker Compose:
services:
ollama:
image: mingzilla/ollama-nomic-embed:latest
container_name: ollama-service
ports:
- "11434:11434"
environment:
- OLLAMA_HOST=0.0.0.0
restart: unless-stopped
qdrant:
image: qdrant/qdrant:latest
container_name: qdrant-service
ports:
- "6333:6333" # REST API
- "6334:6334" # gRPC API
environment:
- QDRANT_ALLOW_RECOVERY_MODE=false
restart: unless-stopped
Start the services:
docker-compose up -d
The services will be available at:
Note: Both services can be placed behind a reverse proxy for token verification if needed in secure environments. The library supports sending authentication tokens through VectorStoreConfig
for both services when required.
mingzilla/ollama-nomic-embed
image comes with the embedding model pre-installed, eliminating the need for model downloading and maintenance.The library consists of these main components: