Micronaut LangChain4j

Integration between Micronaut and Langchain4j

Version: 0.0.1

1 Introduction

This module provides integration between Micronaut and Langchain4j.

This module is regarded as experimental and subject to change since the underlying technology (Langchain4j) is not yet 1.0.0 or regarded as stable.

Various modules are provided that allow automatically configuring common Langchain4j types like ChatModel, ImageModel etc. Refer to the sections below for the supported Langchain4j extensions.

2 Quick Start

Add the following annotation processor dependency:

annotationProcessor("io.micronaut.langchain4j:micronaut-langchain4j-processor")
<annotationProcessorPaths>
    <path>
        <groupId>io.micronaut.langchain4j</groupId>
        <artifactId>micronaut-langchain4j-processor</artifactId>
    </path>
</annotationProcessorPaths>

Then the core module:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-core")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-core</artifactId>
</dependency>

You are now ready to configure one of the Chat Language Models, for the quick start we will use Ollama:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-ollama")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-ollama</artifactId>
</dependency>

To test the integration add the test resources integration to your Maven build or Gradle build.

testResourcesService("io.micronaut.langchain4j:micronaut-langchain4j-ollama-testresources")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-ollama-testresources</artifactId>
    <scope>testResourcesService</scope>
</dependency>

Add the necessary configuration to configure the model name you want to use:

Configuring the Model Name
langchain4j.ollama.model-name=orca-mini
langchain4j.ollama.model-name: orca-mini
"langchain4j.ollama.model-name"="orca-mini"
langchain4j.ollama.modelName = "orca-mini"
{
  "langchain4j.ollama.model-name" = "orca-mini"
}
{
  "langchain4j.ollama.model-name": "orca-mini"
}

Now you can inject an instance of

You can also define new AI services:

Defining @AiService interfaces
package example;

import dev.langchain4j.service.SystemMessage;
import io.micronaut.langchain4j.annotation.AiService;

@AiService // (1)
public interface Friend {

    @SystemMessage("You are a good friend of mine. Answer using slang.") // (2)
    String chat(String userMessage);
}
1 Define an interface annotated with @AiService
2 Use Langchain4j annotations like @SystemMessage

You can now inject the @AiService definition into any Micronaut component including tests:

Calling @AiService definitions
package example;

import static org.junit.jupiter.api.Assertions.assertNotNull;

import dev.langchain4j.model.chat.ChatLanguageModel;
import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import org.junit.jupiter.api.Disabled;
import org.junit.jupiter.api.Test;

@MicronautTest
@Disabled("Ollama Testcontainers broken?")
public class AiServiceTest {
    @Test
    void testAiService(Friend friend, ChatLanguageModel languageModel) {
        String result = friend.chat("Hello");

        assertNotNull(result);
        assertNotNull(languageModel);
    }
}

3 Chat Language Models

The following modules provide integration with Langchain4j Language Models.

Each module configures one or more ChatLanguageModel beans, making them available for dependency injection based on configuration.

3.1 Anthropic

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-anthropic")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-anthropic</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.anthropic.api-key=YOUR_KEY
langchain4j.anthropic.api-key: YOUR_KEY
"langchain4j.anthropic.api-key"="YOUR_KEY"
langchain4j.anthropic.apiKey = "YOUR_KEY"
{
  "langchain4j.anthropic.api-key" = "YOUR_KEY"
}
{
  "langchain4j.anthropic.api-key": "YOUR_KEY"
}

3.2 Azure

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-azure")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-azure</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.azure-open-ai.api-key=YOUR_KEY
langchain4j.azure-open-ai.endpoint=YOUR_ENDPOINT
langchain4j.azure-open-ai.api-key: YOUR_KEY
langchain4j.azure-open-ai.endpoint: YOUR_ENDPOINT
"langchain4j.azure-open-ai.api-key"="YOUR_KEY"
"langchain4j.azure-open-ai.endpoint"="YOUR_ENDPOINT"
langchain4j.azureOpenAi.apiKey = "YOUR_KEY"
langchain4j.azureOpenAi.endpoint = "YOUR_ENDPOINT"
{
  "langchain4j.azure-open-ai.api-key" = "YOUR_KEY"
  "langchain4j.azure-open-ai.endpoint" = "YOUR_ENDPOINT"
}
{
  "langchain4j.azure-open-ai.api-key": "YOUR_KEY",
  "langchain4j.azure-open-ai.endpoint": "YOUR_ENDPOINT"
}

You will additionally need to define a bean of type TokenCredentials.

One way to do this is to include the Azure SDK module.

3.3 Bedrock

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-bedrock")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-bedrock</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.bedrock-llama.api-key=YOUR_KEY
langchain4j.bedrock-llama.api-key: YOUR_KEY
"langchain4j.bedrock-llama.api-key"="YOUR_KEY"
langchain4j.bedrockLlama.apiKey = "YOUR_KEY"
{
  "langchain4j.bedrock-llama.api-key" = "YOUR_KEY"
}
{
  "langchain4j.bedrock-llama.api-key": "YOUR_KEY"
}

You will additionally need to define a bean of type AwsCredentialsProvider.

One way to do this is to include the AWS SDK module.

3.4 HuggingFace

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-hugging-face")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-hugging-face</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.hugging-face.access-token=YOUR_ACCESS_TOKEN
langchain4j.hugging-face.access-token: YOUR_ACCESS_TOKEN
"langchain4j.hugging-face.access-token"="YOUR_ACCESS_TOKEN"
langchain4j.huggingFace.accessToken = "YOUR_ACCESS_TOKEN"
{
  "langchain4j.hugging-face.access-token" = "YOUR_ACCESS_TOKEN"
}
{
  "langchain4j.hugging-face.access-token": "YOUR_ACCESS_TOKEN"
}

3.5 MistralAi

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-mistralai")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-mistralai</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.mistral-ai.api-key=YOUR_KEY
langchain4j.mistral-ai.api-key: YOUR_KEY
"langchain4j.mistral-ai.api-key"="YOUR_KEY"
langchain4j.mistralAi.apiKey = "YOUR_KEY"
{
  "langchain4j.mistral-ai.api-key" = "YOUR_KEY"
}
{
  "langchain4j.mistral-ai.api-key": "YOUR_KEY"
}

3.6 Ollama

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-ollama")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-ollama</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.ollama.base-url=YOUR_URL
langchain4j.ollama.base-url: YOUR_URL
"langchain4j.ollama.base-url"="YOUR_URL"
langchain4j.ollama.baseUrl = "YOUR_URL"
{
  "langchain4j.ollama.base-url" = "YOUR_URL"
}
{
  "langchain4j.ollama.base-url": "YOUR_URL"
}

3.7 OpenAi

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-openai")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-openai</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.open-ai.api-key=YOUR_KEY
langchain4j.open-ai.api-key: YOUR_KEY
"langchain4j.open-ai.api-key"="YOUR_KEY"
langchain4j.openAi.apiKey = "YOUR_KEY"
{
  "langchain4j.open-ai.api-key" = "YOUR_KEY"
}
{
  "langchain4j.open-ai.api-key": "YOUR_KEY"
}

3.8 Google AI Gemini

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-googleai-gemini")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-googleai-gemini</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.google-ai-gemini.api-key=YOUR_API_KEY
langchain4j.google-ai-gemini.api-key: YOUR_API_KEY
"langchain4j.google-ai-gemini.api-key"="YOUR_API_KEY"
langchain4j.googleAiGemini.apiKey = "YOUR_API_KEY"
{
  "langchain4j.google-ai-gemini.api-key" = "YOUR_API_KEY"
}
{
  "langchain4j.google-ai-gemini.api-key": "YOUR_API_KEY"
}

3.9 VertexAi

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-vertexai")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-vertexai</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.vertex-ai.endpoint=YOUR_ENDPOINT
langchain4j.vertex-ai.model-name=YOUR_MODEL
langchain4j.vertex-ai.project=YOUR_PROJECT
langchain4j.vertex-ai.location=YOUR_LOCATION
langchain4j.vertex-ai.publisher=YOUR_PUBLISHER
langchain4j.vertex-ai.endpoint: YOUR_ENDPOINT
langchain4j.vertex-ai.model-name: YOUR_MODEL
langchain4j.vertex-ai.project: YOUR_PROJECT
langchain4j.vertex-ai.location: YOUR_LOCATION
langchain4j.vertex-ai.publisher: YOUR_PUBLISHER
"langchain4j.vertex-ai.endpoint"="YOUR_ENDPOINT"
"langchain4j.vertex-ai.model-name"="YOUR_MODEL"
"langchain4j.vertex-ai.project"="YOUR_PROJECT"
"langchain4j.vertex-ai.location"="YOUR_LOCATION"
"langchain4j.vertex-ai.publisher"="YOUR_PUBLISHER"
langchain4j.vertexAi.endpoint = "YOUR_ENDPOINT"
langchain4j.vertexAi.modelName = "YOUR_MODEL"
langchain4j.vertexAi.project = "YOUR_PROJECT"
langchain4j.vertexAi.location = "YOUR_LOCATION"
langchain4j.vertexAi.publisher = "YOUR_PUBLISHER"
{
  "langchain4j.vertex-ai.endpoint" = "YOUR_ENDPOINT"
  "langchain4j.vertex-ai.model-name" = "YOUR_MODEL"
  "langchain4j.vertex-ai.project" = "YOUR_PROJECT"
  "langchain4j.vertex-ai.location" = "YOUR_LOCATION"
  "langchain4j.vertex-ai.publisher" = "YOUR_PUBLISHER"
}
{
  "langchain4j.vertex-ai.endpoint": "YOUR_ENDPOINT",
  "langchain4j.vertex-ai.model-name": "YOUR_MODEL",
  "langchain4j.vertex-ai.project": "YOUR_PROJECT",
  "langchain4j.vertex-ai.location": "YOUR_LOCATION",
  "langchain4j.vertex-ai.publisher": "YOUR_PUBLISHER"
}

3.10 VertexAi Gemini

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-vertexai-gemini")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-vertexai-gemini</artifactId>
</dependency>

Then add the necessary configuration.

Example Configuration
langchain4j.vertex-ai-gemini.model-name=YOUR_MODEL
langchain4j.vertex-ai-gemini.project=YOUR_PROJECT
langchain4j.vertex-ai-gemini.location=YOUR_LOCATION
langchain4j.vertex-ai-gemini.model-name: YOUR_MODEL
langchain4j.vertex-ai-gemini.project: YOUR_PROJECT
langchain4j.vertex-ai-gemini.location: YOUR_LOCATION
"langchain4j.vertex-ai-gemini.model-name"="YOUR_MODEL"
"langchain4j.vertex-ai-gemini.project"="YOUR_PROJECT"
"langchain4j.vertex-ai-gemini.location"="YOUR_LOCATION"
langchain4j.vertexAiGemini.modelName = "YOUR_MODEL"
langchain4j.vertexAiGemini.project = "YOUR_PROJECT"
langchain4j.vertexAiGemini.location = "YOUR_LOCATION"
{
  "langchain4j.vertex-ai-gemini.model-name" = "YOUR_MODEL"
  "langchain4j.vertex-ai-gemini.project" = "YOUR_PROJECT"
  "langchain4j.vertex-ai-gemini.location" = "YOUR_LOCATION"
}
{
  "langchain4j.vertex-ai-gemini.model-name": "YOUR_MODEL",
  "langchain4j.vertex-ai-gemini.project": "YOUR_PROJECT",
  "langchain4j.vertex-ai-gemini.location": "YOUR_LOCATION"
}

4 Embedding Stores

4.1 Elastic Search

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-elasticsearch")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-store-elasticsearch</artifactId>
</dependency>

Example Configuration
elasticsearch.httpHosts=http://localhost:9200,http://127.0.0.2:9200
langchain4j.elasticsearch.embedding-stores.default.dimension=384
elasticsearch.httpHosts: "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.elasticsearch.embedding-stores.default.dimension: 384
"elasticsearch.httpHosts"="http://localhost:9200,http://127.0.0.2:9200"
"langchain4j.elasticsearch.embedding-stores.default.dimension"=384
elasticsearch.httpHosts = "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.elasticsearch.embeddingStores.default.dimension = 384
{
  "elasticsearch.httpHosts" = "http://localhost:9200,http://127.0.0.2:9200"
  "langchain4j.elasticsearch.embedding-stores.default.dimension" = 384
}
{
  "elasticsearch.httpHosts": "http://localhost:9200,http://127.0.0.2:9200",
  "langchain4j.elasticsearch.embedding-stores.default.dimension": 384
}

4.2 MongoDB

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-mongodb-atlas")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-store-mongodb-atlas</artifactId>
</dependency>

Configuring a MongoDB server
mongodb.servers.default.uri: mongodb://username:password@localhost:27017/databaseName
Example Configuration
langchain4j.mongodb-atlas.embedding-stores.default.database-name=testdb
langchain4j.mongodb-atlas.embedding-stores.default.collection-name=testcol
langchain4j.mongodb-atlas.embedding-stores.default.index-name=testindex
langchain4j.mongodb-atlas.embedding-stores.default.database-name: testdb
langchain4j.mongodb-atlas.embedding-stores.default.collection-name: testcol
langchain4j.mongodb-atlas.embedding-stores.default.index-name: testindex
"langchain4j.mongodb-atlas.embedding-stores.default.database-name"="testdb"
"langchain4j.mongodb-atlas.embedding-stores.default.collection-name"="testcol"
"langchain4j.mongodb-atlas.embedding-stores.default.index-name"="testindex"
langchain4j.mongodbAtlas.embeddingStores.default.databaseName = "testdb"
langchain4j.mongodbAtlas.embeddingStores.default.collectionName = "testcol"
langchain4j.mongodbAtlas.embeddingStores.default.indexName = "testindex"
{
  "langchain4j.mongodb-atlas.embedding-stores.default.database-name" = "testdb"
  "langchain4j.mongodb-atlas.embedding-stores.default.collection-name" = "testcol"
  "langchain4j.mongodb-atlas.embedding-stores.default.index-name" = "testindex"
}
{
  "langchain4j.mongodb-atlas.embedding-stores.default.database-name": "testdb",
  "langchain4j.mongodb-atlas.embedding-stores.default.collection-name": "testcol",
  "langchain4j.mongodb-atlas.embedding-stores.default.index-name": "testindex"
}

4.3 Neo4j

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-neo4j")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-store-neo4j</artifactId>
</dependency>

Example Configuration
neo4j.uri=bolt://localhost
langchain4j.neo4j.embedding-stores.default.dimension=384
neo4j.uri: bolt://localhost
langchain4j.neo4j.embedding-stores.default.dimension: 384
"neo4j.uri"="bolt://localhost"
"langchain4j.neo4j.embedding-stores.default.dimension"=384
neo4j.uri = "bolt://localhost"
langchain4j.neo4j.embeddingStores.default.dimension = 384
{
  "neo4j.uri" = "bolt://localhost"
  "langchain4j.neo4j.embedding-stores.default.dimension" = 384
}
{
  "neo4j.uri": "bolt://localhost",
  "langchain4j.neo4j.embedding-stores.default.dimension": 384
}

4.4 Oracle

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-oracle")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-store-oracle</artifactId>
</dependency>

Then add one of the supported JDBC connection pools, for example Hikari:

runtimeOnly("io.micronaut.sql:micronaut-jdbc-hikari")
<dependency>
    <groupId>io.micronaut.sql</groupId>
    <artifactId>micronaut-jdbc-hikari</artifactId>
    <scope>runtime</scope>
</dependency>

Example Configuration
datasources.default.dialect=oracle
langchain4j.oracle.embedding-stores.default.table=test
langchain4j.oracle.embedding-stores.default.table.create-option=create_if_not_exists
datasources.default.dialect: oracle
langchain4j.oracle.embedding-stores.default.table: test
langchain4j.oracle.embedding-stores.default.table.create-option: create_if_not_exists
"datasources.default.dialect"="oracle"
"langchain4j.oracle.embedding-stores.default.table"="test"
"langchain4j.oracle.embedding-stores.default.table.create-option"="create_if_not_exists"
datasources.default.dialect = "oracle"
langchain4j.oracle.embeddingStores.default.table = "test"
langchain4j.oracle.embeddingStores.default.table.createOption = "create_if_not_exists"
{
  "datasources.default.dialect" = "oracle"
  "langchain4j.oracle.embedding-stores.default.table" = "test"
  "langchain4j.oracle.embedding-stores.default.table.create-option" = "create_if_not_exists"
}
{
  "datasources.default.dialect": "oracle",
  "langchain4j.oracle.embedding-stores.default.table": "test",
  "langchain4j.oracle.embedding-stores.default.table.create-option": "create_if_not_exists"
}

4.5 Open Search

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-opensearch")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-store-opensearch</artifactId>
</dependency>

Example Configuration
micronaut.opensearch.rest-client.http-hosts=http://localhost:9200,http://127.0.0.2:9200
langchain4j.opensearch.embedding-stores.default.dimension=384
micronaut.opensearch.rest-client.http-hosts: "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.opensearch.embedding-stores.default.dimension: 384
"micronaut.opensearch.rest-client.http-hosts"="http://localhost:9200,http://127.0.0.2:9200"
"langchain4j.opensearch.embedding-stores.default.dimension"=384
micronaut.opensearch.restClient.httpHosts = "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.opensearch.embeddingStores.default.dimension = 384
{
  "micronaut.opensearch.rest-client.http-hosts" = "http://localhost:9200,http://127.0.0.2:9200"
  "langchain4j.opensearch.embedding-stores.default.dimension" = 384
}
{
  "micronaut.opensearch.rest-client.http-hosts": "http://localhost:9200,http://127.0.0.2:9200",
  "langchain4j.opensearch.embedding-stores.default.dimension": 384
}

4.6 PGVector

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-pgvector")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-store-pgvector</artifactId>
</dependency>

Then add one of the supported JDBC connection pools, for example Hikari:

runtimeOnly("io.micronaut.sql:micronaut-jdbc-hikari")
<dependency>
    <groupId>io.micronaut.sql</groupId>
    <artifactId>micronaut-jdbc-hikari</artifactId>
    <scope>runtime</scope>
</dependency>

Example Configuration
datasources.default.dialect=postgres
langchain4j.pgvector.embedding-stores.default.table=mytable
langchain4j.pgvector.embedding-stores.default.dimension=384
test-resources.containers.postgres.image-name=pgvector/pgvector:pg16
datasources.default.dialect: postgres
langchain4j.pgvector.embedding-stores.default.table: "mytable"
langchain4j.pgvector.embedding-stores.default.dimension: 384

# Add this if you plan to use testresources
test-resources.containers.postgres.image-name: pgvector/pgvector:pg16
"datasources.default.dialect"="postgres"
"langchain4j.pgvector.embedding-stores.default.table"="mytable"
"langchain4j.pgvector.embedding-stores.default.dimension"=384
"test-resources.containers.postgres.image-name"="pgvector/pgvector:pg16"
datasources.default.dialect = "postgres"
langchain4j.pgvector.embeddingStores.default.table = "mytable"
langchain4j.pgvector.embeddingStores.default.dimension = 384
testResources.containers.postgres.imageName = "pgvector/pgvector:pg16"
{
  "datasources.default.dialect" = "postgres"
  "langchain4j.pgvector.embedding-stores.default.table" = "mytable"
  "langchain4j.pgvector.embedding-stores.default.dimension" = 384
  "test-resources.containers.postgres.image-name" = "pgvector/pgvector:pg16"
}
{
  "datasources.default.dialect": "postgres",
  "langchain4j.pgvector.embedding-stores.default.table": "mytable",
  "langchain4j.pgvector.embedding-stores.default.dimension": 384,
  "test-resources.containers.postgres.image-name": "pgvector/pgvector:pg16"
}

4.7 Redis

TODO

4.8 Qdrant

Add the following dependency:

implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-qdrant")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-store-qdrant</artifactId>
</dependency>

To use Testcontainers & Test Resources add the following dependency:

testResourcesService("io.micronaut.langchain4j:micronaut-langchain4j-qdrant-testresource")
<dependency>
    <groupId>io.micronaut.langchain4j</groupId>
    <artifactId>micronaut-langchain4j-qdrant-testresource</artifactId>
    <scope>testResourcesService</scope>
</dependency>

Example Configuration
langchain4j.qdrant.embedding-store.host=localhost
langchain4j.qdrant.embedding-store.port=6334
langchain4j.qdrant.embedding-store.collection-name=mycollection
# Omitt the following 2 properties if you use Test resources
langchain4j.qdrant.embedding-store.host: localhost
langchain4j.qdrant.embedding-store.port: 6334

# Minimal configuration required for Test resources
langchain4j.qdrant.embedding-store.collection-name: mycollection
"langchain4j.qdrant.embedding-store.host"="localhost"
"langchain4j.qdrant.embedding-store.port"=6334
"langchain4j.qdrant.embedding-store.collection-name"="mycollection"
langchain4j.qdrant.embeddingStore.host = "localhost"
langchain4j.qdrant.embeddingStore.port = 6334
langchain4j.qdrant.embeddingStore.collectionName = "mycollection"
{
  "langchain4j.qdrant.embedding-store.host" = "localhost"
  "langchain4j.qdrant.embedding-store.port" = 6334
  "langchain4j.qdrant.embedding-store.collection-name" = "mycollection"
}
{
  "langchain4j.qdrant.embedding-store.host": "localhost",
  "langchain4j.qdrant.embedding-store.port": 6334,
  "langchain4j.qdrant.embedding-store.collection-name": "mycollection"
}

5 Repository

You can find the source code of this project in this repository:

6 Release History

For this project, you can find a list of releases (with release notes) here: