annotationProcessor("io.micronaut.langchain4j:micronaut-langchain4j-processor")
Table of Contents
Micronaut LangChain4j
Integration between Micronaut and Langchain4j
Version: 1.2.0-SNAPSHOT
1 Introduction
This module provides integration between Micronaut and Langchain4j.
This module is regarded as experimental and subject to change since the underlying technology (AI) is volatile and subject to change. |
Various modules are provided that allow automatically configuring common Langchain4j types like ChatModel
, ImageModel
etc. Refer to the sections below for the supported Langchain4j extensions.
2 Quick Start
Add the following annotation processor dependency:
<annotationProcessorPaths>
<path>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-processor</artifactId>
</path>
</annotationProcessorPaths>
Then the core module:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-core")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-core</artifactId>
</dependency>
You are now ready to configure one of the Chat Language Models, for the quick start we will use Ollama:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-ollama")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-ollama</artifactId>
</dependency>
To test the integration add the test resources integration to your Maven build or Gradle build.
testResourcesService("io.micronaut.langchain4j:micronaut-langchain4j-ollama-testresource")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-ollama-testresource</artifactId>
<scope>testResourcesService</scope>
</dependency>
Add the necessary configuration to configure the model name you want to use:
langchain4j.ollama.model-name=orca-mini
langchain4j.ollama.model-name: orca-mini
"langchain4j.ollama.model-name"="orca-mini"
langchain4j.ollama.modelName = "orca-mini"
{
"langchain4j.ollama.model-name" = "orca-mini"
}
{
"langchain4j.ollama.model-name": "orca-mini"
}
3 AI Service
You can also define new AI services:
@AiService
interfacespackage example.micronaut.aiservice;
import dev.langchain4j.service.SystemMessage;
import io.micronaut.langchain4j.annotation.AiService;
@AiService // (1)
public interface Friend {
@SystemMessage("You are a good friend of mine. Answer using slang.") // (2)
String chat(String userMessage);
}
@AiService
interfacespackage example.micronaut.aiservice
import dev.langchain4j.service.SystemMessage
import io.micronaut.langchain4j.annotation.AiService
@AiService // (1)
interface Friend {
@SystemMessage("You are a good friend of mine. Answer using slang.") // (2)
String chat(String userMessage)
}
@AiService
interfacespackage example.micronaut.aiservice
import dev.langchain4j.service.SystemMessage
import io.micronaut.langchain4j.annotation.AiService
@AiService // (1)
interface Friend {
@SystemMessage("You are a good friend of mine. Answer using slang.") // (2)
fun chat(userMessage: String): String
}
1 | Define an interface annotated with @AiService |
2 | Use Langchain4j annotations like @SystemMessage |
You can now inject the @AiService
definition into any Micronaut component including tests:
@AiService
definitionspackage example.micronaut.aiservice;
import static org.junit.jupiter.api.Assertions.assertNotNull;
import dev.langchain4j.model.chat.ChatModel;
import io.micronaut.langchain4j.testutils.OllamaTestPropertyProvider;
import io.micronaut.test.extensions.junit5.annotation.MicronautTest;
import org.junit.jupiter.api.Test;
import org.junit.jupiter.api.TestInstance;
import org.testcontainers.junit.jupiter.Testcontainers;
@Testcontainers(disabledWithoutDocker = true)
@MicronautTest(startApplication = false)
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
class AiServiceTest implements OllamaTestPropertyProvider {
@Test
void testAiService(Friend friend, ChatModel languageModel) {
String result = friend.chat("Hello");
assertNotNull(result);
assertNotNull(languageModel);
}
}
@AiService
definitionspackage example.micronaut.aiservice
import dev.langchain4j.model.chat.ChatModel
import io.micronaut.langchain4j.testutils.OllamaTestPropertyProvider
import io.micronaut.test.extensions.junit5.annotation.MicronautTest
import org.junit.jupiter.api.Test
import org.junit.jupiter.api.TestInstance
import org.testcontainers.junit.jupiter.Testcontainers
import static org.junit.jupiter.api.Assertions.assertNotNull
@Testcontainers(disabledWithoutDocker = true)
@MicronautTest(startApplication = false)
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
class AiServiceTest implements OllamaTestPropertyProvider {
@Test
void testAiService(Friend friend, ChatModel languageModel) {
String result = friend.chat("Hello")
assertNotNull(result)
assertNotNull(languageModel)
}
}
@AiService
definitionspackage example.micronaut.aiservice
import dev.langchain4j.model.chat.ChatModel
import io.micronaut.langchain4j.testutils.OllamaTestPropertyProvider
import io.micronaut.test.extensions.junit5.annotation.MicronautTest
import org.junit.jupiter.api.Assertions
import org.junit.jupiter.api.Test
import org.junit.jupiter.api.TestInstance
import org.testcontainers.junit.jupiter.Testcontainers
@Testcontainers(disabledWithoutDocker = true)
@MicronautTest(startApplication = false)
@TestInstance(TestInstance.Lifecycle.PER_CLASS)
internal class AiServiceTest : OllamaTestPropertyProvider {
@Test
fun testAiService(friend: Friend, languageModel: ChatModel) {
val result: String = friend.chat("Hello")
Assertions.assertNotNull(result)
Assertions.assertNotNull(languageModel)
}
}
4 Tools
Tools allow AI models to request specific actions that extends beyond their built-in capabilities.
package example.micronaut.aiservice.tools;
import dev.langchain4j.agent.tool.Tool;
import jakarta.inject.Singleton;
import java.time.LocalDate;
@Singleton // (1)
public class LegalDocumentTools {
@Tool("Returns the last time the PRIVACY document was updated") // (2)
public LocalDate lastUpdatePrivacy() {
return LocalDate.of(2013, 3, 9); // (3)
}
}
package example.micronaut.aiservice.tools
import dev.langchain4j.agent.tool.Tool
import groovy.transform.CompileStatic
import jakarta.inject.Singleton
import java.time.LocalDate
@Singleton // (1)
@CompileStatic
class LegalDocumentTools {
@Tool("Returns the last time the PRIVACY document was updated") // (2)
LocalDate lastUpdatePrivacy() {
return LocalDate.of(2013, 3, 9) // (3)
}
}
package example.micronaut.aiservice.tools
import dev.langchain4j.agent.tool.Tool
import jakarta.inject.Singleton
import java.time.LocalDate
@Singleton // (1)
class LegalDocumentTools {
@Tool("Returns the last time the PRIVACY document was updated") // (2)
fun lastUpdatePrivacy(): LocalDate = LocalDate.of(2013, 3, 9) // (3)
}
1 | You should annotate the classes with @Tool methods with @Singleton . |
2 | The @Tool value specifies the description of the tool. The lastUpdatePrivacy() method returns date when a company PRIVACY document was last updated.
A model would have no way to know the last date when the PRIVACY document was updated. Hence, it is a good candidate for a tool. |
3 | The dates are harcoded for the purpose of this example, but they could have been retrieved from a database or external API. |
You can supply the tools to use to an @AiService
:
package example.micronaut.aiservice.tools;
import io.micronaut.langchain4j.annotation.AiService;
@AiService(tools = LegalDocumentTools.class)
public interface CompanyBot {
String ask(String question);
}
package example.micronaut.aiservice.tools
import io.micronaut.langchain4j.annotation.AiService
@AiService(tools = LegalDocumentTools.class)
interface CompanyBot {
String ask(String question)
}
package example.micronaut.aiservice.tools
import io.micronaut.langchain4j.annotation.AiService
@AiService(tools = [LegalDocumentTools::class])
interface CompanyBot {
fun ask(question: String): String
}
5 Chat Language Models
The following modules provide integration with Langchain4j Language Models.
Each module configures one or more ChatLanguageModel beans, making them available for dependency injection based on configuration.
5.1 ChatModel Example
This example, asks a chat model to generate the list of the top 3 albums of a Jazz musician.
package example.micronaut;
import dev.langchain4j.data.message.ChatMessage;
import dev.langchain4j.data.message.SystemMessage;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.response.ChatResponse;
import jakarta.inject.Singleton;
import java.util.List;
@Singleton
public class MusicianAssistant {
private static final SystemMessage SYSTEM_MSG = SystemMessage.from("""
You are an expert in Jazz music.
Reply with only the names of the artists, albums, etc.
Be very concise.
If a list is given, separate the items with commas.""");
private final ChatModel model;
public MusicianAssistant(ChatModel model) { // (1)
this.model = model;
}
public Musician generateTopThreeAlbums(String name) {
List<ChatMessage> messages = generateTopThreeAlbumsMessages(name);
ChatResponse albums = model.chat(messages);
String topThreeAlbums = albums.aiMessage().text();
return new Musician(name, topThreeAlbums);
}
private static List<ChatMessage> generateTopThreeAlbumsMessages(String name) {
return List.of(SYSTEM_MSG, UserMessage.from(
String.format("Only list the top 3 albums of %s", name)
));
}
}
package example.micronaut
import dev.langchain4j.data.message.ChatMessage
import dev.langchain4j.data.message.SystemMessage
import dev.langchain4j.data.message.UserMessage
import dev.langchain4j.model.chat.ChatModel
import dev.langchain4j.model.chat.response.ChatResponse
import jakarta.inject.Singleton
import groovy.transform.CompileStatic
@CompileStatic
@Singleton
class MusicianAssistant {
private static final SystemMessage SYSTEM_MSG = SystemMessage.from("""
You are an expert in Jazz music.
Reply with only the names of the artists, albums, etc.
Be very concise.
If a list is given, separate the items with commas.""")
private final ChatModel model
MusicianAssistant(ChatModel model) { // (1)
this.model = model
}
Musician generateTopThreeAlbums(String name) {
List<ChatMessage> messages = generateTopThreeAlbumsMessages(name)
ChatResponse albums = model.chat(messages);
String topThreeAlbums = albums.aiMessage().text();
new Musician(name: name, albums: topThreeAlbums);
}
private static List<ChatMessage> generateTopThreeAlbumsMessages(String name) {
[
SYSTEM_MSG,
UserMessage.from(String.format("Only list the top 3 albums of %s", name))
]
}
}
package example.micronaut
import dev.langchain4j.data.message.ChatMessage
import dev.langchain4j.data.message.SystemMessage
import dev.langchain4j.data.message.UserMessage
import dev.langchain4j.model.chat.ChatModel
import jakarta.inject.Singleton
@Singleton
class MusicianAssistant(private val model: ChatModel) { // (1)
fun generateTopThreeAlbums(name: String): Musician {
val messages = generateTopThreeAlbumsMessages(name)
val albums = model.chat(messages)
val topThreeAlbums = albums.aiMessage().text()
return Musician(name, topThreeAlbums)
}
private fun generateTopThreeAlbumsMessages(name: String): List<ChatMessage> {
return listOf(
SYSTEM_MSG, UserMessage.from(
String.format("Only list the top 3 albums of %s", name)
)
)
}
companion object {
private val SYSTEM_MSG: SystemMessage = SystemMessage.from(
"""
You are an expert in Jazz music.
Reply with only the names of the artists, albums, etc.
Be very concise.
If a list is given, separate the items with commas.
""".trimIndent()
)
}
}
1 | Inject via constructor injection a bean of type ChatModel . |
package example.micronaut;
public record Musician(String name, String albums) {
}
package example.micronaut
import groovy.transform.CompileStatic
@CompileStatic
class Musician {
String name
String albums
}
package example.micronaut
data class Musician(val name: String, val albums: String)
You can configure the chat model via configuration.
For example, you may want to configure OpenAI in the main classpath:
micronaut.application.name=micronaut-guide
langchain4j.open-ai.chat-model.log-requests=true
langchain4j.open-ai.chat-model.log-responses=true
langchain4j.open-ai.chat-model.timeout=60s
langchain4j.open-ai.chat-model.temperature=0.3
langchain4j.open-ai.chat-model.model-name=gpt-4.1
And a local SLM (Small Language Model) such as Ollama in the test classpath:
langchain4j.open-ai.enabled=false
langchain4j.ollama.model-name=tinyllama
langchain4j.ollama.chat-model.timeout=5m
langchain4j.ollama.chat-model.log-requests=true
langchain4j.ollama.chat-model.log-responses=true
Moreover, you can also register a bean of type BeanCreatedEventListener
to configure the Chat Model builder programmatically if configuration is not enough.
package example.micronaut;
import dev.langchain4j.model.ollama.OllamaChatModel;
import io.micronaut.context.event.BeanCreatedEvent;
import io.micronaut.context.event.BeanCreatedEventListener;
import io.micronaut.core.annotation.NonNull;
import jakarta.inject.Singleton;
@Singleton
class OllamaChatModelBuilderListener
implements BeanCreatedEventListener<OllamaChatModel.OllamaChatModelBuilder> {
@Override
public OllamaChatModel.OllamaChatModelBuilder onCreated(
@NonNull BeanCreatedEvent<OllamaChatModel.OllamaChatModelBuilder> event) {
OllamaChatModel.OllamaChatModelBuilder builder = event.getBean();
builder.temperature(0.0);
return builder;
}
}
package example.micronaut
import dev.langchain4j.model.ollama.OllamaChatModel
import io.micronaut.context.event.BeanCreatedEvent
import io.micronaut.context.event.BeanCreatedEventListener
import io.micronaut.core.annotation.NonNull
import jakarta.inject.Singleton
@Singleton
class OllamaChatModelBuilderListener
implements BeanCreatedEventListener<OllamaChatModel.OllamaChatModelBuilder> {
@Override
OllamaChatModel.OllamaChatModelBuilder onCreated(
@NonNull BeanCreatedEvent<OllamaChatModel.OllamaChatModelBuilder> event) {
OllamaChatModel.OllamaChatModelBuilder builder = event.getBean()
builder.temperature(0.0)
builder
}
}
package example.micronaut
import dev.langchain4j.model.ollama.OllamaChatModel.OllamaChatModelBuilder
import io.micronaut.context.event.BeanCreatedEvent
import io.micronaut.context.event.BeanCreatedEventListener
import io.micronaut.core.annotation.NonNull
import jakarta.inject.Singleton
@Singleton
class OllamaChatModelBuilderListener : BeanCreatedEventListener<OllamaChatModelBuilder> {
override fun onCreated(event: @NonNull BeanCreatedEvent<OllamaChatModelBuilder>): OllamaChatModelBuilder {
val builder = event.bean
builder.temperature(0.0)
return builder
}
}
5.2 Chat Memory
Models are stateless by design. Chat memory serves as container for previous messages, helping you maintain context in a conversation, but the model itself is not aware of this memory; it relies on you to include the relevant messages in each request for coherent and contextually relevant responses.
Langchain4J provides an API ChatMemory
to help you manage chat memory. You can provide your own implementation or use one of the provided implementations.
The default implementation of ChatMemory
, dev.langchain4j.store.memory.chat.InMemoryChatMemoryStore
, stores ChatMessage
instances in memory.
To use the Redis implementation dev.langchain4j.community.store.memory.chat.redis.RedisChatMemoryStore
, add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-redis")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-redis</artifactId>
</dependency>
To use the Neo4J implementation dev.langchain4j.community.store.memory.chat.neo4j.Neo4jChatMemoryStore
, add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-neo4j")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-neo4j</artifactId>
</dependency>
To use the Cassandra implementation dev.langchain4j.store.memory.chat.cassandra.CassandraChatMemoryStore
, add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-cassandra")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-cassandra</artifactId>
</dependency>
The following example shows how to use the ChatMemory
:
package example.micronaut;
import dev.langchain4j.data.message.AiMessage;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.memory.ChatMemory;
import dev.langchain4j.memory.chat.MessageWindowChatMemory;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.response.ChatResponse;
import jakarta.inject.Singleton;
import java.util.Map;
import java.util.UUID;
import java.util.concurrent.ConcurrentHashMap;
@Singleton
public class AssistantWithMemory {
private final Map<String, ChatMemory> conversations = new ConcurrentHashMap<>();
private final ChatModel model;
private final MessageWindowChatMemory.Builder messageWindowChatMemoryBuilder;
public AssistantWithMemory(MessageWindowChatMemory.Builder messageWindowChatMemoryBuilder,
ChatModel model) {
this.messageWindowChatMemoryBuilder = messageWindowChatMemoryBuilder;
this.model = model;
}
public MemoryIdAndResponse chat(String conversationId, String message) {
ChatMemory chatMemory = conversations.get(conversationId);
if (chatMemory == null) {
throw new IllegalArgumentException("Unknown conversation: " + conversationId);
}
chatMemory.add(UserMessage.from(message));
ChatResponse chatResponse = model.chat(chatMemory.messages());
AiMessage answer = chatResponse.aiMessage();
chatMemory.add(answer);
return new MemoryIdAndResponse(conversationId, answer.text());
}
public MemoryIdAndResponse chat(String message) {
String conversationId = startConversation();
return chat(conversationId, message);
}
private String startConversation() {
String memoryId = generateChatMemoryId();
ChatMemory chatMemory = generateChatMemory(memoryId);
conversations.putIfAbsent(memoryId, chatMemory);
return memoryId;
}
private String generateChatMemoryId() {
return UUID.randomUUID().toString();
}
private ChatMemory generateChatMemory(String memoryId) {
return messageWindowChatMemoryBuilder
.id(memoryId)
.build();
}
}
package example.micronaut
import dev.langchain4j.data.message.AiMessage
import dev.langchain4j.data.message.UserMessage
import dev.langchain4j.memory.ChatMemory
import dev.langchain4j.memory.chat.MessageWindowChatMemory
import dev.langchain4j.model.chat.ChatModel
import dev.langchain4j.model.chat.response.ChatResponse
import jakarta.inject.Singleton
import java.util.concurrent.ConcurrentHashMap
@Singleton
class AssistantWithMemory {
private final Map<String, ChatMemory> conversations = new ConcurrentHashMap<>()
private final MessageWindowChatMemory.Builder messageWindowChatMemoryBuilder
private final ChatModel model
AssistantWithMemory(MessageWindowChatMemory.Builder messageWindowChatMemoryBuilder,
ChatModel model) {
this.model = model
this.messageWindowChatMemoryBuilder = messageWindowChatMemoryBuilder
}
private String startConversation() {
String memoryId = generateChatMemoryId()
ChatMemory chatMemory = generateChatMemory(memoryId)
conversations.putIfAbsent(memoryId, chatMemory)
memoryId
}
private String generateChatMemoryId() {
UUID.randomUUID().toString()
}
private ChatMemory generateChatMemory(String memoryId) {
messageWindowChatMemoryBuilder
.id(memoryId)
.build()
}
MemoryIdAndResponse chat(String memoryId, String message) {
ChatMemory chatMemory = conversations.get(memoryId)
if (chatMemory == null) {
throw new IllegalArgumentException("Unknown conversation: " + memoryId)
}
chatMemory.add(UserMessage.from(message))
ChatResponse chatResponse = model.chat(chatMemory.messages())
AiMessage aiMessage = chatResponse.aiMessage()
chatMemory.add(aiMessage)
new MemoryIdAndResponse(memoryId: memoryId, response: aiMessage.text())
}
MemoryIdAndResponse chat(String message) {
String conversationId = startConversation()
chat(conversationId, message)
}
}
package example.micronaut
import dev.langchain4j.data.message.UserMessage
import dev.langchain4j.memory.ChatMemory
import dev.langchain4j.memory.chat.MessageWindowChatMemory
import dev.langchain4j.model.chat.ChatModel
import jakarta.inject.Singleton
import java.util.*
import java.util.concurrent.ConcurrentHashMap
@Singleton
class AssistantWithMemory(
val messageWindowChatMemoryBuilder: MessageWindowChatMemory.Builder,
val model: ChatModel) {
private val conversations: MutableMap<String, ChatMemory> = ConcurrentHashMap<String, ChatMemory>()
fun chat(conversationId: String, message: String): MemoryIdAndResponse {
val chatMemory = requireNotNull(this.conversations[conversationId]) {
"Unknown conversation: $conversationId"
}
chatMemory.add(UserMessage.from(message))
val chatResponse = model.chat(chatMemory.messages())
val aiMessage = chatResponse.aiMessage()
chatMemory.add(aiMessage)
return MemoryIdAndResponse(conversationId, aiMessage.text())
}
fun chat(message: String): MemoryIdAndResponse {
val conversationId = startConversation()
return chat(conversationId, message)
}
private fun startConversation(): String {
val memoryId = generateChatMemoryId()
val chatMemory = generateChatMemory(memoryId)
conversations.putIfAbsent(memoryId, chatMemory)
return memoryId
}
private fun generateChatMemoryId(): String {
return UUID.randomUUID().toString()
}
private fun generateChatMemory(memoryId: String): ChatMemory {
return messageWindowChatMemoryBuilder
.id(memoryId)
.build()
}
}
You could invoke the previous class as illustrated in the following test:
@Test
void chatWithMemory(AssistantWithMemory assistant) {
MemoryIdAndResponse johnConversation = assistant.chat("Let me introduce myself. My name is John");
String johnConversationId = johnConversation.memoryId();
assertNotNull(johnConversationId);
MemoryIdAndResponse aegonConversation = assistant.chat("Let me introduce myself. My name is Dan");
String aegonConversationId = aegonConversation.memoryId();
assertNotNull(aegonConversationId);
MemoryIdAndResponse answer = assistant.chat(johnConversationId, "What's my name?");
assertTrue(answer.response().toLowerCase().contains("john"), answer.response());
answer = assistant.chat(aegonConversationId, "What's my name?");
assertTrue(answer.response().toLowerCase().contains("dan"), answer.response());
}
@Test
void chatWithMemory(AssistantWithMemory assistant) {
MemoryIdAndResponse johnConversation = assistant.chat("Let me introduce myself. My name is John")
String johnConversationId = johnConversation.memoryId
assertNotNull(johnConversationId)
MemoryIdAndResponse aegonConversation = assistant.chat("Let me introduce myself. My name is Dan")
String aegonConversationId = aegonConversation.memoryId
assertNotNull(aegonConversationId)
MemoryIdAndResponse answer = assistant.chat(johnConversationId, "What's my name?")
assertTrue(answer.response.toLowerCase().contains("john"), answer.response)
answer = assistant.chat(aegonConversationId, "What's my name?")
assertTrue(answer.response.toLowerCase().contains("dan"), answer.response)
}
@Test
fun chatWithMemory(assistant: AssistantWithMemory) {
val johnConversation = assistant.chat("Let me introduce myself. My name is John")
val johnConversationId = johnConversation.memoryId
assertNotNull(johnConversationId)
val aegonConversation = assistant.chat("Let me introduce myself. My name is Dan")
val aegonConversationId = aegonConversation.memoryId
assertNotNull(aegonConversationId)
var answer = assistant.chat(johnConversationId, "What's my name?")
assertTrue(answer.response.lowercase().contains("john"), answer.response)
answer = assistant.chat(aegonConversationId, "What's my name?")
assertTrue(answer.response.lowercase().contains("dan"), answer.response)
}
5.3 Anthropic
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-anthropic")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-anthropic</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.anthropic.api-key=YOUR_KEY
langchain4j.anthropic.api-key: YOUR_KEY
"langchain4j.anthropic.api-key"="YOUR_KEY"
langchain4j.anthropic.apiKey = "YOUR_KEY"
{
"langchain4j.anthropic.api-key" = "YOUR_KEY"
}
{
"langchain4j.anthropic.api-key": "YOUR_KEY"
}
5.4 Azure
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-azure")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-azure</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.azure-open-ai.api-key=YOUR_KEY
langchain4j.azure-open-ai.endpoint=YOUR_ENDPOINT
langchain4j.azure-open-ai.api-key: YOUR_KEY
langchain4j.azure-open-ai.endpoint: YOUR_ENDPOINT
"langchain4j.azure-open-ai.api-key"="YOUR_KEY"
"langchain4j.azure-open-ai.endpoint"="YOUR_ENDPOINT"
langchain4j.azureOpenAi.apiKey = "YOUR_KEY"
langchain4j.azureOpenAi.endpoint = "YOUR_ENDPOINT"
{
"langchain4j.azure-open-ai.api-key" = "YOUR_KEY"
"langchain4j.azure-open-ai.endpoint" = "YOUR_ENDPOINT"
}
{
"langchain4j.azure-open-ai.api-key": "YOUR_KEY",
"langchain4j.azure-open-ai.endpoint": "YOUR_ENDPOINT"
}
You will additionally need to define a bean of type TokenCredentials.
One way to do this is to include the Azure SDK module.
5.5 Bedrock
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-bedrock")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-bedrock</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.bedrock-llama.api-key=YOUR_KEY
langchain4j.bedrock-llama.api-key: YOUR_KEY
"langchain4j.bedrock-llama.api-key"="YOUR_KEY"
langchain4j.bedrockLlama.apiKey = "YOUR_KEY"
{
"langchain4j.bedrock-llama.api-key" = "YOUR_KEY"
}
{
"langchain4j.bedrock-llama.api-key": "YOUR_KEY"
}
You will additionally need to define a bean of type AwsCredentialsProvider.
One way to do this is to include the AWS SDK module.
5.6 HuggingFace
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-hugging-face")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-hugging-face</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.hugging-face.access-token=YOUR_ACCESS_TOKEN
langchain4j.hugging-face.access-token: YOUR_ACCESS_TOKEN
"langchain4j.hugging-face.access-token"="YOUR_ACCESS_TOKEN"
langchain4j.huggingFace.accessToken = "YOUR_ACCESS_TOKEN"
{
"langchain4j.hugging-face.access-token" = "YOUR_ACCESS_TOKEN"
}
{
"langchain4j.hugging-face.access-token": "YOUR_ACCESS_TOKEN"
}
5.7 MistralAi
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-mistralai")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-mistralai</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.mistral-ai.api-key=YOUR_KEY
langchain4j.mistral-ai.api-key: YOUR_KEY
"langchain4j.mistral-ai.api-key"="YOUR_KEY"
langchain4j.mistralAi.apiKey = "YOUR_KEY"
{
"langchain4j.mistral-ai.api-key" = "YOUR_KEY"
}
{
"langchain4j.mistral-ai.api-key": "YOUR_KEY"
}
5.8 Ollama
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-ollama")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-ollama</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.ollama.base-url=YOUR_URL
langchain4j.ollama.base-url: YOUR_URL
"langchain4j.ollama.base-url"="YOUR_URL"
langchain4j.ollama.baseUrl = "YOUR_URL"
{
"langchain4j.ollama.base-url" = "YOUR_URL"
}
{
"langchain4j.ollama.base-url": "YOUR_URL"
}
5.9 Oracle Cloud GenAI
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-oci-genai")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-oci-genai</artifactId>
</dependency>
Setup a supported OCI authentication method.
Then add the necessary configuration to configure a chat model.
langchain4j.oci-gen-ai.chat-model.model-name=orca-mini
langchain4j.oci-gen-ai.compartment-id=your-compartment
langchain4j.oci-gen-ai.chat-model.model-name: orca-mini
langchain4j.oci-gen-ai.compartment-id: your-compartment
"langchain4j.oci-gen-ai.chat-model.model-name"="orca-mini"
"langchain4j.oci-gen-ai.compartment-id"="your-compartment"
langchain4j.ociGenAi.chatModel.modelName = "orca-mini"
langchain4j.ociGenAi.compartmentId = "your-compartment"
{
"langchain4j.oci-gen-ai.chat-model.model-name" = "orca-mini"
"langchain4j.oci-gen-ai.compartment-id" = "your-compartment"
}
{
"langchain4j.oci-gen-ai.chat-model.model-name": "orca-mini",
"langchain4j.oci-gen-ai.compartment-id": "your-compartment"
}
5.10 OpenAi
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-openai")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-openai</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.open-ai.api-key=YOUR_KEY
langchain4j.open-ai.api-key: YOUR_KEY
"langchain4j.open-ai.api-key"="YOUR_KEY"
langchain4j.openAi.apiKey = "YOUR_KEY"
{
"langchain4j.open-ai.api-key" = "YOUR_KEY"
}
{
"langchain4j.open-ai.api-key": "YOUR_KEY"
}
5.11 Google AI Gemini
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-googleai-gemini")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-googleai-gemini</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.google-ai-gemini.api-key=YOUR_API_KEY
langchain4j.google-ai-gemini.api-key: YOUR_API_KEY
"langchain4j.google-ai-gemini.api-key"="YOUR_API_KEY"
langchain4j.googleAiGemini.apiKey = "YOUR_API_KEY"
{
"langchain4j.google-ai-gemini.api-key" = "YOUR_API_KEY"
}
{
"langchain4j.google-ai-gemini.api-key": "YOUR_API_KEY"
}
5.12 VertexAi
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-vertexai")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-vertexai</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.vertex-ai.endpoint=YOUR_ENDPOINT
langchain4j.vertex-ai.model-name=YOUR_MODEL
langchain4j.vertex-ai.project=YOUR_PROJECT
langchain4j.vertex-ai.location=YOUR_LOCATION
langchain4j.vertex-ai.publisher=YOUR_PUBLISHER
langchain4j.vertex-ai.endpoint: YOUR_ENDPOINT
langchain4j.vertex-ai.model-name: YOUR_MODEL
langchain4j.vertex-ai.project: YOUR_PROJECT
langchain4j.vertex-ai.location: YOUR_LOCATION
langchain4j.vertex-ai.publisher: YOUR_PUBLISHER
"langchain4j.vertex-ai.endpoint"="YOUR_ENDPOINT"
"langchain4j.vertex-ai.model-name"="YOUR_MODEL"
"langchain4j.vertex-ai.project"="YOUR_PROJECT"
"langchain4j.vertex-ai.location"="YOUR_LOCATION"
"langchain4j.vertex-ai.publisher"="YOUR_PUBLISHER"
langchain4j.vertexAi.endpoint = "YOUR_ENDPOINT"
langchain4j.vertexAi.modelName = "YOUR_MODEL"
langchain4j.vertexAi.project = "YOUR_PROJECT"
langchain4j.vertexAi.location = "YOUR_LOCATION"
langchain4j.vertexAi.publisher = "YOUR_PUBLISHER"
{
"langchain4j.vertex-ai.endpoint" = "YOUR_ENDPOINT"
"langchain4j.vertex-ai.model-name" = "YOUR_MODEL"
"langchain4j.vertex-ai.project" = "YOUR_PROJECT"
"langchain4j.vertex-ai.location" = "YOUR_LOCATION"
"langchain4j.vertex-ai.publisher" = "YOUR_PUBLISHER"
}
{
"langchain4j.vertex-ai.endpoint": "YOUR_ENDPOINT",
"langchain4j.vertex-ai.model-name": "YOUR_MODEL",
"langchain4j.vertex-ai.project": "YOUR_PROJECT",
"langchain4j.vertex-ai.location": "YOUR_LOCATION",
"langchain4j.vertex-ai.publisher": "YOUR_PUBLISHER"
}
5.13 VertexAi Gemini
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-vertexai-gemini")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-vertexai-gemini</artifactId>
</dependency>
Then add the necessary configuration.
langchain4j.vertex-ai-gemini.model-name=YOUR_MODEL
langchain4j.vertex-ai-gemini.project=YOUR_PROJECT
langchain4j.vertex-ai-gemini.location=YOUR_LOCATION
langchain4j.vertex-ai-gemini.model-name: YOUR_MODEL
langchain4j.vertex-ai-gemini.project: YOUR_PROJECT
langchain4j.vertex-ai-gemini.location: YOUR_LOCATION
"langchain4j.vertex-ai-gemini.model-name"="YOUR_MODEL"
"langchain4j.vertex-ai-gemini.project"="YOUR_PROJECT"
"langchain4j.vertex-ai-gemini.location"="YOUR_LOCATION"
langchain4j.vertexAiGemini.modelName = "YOUR_MODEL"
langchain4j.vertexAiGemini.project = "YOUR_PROJECT"
langchain4j.vertexAiGemini.location = "YOUR_LOCATION"
{
"langchain4j.vertex-ai-gemini.model-name" = "YOUR_MODEL"
"langchain4j.vertex-ai-gemini.project" = "YOUR_PROJECT"
"langchain4j.vertex-ai-gemini.location" = "YOUR_LOCATION"
}
{
"langchain4j.vertex-ai-gemini.model-name": "YOUR_MODEL",
"langchain4j.vertex-ai-gemini.project": "YOUR_PROJECT",
"langchain4j.vertex-ai-gemini.location": "YOUR_LOCATION"
}
6 Embedding Stores
6.1 Elastic Search
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-elasticsearch")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-elasticsearch</artifactId>
</dependency>
elasticsearch.httpHosts=http://localhost:9200,http://127.0.0.2:9200
langchain4j.elasticsearch.embedding-stores.default.dimension=384
elasticsearch.httpHosts: "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.elasticsearch.embedding-stores.default.dimension: 384
"elasticsearch.httpHosts"="http://localhost:9200,http://127.0.0.2:9200"
"langchain4j.elasticsearch.embedding-stores.default.dimension"=384
elasticsearch.httpHosts = "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.elasticsearch.embeddingStores.default.dimension = 384
{
"elasticsearch.httpHosts" = "http://localhost:9200,http://127.0.0.2:9200"
"langchain4j.elasticsearch.embedding-stores.default.dimension" = 384
}
{
"elasticsearch.httpHosts": "http://localhost:9200,http://127.0.0.2:9200",
"langchain4j.elasticsearch.embedding-stores.default.dimension": 384
}
6.2 MongoDB
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-mongodb-atlas")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-mongodb-atlas</artifactId>
</dependency>
mongodb.servers.default.uri: mongodb://username:password@localhost:27017/databaseName
langchain4j.mongodb-atlas.embedding-stores.default.database-name=testdb
langchain4j.mongodb-atlas.embedding-stores.default.collection-name=testcol
langchain4j.mongodb-atlas.embedding-stores.default.index-name=testindex
langchain4j.mongodb-atlas.embedding-stores.default.database-name: testdb
langchain4j.mongodb-atlas.embedding-stores.default.collection-name: testcol
langchain4j.mongodb-atlas.embedding-stores.default.index-name: testindex
"langchain4j.mongodb-atlas.embedding-stores.default.database-name"="testdb"
"langchain4j.mongodb-atlas.embedding-stores.default.collection-name"="testcol"
"langchain4j.mongodb-atlas.embedding-stores.default.index-name"="testindex"
langchain4j.mongodbAtlas.embeddingStores.default.databaseName = "testdb"
langchain4j.mongodbAtlas.embeddingStores.default.collectionName = "testcol"
langchain4j.mongodbAtlas.embeddingStores.default.indexName = "testindex"
{
"langchain4j.mongodb-atlas.embedding-stores.default.database-name" = "testdb"
"langchain4j.mongodb-atlas.embedding-stores.default.collection-name" = "testcol"
"langchain4j.mongodb-atlas.embedding-stores.default.index-name" = "testindex"
}
{
"langchain4j.mongodb-atlas.embedding-stores.default.database-name": "testdb",
"langchain4j.mongodb-atlas.embedding-stores.default.collection-name": "testcol",
"langchain4j.mongodb-atlas.embedding-stores.default.index-name": "testindex"
}
6.3 Neo4j
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-neo4j")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-neo4j</artifactId>
</dependency>
neo4j.uri=bolt://localhost
langchain4j.neo4j.embedding-stores.default.dimension=384
neo4j.uri: bolt://localhost
langchain4j.neo4j.embedding-stores.default.dimension: 384
"neo4j.uri"="bolt://localhost"
"langchain4j.neo4j.embedding-stores.default.dimension"=384
neo4j.uri = "bolt://localhost"
langchain4j.neo4j.embeddingStores.default.dimension = 384
{
"neo4j.uri" = "bolt://localhost"
"langchain4j.neo4j.embedding-stores.default.dimension" = 384
}
{
"neo4j.uri": "bolt://localhost",
"langchain4j.neo4j.embedding-stores.default.dimension": 384
}
6.4 Oracle
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-oracle")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-oracle</artifactId>
</dependency>
Then add one of the supported JDBC connection pools, for example Hikari:
runtimeOnly("io.micronaut.sql:micronaut-jdbc-hikari")
<dependency>
<groupId>io.micronaut.sql</groupId>
<artifactId>micronaut-jdbc-hikari</artifactId>
<scope>runtime</scope>
</dependency>
datasources.default.dialect=oracle
langchain4j.oracle.embedding-stores.default.table=test
langchain4j.oracle.embedding-stores.default.table.create-option=create_if_not_exists
datasources.default.dialect: oracle
langchain4j.oracle.embedding-stores.default.table: test
langchain4j.oracle.embedding-stores.default.table.create-option: create_if_not_exists
"datasources.default.dialect"="oracle"
"langchain4j.oracle.embedding-stores.default.table"="test"
"langchain4j.oracle.embedding-stores.default.table.create-option"="create_if_not_exists"
datasources.default.dialect = "oracle"
langchain4j.oracle.embeddingStores.default.table = "test"
langchain4j.oracle.embeddingStores.default.table.createOption = "create_if_not_exists"
{
"datasources.default.dialect" = "oracle"
"langchain4j.oracle.embedding-stores.default.table" = "test"
"langchain4j.oracle.embedding-stores.default.table.create-option" = "create_if_not_exists"
}
{
"datasources.default.dialect": "oracle",
"langchain4j.oracle.embedding-stores.default.table": "test",
"langchain4j.oracle.embedding-stores.default.table.create-option": "create_if_not_exists"
}
6.5 Open Search
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-opensearch")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-opensearch</artifactId>
</dependency>
micronaut.opensearch.rest-client.http-hosts=http://localhost:9200,http://127.0.0.2:9200
langchain4j.opensearch.embedding-stores.default.dimension=384
micronaut.opensearch.rest-client.http-hosts: "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.opensearch.embedding-stores.default.dimension: 384
"micronaut.opensearch.rest-client.http-hosts"="http://localhost:9200,http://127.0.0.2:9200"
"langchain4j.opensearch.embedding-stores.default.dimension"=384
micronaut.opensearch.restClient.httpHosts = "http://localhost:9200,http://127.0.0.2:9200"
langchain4j.opensearch.embeddingStores.default.dimension = 384
{
"micronaut.opensearch.rest-client.http-hosts" = "http://localhost:9200,http://127.0.0.2:9200"
"langchain4j.opensearch.embedding-stores.default.dimension" = 384
}
{
"micronaut.opensearch.rest-client.http-hosts": "http://localhost:9200,http://127.0.0.2:9200",
"langchain4j.opensearch.embedding-stores.default.dimension": 384
}
6.6 PGVector
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-pgvector")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-pgvector</artifactId>
</dependency>
Then add one of the supported JDBC connection pools, for example Hikari:
runtimeOnly("io.micronaut.sql:micronaut-jdbc-hikari")
<dependency>
<groupId>io.micronaut.sql</groupId>
<artifactId>micronaut-jdbc-hikari</artifactId>
<scope>runtime</scope>
</dependency>
datasources.default.dialect=postgres
langchain4j.pgvector.embedding-stores.default.table=mytable
langchain4j.pgvector.embedding-stores.default.dimension=384
test-resources.containers.postgres.image-name=pgvector/pgvector:pg16
datasources.default.dialect: postgres
langchain4j.pgvector.embedding-stores.default.table: "mytable"
langchain4j.pgvector.embedding-stores.default.dimension: 384
# Add this if you plan to use testresources
test-resources.containers.postgres.image-name: pgvector/pgvector:pg16
"datasources.default.dialect"="postgres"
"langchain4j.pgvector.embedding-stores.default.table"="mytable"
"langchain4j.pgvector.embedding-stores.default.dimension"=384
"test-resources.containers.postgres.image-name"="pgvector/pgvector:pg16"
datasources.default.dialect = "postgres"
langchain4j.pgvector.embeddingStores.default.table = "mytable"
langchain4j.pgvector.embeddingStores.default.dimension = 384
testResources.containers.postgres.imageName = "pgvector/pgvector:pg16"
{
"datasources.default.dialect" = "postgres"
"langchain4j.pgvector.embedding-stores.default.table" = "mytable"
"langchain4j.pgvector.embedding-stores.default.dimension" = 384
"test-resources.containers.postgres.image-name" = "pgvector/pgvector:pg16"
}
{
"datasources.default.dialect": "postgres",
"langchain4j.pgvector.embedding-stores.default.table": "mytable",
"langchain4j.pgvector.embedding-stores.default.dimension": 384,
"test-resources.containers.postgres.image-name": "pgvector/pgvector:pg16"
}
6.7 Redis
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-redis")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-redis</artifactId>
</dependency>
langchain4j.redis.embedding-store.host=localhost
langchain4j.redis.embedding-store.port=6379
langchain4j.redis.embedding-stores.default.dimension=384
langchain4j.redis.embedding-store.host: localhost
langchain4j.redis.embedding-store.port: 6379
langchain4j.redis.embedding-stores.default.dimension: 384
"langchain4j.redis.embedding-store.host"="localhost"
"langchain4j.redis.embedding-store.port"=6379
"langchain4j.redis.embedding-stores.default.dimension"=384
langchain4j.redis.embeddingStore.host = "localhost"
langchain4j.redis.embeddingStore.port = 6379
langchain4j.redis.embeddingStores.default.dimension = 384
{
"langchain4j.redis.embedding-store.host" = "localhost"
"langchain4j.redis.embedding-store.port" = 6379
"langchain4j.redis.embedding-stores.default.dimension" = 384
}
{
"langchain4j.redis.embedding-store.host": "localhost",
"langchain4j.redis.embedding-store.port": 6379,
"langchain4j.redis.embedding-stores.default.dimension": 384
}
6.8 Qdrant
Add the following dependency:
implementation("io.micronaut.langchain4j:micronaut-langchain4j-store-qdrant")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-store-qdrant</artifactId>
</dependency>
To use Testcontainers & Test Resources add the following dependency:
testResourcesService("io.micronaut.langchain4j:micronaut-langchain4j-qdrant-testresource")
<dependency>
<groupId>io.micronaut.langchain4j</groupId>
<artifactId>micronaut-langchain4j-qdrant-testresource</artifactId>
<scope>testResourcesService</scope>
</dependency>
langchain4j.qdrant.embedding-store.host=localhost
langchain4j.qdrant.embedding-store.port=6334
langchain4j.qdrant.embedding-store.collection-name=mycollection
# Omitt the following 2 properties if you use Test resources
langchain4j.qdrant.embedding-store.host: localhost
langchain4j.qdrant.embedding-store.port: 6334
# Minimal configuration required for Test resources
langchain4j.qdrant.embedding-store.collection-name: mycollection
"langchain4j.qdrant.embedding-store.host"="localhost"
"langchain4j.qdrant.embedding-store.port"=6334
"langchain4j.qdrant.embedding-store.collection-name"="mycollection"
langchain4j.qdrant.embeddingStore.host = "localhost"
langchain4j.qdrant.embeddingStore.port = 6334
langchain4j.qdrant.embeddingStore.collectionName = "mycollection"
{
"langchain4j.qdrant.embedding-store.host" = "localhost"
"langchain4j.qdrant.embedding-store.port" = 6334
"langchain4j.qdrant.embedding-store.collection-name" = "mycollection"
}
{
"langchain4j.qdrant.embedding-store.host": "localhost",
"langchain4j.qdrant.embedding-store.port": 6334,
"langchain4j.qdrant.embedding-store.collection-name": "mycollection"
}
7 Repository
You can find the source code of this project in this repository:
8 Release History
For this project, you can find a list of releases (with release notes) here: