Package dev.langchain4j.model.mistralai
Class MistralAiChatModel
java.lang.Object
dev.langchain4j.model.mistralai.MistralAiChatModel
- All Implemented Interfaces:
dev.langchain4j.model.chat.ChatLanguageModel
public class MistralAiChatModel
extends Object
implements dev.langchain4j.model.chat.ChatLanguageModel
Represents a Mistral AI Chat Model with a chat completion interface, such as open-mistral-7b and open-mixtral-8x7b
This model allows generating chat completion of a sync way based on a list of chat messages.
You can find description of parameters
here.
-
Nested Class Summary
Nested Classes -
Constructor Summary
ConstructorsConstructorDescriptionMistralAiChatModel(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, String responseFormat, Duration timeout, Boolean logRequests, Boolean logResponses, Integer maxRetries) Constructs a MistralAiChatModel with the specified parameters. -
Method Summary
Methods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface dev.langchain4j.model.chat.ChatLanguageModel
chat, chat, chat, defaultRequestParameters, doChat, listeners, supportedCapabilities
-
Constructor Details
-
MistralAiChatModel
public MistralAiChatModel(String baseUrl, String apiKey, String modelName, Double temperature, Double topP, Integer maxTokens, Boolean safePrompt, Integer randomSeed, String responseFormat, Duration timeout, Boolean logRequests, Boolean logResponses, Integer maxRetries) Constructs a MistralAiChatModel with the specified parameters.- Parameters:
baseUrl- the base URL of the Mistral AI API. It uses the default value if not specifiedapiKey- the API key for authenticationmodelName- the name of the Mistral AI model to usetemperature- the temperature parameter for generating chat responsestopP- the top-p parameter for generating chat responsesmaxTokens- the maximum number of new tokens to generate in a chat responsesafePrompt- a flag indicating whether to use a safe prompt for generating chat responsesrandomSeed- the random seed for generating chat responsesresponseFormat- the response format for generating chat responses.Current values supported are "text" and "json_object".
timeout- the timeout duration for API requestsThe default value is 60 seconds
logRequests- a flag indicating whether to log API requestslogResponses- a flag indicating whether to log API responsesmaxRetries- the maximum number of retries for API requests. It uses the default value 3 if not specified
-
-
Method Details
-
chat
public dev.langchain4j.model.chat.response.ChatResponse chat(dev.langchain4j.model.chat.request.ChatRequest chatRequest) - Specified by:
chatin interfacedev.langchain4j.model.chat.ChatLanguageModel
-
provider
public dev.langchain4j.model.ModelProvider provider()- Specified by:
providerin interfacedev.langchain4j.model.chat.ChatLanguageModel
-
builder
-