Mistral Large
Approved Data Classifications
Description
Mistral Large is an advanced language model developed by Mistral AI, recognized for its robust reasoning capabilities and designed to tackle complex multilingual tasks. Launched in February 2024, Mistral Large features a context window of 32,000 tokens, enabling it to retrieve and generate precise information from extensive documents. The model is fluent in multiple languages, including English, French, Spanish, German, and Italian, showcasing a nuanced understanding of grammar and cultural context. With the incorporation of retrieval-augmented generation (RAG), Mistral Large can access external knowledge bases to enhance its comprehension and accuracy. It excels in instruction-following tasks and coding challenges, making it a powerful tool for developers looking to create applications that require high-level reasoning and language understanding across various domains.
Capabilities
Model | Training Data | Input | Output | Context Length | Cost (per 1 million tokens) |
---|---|---|---|---|---|
mistral-large | February 2024 | Text | Text | 32,000 | $4.00/1M input $12.00/1M output |
1M
represents 1 Million Tokens- All prices listed are based on 1 Million Tokens
Availability
Cloud Provider
Usage
- curl
- python
- javascript
curl -X POST https://api.ai.it.ufl.edu/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <API_TOKEN>" \
-d '{
"model": "mistral-large",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Write a haiku about an Alligator."
}
]
}'
from openai import OpenAI
client = OpenAI(
api_key="your_api_key",
base_url="https://api.ai.it.ufl.edu/v1"
)
response = client.chat.completions.create(
model="mistral-large", # model to send to the proxy
messages = [
{ role: "system", content: "You are a helpful assistant." },
{
"role": "user",
"content": "Write a haiku about an Alligator."
}
]
)
print(response.choices[0].message)
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your_api_key',
baseURL: 'https://api.ai.it.ufl.edu/v1'
});
const completion = await openai.chat.completions.create({
model: "mistral-large",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{
role: "user",
content: "Write a haiku about an Alligator.",
},
],
});
print(completion.choices[0].message)