Mistral Small 3.1
Approved Data Classifications
Description
Mistral Small 3.1 is an advanced, open‑source language model from Mistral AI designed to deliver best‑in‑class performance within its 24‑billion‑parameter weight class. It was released on March 11, 2025 under an Apache 2.0 license, and features a 128,000‑token context window and multimodal support for text and images. The model excels across dozens of languages—including English, French, German, Spanish, Italian, Japanese, and Korean—showcasing robust multilingual proficiency. Optimized for efficiency, Mistral Small 3.1 achieves inference speeds of up to 150 tokens per second, excelling at instruction following, conversational AI, code generation, mathematical reasoning, and long‑document comprehension. Its lightweight design, low latency, and permissive licensing make it an ideal choice for developers building fast, accurate, and flexible AI applications across diverse domains.
Capabilities
| Model | Release Date | Input | Output | Context Length | Cost (per 1 million tokens) |
|---|---|---|---|---|---|
| mistral-small-3.1 | Mar 11 2025 | Image, Text | Text | 128,000 | $0.10/1M input $0.30/1M output |
1Mrepresents 1 Million Tokens- All prices listed are based on 1 Million Tokens
Availability
Cloud Provider
Usage
- curl
- python
- javascript
curl -X POST https://api.ai.it.ufl.edu/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <API_TOKEN>" \
-d '{
"model": "mistral-small-3.1",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Write a haiku about an Alligator."
}
]
}'
from openai import OpenAI
client = OpenAI(
api_key="your_api_key",
base_url="https://api.ai.it.ufl.edu/v1"
)
response = client.chat.completions.create(
model="mistral-small-3.1", # model to send to the proxy
messages = [
{ role: "system", content: "You are a helpful assistant." },
{
"role": "user",
"content": "Write a haiku about an Alligator."
}
]
)
print(response.choices[0].message)
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your_api_key',
baseURL: 'https://api.ai.it.ufl.edu/v1'
});
const completion = await openai.chat.completions.create({
model: "mistral-small-3.1",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{
role: "user",
content: "Write a haiku about an Alligator.",
},
],
});
print(completion.choices[0].message)
References
- MistralAI
https://mistral.ai/- LLM Stats
https://llm-stats.com- Artificial Analysis
https://artificialanalysis.ai