Ministral 8B
Approved Data Classifications
Description
Ministral-8B is a cutting-edge AI language model developed by Mistral AI, designed specifically for on-device and edge computing applications. Launched in October 2024, this model features 8 billion parameters and supports an impressive context length of up to 128,000 tokens, making it capable of handling extensive data inputs efficiently. One of its standout features is the innovative "sliding window attention pattern," which enhances memory efficiency and speeds up inference times, allowing for smoother performance in resource-constrained environments. Ministral-8B is optimized for privacy-first applications such as on-device translation, local analytics, and autonomous robotics, catering to the growing demand for AI solutions that prioritize data security and low-latency processing. Its ability to function as an intermediary in multi-step workflows further expands its utility, making it a versatile tool for developers looking to implement advanced AI capabilities in various contexts.
Capabilities
Model | Training Data | Input | Output | Context Length | Cost (per 1 million tokens) |
---|---|---|---|---|---|
ministral-8b-instruct | June 2024 | Text | Text | 128,000 | $0.10/1M input $0.10/1M output |
1M
represents 1 Million Tokens- All prices listed are based on 1 Million Tokens
Availability
Cloud Provider
Usage
- curl
- python
- javascript
curl -X POST https://api.ai.it.ufl.edu/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer <API_TOKEN>" \
-d '{
"model": "ministral-8b-instruct",
"messages": [
{
"role": "system",
"content": "You are a helpful assistant."
},
{
"role": "user",
"content": "Write a haiku about an Alligator."
}
]
}'
from openai import OpenAI
client = OpenAI(
api_key="your_api_key",
base_url="https://api.ai.it.ufl.edu/v1"
)
response = client.chat.completions.create(
model="ministral-8b-instruct", # model to send to the proxy
messages = [
{ role: "system", content: "You are a helpful assistant." },
{
"role": "user",
"content": "Write a haiku about an Alligator."
}
]
)
print(response.choices[0].message)
import OpenAI from 'openai';
const openai = new OpenAI({
apiKey: 'your_api_key',
baseURL: 'https://api.ai.it.ufl.edu/v1'
});
const completion = await openai.chat.completions.create({
model: "ministral-8b-instruct",
messages: [
{ role: "system", content: "You are a helpful assistant." },
{
role: "user",
content: "Write a haiku about an Alligator.",
},
],
});
print(completion.choices[0].message)