Ministral 3B is a 3B parameter model optimized for on-device and edge computing. It excels in knowledge, commonsense reasoning, and function-calling, outperforming larger models like Mistral 7B on most benchmarks. Supporting up to 128k context length, it’s ideal for orchestrating agentic workflows and specialist tasks with efficient inference.
by Mistralai|33K context|$0.04/M input tokens|$0.04/M output tokens
Endpoints
Available providers for this model, with details on pricing, context limits, and real-time health metrics.