Mistral: Mixtral 8x7B Instruct by mistralai | Mume AI
Mixtral 8x7B Instruct
Mixtral 8x7B Instruct is a pretrained generative Sparse Mixture of Experts, by Mistral AI, for chat and instruction use. Incorporates 8 experts (feed-forward networks) for a total of 47 billion parameters.
Instruct model fine-tuned by Mistral. #moe
by Mistralai|33K context|$0.08/M input tokens|$0.24/M output tokens
Endpoints
Available providers for this model, with details on pricing, context limits, and real-time health metrics.