Mixtral 8x7B

Mistral

Mixture of Experts model with 8 experts, uses 12B active params

Text Generation Local Latest Mixtral Family
Parameters
46.7B
params
Context Window
32K
tokens
Max Output
-
tokens
Input Price
-
per 1M tokens
Output Price
-
per 1M tokens

Mixtral Family Timeline 2 versions

Mixtral 8x7B Current Latest local
Dec 2023

Capabilities

👁️
Vision
Function Calling
📋
JSON Mode
🌊
Streaming
💬
System Prompt
🖥️
Code Execution
🔍
Web Search
🔌
MCP Support

Local Model Specs

Quantization
Q4_K_M
Runtime
ollama

Details

Release Date
December 11, 2023
Knowledge Cutoff
December 1, 2023
Source
Local
License
Apache 2.0
Model ID
mixtral-8x7b
Last updated: November 26, 2025