Magistral Small is a 24B parameter instruction-tuned model based on Mistral-Small-3.1 (2503), enhanced through supervised fine-tuning on traces from Magistral Medium and further refined via reinforcement learning. It is optimized for reasoning and supports a wide multilingual range, including over 20 languages.
Recent activity on Magistral Small 2506
Total usage per day on OpenRouter
Prompt
3.56M
Completion
1.63M
Reasoning
0
Prompt tokens measure input size. Reasoning tokens show internal thinking before a response. Completion tokens reflect total output length.