Mistral: Mixtral 8x22B Instruct
Fiabilité
20%
mistralai/mixtral-8x22b-instruct
Mistral's official instruct fine-tuned version of [Mixtral 8x22B](/models/mistralai/mixtral-8x22b). It uses 39B active parameters out of 141B, offering unparalleled cost efficiency for its size. Its strengths include: - strong math, coding, and reasoning - large context length (64k) - fluency in Eng...
🧠 Intelligence & Données
Knowledge Cutoff:
2024-01-31
Tokenizer:
Mistral
Moderation:
✅ Non
📅 Cycle de vie
Ajouté le:
17/04/2024
Spécifications
-
Provider & Modalité mistralai text->text
-
Fenêtre de contexte 65,536 tokens
-
Max Output Tokens 0
-
Support des Outils (Tools) ✔️ Fonction Calling
🔍 Modèles similaires
| Modèle | Provider | Input | Output | Contexte | |
|---|---|---|---|---|---|
|
Mistral: Mistral Small 4
mistralai/mistral-small-2...
|
mistralai | $0.1500 | $0.6000 | 262,144 | → |
|
Google: Nano Banana 2 (Gemini 3.1 Flash Image Preview)
google/gemini-3.1-flash-i...
|
$0.5000 | $3.0000 | 65,536 | → | |
|
LiquidAI: LFM2-24B-A2B
liquid/lfm-2-24b-a2b
|
liquid | $0.0300 | $0.1200 | 32,768 | → |
|
Z.ai: GLM 5
z-ai/glm-5
|
z-ai | $0.7200 | $2.3000 | 80,000 | → |
|
MiniMax: MiniMax M2-her
minimax/minimax-m2-her
|
minimax | $0.3000 | $1.2000 | 65,536 | → |