Mistral: Saba
Fiabilité
20%
mistralai/mistral-saba
Mistral Saba is a 24B-parameter language model specifically designed for the Middle East and South Asia, delivering accurate and contextually relevant responses while maintaining efficient performance. Trained on curated regional datasets, it supports multiple Indian-origin languages—including Tamil...
🧠 Intelligence & Données
Knowledge Cutoff:
2024-09-30
Tokenizer:
Mistral
Moderation:
✅ Non
📅 Cycle de vie
Ajouté le:
17/02/2025
Spécifications
-
Provider & Modalité mistralai text->text
-
Fenêtre de contexte 32,768 tokens
-
Max Output Tokens 0
-
Support des Outils (Tools) ✔️ Fonction Calling
🔍 Modèles similaires
| Modèle | Provider | Input | Output | Contexte | |
|---|---|---|---|---|---|
|
Reka Edge
rekaai/reka-edge
|
rekaai | $0.1000 | $0.1000 | 16,384 | → |
|
Mistral: Mistral Small 4
mistralai/mistral-small-2...
|
mistralai | $0.1500 | $0.6000 | 262,144 | → |
|
LiquidAI: LFM2-24B-A2B
liquid/lfm-2-24b-a2b
|
liquid | $0.0300 | $0.1200 | 32,768 | → |
|
LiquidAI: LFM2.5-1.2B-Thinking (free)
liquid/lfm-2.5-1.2b-think...
|
liquid | $0.0000 | $0.0000 | 32,768 | → |
|
LiquidAI: LFM2.5-1.2B-Instruct (free)
liquid/lfm-2.5-1.2b-instr...
|
liquid | $0.0000 | $0.0000 | 32,768 | → |