Arcee AI: Coder Large
Fiabilité
20%
arcee-ai/coder-large
Coder‑Large is a 32 B‑parameter offspring of Qwen 2.5‑Instruct that has been further trained on permissively‑licensed GitHub, CodeSearchNet and synthetic bug‑fix corpora. It supports a 32k context window, enabling multi‑file refactoring or long diff review in a single call, and understands 30‑plus p...
🧠 Intelligence & Données
Knowledge Cutoff:
2025-03-31
Tokenizer:
Other
Moderation:
✅ Non
📅 Cycle de vie
Ajouté le:
05/05/2025
Spécifications
-
Provider & Modalité arcee-ai text->text
-
Fenêtre de contexte 32,768 tokens
-
Max Output Tokens 0
-
Support des Outils (Tools) Non supporté
🔍 Modèles similaires
| Modèle | Provider | Input | Output | Contexte | |
|---|---|---|---|---|---|
|
Arcee AI: Trinity Large Thinking
arcee-ai/trinity-large-th...
|
arcee-ai | $0.2200 | $0.8500 | 262,144 | → |
|
Reka Edge
rekaai/reka-edge
|
rekaai | $0.1000 | $0.1000 | 16,384 | → |
|
LiquidAI: LFM2-24B-A2B
liquid/lfm-2-24b-a2b
|
liquid | $0.0300 | $0.1200 | 32,768 | → |
|
Arcee AI: Trinity Large Preview (free)
arcee-ai/trinity-large-pr...
|
arcee-ai | $0.0000 | $0.0000 | 131,000 | → |
|
LiquidAI: LFM2.5-1.2B-Thinking (free)
liquid/lfm-2.5-1.2b-think...
|
liquid | $0.0000 | $0.0000 | 32,768 | → |