Inception: Mercury Coder

Fiabilité 20%

inception/mercury-coder

Mercury Coder is the first diffusion large language model (dLLM). Applying a breakthrough discrete diffusion approach, the model runs 5-10x faster than even speed optimized models like Claude 3.5 Haiku and GPT-4o Mini while matching their performance. Mercury Coder's speed means that developers can...

🧠 Intelligence & Données
Knowledge Cutoff: 2025-01-31
Tokenizer: Other
Moderation: ✅ Non
📅 Cycle de vie
Ajouté le: 30/04/2025
Spécifications
  • Provider & Modalité inception text->text
  • Fenêtre de contexte 128,000 tokens
  • Max Output Tokens 32,000
  • Support des Outils (Tools) ✔️ Fonction Calling
🔍 Modèles similaires
Modèle Provider Input Output Contexte
Inception: Mercury 2
inception/mercury-2
inception $0.2500 $0.7500 128,000
OpenAI: GPT-5.3 Chat
openai/gpt-5.3-chat
openai $1.7500 $14.0000 128,000
Google: Nano Banana 2 (Gemini 3.1 Flash Image Preview)
google/gemini-3.1-flash-i...
google $0.5000 $3.0000 65,536
AionLabs: Aion-2.0
aion-labs/aion-2.0
aion-labs $0.8000 $1.6000 131,072
Z.ai: GLM 5
z-ai/glm-5
z-ai $0.7200 $2.3000 80,000