MiniMax-M2.7 is MiniMax's open-weight agentic frontier text model, released March 18, 2026 — a 230B-parameter MoE with only ~10B active per token. It sustains 97% skill adherence across 40 complex skills and scores SOTA among open-weight models on SWE-Pro (56.22%) and GDPval-AA (ELO 1495).
MiniMax-M2.7 is MiniMax's open-weight agentic frontier text model, released March 18, 2026. It is a 230B-parameter Mixture-of-Experts model with only ~10B active parameters per token, making it one of the most "intelligence-per-parameter"-efficient open-weight models on the market. The release positions MiniMax — better known for its consumer Hailuo video and music apps — squarely in the open-weight LLM frontier alongside [[DeepSeek/DeepSeek V4-Pro|DeepSeek V4-Pro]], [[Z.ai/GLM-5.1|GLM-5.1]], [[Moonshot AI/Kimi K2.6|Kimi K2.6]], and [[Alibaba Qwen/Qwen 3.5|Qwen 3.5]].
M2.7 is built specifically for agentic deployments: it sustains 97% skill adherence across 40 complex skills (each >2,000 tokens), supports native Agent Teams with stable role boundaries, and currently handles 30–50% of MiniMax's internal RL team workflows autonomously.
MiniMaxAI/MiniMax-M2.7), GitHub, NVIDIA-optimized deploymentsSelf-Evolving Agent: MiniMax describes M2.7 as showing "early echoes of self-evolution" — the model self-improves through its own agentic workflow execution.
Real-World SWE Performance: 56.22% on SWE-Pro (matching GPT-5.3-Codex) and 57.0% on Terminal Bench 2 — SOTA among open-weight models on real-world software engineering benchmarks.
GDPval Leadership Among Open Weights: ELO 1495 on GDPval-AA — highest among open-weight models, surpassing GPT-5.3.
Sustained Skill Adherence: 97% adherence across 40 complex skills (each >2,000 tokens) — addresses a major weakness of long-horizon agent execution.
Native Agent Teams: Stable role boundaries across multi-agent setups; first-class support for agent-team orchestration patterns.
Production Deployment: NVIDIA technical blog highlights M2.7 as a flagship for scalable agentic workflows on NVIDIA platforms.
The Hugging Face license restricts commercial use without prior written authorization, which complicates production adoption relative to fully MIT-licensed peers like DeepSeek V4. M2.7 is text-only at present (no native vision/audio modality). Coding-leadership benchmarks are open-weight class — closed frontier models from Anthropic, OpenAI, and Google still lead on the absolute SWE-bench Verified frontier.
May 9, 2026