MiniMax M2.7

Summary

MiniMax-M2.7 is MiniMax's open-weight agentic frontier text model, released March 18, 2026 — a 230B-parameter MoE with only ~10B active per token. It sustains 97% skill adherence across 40 complex skills and scores SOTA among open-weight models on SWE-Pro (56.22%) and GDPval-AA (ELO 1495).

Overview

MiniMax-M2.7 is MiniMax's open-weight agentic frontier text model, released March 18, 2026. It is a 230B-parameter Mixture-of-Experts model with only ~10B active parameters per token, making it one of the most "intelligence-per-parameter"-efficient open-weight models on the market. The release positions MiniMax — better known for its consumer Hailuo video and music apps — squarely in the open-weight LLM frontier alongside [[DeepSeek/DeepSeek V4-Pro|DeepSeek V4-Pro]], [[Z.ai/GLM-5.1|GLM-5.1]], [[Moonshot AI/Kimi K2.6|Kimi K2.6]], and [[Alibaba Qwen/Qwen 3.5|Qwen 3.5]].

M2.7 is built specifically for agentic deployments: it sustains 97% skill adherence across 40 complex skills (each >2,000 tokens), supports native Agent Teams with stable role boundaries, and currently handles 30–50% of MiniMax's internal RL team workflows autonomously.

Specifications

  • Developer: MiniMax
  • Release Date: March 18, 2026
  • Type: Mixture-of-Experts text-generation model (agentic, code, reasoning)
  • Total Parameters: 230B (10B active per token)
  • Context Window: 205K tokens
  • License: Open weights on Hugging Face — commercial use prohibited without prior written authorization from MiniMax
  • Distribution: Hugging Face (MiniMaxAI/MiniMax-M2.7), GitHub, NVIDIA-optimized deployments

Capabilities

Self-Evolving Agent: MiniMax describes M2.7 as showing "early echoes of self-evolution" — the model self-improves through its own agentic workflow execution.

Real-World SWE Performance: 56.22% on SWE-Pro (matching GPT-5.3-Codex) and 57.0% on Terminal Bench 2 — SOTA among open-weight models on real-world software engineering benchmarks.

GDPval Leadership Among Open Weights: ELO 1495 on GDPval-AA — highest among open-weight models, surpassing GPT-5.3.

Sustained Skill Adherence: 97% adherence across 40 complex skills (each >2,000 tokens) — addresses a major weakness of long-horizon agent execution.

Native Agent Teams: Stable role boundaries across multi-agent setups; first-class support for agent-team orchestration patterns.

Production Deployment: NVIDIA technical blog highlights M2.7 as a flagship for scalable agentic workflows on NVIDIA platforms.

Limitations

The Hugging Face license restricts commercial use without prior written authorization, which complicates production adoption relative to fully MIT-licensed peers like DeepSeek V4. M2.7 is text-only at present (no native vision/audio modality). Coding-leadership benchmarks are open-weight class — closed frontier models from Anthropic, OpenAI, and Google still lead on the absolute SWE-bench Verified frontier.

Recent Developments

  • March 18, 2026 — MiniMax-M2.7 released open-weight on Hugging Face; described as a "self-evolving agent model."
  • April 2026 — Featured in NVIDIA's developer blog for agentic-workflow scaling on NVIDIA hardware.
  • April–May 2026 — M2.7 cited alongside Z.ai GLM-5.1, Moonshot Kimi K2.6, and DeepSeek V4 as the four open-weight Chinese coding models released within a 12-day window — collectively shifting the open-weight coding frontier.

Last Updated

May 9, 2026