Home/All articles/minimax-m2-5
Chinese AI Landscape

MiniMax M2.5: Claude Opus Performance at 1/20th the Cost

Hong Kong AI Podcast/2026-03-07/5 min read/MiniMaxOpen SourceHKEXCost EfficiencyHong Kong

Here's a sentence that would have sounded absurd two years ago: a Chinese AI model, open-sourced under a modified MIT license, matches Claude Opus on SWE-Bench — and costs roughly 1/20th as much to run. Then the company IPO'd on the Hong Kong Stock Exchange.

That's MiniMax M2.5.

The Numbers

SWE-Bench: 80.2% — this measures real-world software engineering ability. Claude Opus scores in the same range.

Cost: Approximately 1/20th of Claude Opus pricing. For production applications processing millions of tokens per day, that's the difference between a viable business model and an unsustainable one.

License: Modified MIT on Hugging Face. The "modified" part adds a few restrictions around harmful use cases, but for legitimate commercial applications, you're free to use, modify, and deploy.

Why MiniMax Matters for Hong Kong

You literally can't use Claude here

Anthropic blocks Hong Kong. So comparing MiniMax M2.5 to Claude Opus isn't an academic exercise — for HK developers, MiniMax M2.5 is the accessible version of that capability level. And it's cheaper and open source.

The HKEX IPO

MiniMax listed on the Hong Kong Stock Exchange on January 9, 2026 — one day after Zhipu AI (Z.ai) listed on January 8. Two major Chinese AI companies choosing HKEX within 24 hours is a signal: Hong Kong is becoming the financial hub for Chinese AI.

For local developers and startups, having these companies listed in HK means stronger local presence, easier partnerships, and likely better regional support.

Open source means no platform risk

If you build on Claude's API, Anthropic can change pricing, change terms, or expand geographic restrictions (which they did in September 2025, blocking Chinese-controlled entities worldwide). If you build on MiniMax M2.5's open weights, the model is yours. Forever.

How to Use It

API: MiniMax offers managed API access. OpenAI-compatible format.

Hugging Face: Download the full weights. Deploy on your own infrastructure using vLLM, TGI, or similar serving frameworks.

When to choose M2.5:

  • -You need strong coding/software engineering capability
  • -Cost is a significant factor in your deployment
  • -You want Claude-level output quality
  • -You need an open-source model for compliance or sovereignty reasons

When to look elsewhere:

  • -You need vision/multimodal capabilities (M2.5 is text-only — consider Kimi K2.5 or GLM-5)
  • -You need strong multilingual/Chinese capabilities (Qwen 3.5 is stronger here)
  • -You need the smallest possible model (M2.5 is a large model)

The Bigger Picture

MiniMax's trajectory — from relatively unknown to HKEX-listed company with a frontier model — is becoming a pattern in Chinese AI. These aren't scrappy startups anymore. They're well-funded, publicly traded companies producing models that match or exceed their Western counterparts.

For Hong Kong, sitting at the intersection of these two AI ecosystems, the practical takeaway is clear: you have options. Good ones. Claude-level quality is available, open source, at a fraction of the cost. The AI wall that US companies built around Hong Kong has made the models on this side of it more competitive, more accessible, and more interesting than most people outside HK realize.



Sources

Following the Chinese AI model landscape? Subscribe to the Hong Kong AI Podcast for the latest on what's shipping and what's working in Hong Kong.

Stay in the loop

Get notified when we publish new articles and episodes. No spam, just signal.

Something out of date or wrong? AI moves fast and we want to get it right. Let us know at contact@hongkongaipodcast.com