AI Rivalry Heats Up: Mistral Claims DeepSeek Borrowed Its Tech
The Architecture Debate Shaking Up AI Circles
The normally collaborative world of open-source AI found itself embroiled in controversy this week when Mistral CEO Arthur Mensch made surprising claims about Chinese competitor DeepSeek's technology.
"They Used Our Blueprint"
In a recent interview, Mensch - whose company has been dubbed "Europe's OpenAI" - asserted that DeepSeek-V3's architecture builds directly on Mistral's pioneering work with sparse mixture-of-experts (MoE) models. "We released our first MoE model early last year," Mensch explained. "DeepSeek adopted the same architecture."
The comments immediately sparked debate across tech forums and social media. Developers began digging through research papers published on arXiv, uncovering some intriguing details:
- Timing Trouble: Publication dates show Mixtral's paper appeared just three days before DeepSeek's MoE research - hardly enough time for substantial borrowing.
- Technical Differences: While both use SMoE systems, DeepSeek implemented novel features like "fine-grained expert segmentation" absent from Mistral's design.
- Divergent Approaches: Mixtral focused on engineering optimizations while DeepSeek rebuilt core algorithms from the ground up.
The Plot Thickens
The story took an unexpected turn when analysts noticed something curious about Mistral's latest release. Their Mistral3Large model appears to incorporate architectural elements strikingly similar to innovations first seen in DeepSeek-V3, particularly around MLA technology.
Some developers joked that Mistral might be trying to "rewrite history" amid shifting influence in the MoE space. Others see it as natural evolution in open-source development.
Competition Heats Up
Both companies appear to be accelerating their efforts:
- DeepSeek reportedly plans a major release timed with China's Spring Festival holiday
- Mistral continues updating its Devstral family, aiming to reclaim leadership in programming intelligence tools
The controversy raises deeper questions about attribution in an ecosystem built on shared knowledge. As one developer noted: "In open-source, every innovation stands on someone else's shoulders - but we should acknowledge whose shoulders they are."
Key Points:
- Mistral CEO claims architectural influence over Chinese rival DeepSeek
- Technical analysis reveals complex mutual influences between the models
- Debate highlights challenges of attribution in fast-moving open-source AI
- Both companies pushing forward with ambitious new releases



