Skip to main content

Xiaohongshu Shakes Up AI Race with Surprise Open-Source Release

Xiaohongshu's Bold Move into AI Infrastructure

On April 15, while most tech observers were watching the usual AI giants, Chinese social commerce platform Xiaohongshu quietly dropped a bombshell: they open-sourced their Relax training engine, a sophisticated system for developing multi-modal AI models.

Designed for the Future of AI

What makes Relax stand out is its native support for multiple data types - not just text, but images, audio, and video too. In today's AI landscape where systems need to process the world as humans do (through multiple senses), this capability positions Relax as a forward-looking solution.

The engine introduces two clever technical innovations:

  • Modal-aware parallelism: Think of this as giving each data type its own optimized processing lane
  • End-to-end asynchronous pipelining: This keeps the training process flowing smoothly with minimal downtime

Together, these features promise to make training complex AI models significantly more efficient.

Why Open Source Matters

Here's what's really interesting: Xiaohongshu isn't an AI infrastructure company. They're best known for their social shopping platform. By open-sourcing Relax, they're:

  1. Showing off serious AI engineering chops
  2. Building goodwill with developers worldwide
  3. Potentially shaping the future of multi-modal AI development

"This is a classic ecosystem play," notes AI researcher Li Wei. "Rather than keeping their tech locked up, they're releasing it to attract talent and establish technical leadership."

The Bigger Picture

The AI arms race just got more interesting. While most attention focuses on American and Chinese tech giants, Xiaohongshu's move proves innovation can come from unexpected places. For developers, this means another powerful tool in their arsenal. For the industry, it signals that the multi-modal future is arriving faster than many predicted.

Key Points:

  • Xiaohongshu unexpectedly open-sourced its Relax AI training engine
  • The system specializes in handling multiple data types (text, images, audio, video)
  • Two core innovations boost training efficiency for complex models
  • Move establishes Xiaohongshu as a serious AI infrastructure player
  • Demonstrates how social platforms are expanding into fundamental AI technology

Enjoyed this article?

Subscribe to our newsletter for the latest AI news, product reviews, and project recommendations delivered to your inbox weekly.

Weekly digestFree foreverUnsubscribe anytime

Related Articles

Alibaba's Meoo AI tool lets anyone build websites with simple words
News

Alibaba's Meoo AI tool lets anyone build websites with simple words

Alibaba has unveiled Meoo, an AI-powered development platform that requires no coding skills. By simply describing what you want in plain language, the tool can generate complete websites in minutes. Currently used by over 10,000 Alibaba employees across various departments, Meoo combines four leading AI models with Alibaba Cloud services to streamline digital creation for non-technical users.

April 15, 2026
AI developmentno-code toolsAlibaba Cloud
News

OpenAI Engineers Push Boundaries: AI Now Writes Millions of Code Lines Without Human Help

In a groundbreaking experiment, OpenAI engineers have successfully created a 'ghost library' called Symphony entirely through AI coding. The team imposed strict 'no human coding' rules, forcing their AI agents to evolve from sluggish assistants to fully autonomous teammates. Key breakthroughs included developing ultra-fast build systems and redefining code review processes. This shift suggests a future where software dependencies may disappear as AI can reconstruct needed code on demand.

April 14, 2026
AI developmentOpenAIfuture of coding
Claude's secret weapon? Leaked screenshots reveal AI app builder that could shake up coding
News

Claude's secret weapon? Leaked screenshots reveal AI app builder that could shake up coding

Whispers in Silicon Valley just got louder. A batch of leaked screenshots from Anthropic shows the company quietly developing a full-stack app builder within Claude. Dubbed 'Let's ship something great,' the feature lets users describe apps in plain English, generating complete front-end and back-end code instantly. The revelation has developers buzzing—and competitors like Lovable.dev potentially sweating. Could this be the beginning of the end for specialized AI coding tools?

April 13, 2026
AnthropicAI developmentvibe coding
News

Japan's Corporate Giants Band Together to Build Homegrown AI Powerhouse

Japan's tech and industrial heavyweights - Sony, Honda, SoftBank and others - are pooling resources to develop domestic AI capabilities. Backed by a trillion-yen government commitment, this new consortium aims to reduce reliance on foreign AI technology while leveraging Japan's robotics expertise. The move signals Japan's ambitious bid to compete in the global AI race with homegrown solutions.

April 13, 2026
Japan AITech collaborationAI development
Minimax Opens Its AI Model to All, While DeepSeek Prepares a Game-Changing Update
News

Minimax Opens Its AI Model to All, While DeepSeek Prepares a Game-Changing Update

MiniMax has made waves by open-sourcing its Minimax2.7 AI model, showcasing programming skills that rival top international competitors like Claude Opus. This move, along with Zhipu's recent GLM-5.1 release, signals China's growing strength in AI development. Now, attention turns to DeepSeek's upcoming V4 model, expected to bring multimodal capabilities and hardware compatibility when it launches later this month.

April 13, 2026
AI developmentopen-source technologyChinese tech innovation
Alibaba's Tiny AI Model Packs a Punch with Smart Upcycling Technique
News

Alibaba's Tiny AI Model Packs a Punch with Smart Upcycling Technique

Alibaba's research team has achieved something remarkable - transforming a modest 0.6 billion parameter AI model into a powerful 17.3 billion parameter system that runs efficiently on standard CPUs. The secret? An innovative 'upcycling' approach that activates just 5% of parameters during operation. This breakthrough could make sophisticated AI more accessible than ever, performing tasks at 30 tokens per second without expensive hardware. It's not just about size - the clever training methods make this compact model outperform larger rivals.

April 10, 2026
AI efficiencyMachine learningMoE architecture