Tencent Releases Groundbreaking AI Tool That Animates Game Characters With Words
Tencent's New AI Turns Words Into Fluid Character Animations
In a move that could reshape game development, Tencent's Hunyuan team unveiled HY-Motion 1.0 on December 30, 2025 - an open-source AI model that converts text descriptions into detailed 3D character animations.

How It Works
The system uses a sophisticated Diffusion Transformer architecture combined with flow matching technology. Developers simply describe an action in natural language (like "character jumps over a log" or "villager waves hello"), and the model generates corresponding skeletal animations ready for use in popular engines like Unity and Unreal.
"What sets this apart is its training approach," explains Dr. Li Wei, an animation technology researcher at Tsinghua University. "They've layered general motion understanding with specialized refinement phases, then added human feedback tuning - it's like teaching someone first to walk, then dance, then perform stunts."
Real-World Performance
The model handles over 200 motion types across six categories:
- Everyday movements (walking, sitting)
- Sports and fitness actions
- Social interactions
- Game-specific combat animations
Early testing shows particularly strong results for basic to moderately complex motions. Descriptions like "character somersaults and lands crouching" produce surprisingly fluid results that would normally require hours of manual animation work.
However, the system struggles with highly specialized athletic movements - attempts to recreate Olympic ski jumps or BMX tricks often result in slightly unnatural joint transitions.
Game-Changing Potential
The immediate application shines brightest in game development:
- Rapid prototyping of NPC behaviors
- Quick iteration on character actions
- Mass generation of background animations
Indie developer Maya Chen tested the lightweight version: "For our small team, this cuts days off our production schedule. We can now focus on refining main character animations while letting the AI handle crowd movements."
The technology also shows promise for film pre-visualization and VR content creation.
Availability
Both the full HY-Motion 1.0 model and its streamlined Lite version (0.46B parameters) are now available open-source through Tencent's Hunyuan platform.
Key Points:
- 10 billion parameter text-to-animation AI released open-source
- Generates SMPL-H skeleton format compatible with major engines
- Excels at common motions, struggles with elite athletic movements
- Could revolutionize indie game development workflows
- Lightweight version available for less demanding hardware