Tweet Overview
View this X/Twitter post from @TencentHunyuan published on ā§§ āϏā§āĻĒā§āĻā§āĻŽā§āĻŦāϰ, ⧍ā§Ļ⧍ā§Ģ āĻ ā§§ā§Ļ:ā§Ēā§Ē AM. This post contains 3 images.
We're excited to announce the the open-source release of Hunyuan-MT-7B, our latest translation model that just won big at WMT2025! đđ Hunyuan-MT-7B is a lightweight 7B model that's a true powerhouse. It dominated the competition by winning 30 out of 31 language categories, outperforming much larger models under strict open-source and public-data constraints. On the widely-used Flores200 benchmark, its performance rivals closed-source models like GPT-4.1.đđŦ Why is this a game-changer? đšUnmatched Efficiency: The 7B model delivers lightning-fast inference, processing more translation requests on the same hardware. đšDeployment Flexibility: It's cost-effective and can be deployed on a wide range of hardware, from high-end servers to edge devices. đšBroad Language Support: It supports 33 languages and 5 minority languages, providing comprehensive coverage. But that's not all. We're also open-sourcing Hunyuan-MT-Chimera-7B, the industry's first open-source integrated translation model. It intelligently refines translations from multiple models to deliver more accurate and professional results for specialized use cases. Download and deploy it for your next project! đTry it now: đCode: đTechnical Report: đ¤Hugging Face: đ§AngelSlim:









