Alibaba launches Qwen3-Embedding and Qwen3-Reranker series for multilingual text embedding

Published 06/06/2025, 14:14
© Reuters.

Investing.com -- Alibaba (NYSE:BABA) has launched the Qwen3-Embedding and Qwen3-Reranker series, setting new benchmarks in multilingual text embedding and relevance ranking. The series, which includes models designed for text embedding, retrieval, and reranking tasks, supports 119 languages and is available in 0.6B, 4B, and 8B versions.

The Qwen3-Embedding and Qwen3-Reranker series are built on the Qwen3 foundation model, which boasts robust multilingual text understanding capabilities. These new models have achieved state-of-the-art performance across multiple benchmarks for text embedding and reranking tasks. They are open-sourced under the Apache 2.0 license on Hugging Face, GitHub, and ModelScope, and can be used via API on Alibaba Cloud.

The Qwen3-Embedding series offers a range of sizes for both embedding and reranking models, catering to various use cases that prioritize efficiency and effectiveness. The 8B size embedding model ranks No.1 in the MTEB multilingual leaderboard as of June 5, 2025, with a score of 70.58. The reranking models excel in text retrieval scenarios, significantly improving search relevance.

The Qwen3-Embedding series supports over 100 languages, including various programming languages, and provides robust multilingual, cross-lingual, and code retrieval capabilities. The models are designed using dual-encoder and cross-encoder architectures and aim to fully preserve and enhance the text understanding capabilities of the base model.

The training framework for the Qwen3-Embedding series follows the multi-stage training paradigm established by the GTE-Qwen series. This includes a three-stage training structure for the Embedding model and a direct use of high-quality labeled data for supervised training of the Reranking model, improving training efficiency.

As part of future work, Alibaba plans to optimize the Qwen foundation model further to enhance the training efficiency of text embeddings and reranking models. This will improve deployment performance across various scenarios. Additionally, the company plans to expand its multimodal representation system to establish cross-modal semantic understanding capabilities.

This article was generated with the support of AI and reviewed by an editor. For more information see our T&C.

Latest comments

Risk Disclosure: Trading in financial instruments and/or cryptocurrencies involves high risks including the risk of losing some, or all, of your investment amount, and may not be suitable for all investors. Prices of cryptocurrencies are extremely volatile and may be affected by external factors such as financial, regulatory or political events. Trading on margin increases the financial risks.
Before deciding to trade in financial instrument or cryptocurrencies you should be fully informed of the risks and costs associated with trading the financial markets, carefully consider your investment objectives, level of experience, and risk appetite, and seek professional advice where needed.
Fusion Media would like to remind you that the data contained in this website is not necessarily real-time nor accurate. The data and prices on the website are not necessarily provided by any market or exchange, but may be provided by market makers, and so prices may not be accurate and may differ from the actual price at any given market, meaning prices are indicative and not appropriate for trading purposes. Fusion Media and any provider of the data contained in this website will not accept liability for any loss or damage as a result of your trading, or your reliance on the information contained within this website.
It is prohibited to use, store, reproduce, display, modify, transmit or distribute the data contained in this website without the explicit prior written permission of Fusion Media and/or the data provider. All intellectual property rights are reserved by the providers and/or the exchange providing the data contained in this website.
Fusion Media may be compensated by the advertisers that appear on the website, based on your interaction with the advertisements or advertisers
© 2007-2025 - Fusion Media Limited. All Rights Reserved.