GTE-Qwen2 Embeddings
Embedding ModelsOpen SourceVerifiedOpen Source
Provider
Alibaba NLP (Qwen team)
Open-source 7B-parameter text embedding model built on Qwen2 LLM for generating high-quality vector representations. Supports 32K token context with 3584-dimensional embeddings and bidirectional attention with instruction tuning. Best for multilingual semantic search, retrieval-augmented generation, and text similarity tasks requiring long-context support.
Model size
7B parameters (~26.45 GB fp32)
API
Available
Benchmarks
MTEB English avg 70.24 (56 tasks), C-MTEB Chinese avg 72.05 (35 tasks), ranked #1 on MTEB as of June 2024
Price
From $0
License: Apache-2.0