Document Type
Article
Publication Date
1-2025
Publisher
Elsevier
Abstract
Dense retrieval (DR) converts queries and documents into dense embeddings and measures the similarity between queries and documents in vector space. One of the major challenges in DR is the lack of domain-specific training data. While DR models can learn from large-scale public datasets like MS MARCO through transfer learning, evidence shows that not all DR models and domains can benefit from transfer learning. Recently, researchers have resorted to large language models (LLMs) to improve the zero-shot and few-shot DR models. However, the hard prompts or human-written prompts utilized in these works are suboptimal and the generated weak queries are often sensitive to the prompts. To tackle this, we propose soft prompt tuning for augmenting DR (SPTAR): for each task, we leverage soft prompt tuning to optimize a task-specific soft prompt on limited ground truth data and then prompt the LLMs to tag unlabeled documents with weak queries, yielding weak document–query pairs to train task-specific dense retrievers. We design a filter to select high-quality example document–query pairs in the prompt to further improve the quality of weak tagged queries. To the best of our knowledge, there is no prior work utilizing soft prompt tuning to augment DR models. Moreover, unlike much of the existing work, ours is based on popular open-source LLMs to ensure reproducible and deterministic results. Our experimental results demonstrate that SPTAR outperforms both unsupervised baselines and the recently proposed LLMs-based augmentation method for DR.
Recommended Citation
Peng, Z., Wu, X., Wang, Q., & Fang, Y. (2025). Soft prompt tuning for augmenting dense retrieval with large language models. Knowledge-Based Systems, 309, 112758. https://doi.org/10.1016/j.knosys.2024.112758

Comments
Open Access - This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.