Speculative Contrastive Decoding - ACL Anthology

Speculative Contrastive Decoding

Hongyi Yuan, Keming Lu, Fei Huang, Zheng Yuan, Chang Zhou


Abstract
Large language models (LLMs) exhibit exceptional performance in language tasks, yet their auto-regressive inference is limited due to high computational requirements and is sub-optimal due to the exposure bias. Inspired by speculative decoding and contrastive decoding, we introduce Speculative Contrastive Decoding (SCD), a straightforward yet powerful decoding approach that leverages predictions from smaller language models (LMs) to achieve both decoding acceleration and quality improvement. Extensive evaluations and analyses on four diverse language tasks demonstrate the effectiveness of SCD, showing that decoding efficiency and quality can compatibly benefit from one smaller LM.
Anthology ID:
2024.acl-short.5
Volume:
Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
56–64
Language:
URL:
https://aclanthology.org/2024.acl-short.5
DOI:
10.18653/v1/2024.acl-short.5
Bibkey:
Cite (ACL):
Hongyi Yuan, Keming Lu, Fei Huang, Zheng Yuan, and Chang Zhou. 2024. Speculative Contrastive Decoding. In Proceedings of the 62nd Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 56–64, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Speculative Contrastive Decoding (Yuan et al., ACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.acl-short.5.pdf