Knowledge Fusion By Evolving Weights of Language Models - ACL Anthology

Knowledge Fusion By Evolving Weights of Language Models

Guodong Du, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, Ho-Kin Tang


Abstract
Fine-tuning pre-trained language models, particularly large language models, demands extensive computing resources and can result in varying performance outcomes across different domains and datasets. This paper examines the approach of integrating multiple models from diverse training scenarios into a unified model. This unified model excels across various data domains and exhibits the ability to generalize well on out-of-domain data. We propose a knowledge fusion method named Evolver, inspired by evolutionary algorithms, which does not need further training or additional training data. Specifically, our method involves aggregating the weights of different language models into a population and subsequently generating offspring models through mutation and crossover operations. These offspring models are then evaluated against their parents, allowing for the preservation of those models that show enhanced performance on development datasets. Importantly, our model evolving strategy can be seamlessly integrated with existing model merging frameworks, offering a versatile tool for model enhancement. Experimental results on mainstream language models (i.e., encoder-only, decoder-only, encoder-decoder) reveal that Evolver outperforms previous state-of-the-art models by large margins.
Anthology ID:
2024.findings-acl.698
Volume:
Findings of the Association for Computational Linguistics: ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11727–11742
Language:
URL:
https://aclanthology.org/2024.findings-acl.698
DOI:
10.18653/v1/2024.findings-acl.698
Bibkey:
Cite (ACL):
Guodong Du, Jing Li, Hanting Liu, Runhua Jiang, Shuyang Yu, Yifei Guo, Sim Kuan Goh, and Ho-Kin Tang. 2024. Knowledge Fusion By Evolving Weights of Language Models. In Findings of the Association for Computational Linguistics: ACL 2024, pages 11727–11742, Bangkok, Thailand. Association for Computational Linguistics.
Cite (Informal):
Knowledge Fusion By Evolving Weights of Language Models (Du et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.698.pdf