{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,7,1]],"date-time":"2024-07-01T21:10:33Z","timestamp":1719868233320},"reference-count":0,"publisher":"Association for the Advancement of Artificial Intelligence (AAAI)","issue":"6","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["AAAI"],"abstract":"Deep learning models have the ability to extract rich knowledge from large-scale datasets. However, the sharing of data has become increasingly challenging due to concerns regarding data copyright and privacy. Consequently, this hampers the effective transfer of knowledge from existing data to novel downstream tasks and concepts. Zero-shot learning (ZSL) approaches aim to recognize new classes by transferring semantic knowledge learned from base classes. However, traditional generative ZSL methods often require access to real images from base classes and rely on manually annotated attributes, which presents challenges in terms of data restrictions and model scalability. To this end, this paper tackles a challenging and practical problem dubbed as data-free zero-shot learning (DFZSL), where only the CLIP-based base classes data pre-trained classifier is available for zero-shot classification. Specifically, we propose a generic framework for DFZSL, which consists of three main components. Firstly, to recover the virtual features of the base data, we model the CLIP features of base class images as samples from a von Mises-Fisher (vMF) distribution based on the pre-trained classifier. Secondly, we leverage the text features of CLIP as low-cost semantic information and propose a feature-language prompt tuning (FLPT) method to further align the virtual image features and textual features. Thirdly, we train a conditional generative model using the well-aligned virtual image features and corresponding semantic text features, enabling the generation of new classes features and achieve better zero-shot generalization. Our framework has been evaluated on five commonly used benchmarks for generalized ZSL, as well as 11 benchmarks for the base-to-new ZSL. The results demonstrate the superiority and effectiveness of our approach. Our code is available in https:\/\/github.com\/ylong4\/DFZSL.<\/jats:p>","DOI":"10.1609\/aaai.v38i6.28316","type":"journal-article","created":{"date-parts":[[2024,3,25]],"date-time":"2024-03-25T09:46:52Z","timestamp":1711360012000},"page":"5108-5117","source":"Crossref","is-referenced-by-count":1,"title":["Data-Free Generalized Zero-Shot Learning"],"prefix":"10.1609","volume":"38","author":[{"given":"Bowen","family":"Tang","sequence":"first","affiliation":[]},{"given":"Jing","family":"Zhang","sequence":"additional","affiliation":[]},{"given":"Long","family":"Yan","sequence":"additional","affiliation":[]},{"given":"Qian","family":"Yu","sequence":"additional","affiliation":[]},{"given":"Lu","family":"Sheng","sequence":"additional","affiliation":[]},{"given":"Dong","family":"Xu","sequence":"additional","affiliation":[]}],"member":"9382","published-online":{"date-parts":[[2024,3,24]]},"container-title":["Proceedings of the AAAI Conference on Artificial Intelligence"],"original-title":[],"link":[{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/28316\/28621","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/28316\/28622","content-type":"unspecified","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/28316\/28621","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,3,25]],"date-time":"2024-03-25T09:46:52Z","timestamp":1711360012000},"score":1,"resource":{"primary":{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/view\/28316"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2024,3,24]]},"references-count":0,"journal-issue":{"issue":"6","published-online":{"date-parts":[[2024,3,25]]}},"URL":"https:\/\/doi.org\/10.1609\/aaai.v38i6.28316","relation":{},"ISSN":["2374-3468","2159-5399"],"issn-type":[{"value":"2374-3468","type":"electronic"},{"value":"2159-5399","type":"print"}],"subject":[],"published":{"date-parts":[[2024,3,24]]}}}