SPD-DDPM: Denoising Diffusion Probabilistic Models in the Symmetric Positive Definite Space
DOI:
https://doi.org/10.1609/aaai.v38i12.29276Keywords:
ML: Deep Generative Models & Autoencoders, ML: Learning with ManifoldsAbstract
Symmetric positive definite(SPD) matrices have shown important value and applications in statistics and machine learning, such as FMRI analysis and traffic prediction. Previous works on SPD matrices mostly focus on discriminative models, where predictions are made directly on E(X|y), where y is a vector and X is an SPD matrix. However, these methods are challenging to handle for large-scale data. In this paper, inspired by denoising diffusion probabilistic model(DDPM), we propose a novel generative model, termed SPD-DDPM, by introducing Gaussian distribution in the SPD space to estimate E(X|y). Moreover, our model can estimate p(X) unconditionally and flexibly without giving y. On the one hand, the model conditionally learns p(X|y) and utilizes the mean of samples to obtain E(X|y) as a prediction. On the other hand, the model unconditionally learns the probability distribution of the data p(X) and generates samples that conform to this distribution. Furthermore, we propose a new SPD net which is much deeper than the previous networks and allows for the inclusion of conditional factors. Experiment results on toy data and real taxi data demonstrate that our models effectively fit the data distribution both unconditionally and conditionally.Downloads
Published
2024-03-24
How to Cite
Li, Y., Yu, Z., He, G., Shen, Y., Li, K., Sun, X., & Lin, S. (2024). SPD-DDPM: Denoising Diffusion Probabilistic Models in the Symmetric Positive Definite Space. Proceedings of the AAAI Conference on Artificial Intelligence, 38(12), 13709-13717. https://doi.org/10.1609/aaai.v38i12.29276
Issue
Section
AAAI Technical Track on Machine Learning III