We have previously proposed a cross-validation (CV) based Gaussian mixture optimization method that efficiently optimizes the model structure based on CV likelihood. In this study, we propose aggregated cross-validation (AgCV) that introduces a bagging-like approach in the CV framework to reinforce the model selection ability. While a single model is used in CV to evaluate a held-out subset, AgCV uses multiple models to reduce the variance in the score estimation. By integrating AgCV instead of CV in the Gaussian mixture optimization algorithm, an AgCV likelihood based Gaussian mixture optimization algorithm is obtained. The algorithm works efficiently by using sufficient statistics and can be applied to large models such as Gaussian mixture HMM. The proposed algorithm is evaluated by speech recognition experiments on oral presentations and it is shown that lower word error rates are obtained by the AgCV optimization method when compared to CV and MDL based methods.