Download PDFOpen PDF in browser

Mixtures of Normalizing Flows

9 pagesPublished: November 2, 2021


Normalizing flows fall into the category of deep generative models. They explicitly model a probability density function. As a result, such a model can learn probabilistic distributions beyond the Gaussian one. Clustering is one of the main unsupervised ma- chine learning tasks and the most common probabilistic approach to solve a clustering problem is via Gaussian mixture models. Although there are a few approaches for con- structing mixtures of normalizing flows in the literature, we propose a direct approach and use the masked autoregressive flow as the normalizing flow. We show the results obtained on 2D datasets and then on images. The results contain density plots or ta- bles with clustering metrics in order to quantify the quality of the obtained clusters. Although on images we usually obtain worse results than other classic models, the 2D results show that more expressive mixtures of distributions (than the Gaussian mixture models) can be learned indeed. The code which implements this method can be found at

Keyphrases: Clustering, machine learning, mixture models, Mixtures of normalizing flows, Normalizing Flows

In: Yan Shi, Gongzhu Hu, Quan Yuan and Takaaki Goto (editors). Proceedings of ISCA 34th International Conference on Computer Applications in Industry and Engineering, vol 79, pages 82--90

BibTeX entry
  author    = {Sebastian Ciobanu},
  title     = {Mixtures of Normalizing Flows},
  booktitle = {Proceedings of ISCA 34th International Conference on Computer Applications in Industry and Engineering},
  editor    = {Yan Shi and Gongzhu Hu and Quan Yuan and Takaaki Goto},
  series    = {EPiC Series in Computing},
  volume    = {79},
  pages     = {82--90},
  year      = {2021},
  publisher = {EasyChair},
  bibsource = {EasyChair,},
  issn      = {2398-7340},
  url       = {},
  doi       = {10.29007/nq4f}}
Download PDFOpen PDF in browser