Algebraic tests of general Gaussian latent tree models

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Standard

Algebraic tests of general Gaussian latent tree models. / Leung, Dennis; Drton, Mathias.

Advances in Neural Information Processing Systems 31 (NIPS 2018). Neural Information Processing Systems Foundation, 2018.

Research output: Chapter in Book/Report/Conference proceedingArticle in proceedingsResearchpeer-review

Harvard

Leung, D & Drton, M 2018, Algebraic tests of general Gaussian latent tree models. in Advances in Neural Information Processing Systems 31 (NIPS 2018). Neural Information Processing Systems Foundation, Twenty-Second Annual Conference on Neural Information Processing Systems, Vancouver, Canada, 08/12/2008.

APA

Leung, D., & Drton, M. (2018). Algebraic tests of general Gaussian latent tree models. In Advances in Neural Information Processing Systems 31 (NIPS 2018) Neural Information Processing Systems Foundation.

Vancouver

Leung D, Drton M. Algebraic tests of general Gaussian latent tree models. In Advances in Neural Information Processing Systems 31 (NIPS 2018). Neural Information Processing Systems Foundation. 2018

Author

Leung, Dennis ; Drton, Mathias. / Algebraic tests of general Gaussian latent tree models. Advances in Neural Information Processing Systems 31 (NIPS 2018). Neural Information Processing Systems Foundation, 2018.

Bibtex

@inproceedings{d701667a03fa4cc1a5a2d7586725329f,
title = "Algebraic tests of general Gaussian latent tree models",
abstract = "We consider general Gaussian latent tree models in which the observed variables are not restricted to be leaves of the tree. Extending related recent work, we give a full semi-algebraic description of the set of covariance matrices of any such model. In other words, we find polynomial constraints that characterize when a matrix is the covariance matrix of a distribution in a given latent tree model. However, leveraging these constraints to test a given such model is often complicated by the number of constraints being large and by singularities of individual polynomials, which may invalidate standard approximations to relevant probability distributions. Illustrating with the star tree, we propose a new testing methodology that circumvents singularity issues by trading off some statistical estimation efficiency and handles cases with many constraints through recent advances on Gaussian approximation for maxima of sums of high-dimensional random vectors. Our test avoids the need to maximize the possibly multimodal likelihood function of such models and is applicable to models with larger number of variables. These points are illustrated in numerical experiments.",
author = "Dennis Leung and Mathias Drton",
year = "2018",
language = "English",
booktitle = "Advances in Neural Information Processing Systems 31 (NIPS 2018)",
publisher = "Neural Information Processing Systems Foundation",
note = "null ; Conference date: 08-12-2008 Through 13-12-2008",

}

RIS

TY - GEN

T1 - Algebraic tests of general Gaussian latent tree models

AU - Leung, Dennis

AU - Drton, Mathias

N1 - Conference code: 22

PY - 2018

Y1 - 2018

N2 - We consider general Gaussian latent tree models in which the observed variables are not restricted to be leaves of the tree. Extending related recent work, we give a full semi-algebraic description of the set of covariance matrices of any such model. In other words, we find polynomial constraints that characterize when a matrix is the covariance matrix of a distribution in a given latent tree model. However, leveraging these constraints to test a given such model is often complicated by the number of constraints being large and by singularities of individual polynomials, which may invalidate standard approximations to relevant probability distributions. Illustrating with the star tree, we propose a new testing methodology that circumvents singularity issues by trading off some statistical estimation efficiency and handles cases with many constraints through recent advances on Gaussian approximation for maxima of sums of high-dimensional random vectors. Our test avoids the need to maximize the possibly multimodal likelihood function of such models and is applicable to models with larger number of variables. These points are illustrated in numerical experiments.

AB - We consider general Gaussian latent tree models in which the observed variables are not restricted to be leaves of the tree. Extending related recent work, we give a full semi-algebraic description of the set of covariance matrices of any such model. In other words, we find polynomial constraints that characterize when a matrix is the covariance matrix of a distribution in a given latent tree model. However, leveraging these constraints to test a given such model is often complicated by the number of constraints being large and by singularities of individual polynomials, which may invalidate standard approximations to relevant probability distributions. Illustrating with the star tree, we propose a new testing methodology that circumvents singularity issues by trading off some statistical estimation efficiency and handles cases with many constraints through recent advances on Gaussian approximation for maxima of sums of high-dimensional random vectors. Our test avoids the need to maximize the possibly multimodal likelihood function of such models and is applicable to models with larger number of variables. These points are illustrated in numerical experiments.

M3 - Article in proceedings

BT - Advances in Neural Information Processing Systems 31 (NIPS 2018)

PB - Neural Information Processing Systems Foundation

Y2 - 8 December 2008 through 13 December 2008

ER -

ID: 215135290