Local Independence Testing for Point Processes

Research output: Contribution to journalJournal articleResearchpeer-review

Standard

Local Independence Testing for Point Processes. / Thams, Nikolaj; Hansen, Niels Richard.

In: IEEE Transactions on Neural Networks and Learning Systems, 2024, p. 1-12.

Research output: Contribution to journalJournal articleResearchpeer-review

Harvard

Thams, N & Hansen, NR 2024, 'Local Independence Testing for Point Processes', IEEE Transactions on Neural Networks and Learning Systems, pp. 1-12. https://doi.org/10.1109/TNNLS.2023.3335265

APA

Thams, N., & Hansen, N. R. (2024). Local Independence Testing for Point Processes. IEEE Transactions on Neural Networks and Learning Systems, 1-12. https://doi.org/10.1109/TNNLS.2023.3335265

Vancouver

Thams N, Hansen NR. Local Independence Testing for Point Processes. IEEE Transactions on Neural Networks and Learning Systems. 2024;1-12. https://doi.org/10.1109/TNNLS.2023.3335265

Author

Thams, Nikolaj ; Hansen, Niels Richard. / Local Independence Testing for Point Processes. In: IEEE Transactions on Neural Networks and Learning Systems. 2024 ; pp. 1-12.

Bibtex

@article{6c66b86413b74335b829583ca3f568f2,
title = "Local Independence Testing for Point Processes",
abstract = "Constraint-based causal structure learning for point processes require empirical tests of local independence. Existing tests require strong model assumptions, e.g., that the true data generating model is a Hawkes process with no latent confounders. Even when restricting attention to Hawkes processes, latent confounders are a major technical difficulty because a marginalized process will generally not be a Hawkes process itself. We introduce an expansion similar to Volterra expansions as a tool to represent marginalized intensities. Our main theoretical result is that such expansions can approximate the true marginalized intensity arbitrarily well. Based on this, we propose a test of local independence and investigate its properties in real and simulated data.",
keywords = "Causal discovery, Data models, Heuristic algorithms, Kernel, Learning systems, local independence, Mathematical models, Neurons, neuroscience, point processes, Testing",
author = "Nikolaj Thams and Hansen, {Niels Richard}",
note = "Publisher Copyright: IEEE",
year = "2024",
doi = "10.1109/TNNLS.2023.3335265",
language = "English",
pages = "1--12",
journal = "IEEE Transactions on Neural Networks and Learning Systems",
issn = "2162-237X",
publisher = "Institute of Electrical and Electronics Engineers",

}

RIS

TY - JOUR

T1 - Local Independence Testing for Point Processes

AU - Thams, Nikolaj

AU - Hansen, Niels Richard

N1 - Publisher Copyright: IEEE

PY - 2024

Y1 - 2024

N2 - Constraint-based causal structure learning for point processes require empirical tests of local independence. Existing tests require strong model assumptions, e.g., that the true data generating model is a Hawkes process with no latent confounders. Even when restricting attention to Hawkes processes, latent confounders are a major technical difficulty because a marginalized process will generally not be a Hawkes process itself. We introduce an expansion similar to Volterra expansions as a tool to represent marginalized intensities. Our main theoretical result is that such expansions can approximate the true marginalized intensity arbitrarily well. Based on this, we propose a test of local independence and investigate its properties in real and simulated data.

AB - Constraint-based causal structure learning for point processes require empirical tests of local independence. Existing tests require strong model assumptions, e.g., that the true data generating model is a Hawkes process with no latent confounders. Even when restricting attention to Hawkes processes, latent confounders are a major technical difficulty because a marginalized process will generally not be a Hawkes process itself. We introduce an expansion similar to Volterra expansions as a tool to represent marginalized intensities. Our main theoretical result is that such expansions can approximate the true marginalized intensity arbitrarily well. Based on this, we propose a test of local independence and investigate its properties in real and simulated data.

KW - Causal discovery

KW - Data models

KW - Heuristic algorithms

KW - Kernel

KW - Learning systems

KW - local independence

KW - Mathematical models

KW - Neurons

KW - neuroscience

KW - point processes

KW - Testing

UR - http://www.scopus.com/inward/record.url?scp=85181554691&partnerID=8YFLogxK

U2 - 10.1109/TNNLS.2023.3335265

DO - 10.1109/TNNLS.2023.3335265

M3 - Journal article

C2 - 38109252

AN - SCOPUS:85181554691

SP - 1

EP - 12

JO - IEEE Transactions on Neural Networks and Learning Systems

JF - IEEE Transactions on Neural Networks and Learning Systems

SN - 2162-237X

ER -

ID: 384911241