- Soil Geostatistics and Mapping
- Statistical Methods and Bayesian Inference
- Bayesian Methods and Mixture Models
- Spatial and Panel Data Analysis
- Gaussian Processes and Bayesian Inference
- Statistical Methods and Inference
- Markov Chains and Monte Carlo Methods
- Bayesian Modeling and Causal Inference
- Economic and Environmental Valuation
- demographic modeling and climate adaptation
- Probabilistic and Robust Engineering Design
- Remote Sensing and LiDAR Applications
- Scientific Research and Discoveries
- Reproductive tract infections research
- Survey Methodology and Nonresponse
- Statistical and numerical algorithms
- Genetic and phenotypic traits in livestock
- X-ray Diffraction in Crystallography
- Crystallization and Solubility Studies
- Species Distribution and Climate Change
- Remote Sensing in Agriculture
- Data Analysis with R
- Advanced Malware Detection Techniques
- Advanced Causal Inference Techniques
- Network Security and Intrusion Detection
Monash University
2022-2025
Tennessee Technological University
2024
University of Cambridge
2023
Museum of London Archaeology
2023
Philadelphia College of Osteopathic Medicine
2023
University of Toronto
2015-2021
University of Iceland
2020
Royal Holloway University of London
2020
Reykjavík Energy (Iceland)
2020
UNSW Sydney
2017-2018
In this paper, we introduce a new concept for constructing prior distributions. We exploit the natural nested structure inherent to many model components, which defines component be flexible extension of base model. Proper priors are defined penalise complexity induced by deviating from simpler and formulated after input user-defined scaling parameter that component, both in univariate multivariate case. These invariant reparameterisations, have connection Jeffreys’ priors, designed support...
The widely recommended procedure of Bayesian model averaging is flawed in the M-open setting which true data-generating process not one candidate models being fit. We take idea stacking from point estimation literature and generalize to combination predictive distributions, extending utility function any proper scoring rule, using Pareto smoothed importance sampling efficiently compute required leave-one-out posterior distributions regularization get more stability. compare several...
A key sticking point of Bayesian analysis is the choice prior distribution, and there a vast literature on potential defaults including uniform priors, Jeffreys’ reference maximum entropy weakly informative priors. These methods, however, often manifest conceptual tension in modeling: model encoding true information should be chosen without to measurement process, but almost all common modeling techniques are implicitly motivated by likelihood. In this paper we resolve apparent paradox...
In recent years, disease mapping studies have become a routine application within geographical epidemiology and are typically analysed Bayesian hierarchical model formulation. A variety of formulations for the latent level been proposed but all come with inherent issues. classical BYM (Besag, York Mollié) model, spatially structured component cannot be seen independently from unstructured component. This makes prior definitions hyperparameters two random effects challenging. There...
Priors are important for achieving proper posteriors with physically meaningful covariance structures Gaussian random fields (GRFs) since the likelihood typically only provides limited information about structure under in-fill asymptotics. We extend recent penalized complexity prior framework and develop a principled joint range marginal variance of one-dimensional, two-dimensional, three-dimensional Matérn GRFs fixed smoothness. The is weakly informative penalizes by shrinking toward...
Abstract Bayesian data analysis is about more than just computing a posterior distribution, and visualization trace plots of Markov chains. Practical analysis, like all an iterative process model building, inference, checking evaluation, expansion. Visualization helpful in each these stages the workflow it indispensable when drawing inferences from types modern, high dimensional models that are used by applied researchers.
Air pollution is a leading global disease risk factor. Tracking progress (e.g., for Sustainable Development Goals) requires accurate, spatially resolved, routinely updated exposure estimates. A Bayesian hierarchical model was developed to estimate annual average fine particle (PM2.5) concentrations at 0.1° × spatial resolution globally 2010-2016. The incorporated varying relationships between 6003 ground measurements from 117 countries, satellite-based estimates, and other predictors. Model...
Markov chain Monte Carlo is a key computational tool in Bayesian statistics, but it can be challenging to monitor the convergence of an iterative stochastic algorithm. In this paper we show that diagnostic $\widehat{R}$ Gelman and Rubin (1992) has serious flaws. Traditional will fail correctly diagnose failures when heavy tail or variance varies across chains. propose alternative rank-based fixes these problems. We also introduce collection quantile-based local efficiency measures, along...
This paper introduces a new method for performing computational inference on log-Gaussian Cox processes. The likelihood is approximated directly by making use of continuously specified Gaussian random field. We show that sufficiently smooth field prior distributions, the approximation can converge with arbitrarily high order, whereas an based counting process partition domain achieves only first-order convergence. results improve upon general theory convergence stochastic partial...
Verifying the correctness of Bayesian computation is challenging. This especially true for complex models that are common in practice, as these require sophisticated model implementations and algorithms. In this paper we introduce \emph{simulation-based calibration} (SBC), a general procedure validating inferences from algorithms capable generating posterior samples. not only identifies inaccurate inconsistencies but also provides graphical summaries can indicate nature problems arise. We...
The Bayesian approach to data analysis provides a powerful way handle uncertainty in all observations, model parameters, and structure using probability theory. Probabilistic programming languages make it easier specify fit models, but this still leaves us with many options regarding constructing, evaluating, these along remaining challenges computation. Using inference solve real-world problems requires not only statistical skills, subject matter knowledge, programming, also awareness of...
Gaussian random fields (GRFs) play an important part in spatial modelling, but can be computationally infeasible for general covariance structures.An efficient approach is to specify GRFs via stochastic partial differential equations (SPDEs) and derive Markov field (GMRF) approximations of the solutions.We consider construction a class non-stationary with varying local anisotropy, where anisotropy introduced by allowing coefficients SPDE vary position.This done using form diffusion equation...
A large number of statistical models are "doubly-intractable": the likelihood normalising term, which is a function model parameters, intractable, as well marginal (model evidence). This means that standard inference techniques to sample from posterior, such Markov chain Monte Carlo (MCMC), cannot be used. Examples include, but not confined to, massive Gaussian random fields, autologistic and Exponential graph models. approximate schemes based on MCMC techniques, Approximate Bayesian...
Gaussian random fields (GRFs) are the most common way of modeling structured spatial effects in statistics. Unfortunately, their high computational cost renders direct use GRFs impractical for large problems and approximations commonly used. In this paper, we compare two to with Matérn covariance functions: kernel convolution approximation Markov field representation an associated stochastic partial differential equation. We show that second approach is a natural tackle problem better than...
In an earlier article in this journal, Gronau and Wagenmakers (2018) discuss some problems with leave-one-out cross-validation (LOO) for Bayesian model selection. However, the variant of LOO that is at odds a long literature on how to use well. discussion, we practical data analysis, from perspective need abandon idea there device will produce single-number decision rule.
A central theme in the field of survey statistics is estimating population-level quantities through data coming from potentially non-representative samples population. Multilevel regression and poststratification (MRP), a model-based approach, gaining traction against traditional weighted approach for estimates. MRP estimates are susceptible to bias if there an underlying structure that methodology does not capture. This work aims provide new framework specifying structured prior...