- Statistical Methods and Inference
- Advanced Statistical Methods and Models
- Bayesian Methods and Mixture Models
- Statistical Methods and Bayesian Inference
- Carcinogens and Genotoxicity Assessment
- Statistical and numerical algorithms
- Neural Networks and Applications
- Control Systems and Identification
- Spectroscopy and Chemometric Analyses
- Probabilistic and Robust Engineering Design
- Statistical Methods in Clinical Trials
- Maternal Mental Health During Pregnancy and Postpartum
- Advanced Clustering Algorithms Research
- Soil Geostatistics and Mapping
- Radiation Effects and Dosimetry
- Advanced Causal Inference Techniques
- Data Visualization and Analytics
- Data Management and Algorithms
- Fault Detection and Control Systems
- Advanced Statistical Process Monitoring
- Optimal Experimental Design Methods
- Data Analysis with R
- Theoretical and Computational Physics
- Gene expression and cancer classification
- Time Series Analysis and Forecasting
Durham University
2016-2025
Wageningen University & Research
2023
King Abdulaziz University
2019
Rensselaer Polytechnic Institute
2019
Ollscoil na Gaillimhe – University of Galway
2005-2016
Science Foundation Ireland
2007
Ludwig-Maximilians-Universität München
2003-2005
Blood lactate markers are used as summary measures of the underlying model an athlete's blood response to increasing work rate. Exercise physiologists use these endurance markers, typically corresponding a rate in region high curvature curve, predict and compare ability. A short theoretical background commonly is given algorithms provided for their calculation. To date, no free software exists that allows sports scientist calculate markers. In this paper, introduced precisely purpose will...
In the event of a radiological accident or incident, aim biological dosimetry is to convert yield specific biomarker exposure ionizing radiation into an absorbed dose. Since 1980s, various tools have been used deal with statistical procedures needed for dosimetry, and in general those who made several calculations different biomarkers were based on closed source software. Here we present new open program, Biodose Tools, that has developed under umbrella RENEB (Running European Network...
A novel approach is proposed for analysing multilevel multivariate response data. The based on identifying a one-dimensional latent variable spanning the space of responses, which then induces correlation between upper-level units. variable, can be thought as random effect, estimated along with other model parameters using an EM algorithm, seen in tradition 'nonparametric maximum likelihood' estimator two-level linear (univariate response) models. Simulations and real data examples from...
Individuals with an at risk mental state (ARMS) often experience hallucinatory-type experiences, which we refer to as unusual sensory experiences (USE). However, it is not known whether individuals want know more about USE or discuss these in therapy. Our preferences study asked who are referred into a treatment trial for ARMS consider attention important. Ninety-four service users of services within two UK National Health Service (NHS) health trusts completed the study-specific,...
Within the field of cytogenetic biodosimetry, Poisson regression is classical approach for modeling number chromosome aberrations as a function radiation dose. However, it common to find data that exhibit overdispersion. In practice, assumption equidispersion may be violated due unobserved heterogeneity in cell population, which will render variance observed aberration counts larger than their mean, and/or frequency zero greater expected distribution. This phenomenon observable both full‐...
Abstract For the analysis of multivariate data with an approximately one-dimensional latent structure, it is suggested to model this variable by a random effect, allowing for use mixed methodology dimension reduction purposes. We implement idea through mixture-based approach estimation effect models, hence conveniently enabling clustering observations along linear subspace, and derive estimators required ensuing EM algorithm under several error variance parameterizations. A simulation study...
Summary For speed–flow data, which are intensively discussed in transportation science, common nonparametric regression models of the type y = m(x)+noise turn out to be inadequate since simple functional cannot capture essential relationship between predictor and response. Instead a more general setting is required, allowing for multifunctions rather than functions. The tool proposed conditional modes estimation which, form local modes, yields several branches that correspond modes. A...
We propose weighted repeated median filters and smoothers for robust nonparametric regression in general online signal extraction from time series particular. The new methods allow us to remove outlying sequences preserve discontinuities (shifts) the underlying function (the signal) presence of local linear trends. Suitable weighting observations according their distances design space reduces bias arising nonlinearities improves efficiency using larger bandwidths, while still distinguishing...
Reliable dose estimation is an important factor in appropriate dosimetric triage categorization of exposed individuals to support radiation emergency response.Following work done under the EU FP7 MULTIBIODOSE and RENEB projects, formal methods for defining uncertainties on biological estimates are compared using simulated real data from recent exercises.The results demonstrate that a Bayesian method uncertainty assessment most appropriate, even absence detailed prior information. The...
The mean shift is a simple but powerful tool emerging from the computer science literature which shifts point to local center of mass around this point.It has been used as building block for several nonparametric unsupervised learning techniques, such density mode estimation, clustering, and estimation principal curves.Due localized way averaging, it requires specification window size in form bandwidth (matrix).This paper proposes use so-called self-coverage measure general device selection...
Over the last decade, γ-H2AX focus assay, which exploits phosphorylation of H2AX histone following DNA double-strand-breaks, has made considerable progress towards acceptance as a reliable biomarker for exposure to ionizing radiation. While existing literature convincingly demonstrated dose-response effect, and also presented approaches dose estimation based on appropriately defined calibration curves, more widespread practical use is still hampered by certain lack discussion agreement...
Abstract: While there do exist several statistical tests for detecting zero modification in count data regression models, these rely on asymptotical results and not transparently distinguish between inflation deflation. In this manuscript, a novel non-asymptotic test is introduced which makes direct use of the fact that distribution number zeros under null hypothesis no can be described by Poisson-binomial distribution. The computation critical values from requires estimation mean parameter...
Scatterplots of traffic speed versus flow have received considerable attention over the past decades due to their characteristic half-moon shape. Modelling data this type is difficult as both variables are actually not a function each other in sense causality, but rather jointly generated by third latent variable, which monotone density. We propose local principal curves (LPCs) tool describe and model speed–flow data, takes viewpoint into account. introduce concept calibration determine...
We consider principal curves and surfaces in the context of multivariate regression modelling. For predictor spaces featuring complex dependency patterns between involved variables, intrinsic dimensionality data tends to be very small due high redundancy induced by dependencies. In situations this type, it is useful approximate high-dimensional space through a low-dimensional manifold (i.e., curve or surface), use projections onto as compressed predictors problem. case that equals one, we...
The analysis of high–dimensional data is usually challenging since many standard modelling approaches tend to break down due the so–called “curse dimensionality”. Dimension reduction techniques, which reduce set (explicitly or implicitly) a smaller number variables, make more efficient and are furthermore useful for visualization purposes. However, most dimension techniques require fixing intrinsic low-dimensional subspace in advance. can be estimated by fractal estimation methods, exploit...
We consider situations in which the clustering of some multivariate data is desired, establishes an ordering clusters with respect to underlying latent variable. As our motivating example for a situation where such technique desirable, we scatterplots traffic flow and speed, pattern consecutive can be thought linked by variable, interpretable as density. focus on structures linear or quadratic shapes, present estimation methodology based expectation–maximization, estimates both subspace...
Abstract Purpose Prenatal sub-optimal nutrition and exposure to maternal stress, anxiety depression in pregnancy have been linked increased postnatal morbidity mortality. Fetal growth is most vulnerable dietary deficiencies, such as those evident hyperemesis gravidarum (HG), early pregnancy. The purpose of this pilot study was examine the effects HG on fetal movement profiles a measure healthy development 3rd trimester pregnancy, assess whether nutritional stress mother can be evaluated...
Purpose The traditional workflow for biological dosimetry based on manual scoring of dicentric chromosomes is very time consuming. Especially large-scale scenarios or low-dose exposures, high cell numbers have to be analyzed, requiring alternative strategies. Semi-automatic provides an opportunity speed up the standard dosimetry. Due automatic metaphase and chromosome detection, number counted per variable. This can potentially introduce overdispersion statistical methods conventional, might...