- Gaussian Processes and Bayesian Inference
- Machine Learning and Data Classification
- Fault Detection and Control Systems
- Explainable Artificial Intelligence (XAI)
- Machine Learning and Algorithms
- Machine Learning in Materials Science
- Image Retrieval and Classification Techniques
- Advanced Image and Video Retrieval Techniques
- Adversarial Robustness in Machine Learning
- Forecasting Techniques and Applications
- Electrical and Bioimpedance Tomography
- Artificial Intelligence in Healthcare and Education
- Video Surveillance and Tracking Methods
- Geophysical Methods and Applications
- Domain Adaptation and Few-Shot Learning
- Machine Learning in Healthcare
- Spectroscopy and Chemometric Analyses
- Blind Source Separation Techniques
- Healthcare Technology and Patient Monitoring
- Microwave Imaging and Scattering Analysis
- Computational Drug Discovery Methods
- Statistical Methods and Inference
Technische Universität Berlin
2012-2021
Inferring causal interactions from observed data is a challenging problem, especially in the presence of measurement noise. To alleviate problem spurious causality, Haufe <etal xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"/> (2013) proposed to contrast measures information flow obtained on original against same time-reversed data. They show that this procedure, Granger causality (TRGC), robustly rejects interpretations mixtures independent signals....
With the increasing size of today's data sets, finding right parameter configuration in model selection via cross-validation can be an extremely time-consuming task. In this paper we propose improved procedure which uses nonparametric testing coupled with sequential analysis to determine best set on linearly subsets data. By eliminating underperforming candidates quickly and keeping promising as long possible, method speeds up computation while preserving capability full cross-validation....
The use of machine learning (ML) in critical domains such as medicine poses risks and requires regulation. One requirement is that decisions ML systems high-risk applications should be human-understandable. field "explainable artificial intelligence" (XAI) seemingly addresses this need. However, its current form, XAI unfit to provide quality control for ML; it itself needs scrutiny. Popular methods cannot reliably answer important questions about models, their training data, or a given test...
The evolving landscape of explainable artificial intelligence (XAI) aims to improve the interpretability intricate machine learning (ML) models, yet faces challenges in formalisation and empirical validation, being an inherently unsupervised process. In this paper, we bring together various benchmark datasets novel performance metrics initial benchmarking platform, Explainable AI Comparison Toolkit (EXACT), providing a standardised foundation for evaluating XAI methods. Our incorporate...
Non-stationarity in data is an ubiquitous problem signal processing. The recent stationary subspace analysis procedure (SSA) has enabled to decompose such into a and non-stationary part respectively. Algorithmically only weak non- stationarities could be tackled by SSA. present paper takes the conceptual step generalizing from use of first second moments as SSA higher order moments, thus defining proposed (HOSSA). derives novel shows simulations. An obvious trade-off between necessity...
Locality sensitive hashing (LSH) is a powerful tool for sublinear-time approximate nearest neighbor search, and variety of schemes have been proposed different dissimilarity measures. However, hash codes significantly depend on the dissimilarity, which prohibits users from adjusting at query time. In this paper, we propose {multiple purpose LSH (mp-LSH) shares dissimilarities. mp-LSH supports L2, cosine, inner product dissimilarities, their corresponding weighted sums, where weights can be...
Inhomogeneities in real-world data, e.g., due to changes the observation noise level or variations structural complexity of source function, pose a unique set challenges for statistical inference. Accounting them can greatly improve predictive power when physical resources computation time is limited. In this paper, we draw on recent theoretical results estimation local function (LFC), derived from domain polynomial smoothing (LPS), establish notion complexity, which used develop...
We propose a novel active learning strategy for regression, which is model-agnostic, robust against model mismatch, and interpretable. Assuming that small number of initial samples are available, we derive the optimal training density minimizes generalization error local polynomial smoothing (LPS) with its kernel bandwidth tuned locally: adopt mean integrated squared (MISE) as criterion, use asymptotic behavior MISE well locally bandwidths (LOB) - function in limit. The expression our...