- Neural Networks and Applications
- Neural Networks and Reservoir Computing
- Neural dynamics and brain function
- Face and Expression Recognition
- Advanced Memory and Neural Computing
- Data Visualization and Analytics
- Stock Market Forecasting Methods
- Metaheuristic Optimization Algorithms Research
- Complex Systems and Time Series Analysis
- Time Series Analysis and Forecasting
- Music and Audio Processing
- Machine Learning and Data Classification
- Machine Learning and Algorithms
- Evolutionary Algorithms and Applications
- Topological and Geometric Data Analysis
- Gene expression and cancer classification
- Gaussian Processes and Bayesian Inference
- Anomaly Detection Techniques and Applications
- Dementia and Cognitive Impairment Research
- Image Retrieval and Classification Techniques
- Functional Brain Connectivity Studies
- Service-Oriented Architecture and Web Services
- Imbalanced Data Classification Techniques
- Fractal and DNA sequence analysis
- Evolutionary Game Theory and Cooperation
University of Birmingham
2016-2025
Shanghai Key Laboratory of Trustworthy Computing
2021
Southern University of Science and Technology
2021
SAIT Polytechnic
2021
Aston University
2001-2003
Austrian Research Institute for Artificial Intelligence
1999-2002
Slovak University of Technology in Bratislava
1995-2002
University of Vienna
1999
Princeton University
1996-1998
United States Naval Research Laboratory
1998
It has previously been shown that gradient-descent learning algorithms for recurrent neural networks can perform poorly on tasks involve long-term dependencies, i.e. those problems which the desired output depends inputs presented at times far in past. We show dependencies problem is lessened a class of architectures called nonlinear autoregressive models with exogenous (NARX) networks, have powerful representational capabilities. reported gradient descent be more effective NARX than network...
Reservoir computing (RC) refers to a new class of state-space models with fixed state transition structure (the reservoir) and an adaptable readout form the space. The reservoir is supposed be sufficiently complex so as capture large number features input stream that can exploited by reservoir-to-output mapping. field RC has been growing rapidly many successful applications. However, criticized for not being principled enough. construction largely driven series randomized model-building...
Along with the great success of deep neural networks, there is also growing concern about their black-box nature. The interpretability issue affects people's trust on learning systems. It related to many ethical problems, e.g., algorithmic discrimination. Moreover, a desired property for networks become powerful tools in other research fields, drug discovery and genomics. In this survey, we conduct comprehensive review network research. We first clarify definition as it has been used...
Ensembles are a widely used and effective technique in machine learning---their success is commonly attributed to the degree of disagreement, or 'diversity', within ensemble. For ensembles where individual estimators output crisp class labels, this 'diversity' not well understood remains an open research issue. regression estimators, diversity can be exactly formulated terms covariance between estimator outputs, optimum level expressed bias-variance-covariance trade-off. Despite this, most...
The emergence of large sensor networks has facilitated the collection amounts real-time data to monitor and control complex engineering systems. However, in many cases collected may be incomplete or inconsistent, while underlying environment time-varying unformulated. In this paper, we develop an innovative cognitive fault diagnosis framework that tackles above challenges. This investigates model space instead signal space. Learning is implemented by fitting a series models using segments...
A new class of state-space models, reservoir with a fixed state transition structure (the "reservoir") and an adaptable readout from the space, has recently emerged as way for time series processing modeling. Echo network (ESN) is one simplest, yet powerful, models. ESN models are generally constructed in randomized manner. In our previous study (Rodan & Tiňo, 2011), we showed that very simple, cyclic, deterministically generated can yield performance competitive standard ESN. this...
We present novel, efficient, model based kernels for time series data rooted in the reservoir computation framework. The are implemented by fitting models sharing same fixed deterministically constructed state transition part to individual series. proposed can naturally handle of different length without need specify a parametric class Compared with most kernels, our computationally efficient. show how distances used kernel be calculated analytically or efficiently estimated. experimental...
Human behavior is guided by our expectations about the future. Often, we make predictions monitoring how event sequences unfold, even though such may appear incomprehensible. Event structures in natural environment typically vary complexity, from simple repetition to complex probabilistic combinations. How do learn these structures? Here investigate dynamics of structure learning tracking human responses temporal that change unbeknownst participants. Participants were asked predict upcoming...
Abstract The use of primary care electronic health records for research is abundant. benefits gained from utilising such lies in their size, longitudinal data collection and quality. However, the to undertake high quality epidemiological studies, can lead significant challenges particularly dealing with misclassification, variation coding effort required pre-process a meaningful format statistical analysis. In this paper, we describe methodology aid extraction processing databases, delivered...
In this paper, we elaborate upon the claim that clustering in recurrent layer of neural networks (RNNs) reflects meaningful information processing states even prior to training. By concentrating on activation clusters RNNs, while not throwing away continuous state space network dynamics, extract predictive models call prediction machines (NPMs). When RNNs with sigmoid functions are initialized small weights (a common technique RNN community), activations emerging training indeed and...
<para xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink"> In this paper, a sparse learning algorithm, probabilistic classification vector machines (PCVMs), is proposed. We analyze relevance (RVMs) for problems and observe that adopting the same prior different classes may lead to unstable solutions. order tackle problem, signed truncated Gaussian adopted over every weight in PCVMs, where sign of determined by class label, i.e., <formula...
An ensemble is a group of learners that work together as committee to solve problem. The existing learning algorithms often generate unnecessarily large ensembles, which consume extra computational resource and may degrade the generalization performance. Ensemble pruning aim find good subset members constitute small ensemble, saves performs well as, or better than, unpruned ensemble. This paper introduces probabilistic algorithm by choosing set ldquosparserdquo combination weights, most are...
Since estimation of distribution algorithms (EDAs) were proposed, many attempts have been made to improve EDAs' performance in the context global optimization. So far, studies or applications multivariate probabilistic model-based EDAs continuous domain are still mostly restricted low-dimensional problems. Traditional difficulties solving higher dimensional problems because curse dimensionality and rapidly increasing computational costs. However, scaling up for large-scale optimization is...
Concept drift detection methods are crucial components of many online learning approaches. Accurate detections allow prompt reaction to drifts and help maintain high performance models over time. Although have been proposed, no attention has given data streams with imbalanced class distributions, which commonly exist in real-world applications, such as fault diagnosis control systems intrusion computer networks. This paper studies the concept problem for imbalance learning. We look into...
In some pattern analysis problems, there exists expert knowledge, in addition to the original data involved classification process. The vast majority of existing approaches simply ignore such auxiliary (privileged) knowledge. Recently a new paradigm-learning using privileged information-was introduced framework SVM+. This approach is formulated for binary and, as typical many kernel-based methods, can scale unfavorably with number training examples. While speeding up methods and extensions...
The imbalanced nature of some real-world data is one the current challenges for machine learning researchers. One common approach oversamples minority class through convex combination its patterns. We explore general idea synthetic oversampling in feature space induced by a kernel function (as opposed to input space). If matches underlying problem, classes will be linearly separable and synthetically generated patterns lie on region. Since not directly accessible, we use empirical (EFS) (a...
The recent advanced LIGO detections of gravitational waves from merging binary black holes enhance the prospect exploring evolution via gravitational-wave observations a population compact-object binaries. In face uncertainty about formation models, model-independent inference provides an appealing alternative to comparisons between observed and modelled populations. We describe procedure for clustering in multi-dimensional parameter space that are subject significant measurement errors....
Abstract DNA replication initiates from multiple genomic locations called origins. In metazoa, sequence elements involved in origin specification remain elusive. Here, we examine pluripotent, primary, differentiating, and immortalized human cells, demonstrate that a class of origins, termed core is shared by different cell types host ~80% all initiation events any population. We detect G-rich signature coincides with most origins both mouse genomes. Transcription can independently associate...
Alzheimer's disease (AD) is characterised by a dynamic process of neurocognitive changes from normal cognition to mild cognitive impairment (MCI) and progression dementia. However, not all individuals with MCI develop Predicting whether will decline (i.e. progressive MCI) or remain stable impeded patient heterogeneity due comorbidities that may lead diagnosis without AD. Despite the importance early AD for prognosis personalised interventions, we still lack robust tools predicting individual...
Abstract The early stages of Alzheimer’s disease (AD) involve interactions between multiple pathophysiological processes. Although these processes are well studied, we still lack robust tools to predict individualised trajectories progression. Here, employ a and interpretable machine learning approach combine multimodal biological data future pathological tau accumulation. In particular, use quantify key markers (β-amyloid, medial temporal lobe atrophy, APOE 4) at mildly impaired...