- Target Tracking and Data Fusion in Sensor Networks
- Bayesian Methods and Mixture Models
- Bayesian Modeling and Causal Inference
- Blind Source Separation Techniques
- Fault Detection and Control Systems
- Distributed Sensor Networks and Detection Algorithms
- Neural Networks and Applications
- Markov Chains and Monte Carlo Methods
- Underwater Acoustics Research
- Control Systems and Identification
- Advanced Adaptive Filtering Techniques
- Gaussian Processes and Bayesian Inference
- Statistical Methods and Inference
- Matrix Theory and Algorithms
- Image and Signal Denoising Methods
- Advanced Wireless Communication Techniques
- Time Series Analysis and Forecasting
- Statistical Mechanics and Entropy
- Random Matrices and Applications
- Speech and Audio Processing
- Statistical Methods and Bayesian Inference
- Scientific Research and Discoveries
- Fractal and DNA sequence analysis
- Advanced Numerical Analysis Techniques
- Sensor Technology and Measurement Systems
Institut Polytechnique de Paris
2012-2023
Telecom SudParis
2011-2023
Laboratoire Traitement et Communication de l’Information
2004-2018
Centre National de la Recherche Scientifique
2009-2018
Orange (France)
2008-2017
Université Paris-Saclay
2016
Rice University
2015
Citigroup
2005-2006
École Normale Supérieure - PSL
1998
Institut Mines-Télécom
1992-1993
This paper proposes an extension of standard mixture stochastic models, by replacing the constant weights with functional defined using a classifier. Classifier Weighted Mixtures enable straightforward density evaluation, explicit sampling, and enhanced expressivity in variational estimation problems, without increasing number components nor complexity components.
An important problem in signal processing consists recursively estimating an unobservable process x={x/sub n/}/sub n/spl isin/IN/ from observed y={y/sub isin/IN/. This is done classically the framework of hidden Markov models (HMM). In linear Gaussian case, classical recursive solution given by well-known Kalman filter. We consider pairwise assuming that pair (x, y) Markovian and Gaussian. show this model strictly more general than HMM, yet still enables Kalman-like filtering.
Let x = {xn} <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">nisinIN</sub> be a hidden process, y {y xmlns:xlink="http://www.w3.org/1999/xlink">n</sub> } an observed and r {r some additional process. We assume that t (x, r, y) is (so-called "Triplet") vector Markov chain (TMC first show the linear TMC model encompasses generalizes, among other models, classical state-space systems with colored process and/or measurement noise(s). next propose...
Sequential Monte Carlo algorithms, or Particle Filters, are Bayesian filtering algorithms which propagate in time a discrete and random approximation of the posteriori distribution interest. Such based on Importance Sampling with bootstrap resampling step aims at struggling against weights degeneracy. However, some situations (informative measurements, high dimensional model), can prove inefficient. In this paper, we revisit fundamental mechanism leads us back to Rubin's static mechanism. We...
In this note, we revisit fixed-interval Kalman like smoothing algorithms. We have two results. first unify the family of existing algorithms by deriving them in a common Bayesian framework; as shall see, all these stem from forward and/or backward Markovian properties state process, involve one (or two) out four canonical probability density functions, and can be derived systematic use some generic Gaussian variables which develop specific toolbox. On other hand methodology enables us to...
Random finite sets (RFS) are recent tools for addressing the multi-object filtering problem. The probability hypothesis density (PHD) Filter is an approximation of Bayesian filter, which results from RFS formulation problem and has been used in many applications. In framework, it assumed that each target associated observation follow a hidden Markov chain (HMC) model. HMCs conveniently describe some physical properties practical interest practitioners, but they also implicitly imply...
We introduce a new sequential importance sampling (SIS) algorithm which propagates in time Monte Carlo approximation of the posterior fixed-lag smoothing distribution symbols under doubly-selective channels. perform an exact evaluation optimal distribution, at reduced computational cost when compared to other solutions proposed for same state-space model. The method is applied as soft input-soft output (SISO) blind equalizer turbo receiver framework and simulation results are obtained show...
Recurrent Neural Networks (RNN) and Hidden Markov Models (HMM) are popular models for processing sequential data have found many applications such as speech recognition, time series prediction or machine translation. Although both been extended in several ways (eg. Long Short Term Memory Gated Unit architectures, Variational RNN, partially observed models...), their theoretical understanding remains open. In this context, our approach consists classifying from an information geometry point...
In this paper, we focus on the statistical filtering problem in dynamical models with jumps. When a particular application relies physical properties which are modeled by linear and Gaussian probability density functions jumps, an usualmethod consists approximating optimal Bayesian estimate (in sense of Minimum Mean Square Error (MMSE)) Jump Markov State Space System (JMSS). Practical solutions include algorithms based numerical approximations or Sequential Monte Carlo (SMC) methods. propose...
The Probability Hypothesis Density (PHD) filter is a recent solution to the multi-target filtering problem. Because PHD not computable, several implementations have been proposed including Gaussian Mixture (GM) approximations and Sequential Monte Carlo (SMC) methods. In this paper, we propose marginalized particle which improves classical solutions when used in stochastic systems with partially linear substructure.
Linear and Gaussian models with regime switching are popular in signal processing. In this letter, we revisit Bayesian inference such under the variational framework. We propose a structured but implicit distribution which can be seen as posterior of an alternative statistical model first second order moments filtering computed exactly. A critical property is that Kullback-Leibler Divergence between its associated linear also exactly, at cost number observations; hence, parameter estimation...
An important problem in signal processing consists estimating an unobservable process x = {x <sub xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">n</sub> } xmlns:xlink="http://www.w3.org/1999/xlink">nisinN</sub> from observed y {y . In linear Gaussian hidden Markov chains (LGHMC), recursive solutions are given by Kalman-like Bayesian restoration algorithms. this paper, we consider the more general framework of triplet (LGTMC), i.e. models which (x,...
The Probability Hypothesis Density (PHD) filter is a recent solution for tracking an unknown number of targets in multi-object environment. PHD cannot be computed exactly, but popular implementations include Gaussian Mixture (GM) and Sequential Monte Carlo (SMC) based algorithms. GM suffer from pruning merging approximations, enable to extract the states easily; on other hand, SMC are interest if discrete approximation relevant, penalized by difficulty guide particles towards promising...
Among Sequential Monte Carlo (SMC) methods,Sampling Importance Resampling (SIR) algorithms are based on Sampling (IS) and some resampling-based)rejuvenation algorithm which aims at fighting against weight degeneracy. However %whichever the resampling technique used this mechanism tends to be insufficient when applied informative or high-dimensional models. In paper we revisit rejuvenation propose a class of parameterized SIR-based solutions enable adjust tradeoff between computational cost...
Bayesian filtering aims at estimating sequentially a hidden process from an observed one. In particular, sequential Monte Carlo (SMC) techniques propagate in time weighted trajectories which represent the posterior probability density function (pdf) of given available observations. On other hand, conditional (CMC) is variance reduction technique replaces estimator moment interest by its expectation another variable. this paper, we show that up to some adaptations, one can make use recursive...
Sequential Monte Carlo (SMC) algorithms are based on importance sampling (IS) techniques. Resampling has been introduced as a tool for fighting the weight degeneracy problem. However, fixed sample size N, resampled particles dependent, not drawn exactly from target distribution, nor weighted properly. In this paper, we revisit resampling mechanism and propose scheme where (conditionally) independent We validate our results via simulations.
Addresses the blind identification of a linear time-invariant channel using some second-order cyclostationary statistics. In contrast to other contributions, case where statistics noise and jammers are totally unknown is considered. It shown that can be identified consistently by adapting so-called subspace method Moulines et al. (1995). This adaptation valid for fractionally spaced systems and, more interestingly, general exhibiting transmitter induced cyclostationarity introduced Tsatsanis...
We address the problem of second-order blind identification a multiple-input multiple-output (MIMO) transfer function in presence additive noise. The noise is assumed to be (temporally) white, i.e., uncorrelated time, but we do not make any assumption on its spatial correlation. This thus equivalent MIMO noiseless case from partial auto-covariance {R/sub n/}/sub n/spl ne/0/. Our approach consists computing missing central covariance coefficient R/sub 0/ this sequence. It can described simply...