Tutorial on Particle filters Keith Copsey Pattern and

  • Slides: 39
Download presentation
Tutorial on Particle filters Keith Copsey Pattern and Information Processing Group DERA Malvern K.

Tutorial on Particle filters Keith Copsey Pattern and Information Processing Group DERA Malvern K. Copsey@signal. dera. gov. uk NCAF January Meeting, Aston University, Birmingham.

Outline z Introduction to particle filters – Recursive Bayesian estimation z Bayesian Importance sampling

Outline z Introduction to particle filters – Recursive Bayesian estimation z Bayesian Importance sampling – Sequential Importance sampling (SIS) – Sampling Importance resampling (SIR) z Improvements to SIR – On-line Markov chain Monte Carlo z z Basic Particle Filter algorithm Examples Conclusions Demonstration NCAF January Meeting, Aston University, Birmingham.

Particle Filters z Sequential Monte Carlo methods for on-line learning within a Bayesian framework.

Particle Filters z Sequential Monte Carlo methods for on-line learning within a Bayesian framework. z Known as – Particle filters – Sequential sampling-importance resampling (SIR) – Bootstrap filters – Condensation trackers – Interacting particle approximations – Survival of the fittest NCAF January Meeting, Aston University, Birmingham.

Recursive Bayesian estimation (I) z Recursive filter: – System model: – Measurement model: –

Recursive Bayesian estimation (I) z Recursive filter: – System model: – Measurement model: – Information available: NCAF January Meeting, Aston University, Birmingham.

Recursive Bayesian estimation (II) z Seek: – i = 0: filtering. – i >

Recursive Bayesian estimation (II) z Seek: – i = 0: filtering. – i > 0: prediction. – i<0: smoothing. z Prediction: – since: NCAF January Meeting, Aston University, Birmingham.

Recursive Bayesian estimation (III) z Update: z where: – since: NCAF January Meeting, Aston

Recursive Bayesian estimation (III) z Update: z where: – since: NCAF January Meeting, Aston University, Birmingham.

Classical approximations z Analytical methods: – Extended Kalman filter, – Gaussian sums… (Alspach et

Classical approximations z Analytical methods: – Extended Kalman filter, – Gaussian sums… (Alspach et al. 1971) • Perform poorly in numerous cases of interest z Numerical methods: – point masses approximations, – splines. (Bucy 1971, de Figueiro 1974…) • Very complex to implement, not flexible. NCAF January Meeting, Aston University, Birmingham.

Perfect Monte Carlo simulation (I) z Introduce the notation z Represent posterior distribution using

Perfect Monte Carlo simulation (I) z Introduce the notation z Represent posterior distribution using a set of samples or particles. z Random samples distribution. are drawn from the posterior NCAF January Meeting, Aston University, Birmingham.

Perfect Monte Carlo simulation (II) z Easy to approximate expectations of the form: –

Perfect Monte Carlo simulation (II) z Easy to approximate expectations of the form: – by: NCAF January Meeting, Aston University, Birmingham.

Random samples and the pdf (I) z Take p(x)=Gamma(4, 1) z Generate some random

Random samples and the pdf (I) z Take p(x)=Gamma(4, 1) z Generate some random samples z Plot histogram and basic approximation to pdf 200 samples NCAF January Meeting, Aston University, Birmingham.

Random samples and the pdf (II) 500 samples NCAF January Meeting, Aston University, Birmingham.

Random samples and the pdf (II) 500 samples NCAF January Meeting, Aston University, Birmingham. 1000 samples

Random samples and the pdf (III) 5000 samples NCAF January Meeting, Aston University, Birmingham.

Random samples and the pdf (III) 5000 samples NCAF January Meeting, Aston University, Birmingham. 200000 samples

Bayesian Importance Sampling (I) z Unfortunately it is often not possible to sample directly

Bayesian Importance Sampling (I) z Unfortunately it is often not possible to sample directly from the posterior distribution. z Circumvent by drawing from a known easy to sample proposal distribution giving: NCAF January Meeting, Aston University, Birmingham.

Bayesian Importance Sampling (II) z where are unnormalised importance weights: z Now: NCAF January

Bayesian Importance Sampling (II) z where are unnormalised importance weights: z Now: NCAF January Meeting, Aston University, Birmingham.

Bayesian Importance Sampling (III) z Giving: z so that: z where – and are

Bayesian Importance Sampling (III) z Giving: z so that: z where – and are normalised importance weights are independent random samples from NCAF January Meeting, Aston University, Birmingham.

Sequential Importance Sampling (I) z Factorising the proposal distribution: z and remembering that the

Sequential Importance Sampling (I) z Factorising the proposal distribution: z and remembering that the state evolution is modelled as a Markov process z obtain a recursive estimate of the importance weights: NCAF January Meeting, Aston University, Birmingham.

Derivation of SIR weights z Since: and z We have: NCAF January Meeting, Aston

Derivation of SIR weights z Since: and z We have: NCAF January Meeting, Aston University, Birmingham.

Sequential Importance Sampling (II) z Choice of the proposal distribution: z Choose proposal function

Sequential Importance Sampling (II) z Choice of the proposal distribution: z Choose proposal function to minimise variance of (Doucet et al. 1999): z Although Common choice is the prior distribution: NCAF January Meeting, Aston University, Birmingham.

Sequential Importance Sampling (III) z Illustration of SIS: z Degeneracy problems: – variance of

Sequential Importance Sampling (III) z Illustration of SIS: z Degeneracy problems: – variance of importance ratios increases stochastically over time (Kong et al. 1994; Doucet et al. 1999). NCAF January Meeting, Aston University, Birmingham.

SIS - why variance increase is bad z Suppose we want to sample from

SIS - why variance increase is bad z Suppose we want to sample from the posterior – choose a proposal density to be very close to the posterior density • Then • and z So we expect the variance to be close to 0 to obtain reasonable estimates – thus a variance increase has a harmful effect on accuracy NCAF January Meeting, Aston University, Birmingham.

Sequential Importance Sampling (IV) z Illustration of degeneracy: NCAF January Meeting, Aston University, Birmingham.

Sequential Importance Sampling (IV) z Illustration of degeneracy: NCAF January Meeting, Aston University, Birmingham.

Sampling-Importance Resampling z SIS suffers from degeneracy problems so we don’t want to do

Sampling-Importance Resampling z SIS suffers from degeneracy problems so we don’t want to do that! z Introduce a selection (resampling) step to eliminate samples with low importance ratios and multiply samples with high importance ratios. z Resampling maps the weighted random measure on to the equally weighted random measure – by sampling uniformly with replacement from with probabilities z Scheme generates satisfies: children such that NCAF January Meeting, Aston University, Birmingham. and

Improvements to SIR (I) z Variety of resampling schemes with varying performance in terms

Improvements to SIR (I) z Variety of resampling schemes with varying performance in terms of the variance of the particles : – Residual sampling (Liu & Chen, 1998). – Systematic sampling (Carpenter et al. , 1999). – Mixture of SIS and SIR, only resample when necessary (Liu & Chen, 1995; Doucet et al. , 1999). z Degeneracy may still be a problem: – During resampling a sample with high importance weight may be duplicated many times. – Samples may eventually collapse to a single point. NCAF January Meeting, Aston University, Birmingham.

Improvements to SIR (II) z To alleviate numerical degeneracy problems, sample smoothing methods may

Improvements to SIR (II) z To alleviate numerical degeneracy problems, sample smoothing methods may be adopted. – Roughening (Gordon et al. , 1993). • Adds an independent jitter to the resampled particles – Prior boosting (Gordon et al. , 1993). • Increase the number of samples from the proposal distribution to M>N, • but in the resampling stage only draw N particles. NCAF January Meeting, Aston University, Birmingham.

Improvements to SIR (III) z Local Monte Carlo methods for alleviating degeneracy: – Local

Improvements to SIR (III) z Local Monte Carlo methods for alleviating degeneracy: – Local linearisation - using an EKF (Doucet, 1999; Pitt & Shephard, 1999) or UKF (Doucet et al, 2000) to estimate the importance distribution. – Rejection methods (Müller, 1991; Doucet, 1999; Pitt & Shephard, 1999). – Auxiliary particle filters (Pitt & Shephard, 1999) – Kernel smoothing (Gordon, 1994; Hürzeler & Künsch, 1998; Liu & West, 2000; Musso et al. , 2000). – MCMC methods (Müller, 1992; Gordon & Whitby, 1995; Berzuini et al. , 1997; Gilks & Berzuini, 1998; Andrieu et al. , 1999). NCAF January Meeting, Aston University, Birmingham.

Improvements to SIR (IV) z Illustration of SIR with sample smoothing: NCAF January Meeting,

Improvements to SIR (IV) z Illustration of SIR with sample smoothing: NCAF January Meeting, Aston University, Birmingham.

MCMC move step z Improve results by introducing MCMC steps with invariant distribution. –

MCMC move step z Improve results by introducing MCMC steps with invariant distribution. – By applying a Markov transition kernel, the total variation of the current distribution w. r. t. the invariant distribution can only decrease. z Introduces possibility of variable dimension state space through the use of reversible jump MCMC (de Freitas et al. , 1999; Gilks & Berzuini, 2001) NCAF January Meeting, Aston University, Birmingham.

Ingredients for SMC z Importance sampling function – Gordon et al – Optimal –

Ingredients for SMC z Importance sampling function – Gordon et al – Optimal – UKF pdf from UKF at – Gordon et al SIR – Liu & Chen Residual – Carpenter et al Systematic z Redistribution scheme – Liu & Chen, Doucet et al Resample when necessary z Careful initialisation procedure (for efficiency) NCAF January Meeting, Aston University, Birmingham.

Basic Particle Filter - Schematic Initialisation measurement Resampling step Importance sampling step Extract estimate,

Basic Particle Filter - Schematic Initialisation measurement Resampling step Importance sampling step Extract estimate, NCAF January Meeting, Aston University, Birmingham.

Basic Particle Filter algorithm (I) z Initialisation – – For sample – and set

Basic Particle Filter algorithm (I) z Initialisation – – For sample – and set z In practice, to avoid having to take too many samples, for the first step we may want to ensure that we have a reasonable number of particles in the region of high likelihood – perhaps use MCMC techniques NCAF January Meeting, Aston University, Birmingham.

Basic Particle Filter algorithm (II) z Importance Sampling step – For sample and set

Basic Particle Filter algorithm (II) z Importance Sampling step – For sample and set – For evaluate the importance weights – Normalise the importance weights, NCAF January Meeting, Aston University, Birmingham.

Basic Particle Filter algorithm (III) z Resampling step – Resample with replacement particles: –

Basic Particle Filter algorithm (III) z Resampling step – Resample with replacement particles: – from the set: – according to the normalised importance weights, z Set – proceed to the Importance Sampling step, as the next measurement arrives. NCAF January Meeting, Aston University, Birmingham.

Example z On-line Data Fusion (Marrs, 2000). NCAF January Meeting, Aston University, Birmingham.

Example z On-line Data Fusion (Marrs, 2000). NCAF January Meeting, Aston University, Birmingham.

Example - Sensor Deployment z Aim to reduce target sd below some threshold. .

Example - Sensor Deployment z Aim to reduce target sd below some threshold. . . z … and keep it there z … by placing the minimum number of sensors possible z Sensor positions chosen according to particle distribution. NCAF January Meeting, Aston University, Birmingham.

Example - In-situ monitoring of growing semiconductor crystal composition Si 1 -x. Gex substrate

Example - In-situ monitoring of growing semiconductor crystal composition Si 1 -x. Gex substrate NCAF January Meeting, Aston University, Birmingham.

Book Advert (or put this in or your fired) z Sequential Monte Carlo methods

Book Advert (or put this in or your fired) z Sequential Monte Carlo methods in practice, Editors: Doucet, de Freitas, Gordon, Springer-Verlag (2001). – Theorectical foundations - plus convergence proofs – Efficiency measures – Applications: • Target tracking; missile guidance; image tracking; terrain referenced navigation; exchange rate prediction; portfolio allocation; ellipsometry; electricity load forecasting; pollution monitoring; population biology; communications and audio engineering. z ISBN=0 -387 -95146 -6, Price=$79. 95. NCAF January Meeting, Aston University, Birmingham.

Conclusions z On-line Bayesian learning a realistic proposition for many applications. z Appropriate for

Conclusions z On-line Bayesian learning a realistic proposition for many applications. z Appropriate for complex non-linear/non-Gaussian models – don’t bother if KF based solution adequate. z Representation of full posterior pdf leading to – estimation of moments. – estimation of HPD regions. – multi-modality easy to deal with. z Model order can be included in unknowns. z Can mix SMC and KF based solutions NCAF January Meeting, Aston University, Birmingham.

Tracking Demo z Illustrate a running particle filter – compare with Kalman Filter z

Tracking Demo z Illustrate a running particle filter – compare with Kalman Filter z Running as we watch - not pre-recorded z Pre-defined scenarios, or design your own – available to play with at coffee and lunch breaks. Tracking Demo NCAF January Meeting, Aston University, Birmingham.

2 nd Book Advert z z z Statistical Pattern Recognition Andrew Webb, DERA ISBN

2 nd Book Advert z z z Statistical Pattern Recognition Andrew Webb, DERA ISBN 0340741643, Paperback: 1999: £ 29. 99 Butterworth Heinemann z Contents: – Introduction to SPR, Estimation, Density estimation, Linear discriminant analysis, Nonlinear discriminant analysis - neural networks, Nonlinear discriminant analysis - statistical methods, Classification trees, Feature selction and extraction, Clustering, Additional topics, Measures of dissimilarity, Parameter estimation, Linear algebra, Data, Probability theory. NCAF January Meeting, Aston University, Birmingham.