FILTERING AND STATE ESTIMATION BASIC CONCEPTS Massimiliano Vasile
FILTERING AND STATE ESTIMATION: BASIC CONCEPTS Massimiliano Vasile, Aerospace Centre of Excellence, Department of Mechanical & Aerospace Engineering, University of Strathclyde
UQ – BASIC INGREDIENTS § The overall UQ process is made of three fundamental elements: § An uncertainty model § A propagation method § An inference process utopiae_network http: //utopiae. eu
UQ and State Propagation utopiae_network http: //utopiae. eu
PROBLEM § Dynamic process: § The vectors x and p are not completely known § The initial condition vector is affected by uncertainty: utopiae_network http: //utopiae. eu
HIDDEN MARKOV PROCESS § Process transition from one state at stage k-1 to state and stage k: § § § What is the g function? If the state vector and the parameter vector are stochastic variables what is the transition probability? Why Hidden? Because we do not directly observe the state vector or the parameter vector. utopiae_network http: //utopiae. eu
MEASUREMENTS § Observation of the system at stage k: § § § The main problem is that observations measure a quantity that is different from the state vector or the parameter vector. The measurement of zk is not exact but is affected by a measurement error. The measurement error depends on our knowledge of the sensor. utopiae_network http: //utopiae. eu
PROPAGATED VS MEASURED STATE § Both propagated state and measured state at stage k need to be combined to reconstruct state and parameter vectors: § The relationship between measured quantity and hidden quantity is given by a measurement model. utopiae_network http: //utopiae. eu
PROBABILITY OF A STATE § Start from a priori distribution for the state: § From the propagation model we can compute the propagated distribution: § and the distribution conditional to the previously measured state: utopiae_network http: //utopiae. eu
PROBABILITY OF A STATE § At stage k a new measurement zk becomes available and we use Bayes to combine the prior with the measured quantity and infer the posterior: § The discount term is given by: § With likelihood function: utopiae_network http: //utopiae. eu
MONTE CARLO SIMULATIONS (MC) (NOT AN UNCERTAINTY PROPAGATION METHOD) § § Build a significant statistics by collecting a sufficient number of outcomes of the simulations. Commonly used to solve multidimensional integrals. By the central limit theorem, the expectation E of a random variable X belongs with probability e to: § with § This does not say a) how the distribution converges and b) if the distribution is unimodal The mean value might not ‘exist’! The hypothesis on the generation of the samples is very important!!!! Yields the spatial distribution and the probability distribution. § § § utopiae_network http: //utopiae. eu
POLYNOMIAL CHAOS EXPANSION § Response function representation on the quantity of interest: § Basis functions chosen to represent the input distribution: § The coefficients can be recovered with a least square approach or exploiting the orthogonality of the basis functions: § Analytical expressions of statistical moments: utopiae_network http: //utopiae. eu
EXAMPLE: DISPOSAL TRAJECTORY FROML 2 TO THE MOON Vetrisano and Vasile, ASR 2016 Analysis of Spacecraft Disposal Solutions from LPO to the Moon with High Order Polynomial Expansions § Trajectory from L 2 of the Earth-Sun system to the Moon in a full ephemerides model § Monte Carlo Simulation with 1 e 6 samples vs. PCE degree 6 with 26, 000 samples. utopiae_network http: //utopiae. eu
GAUSSIAN MIXTURE § § § Introduced by Garmier et al. and by Terejanu et al. in 2008 for uncertainty propagation was then developed further by Giza et al. and De Mars et al. with specific application to space debris. The idea is to represent the distribution of the quantity of interest with a weighted sum of Gaussians: The covariance and mean value are recovered from the updating step of an Unscented Kalman Filter. utopiae_network http: //utopiae. eu
FROM GAUSSIAN MIXTURE TO KRIGING MODELS § One can use a weighted sum of Kernels to build a surrogate of the PDF of the quantity of interest using a Kriging type of approach. § The hyper-parameters of the Kriging model are then derived from the solution of a maximum likelihood problem: utopiae_network http: //utopiae. eu
HDMR § HDMR allows for a direct cheap reconstruction of the quantity of interest. § HDMR decomposes the function response, f(x), in a sum of the contributions given by each variable and each one of their interactions through the model. § If one considers the contribution of each variable as a variation with respect to an anchored value fc (anchored-HDMR) then the decomposition becomes: § Important point: As for PCE the decomposition allows for the identification of the interdependency among variables and the order of the dependency of the quantity of interest on the uncertain parameters utopiae_network http: //utopiae. eu
INTRUSIVE POLYNOMIAL CHAOS EXPANSIONS § Embed the Polynomial Chaos Expansion in the differential equations: § After embedding the expansion in the differential equations one gets: § We multiply times and exploit the orthogonality of the basis with the probability distribution. The result is n differential equations to be integrated: Yields the spatial distribution and the probability distribution utopiae_network http: //utopiae. eu
STATE TRANSITION TENSOR The local dynamics described by applying a Taylor series expansion • Φ solution flow from t 0 to t. • • State transition tensors STT are the higher-order partials of the solution • Set of non-linear dynamics equations for STT (order 3) § Analytical expressions for mean m and covariance matrix P for Gaussian distribution utopiae_network http: //utopiae. eu
POLYNOMIAL ALGEBRA utopiae_network http: //utopiae. eu
POLYNOMIAL ALGEBRA utopiae_network http: //utopiae. eu
Kalman Filter utopiae_network http: //utopiae. eu
LINEAR MODEL WITH GAUSSIAN NOISE § Linear models for state and measurements: § with § All distributions are Gaussian and remain Gaussian: utopiae_network http: //utopiae. eu
LINEAR MODEL WITH GAUSSIAN NOISE § Linear models for state and measurements: § with § All distributions are Gaussian and remain Gaussian: utopiae_network http: //utopiae. eu
LINEAR MODEL WITH GAUSSIAN NOISE § The Kalman gain derives from Bayesian inference: utopiae_network http: //utopiae. eu
Particle Filter utopiae_network http: //utopiae. eu
NONLINEAR MODEL WITH GENERIC DISTRIBUTION § Consider the discrete posterior is given by Ns samples: § The weights are giving us the probability of each sample. The question is: how do I update the weights after propagation and inference? § utopiae_network http: //utopiae. eu
SAMPLING IS A PROBLEM § Importance sampling: § We can use an auxiliary distribution, q, to generate samples and then calculate the weights using the actual distribution p § The problem is that the sampling process might degenerate an not be sufficient to capture the whole distribution. utopiae_network http: //utopiae. eu
Unscented Kalman Filter utopiae_network http: //utopiae. eu
UNSCENTED TRANSFORMATION (OR TRANSFORM) § Data fusion and state estimation. Prior distribution of states and measurements: utopiae_network http: //utopiae. eu
UNSCENTED TRANSFORMATION AND UQ § Builds the covariance matrix of state and measurements assuming a known covariance of process Q and measurement R noise (linear Bayesian model hypothesis). utopiae_network http: //utopiae. eu
UNSCENTED TRANSFORMATION AND UQ § Cross correlation of states and measurements and builds the posterior estimation based on the new measurement y. § § State estimation: Posterior distribution (UNCERTAINTY): § Max estimated uncertainty on the covariance utopiae_network http: //utopiae. eu
Handling the unknown at the edge of tomorrow http: //utopiae. eu twitter. com/utopiae_network info@utopiae. eu
- Slides: 31