Computer Vision Nonlinear tracking Marc Pollefeys COMP 256
- Slides: 36
Computer Vision Non-linear tracking Marc Pollefeys COMP 256 Some slides and illustrations from D. Forsyth, M. Isard, T. Darrell …
Tentative class schedule Computer Vision Jan 16/18 - Introduction Jan 23/25 Cameras Radiometry Sources & Shadows Color Feb 6/8 Linear filters & edges Texture Feb 13/15 Multi-View Geometry Stereo Feb 20/22 Optical flow Project proposals Affine Sf. M Projective Sf. M Camera Calibration Segmentation Mar 13/15 Springbreak Mar 20/22 Fitting Prob. Segmentation Mar 27/29 Silhouettes and Photoconsistency Linear tracking Apr 3/5 Project Update Non-linear Tracking Apr 10/12 Object Recognition Apr 17/19 Range data Final project Jan 30/Feb 1 Feb 27/Mar 1 Mar 6/8 2 Apr 24/26
Computer Vision Final project presentation No further assignments, focus on project Final presentation: • Presentation and/or Demo (your choice, but let me know) • Short paper (Due April 22 by 23: 59) (preferably Latex IEEE proc. style) • Final presentation/demo April 24 and 26 3
Computer Vision Bayes Filters Estimating system state from noisy observations System state dynamics Observation dynamics We are interested in: Belief or posterior density 4
Computer Vision Recall “law of total probability” and “Bayes’ rule” From above, constructing two steps of Bayes Filters Predict: Update: 5
Computer Vision Predict: Update: 6 Assumptions: Markov Process
Computer Vision Bayes Filter How to use it? What else to know? Motion Model Perceptual Model Start from: 7
Computer Vision Example 1 Step 0: initialization Step 1: updating 8
Computer Vision Example 1 (continue) Step 2: predicting Step 3: updating Step 4: predicting 9
Computer Vision Several types of Bayes filters • They differs in how to represent probability densities – – – 10 Kalman filter Multihypothesis filter Grid-based approach Topological approach Particle filter
Computer Vision Kalman Filter Recall general problem Assumptions of Kalman Filter: Belief of Kalman Filter is actually a unimodal Gaussian Advantage: computational efficiency Disadvantage: assumptions too restrictive 11
Computer Vision 12
Computer Vision 13
Computer Vision Multi-hypothesis Tracking • Belief is a mixture of Gaussian • Tracking each Gaussian hypothesis using a Kalman filter • Deciding weights on the basis of how well the hypothesis predict the sensor measurements • Advantage: – can represent multimodal Gaussian • Disadvantage: – Computationally expensive – Difficult to decide on hypotheses 14
Computer Vision Grid-based Approaches • Using discrete, piecewise constant representations of the belief • Tessellate the environment into small patches, with each patch containing the belief of object in it • Advantage: – Able to represent arbitrary distributions over the discrete state space • Disadvantage – Computational and space complexity required to keep the position grid in memory and update it 15
Computer Vision Topological approaches • A graph representing the state space – node representing object’s location (e. g. a room) – edge representing the connectivity (e. g. hallway) • Advantage – Efficiency, because state space is small • Disadvantage – Coarseness of representation 16
Computer Vision Particle filters • Also known as Sequential Monte Carlo Methods • Representing belief by sets of samples or particles are nonnegative weights called importance factors • Updating procedure is sequential importance sampling with re-sampling 17
Computer Vision Example 2: Particle Filter Step 0: initialization Each particle has the same weight Step 1: updating weights. Weights are proportional to p(z|x) 18
Computer Vision Example 2: Particle Filter Step 2: predicting. Predict the new locations of particles. Step 3: updating weights. Weights are proportional to p(z|x) Step 4: predicting. 19 Particles are more concentrated in the region where the person is more likely to be Predict the new locations of particles.
Computer Vision Compare Particle Filter with Bayes Filter with Known Distribution Updating Example 1 Example 2 Predicting Example 1 Example 2 20
Computer Vision Comments on Particle Filters • Advantage: – Able to represent arbitrary density – Converging to true posterior even for non-Gaussian and nonlinear system – Efficient in the sense that particles tend to focus on regions with high probability • Disadvantage – Worst-case complexity grows exponentially in the dimensions 21
Computer Vision Particle Filtering in CV: Initial Particle Set • Particles at t = 0 drawn from wide prior because of large initial uncertainty – Gaussian with large covariance – Uniform distribution from Mac. Cormick & Blake, 1998 State includes shape & position; prior more constrained for shape 22
Computer Vision Particle Filtering: Sampling • Normalize N particle weights so that they sum to 1 • Resample particles by picking randomly and uniformly in [0, 1] range N times – Analogous to spinning a roulette wheel with arc-lengths of bins equal to particle weights • Adaptively focuses on promising areas of state space 23 ¼(N) ¼(1) ¼(N-1) ¼(2) ¼(3) courtesy of D. Fox
Computer Vision Particle Filtering: Prediction • Update each particle using generative form of dynamics: Deterministic component (aka “drift”) Random component (aka “diffusion”) • Drift may be nonlinear (i. e. , different displacement for each particle) • Each particle diffuses independently – Typically modeled with a Gaussian 24
Computer Vision Particle Filtering: Measurement • For each particle new weight likelihood s(i), compute ¼(i) as measurement ¼(i) = P (z j s(i)) • Enforcing plausibility: Particles that represent impossible configurations are given 0 likelihood – E. g. , positions outside of image from Mac. Cormick & Blake, 1998 25 A snake measurement likelihood method
Computer Vision Particle Filtering Steps (aka CONDENSATION) Sampling occurs here drift diffuse measurement likelihood measure from Isard & Blake, 1998 26
Computer Vision Particle Filtering Visualization courtesy of M. Isard 27 1 -D system, red curve is measurement likelihood
Computer Vision CONDENSATION: Example State Posterior from Isard & Blake, 1998 28 Note how initial distribution “sharpens”
Computer Vision Example: Contour-based Head Template Tracking courtesy of A. Blake 29
Computer Vision Example: Recovering from Distraction from Isard & 30 Blake, 1998
Computer Vision Obtaining a State Estimate • Note that there’s no explicit state estimate maintained—just a “cloud” of particles • Can obtain an estimate at a particular time by querying the current particle set • Some approaches – “Mean” particle • Weighted sum of particles • Confidence: inverse variance – Really want a mode finder—mean of tallest peak 31
Computer Vision Condensation: Estimating Target State From Isard & Blake, 1998 State samples (thickness proportional to weight) 32 Mean of weighted state samples
Computer Vision 33 More examples
Computer Vision Multi-Modal Posteriors • The MAP estimate is just the tallest one when are multiple peaks in the posterior there • This is fine when one peak dominates, but when they are of comparable heights, we might sometimes pick the wrong one • Committing to just one possibility can lead to mistracking – Want a wider sense of the posterior distribution to keep track of other good candidate states adapted from [Hong, 1995] Multiple peaks in the measurement likelihood 34
Computer Vision MCMC-based particle filter (Khan, Balch & Dellaert PAMI 05) Model interaction (higher dimensional state-space) CNN video 35
Computer Vision 36 Next class: recognition
- Marc pollefeys
- Human vision vs computer vision
- Raiz quadrada exata de números inteiros
- Resolución 256 de 2018
- Lanczos
- Quran 2:256
- 387 en yakın onluğa yuvarlama
- 1-1/256
- Gf 256
- 8 bit 256
- Manifestversion
- El año sideral dura 365. 256 días solares medios
- Vlookup reference error 256 columns
- I^256
- All perfect square numbers
- 256+93
- Ist 256
- I^256
- Sha256
- Alas sebuah limas berbentuk persegi dengan luas 256
- Cmpe 256
- Wpi ece faculty
- 16-385 computer vision
- Kalman filter computer vision
- T11 computer
- Berkeley computer vision
- Multiple view geometry in computer vision pdf
- Face detection viola jones
- Radiometry in computer vision
- Linear algebra for computer vision
- Impoverished motion examples
- Computer vision models learning and inference
- Computer vision ppt
- Cs223 stanford
- Multiple view geometry in computer vision
- Azure cognitive services python
- Mathematical foundations of computer graphics and vision