Lecture 10 Outline Monte Carlo methods n History

  • Slides: 65
Download presentation
Lecture 10 Outline Monte Carlo methods n History of methods n Sequential random number

Lecture 10 Outline Monte Carlo methods n History of methods n Sequential random number generators n Parallel random number generators n Generating non-uniform random numbers n Monte Carlo case studies n

Monte Carlo Methods n n Monte Carlo is another name for statistical sampling methods

Monte Carlo Methods n n Monte Carlo is another name for statistical sampling methods of great importance to physics and computer science Applications of Monte Carlo Method u Evaluating integrals of arbitrary functions of 6+ dimensions u Predicting future values of stocks u Solving partial differential equations u Sharpening satellite images u Modeling cell populations u Finding approximate solutions to NP-hard problems

An Interesting History • In 1738, Swiss physicist and mathematician Daniel Bernoulli published Hydrodynamica

An Interesting History • In 1738, Swiss physicist and mathematician Daniel Bernoulli published Hydrodynamica which laid the basis for the kinetic theory of gases: great numbers of molecules moving in all directions, that their impact on a surface causes the gas pressure that we feel, and that we experience as heat is simply the kinetic energy of their motion. • In 1859, Scottish physicist James Clerk Maxwell formulated the distribution of molecular velocities, which gave the proportion of molecules having a certain velocity in a specific range. This was the first -ever statistical law in physics. Maxwell used a simple thought experiment: particles must move independent of any chosen coordinates, hence the only possible distribution of velocities must be normal in each coordinate. • In 1864, Ludwig Boltzmann, a young student in Vienna, came across Maxwell’s paper and was so inspired by it that he spent much of his long, distinguished, and tortured life developing the subject further.

History of Monte Carlo Method n n n Credit for inventing the Monte Carlo

History of Monte Carlo Method n n n Credit for inventing the Monte Carlo method is shared by Stanislaw Ulam, John von Neuman and Nicholas Metropolis. Ulam, a Polish born mathematician, worked for John von Neumann on the Manhattan Project. Ulam is known for designing the hydrogen bomb with Edward Teller in 1951. In a thought experiment he conceived of the MC method in 1946 while pondering the probabilities of winning a card game of solitaire. Ulam, von Neuman, and Metropolis developed algorithms for computer implementations, as well as exploring means of transforming non-random problems into random forms that would facilitate their solution via statistical sampling. This work transformed statistical sampling from a mathematical curiosity to a formal methodology applicable to a wide variety of problems. It was Metropolis who named the new methodology after the casinos of Monte Carlo. Ulam and Metropolis published a paper called “The Monte Carlo Method” in Journal of the American Statistical Association, 44 (247), 335 -341, in 1949.

Solving Integration Problems via Statistical Sampling: Monte Carlo Approximation n How to evaluate integral

Solving Integration Problems via Statistical Sampling: Monte Carlo Approximation n How to evaluate integral of f(x)?

Integration Approximation n Can approximate using another function g(x)

Integration Approximation n Can approximate using another function g(x)

Integration Approximation n Can approximate by taking the average or expected value

Integration Approximation n Can approximate by taking the average or expected value

Integration Approximation n Estimate the average by taking N samples

Integration Approximation n Estimate the average by taking N samples

Monte Carlo Integration n Im = Monte Carlo estimate N = number of samples

Monte Carlo Integration n Im = Monte Carlo estimate N = number of samples x 1, x 2, …, x. N are uniformly distributed random numbers between a and b

Monte Carlo Integration

Monte Carlo Integration

Monte Carlo Integration n We have the definition of expected value and how to

Monte Carlo Integration n We have the definition of expected value and how to estimate it. n Since the expected value can be expressed as an integral, the integral is also approximated by the sum. n To simplify the integral, we can substitute g(x) = f(x)p(x).

Variance n n The variance describes how much the sampled values vary from each

Variance n n The variance describes how much the sampled values vary from each other. Variance proportional to 1/N

Variance n Standard Deviation is just the square root of the variance Standard Deviation

Variance n Standard Deviation is just the square root of the variance Standard Deviation proportional to 1 / sqrt(N) n Need 4 X samples to halve the error n

Variance n Problem: u Variance (noise) decreases slowly u Using more samples only removes

Variance n Problem: u Variance (noise) decreases slowly u Using more samples only removes a small amount of noise

Variance Reduction n There are several ways to reduce the variance u Importance Sampling

Variance Reduction n There are several ways to reduce the variance u Importance Sampling u Stratified Sampling u Quasi-random Sampling u Metropolis Random Mutations

Importance Sampling n n Idea: use more samples in important regions of the function

Importance Sampling n n Idea: use more samples in important regions of the function If function is high in small areas, use more samples there

Importance Sampling n n Want g/p to have low variance Choose a good function

Importance Sampling n n Want g/p to have low variance Choose a good function p similar to g:

Stratified Sampling n n Partition S into smaller domains Si Evaluate integral as sum

Stratified Sampling n n Partition S into smaller domains Si Evaluate integral as sum of integrals over Si Example: jittering for pixel sampling Often works much better than importance sampling in practice

Parallelism in Monte Carlo Methods Monte Carlo methods often amenable to parallelism n Find

Parallelism in Monte Carlo Methods Monte Carlo methods often amenable to parallelism n Find an estimate about p times faster OR n Reduce error of estimate by p 1/2 n

Random versus Pseudo-random n n n Virtually all computers have “random number” generators Their

Random versus Pseudo-random n n n Virtually all computers have “random number” generators Their operation is deterministic Sequences are predictable More accurately called “pseudo-random number” generators In this chapter “random” is shorthand for “pseudorandom” “RNG” means “random number generator”

Properties of an Ideal RNG n n n n n Uniformly distributed Uncorrelated Never

Properties of an Ideal RNG n n n n n Uniformly distributed Uncorrelated Never cycles Satisfies any statistical test for randomness Reproducible Machine-independent Changing “seed” value changes sequence Easily split into independent subsequences Fast Limited memory requirements

No RNG Is Ideal Finite precision arithmetic finite number of states cycles u Period

No RNG Is Ideal Finite precision arithmetic finite number of states cycles u Period = length of cycle u If period > number of values needed, effectively acyclic n Reproducible correlations n Often speed versus quality trade-offs n

Linear Congruential RNGs Modulus Additive constant Multiplier Sequence depends on choice of seed, X

Linear Congruential RNGs Modulus Additive constant Multiplier Sequence depends on choice of seed, X 0

Period of Linear Congruential RNG Maximum period is M n For 32 -bit integers

Period of Linear Congruential RNG Maximum period is M n For 32 -bit integers maximum period is 232, or about 4 billion n This is too small for modern computers n Use a generator with at least 48 bits of precision n

Producing Floating-Point Numbers Xi, a, c, and M are all integers n Xis range

Producing Floating-Point Numbers Xi, a, c, and M are all integers n Xis range in value from 0 to M-1 n To produce floating-point numbers in range [0, 1), divide Xi by M n

Defects of Linear Congruential RNGs Least significant bits correlated u Especially when M is

Defects of Linear Congruential RNGs Least significant bits correlated u Especially when M is a power of 2 n k-tuples of random numbers form a lattice u Points tend to lie on hyperplanes u Especially pronounced when k is large n

Lagged Fibonacci RNGs p and q are lags, p > q n * is

Lagged Fibonacci RNGs p and q are lags, p > q n * is any binary arithmetic operation n Addition modulo M n Subtraction modulo M n Multiplication modulo M n Bitwise exclusive or n

Properties of Lagged Fibonacci RNGs Require p seed values n Careful selection of seed

Properties of Lagged Fibonacci RNGs Require p seed values n Careful selection of seed values, p, and q can result in very long periods and good randomness n For example, suppose M has b bits n Maximum period for additive lagged Fibonacci RNG is (2 p -1)2 b-1 n

Ideal Parallel RNGs All properties of sequential RNGs n No correlations among numbers in

Ideal Parallel RNGs All properties of sequential RNGs n No correlations among numbers in different sequences n Scalability n Locality n

Parallel RNG Designs Manager-worker n Leapfrog n Sequence splitting n Independent sequences n

Parallel RNG Designs Manager-worker n Leapfrog n Sequence splitting n Independent sequences n

Manager-Worker Parallel RNG Manager process generates random numbers n Worker processes consume them n

Manager-Worker Parallel RNG Manager process generates random numbers n Worker processes consume them n If algorithm is synchronous, may achieve goal of consistency n Not scalable n Does not exhibit locality n

Leapfrog Method Process with rank 1 of 4 processes

Leapfrog Method Process with rank 1 of 4 processes

Properties of Leapfrog Method Easy modify linear congruential RNG to support jumping by p

Properties of Leapfrog Method Easy modify linear congruential RNG to support jumping by p n Can allow parallel program to generate same tuples as sequential program n Does not support dynamic creation of new random number streams n

Sequence Splitting Process with rank 1 of 4 processes

Sequence Splitting Process with rank 1 of 4 processes

Properties of Sequence Splitting Forces each process to move ahead to its starting point

Properties of Sequence Splitting Forces each process to move ahead to its starting point n Does not support goal of reproducibility n May run into long-range correlation problems n Can be modified to support dynamic creation of new sequences n

Independent Sequences Run sequential RNG on each process n Start each with different seed(s)

Independent Sequences Run sequential RNG on each process n Start each with different seed(s) or other parameters n Example: linear congruential RNGs with different additive constants n Works well with lagged Fibonacci RNGs n Supports goals of locality and scalability n

Statistical Simulation: Metropolis Algorithm n Metropolis algorithm. [Metropolis, Rosenbluth, Teller 1953] u Simulate behavior

Statistical Simulation: Metropolis Algorithm n Metropolis algorithm. [Metropolis, Rosenbluth, Teller 1953] u Simulate behavior of a physical system according to principles of statistical mechanics. u Globally biased toward "downhill" lower-energy steps, but occasionally makes "uphill" steps to break out of local minima. n Gibbs-Boltzmann function. The probability of finding a physical system in a state with energy E is proportional to e -E / (k. T), where T > 0 is temperature and k is a constant. u For any temperature T > 0, function is monotone decreasing function of energy E. u System more likely to be in a lower energy state than higher one. t T large: high and low energy states have roughly same probability t T small: low energy states are much more probable

n Metropolis algorithm. u Given a fixed temperature T, maintain current state S. u

n Metropolis algorithm. u Given a fixed temperature T, maintain current state S. u Randomly perturb current state S to new state S' N(S). u If E(S') E(S), update current state to S' Otherwise, update current state to S' with probability e - E / (k. T), where E = E(S') - E(S) > 0. n Convergence Theorem. Let f. S(t) be fraction of first t steps in which simulation is in state S. Then, assuming some technical conditions, with probability 1: n Intuition. Simulation spends roughly the right amount of time in each state, according to Gibbs-Boltzmann equation.

Simulated Annealing n Simulated annealing. u T large probability of accepting an uphill move

Simulated Annealing n Simulated annealing. u T large probability of accepting an uphill move is large. u T small uphill moves are almost never accepted. u Idea: turn knob to control T. u Cooling schedule: T = T(i) at iteration i. n Physical analog. u Take solid and raise it to high temperature, we do not expect it to maintain a nice crystal structure. u Take a molten solid and freeze it very abruptly, we do not expect to get a perfect crystal either. u Annealing: cool material gradually from high temperature, allowing it to reach equilibrium at succession of intermediate lower temperatures.

Other Distributions Analytical transformations n Box-Muller Transformation n Rejection method n

Other Distributions Analytical transformations n Box-Muller Transformation n Rejection method n

Analytical Transformation -probability density function f(x) -cumulative distribution F(x) In theory of probability, a

Analytical Transformation -probability density function f(x) -cumulative distribution F(x) In theory of probability, a quantile function of a distribution is the inverse of its cumulative distribution function.

Exponential Distribution: An exponential distribution arises naturally when modeling the time between independent events

Exponential Distribution: An exponential distribution arises naturally when modeling the time between independent events that happen at a constant average rate and are memoryless. One of the few cases where the quartile function is known analytically. 1. 0

Example 1: Produce four samples from an exponential distribution with mean 3 n Uniform

Example 1: Produce four samples from an exponential distribution with mean 3 n Uniform sample: 0. 540, 0. 619, 0. 452, 0. 095 n Take natural log of each value and multiply by -3 n Exponential sample: 1. 850, 1. 440, 2. 317, 7. 072 n

Example 2: n n n Simulation advances in time steps of 1 second Probability

Example 2: n n n Simulation advances in time steps of 1 second Probability of an event happening is from an exponential distribution with mean 5 seconds What is probability that event will happen in next second? F(x=1/5) =1 - exp(-1/5)) = 0. 181269247 Use uniform random number to test for occurrence of event (if u < 0. 181 then ‘event’ else ‘no event’)

Normal Distributions: Box-Muller Transformation Cannot invert cumulative distribution function to produce formula yielding random

Normal Distributions: Box-Muller Transformation Cannot invert cumulative distribution function to produce formula yielding random numbers from normal (gaussian) distribution n Box-Muller transformation produces a pair of standard normal deviates g 1 and g 2 from a pair of normal deviates u 1 and u 2 n

Box-Muller Transformation repeat v 1 2 u 1 - 1 v 2 2 u

Box-Muller Transformation repeat v 1 2 u 1 - 1 v 2 2 u 2 - 1 r v 12 + v 22 until r > 0 and r < 1 f sqrt (-2 ln r /r ) g 1 f v 1 g 2 f v 2 This is a consequence of the fact that the chisquare distribution with two degrees of freedom is an easily-generated exponential random variable.

Example n Produce four samples from a normal distribution with mean 0 and standard

Example n Produce four samples from a normal distribution with mean 0 and standard deviation 1 u 2 v 1 v 2 r f g 1 g 2 0. 234 0. 784 -0. 532 0. 568 0. 605 1. 290 -0. 686 0. 732 0. 824 0. 039 0. 648 -0. 921 1. 269 0. 430 0. 176 -0. 140 -0. 648 0. 439 1. 935 -0. 271 -1. 254

Rejection Method

Rejection Method

Example n Generate random variables from this probability density function

Example n Generate random variables from this probability density function

Example (cont. ) So h(x) f(x) for all x

Example (cont. ) So h(x) f(x) for all x

Example (cont. ) xi ui ui h(xi) f(xi) Outcome 0. 860 0. 975 0.

Example (cont. ) xi ui ui h(xi) f(xi) Outcome 0. 860 0. 975 0. 689 0. 681 Reject 1. 518 0. 357 0. 252 0. 448 Accept 0. 357 0. 920 0. 650 0. 349 Reject 1. 306 0. 272 0. 192 0. 523 Accept Two samples from f(x) are 1. 518 and 1. 306

Case Studies (Topics Introduced) Temperature inside a 2 -D plate (Random walk) n Two-dimensional

Case Studies (Topics Introduced) Temperature inside a 2 -D plate (Random walk) n Two-dimensional Ising model (Metropolis algorithm) n Room assignment problem (Simulated annealing) n Parking garage (Monte Carlo time) n Traffic circle (Simulating queues) n

Temperature Inside a 2 -D Plate Random walk

Temperature Inside a 2 -D Plate Random walk

Example of Random Walk 132

Example of Random Walk 132

NP-Hard Assignment Problems TSP: Find a tour of US cities that minimizes distance.

NP-Hard Assignment Problems TSP: Find a tour of US cities that minimizes distance.

Physical Annealing Heat a solid until it melts n Cool slowly to allow material

Physical Annealing Heat a solid until it melts n Cool slowly to allow material to reach state of minimum energy n Produces strong, defect-free crystal with regular structure n

Simulated Annealing Makes analogy between physical annealing and solving combinatorial optimization problem n Solution

Simulated Annealing Makes analogy between physical annealing and solving combinatorial optimization problem n Solution to problem = state of material n Value of objective function = energy associated with state n Optimal solution = minimum energy state n

How Simulated Annealing Works Iterative algorithm, slowly lower T n Randomly change solution to

How Simulated Annealing Works Iterative algorithm, slowly lower T n Randomly change solution to create alternate solution n Compute , the change in value of objective function n If < 0, then jump to alternate solution n Otherwise, jump to alternate solution with probability e- /T n

Performance of Simulated Annealing n n n Rate of convergence depends on initial value

Performance of Simulated Annealing n n n Rate of convergence depends on initial value of T and temperature change function Geometric temperature change functions typical; e. g. , Ti+1 = 0. 999 Ti Not guaranteed to find optimal solution Same algorithm using different random number streams can converge on different solutions Opportunity for parallelism

Convergence Starting with higher initial temperature leads to more iterations before convergence

Convergence Starting with higher initial temperature leads to more iterations before convergence

Parking Garage Parking garage has S stalls n Car arrivals fit Poisson distribution with

Parking Garage Parking garage has S stalls n Car arrivals fit Poisson distribution with mean A: Exponentially distributed interarrival times n Stay in garage fits a normal distribution with mean M and standard deviation M/S n

Implementation Idea Times Spaces Are Available 101. 2 142. 1 70. 3 91. 7

Implementation Idea Times Spaces Are Available 101. 2 142. 1 70. 3 91. 7 223. 1 Current Time Car Count Cars Rejected 64. 2 15 2

Summary (1/3) Applications of Monte Carlo methods u Numerical integration u Simulation n Random

Summary (1/3) Applications of Monte Carlo methods u Numerical integration u Simulation n Random number generators u Linear congruential u Lagged Fibonacci n

Summary (2/3) n n Parallel random number generators u Manager/worker u Leapfrog u Sequence

Summary (2/3) n n Parallel random number generators u Manager/worker u Leapfrog u Sequence splitting u Independent sequences Non-uniform distributions u Analytical transformations u Box-Muller transformation u (Rejection method)

Summary (3/3) n Concepts revealed in case studies u Monte Carlo time u Random

Summary (3/3) n Concepts revealed in case studies u Monte Carlo time u Random walk u Metropolis algorithm u Simulated annealing u Modeling queues