Optimality Conditions for Unconstrained optimization One dimensional optimization
- Slides: 12
Optimality Conditions for Unconstrained optimization • One dimensional optimization – Necessary and sufficient conditions • Multidimensional optimization – Classification of stationary points – Necssary and sufficient conditions for local optima. • Convexity and global optimality
One dimensional optimization • We are accustomed to think that if f(x) has a minimum then f’(x)=0 but….
1 D Optimization jargon • A point with zero derivative is a stationary point. • x=5, Can be a minimum A maximum An inflection point
Optimality criteria for smooth functions • Condition at a candidate point x* • f’(x*)=0 is the condition for stationarity and a necessary condition for a minimum. • f“(x*)>0 is sufficient for a minimum • f“(x*)<0 is sufficient for a maximum • With f”(x*)=0 needs information from higher derivatives. • Example?
Taylor series expansion • Expanding minimum x* about a candidate • This is the condition for stationarity
Conditions for minimum • Sufficient condition for a minimum is that • That is, the matrix of second derivatives (Hessian) is positive definite • Simplest way to check positive definiteness is eigenvalues: All eigenvalues need to be positive • Necessary conditions matrix is positive-semi definite, all eigenvalues non-negative
Types of stationary points • • • Positive definite: Minimum Positive semi-definite: possibly minimum Indefinite: Saddle point Negative semi-definite: possibly maximum Negative definite: maximum Example of indefinite stationary point?
Example
Global optimization • The function x+sin(2 x)
Convex function • A straight line connecting two points will not dip below the function graph. Sufficient condition: Positive semi-definite Hessian everywhere. What does that mean geometrically?
Reciprocal approximation • Reciprocal approximation (linear in one over the variables) is desirable in many cases because it captures decreasing returns behavior. • Linear approximation • Reciprocal approximation
Conservative-convex approximation • At times we benefit from conservative approximations • All second derivatives of g. C are non-negative • Called convex linearization (CONLIN), Claude Fleury
- Optimality conditions for unconstrained optimization
- Optimality conditions for unconstrained optimization
- One dimensional unconstrained optimization
- Constrained and unconstrained optimization in economics
- Unconstrained multivariable optimization
- Single seed descent
- Unconstrained optimization matlab
- A circular motion is one dimensional
- What is optimality principle in computer networks
- Unconstrained demand
- What is optimality principle in computer networks
- Unconstrained decay
- Unconstrained decay