# Optimization Cheatsheet

## Handout 1 Introduction

### Basic Notions

**Optimal Value:** Is defined to be the greatest lower bound or infimum of the set ${f(x): x\in X}$

**Global Minimizer:** The $x^*$ such that $f(x^*) \le f(x) $

Note that optimal value can be finite even if global minimizer does not exist

### Different Problems

**LP Problems：** $f(x)$ is a linear function and $X$ is a set defined by linear inequalities

minimize $c^Tx$ subject to $Ax \le b$

Note that eualities can be transformed to inequalities by:

$ax = b$ => ${ax \le b \quad -ax \le -b}$

**Quadratic Programming (QP) Problems:** $X$ is also a set defined by linear inequalities while $f(x) = x^TQx$ ($Q = \frac{Q+Q^T}{2}$ is symmertric without loss)

Note that this is different from Quadratically Constrained Quadratic Programming Problems.

**Semidefinite Programming (SDP) Problems:**

inf $b^Ty$ subject to $C - \sum ^m _{i=1}y_iA_i \ge 0$ , $y \in R^m$

Positive Semidefinte: $Q \ge 0$ or $Q \in S^n _+$ if $x^TQx \ge 0$ for any $x \in R^n$

How to combine multiple constraints?

$C*1 - \sum ^m *{i = 1} y_i A_i \ge 0$ => $\left [ \begin{matrix}C_1 & \ & C*2 \end{matrix} \right ] - \sum^m *{i=1} y_i \left [ \begin{matrix}A_i & \ & B_i \end{matrix} \right ] \ge 0$

$C*2 - \sum ^m *{i=1} y_iB_i \ge 0$

### Properties of Semidefinte Matrix

**1)**

Let $A = \left [ \begin{matrix}A_1 & A_2\ A_2^T & A_3 \end{matrix} \right ] \ge 0$, then $A_1 \ge 0$ and $A_3 \ge 0$

## Handout 2 Elements of Convex Analysis

### Affine and Convex Sets

**Affine Set:** $\alpha x+(1-\alpha)y \in S$

**Convex Set:** $\alpha x + (1-\alpha)y \in S$ and $\alpha \in [0,1]$

Note that $\empty$ is convex.

**Proposition 1:** $S$ is not empty

1) $S$ is affine

2) Any affine combination of finite points in $S$ belongs to $S$

3) $S$ is the translation of some linear subspace $V \subseteq R ^n$; $S $ is of the form ${x}+V = {x+v\in R: v \in V}$ for some $x \in R$

**Proposition 2:** $S \subseteq R^n$ is arbitary

1) $S$ is convex

2) Any convex combination of points in $S$ belongs to $S$

**Examples of Convex Sets:** Non-negative orthant, hyperplane, halfspaces, euclidean ball, ellipsoid, simplex, convex cone, positive semidefinte cone

**Cone:** if ${\alpha x: \alpha \gt 0} \in K$ for every $x \in K$

Note that not every cone is convex and $S^n_+$ is a convex cone.

**Affine Hull:** The intersection of all affine subspaces containing $S$, denoted by $aff(S)$

**Convex Hull:** The intersection of all convex sets containing $S$, denoted by $conv(S)$. $conv(S) = S$ if $S$ is convex

**Proposition 3:**

1) $aff(S)$ is the set of all affine combinations of points in $S$

2) $conv(S)$ is the set of all convex combinations of points in $S$

### Convexity-Preserving Operations

**Affine Functions:** $A(\alpha x_1 + (1-\alpha)x_2) = \alpha A(x_1) + (1-\alpha)A(x_2)$

Note that translation ($A(x) = x + y$), projection ($A(x) = Ux$, $U^TU = UU^T = I$) and rotation ($A(x) = Px$) are all affine operations

**Proposition 4:** Affine functions operated on a convex set remains its convexity

### Projection onto Closed Convex Sets

Note that Projection points do not always exist and are not always unique.

Note that every finite point set is closed since it has no limit points thus fulfill the conditon that every limit points belong to itself.

**Theorem 4:** If $S$ is non-empty, closed and convex, then for every $x \in R^n$, there exists a unique point $z^* \in S$ that s closest to $x$

**Projection:** $\prod *S (x) = arg \quad min *{z\in S} ||x - z||^2 _2$

**Weierstrass's Theorem:** If $f$ is continuous on a compact set $T $, then it attains its maximum and minimum on $T$

**Theorem 5:** If $S$ is non-empty, closed and convex, we have $z^* = \prod _S (x)$ iff $z^* \in S$ and $(z-z^*)^T(x-z^*)\le0$ for all $z \in S$

### Separation Theorems

**Theorem 6 (Point-Set Separation):** If $S$ is non-empty, closed and convex, $x \in R^n \backslash S$ is arbitary. Then there exists a $y \in R^n$ such that $max _{z\in S} y^Tz \lt y^T x$

**Theorem 7:** A closed convex set $S \subseteq R^n$is the intersection of all the halfspaces containing $S$

Note that set-set separation $max _{z \in S*1} y^T z \lt min *{z \in S_2} y^Tz$ does not always holds, example can be ${(x_1,x_2): x_2 \ge \frac{1}{x*1}}$ and $R*-$

**Theorem 8 (Set-Set Separation):** If $S_1$, $S_2 \subseteq R^n$ is non-empty. closed and convex with $S_1 \and S_2 = \empty$. $S*2$ is bounded. Then there exists a $y\in R^n$ such that $max *{z\in S*1}y^Tz \lt min *{u \in S_2} y^T u$

### Basic Definitions and Propeerties of Convex Functions

**Convex Functions:** $f(\alpha x_1 + (1-\alpha)x_2) \le \alpha f(x_1) + (1-\alpha)f(x_2)$

**Concave Functions:** $-f$ is convex

**Epigraph:** $epi(f) = {(x,t) \in R^n\times R: f(x) \le t}$

**Effective Domain:** $dom(f) = {x \in R^n:f(x) \lt +\infty }$

**Proposition 9:** Let $f:R^n \rightarrow R \bigcup {+\infty}$, it is convex iff $epi(f)$ is convex

Note that $dom(f)$ is also convex if $f$ is convex

**Corollary 2: (Jensen's Inequality)** Let $f:R^n \rightarrow R \bigcup {+\infty}$, it is convex iff $f(\sum ^k _{i=1} \alpha _i x*i) \le \sum ^k *{i=1} \alpha_i f(x_i)$ for any $x_1,...,x_k \in R^n$ and $\alpha_1,...,\alpha*k \in [0,1]$ such that $\sum ^k *{i=1} \alpha _i = 1$

### Convexity-Preserving Transformations

**Theorem 11:**

**Non-Negative Combinations:** $f(x) = \sum ^m _{i=1} \alpha _i f_i(x)$ is onvex if $f_i$ is convex and $\alpha_i \ge 0$

**Pointwise Supremum:** $f(x) = sup _{i\in I}f_i(x)$

**Affine Composition:** $f(x) = g(A(x))$

**Composition with an Increasing Convex Function:** $h$ is increasing on $dom(h)$, $f(x) = g(h(x))$

**Restriction on Lines:** $f$ is convex iff $f(x_0 + th)$ is convex for any $x_0$ and $h$

### Differentiable Convex Functions

**Theorem 12:** $f$ is differentiable on the open set, then it is convex iff $f(x) \ge f(\bar{x}) + (\nabla f(\bar{x}))^T(x-\bar{x})$ for all $x,\bar{x} \in S$

**Theorem 13:** When $f$ is twice continuously differentiable on convex set $S \subseteq R^n$. Then $f$ is convex on $S$ iff $\nabla ^2 f(x) \ge 0$ for $x \in S$

### Non-Differentiable Convex Functions

**Subgradient:** $s$ is subgradient of $f$ at $\bar{x}$ if $f(x) \ge f(\bar{x}) + s^T(x-\bar{x})$, the set of $s$ is called subdifferetial of $f$ at $\bar{x}$ and is denoted by $\partial f(\bar{x}) $

**Theorem 14:**

1) The convex function $f$ is differentiable at $x\in R^n$ iff $\partial f(x) = {\nabla f(x)}$

2) Let $f$ be convex and $f'(x,d) = \lim *{t \rightarrow 0} \frac{f(x+td) - f(x)}{t}$ be the directional derivative of $f$ at $x$ in the direction $d \in R^n \backslash {0}$. Then $\partial f(x)$ is a non-empty compact convex set and $f'(x,d) = max *{s \in \partial f(x)}s^Td$ for any $d$

### Calculus and Linear Algebra Preparations

**Cachy-Schwarz Inequality:** $(\sum ^n _{i=1} x_iy*i)^2 \le (\sum^n *{i=1} x*i ^2)(\sum ^n *{i=1} y_i^2)$

**Vector Norm:**

1-Norm: $||x||*1 = \sum ^m *{i=1} |x_i|$

2-Norm: $||x||*2 = \sqrt{\sum ^m *{i=1} x_i^2}$

p-Norm: $||x||*p = (\sum ^m *{i=1} |x_i|^p)^{\frac{1}{p}}$

$\infty$-Norm: $||x||_{\infty} = max _i |x_i|$

$-\infty$-Norm: $||x||_{-\infty} = min _i |x_i|$

0-Norm: $||x||_0 =$Nums of non-zero element of $x$

**Matrix Norm:**

1-Norm: $||A||_1 = max *j \sum ^m *{i=1} |a _{i,j}|$

2-Norm: $||A||_2 = \sqrt{\lambda _1}$

$\infty$-Norm: $||A||_{\infty} = max*i \sum ^m *{j=1} |a_{i,j}|$

F-Norm: $||A||*F = (\sum ^m *{i=1} \sum ^n *{j=1} a*{i,j}^2)^{\frac{1}{2}}$

**Taylor's Formula:** $f(x) = f(a) + R_n(x)$ $R_n(x) = \frac{f^{(n+1)}(x)}{(n+1)!}(x-a) ^{(n+1)}$

**Semidefinite:**

$\left [ \begin{matrix} A & B \ B^T & C \end{matrix} \right ] \ge 0$ <=> $C - B^TA^{-1}B \ge 0$

## Handout 3 Elements of Linear Programming

### Basic Definitions and Properties

**Polyhedron:** The intersection of a finite set of halfspaces

**Polytope:** A bounded polyhedron

Note that a closed convex set is the intersection of all the halfspaces containing it but this does not mean that any closed convex set is a polyhedron

### External Elements of a Polyheedron

**Active Set:** The index set of all constraints such that $a_i^T\bar{x} = b_i$

**Theorem 1:** The following are equivalent:

1) There exists n vectors in the set ${a_i \in R: i \in I}$ that are linearly independent

2) The point $\bar{x} \in R^n$ is the unique solution to the following system of linear equations: $a_i^Tx = b_i$

**Basic Solution:** The vector $x$ is called basic solution if there are n linearly independent active constraints to $x$, if $x$ is in the polyhedron, then it is a basic feasible solution

Note that not every polyhedron has basic feasible solution

**Line:** A polyhedron $P$ contains a line if there exists $x \in P$ and a vector $d \in R^n \backslash {0}$ such that $x+ \alpha d \in P$ for all $\alpha \in R$

**Theorem 3:** Let $P \subseteq R^n$ is a non-empty polyhedron, then the following are equivalent:

1) $P$ has at least one vertex

2) $P$ does not contain a line

3) There exists n linearly independent vectors in ${a*i}^m *{i=1}$

### Existence of Optimal Solutions to Linear Programs

**Theorem 4:** Consider the LP $min _{x\in P} h^Tx$. Suppose that $P$ has at least one vertex, then either the optimal value is $-\infty$ or there exists a vertex that is optimal.

Note that there could be non-vertex optimal solutions but at least one vertex optimal solution exists

**Corollary 1:** Consider the LP $min _{x\in P} h^Tx$. Suppose that $P$ is non-empty. Then either the optimal value is $-\infty$ or there exists an optimal solution.

**Standard LP:**

minimize $c^Ty$

subject to $Ay = b$, $y \ge 0$

### Theorems of Alternatives

**Theorem 5: (Farkas' Lemma)** Let $A \in R ^{m \times n}$ and $b \in R^m$ be given. Then exactly one of the following systems has a solution:

1) $Ax = b$, $x \ge 0$

2) $A^Ty \le 0$, $b^Ty \gt 0$

Note that 2) is not a polyhedron since polyhedrons can not have strict inequalites

**Corollary 2: (Gordan's Theorem)** Let $A \in R^{m \times n}$ be given. Then exactly one of the following systems has a solution:

1) $Ax \gt 0$

2) $A^Ty = 0$, $y \ge 0$, $y \neq 0$

### LP Duality Theory

**Primal Problem and Dual Problem:**

(P) minimize $c^Tx$

subject to $Ax = b$, $x \ge 0$

(D) maximize $b^Ty$

subject to $A^Ty \le c$

**Theorem 6: (LP Weak Duality)** Let $\bar{x} \in R^n$ be feasible for (P) and $\bar{y} \in R^m$ be feasible for (D). Then we have $b^T\bar{y} \le c^T \bar{x}$.

**Corollary 3:**

1) If the optimal value of (P) is $-\infty$, then (D) must be infeasible

2) If the optimal value of (D) is $+\infty$, then (P) must be infeasible

3) Let $\bar{x}$ and$\bar{y}$ be feasible for (P) and (D). Suppose that the duality gap $\Delta(\bar{x},\bar{y})=c^T\bar{x}-b^T\bar{y} = 0$. Then $\bar{x}$ and $\bar{y}$ are optimal solutions to (P) and (D)

Note that if (P) is infeasible, it is possible that (D) is also infeasible

**Theorem 7: (LP Strong Duality)** Suppose that (P) has an optimal solution $x^* \in R^n$. Then (D) also has an optimal solution $y^* \in R^m$, and $c^Tx^* = b^Ty^*$

**Corollary 4:** Suppose both (P) and (D) are feasible. Then both (P) and (D) have optimal solutions and their respective optimal values are equal

**Theorem 8: (Complementory Slackness)** $\bar{x}$ and $\bar{y}$ are optimal for their respective problems iff $\bar{x}_i(c-A^T\bar{y})_i = 0$ for $i = 1,...,n$

## Handout 5 Elements of Conic Linear Programming

### Introduction

**Pointed Cone:**

1) $K$ is non-empty and closed under addition

2) $K$ is a cone

3) $K$ is pointed, if $u \in K$ and $-u \in K$, then $u = 0$

A pointed cone is automatically convex

**Examples of Pointed Cone:**

1) Non-Negative Orthant: $R^n _+$

2) Lorentz Cone/Second-Order Cone/Ice Cream Cone: $Q^{n+1}= {(t,x) \in R \times R^n:t \ge ||x||_2}$

3) Positive Semidefinte Cone: $S^n _+ = {X \in S^n: u^TXu \ge 0 \quad for \quad all \quad u \in R^n}$

Note that these three cone are all self-dual

Frobenius inner product: $X \cdot Y = tr(XY)$

**Proposition 1:** Let $E_1,...,E_n$ be finite-dimensional Euclidean spaces and $K_i \sube E_i$ be closed pointed cone with non-empty interiors, where $i = 1,...,n$. Then the set $K = K_1 \times ... \times K_n ={(x_1,...,x_n) \in E_1 \times ... \times E_n: x_i \in K_i }$ is a closed pointed cone with non-empty interior.

### Conic Linear Programming

**Standard Form：**

$v^* _p$ = inf $c \cdot x$

subject to $a_i \cdot x = b_i$ for $i=1,...,m$

$x \ge _K 0$

**Dual:**

$v^* _d$ = sup $b^Ty$

subject to $\sum ^m _{i=1} y_ia_i + s = c$

$y \in R^m$ $s \ge _{K^*} 0$

$K^* = {w \in E: x \cdot w \ge 0 \quad for \quad all \quad x \in K}$

**Proposition 2:** Let $K \sube E$ be a non-empty set. Then the following hold:

1) The set $K^*$ is a closed convex cone

2) If $K$ is a closed convex cone, then so is $K^*$

3) If $K$ has a non-empty interior, then $K^*$ is pointed

4) If $K$ is a closed pointed cone, then $K^*$ has a mon-empty interior

**Examples:**

**Linear Programming:**

(P) inf $c^Tx$

subject to $a^T_i x = b_i$ for $i=1,...,m$

$x \in R^n _+$

(D) sup $b^Ty$

subject to $\sum ^m _{i=1} y_ia_i +s =c$

$y \in R^m$, $s \in R^n _+$

**Second-Order Cone Programming (SOCP):**

(P) inf $c^Tx$

subject to $a^T_ix = b_i$ for $i = 1,...,m$

$x \in Q^{n+1}$

(D) sup $b^Ty$

subject to $(v - u^Ty, d - A^Ty) \in Q^{n+1}$

**Semidefinite Programming (SDP):**

(P) inf $C \cdot X$

subject to $A_i \cdot X = b_i$ for $i = 1,...,m$

$X \in S^n _+$

(D) sup $b^Ty$

subject to $\sum ^m _{i=1} y_iA_i + S = C$

$y \in R^m$, $S \in S^n _+$

**Theorem 1: (CLP Weak Duality)** Let $\bar{x} \in K$ be feasible for (P) and $(\bar{y}, \bar{s}) \in R^m \times K^*$ be feasible for (D). Then $b^T\bar{y} \le c \cdot \bar{x}$

**Theorem 2: (CLP Farkas Lemma)** Suppose that there exists $\bar{y} \in R^m$ such that $-\sum ^m _{i=1} \bar{y}_i a_i \in int(K^*)$, then exactly one of the following holds:

1) $a_i \cdot x = b_i$ for $i = 1,...,m$, $X \in K$

2) $-\sum ^m _{i=1} \bar{y}_i a_i \in K^*, b^Ty > 0$

**Theorem 3: (CLP Strong Duality)**

1) Suppose (P) is bounded below and strictly feasible. Then we have $v_p ^* = v^* _d$ and there exists a feasible solution $(\bar{y}, \bar{s})$ to (D) such that $b^T\bar{y} = v_p ^* = v^* _d$

2) Suppose (D) is bonded above and stictly feasible. Then we have $v_p ^* = v^* _d$ and there exists a feasible solution $(\bar{y}, \bar{s})$ to (D) such that $c \cdot \bar{x} = v_p ^* = v^* _d$

3) Suppose (P) and (D) are bounded and strictly feasible. Then given a feasible solution $\bar{x}$ to (p) and $(\bar{y}, \bar{s})$ to (D), the following are equivalent:

- They are optimal solutions
- The duality gap is zero
- We have complementary slackness: $\bar{x} \cdot \bar{s} = 0$

## Handout 6 Some Applications of Conic Linear Programming

### Quadratically Constrained Quadratic Optimization

**Original Problem:**

minimize $x^TCx$ ` minimize $C\cdot X$

subject to $x^TA_ix \ge b_i$ for $i=1,...,m$ subject to $A_i \cdot X \ge b_i$, $X \ge 0$, $rank(X)\le 1$

**Semidefinite Relaxation:**

minimize $C\cdot X$

subject to $A_i \cdot X \ge b_i$, $X \ge 0$

## handout 7 Optimality Conditions and Lagrangian Duality

### Unconstrained Optimizationm Problems

**Proposition 1:** Suppose that $f: R^n \rightarrow R$ is continuously differentiable at $\bar{x} \in R^n$. If there exists a $d \in R^n$ such that $\nabla f (\bar{x})^Td < 0$, then there exists an $\alpha _0 >0$ such that $f(\bar{x} + \alpha d) < f(\bar{x})$ for all $\alpha \in (0,\alpha_0)$. In other words, $d$ is a descent direction of $f$ at $\bar{x}$.

**Corollary 1: (First Order Necessary Condition for Unconstrained Optimization)** Suppose that $f: R^n \rightarrow R$ is continuously differentiable at $\bar{x} \in R^n$. If $\bar{x}$ is a local minimum, then we have $\nabla f(\bar{x}) = 0$. In particular, we have ${d \in R^n: \nabla f(\bar{x}) ^Td < 0 } \neq \empty$.

**Proposition 2:** Let $S \sube R^n$ be an open convex set. Suppose that $f: R^n \rightarrow R$ is convex on $S$ and continuously differentiable at $\bar{x} \in S$. Then, $\bar{x}$ is a global minimum in $S$ iff $\nabla f(\bar{x}) = 0$.

**Proposition 4: (Second Order Sufficient Condition for Unconstrained Optimization)** Suppose that $f: R^n \rightarrow R$ is twice continuously differentiable at $\bar{x} \in R^n$. If $\nabla f(\bar{x}) = 0$ and $\nabla ^2 f(\bar{x})$ is positive definite, then $\bar{x}$ is a local minimum.

### Constrained Optimization Problems

**Problem:**

inf $f(x)$

subject to $g_i(x) \le 0$, for $i = 1,...,m$

$h_j(x) = 0$, for $i = 1,...,m$

$x \in X$

**Theorem 2: (The Fritz John Necessary Conditions)** Let $\bar{x} \in S$ be a local minimum of the above problem. Then there exists $u \in R$, $v*1,...,v*{m_1} \in R$, and $w*1,...,w*{m_2} \in R$ such that

$u \nabla f(\bar{x}) + \sum ^{m*1} *{i=1} v_i \nabla g_i(\bar{x}) + \sum ^{m*2} *{j=1} w_j\nabla h_j (\bar{x}) = 0 $

$u, v_i \ge 0$, for $i = 1,...,m_1$

$(u, v*1,...,v*{m_1},w*1,...,w*{m_2}) \neq 0$

$v_ig_i(\bar{x}) = 0$, for $i = 1,...,m_1$

**Theorem 3: (The Karush-Kuhn-Tucker Necessary Conditions)** Let $\bar{x} \in S$ be a local minimum of the above problem. Let $I = {i \in {1,...,m_1}:g_i(\bar{x}) = 0}$ be the index set for active constraints. Suppose that $\bar{x}$ is regular which means ${\nabla g*i(\bar{x}) } *{i \in I}$ and ${\nabla h*j(\bar{x}) }*{j=1} ^{m_2}$ of vectors is linearly independent. Then there exist $v*1,....,v*{m_1} \in R$ and $w*1,...,w*{m_2} \in R$ such that

$ \nabla f(\bar{x}) + \sum ^{m*1} *{i=1} v_i \nabla g_i(\bar{x}) + \sum ^{m*2} *{j=1} w_j\nabla h_j (\bar{x}) = 0 $

$v_i \ge 0$, for $i = 1,...,m_1$

$v_ig_i(\bar{x}) = 0$, for $i = 1,...,m_1$

**Theorem 4:** Suppose that $g*1,...,g*{m_1}$ and $h*1,...,h*{m_2}$ are affine. Let $\bar{x} \in S$ be a local minimum and $I = {i \in {1,...,m_1}:g_i(\bar{x}) = 0}$ . Suppose that the Slater condition is satisfied which means there exists an $x' \in S$ such that $g_i(x') <0$ for $i \in I$. Then $\bar{x}$ satisfies the KKT conditions.

**Theorem 5:** Suppose that $g*1,...,g*{m_1}$ are concave and $h*1,...,h*{m_2}$ are affine. Let $\bar{x} \in S$ be a local minimum. Then $\bar{x}$ satisfies the KKT conditions.

**Theorem 6:** Suppose that $X$ is open and convex, $f,g*1,...,g*{m_1}$ are convex on $X$, and $h*1,...,h*{m_2}$ are affine. Suppose that $(\bar{x},\bar{v},\bar{w}) \in X \times R^{m_1} \times R^{m_2}$ is a solution to the KKT conditions (includes primal conditions), then $\bar{x}$ is an optimal solution.

$ \nabla f(\bar{x}) + \sum ^{m*1} *{i=1} \bar{v}_i \nabla g_i(\bar{x}) + \sum ^{m*2} *{j=1} \bar{w}_j\nabla h_j (\bar{x}) = 0 $

$\bar{v}_i \ge 0$, for $i = 1,...,m_1$

$\bar{v}_ig_i(\bar{x}) = 0$, for $i = 1,...,m_1$

$g_i(\bar{x}) \le 0$, for $i = 1,...,m_1$

$h_j(\bar{x}) = 0$, for $i = 1,...,m_2$

### Lagrangian Duality

**Primal Problem:**

$v^* _p$ = inf $f(x)$

subject to $g_i(x) \le 0$, for $i = 1,...,m_1$

$h_j(x) = 0$, for $j = 1,...,m_2$

$x \in X$

**Dual Problem:**

$v^**d$ = sup $*{v \in R^{m*1}*+, w\in R^{m_2}}$ $\theta(v,w)$

where $\theta(v,w) = inf _{x \in X} L(x,v,w)$

$L(x,v,w) = f(x) + v^TG(x) + w^TH(x)$

**Theorem 7: (Weak Duality Theorem)** Let $\bar{x} \in R^n$ be feasible for (P) and $(\bar{v},\bar{w}) \in R^{m_1} \times R^{m_2}$ be feasible for (D). Then we have $\theta(\bar{v},\bar{w}) \le f(\bar{x})$

**Theorem 8: (Strong Duality Theorem)** $(\bar{x}, \bar{v}, \bar{w})$ is a saddle point of the Lagrangian function $L$ associated with (P) iff the duality gap between (P) and (D) is zero, $\bar{x}$ is optimal for (P), $(\bar{v}, \bar{w})$ is optimal for (D)

**Saddle Point:**

1) $\bar{x} \in X$

2) $\bar{v} \ge 0$

3) For all $x \in X$ and $(v,w) \in R^{m*1}*+ \times R^{m_2}$, we have $L(\bar{x}, v, w) \le L(\bar{x}, \bar{v}, \bar{w}) \le L(x, \bar{v}, \bar{w})$

## 文章评论（0）