Takeaway

Convex problems have no bad local minima; duality and KKT conditions provide certificates of optimality and efficient algorithms.

The problem (before → after)

  • Before: Nonconvex landscapes trap local methods.
  • After: Convex structure guarantees global optima and strong theoretical tools.

Mental model first

Think of a bowl-shaped valley; any path downhill leads to the bottom. Constraints are fences; the optimal point touches fences where forces balance.

Just-in-time concepts

  • Convex sets/functions, epigraphs.
  • Lagrangian L(x, λ, ν) and dual function g(λ, ν).
  • KKT conditions: stationarity, primal/dual feasibility, complementary slackness.

First-pass solution

Formulate as minimize f(x) s.t. g_i(x) ≤ 0, h_j(x) = 0. Write L; derive KKT; solve via interior-point or first-order methods.

Iterative refinement

  1. Strong duality via Slater’s condition.
  2. Proximal methods and ADMM for large-scale problems.
  3. Conic formulations unify many problems (LP, QP, SDP).

Principles, not prescriptions

  • Choose convex relaxations when exact problems are hard.
  • Dual variables interpret constraints as prices.

Common pitfalls

  • Violating convexity assumptions; KKT become necessary but not sufficient.
  • Poor scaling without preconditioning.

Connections and contrasts

  • See also: [/blog/interior-point-methods], [/blog/branch-and-bound], [/blog/optimal-transport].

Quick checks

  1. Why convex? — Guarantees global optimality and tractable algorithms.
  2. What is complementary slackness? — Active constraints have positive multipliers and zero slack.
  3. When strong duality? — Slater’s condition holds.

Further reading

  • Boyd & Vandenberghe (book above)