Tractable optimization problems. Linear Programming quadratic programming etc. We can solve these problems reliably and efficiently. constituents where x \in \mathbb{R}^{n} is a vector of variables f_{0} is the objective function, “soft” to be minimized f_{1} … f_{m} are the inequality constraints g_{1} … g_{p} are the inequality constraints requirements Generally of structure:
subject to:
and have curvature constraints f_{0}, … f_{m} are convex:
for \theta \in [0,1]. That is, f_{i} have non-negative curvature. additional information sign asymmetry FUN IMPORTANT FACT: when an minimization problem maybe convex, sticking a negative sign (casting it to the maximization problem) is not necessarily convex. “Negative of things that curve down is things that curve up. This may no longer be convex.” ease “convex (nonnegative curvature) is easy”, “onconvex (negative curvature) is hard”. DSLs for Convex Optimization DSLs for convex optimization allows us to describe problems at a high level + close to the math; can transform the problem into a standard form and then solve it.