Category Archives: Optimisation

Useful convex analysis stuff: support functions & normal and tangent cones

This is a collection of some simple, yet useful results from convex analysis. We will give examples of support functions of convex sets, and normal cones. Of course special focus will be given to the two most popular sets of convex analysis: balls (Euclidean balls and ellipsoids) and polyhedra.

Continue reading →

Generalized Directional Derivatives: Some Examples

Earlier we defined the directional derivative of a function f: \mathbb{R}^n \to \overline{\mathbb{R}} as

\begin{aligned}f'(x; h) = \lim_{t \downarrow 0} \frac{f(x+th)-f(x)}{t},\end{aligned}

provided that the limit exists. It turns out that all convex functions are directionally differentiable on the interior (actually, the core) of their domains and f'(x; \cdot) is sublinear. However, the sublinearity property may fail when working with nonconvex functions. This motivates the definition of generalised directional derivatives which will hopefully be accompanied by some good calculus rules.

Continue reading →

Pointwise maximum function differentiability

These are some notes on some differentiability properties of the maximum of a finite number of functions based on some results taken mainly from the book of Borwein and Lewis and Rockafellar and Wets’s “Variational Analysis”.

Continue reading →

Notes on the Rayleigh Quotient

So here I am after a short of period of absence. This will be a short blog post on the Rayleigh quotient of a symmetric matrix, A, which is defined as R_A(x) = x^\top A x / \|x\|^2, for x \in\mathbb{R}^n, with x \neq 0.

Continue reading →

Strict and strong convexity of maximum of two functions

This is a brief note on the strict and strong convexity properties of the maximum of two functions. The question is whether the maximum of two strictly/strongly convex functions is strintly/strongly convex. The proofs are very simple.

Continue reading →

Interchangeability of infimum in risk measures

In this post we discuss the interchangeability of the infimum with (monotone) risk measures in finite probability spaces. In particular, we show that under the common monotonicity assumption (which is satisfied by all well-behaving risk measures), for a risk measure \rho:\mathbb{R}^n\to\mathbb{R} and a mapping f:\mathbb{R}^m\to\mathbb{R}^n, we have

\begin{aligned} \rho\left(\inf_x f(x)\right) = \inf_x \rho(f(x)) \end{aligned}

and \mathbf{argmin}_x f(x) \subseteq \mathbf{argmin}_x \rho(f(x)), while, under additional conditions (which are typically met in finite-dimensional spaces), we have \mathbf{argmin}_x f(x) = \mathbf{argmin}_x \rho(f(x)) Continue reading →

Cone programs and self-dual embeddings

This post aims at providing some intuition into cone programs from different perspectives; in particular:

  1. Equivalence of different formulations of cone programs
  2. Fenchel duality
  3. Primal-dual optimality conditions (OC)
  4. OCs as variational inequalities
  5. Homogeneous self-dual embeddings (HSDEs)
  6. OCs for HSDEs

Continue reading →

Continuity of argmin

Where here we ask what happens to the infima and sets of minimisers of sequences of functions \{f_n\}_n? under what conditions do these converge? what is an appropriate notion of convergence for functions which transfers the convergence to the corresponding sequence of its minima and minimizers? This poses a question of continuity for the infimum (as an operator) as well as the set of minimisers (as a multi-valued operator). We aim at characterising the continuity of these operators. Continue reading →

Projection on epigraph via a proximal operator

A while ago I posted this article on how to project on the epigraph of a convex function where I derived the optimality conditions and the KKT conditions. This post comes as an addendum proving a third way to project on an epigraph. Do read the previous article first because I use the same notation here. Continue reading →

Lagrange vs Fenchel Duality

In this post we discuss the correspondence between the Lagrangian and the Fenchelian duality frameworks and we trace their common origin to the concept of convex conjugate functions and perturbed optimization problems. Continue reading →

Let us learn Russian

Давайте выучим русский язык

Almost Originality

a mathematical journal

Annoying Precision

"A good stock of examples, as large as possible, is indispensable for a thorough understanding of any concept, and when I want to learn something new, I make it my first job to build one." - Paul Halmos

Alex Sisto

Alessandro Sisto's math blog

Journey into Randomness

random stuffs, mainly related to randomness

Society Of Mathematics

Make Mathematics Great Again

What's new

Updates on my research and expository papers, discussion of open problems, and other maths-related topics. By Terence Tao

Normal Deviate

Thoughts on Statistics and Machine Learning

Research and Lecture notes

by Fabrice Baudoin

mathbabe

Exploring and venting about quantitative issues

Look at the corners!

The math blog of Dmitry Ostrovsky

The Unapologetic Mathematician

Mathematics for the interested outsider

Almost Sure

A random mathematical blog

Mathematix

Mathematix is the mathematician of Erquy - the village of Asterix

Design a site like this with WordPress.com
Get started