In optimization, a self-concordant function is a function [math]\displaystyle{ f:\mathbb{R} \rightarrow \mathbb{R} }[/math] for which
or, equivalently, a function [math]\displaystyle{ f:\mathbb{R} \rightarrow \mathbb{R} }[/math] that, wherever [math]\displaystyle{ f''(x) \gt 0 }[/math], satisfies
and which satisfies [math]\displaystyle{ f'''(x) = 0 }[/math] elsewhere.
More generally, a multivariate function [math]\displaystyle{ f(x) : \mathbb{R}^n \rightarrow \mathbb{R} }[/math] is self-concordant if
or, equivalently, if its restriction to any arbitrary line is self-concordant.[1]
As mentioned in the "Bibliography Comments"[2] of their 1994 book,[3] self-concordant functions were introduced in 1988 by Yurii Nesterov[4][5] and further developed with Arkadi Nemirovski.[6] As explained in[7] their basic observation was that the Newton method is affine invariant, in the sense that if for a function [math]\displaystyle{ f(x) }[/math] we have Newton steps [math]\displaystyle{ x_{k+1} = x_k - [f''(x_k)]^{-1}f'(x_k) }[/math] then for a function [math]\displaystyle{ \phi(y) = f(Ay) }[/math] where [math]\displaystyle{ A }[/math] is a non-degenerate linear transformation, starting from [math]\displaystyle{ y_0 = A^{-1} x_0 }[/math] we have the Newton steps [math]\displaystyle{ y_k = A^{-1} x_k }[/math] which can be shown recursively
However the standard analysis of the Newton method supposes that the Hessian of [math]\displaystyle{ f }[/math] is Lipschitz continuous, that is [math]\displaystyle{ \|f''(x) - f''(y)\| \leq M\| x-y \| }[/math] for some constant [math]\displaystyle{ M }[/math]. If we suppose that [math]\displaystyle{ f }[/math] is 3 times continuously differentiable, then this is equivalent to
where [math]\displaystyle{ f'''(x)[u] = \lim_{\alpha \to 0} \alpha^{-1} [f''(x + \alpha u) - f''(x)] }[/math] . Then the left hand side of the above inequality is invariant under the affine transformation [math]\displaystyle{ f(x) \to \phi(y) = f(A y), u \to A^{-1} u, v \to A^{-1} v }[/math], however the right hand side is not.
The authors note that the right hand side can be made also invariant if we replace the Euclidean metric by the scalar product defined by the Hessian of [math]\displaystyle{ f }[/math] defined as [math]\displaystyle{ \| w \|_{f''(x)} = \langle f''(x)w, w \rangle^{1/2} }[/math] for [math]\displaystyle{ w \in \mathbb R^n }[/math]. They then arrive at the definition of a self concordant function as
If [math]\displaystyle{ f_1 }[/math] and [math]\displaystyle{ f_2 }[/math] are self-concordant with constants [math]\displaystyle{ M_1 }[/math] and [math]\displaystyle{ M_2 }[/math] and [math]\displaystyle{ \alpha,\beta\gt 0 }[/math], then [math]\displaystyle{ \alpha f_1 + \beta f_2 }[/math] is self-concordant with constant [math]\displaystyle{ \max(\alpha^{-1/2} M_1, \beta^{-1/2} M_2) }[/math].
If [math]\displaystyle{ f }[/math] is self-concordant with constant [math]\displaystyle{ M }[/math] and [math]\displaystyle{ Ax + b }[/math] is an affine transformation of [math]\displaystyle{ \mathbb R^n }[/math], then [math]\displaystyle{ \phi(x) = f(Ax+b) }[/math] is also self-concordant with parameter [math]\displaystyle{ M }[/math].
If [math]\displaystyle{ f }[/math] is self-concordant, then its convex conjugate [math]\displaystyle{ f^* }[/math] is also self-concordant.[8][9]
If [math]\displaystyle{ f }[/math] is self-concordant and the domain of [math]\displaystyle{ f }[/math] contains no straight line (infinite in both directions), then [math]\displaystyle{ f'' }[/math] is non-singular.
Conversely, if for some [math]\displaystyle{ x }[/math] in the domain of [math]\displaystyle{ f }[/math] and [math]\displaystyle{ u \in \mathbb R^n, u \neq 0 }[/math] we have [math]\displaystyle{ \langle f''(x) u, u \rangle = 0 }[/math], then [math]\displaystyle{ \langle f''(x + \alpha u) u, u \rangle = 0 }[/math] for all [math]\displaystyle{ \alpha }[/math] for which [math]\displaystyle{ x + \alpha u }[/math] is in the domain of [math]\displaystyle{ f }[/math] and then [math]\displaystyle{ f(x + \alpha u) }[/math] is linear and cannot have a maximum so all of [math]\displaystyle{ x + \alpha u, \alpha \in \mathbb R }[/math] is in the domain of [math]\displaystyle{ f }[/math]. We note also that [math]\displaystyle{ f }[/math] cannot have a minimum inside its domain.
Among other things, self-concordant functions are useful in the analysis of Newton's method. Self-concordant barrier functions are used to develop the barrier functions used in interior point methods for convex and nonlinear optimization. The usual analysis of the Newton method would not work for barrier functions as their second derivative cannot be Lipschitz continuous, otherwise they would be bounded on any compact subset of [math]\displaystyle{ \mathbb R^n }[/math].
Self-concordant barrier functions
A self-concordant function may be minimized with a modified Newton method where we have a bound on the number of steps required for convergence. We suppose here that [math]\displaystyle{ f }[/math] is a standard self-concordant function, that is it is self-concordant with parameter [math]\displaystyle{ M = 2 }[/math].
We define the Newton decrement [math]\displaystyle{ \lambda_f(x) }[/math] of [math]\displaystyle{ f }[/math] at [math]\displaystyle{ x }[/math] as the size of the Newton step [math]\displaystyle{ [f''(x)]^{-1} f'(x) }[/math] in the local norm defined by the Hessian of [math]\displaystyle{ f }[/math] at [math]\displaystyle{ x }[/math]
Then for [math]\displaystyle{ x }[/math] in the domain of [math]\displaystyle{ f }[/math], if [math]\displaystyle{ \lambda_f(x) \lt 1 }[/math] then it is possible to prove that the Newton iterate
will be also in the domain of [math]\displaystyle{ f }[/math]. This is because, based on the self-concordance of [math]\displaystyle{ f }[/math], it is possible to give some finite bounds on the value of [math]\displaystyle{ f(x_+) }[/math]. We further have
Then if we have
then it is also guaranteed that [math]\displaystyle{ \lambda_f(x_+) \lt \lambda_f(x) }[/math], so that we can continue to use the Newton method until convergence. Note that for [math]\displaystyle{ \lambda_f(x_+) \lt \beta }[/math] for some [math]\displaystyle{ \beta \in (0, \bar\lambda) }[/math] we have quadratic convergence of [math]\displaystyle{ \lambda_f }[/math] to 0 as [math]\displaystyle{ \lambda_f(x_+) \leq (1-\beta)^{-2} \lambda_f(x)^2 }[/math]. This then gives quadratic convergence of [math]\displaystyle{ f(x_k) }[/math] to [math]\displaystyle{ f(x^*) }[/math] and of [math]\displaystyle{ x }[/math] to [math]\displaystyle{ x^* }[/math], where [math]\displaystyle{ x^* = \arg\min f(x) }[/math], by the following theorem. If [math]\displaystyle{ \lambda_f(x) \lt 1 }[/math] then
with the following definitions
If we start the Newton method from some [math]\displaystyle{ x_0 }[/math] with [math]\displaystyle{ \lambda_f(x_0) \geq \bar\lambda }[/math] then we have to start by using a damped Newton method defined by
For this it can be shown that [math]\displaystyle{ f(x_{k+1}) \leq f(x_k) - \omega(\lambda_f(x_k)) }[/math] with [math]\displaystyle{ \omega }[/math] as defined previously. Note that [math]\displaystyle{ \omega(t) }[/math] is an increasing function for [math]\displaystyle{ t \gt 0 }[/math] so that [math]\displaystyle{ \omega(t) \geq \omega(\bar\lambda) }[/math] for any [math]\displaystyle{ t \geq \bar\lambda }[/math], so the value of [math]\displaystyle{ f }[/math] is guaranteed to decrease by a certain amount in each iteration, which also proves that [math]\displaystyle{ x_{k+1} }[/math] is in the domain of [math]\displaystyle{ f }[/math].
Functions that are not self-concordant