The preconditioning of our generalized forward-backward splitting algorithm can serve two practical purposes: first, it might accelerate the convergence, or second, it might simplify the computation of the resolvent of some operators in the splitting. In addition, we enable the economy of storage and computation concerning some auxiliary variables, when some coordinates of the problem take part only in some of the operators in the splitting.

As a particular application, we provide an efficient method for minimizing the following functional structured on the vertices of a graph *G* = (*V*, *E*)

ℝ^{V} → ℝ

*x* ↦ ∑_{v∈V} *λ*_{v}/2 |*x*_{v} − *y*_{v}|^{2} +
∑_{(u,v)∈E} *μ*_{(u,v)}
|*x*_{u} − *x*_{v}| +
∑_{v∈V} *ν*_{v}
|*x*_{v}| ,

where *y* ∈ ℝ^{V} is the vector of the observation on the vertices, and
*λ* ∈ ℝ^{V}, *μ* ∈ ℝ^{E} and *ν* ∈ ℝ^{V}
are weights defined over the vertices or the edges. The first term is a classical data-fidelity term, the second term is akin to a weighted total variation seminorm, and the third term is a weighted ℓ_{1}-norm.

Published in

Source code of numerical simulations used in the paper is not maintained anymore. However, I wrote a version in C++, parallelized with OpenMP, interfaced with GNU Octave or Matlab with MEX API. It's on my GitHub repository; the above problem is a specific instance of the quadratic functional, ℓ1 norm, bounds, and graph total variation composite problem. Feel free to contact me for any question.

Also, check out our new state-of-the-art approach for large-scale nondifferentiable optimization problems regularized by graph total variation.