The preconditioning of our generalized forward-backward splitting algorithm can serve two practical purposes: first, it might accelerate the convergence, or second, it might simplify the computation of the resolvent of some operators in the splitting. In addition, we enable the economy of storage and computation concerning some auxiliary variables, when some coordinates of the problem take part only in some of the operators in the splitting.
As a particular application, we provide an efficient method for minimizing the following functional structured on the vertices of a graph G = (V, E) ℝV → ℝ x ↦ ∑v∈V λv/2 |xv − yv|2 + ∑(u,v)∈E μ(u,v) |xu − xv| + ∑v∈V νv |xv| , where y ∈ ℝV is the vector of the observation on the vertices, and λ ∈ ℝV, μ ∈ ℝE and ν ∈ ℝV are weights defined over the vertices or the edges. The first term is a classical data-fidelity term, the second term is akin to a weighted total variation seminorm, and the third term is a weighted ℓ1-norm.
Work done in collaboration with Loïc Landrieu. Published in SIAM Journal of Imaging Sciences: Permalink - Authors' manuscript - reference as BibTeX format. Source code of numerical simulations used in the paper is not maintained anymore. However, I wrote a version in C++, parallelized with OpenMP, interfaced with GNU Octave or Matlab with MEX API, or with Pyhton. It's on my GitLab repository; the above problem is a specific instance of the quadratic functional, ℓ1 norm, bounds, and graph total variation composite problem. Feel free to contact me for any question. Also, check out our new state-of-the-art approach for large-scale nondifferentiable optimization problems regularized by graph total variation.