Submitted Paper
Inserted: 6 apr 2025
Last Updated: 20 apr 2025
Year: 2025
Abstract:
We revisit the classical problem of denoising a one-dimensional scalar-valued function by minimizing the sum of an $L^2$ fidelity term and the total variation, scaled by a regularization parameter. This study focuses on proving that the jump set of solutions, corresponding to discontinuities or edges, as well as the amplitude of the jumps are nonincreasing as the regularization parameter increases. Our results apply to input functions in $L^\infty$ with left and right approximate limits everywhere, extending beyond the traditional setting of functions of bounded variation. The proof leverages competitor constructions and convexity properties of the taut string problem, a well-known equivalent formulation of the TV model. Such a monotonicity property reflects that the extent to which geometric and topological features of the original signal are preserved is consistent with the amount of smoothing desired when formulating the denoising method.