TY - JOUR
T1 - Randomized block coordinate descent method for linear ill-posed problems
AU - Jin, Qinian
AU - Liu, Duo
N1 - Publisher Copyright:
© 2025 IOP Publishing Ltd. All rights, including for text and data mining, AI training, and similar technologies, are reserved.
PY - 2025/3
Y1 - 2025/3
N2 - Consider the linear ill-posed problems of the form ∑ i = 1 b A i x i = y , where, for each i, Ai is a bounded linear operator between two Hilbert spaces Xi and Y . When b is huge, solving the problem by an iterative method using the full gradient at each iteration step is both time-consuming and memory insufficient. Although randomized block coordinate descent (RBCD) method has been shown to be an efficient method for well-posed large-scale optimization problems with a small amount of memory, there still lacks a convergence analysis on the RBCD method for solving ill-posed problems. In this paper, we investigate the convergence property of the RBCD method with noisy data under either a priori or a posteriori stopping rules. We prove that the RBCD method combined with an a priori stopping rule yields a sequence that converges weakly to a solution of the problem almost surely. We also consider the early stopping of the RBCD method and demonstrate that the discrepancy principle can terminate the iteration after finite many steps almost surely. For a class of ill-posed problems with special tensor product form, we obtain strong convergence results on the RBCD method. Furthermore, we consider incorporating the convex regularization terms into the RBCD method to enhance the detection of solution features. To illustrate the theory and the performance of the method, numerical simulations from the imaging modalities in computed tomography and compressive temporal imaging are reported.
AB - Consider the linear ill-posed problems of the form ∑ i = 1 b A i x i = y , where, for each i, Ai is a bounded linear operator between two Hilbert spaces Xi and Y . When b is huge, solving the problem by an iterative method using the full gradient at each iteration step is both time-consuming and memory insufficient. Although randomized block coordinate descent (RBCD) method has been shown to be an efficient method for well-posed large-scale optimization problems with a small amount of memory, there still lacks a convergence analysis on the RBCD method for solving ill-posed problems. In this paper, we investigate the convergence property of the RBCD method with noisy data under either a priori or a posteriori stopping rules. We prove that the RBCD method combined with an a priori stopping rule yields a sequence that converges weakly to a solution of the problem almost surely. We also consider the early stopping of the RBCD method and demonstrate that the discrepancy principle can terminate the iteration after finite many steps almost surely. For a class of ill-posed problems with special tensor product form, we obtain strong convergence results on the RBCD method. Furthermore, we consider incorporating the convex regularization terms into the RBCD method to enhance the detection of solution features. To illustrate the theory and the performance of the method, numerical simulations from the imaging modalities in computed tomography and compressive temporal imaging are reported.
KW - convergence
KW - convex regularization term
KW - imaging
KW - linear ill-posed problems
KW - randomized block coordinate descent method
UR - http://www.scopus.com/inward/record.url?scp=86000315713&partnerID=8YFLogxK
U2 - 10.1088/1361-6420/adb780
DO - 10.1088/1361-6420/adb780
M3 - Article
AN - SCOPUS:86000315713
SN - 0266-5611
VL - 41
JO - Inverse Problems
JF - Inverse Problems
IS - 3
M1 - 035012
ER -