Skip to Main Content
Splitting methods are examined for the iterative solution of the classical linear least-squares problem in Hilbert space. Conditions for convergence of the class of iterations studied generalize existing conditions in the literature. Throughout, the emphasis is on an organization-theoretic interpretation of the algorithm, thereby clarifying certain questions of decentralization of information and computation. Two examples are discussed in some detail: a matrix example and a standard optimal control problem. When such problems involve very large, sparse matrices the analogy with the "invertible case" is a most compelling argument for further investigation of applicability.