Browsing by Author "Hnětynková Iveta"
Now showing 1 - 12 of 12
Results Per Page
Sort Options
- ItemBand generalization of the Golub-Kahan bidiagonalization, generalized Jacobi matrices, and the core problem(Society for Industrial and Applied Mathematics, 2015-01-01) Plešinger Martin; Hnětynková Iveta; Strakoš Zdeněk
- ItemComplex wedge-shaped matrices: A generalization of Jacobi matrices(Elsevier, 2015-01-01) Plešinger Martin; Hnětynková Iveta
- ItemThe core problem within a linear approximation problem AX &approx B with multiple right-hand sides(SIAM Publications, 2013-01-01) Hnětynková Iveta; Plešinger Martin; Strakoš Zdeněk
- ItemFilter factors of truncated tls regularization with multiple observations(Springer, 2017-01-01) Plešinger Martin; Žáková Jana; Hnětynková IvetaThe total least squares (TLS) and truncated TLS (T-TLS) methods are widely known linear data fitting approaches, often used also in the context of very ill-conditioned, rank-deficient, or ill-posed problems. Regularization properties of T-TLS applied to linear approximation problems $Ax\approx b$ were analyzed by Fierro, Golub, Hansen, and O'Leary (1997) through the so-called filter factors allowing to represent the solution in terms of a filtered pseudoinverse of $A$ applied to $b$. This paper focuses on the situation when multiple observations $b_1,\ldots,b_d$ are available, i.e., the T-TLS method is applied to the problem $AX\approx B$, where $B=[b_1,\ldots,b_d]$ is a matrix. It is proved that the filtering representation of the T-TLS solution can be generalized to this case. The corresponding filter factors are explicitly derived.
- ItemModification of TLS algorithm for solving F_2 linear data fitting problems(Wiley Interscience, 2017-01-01) Plešinger Martin; Žáková Jana; Hnětynková IvetaIt has been proved that the classical TLS algorithm fails to construct a TLS solution of linear data fitting problems AX ≈ B that belong to the class ℱ2. It will be shown how to modify this algorithm in order to reach a TLS solution. Such solution is not necessarily the minimum 2‐norm or Frobenius norm one. A few ideas how to decrease its norm are briefly discussed.
- ItemNoise representation in residuals of LSQR, LSMR, and CRAIG regularization(Elsevier, 2017-11-15) Plešinger Martin; Hnětynková Iveta; Kubínová MarieGolub–Kahan iterative bidiagonalization represents the core algorithm in several regularization methods for solving large linear noise-polluted ill-posed problems. We consider a general noise setting and derive explicit relations between (noise contaminated) bidiagonalization vectors and the residuals of bidiagonalization-based regularization methods LSQR, LSMR, and CRAIG. For LSQR and LSMR residuals we prove that the coefficients of the linear combination of the computed bidiagonalization vectors reflect the amount of propagated noise in each of these vectors. For CRAIG the residual is only a multiple of a particular bidiagonalization vector. We show how its size indicates the regularization effect in each iteration by expressing the CRAIG solution as the exact solution to a modified compatible problem. Validity of the results for larger two-dimensional problems and influence of the loss of orthogonality is also discussed.
- ItemNotes on performance of bidiagonalization-based noise level estimator in image deblurring(2016-01-01) Plešinger Martin; Hnětynková Iveta; Kubínová Marie
- ItemOn TLS formulation and core reduction for data fitting with generalized models(Elsevier BV, 2019-01-01) Hnětynková Iveta; Plešinger Martin; Žáková Jana
- ItemThe regularizing effect of the Golub&ndashKahan iterative bidiagonalization and revealing the noise level in the data(Springer Verlag, 2009-01-01) Hnětynková Iveta; Plešinger Martin; Strakoš Zdeněk
- ItemSOLVABILITY OF THE CORE PROBLEM WITH MULTIPLE RIGHT-HAND SIDES IN THE TLS SENSE(SIAM, 2016-01-01) Plešinger Martin; Hnětynková Iveta; Sima Diana Maria
- ItemTLS formulation and core reduction for problems with structured right-hand sides(Elsevier, 2018-01-01) Hnětynková Iveta; Plešinger Martin; Žáková JanaThe total least squares (TLS) represents a popular data fitting approach for solving linear approximation problems (i.e., with a vector right-hand side) and (i.e., with a matrix right-hand side) contaminated by errors. This paper introduces a generalization of TLS formulation to problems with structured right-hand sides. First, we focus on the case, where the right-hand side and consequently also the solution are tensors. We show that whereas the basic solvability result can be obtained directly by matricization of both tensors, generalization of the core problem reduction is more complicated. The core reduction allows to reduce mathematically the problem dimensions by removing all redundant and irrelevant data from the system matrix and the right-hand side. We prove that the core problems within the original tensor problem and its matricized counterpart are in general different. Then, we concentrate on problems with even more structured right-hand sides, where the same model A corresponds to a set of various tensor right-hand sides. Finally, relations between the matrix and tensor core problem are discussed.
- ItemThe total least squares problem in AX &approx B. A new classification with the relationship to the classical works,(SIAM Publications, 2011-01-01) Hnětynková Iveta; Plešinger Martin; Sima Diana Maria; Strakoš Zdeněk; Van Huffel Sabine