Browsing by Author "Lukšan, Ladislav"
Now showing 1 - 6 of 6
Results Per Page
Sort Options
- ItemA limited-memory optimization method using the infinitely many times repeated BNS update and conjugate directions(2019-05) Vlček, Jan; Lukšan, LadislavTo improve the performance of the limited-memory variable metric L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in Al-Baali (1999, 2002). Since the repeating process can be time consuming, the suitable extra updates need to be selected carefully. We show that for the limited-memory variable metric BNS method, matrix updating can be efficiently repeated infinitely many times under some conditions, with only a small increase of the number of arithmetic operations. The limit matrix can be written as a block BFGS update (Vlcek and Luksan, 2018), which can be obtained by solving of some low-order Lyapunov matrix equation. The resulting method can be advantageously combined with methods based on vector corrections for conjugacy, see e.g. Vlcek and Luksan (2015). Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new method.
- ItemApplication of the infinitely many times repeated BNS update and conjugate directions to limited-memory optimization methods(ACAD SCIENCES CZECH REPUBLIC, INST MATHEMATICS, ZITNA 25, PRAHA 1, CZ-115 67, CZECH REPUBLIC, 2019) Vlček, Jan; Lukšan, LadislavTo improve the performance of the L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in [1]. Since this can be time consuming, the extra updates need to be selected carefully. We show that groups of these updates can be repeated infinitely many times under some conditions, without a noticeable increase of the computational time; the limit update is a block BFGS update [17]. It can be obtained by solving of some Lyapunov matrix equation whose order can be decreased by application of vector corrections for conjugacy [16]. Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical results indicate the efficiency of the new method.
- Item
- ItemPrimal Interior Point Method For Minimization Of Generalized Minimax Functions(Kybernetika, 2010) Lukšan, Ladislav; Matonoha, Ctirad; Vlček, JanIn this paper, we propose a primal interior-point method for large sparse generalized minimax optimization. After a short introduction, where the problem is stated, we introduce the basic equations of the Newton method applied to the KKT conditions and propose a primal interior-point method. Next we describe the basic algorithm and give more details concerning its implementation covering numerical differentiation, variable metric updates, and a barrier parameter decrease. Using standard weak assumptions, we prove that this algorithm is globally convergent if a bounded barrier is used. Then, using stronger assumptions, we prove that it is globally convergent also for the logarithmic barrier. Finally, we present results of computational experiments confirming the efficiency of the primal interior point method for special cases of generalized minimax problems.
- ItemProperties of the block BFGS update and its application to the limited-memory block BNS method for unconstrained minimization(2019-03) Vlček, Jan; Lukšan, LadislavA block version of the Broyden-Fletcher-Goldfarb-Shanno (BFGS) variable metric update formula and its modifications are investigated. In spite of the fact that this formula satisfies the quasi-Newton conditions with all used difference vectors and that the improvement of convergence is the best one in some sense for quadratic objective functions, for general functions, it does not guarantee that the corresponding direction vectors are descent directions. To overcome this difficulty, but at the same time utilize the advantageous properties of the block BFGS update, a block version of the limited-memory variable metric BNS method for large-scale unconstrained optimization is proposed. The global convergence of the algorithm is established for convex sufficiently smooth functions. Numerical experiments demonstrate the efficiency of the new method.
- ItemRecursive Form Of General Limited Memory Variable Metric Methods(Kybernetika, 2013) Lukšan, Ladislav; Vlček, JanIn this report we propose a new recursive matrix formulation of limited memory variable metric methods. This approach can be used for an arbitrary update from the Broyden class (and some other updates) and also for the approximation of both the Hessian matrix and its inverse. The new recursive formulation requires approximately 4mn multiplications and additions per iteration, so it is comparable with other efficient limited memory variable metric methods. Numerical experiments concerning Algorithm 1, proposed in this report, confirm its practical efficiency.