Towards large scale unconstrained optimization
A large scale unconstrained optimization problem can be formulated when the dimension n is large. The notion of 'large scale' is machine dependent and hence it could be difficult to state a priori when a problem is of large size. However, today an unconstrained problem with 400 or more var...
Saved in:
Main Author: | |
---|---|
Format: | Inaugural Lecture |
Language: | English English |
Published: |
Universiti Putra Malaysia Press
2007
|
Online Access: | http://psasir.upm.edu.my/id/eprint/41627/1/0001.pdf http://psasir.upm.edu.my/id/eprint/41627/2/0001.pdf http://psasir.upm.edu.my/id/eprint/41627/ |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
id |
my.upm.eprints.41627 |
---|---|
record_format |
eprints |
spelling |
my.upm.eprints.416272015-12-23T02:22:17Z http://psasir.upm.edu.my/id/eprint/41627/ Towards large scale unconstrained optimization Abu Hassan, Malik A large scale unconstrained optimization problem can be formulated when the dimension n is large. The notion of 'large scale' is machine dependent and hence it could be difficult to state a priori when a problem is of large size. However, today an unconstrained problem with 400 or more variables is usually considered a large scale problem. The main difficulty in dealing with large scale problems is the fact that effective algorithms for small scale problems do not necessarily translate into efficient algorithms when applied to solve large scale problems. Therefore in dealing with large scale unconstrained problems with a large number of variables, modifications must be made to the standard implementation of the many existing algorithms for the small scale case. One of the most effective Newton-type methods for solving large-scale problems is the truncated Newton method. This method computes a Newton-type direction by truncating the conjugate Gradient method iterates (inner iterations) whenever a required accuracy is nobtained, thereby the superlinear convergence is guaranteed. Another effective approach to large-scale unconstrained is the limited memory BFGS method. This method satisfies the requirement to solve large-scale problems because the storage of matrices is avoided by storing a number of vector pairs. The symmetric rank one (SR1) update is of the simplest quasi-Newton updates for solving large-scale problems. However a basic disadvantage is that the SR1 update may not preserve the positive definiteness with a positive definiteness approximation. A simple restart procedure for the SR1 method using the standard line search to avoid the loss of positive definiteness will be implemented. The matrix-storage free BFGS (MF-BFGS) method is a method that combines with a restarting strategy to the BFGS method. We also attempt to construct a new matrix-storage free which uses the SR1 update (MF-SR1). The MF-SR1 method is more superior than the MF-BFGS method in some problems. However for other problems the MF-BFGS method is more competitive because of its rapid convergence. The matrix- storage methods can be gread accelerated by means of a simple scaling. Therefore, by a simple scaling on SR1 and BFGS methods, we can improve the methods tremendously. Universiti Putra Malaysia Press 2007 Inaugural Lecture NonPeerReviewed application/pdf en http://psasir.upm.edu.my/id/eprint/41627/1/0001.pdf application/pdf en http://psasir.upm.edu.my/id/eprint/41627/2/0001.pdf Abu Hassan, Malik (2007) Towards large scale unconstrained optimization. [Inaugural Lecture] |
institution |
Universiti Putra Malaysia |
building |
UPM Library |
collection |
Institutional Repository |
continent |
Asia |
country |
Malaysia |
content_provider |
Universiti Putra Malaysia |
content_source |
UPM Institutional Repository |
url_provider |
http://psasir.upm.edu.my/ |
language |
English English |
description |
A large scale unconstrained optimization problem can be formulated when the dimension n is large. The notion of 'large scale' is machine dependent and hence it could be difficult to state a priori when a problem is of large size. However, today an unconstrained problem with 400 or more variables is usually considered a large scale problem. The main difficulty in dealing with large scale problems is the fact that effective algorithms for small scale problems do not necessarily translate into efficient algorithms when applied to solve large scale problems. Therefore in dealing with large scale unconstrained problems with a large number of variables, modifications must be made to the standard implementation of the many existing algorithms for the small scale case. One of the most effective Newton-type methods for solving large-scale problems is the truncated Newton method. This method computes a Newton-type direction by truncating the conjugate Gradient method iterates (inner iterations) whenever a required accuracy is nobtained, thereby the superlinear convergence is guaranteed. Another effective approach to large-scale unconstrained is the limited memory BFGS method. This method satisfies the requirement to solve large-scale problems because the storage of matrices is avoided by storing a number of vector pairs. The symmetric rank one (SR1) update is of the simplest quasi-Newton updates for solving large-scale problems. However a basic disadvantage is that the SR1 update may not preserve the positive definiteness with a positive definiteness approximation. A simple restart procedure for the SR1 method using the standard line search to avoid the loss of positive definiteness will be implemented. The matrix-storage free BFGS (MF-BFGS) method is a method that combines with a restarting strategy to the BFGS method. We also attempt to construct a new matrix-storage free which uses the SR1 update (MF-SR1). The MF-SR1 method is more superior than the MF-BFGS method in some problems. However for other problems the MF-BFGS method is more competitive because of its rapid convergence. The matrix- storage methods can be gread accelerated by means of a simple scaling. Therefore, by a simple scaling on SR1 and BFGS methods, we can improve the methods tremendously. |
format |
Inaugural Lecture |
author |
Abu Hassan, Malik |
spellingShingle |
Abu Hassan, Malik Towards large scale unconstrained optimization |
author_facet |
Abu Hassan, Malik |
author_sort |
Abu Hassan, Malik |
title |
Towards large scale unconstrained optimization |
title_short |
Towards large scale unconstrained optimization |
title_full |
Towards large scale unconstrained optimization |
title_fullStr |
Towards large scale unconstrained optimization |
title_full_unstemmed |
Towards large scale unconstrained optimization |
title_sort |
towards large scale unconstrained optimization |
publisher |
Universiti Putra Malaysia Press |
publishDate |
2007 |
url |
http://psasir.upm.edu.my/id/eprint/41627/1/0001.pdf http://psasir.upm.edu.my/id/eprint/41627/2/0001.pdf http://psasir.upm.edu.my/id/eprint/41627/ |
_version_ |
1643833052244213760 |
score |
13.211869 |