Global convergence of a new spectral conjugate gradient by using strong wolfe line search

Unconstrained optimization problems can be solved by using few popular methods such as Conjugate Gradient (CG) method, Steepest Descent (SD) Method and Broyden-Fletcher-Goldfarb-Shanno (BFGS) method. The simplest solving method is by using SD method but nowadays CG method is used worldwide due to it...

وصف كامل

محفوظ في:
التفاصيل البيبلوغرافية
المؤلفون الرئيسيون: Zahrahtul Amani, Zakaria, Mustafa, Mamat, Mohd, Rivaie
التنسيق: مقال
اللغة:English
منشور في: HIKARI Ltd. 2015
الموضوعات:
الوصول للمادة أونلاين:http://eprints.unisza.edu.my/6330/1/FH02-FIK-15-03433.jpg
http://eprints.unisza.edu.my/6330/
الوسوم: إضافة وسم
لا توجد وسوم, كن أول من يضع وسما على هذه التسجيلة!
الوصف
الملخص:Unconstrained optimization problems can be solved by using few popular methods such as Conjugate Gradient (CG) method, Steepest Descent (SD) Method and Broyden-Fletcher-Goldfarb-Shanno (BFGS) method. The simplest solving method is by using SD method but nowadays CG method is used worldwide due to its convergence analysis. A few of unconstrained optimization problems with several different variables are used to prove the global convergence result of new spectral conjugate gradient to be compared with five most common  k proposed by the early researches by using inexact line search.