|
|
Bimonthly Since 1986 |
ISSN 1004-9037
|
|
|
|
|
Publication Details |
Edited by: Editorial Board of Journal of Data Acquisition and Processing
P.O. Box 2704, Beijing 100190, P.R. China
Sponsored by: Institute of Computing Technology, CAS & China Computer Federation
Undertaken by: Institute of Computing Technology, CAS
Published by: SCIENCE PRESS, BEIJING, CHINA
Distributed by:
China: All Local Post Offices
|
|
|
|
|
|
|
|
|
|
Abstract
If f is continuous and nonlinear on R^n, its application has proven evident throughout time. Minimizers are points where f is not differentiable. This work specifically focuses on the scenario when it is challenging to calculate the gradient and Hessian matrices for each given value of x. This study introduces a novel technique that utilizes derivatives to solve optimization problems. Specifically, it focuses on employing finite difference representations of the gradient and Hessian in the Quasi Newton method and Derivative Free Trust Region methods. If it is proven that f has a unique solution, it may be demonstrated that the step length (h) generated converges globally. Two test problems were employed for actual implementation utilizing MATLAB software. The numerical outcomes demonstrated the efficacy and resilience of the algorithms, which exhibited favourable comparisons to certain preexisting methods.
Keyword
Continuous functions, Differentiability, Quasi-Newton Method, Trust Region Method, Free Derivative, Optimization Problems,
PDF Download (click here)
|
|
|
|
|