Search Results

A Derivative-free Two Level Random Search Method for Unconstrained Optimization

Download or Read eBook A Derivative-free Two Level Random Search Method for Unconstrained Optimization PDF written by Neculai Andrei and published by . This book was released on 2021 with total page pages. Available in PDF, EPUB and Kindle.
A Derivative-free Two Level Random Search Method for Unconstrained Optimization
Author :
Publisher :
Total Pages :
Release :
ISBN-10 : 3030685187
ISBN-13 : 9783030685188
Rating : 4/5 (87 Downloads)

Book Synopsis A Derivative-free Two Level Random Search Method for Unconstrained Optimization by : Neculai Andrei

Book excerpt: The book is intended for graduate students and researchers in mathematics, computer science, and operational research. The book presents a new derivative-free optimization method/algorithm based on randomly generated trial points in specified domains and where the best ones are selected at each iteration by using a number of rules. This method is different from many other well established methods presented in the literature and proves to be competitive for solving many unconstrained optimization problems with different structures and complexities, with a relative large number of variables. Intensive numerical experiments with 140 unconstrained optimization problems, with up to 500 variables, have shown that this approach is efficient and robust. Structured into 4 chapters, Chapter 1 is introductory. Chapter 2 is dedicated to presenting a two level derivative-free random search method for unconstrained optimization. It is assumed that the minimizing function is continuous, lower bounded and its minimum value is known. Chapter 3 proves the convergence of the algorithm. In Chapter 4, the numerical performances of the algorithm are shown for solving 140 unconstrained optimization problems, out of which 16 are real applications. This shows that the optimization process has two phases: the reduction phase and the stalling one. Finally, the performances of the algorithm for solving a number of 30 large-scale unconstrained optimization problems up to 500 variables are presented. These numerical results show that this approach based on the two level random search method for unconstrained optimization is able to solve a large diversity of problems with different structures and complexities. There are a number of open problems which refer to the following aspects: the selection of the number of trial or the number of the local trial points, the selection of the bounds of the domains where the trial points and the local trial points are randomly generated and a criterion for initiating the line search.


A Derivative-free Two Level Random Search Method for Unconstrained Optimization Related Books

A Derivative-free Two Level Random Search Method for Unconstrained Optimization
Language: en
Pages: 126
Authors: Neculai Andrei
Categories: Mathematics
Type: BOOK - Published: 2021-03-31 - Publisher: Springer Nature

DOWNLOAD EBOOK

The book is intended for graduate students and researchers in mathematics, computer science, and operational research. The book presents a new derivative-free o
Modern Numerical Nonlinear Optimization
Language: en
Pages: 824
Authors: Neculai Andrei
Categories: Mathematics
Type: BOOK - Published: 2022-10-18 - Publisher: Springer Nature

DOWNLOAD EBOOK

This book includes a thorough theoretical and computational analysis of unconstrained and constrained optimization algorithms and combines and integrates the mo
Introduction to Derivative-Free Optimization
Language: en
Pages: 276
Authors: Andrew R. Conn
Categories: Mathematics
Type: BOOK - Published: 2009-04-16 - Publisher: SIAM

DOWNLOAD EBOOK

The first contemporary comprehensive treatment of optimization without derivatives. This text explains how sampling and model techniques are used in derivative-
Implicit Filtering
Language: en
Pages: 171
Authors: C. T. Kelley
Categories: Mathematics
Type: BOOK - Published: 2011-09-29 - Publisher: SIAM

DOWNLOAD EBOOK

A description of the implicit filtering algorithm, its convergence theory and a new MATLABĀ® implementation.
Derivative-Free and Blackbox Optimization
Language: en
Pages: 307
Authors: Charles Audet
Categories: Mathematics
Type: BOOK - Published: 2017-12-02 - Publisher: Springer

DOWNLOAD EBOOK

This book is designed as a textbook, suitable for self-learning or for teaching an upper-year university course on derivative-free and blackbox optimization. Th
Scroll to top