On Derivative-Free Optimisation Methods
Master thesis
View/ Open
Date
2023-05-15Metadata
Show full item recordCollections
- Master theses [217]
Abstract
As part of the field of mathematical optimisation, derivative-free optimisation is the study of optimisation methods that are not granted full access to the derivative of the objective function. In this master's thesis, three derivative-free optimisation methods known from the literature that do not use the derivatives have been studied, namely the Nelder–Mead method, the conditional trust-region method and the discrete gradient method. For each of these methods, besides recalling a description and a convergence statement, focus was given on providing motivation and background for increased understanding of the method without requiring specific prior knowledge in derivative-free optimisation. Different types of differentiability as total differentiability or subdifferentiability have been recalled for general usage in the understanding of those methods. As part of the description of the discrete gradient method, Wolfe's method for finding a minimum norm vector in a convex set is recalled, with a modified statement for proven convergence. Each of the methods are accompanied with an implementation for use with the MATLAB programming and numeric computing platform, or a reference to one such existing implementation is given. Numerical experiments were performed to compare the quality of variants of the methods.