University of Cambridge > Talks.cam > CMIH Hub seminar series > Improving the Flexibility and Robustness of Derivative-Free Optimization Solvers

Improving the Flexibility and Robustness of Derivative-Free Optimization Solvers

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact J.W.Stevens.

Classical nonlinear optimisation algorithms require the availability of gradient evaluations for constructing local approximations to the objective and testing for convergence. In settings where the objective is expensive to evaluate or noisy, evaluating the gradient may be too expensive or inaccurate, so cannot be used; we must turn to optimisation methods which do not require gradient information, so-called derivative-free optimisation (DFO). This has applications in areas such as climate modelling, hyperparameter tuning and generating adversarial examples in deep learning. In this talk, I will introduce DFO and discuss two software packages for DFO for nonlinear least-squares problems and general minimisation problems. I will describe their novel features aimed at expensive and/or noisy problems, and show their state-of-the-art performance. Time permitting, I will also show a heuristic method which improves the ability of these methods to escape local minima, and show its favourable performance on global optimisation problems.

This talk is part of the CMIH Hub seminar series series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

© 2006-2024 Talks.cam, University of Cambridge. Contact Us | Help and Documentation | Privacy and Publicity