Parameter optimisation with mcasopt

Introduce the concept of mcasopt:

  • Defining target criteria;

  • Defining an error metric for MCAS outputs (the objective function);

  • Using optimisation methods to search for improved parameter estimates; and

  • How mcasopt ties this all together.

The mcasopt package uses a numerical optimisation method to find an approximately-optimal model solution over some subset of the model parameters.

This requires four things to be defined by the user:

  • An input file that defines values for all of the model parameters;

  • A criteria file that defines the desired output;

  • A function that takes a vector of free parameters and updates the corresponding model parameters (i.e., determines the contents of a new MCAS input file); and

  • A vector of initial values for these free parameters.

Optimisation methods

The SciPy library provides a range of optimisation methods (see the reference documentation and tutorial for details). Some of these methods only require function evaluations, some require calculating or estimating the objective function’s Jacobian, and some require calculating the objective function’s Hessian. Depending on the method, bounds constraints, linear constraints, and/or non-linear constraints may also be provided.

Broyden-Fletcher-Goldfarb-Shanno algorithm

Perhaps the most appropriate starting point is the Broyden-Fletcher-Goldfarb-Shanno (BFGS), which uses the Jacobian (or estimates it using first-differences) and typically requires fewer function evaluations than the gradient-free simplex algorithm even when the Jacobian must be estimated. A variant of this method, L-BFGS-B, is also provided and supports simple bound constraints to ensure that, e.g., parameters are non-negative.

Note

Experimentation will likely be required to identify the optimal minimisation method.