Talk on JuliaSmoothOptimizers and the NLPModel API

less than 1 minute read

Published:

Yesterday was …. the 10th anniversary of Julia, and, yes, valentine’s day also. To celebrate this, we had the opportunity to present JuliaSmoothOptimizers at the JuMP-dev seminar and talk about the API for nonlinear models JuMP nonlinear developers call: The JuliaSmoothOptimizers ecosystem.

It was a great experience showing several of the features of the organization. The API defined in NLPModels.jl has two main advantages:

  • It defines a very abstract API with everything you need from a continuous optimization model (operator derivatives, sparse derivatives, in-place functions).
  • It is also compatible with all the types available in Julia. For instance, in OptimizationProblems.jl, you can access test problems defined with automatic differentiation on any type just by changing a keyword. Alexis Montoison, Ph.D. student at GERAD/Polytechnique Montreal, showed during the presentation how this API can also be used on GPUs.

Everything related to the presentation is available here and you can watch the youtube video on The Julia Programming Language channel.