I posted on HAL a work on a numerical method for the absolute value equation in collaboration with Lina Abdallah from Tripoli, Lebanon, and Mounir Haddou entitled: Solving Absolute Value Equation using Complementarity and Smoothing Functions. EDIT: The paper has been accepted in Journal of Computational and Applied Mathematics (doi :10.1016/j.cam.2017.06.019).
To celebrate the new year, I gave in January a seminar talk at the Departmental Colloquium Series in Guelph where I motivated the GNEP and presented some of our findings with Monica Cojocaru. In particular, it was based on this new paper we got accepted A dynamical system approach to the generalized Nash equilibrium problem in the open access Journal of Nonlinear and Variational Analysis. The paper reviews the use of nonsmooth dynamical system to find generalized Nash equilibria.
I started a 2 years post-doc position in Guelph under the supervision of Monica Cojocaru. I will work on bringing optimization methods for Generalized Nash Equilibrium Problems.
I gave a talk about interior point methods for monotone linear complementarity problems at the Journées SMAI-MODE 2016, which were organized in Toulouse from 23 to 25 of March 2016, see the slides. You can find a preprint of this work on HAL. UPDATE Jan. 2018: the paper has been accepted in Optimization Letters (doi: 10.1007/s11590-018-1241-2).
I am very happy to announce the publication in the Journal of Open Source Software of the paper PDENLPModels.jl: A NLPModel API for optimization problems with PDE-constraints.
I’m thrilled to share two major milestones in my recent work within the Julia ecosystem. First, I presented the latest developments in optimization solvers at JuMP-dev 2024, and second, my paper on JSOSuite.jl was accepted in The Proceedings of the JuliaCon Conferences.
These two achievements highlight both the ongoing evolution of JuliaSmoothOptimizers (JSO) and its growing impact on large-scale nonlinear optimization problems.
JuMP-dev 2024: Advancing Nonlinear Optimization with JuliaSmoothOptimizers
This year’s JuMP-dev workshop, held independently from JuliaCon for the first time in Montreal, offered a focused platform for deep dives into JuMP and its surrounding tools. In my presentation, I discussed the latest progress within the JuliaSmoothOptimizers (JSO) ecosystem, my slides and the replay.
At the core of my talk was an introduction to new solvers and packages like AdaptiveRegularization.jl, which address the unique challenges of large-scale optimization problems with Adaptive Regularization with Cubics. I emphasized the following key innovations:
Automatic Differentiation (AD) support and integration with JuMP for easier problem modeling.
Memory pre-allocation for in-place solvers, reducing runtime overhead.
Support for multi-precision solvers and GPU-based computations, essential for modern large-scale applications.
The value of factorization-free solvers, which excel in tackling large, complex problems, such as those in discretized PDE-constrained optimization.
For newcomers to JSO, JSOSuite.jl serves as a critical entry point, simplifying solver selection and benchmarking through automatic algorithm matching. This tool eliminates the complexity of choosing from multiple solvers by providing a user-friendly interface that adapts to the problem at hand. My talk also touched on the broader adoption and longevity of JSO, which now spans over 50 registered packages, making it one of the most comprehensive platforms for numerical optimization.
JSOSuite.jl covers a range of problem types—from unconstrained to generally-constrained and least-squares problems—and eliminates the need for users to understand the intricate details of individual solvers. Instead, the package conducts a preliminary analysis of the problem and automatically selects the most appropriate solver, offering significant advantages to both experienced practitioners and newcomers alike.
This paper builds on the innovations within JSO, reinforcing its versatility and ease of use across various fields and applications. The package is a natural fit for researchers who need efficient, reliable solvers without the overhead of manually configuring them for different problem types.
Looking Forward
Both my presentation at JuMP-dev 2024 and the publication of the JSOSuite.jl paper reflect the significant strides made by the JuliaSmoothOptimizers organization over the past year. The JSO ecosystem is positioned to continue driving innovation in the field of numerical optimization.
I’m excited to see how these advancements will be applied across diverse optimization problems in the coming years and look forward to continuing this journey with the JSO community.
I am thrilled to share that the article Scalable adaptive cubic regularization methods has been published in the journal Mathematical Programming, Series A. This has been a really exciting journey with my co-authors Jean-Pierre Dussault and Dominique Orban on this really exciting work that I hope will help explore the numerical possibilities of ARC methods. The proposed implementation is a perfect fit for large-scale application as it solves the subproblem inexactly and only required Hessian-vector products, so no need to evaluate and store the Hessian matrix. As usual, the code has been done in Julia and is available in the folder paper in the Github repository AdaptiveRegularization.jl. Full text published version available from here, enjoy!
In the world of numerical optimization, the quest for efficiency, accuracy, and innovation never ceases. At Polytechnique Montréal, students embarking on their educational journey have a unique opportunity to explore this dynamic field through the course MTH8408 Méthodes d’optimisation et contrôle optimal. This course, which I had the privilege of teaching during the winter session of 2023, delves into the depths of numerical methods for optimization, variational calculus, and optimal control.
I am thrilled to share my experiences from the recent JuMP-dev workshop that took place at the JuliaCon 2023, held at MIT in Boston, USA. As a passionate researcher in the field of numerical optimization, this year’s conference was particularly special for me, as it marked my first in-person attendance after the previous year’s online edition (read about it here). You can check the replay of my talk on youtube.
I am very happy to announce the publication in the Journal of Open Source Software of the paper PDENLPModels.jl: A NLPModel API for optimization problems with PDE-constraints.
The package FletcherPenaltySolver.jl is now a Julia package !! Very happy about this one because it has been a long project and the solver with great for large problems. Besides, we always teach penalty methods at the University, but efficient implementations are scarce.
I showed how to use JuliaSmoothOptimizers framework to PDE-constrained optimization problems modeled with PDENLPModels.jl at this year’s JuliaCon 2022/JuMP-dev 2022. The conference featured three talks on JuliaSmoothOptimizers in the JuMP-dev stream.
I taught in french the class MTH8408Méthodes d’optimisation et contrôle optimal during the winter session at Polytechnique Montréal on numerical methods for optimization, variational calculus, and optimal control.
We organized a steam of sessions on ’’Numerical optimization and linear algebra with Julia’’ at Optimization Days/Journées de l’optimisation 2022 held at HEC Montréal, May 16-18, 2022. The conference, renowned for its optimization expertise and wine & cheese party, was held in-person for the first time since 2019!
I am starting a new year in Montreal and one of the main projects for this year is to move forward in finalizing an ecosystem for optimization problems with partial differential equations in the constraints. The ecosystem is hosted under the JuliaSmoothOptimizers umbrella and contains:
The package RandomLinearAlgebraSolvers.jl is now a Julia package. It contains randomized iterative methods for linear algebra and random projectors used to solve linear systems in the sense of the Johnson–Lindenstrauss lemma. The package is still at an early stage and new contributions are very welcome. You will find below an example extracted from the package’s documentation. The package uses Stopping.jl as a framework for iterative methods.
As part of my postdoc at IVADO, I presented my work on solving PDE-constrained optimization problems in Julia during IVADO’s annual conference, IVADO Digital October. It was a 3min presentation, and a great challenge for such challenging problem! The video is now available on Youtube, and, as usual, more on the various packages on GitHub @tmigot.
I am starting a new postdoc position in Polytechnique Montréal with Dominique Orban on solvers for optimization problems involving PDE constraints in Julia and data science applications. This new position is funded by a competitive grant of IVADO, the institute for data valorization in Montreal.
MPCC.jl Set of tools to model the mathematical program with complementarity/switching/vanishing constraints following the NLPModels structure and with basic tools to use the Stopping framework.
MPCCSolver.jl A set of algorithms to solve the models from MPCC.jl.
I also had the pleasure to present A differential inclusion approach to mineral precipitation-dissolution reactions in geochemistry in collaboration with Jocelyne Erhel and Bastien Hamlat in the online Workshop Variational Methods in Nonlinear Phenomena. This work is part of Bastien’s Ph.D. thesis that he successfully defended this month, Congrats!!!
The new version of Stopping.jl is now official on Julia. It is the first stable version of it, and I will now soon release more codes using it for MPCC and GNEP. Currently, the main page of the project is here: Stopping.jl
I gave a talk entitled “Stopping.jl: A framework to implement iterative optimization algorithms” in the Journées de l’optimisation 2019, Montréal, Québec in the session Optimization in Julia to introduce our new Julia package.
I am back in Sherbrooke for three months after winning a grant in the interdisciplinary program “Programme de bourses d’excellence pour étudiants étrangers” of FRQNT. I will continue working on a solver for degenerate non-linear programs in Julia in collaboration with Jean-Pierre Dussault.
MPCC.jl Set of tools to model the mathematical program with complementarity/switching/vanishing constraints following the NLPModels structure and with basic tools to use the Stopping framework.
MPCCSolver.jl A set of algorithms to solve the models from MPCC.jl.
I also had the pleasure to present A differential inclusion approach to mineral precipitation-dissolution reactions in geochemistry in collaboration with Jocelyne Erhel and Bastien Hamlat in the online Workshop Variational Methods in Nonlinear Phenomena. This work is part of Bastien’s Ph.D. thesis that he successfully defended this month, Congrats!!!
I gave a talk on regularization methods for MPCC and its extension to MPVC in ParaoptXI (19-22 september) in Prague. UPDATE: you can now find a preprint on this subject here. UPDATE 2: The paper is published in Optimization (doi:10.1080/02331934.2018.1542531)
I posted on HAL a work in collaboration with Jean-Pierre Dussault, Mounir Haddou, and Abdeslam Kadrani on relaxation methods for mathematical programs with complementarity constraints and the approximate resolution of their sub-problems entitled : How to Compute a Local Minimum of the MPCC. I will present part of this work during the conference EUROPT 2017 in Montréal.
I gave a talk about the butterfly relaxation method for mathematical programs with complementarity constraints at the INFORMS 2016 Annual Meeting, which were organized in the city of music Nashville (Tennessee, US) from 14 to 16 of November 2016, see the slides.
I gave a talk on regularization methods for MPCC and its extension to MPVC in ParaoptXI (19-22 september) in Prague. UPDATE: you can now find a preprint on this subject here. UPDATE 2: The paper is published in Optimization (doi:10.1080/02331934.2018.1542531)
As part of my postdoc at IVADO, I presented my work on solving PDE-constrained optimization problems in Julia during IVADO’s annual conference, IVADO Digital October. It was a 3min presentation, and a great challenge for such challenging problem! The video is now available on Youtube, and, as usual, more on the various packages on GitHub @tmigot.
In the world of numerical optimization, the quest for efficiency, accuracy, and innovation never ceases. At Polytechnique Montréal, students embarking on their educational journey have a unique opportunity to explore this dynamic field through the course MTH8408 Méthodes d’optimisation et contrôle optimal. This course, which I had the privilege of teaching during the winter session of 2023, delves into the depths of numerical methods for optimization, variational calculus, and optimal control.
I am very happy to announce the publication in the Journal of Open Source Software of the paper PDENLPModels.jl: A NLPModel API for optimization problems with PDE-constraints.
The package FletcherPenaltySolver.jl is now a Julia package !! Very happy about this one because it has been a long project and the solver with great for large problems. Besides, we always teach penalty methods at the University, but efficient implementations are scarce.
I showed how to use JuliaSmoothOptimizers framework to PDE-constrained optimization problems modeled with PDENLPModels.jl at this year’s JuliaCon 2022/JuMP-dev 2022. The conference featured three talks on JuliaSmoothOptimizers in the JuMP-dev stream.
I taught in french the class MTH8408Méthodes d’optimisation et contrôle optimal during the winter session at Polytechnique Montréal on numerical methods for optimization, variational calculus, and optimal control.
We organized a steam of sessions on ’’Numerical optimization and linear algebra with Julia’’ at Optimization Days/Journées de l’optimisation 2022 held at HEC Montréal, May 16-18, 2022. The conference, renowned for its optimization expertise and wine & cheese party, was held in-person for the first time since 2019!
I am starting a new year in Montreal and one of the main projects for this year is to move forward in finalizing an ecosystem for optimization problems with partial differential equations in the constraints. The ecosystem is hosted under the JuliaSmoothOptimizers umbrella and contains:
I am starting a new postdoc position in Polytechnique Montréal with Dominique Orban on solvers for optimization problems involving PDE constraints in Julia and data science applications. This new position is funded by a competitive grant of IVADO, the institute for data valorization in Montreal.
The new version of Stopping.jl is now official on Julia. It is the first stable version of it, and I will now soon release more codes using it for MPCC and GNEP. Currently, the main page of the project is here: Stopping.jl
I gave a talk entitled “Stopping.jl: A framework to implement iterative optimization algorithms” in the Journées de l’optimisation 2019, Montréal, Québec in the session Optimization in Julia to introduce our new Julia package.
I am thrilled to share that the article Scalable adaptive cubic regularization methods has been published in the journal Mathematical Programming, Series A. This has been a really exciting journey with my co-authors Jean-Pierre Dussault and Dominique Orban on this really exciting work that I hope will help explore the numerical possibilities of ARC methods. The proposed implementation is a perfect fit for large-scale application as it solves the subproblem inexactly and only required Hessian-vector products, so no need to evaluate and store the Hessian matrix. As usual, the code has been done in Julia and is available in the folder paper in the Github repository AdaptiveRegularization.jl. Full text published version available from here, enjoy!
As I started my Ph.D. journey in numerical optimization back in 2014, I noticed something that really stood out to me: despite the abundance of scientific papers discussing algorithms and their numerical results, the availability of corresponding open-source codes lagged far behind.
I’m thrilled to share two major milestones in my recent work within the Julia ecosystem. First, I presented the latest developments in optimization solvers at JuMP-dev 2024, and second, my paper on JSOSuite.jl was accepted in The Proceedings of the JuliaCon Conferences.
These two achievements highlight both the ongoing evolution of JuliaSmoothOptimizers (JSO) and its growing impact on large-scale nonlinear optimization problems.
JuMP-dev 2024: Advancing Nonlinear Optimization with JuliaSmoothOptimizers
This year’s JuMP-dev workshop, held independently from JuliaCon for the first time in Montreal, offered a focused platform for deep dives into JuMP and its surrounding tools. In my presentation, I discussed the latest progress within the JuliaSmoothOptimizers (JSO) ecosystem, my slides and the replay.
At the core of my talk was an introduction to new solvers and packages like AdaptiveRegularization.jl, which address the unique challenges of large-scale optimization problems with Adaptive Regularization with Cubics. I emphasized the following key innovations:
Automatic Differentiation (AD) support and integration with JuMP for easier problem modeling.
Memory pre-allocation for in-place solvers, reducing runtime overhead.
Support for multi-precision solvers and GPU-based computations, essential for modern large-scale applications.
The value of factorization-free solvers, which excel in tackling large, complex problems, such as those in discretized PDE-constrained optimization.
For newcomers to JSO, JSOSuite.jl serves as a critical entry point, simplifying solver selection and benchmarking through automatic algorithm matching. This tool eliminates the complexity of choosing from multiple solvers by providing a user-friendly interface that adapts to the problem at hand. My talk also touched on the broader adoption and longevity of JSO, which now spans over 50 registered packages, making it one of the most comprehensive platforms for numerical optimization.
JSOSuite.jl covers a range of problem types—from unconstrained to generally-constrained and least-squares problems—and eliminates the need for users to understand the intricate details of individual solvers. Instead, the package conducts a preliminary analysis of the problem and automatically selects the most appropriate solver, offering significant advantages to both experienced practitioners and newcomers alike.
This paper builds on the innovations within JSO, reinforcing its versatility and ease of use across various fields and applications. The package is a natural fit for researchers who need efficient, reliable solvers without the overhead of manually configuring them for different problem types.
Looking Forward
Both my presentation at JuMP-dev 2024 and the publication of the JSOSuite.jl paper reflect the significant strides made by the JuliaSmoothOptimizers organization over the past year. The JSO ecosystem is positioned to continue driving innovation in the field of numerical optimization.
I’m excited to see how these advancements will be applied across diverse optimization problems in the coming years and look forward to continuing this journey with the JSO community.
As part of my postdoc at IVADO, I presented my work on solving PDE-constrained optimization problems in Julia during IVADO’s annual conference, IVADO Digital October. It was a 3min presentation, and a great challenge for such challenging problem! The video is now available on Youtube, and, as usual, more on the various packages on GitHub @tmigot.
I gave a talk on regularization methods for MPCC and its extension to MPVC in ParaoptXI (19-22 september) in Prague. UPDATE: you can now find a preprint on this subject here. UPDATE 2: The paper is published in Optimization (doi:10.1080/02331934.2018.1542531)
I posted on HAL a work in collaboration with Jean-Pierre Dussault, Mounir Haddou, and Abdeslam Kadrani on relaxation methods for mathematical programs with complementarity constraints and the approximate resolution of their sub-problems entitled : How to Compute a Local Minimum of the MPCC. I will present part of this work during the conference EUROPT 2017 in Montréal.
I gave a talk about the butterfly relaxation method for mathematical programs with complementarity constraints at the INFORMS 2016 Annual Meeting, which were organized in the city of music Nashville (Tennessee, US) from 14 to 16 of November 2016, see the slides.
I was in the organizing committee of the conference HJ2016: Hamilton-Jacobi Equations: new trends and applications, which takes place from 30 May to 3 June 2016 in Rennes (France). I also presented there a poster on: “A new relaxation method for the mathematical program with complementarity constraints.”
I gave a talk about interior point methods for monotone linear complementarity problems at the Journées SMAI-MODE 2016, which were organized in Toulouse from 23 to 25 of March 2016, see the slides. You can find a preprint of this work on HAL. UPDATE Jan. 2018: the paper has been accepted in Optimization Letters (doi: 10.1007/s11590-018-1241-2).
I am very happy to announce the publication in the Journal of Open Source Software of the paper PDENLPModels.jl: A NLPModel API for optimization problems with PDE-constraints.
The package FletcherPenaltySolver.jl is now a Julia package !! Very happy about this one because it has been a long project and the solver with great for large problems. Besides, we always teach penalty methods at the University, but efficient implementations are scarce.
I am starting a new year in Montreal and one of the main projects for this year is to move forward in finalizing an ecosystem for optimization problems with partial differential equations in the constraints. The ecosystem is hosted under the JuliaSmoothOptimizers umbrella and contains:
The package RandomLinearAlgebraSolvers.jl is now a Julia package. It contains randomized iterative methods for linear algebra and random projectors used to solve linear systems in the sense of the Johnson–Lindenstrauss lemma. The package is still at an early stage and new contributions are very welcome. You will find below an example extracted from the package’s documentation. The package uses Stopping.jl as a framework for iterative methods.
As part of my postdoc at IVADO, I presented my work on solving PDE-constrained optimization problems in Julia during IVADO’s annual conference, IVADO Digital October. It was a 3min presentation, and a great challenge for such challenging problem! The video is now available on Youtube, and, as usual, more on the various packages on GitHub @tmigot.
I am starting a new postdoc position in Polytechnique Montréal with Dominique Orban on solvers for optimization problems involving PDE constraints in Julia and data science applications. This new position is funded by a competitive grant of IVADO, the institute for data valorization in Montreal.
This project holds a special place in my heart as it touches on the very applications in geochemistry that first drew me into research. Equilibrium reactions, particularly in slow processes like the water cycle in aquifers, have always fascinated me. Moreover, this paper represents an important milestone for one of the authors, Bastien, as it was part of his Ph.D. thesis. The use of projected dynamical systems, a model I am particularly fond of, adds an additional layer of personal significance to this work.
MPCC.jl Set of tools to model the mathematical program with complementarity/switching/vanishing constraints following the NLPModels structure and with basic tools to use the Stopping framework.
MPCCSolver.jl A set of algorithms to solve the models from MPCC.jl.
I also had the pleasure to present A differential inclusion approach to mineral precipitation-dissolution reactions in geochemistry in collaboration with Jocelyne Erhel and Bastien Hamlat in the online Workshop Variational Methods in Nonlinear Phenomena. This work is part of Bastien’s Ph.D. thesis that he successfully defended this month, Congrats!!!
This project holds a special place in my heart as it touches on the very applications in geochemistry that first drew me into research. Equilibrium reactions, particularly in slow processes like the water cycle in aquifers, have always fascinated me. Moreover, this paper represents an important milestone for one of the authors, Bastien, as it was part of his Ph.D. thesis. The use of projected dynamical systems, a model I am particularly fond of, adds an additional layer of personal significance to this work.
The package RandomLinearAlgebraSolvers.jl is now a Julia package. It contains randomized iterative methods for linear algebra and random projectors used to solve linear systems in the sense of the Johnson–Lindenstrauss lemma. The package is still at an early stage and new contributions are very welcome. You will find below an example extracted from the package’s documentation. The package uses Stopping.jl as a framework for iterative methods.
I am thrilled to share that the article Scalable adaptive cubic regularization methods has been published in the journal Mathematical Programming, Series A. This has been a really exciting journey with my co-authors Jean-Pierre Dussault and Dominique Orban on this really exciting work that I hope will help explore the numerical possibilities of ARC methods. The proposed implementation is a perfect fit for large-scale application as it solves the subproblem inexactly and only required Hessian-vector products, so no need to evaluate and store the Hessian matrix. As usual, the code has been done in Julia and is available in the folder paper in the Github repository AdaptiveRegularization.jl. Full text published version available from here, enjoy!
In the world of numerical optimization, the quest for efficiency, accuracy, and innovation never ceases. At Polytechnique Montréal, students embarking on their educational journey have a unique opportunity to explore this dynamic field through the course MTH8408 Méthodes d’optimisation et contrôle optimal. This course, which I had the privilege of teaching during the winter session of 2023, delves into the depths of numerical methods for optimization, variational calculus, and optimal control.
I taught in french the class MTH8408Méthodes d’optimisation et contrôle optimal during the winter session at Polytechnique Montréal on numerical methods for optimization, variational calculus, and optimal control.
To celebrate the new year, I gave in January a seminar talk at the Departmental Colloquium Series in Guelph where I motivated the GNEP and presented some of our findings with Monica Cojocaru. In particular, it was based on this new paper we got accepted A dynamical system approach to the generalized Nash equilibrium problem in the open access Journal of Nonlinear and Variational Analysis. The paper reviews the use of nonsmooth dynamical system to find generalized Nash equilibria.
I’m thrilled to share two major milestones in my recent work within the Julia ecosystem. First, I presented the latest developments in optimization solvers at JuMP-dev 2024, and second, my paper on JSOSuite.jl was accepted in The Proceedings of the JuliaCon Conferences.
These two achievements highlight both the ongoing evolution of JuliaSmoothOptimizers (JSO) and its growing impact on large-scale nonlinear optimization problems.
JuMP-dev 2024: Advancing Nonlinear Optimization with JuliaSmoothOptimizers
This year’s JuMP-dev workshop, held independently from JuliaCon for the first time in Montreal, offered a focused platform for deep dives into JuMP and its surrounding tools. In my presentation, I discussed the latest progress within the JuliaSmoothOptimizers (JSO) ecosystem, my slides and the replay.
At the core of my talk was an introduction to new solvers and packages like AdaptiveRegularization.jl, which address the unique challenges of large-scale optimization problems with Adaptive Regularization with Cubics. I emphasized the following key innovations:
Automatic Differentiation (AD) support and integration with JuMP for easier problem modeling.
Memory pre-allocation for in-place solvers, reducing runtime overhead.
Support for multi-precision solvers and GPU-based computations, essential for modern large-scale applications.
The value of factorization-free solvers, which excel in tackling large, complex problems, such as those in discretized PDE-constrained optimization.
For newcomers to JSO, JSOSuite.jl serves as a critical entry point, simplifying solver selection and benchmarking through automatic algorithm matching. This tool eliminates the complexity of choosing from multiple solvers by providing a user-friendly interface that adapts to the problem at hand. My talk also touched on the broader adoption and longevity of JSO, which now spans over 50 registered packages, making it one of the most comprehensive platforms for numerical optimization.
JSOSuite.jl covers a range of problem types—from unconstrained to generally-constrained and least-squares problems—and eliminates the need for users to understand the intricate details of individual solvers. Instead, the package conducts a preliminary analysis of the problem and automatically selects the most appropriate solver, offering significant advantages to both experienced practitioners and newcomers alike.
This paper builds on the innovations within JSO, reinforcing its versatility and ease of use across various fields and applications. The package is a natural fit for researchers who need efficient, reliable solvers without the overhead of manually configuring them for different problem types.
Looking Forward
Both my presentation at JuMP-dev 2024 and the publication of the JSOSuite.jl paper reflect the significant strides made by the JuliaSmoothOptimizers organization over the past year. The JSO ecosystem is positioned to continue driving innovation in the field of numerical optimization.
I’m excited to see how these advancements will be applied across diverse optimization problems in the coming years and look forward to continuing this journey with the JSO community.
In the world of numerical optimization, the quest for efficiency, accuracy, and innovation never ceases. At Polytechnique Montréal, students embarking on their educational journey have a unique opportunity to explore this dynamic field through the course MTH8408 Méthodes d’optimisation et contrôle optimal. This course, which I had the privilege of teaching during the winter session of 2023, delves into the depths of numerical methods for optimization, variational calculus, and optimal control.
I am thrilled to share my experiences from the recent JuMP-dev workshop that took place at the JuliaCon 2023, held at MIT in Boston, USA. As a passionate researcher in the field of numerical optimization, this year’s conference was particularly special for me, as it marked my first in-person attendance after the previous year’s online edition (read about it here). You can check the replay of my talk on youtube.
I am very happy to announce the publication in the Journal of Open Source Software of the paper PDENLPModels.jl: A NLPModel API for optimization problems with PDE-constraints.
The package FletcherPenaltySolver.jl is now a Julia package !! Very happy about this one because it has been a long project and the solver with great for large problems. Besides, we always teach penalty methods at the University, but efficient implementations are scarce.
I showed how to use JuliaSmoothOptimizers framework to PDE-constrained optimization problems modeled with PDENLPModels.jl at this year’s JuliaCon 2022/JuMP-dev 2022. The conference featured three talks on JuliaSmoothOptimizers in the JuMP-dev stream.
I taught in french the class MTH8408Méthodes d’optimisation et contrôle optimal during the winter session at Polytechnique Montréal on numerical methods for optimization, variational calculus, and optimal control.
We organized a steam of sessions on ’’Numerical optimization and linear algebra with Julia’’ at Optimization Days/Journées de l’optimisation 2022 held at HEC Montréal, May 16-18, 2022. The conference, renowned for its optimization expertise and wine & cheese party, was held in-person for the first time since 2019!
I am starting a new year in Montreal and one of the main projects for this year is to move forward in finalizing an ecosystem for optimization problems with partial differential equations in the constraints. The ecosystem is hosted under the JuliaSmoothOptimizers umbrella and contains:
I was in the organizing committee of the conference HJ2016: Hamilton-Jacobi Equations: new trends and applications, which takes place from 30 May to 3 June 2016 in Rennes (France). I also presented there a poster on: “A new relaxation method for the mathematical program with complementarity constraints.”
I am starting a new postdoc position in Polytechnique Montréal with Dominique Orban on solvers for optimization problems involving PDE constraints in Julia and data science applications. This new position is funded by a competitive grant of IVADO, the institute for data valorization in Montreal.
I started a 2 years post-doc position in Guelph under the supervision of Monica Cojocaru. I will work on bringing optimization methods for Generalized Nash Equilibrium Problems.
I am back in Sherbrooke for three months after winning a grant in the interdisciplinary program “Programme de bourses d’excellence pour étudiants étrangers” of FRQNT. I will continue working on a solver for degenerate non-linear programs in Julia in collaboration with Jean-Pierre Dussault.
I am starting a new postdoc position in Polytechnique Montréal with Dominique Orban on solvers for optimization problems involving PDE constraints in Julia and data science applications. This new position is funded by a competitive grant of IVADO, the institute for data valorization in Montreal.
I started a 2 years post-doc position in Guelph under the supervision of Monica Cojocaru. I will work on bringing optimization methods for Generalized Nash Equilibrium Problems.
I am back in Sherbrooke for three months after winning a grant in the interdisciplinary program “Programme de bourses d’excellence pour étudiants étrangers” of FRQNT. I will continue working on a solver for degenerate non-linear programs in Julia in collaboration with Jean-Pierre Dussault.
I gave a talk on regularization methods for MPCC and its extension to MPVC in ParaoptXI (19-22 september) in Prague. UPDATE: you can now find a preprint on this subject here. UPDATE 2: The paper is published in Optimization (doi:10.1080/02331934.2018.1542531)
I’m thrilled to share two major milestones in my recent work within the Julia ecosystem. First, I presented the latest developments in optimization solvers at JuMP-dev 2024, and second, my paper on JSOSuite.jl was accepted in The Proceedings of the JuliaCon Conferences.
These two achievements highlight both the ongoing evolution of JuliaSmoothOptimizers (JSO) and its growing impact on large-scale nonlinear optimization problems.
JuMP-dev 2024: Advancing Nonlinear Optimization with JuliaSmoothOptimizers
This year’s JuMP-dev workshop, held independently from JuliaCon for the first time in Montreal, offered a focused platform for deep dives into JuMP and its surrounding tools. In my presentation, I discussed the latest progress within the JuliaSmoothOptimizers (JSO) ecosystem, my slides and the replay.
At the core of my talk was an introduction to new solvers and packages like AdaptiveRegularization.jl, which address the unique challenges of large-scale optimization problems with Adaptive Regularization with Cubics. I emphasized the following key innovations:
Automatic Differentiation (AD) support and integration with JuMP for easier problem modeling.
Memory pre-allocation for in-place solvers, reducing runtime overhead.
Support for multi-precision solvers and GPU-based computations, essential for modern large-scale applications.
The value of factorization-free solvers, which excel in tackling large, complex problems, such as those in discretized PDE-constrained optimization.
For newcomers to JSO, JSOSuite.jl serves as a critical entry point, simplifying solver selection and benchmarking through automatic algorithm matching. This tool eliminates the complexity of choosing from multiple solvers by providing a user-friendly interface that adapts to the problem at hand. My talk also touched on the broader adoption and longevity of JSO, which now spans over 50 registered packages, making it one of the most comprehensive platforms for numerical optimization.
JSOSuite.jl covers a range of problem types—from unconstrained to generally-constrained and least-squares problems—and eliminates the need for users to understand the intricate details of individual solvers. Instead, the package conducts a preliminary analysis of the problem and automatically selects the most appropriate solver, offering significant advantages to both experienced practitioners and newcomers alike.
This paper builds on the innovations within JSO, reinforcing its versatility and ease of use across various fields and applications. The package is a natural fit for researchers who need efficient, reliable solvers without the overhead of manually configuring them for different problem types.
Looking Forward
Both my presentation at JuMP-dev 2024 and the publication of the JSOSuite.jl paper reflect the significant strides made by the JuliaSmoothOptimizers organization over the past year. The JSO ecosystem is positioned to continue driving innovation in the field of numerical optimization.
I’m excited to see how these advancements will be applied across diverse optimization problems in the coming years and look forward to continuing this journey with the JSO community.
I am thrilled to share my experiences from the recent JuMP-dev workshop that took place at the JuliaCon 2023, held at MIT in Boston, USA. As a passionate researcher in the field of numerical optimization, this year’s conference was particularly special for me, as it marked my first in-person attendance after the previous year’s online edition (read about it here). You can check the replay of my talk on youtube.
I am very happy to announce the publication in the Journal of Open Source Software of the paper PDENLPModels.jl: A NLPModel API for optimization problems with PDE-constraints.
The package FletcherPenaltySolver.jl is now a Julia package !! Very happy about this one because it has been a long project and the solver with great for large problems. Besides, we always teach penalty methods at the University, but efficient implementations are scarce.
I showed how to use JuliaSmoothOptimizers framework to PDE-constrained optimization problems modeled with PDENLPModels.jl at this year’s JuliaCon 2022/JuMP-dev 2022. The conference featured three talks on JuliaSmoothOptimizers in the JuMP-dev stream.
We organized a steam of sessions on ’’Numerical optimization and linear algebra with Julia’’ at Optimization Days/Journées de l’optimisation 2022 held at HEC Montréal, May 16-18, 2022. The conference, renowned for its optimization expertise and wine & cheese party, was held in-person for the first time since 2019!
I am starting a new year in Montreal and one of the main projects for this year is to move forward in finalizing an ecosystem for optimization problems with partial differential equations in the constraints. The ecosystem is hosted under the JuliaSmoothOptimizers umbrella and contains:
The package RandomLinearAlgebraSolvers.jl is now a Julia package. It contains randomized iterative methods for linear algebra and random projectors used to solve linear systems in the sense of the Johnson–Lindenstrauss lemma. The package is still at an early stage and new contributions are very welcome. You will find below an example extracted from the package’s documentation. The package uses Stopping.jl as a framework for iterative methods.
The new version of Stopping.jl is now official on Julia. It is the first stable version of it, and I will now soon release more codes using it for MPCC and GNEP. Currently, the main page of the project is here: Stopping.jl
I gave a talk entitled “Stopping.jl: A framework to implement iterative optimization algorithms” in the Journées de l’optimisation 2019, Montréal, Québec in the session Optimization in Julia to introduce our new Julia package.
This project holds a special place in my heart as it touches on the very applications in geochemistry that first drew me into research. Equilibrium reactions, particularly in slow processes like the water cycle in aquifers, have always fascinated me. Moreover, this paper represents an important milestone for one of the authors, Bastien, as it was part of his Ph.D. thesis. The use of projected dynamical systems, a model I am particularly fond of, adds an additional layer of personal significance to this work.
I’m thrilled to share two major milestones in my recent work within the Julia ecosystem. First, I presented the latest developments in optimization solvers at JuMP-dev 2024, and second, my paper on JSOSuite.jl was accepted in The Proceedings of the JuliaCon Conferences.
These two achievements highlight both the ongoing evolution of JuliaSmoothOptimizers (JSO) and its growing impact on large-scale nonlinear optimization problems.
JuMP-dev 2024: Advancing Nonlinear Optimization with JuliaSmoothOptimizers
This year’s JuMP-dev workshop, held independently from JuliaCon for the first time in Montreal, offered a focused platform for deep dives into JuMP and its surrounding tools. In my presentation, I discussed the latest progress within the JuliaSmoothOptimizers (JSO) ecosystem, my slides and the replay.
At the core of my talk was an introduction to new solvers and packages like AdaptiveRegularization.jl, which address the unique challenges of large-scale optimization problems with Adaptive Regularization with Cubics. I emphasized the following key innovations:
Automatic Differentiation (AD) support and integration with JuMP for easier problem modeling.
Memory pre-allocation for in-place solvers, reducing runtime overhead.
Support for multi-precision solvers and GPU-based computations, essential for modern large-scale applications.
The value of factorization-free solvers, which excel in tackling large, complex problems, such as those in discretized PDE-constrained optimization.
For newcomers to JSO, JSOSuite.jl serves as a critical entry point, simplifying solver selection and benchmarking through automatic algorithm matching. This tool eliminates the complexity of choosing from multiple solvers by providing a user-friendly interface that adapts to the problem at hand. My talk also touched on the broader adoption and longevity of JSO, which now spans over 50 registered packages, making it one of the most comprehensive platforms for numerical optimization.
JSOSuite.jl covers a range of problem types—from unconstrained to generally-constrained and least-squares problems—and eliminates the need for users to understand the intricate details of individual solvers. Instead, the package conducts a preliminary analysis of the problem and automatically selects the most appropriate solver, offering significant advantages to both experienced practitioners and newcomers alike.
This paper builds on the innovations within JSO, reinforcing its versatility and ease of use across various fields and applications. The package is a natural fit for researchers who need efficient, reliable solvers without the overhead of manually configuring them for different problem types.
Looking Forward
Both my presentation at JuMP-dev 2024 and the publication of the JSOSuite.jl paper reflect the significant strides made by the JuliaSmoothOptimizers organization over the past year. The JSO ecosystem is positioned to continue driving innovation in the field of numerical optimization.
I’m excited to see how these advancements will be applied across diverse optimization problems in the coming years and look forward to continuing this journey with the JSO community.
This project holds a special place in my heart as it touches on the very applications in geochemistry that first drew me into research. Equilibrium reactions, particularly in slow processes like the water cycle in aquifers, have always fascinated me. Moreover, this paper represents an important milestone for one of the authors, Bastien, as it was part of his Ph.D. thesis. The use of projected dynamical systems, a model I am particularly fond of, adds an additional layer of personal significance to this work.
As I started my Ph.D. journey in numerical optimization back in 2014, I noticed something that really stood out to me: despite the abundance of scientific papers discussing algorithms and their numerical results, the availability of corresponding open-source codes lagged far behind.
I am thrilled to share that the article Scalable adaptive cubic regularization methods has been published in the journal Mathematical Programming, Series A. This has been a really exciting journey with my co-authors Jean-Pierre Dussault and Dominique Orban on this really exciting work that I hope will help explore the numerical possibilities of ARC methods. The proposed implementation is a perfect fit for large-scale application as it solves the subproblem inexactly and only required Hessian-vector products, so no need to evaluate and store the Hessian matrix. As usual, the code has been done in Julia and is available in the folder paper in the Github repository AdaptiveRegularization.jl. Full text published version available from here, enjoy!
MPCC.jl Set of tools to model the mathematical program with complementarity/switching/vanishing constraints following the NLPModels structure and with basic tools to use the Stopping framework.
MPCCSolver.jl A set of algorithms to solve the models from MPCC.jl.
I also had the pleasure to present A differential inclusion approach to mineral precipitation-dissolution reactions in geochemistry in collaboration with Jocelyne Erhel and Bastien Hamlat in the online Workshop Variational Methods in Nonlinear Phenomena. This work is part of Bastien’s Ph.D. thesis that he successfully defended this month, Congrats!!!
To celebrate the new year, I gave in January a seminar talk at the Departmental Colloquium Series in Guelph where I motivated the GNEP and presented some of our findings with Monica Cojocaru. In particular, it was based on this new paper we got accepted A dynamical system approach to the generalized Nash equilibrium problem in the open access Journal of Nonlinear and Variational Analysis. The paper reviews the use of nonsmooth dynamical system to find generalized Nash equilibria.
I posted on HAL a work in collaboration with Jean-Pierre Dussault, Mounir Haddou, and Abdeslam Kadrani on relaxation methods for mathematical programs with complementarity constraints and the approximate resolution of their sub-problems entitled : How to Compute a Local Minimum of the MPCC. I will present part of this work during the conference EUROPT 2017 in Montréal.
I posted on HAL a work on a numerical method for the absolute value equation in collaboration with Lina Abdallah from Tripoli, Lebanon, and Mounir Haddou entitled: Solving Absolute Value Equation using Complementarity and Smoothing Functions. EDIT: The paper has been accepted in Journal of Computational and Applied Mathematics (doi :10.1016/j.cam.2017.06.019).
The package RandomLinearAlgebraSolvers.jl is now a Julia package. It contains randomized iterative methods for linear algebra and random projectors used to solve linear systems in the sense of the Johnson–Lindenstrauss lemma. The package is still at an early stage and new contributions are very welcome. You will find below an example extracted from the package’s documentation. The package uses Stopping.jl as a framework for iterative methods.
As I started my Ph.D. journey in numerical optimization back in 2014, I noticed something that really stood out to me: despite the abundance of scientific papers discussing algorithms and their numerical results, the availability of corresponding open-source codes lagged far behind.
The package FletcherPenaltySolver.jl is now a Julia package !! Very happy about this one because it has been a long project and the solver with great for large problems. Besides, we always teach penalty methods at the University, but efficient implementations are scarce.
I am starting a new year in Montreal and one of the main projects for this year is to move forward in finalizing an ecosystem for optimization problems with partial differential equations in the constraints. The ecosystem is hosted under the JuliaSmoothOptimizers umbrella and contains:
The package RandomLinearAlgebraSolvers.jl is now a Julia package. It contains randomized iterative methods for linear algebra and random projectors used to solve linear systems in the sense of the Johnson–Lindenstrauss lemma. The package is still at an early stage and new contributions are very welcome. You will find below an example extracted from the package’s documentation. The package uses Stopping.jl as a framework for iterative methods.
I’m thrilled to share two major milestones in my recent work within the Julia ecosystem. First, I presented the latest developments in optimization solvers at JuMP-dev 2024, and second, my paper on JSOSuite.jl was accepted in The Proceedings of the JuliaCon Conferences.
These two achievements highlight both the ongoing evolution of JuliaSmoothOptimizers (JSO) and its growing impact on large-scale nonlinear optimization problems.
JuMP-dev 2024: Advancing Nonlinear Optimization with JuliaSmoothOptimizers
This year’s JuMP-dev workshop, held independently from JuliaCon for the first time in Montreal, offered a focused platform for deep dives into JuMP and its surrounding tools. In my presentation, I discussed the latest progress within the JuliaSmoothOptimizers (JSO) ecosystem, my slides and the replay.
At the core of my talk was an introduction to new solvers and packages like AdaptiveRegularization.jl, which address the unique challenges of large-scale optimization problems with Adaptive Regularization with Cubics. I emphasized the following key innovations:
Automatic Differentiation (AD) support and integration with JuMP for easier problem modeling.
Memory pre-allocation for in-place solvers, reducing runtime overhead.
Support for multi-precision solvers and GPU-based computations, essential for modern large-scale applications.
The value of factorization-free solvers, which excel in tackling large, complex problems, such as those in discretized PDE-constrained optimization.
For newcomers to JSO, JSOSuite.jl serves as a critical entry point, simplifying solver selection and benchmarking through automatic algorithm matching. This tool eliminates the complexity of choosing from multiple solvers by providing a user-friendly interface that adapts to the problem at hand. My talk also touched on the broader adoption and longevity of JSO, which now spans over 50 registered packages, making it one of the most comprehensive platforms for numerical optimization.
JSOSuite.jl covers a range of problem types—from unconstrained to generally-constrained and least-squares problems—and eliminates the need for users to understand the intricate details of individual solvers. Instead, the package conducts a preliminary analysis of the problem and automatically selects the most appropriate solver, offering significant advantages to both experienced practitioners and newcomers alike.
This paper builds on the innovations within JSO, reinforcing its versatility and ease of use across various fields and applications. The package is a natural fit for researchers who need efficient, reliable solvers without the overhead of manually configuring them for different problem types.
Looking Forward
Both my presentation at JuMP-dev 2024 and the publication of the JSOSuite.jl paper reflect the significant strides made by the JuliaSmoothOptimizers organization over the past year. The JSO ecosystem is positioned to continue driving innovation in the field of numerical optimization.
I’m excited to see how these advancements will be applied across diverse optimization problems in the coming years and look forward to continuing this journey with the JSO community.
I am thrilled to share my experiences from the recent JuMP-dev workshop that took place at the JuliaCon 2023, held at MIT in Boston, USA. As a passionate researcher in the field of numerical optimization, this year’s conference was particularly special for me, as it marked my first in-person attendance after the previous year’s online edition (read about it here). You can check the replay of my talk on youtube.
I showed how to use JuliaSmoothOptimizers framework to PDE-constrained optimization problems modeled with PDENLPModels.jl at this year’s JuliaCon 2022/JuMP-dev 2022. The conference featured three talks on JuliaSmoothOptimizers in the JuMP-dev stream.
We organized a steam of sessions on ’’Numerical optimization and linear algebra with Julia’’ at Optimization Days/Journées de l’optimisation 2022 held at HEC Montréal, May 16-18, 2022. The conference, renowned for its optimization expertise and wine & cheese party, was held in-person for the first time since 2019!
As part of my postdoc at IVADO, I presented my work on solving PDE-constrained optimization problems in Julia during IVADO’s annual conference, IVADO Digital October. It was a 3min presentation, and a great challenge for such challenging problem! The video is now available on Youtube, and, as usual, more on the various packages on GitHub @tmigot.
MPCC.jl Set of tools to model the mathematical program with complementarity/switching/vanishing constraints following the NLPModels structure and with basic tools to use the Stopping framework.
MPCCSolver.jl A set of algorithms to solve the models from MPCC.jl.
I also had the pleasure to present A differential inclusion approach to mineral precipitation-dissolution reactions in geochemistry in collaboration with Jocelyne Erhel and Bastien Hamlat in the online Workshop Variational Methods in Nonlinear Phenomena. This work is part of Bastien’s Ph.D. thesis that he successfully defended this month, Congrats!!!
I gave a talk on regularization methods for MPCC and its extension to MPVC in ParaoptXI (19-22 september) in Prague. UPDATE: you can now find a preprint on this subject here. UPDATE 2: The paper is published in Optimization (doi:10.1080/02331934.2018.1542531)
I gave a talk about the butterfly relaxation method for mathematical programs with complementarity constraints at the INFORMS 2016 Annual Meeting, which were organized in the city of music Nashville (Tennessee, US) from 14 to 16 of November 2016, see the slides.
I gave a talk about interior point methods for monotone linear complementarity problems at the Journées SMAI-MODE 2016, which were organized in Toulouse from 23 to 25 of March 2016, see the slides. You can find a preprint of this work on HAL. UPDATE Jan. 2018: the paper has been accepted in Optimization Letters (doi: 10.1007/s11590-018-1241-2).
In the world of numerical optimization, the quest for efficiency, accuracy, and innovation never ceases. At Polytechnique Montréal, students embarking on their educational journey have a unique opportunity to explore this dynamic field through the course MTH8408 Méthodes d’optimisation et contrôle optimal. This course, which I had the privilege of teaching during the winter session of 2023, delves into the depths of numerical methods for optimization, variational calculus, and optimal control.
I taught in french the class MTH8408Méthodes d’optimisation et contrôle optimal during the winter session at Polytechnique Montréal on numerical methods for optimization, variational calculus, and optimal control.
As I started my Ph.D. journey in numerical optimization back in 2014, I noticed something that really stood out to me: despite the abundance of scientific papers discussing algorithms and their numerical results, the availability of corresponding open-source codes lagged far behind.
I was in the organizing committee of the conference HJ2016: Hamilton-Jacobi Equations: new trends and applications, which takes place from 30 May to 3 June 2016 in Rennes (France). I also presented there a poster on: “A new relaxation method for the mathematical program with complementarity constraints.”