About me
Computational scientist specializing in solving challenging mathematical problems and their practical applications, my primary expertise lies in numerical optimization. I specialize in addressing nonconvex problems, including mathematical programs with complementarity problems, bilevel optimization, and generalized Nash games, and nonsmooth optimization problems arising in sparse optimization. I use optimal control and nonsmooth dynamical systems in diverse applications such as game theory and geochemistry. An important aspect of my research is to make algorithms accessible and competitive for large-scale problems encountered in machine learning, earth sciences, and life sciences.
I am always open to new collaborations and opportunities, so feel free to reach out!
Future plans
- ICCOPT 2025, July 21-24 2025, Los Angeles, USA
- JuMP-dev 2025, November 17-20 2025, Auckland, New Zealand
News
Presenting at JuMP-dev 2024 and Publishing in JuliaCon 2023 Proceedings
Published:
I’m thrilled to share two major milestones in my recent work within the Julia ecosystem. First, I presented the latest developments in optimization solvers at JuMP-dev 2024, and second, my paper on JSOSuite.jl was accepted in The Proceedings of the JuliaCon Conferences.
These two achievements highlight both the ongoing evolution of JuliaSmoothOptimizers (JSO) and its growing impact on large-scale nonlinear optimization problems.
JuMP-dev 2024: Advancing Nonlinear Optimization with JuliaSmoothOptimizers
This year’s JuMP-dev workshop, held independently from JuliaCon for the first time in Montreal, offered a focused platform for deep dives into JuMP and its surrounding tools. In my presentation, I discussed the latest progress within the JuliaSmoothOptimizers (JSO) ecosystem, my slides and the replay.
At the core of my talk was an introduction to new solvers and packages like AdaptiveRegularization.jl, which address the unique challenges of large-scale optimization problems with Adaptive Regularization with Cubics. I emphasized the following key innovations:
- Automatic Differentiation (AD) support and integration with JuMP for easier problem modeling.
- Memory pre-allocation for in-place solvers, reducing runtime overhead.
- Support for multi-precision solvers and GPU-based computations, essential for modern large-scale applications.
- The value of factorization-free solvers, which excel in tackling large, complex problems, such as those in discretized PDE-constrained optimization.
For newcomers to JSO, JSOSuite.jl serves as a critical entry point, simplifying solver selection and benchmarking through automatic algorithm matching. This tool eliminates the complexity of choosing from multiple solvers by providing a user-friendly interface that adapts to the problem at hand. My talk also touched on the broader adoption and longevity of JSO, which now spans over 50 registered packages, making it one of the most comprehensive platforms for numerical optimization.
JuliaCon 2023: JSOSuite.jl – Simplifying Continuous Optimization
While JuMP-dev 2024 focused on recent developments, my publication in The Proceedings of the JuliaCon Conferences looks at the core philosophy and implementation behind JSOSuite.jl. Titled JSOSuite.jl: Solving Continuous Optimization Problems with JuliaSmoothOptimizers, the paper introduces JSOSuite.jl as a package designed to bring ease-of-use to complex optimization challenges.
JSOSuite.jl covers a range of problem types—from unconstrained to generally-constrained and least-squares problems—and eliminates the need for users to understand the intricate details of individual solvers. Instead, the package conducts a preliminary analysis of the problem and automatically selects the most appropriate solver, offering significant advantages to both experienced practitioners and newcomers alike.
This paper builds on the innovations within JSO, reinforcing its versatility and ease of use across various fields and applications. The package is a natural fit for researchers who need efficient, reliable solvers without the overhead of manually configuring them for different problem types.
Looking Forward
Both my presentation at JuMP-dev 2024 and the publication of the JSOSuite.jl paper reflect the significant strides made by the JuliaSmoothOptimizers organization over the past year. The JSO ecosystem is positioned to continue driving innovation in the field of numerical optimization.
I’m excited to see how these advancements will be applied across diverse optimization problems in the coming years and look forward to continuing this journey with the JSO community.
New Preprint on HAL: Exploring Projected Dynamical Systems in Geochemical Reactions
Published:
This project holds a special place in my heart as it touches on the very applications in geochemistry that first drew me into research. Equilibrium reactions, particularly in slow processes like the water cycle in aquifers, have always fascinated me. Moreover, this paper represents an important milestone for one of the authors, Bastien, as it was part of his Ph.D. thesis. The use of projected dynamical systems, a model I am particularly fond of, adds an additional layer of personal significance to this work.
Performance Profile Benchmarking Tool
Published:
The Dolan-More Performance Profile is a method used for comparing the performance of algorithms.
Empowering Research: The Vital Role of Citing Research Software for Reproducibility and Innovation
Published:
As I started my Ph.D. journey in numerical optimization back in 2014, I noticed something that really stood out to me: despite the abundance of scientific papers discussing algorithms and their numerical results, the availability of corresponding open-source codes lagged far behind.
ARCqK published in Mathematical Programming
Published:
I am thrilled to share that the article Scalable adaptive cubic regularization methods has been published in the journal Mathematical Programming, Series A. This has been a really exciting journey with my co-authors Jean-Pierre Dussault and Dominique Orban on this really exciting work that I hope will help explore the numerical possibilities of ARC methods. The proposed implementation is a perfect fit for large-scale application as it solves the subproblem inexactly and only required Hessian-vector products, so no need to evaluate and store the Hessian matrix. As usual, the code has been done in Julia and is available in the folder paper in the Github repository AdaptiveRegularization.jl. Full text published version available from here, enjoy!
Looking Back at the Winter Session
Published:
In the world of numerical optimization, the quest for efficiency, accuracy, and innovation never ceases. At Polytechnique Montréal, students embarking on their educational journey have a unique opportunity to explore this dynamic field through the course MTH8408 Méthodes d’optimisation et contrôle optimal. This course, which I had the privilege of teaching during the winter session of 2023, delves into the depths of numerical methods for optimization, variational calculus, and optimal control.
Unveiling JuliaSmoothOptimizers at JuMP-dev Workshop, JuliaCon 2023
Published:
I am thrilled to share my experiences from the recent JuMP-dev workshop that took place at the JuliaCon 2023, held at MIT in Boston, USA. As a passionate researcher in the field of numerical optimization, this year’s conference was particularly special for me, as it marked my first in-person attendance after the previous year’s online edition (read about it here). You can check the replay of my talk on youtube.
PDENLPModels.jl published in JOSS
Published:
I am very happy to announce the publication in the Journal of Open Source Software of the paper PDENLPModels.jl: A NLPModel API for optimization problems with PDE-constraints.
A new package: FletcherPenaltySolver.jl
Published:
The package FletcherPenaltySolver.jl is now a Julia package !! Very happy about this one because it has been a long project and the solver with great for large problems. Besides, we always teach penalty methods at the University, but efficient implementations are scarce.
JuliaSmoothOptimizers at the JuliaCon 2022
Published:
I showed how to use JuliaSmoothOptimizers framework to PDE-constrained optimization problems modeled with PDENLPModels.jl at this year’s JuliaCon 2022/JuMP-dev 2022. The conference featured three talks on JuliaSmoothOptimizers in the JuMP-dev stream.
Looking Back at the Winter Session
Published:
I taught in french the class MTH8408 Méthodes d’optimisation et contrôle optimal during the winter session at Polytechnique Montréal on numerical methods for optimization, variational calculus, and optimal control.
JuliaSmoothOptimizers at Optimization Days Montréal
Published:
We organized a steam of sessions on ’’Numerical optimization and linear algebra with Julia’’ at Optimization Days/Journées de l’optimisation 2022 held at HEC Montréal, May 16-18, 2022. The conference, renowned for its optimization expertise and wine & cheese party, was held in-person for the first time since 2019!
Talk on JuliaSmoothOptimizers and the NLPModel API
Published:
Yesterday was …. the 10th anniversary of Julia, and, yes, valentine’s day also. To celebrate this, we had the opportunity to present JuliaSmoothOptimizers at the JuMP-dev seminar and talk about the API for nonlinear models JuMP nonlinear developers call: The JuliaSmoothOptimizers ecosystem.
DCISolver.jl published in JOSS
Published:
Last December, we submitted a paper to the Journal of Open Source Software related to the package DCISolver.jl
and it is now published!
Partially differentiable projects
Published:
I am starting a new year in Montreal and one of the main projects for this year is to move forward in finalizing an ecosystem for optimization problems with partial differential equations in the constraints. The ecosystem is hosted under the JuliaSmoothOptimizers umbrella and contains:
- PDENLPModels.jl a package to model the PDE-constrained optimization problem using Gridap.jl Finite-Element discretization for the PDE.
- PDEOptimizationProblems contains a collection of 39 problems using PDENLPModels. It includes most of the problems from the COPS test set.
A new package: RandomLinearAlgebraSolvers.jl
Published:
The package RandomLinearAlgebraSolvers.jl is now a Julia package. It contains randomized iterative methods for linear algebra and random projectors used to solve linear systems in the sense of the Johnson–Lindenstrauss lemma. The package is still at an early stage and new contributions are very welcome. You will find below an example extracted from the package’s documentation. The package uses Stopping.jl as a framework for iterative methods.
IVADO, Digital October
Published:
As part of my postdoc at IVADO, I presented my work on solving PDE-constrained optimization problems in Julia during IVADO’s annual conference, IVADO Digital October. It was a 3min presentation, and a great challenge for such challenging problem! The video is now available on Youtube, and, as usual, more on the various packages on GitHub @tmigot.
Winter 2021: News and Publications
Published:
Some news following our work in Guelph with an article on the website of the CEPS Guelph presenting some nice applications of GNEPs.
New Position in Montréal
Published:
I am starting a new postdoc position in Polytechnique Montréal with Dominique Orban on solvers for optimization problems involving PDE constraints in Julia and data science applications. This new position is funded by a competitive grant of IVADO, the institute for data valorization in Montreal.
Summer 2020 Publications
Published:
Two publications have been accepted this summer: On approximate stationary points of the regularized mathematical program with complementarity constraints in JOTA and The new butterfly relaxation method for mathematical programs with complementarity constraints in the proceedings of the Indo-French seminar. These works are the theoretical basis toward the computation of M-stationary points of the MPCC. I also continue to update the Julia:
- MPCC.jl Set of tools to model the mathematical program with complementarity/switching/vanishing constraints following the NLPModels structure and with basic tools to use the Stopping framework.
- MPCCSolver.jl A set of algorithms to solve the models from MPCC.jl.
I also had the pleasure to present A differential inclusion approach to mineral precipitation-dissolution reactions in geochemistry in collaboration with Jocelyne Erhel and Bastien Hamlat in the online Workshop Variational Methods in Nonlinear Phenomena. This work is part of Bastien’s Ph.D. thesis that he successfully defended this month, Congrats!!!
Conference in Toronto [Update]
Published:
The workshop has now evolved as an online workshop to maintain active researches despite the current pandemic, see the schedule here!
Winter 2020 Seminars and a publication
Published:
To celebrate the new year, I gave in January a seminar talk at the Departmental Colloquium Series in Guelph where I motivated the GNEP and presented some of our findings with Monica Cojocaru. In particular, it was based on this new paper we got accepted A dynamical system approach to the generalized Nash equilibrium problem in the open access Journal of Nonlinear and Variational Analysis. The paper reviews the use of nonsmooth dynamical system to find generalized Nash equilibria.
Stopping v.0.2
Published:
The new version of Stopping.jl is now official on Julia. It is the first stable version of it, and I will now soon release more codes using it for MPCC and GNEP. Currently, the main page of the project is here: Stopping.jl
Fall 2019 Publications
Published:
I taught a new course in Guelph this Fall, and in the meantime, 3 papers have been accepted.
Summer 19 Talks
Published:
I gave a couple of talks over the summer at the EURO conference 2019 EURO conference, Dublin, Ireland (and co-organize a stream on games with 18 talks!), the 2019 World Congress on Global Optimization, Metz, France, the 2019 ICCOPT, Berlin, Germany, the MOPTA 2019, Lehigh, USA and finally the AMMCS-2019, Waterloo, Ontario (co-organize of a special session on optimization). Now, back in Guelph for a second year and teach a new class this Fall.
Stopping v.0.1
Published:
I gave a talk entitled “Stopping.jl: A framework to implement iterative optimization algorithms” in the Journées de l’optimisation 2019, Montréal, Québec in the session Optimization in Julia to introduce our new Julia package.
Winter 19 Publications and a New Preprint
Published:
This winter, I’m teaching a fun course in Guelph. In the meantime, several papers have been accepted for publication.
CMS Winter Meeting
Published:
I gave a talk on the KKT conditions of the GNEP during the 2018 CMS Winter Meeting in Vancouver.
New Position in Guelph
Published:
I started a 2 years post-doc position in Guelph under the supervision of Monica Cojocaru. I will work on bringing optimization methods for Generalized Nash Equilibrium Problems.
New Position in Sherbrooke
Published:
I am back in Sherbrooke for three months after winning a grant in the interdisciplinary program “Programme de bourses d’excellence pour étudiants étrangers” of FRQNT. I will continue working on a solver for degenerate non-linear programs in Julia in collaboration with Jean-Pierre Dussault.
Spring 18 Talks
Published:
I attended two conferences this spring from May, 2nd to 4th LOPAL in Rabat (Marocco) and from June 3rd to 6th in Castro Urdiales, Spain.
New Position in INRIA
Published:
I join the team FLUMINANCE at INRIA Rennes for a few months to work on dynamical complementarity problems with Jocelyne Erhel.
PhD Defense
Published:
I have successfully defended my PhD thesis in Rennes. You can find the slides here and the manuscript here.
Conference
Published:
I gave a talk on regularization methods for MPCC and its extension to MPVC in ParaoptXI (19-22 september) in Prague. UPDATE: you can now find a preprint on this subject here. UPDATE 2: The paper is published in Optimization (doi:10.1080/02331934.2018.1542531)
Preprint
Published:
I posted on HAL a work in collaboration with Jean-Pierre Dussault, Mounir Haddou, and Abdeslam Kadrani on relaxation methods for mathematical programs with complementarity constraints and the approximate resolution of their sub-problems entitled : How to Compute a Local Minimum of the MPCC. I will present part of this work during the conference EUROPT 2017 in Montréal.
Preprint
Published:
I posted on HAL a work on the butterfly relaxation method for mathematical programs with complementarity constraints in collaboration with Jean-Pierre Dussault and Mounir Haddou entitled: The New Butterfly Relaxation Methods for Mathematical Programs with Complementarity Constraints.
Talk INFORMS Annual Meeting
Published:
I gave a talk about the butterfly relaxation method for mathematical programs with complementarity constraints at the INFORMS 2016 Annual Meeting, which were organized in the city of music Nashville (Tennessee, US) from 14 to 16 of November 2016, see the slides.
Conference in Rennes
Published:
I was in the organizing committee of the conference HJ2016: Hamilton-Jacobi Equations: new trends and applications, which takes place from 30 May to 3 June 2016 in Rennes (France). I also presented there a poster on: “A new relaxation method for the mathematical program with complementarity constraints.”
Talks and publications
Published:
I gave a talk about interior point methods for monotone linear complementarity problems at the Journées SMAI-MODE 2016, which were organized in Toulouse from 23 to 25 of March 2016, see the slides. You can find a preprint of this work on HAL. UPDATE Jan. 2018: the paper has been accepted in Optimization Letters (doi: 10.1007/s11590-018-1241-2).
Publication on AVE
Published:
I posted on HAL a work on a numerical method for the absolute value equation in collaboration with Lina Abdallah from Tripoli, Lebanon, and Mounir Haddou entitled: Solving Absolute Value Equation using Complementarity and Smoothing Functions. EDIT: The paper has been accepted in Journal of Computational and Applied Mathematics (doi :10.1016/j.cam.2017.06.019).