This course offers a rigorous introduction to nonsmooth optimization, focusing on key tools from variational analysis and subdifferential calculus, as well as optimality conditions and duality theory relevant to nonsmooth contexts.

The aim is to provide a solid understanding of how to formulate, analyze, and solve optimization problems involving convex or nonconvex, nondifferentiable functions—structures that frequently arise in applied mathematics, economics, control theory, and machine learning.

Topics include:
– Introduction to nonsmooth optimization: motivations and illustrative examples. Review of
convex analysis and properties of convex functions
– Subdifferentials, Tangent cones: definitions, properties, and calculus rules
– Optimality conditions and variational formulations of optimization problems
– Proximal operators and projection onto convex sets
– Algorithmic approaches: proximal gradient, subgradient method
– Duality theory: Rockafellar’s framework and Lagrangian duality
– Applications: L¹ regularization, variational problems, nonconvex models

The course combines theoretical foundations with practical examples and applications, aiming to develop both analytical insight and algorithmic intuition.

Format:
● Total hours: 20 hours of in-person lectures
● Schedule: 7 sessions from September to November
● Assessment: Final 3-hour written exam + continuous assessment (homework)
● Language: The course is taught in French, but will be delivered in English if non-French-speaking students are present.