|
Gradient descent with a general cost| title | Gradient descent with a general cost |
|---|
| start_date | 2025/02/04 |
|---|
| schedule | 15h-16h |
|---|
| online | no |
|---|
| location_info | amphi Yvonne Choquet-Bruhat (bât. Perrin) |
|---|
| summary | In this talk I will present an approach to iteratively minimize a given objective function using minimizing movement schemes built on general cost functions. I will introduce an explicit method, gradient descent with a general cost (GDGC), as well as an implicit, proximal-like scheme and an explicit-implicit (forward-backward) method.
GDGC unifies several standard gradient descent-type methods: gradient descent, mirror descent, Newton’s method, and Riemannian gradient descent. I will explain how the so-called nonnegative cross-curvature condition provides tractable conditions to prove convergence rates for GDGC.
Byproducts of this framework include: (1) a new nonsmooth mirror descent, (2) global convergence rates for Newton’s method, and (3) a clear picture of the type of convexity needed for converging schemes in the Riemannian setting. |
|---|
| responsibles | Leclaire |
|---|
Workflow history| from state (1) | to state | comment | date |
| submitted | published | | 2025/01/27 14:51 UTC |
| |
|