Optimization for Scientific Machine Learning#

Objectives#

  • Understand what are optimization problems,

  • Understand concepts like local minima, and saddle points.

  • Introduce gradient descent and variants, like stochastic gradient descent, momentum, AdaGrad, RMSProp, Adam, AdamMax, AdamW.

  • Introduce methods using second order information, like Newton’s method and L-BFGS.

  • Introduce optimizers in jax using optax.