Speaker
Description
Abstract
Automatic Differentiation (autodiff or AD) is a technique for computing derivatives that differs from both symbolic and numerical differentiation. It relies on the implementation of the chain rule to functions composed of simpler smooth functions whose Jacobian matrices are known.
Given the implementation of a function composed of smooth operations, Automatic Differentiation enables the computation of its partial derivatives by propagating derivatives through the computational graph of the program. Using this approach, we can differentiate a function that is part of a code and does not have an explicit mathematical expression.
In this work, we demonstrate the application of this technique to basic maintenance optimization problems, where the objective is to identify maximum or minimum points of a cost function approximated using Monte Carlo methods. This approach enables efficient gradient-based optimization even when the cost function is estimated through stochastic simulation. The implementation is carried out using modern automatic differentiation libraries such as JAX and Autograd.
References
- Atilim Gunes Baydin, Barak A. Pearlmutter, Alexey Andreyevich Radul, and Jeffrey Mark Siskind.
Automatic differentiation in machine learning: a survey, 2018. - Waltraud Kahle, Sophie Mercier, and Christian Paroissin. Degradation Processes in Reliability. Mathe-
matics and Statistics Series, Mathematical models and methods in reliability set, Vol. 3. June 2016. - Charles C. Margossian. A review of automatic differentiation and its efficient implementation. WIREs
Data Mining and Knowledge Discovery, 9(4), March 2019. - Toshio Nakagawa. Maintenance Theory of Reliability. Springer Series in Reliability Engineering. Springer
Science & Business Media, London, 2005.