Currently set to No Index

Lecture slides: Introduction to Adjoint Differentiation and Back-Propagation in Machine Learning and Finance

Lecture slides: Introduction to Adjoint Differentiation and Back-Propagation in Machine Learning and Finance


To access the document, go to https://antoinesavine.files.wordpress.com/2018/11/intro2aadinmachinelearningandfinance.pdf
This is a work in progress, feedback is highly appreciated.

Content
Introduction

Adjoint Differentiation – AD
Application: model fitting
Calibrating a financial pricing model
Training a deep learning model
Application: market risk
Differentiation
Adjoint Differentiation – AD
A brief history of AD
Automatic Adjoint Differentiation – AAD
AAD: challenges
AAD book
Overview
Demonstration: Dupire’s model (1992)
Demonstration: Results

Deep Learning

Neural networks and deep learning
Linear regression: prediction
Linear regression and classification (1)
Linear regression and classification (2)
Linear regression: training
Lin reg only captures linear functions
Basis function regression
Basis function regression: performance
Curse of dimensionality
Overfitting and the rule of ten
Basis function regression vs ANN
ANN basis functions
ANN: prediction
Choice of activation function
ANN: computation graph
ANN: training
Universal representation theorem
More on regression, basis functions and ANNs
Deep learning: composing basis functions
Why deep learning?
Deep feed-forward networks

Back-Propagation in deep neural nets

Differentials of the cost function
Differentials by finite differences
FD and automatic differentiation
Computing cost differentials
Differentials by forward Jacobian propagation
Jacobian propagation: performance
Adjoint propagation: performance
Back-Propagation

AD through evaluation graphs

Evaluation graphs and adjoint propagation
Example: Black & Scholes
Black & Scholes: evaluation graph
Feed-forward equations
Differentiation equations
Forward and backward differentiation
Evaluation graphs: conclusion

Recording calculations on tape

Automatic differentiation
Building evaluation graphs
Operator overloading
Recording operations
Lazy evaluation
Conventional implementation
Simplistic implementation
Record and tape data structures
Custom real number
Operator overloading
On-class operator overloading
Function overloading
Custom function overloading
Avoiding code duplication
Comparison operator overloading
Applying the recording framework
Instrumenting computation code

AAD

State of the tape after recording
Successors and adjoints
Adjoint propagation
Complexity
Adjoint propagation code
Conclusion

AAD for financial simulations

Simple simulation code
Simulation code, version 1
A simplistic code
An inefficient code
Smoothing barrier options
Smooth barrier
Simulation code with smooth barrier
Simulation code
Differentiation steps
Differentiation code
Initialization
Record evaluation, propagate adjoints
Pick and return results
Testing the code
Solution in principle
Solution in code
Performance
Conclusion


Link: Lecture slides: Introduction to Adjoint Differentiation and Back-Propagation in Machine Learning and Finance