Currently set to No Index

# Lecture slides: Introduction to Adjoint Differentiation and Back-Propagation in Machine Learning and Finance

Lecture slides: Introduction to Adjoint Differentiation and Back-Propagation in Machine Learning and Finance

To access the document, go to https://antoinesavine.files.wordpress.com/2018/11/intro2aadinmachinelearningandfinance.pdf
This is a work in progress, feedback is highly appreciated.

Content
Introduction

Application: model fitting
Calibrating a financial pricing model
Training a deep learning model
Application: market risk
Differentiation
Overview
Demonstration: Dupire’s model (1992)
Demonstration: Results

Deep Learning

Neural networks and deep learning
Linear regression: prediction
Linear regression and classification (1)
Linear regression and classification (2)
Linear regression: training
Lin reg only captures linear functions
Basis function regression
Basis function regression: performance
Curse of dimensionality
Overfitting and the rule of ten
Basis function regression vs ANN
ANN basis functions
ANN: prediction
Choice of activation function
ANN: computation graph
ANN: training
Universal representation theorem
More on regression, basis functions and ANNs
Deep learning: composing basis functions
Why deep learning?
Deep feed-forward networks

Back-Propagation in deep neural nets

Differentials of the cost function
Differentials by finite differences
FD and automatic differentiation
Computing cost differentials
Differentials by forward Jacobian propagation
Jacobian propagation: performance
Back-Propagation

Example: Black & Scholes
Black & Scholes: evaluation graph
Feed-forward equations
Differentiation equations
Forward and backward differentiation
Evaluation graphs: conclusion

Recording calculations on tape

Automatic differentiation
Building evaluation graphs
Recording operations
Lazy evaluation
Conventional implementation
Simplistic implementation
Record and tape data structures
Custom real number
Avoiding code duplication
Applying the recording framework
Instrumenting computation code

State of the tape after recording
Complexity
Conclusion

Simple simulation code
Simulation code, version 1
A simplistic code
An inefficient code
Smoothing barrier options
Smooth barrier
Simulation code with smooth barrier
Simulation code
Differentiation steps
Differentiation code
Initialization