May. 14th, 2019

xacid: (Default)
https://www.cs.purdue.edu/homes/rompf/papers/wang-preprint201811.pdf

In this paper, we take a fresh look at automatic differentiation (AD) techniques, and especially aim to demystify the reverse-mode form of AD that generalizes back-propagation in neural networks.We uncover a tight connection between reverse-mode AD and delimited continuations, which permits implementing reverse-mode AD purely via operator overloading and with-out managing any auxiliary data structures.

Profile

xacid: (Default)
xacid

April 2021

S M T W T F S
    123
45678910
11121314151617
18192021222324
252627282930 

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 28th, 2025 04:26 am
Powered by Dreamwidth Studios