Hacker News new | past | comments | ask | show | jobs | submit login

Automatic differentiation gives you the ability to freely define the forward propagation of your neural network and you get backpropagation for free. The NN library "Flow" for Julia makes great use out of this. Having automatic differentiation makes it very simple to define novel layers, with very little work.

>is it ever required for applications to implement their own differentation for something specific?

If you want to do backpropagation you need to manually or automatically calculate derivatives woth regards to your parameters.




Do you mean Flux?

I've had to define my own gradients in Flux before, but that was two years ago, maybe things have improved?


IIRC even two years ago you could get gradients by AD in Flux (you are completely correct about the name). Nowadays you have https://fluxml.ai/Flux.jl/stable/training/zygote to calculate gradients with AD.

In any case AD in general is useful for NNs if you want to implement a novel layer. Of course you could instead derive back propagation by algebraic or manual differentiation instead.


It was only one weird operation I was doing that needed a gradient defined. I should have made that more clear. I was using Zygote.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: