Automatic differentiation gives you the ability to freely define the forward propagation of your neural network and you get backpropagation for free. The NN library "Flow" for Julia makes great use out of this. Having automatic differentiation makes it very simple to define novel layers, with very little work.
>is it ever required for applications to implement their own differentation for something specific?
If you want to do backpropagation you need to manually or automatically calculate derivatives woth regards to your parameters.
IIRC even two years ago you could get gradients by AD in Flux (you are completely correct about the name).
Nowadays you have https://fluxml.ai/Flux.jl/stable/training/zygote to calculate gradients with AD.
In any case AD in general is useful for NNs if you want to implement a novel layer. Of course you could instead derive back propagation by algebraic or manual differentiation instead.
>is it ever required for applications to implement their own differentation for something specific?
If you want to do backpropagation you need to manually or automatically calculate derivatives woth regards to your parameters.