Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

AD is important for training neural networks, or sgd (et al) generally. But that's still only one field. Numerical differentiation is still important e.g. for differential equation solvers. I don't think you can say AD is the most important or useful - maybe for understanding pop culture.


Then we can combine AD and numerical solvers like is found in modern weather and climate models. I don't quite understand but it has something to do with sensitivity analysis and improving data assimilation. (Google "4dvar ecmwf" for more details... eg: https://www2.atmos.umd.edu/~dkleist/docs/da/ECMWF_DA_TUTORIA...)

I think the idea is to use the "tangent linear model" to decide how much importance to give to a particular observation of the initial state.


AD has been a round since at least 1957 (the oldest reference to the class of technique I could find). When I studied it (from a CAS/PLT angle), it was considered a good middle-ground technique between symbolic methods & giving up. You can trace the result of the AD and recover a polynomial solution near the expected results (like a profile guided Taylor expansion, I guess?). It allowed us to run Buchberger's on algorithmic objects without analytic forms, and still have a chance at getting a complete basis for the antiderivative.


> Numerical differentiation is still important e.g. for differential equation solvers.

Is this a misunderstanding? You use differential equation solvers when all you know is how to calculate the derivative(s) of the function, and you want to get the function itself as the solution. Where would you need numerical differentiation in this?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: