auto-differentiation is the magic behind most neural net-based ML (you do get some madmen who do the task manually). Backpropagation is basically just a special case of AD, and it lets you evaluate the derivative of an arbitrary (well-enough behaved) function with respect to any of its inputs. It's very useful for a lot of optimisation applications, but ML is the biggest in terms of GPU compute use.
(It's not particularly magical from a mathematic point of view, it's basically just an application of the chain rule, but it's something where you either need a fairly complex library or language support to implement, because you effectively need to take the function(s) you want the derivative of and transform them into a different set of calculations)
If you have auto-differentiation, it is tractable to do machine learning in a language. Because slang runs on any shader language & GPU, it could enable generalized acceleration of ML training on any graphics hardware which is cool.
YouTube videos on backpropagation might explain it better