Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I can't tell because I don't speak category theory, but I've been told the essence is similar to how I've implemented the neural networks for spaCy: https://github.com/explosion/thinc

If so, then it definitely works fine. On the other hand I don't think parallelism is so easy to achieve, so maybe it's not the same thing after all.



My understanding is also that it is. Your implementation drove the point home for me that it can be practical.

The dictionary is rather simple

- a lambda \x . f(x) is known as an "internal hom object"

- you get a closure by using the partial application morphism (see https://en.wikipedia.org/wiki/Cartesian_closed_category)

- copying a variable is known as Δ : x ↦ (x,x)

- there is a contravariant functor T^\star: X → T^\star X between a manifold X and its cotangent space, which maps f: X → Y to its adjoint f^\star: T^\star Y → T^\star X

- you can then consider the product of the category of smooth manifolds Man × Man^{op} and define a functor Man → Man × Man^{op}, which maps a manifold X to the product of manifolds (X,T^\star X) and a morphism f : X → Y to the morphism (f,f^\star).

- All you have to do to get from your implementation to the notions in the paper is to convert the prescription you give into a point free style and apply some jargon (You need things like the partial application map and evaluation map, which are defined in any closed cartesian category, but easily translate to programming languages as well)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: