Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Ok, and it is completely irrelevant. Besides, guess what enables the training of those neural networks? I’m fairly sure gradient descent has a bit to do with mathematics’ closed-formulas.

Of course ML has use cases where traditional tools are less fit, my gripe is the hype-based anti intellectual nonsense that often surrounds it. They are no magic tools, the fundamental limits these giants of math/CS discovered still apply to them and we can save ourselves from a lot of pain if we don’t bother solving unsolvable problems.



Yeah, hype is driven by business and marketing who want to sell more of a new thing pointing out what was not possible before using all kinds of silly arguments. Still, there is some noticeable progress there (compared to e.g. crypto that outside logistics and large-scale fraud didn't bring much despite being based on super solid number theory concepts).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: