Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well I think the point being made is an instrumental one: it’s general enough to matter, so we should use the word “general” to communicate that to laypeople.

Re:”traits we associate with general intelligence”, I think the exact issue is that there is no scientific (ie specific*consistent) list of such traits. This is why Turing wrote his famous 1950 paper and invoked the Imitation Game; not to detail how one could test for a computer that’s really thinking(/truly general), but to show why that question isn’t necessary in the first place.



I still disagree, being good at a number of tasks does not make it intellectual.

Certainly creativity is missing, it has no internal motivation, and it will answer the same simple question both right and wrong, depending on unknown factors. What if we reverse the framing from "it can do these tasks, therefore it must be..." to "it lacks these traits, therefore it is not yet..."

While I do not disagree that the LLMs have become advanced enough to do a bunch of automation, I do not agree they are intelligent or actually thinking.

I'm with Yann Lecun when he says that we won't reach AGI until we move beyond transformers.


And based on the actual Imitation Game in Turing's paper, we are no where close and I don't think we will be close for quite some time.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: