Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> A requisite of common sense is understanding, and LLMs do not possess any sort of understanding.

Adding to this, the reason they lack understanding is because they lack experience. To them, the universe is limited to the very approximate symbolic representation system we invented known as language. Even worse, it's just written language which is strictly less expressive than spoken language.

They process our experience only as linguistic patterns, nothing more.

That all said, it seems like for a domain-specific use case like ordering fast food, some prompting and function calling to enforce limits on an order could have addressed this and simulated "common-sense", so it sounds a lot like they did a poor implementation.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: