Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
Marazan
9 months ago
|
parent
|
context
|
favorite
| on:
A non-anthropomorphized view of LLMs
aAnthrophormisation happens because Humans are absolutely terrible at evaluating systems that give converdational text output.
ELIZA fooled many people into think it was conscious and it wasn't even trying to do that.
Consider applying for YC's Summer 2026 batch! Applications are open till May 4
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search:
ELIZA fooled many people into think it was conscious and it wasn't even trying to do that.