Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As someone deeply involved in NLP, I’ve observed the field’s evolution: from decades of word counting and statistical methods to a decade of deep learning enabling “word arithmetic.” Now, with Generative AI, we’ve reached a new milestone, a universal NLP engine.

IMHO, the path to scalability often involves using GPT models for prototyping and cold starts. They are incredible at generating synthetic data, which is invaluable for bootstrapping datasets and also data labelling of a given dataset. Once a sufficient dataset is available, training a transformer model becomes feasible for high-intensity data applications where the cost of using GPT would be prohibitive.

GPT’s capabilities in data extraction and labeling are to me the killer applications, making it accessible for downstream tasks.

This shift signifies that NLP is transitioning from a data science problem to an engineering one, focusing on building robust, scalable systems.



Reminds me of the whole Chomsky vs. Norvig debate - https://norvig.com/chomsky.html


Thanks for the link, just read it, and the Chomsky transcript. Chomsky wanted deep structure, Norvig bet on stats, but maybe Turing saw it coming, kids talk before they know grammar and so did the machines. It turns out we didn’t need to understand language to automate it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: