Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Do those use cases need LLMs? Probably not. but if good results can be had with a day of prompting (in addition to the stuff mentioned in the article, which you have to do anyway) and a smaller model like Haiku gives good results why would you build a classifer before you have literally millions of customers?

The LLM solution will be much more flexible because prompts can change more easily than training data and input tokens are cheap.

 help



> Do those use cases need LLMs? Probably not.

One of the points of the article is the importance of gathering data to support your conclusions.

> prompts can change more easily than training data

Training data is real, and prompts are not. I don’t think this is an apples to apples comparison.


I don't disagree that very numerical tasks like revenue forecasting are not a good fit for LLMs. But neither did a lot of data scientist concerns themselves with such things (compared to business analysts and the like). Software to achieve this has been commoditized.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: