Do those use cases need LLMs? Probably not. but if good results can be had with a day of prompting (in addition to the stuff mentioned in the article, which you have to do anyway) and a smaller model like Haiku gives good results why would you build a classifer before you have literally millions of customers?
The LLM solution will be much more flexible because prompts can change more easily than training data and input tokens are cheap.
I don't disagree that very numerical tasks like revenue forecasting are not a good fit for LLMs. But neither did a lot of data scientist concerns themselves with such things (compared to business analysts and the like). Software to achieve this has been commoditized.
The LLM solution will be much more flexible because prompts can change more easily than training data and input tokens are cheap.