https://a16z.com/llmflation-llm-inference-cost/ for example shows this to be true.
The report from OpenRouter https://openrouter.ai/state-of-ai also makes the same observation.
https://a16z.com/llmflation-llm-inference-cost/ for example shows this to be true.
The report from OpenRouter https://openrouter.ai/state-of-ai also makes the same observation.