What makes you think these things aren’t subsidized? It would be very impressive if Claude was making money off of their $20/month users that hit their weekly limits.
> What makes you think these things aren’t subsidized?
You can pay Amazon or a great many other hosting providers for inference for a wide variety of models. Do you think all of these hosting providers are burning money for you, when it’s not even their model and they have no lock-in?
> It would be very impressive if Claude was making money off of their $20/month users that hit their weekly limits.
They have been adjusting their limits frequently, and those whole point of those limits is to control the cost of servicing those users.
Also:
> Unit economics of LLM APIs
> As of June 2024, OpenAI's API was very likely profitable, with surprisingly high margins. Our median estimate for gross margin (not including model training costs or employee salaries) was 75%.
> Once all traffic switches over to the new August GPT-4o model and pricing, OpenAI plausibly still will have a healthy profit margin. Our median estimate for the profit margin is 55%.
> As of June 2024, OpenAI's API was very likely profitable, with surprisingly high margins. Our median estimate for gross margin (not including model training costs or employee salaries) was 75%.
> Once all traffic switches over to the new August GPT-4o model and pricing, OpenAI plausibly still will have a healthy profit margin. Our median estimate for the profit margin is 55%.
"likely profitable", "median estimate"... that 75% gross margin is not based on hard numbers.