Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The issue in the article was paying customers complaining about ads. The ads OpenAI wants to roll out would likely be for free users, since the costs of training and running these LLM systems is very expensive.

From the tweet in your linked post:

> This could help OpenAI give free users more generous usage and features, while users on paid plans stay ad free, which fits with the high costs of running ChatGPT and the revenue they expect from shopping and ad related features





You'd have to be pretty dumb to believe ads are only for the free tier. Look at literally every subscription streaming service. They all have ads on paid tiers now.

They will put ads in the paid ChatGPT tiers. That is an absolute certainty. The only question is how long will they tolerate un-advertised eyballs on paid plans.


> They all have ads on paid tiers now.

Yup, because people who pay for subscriptions are far more valuable ad targets than people who might be too poor or too disciplined to convert on the advertised products.

And the more you pay for a subscription, and the more others purchases they can correlate you making behind the scenes once they have a fingerprint for your identity, the more and more valuable your eyeballs become, and therefore the more challenging it becomes to resist selling your eyeballs on the ad market.

Even if a service you subscribe to isn't placing obvious ads in front of your face today and promising they never will, they're 100% strategizing ways to either make the ads less obvious or to sell your data upstream so that the ads you see elsewhere are more convincing. Better hope you like buying stuff!


Netflix's paid+ads plan costs 50% less than the standard paid only version with no ads.

I could see ChatGPT search results having affiliate links for shopping stuff even for fully-paid users.

There's a lot of competition in this space, so we'll see what users tolerate. But it's going to be tough getting around the fact this stuff is expensive to run.

Things like this are only 'free' for a reason.


>this stuff is expensive to run

What's expensive is innovating on current models and building the infrastructure. My understanding is inference is cheap and profitable. Most open source models cost less than a dollar for 1 million tokens which makes me think SotA models likely have a similar pricepoint, but more profit margin.


I can assure you that inference is not profitable if the user is paying nothing.

DAU/MAU stats of free users have already carved out multi-millionaire and billionaire fortunes for employees and executives, all paid out with VC money. Plenty of people are profiting, even if the corporation is deep in the red.

look at points everywhere for enshittification

> The issue in the article was paying customers complaining about ads.

They will just introduce cheaper, ad supported tier, price hike it to the previous price of ad-free tier and slow-boil the user base


There's no point doing that given the Responses API has to be ad-free unlike ChatGPT Web API for applications to function correctly (no way baking ads into responses sent to third party services using your language model just as a natural language processor), and you have to keep the Web API tiers that's more expensive than the same amount of tokens of equivalent Responses API use also ad-free becuase otherwise the "wrong way of payment" paradox would arise.

> The ads OpenAI wants to roll out would like be for free users

At first. The scream going through the hallways at HQ must be along the lines of: "Nonononono! Not yet!"




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: