Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes. Each query needs to generate enough revenue to pay for the cost of running the query, a proportional cost of the cost to train the original model, overhead, SG&A, etc. just to break even. Few have shown a plan to do that or explained in a defensible way how they’re going to get there.

A challenge at the moment is a lot of the AI movement is led by folks that are brilliant technologists but have little to no experience running viable businesses or building a viable business plan. That was clearly part of why OpenAI has its turmoil in that some where trying to be tech purists where others knew the whole AI space will implode if it’s not a viable business. To some degree that seems to behind a lot of the recent chaos inside these larger AI companies.



It's always fascinating to see. The business portions are much easier to solve for with an engineering mindset, but it seems to be a common issue that engineers never take it into account.

This is saying nothing about "technologists" (or as they're starting to become derided as "wordcels": people that communicate well, but cannot execute anything themselves).

It would be... not trivial, but straightforward to map out the finances on everything involved, and see if there is room from any standpoint (engineering, financial, product, etc.) to get queries to breakeven, or even profitable.

But at that point, I believe the answer will be "no, it's not possible at the moment." So it becomes a game of stalling, and burning more money until R&D finds something new that may change that answer (a big if).


> the whole AI space will implode if it’s not a viable business

if the Good Lord's willing and the creek don't rise


Inference will continue to shift toward profitability, Google and the rest are going to choke the cost to gain traction but as TPU costs come down and GPU costs scale up and models become more efficient the cost to infer will drastically scale down into profitability.


This comment made me laugh because I thought of NVIDIA during the crypto boom and now the AI boom.

They played their cards so damn right when deep learning was taking off.


Many here know I personally despise AI, but to take off my tech critic hat for a moment, if subscription models don't work out, we still ultimately do have a bunch of pretty powerful base models to build on. If you can push inference to the edge and run on local machines, you can make your users bear the cost.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: