Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> They're in the "get 'em hooked" stage of the drug deal.

You're implying that people are selling inference at below cost right now. That's certainly not true for most third-party inference providers. I doubt API pricing at Anthropic or OpenAI is being solid below cost either.

The only place where you get what you're talking about are the fixed price plans OpenAI, Anthropic, Cursor, etc. sell.



OpenAI is claiming they'll post 74 billion in loses through 2028. Anthropic is on course to lose 3 billion by the end of this year, they lost 5 billion last year.

As far as I can tell the inference provider landscape is a fucking mess, and I can't find any decent financial information on any of the ones I tried. So unless you have something showing those companies are profitable I'm not buy it.


Their big spend isn't inference. It is the training, which they can pull back on at any time.

Inference itself will keep getting cheaper.


Even if you eliminate all of OpenAIs other costs besides inference they're still in the red. And they can't just stop training new models. That's like saying Honda can just quit designing new cars. They technically could, but it would destroy their business.

They have one method of monitization right now, and there is no clear evidence that their costs are suddenly going to decrease anytime soon. Despite claims to the contrary, no one has actually provided any evidence of a pathway to those costs magically cutting in half over the next few years.

The entire industry is being propped up by insane over investment and an obsession with growth at all costs. Investments will dry up sooner or later, and you can't grow forever.


Inference keeps getting cheaper, so "it isn't cheap enough yet" isn't an issue. Even with zero efficiency innovations from here, cost per instruction is the most deflationary commodity of all time.

So how was that ever going to be a problem?

The optimal choice for marginal costs, which will naturally drop on their own, at the beginning of a new tech cycle is to run in the red. It would be a sign of gross incompetence if they were fine tuning those costs already.

Training spend is the giant expense. And either training costs are unsustainable, and training spend will hit a pause, or it is not unsustainable and training spend will continue.

So, which is it?

Critical point: The majority of their costs are not required to serve the highest level of capability they have achieved at any given time.

That is unusual. In the sense that it is an exceptionally healthy cost control structure. Note that not even open source offers a cost advantage, for training or inference.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: