> The cost of inference -- ie $ that go to your llm api provider
This is the crux of it: when talking about "the cost of inference" for the purposes of the unit economics of the business, what's being discussed is not what they charge you. It's about their COGs.
That's not word games. It's about being clear about what's being talked about.
Talking about increased prices is something that could be talked about! But it's a different thing. For example, what you're talking about here is total spend, not about individual pricing going up or down. That's also a third thing!
You can't come to agreement unless you agree on what's being discussed.
This is the crux of it: when talking about "the cost of inference" for the purposes of the unit economics of the business, what's being discussed is not what they charge you. It's about their COGs.
That's not word games. It's about being clear about what's being talked about.
Talking about increased prices is something that could be talked about! But it's a different thing. For example, what you're talking about here is total spend, not about individual pricing going up or down. That's also a third thing!
You can't come to agreement unless you agree on what's being discussed.