Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

im not sure that you read what i just said. Almost no one using chatgpt would care if they were still talking to gpt5 2 years from now. If compute per watt doubles in the next 2 years, then the cost of serving gpt5 just got cut in half. purely on the hardware side, not to mention we are getting better at making smaller models smarter.


I don't really believe that premise in a world with competition, and the strategy it supports -- let AI companies produce profit off of old models -- ignores the need for SOTA advancement and expansion by these very same companies.

In other words, yes GPT-X might work well enough for most people, but the newer demo for ShinyNewModelZ is going to pull customers of GPT-X's in regardless of both fulfilling the customer needs. There is a persistent need for advancement (or at least marketing that indicates as much) in order to have positive numbers at the end of the churn cycle.

I have major doubts that can be done without trying to push features or SOTA models, without just straight lying or deception.


People cared enough about GPT-5 not being 4o that OpenAI brought 4o back.

https://arstechnica.com/information-technology/2025/08/opena...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: