Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
necovek
87 days ago
|
parent
|
context
|
favorite
| on:
OpenAI's H1 2025: $4.3B in income, $13.5B in loss
Someone brought up an interesting point: to get the latest data (news, scientific breakthroughs...) into the model, you need to constantly retrain it.
Ianjit
87 days ago
|
next
[–]
The incremental compute costs will scale with the incremental data added, therefore training costs will grow at a much slower rate compared to when training was GPU limited.
fennecbutt
87 days ago
|
prev
[–]
Or, you know, use rag. Which is far better and more accurate than regurgitating compressed training knowledge.
gmerc
87 days ago
|
parent
[–]
Oh please
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: