Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
skeptrune
on Feb 16, 2024
|
parent
|
context
|
favorite
| on:
Training LLMs to generate text with citations via ...
This is soooo much more exciting than the "put 100M tokens in the context window" idea
bugglebeetle
on Feb 16, 2024
|
next
[–]
I would say the large context sizes with the ability to reliably cite said context is the best possible outcome.
amelius
on Feb 16, 2024
|
prev
[–]
ELIZA was so much more exciting than the "put 100B artificial neurons on a GPU" idea.
;)
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: