Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

My hot (maybe just warm these days) take is, the problem with voice assistants on phones is they have to be able to have reasonable responses to a long tail or users will learn not to use them, since the use cases aren’t discoverable and the primarily value is talking to it like a person.

So voice assistants backed by very large LLMs over the network are going to win even if we solve the (substantial) battery usage issue.





Why even bother with the text generation then? You could just make a phone call to an LLM with a TTS frontend. Like with directory enquiries back in the day. Which can be set up as easily as a BBS if you have a home server rack like Jeff Geerling makes youtube videos about.

1 800 CHAT GPT exists, I use it often enough to put it on speed dial on my flip phone



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: