Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As risky as any other health related self help, plus the added risk of unreliability.

When GPT proves itself to be reliably beneficial, then therapists will use it or recommend it themselves. Until then it’s an experimental tool at best.



I would say self-help is quite unreliable already, more unreliable doesn’t make it much worse.

The authority argument is pointless. The therapist must value person’s wellbeing above their continued income for this to apply. In theory they should, but it would take a lot to convince me and I would want to know what’s the incentive behind such a recommendation. An to be clear, I’m not saying LLM can be your therapist.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: