Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It reminds me of the Mirror Self-Recognition test. As humans, we know that a mirror is a lifeless piece of reflective metal. All the life in the mirror comes from us.

But some of us fail the test when it comes to LLM - mistaking the distorted reflection of humanity for a separate sentience.



Actually, I propose you're a p-zombie executing an algorithm that led you to post this content, and that you are not actually a conscious being...

That is unless you have a well defined means of explaining what consciousness/sentience is without saying "I have it and X does not" that you care to share with us.


Thing is that other humans have the same biology as ourselves, so saying they're not conscious would mean we're (or really just me) are special somehow in a way that isn't based on biology. That or the metaphysical conclusion is solipsistic. Only you (I, whoever) exists and is hallucinating the entire universe.


I found Bostom's Superintelligence to be the most boring Scifi I have ever read.

I think it's probably possible to create a digital sentience. But LLM ain't it.


This very much focused some recurring thought I had on how useless a Turing style test is, especially if the tester really doesn't care. Great comment. Thank you.


I always thought the Turing test was kind of silly because most humans would tell the interviewer to bugger off.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: