Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Or in my case, a prediction machine that is more often wrong than right.


Frequency of correctness is secondary to quality of prediction.

That is, even if your brain is only right some of the time, it matters more that the times it is right are important.

Constantly making small mistakes is the price to pay for successfully surviving when it matters.


The brain is a survival engine, not a truth detector. If lies improve fitness, the brain lies.


Exactly. As a matter of fact it constantly lies to us projecting a convenient reality and narrative so we eagerly wait for the next day.


Lying to oneself and exposing the complete truth are two alternative strategies towards survival.

That is why some people have brains that lie to them less then others, while others have brains that lie more. You will find different degrees of bias and delusion in the people you encounter.


At which point you run into problems of defining agency/free will.


Free will lies at the particle level. If atoms are defined by strict physical rules that are deterministic and have no free will then if our brains are made of of the same atoms does that mean our brains are also defined by the same strict physical rules?

What if atoms are not defined by deterministic rules? What if it's defined by probabilistic rules? Then are our brains bounded by the same probabilistic rules?

It seems that given that all brains are made of the same stuff as the universe, the question of free will is more of a physics question. What is the universe made of? And does this fundamental unit allow for free will?


I was alluding more to the issue of deciding how responsible people should be held for their actions, for societal purposes.


That's is one brilliant observation - love it!


there's also subconscious lying, such as hiding our blind spots where the optic nerve is attached to the retina and turning off our vision when we move our eyes, and toying with our perception of time passage to make it seem like there was no gap in vision.


Is that one of its right predictions? Or one of the wrong?

A bias toward wrong predictions can be adaptive. If a wild animal comes across something that is 99% safe and 1% fatal, it could well be better off predicting doom and fleeing just in case.


Your achievement in getting to this point in life and any continued existence are pretty irrefutable proof your brain is getting it mostly right, no matter how often you feel wrong.


Unless you're bumping into walls or tripping over your own feet all the time, that's probably not true.


Yeah. If you can find your mouth with a fork-full of food then your brain is predicting well. If you can drive a car in traffic without becoming an immediate public hazard then your brain is great at predicting.


Scale and context matter. Its number one job is to predict what actions to take to keep you alive to reproduce and it's done that pretty well. The more abstract you get the less accurate it will be.


Which could also be a bug in the conclusion algorithm / reward function that evaluates right from wrong after the fact


in cases where there's more than 2 options, that's still not necessarily a bad track record




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: