Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> Given that uncertainty, we can't rule out the chance of our current AI approach leading to superintelligence

I think you realise this is the weak point. You can't rule out the current AI approach leading to superintelligence. You also can't rule out a rotting banana skin in your bin spontaneously gaining sentience either. Does that mean you shouldn't risk throwing away that skin? It's so outrageous that you need at least some reason to rule it in. So it goes with current AI approaches.



Isn't the problem precisely that uncertainty though? That we have many data points showing that a rotting banana skin will not spontaneously gain sentience, but we have no clear way to predict the future? And we have no way of knowing the true chance of superintelligence arising from the current path of AI research—the fact that it could be 1-in-100 or 1-in-1e12 or whatever is part of the discussion of uncertainty itself, and people are biased in all sorts of ways to believe that the true risk is somewhere on that continuum.


>And we have no way of knowing the true chance of superintelligence arising from the current path of AI research

What makes people think that the future advances in AI will continue to be linear instead of falling of and plateau? Don't all breakthrough technologies develop quickly at the start and then fall of in improvements as all the 'easy' improvements have already been made? In my opinion AI and AGI is like the car and the flying car. People saw continous improvements in cars and thought this rate of progress would continue indefinitely. Leading to cars that have the ability to not only drive but fly as well.


We already have flying cars. They’re called airplanes and helicopters. Those are limited by the laws of physics, so we don’t have antigravity flying vehicles.

In the case of AGI we already know it is physically possible.


There are lots of data points of previous AI efforts not creating super intelligence.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: