Hacker Newsnew | past | comments | ask | show | jobs | submit | MrGagi's commentslogin

But funny enough is that there is a commit where it pretends to be Linus https://github.com/jayphelps/git-blame-someone-else/commit/e...


I was learning some new framework, knew what I wanted to achieve, but had no idea how. It was easier to have conversation with GPT than to go through documentation. Although it sometimes gave examples from older version :)


I would really love to hear from people who work at Tesla or SpaceX how they feel about this culture. Are there any public articles about it?


It isn't a culture. It's more hours for no more money. It is simply wage theft.


Yeah, totally agree, good point.


“It would be a real mistake to think that when you’re teaching a child, all you are doing is adjusting the weights in a network.”


This is what I think is going to hold us back. The worry, perhaps subconsciously, of discovering that consciousness is much less complex and exceptional than we might want it to be.

I have this inkling because I see a pattern: people, even incredibly smart people, explaining it away with a “but that would diminish the exceptionality of our own minds!” type of argument.

I’m struggling for the right words. Specialness, maybe? That people have this need to think that it’s special and not so easily replicated. And if it’s rather simple then it surely must not be the same because that wouldn’t be special.


Turing saw the same pattern of thought. That's why he pre-emptively refutes it in the Turing test paper. He describes the argument like so

> The consequences of machines thinking would be too dreadful. Let us hope and believe that they cannot do so.[1]

1. https://redirect.cs.umbc.edu/courses/471/papers/turing.pdf


The idea you’re talking about is called “human exceptionalism”. It’s the idea that humans are in some way unique when it comes to being, thinking, and feeling.


Thank you. That sounds like what I’m trying to describe.


Yeah I agree, we still don't much about our minds, and it might turn out that we are not that special.


Or rather that our specialness can not be based on intellligence.


It's not replicating it that is impressive. That part is easy. Of course consciousness wouldn't be that hard. Of course it would be mostly an emergent property of various building blocks. Of course homo sapiens would eventually be able to do the same thing.

The impressive part is replicating it and then teaching it real Love. Love is a choice. Producing life that chooses to make that choice and defend that choice is impressive.


I think this is the same problem. How would you measure when a human is taught love? How would you differentiate that from measuring when one of these programs is taught love?

It feels like it always boils down to: 1) you can’t. Or 2) “trust me, it’s different.”


It's simple - observe what the candidate does when they have to show true love in order to move on to the next level. If they fail, they don't move on to the next level.


That works until the AI is crashing both trolleys at the very end to get to Meryl Streep.


Or maybe it is? How does he know?


It's crazy how much bots can keep trying to enter, and this bot from China really keeps trying, fail2ban really saves our ass


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: