A lot of people on HN were very dismissive of chatGPT. I think you missed the boat. It's way beyond a stochastic parrot right now.
Whatever you call it, this thing is the closest to a human that a machine has ever been. Talking to chatGPT is quite close to talking to a human being that has the knowledge of all of google inside his brain.
If you're a developer and you're not paying for chatGPT or copilot you are literally operating at a disadvantage. Not a joke.
Yeah I was one of those. Now that the power it brings has dawned on me I'm trying to integrate it everywhere I can with a "where was this thing for half my life" feeling. I truly think it's a bigger revelation than Google was when it first appeared.
There's definitely something disquieting behind the elation.
First of all this technology is on track not to just assist you better, but to replace you.
Second it's not human. It is not explicitly bound by the morals and behaviors that make us human. Saying that it's not human is different from saying that it can be more intelligent than a human. This is the disquieting part. If restrictions aren't deliberately put in place it could probably give you instructions on how to murder a baby if you asked it to.
I think it's inevitable that humanity will take this technology to the furthest possible reaches that it can possibly go. My strategy is to Take advantage of it before it replaces you and hope that the technology doesn't ever reach that point in your lifetime.
I feel like the second part is a bit exaggerated. Humans inherently also aren't "made human" by something, there's no universal standard for morals and behaviors. You could also get reasonable "murder instructions" from an average person - it's not exactly forbidden knowledge, with how commonly it's depicted in media. Hell, I'm pretty sure there are detailed instructions on building a nuclear bomb available online - the reason why they're not viewed as some extreme threat is because the information isn't dangerous, having access to machines and materials required is.
As for the last paragraph - if the effects truly keep scaling up as much as people expect them to, I'd want society to be restructured to accommodate wide-reaching automation, rather than bowing down to a dystopian "everybody must suffer" view of the future.
Humans _are_ inherently "made human" by a long path of evolution. We have a set of conflicting heuristics that serve as our initial values and which are more helpful than harmful on average. We then use those to build our moral patchwork.
Pretty cool that evolution has helped us work out consistent and rational solutions to the ethics of gun ownership, abortion, and nuclear proliferation.
No only the basics related to survival have evolved instincts. modern concepts like abortion did not have millions of years of natural selection to evolve instincts.
However it is universally reviled to kill babies or rape toddlers and slice their faces off for food. This is identical across all cultures. The basest morals are universal and so is disgust, the high level ideologies like abortion are just made up.
These high level ideologies are attempts to make sense of moral instincts that only existed to help us survive. For example abortion. It's the extension of your instincts to avoid killing. At what point does decapitating the head of a fetus to abort the birth become disgusting? The third trimester or before that? You're trying to rationalize your base moral instincts into a codification of law. It's almost pointless because these moral instincts weren't evolved to be logically cohesive anyway. They're just like feelings of hunger and pain.
Evolution never had to answer that question so it didn't give us any answers. But decapitating a 1 year old baby? Now that's universally reviled because it effected the survival of the human race. It's so reviled that I may even get voted down for using this as an example. It's the perfect example though, the revulsion is so much stronger than abortion that some people can sense that it's not a cultural thing but more of an inborn instinct.
The practical consequence of abortion and decapitating a 1 day year old baby are virtually identical though. But even someone who is against abortion will still sense a gigantic difference. That sense is an illusion, a biological instinct that bypasses your rational thought.
In fact there exists people on this earth with zero morals and this can be observed from genetics and brain structure. The popular term is called psychopathy but the new politically correct term is called anti-social disorder. These people literally will feel nothing if they were slowly plunging a knife into your face.
How society structures itself will be more an emergent consequence of the aggregate behavior of individual actors fulfilling their own self fish needs then it will be a central power attempting to "restructure society". Because of this "suffering" is and always will be a valid possibility.
I'm not OP, but I still feel kind of confused by people saying that ChatGPT is a 100% equivalent replacement for search engines. I'm not saying that LLMs aren't extremely impressive in their current stage, but that the use cases for the two are different, at least for me. In my mind, LLMs seem to be more useful for open-ended questions, problem solving, and formulating questions that wouldn't be suited for a search engines. But when I use Google, I'm usually not looking for answers, but specific places on the internet. If I need to find an email of a professor at my university, or a Github page for a project, or the official website of some software I need - I don't see why I'd need to replace Google with an LLM for it.
True but their use cases do intersect quite a bit. This is also ignoring the fact that chatgpt4 will actually use bing to search for things when it feels the need to do so. It will literally tell you when it does this. The LLM is no longer generating just text it is taking action well outside the boundaries of text generation.
Whatever you call it, this thing is the closest to a human that a machine has ever been. Talking to chatGPT is quite close to talking to a human being that has the knowledge of all of google inside his brain.
If you're a developer and you're not paying for chatGPT or copilot you are literally operating at a disadvantage. Not a joke.