Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> If it's 2023 and on, I dust off my 90's "everything on the World Wide Web is wrong" glasses.

because misinformation written by humans didn’t exist before LLMs?



Back in the 90s when the Internet became a thing, it was common knowledge that because normal people made websites, that you should take things with a grain of salt. There was a bit of an overreaction to this, as the general feeling at the time was to trust nothing on the Internet.

In the 00s and 10s, the quality of discoverable content improved: reddit and stackechange had experts (at a higher rate than the rest of the net at least). It was the era where search was good, Google wasn't evil (good results that separated the ads are entirely why they won against AskJeeves and Yahoo), and SEO was still gestating in Adam Smith's wastebasket.

Now Google and Bing are polluted with SEO-optimized content farms designed to waste your time and show you as many ads as possible. They hunger for new content (for the SEO gods demand regular posts), and the cheapest way to do this is an underpaid "author" spewing out a GPT-created firehose of content.

SEO has ruined search, and content farms have made what few usable results there are even less trustworthy.

So yes, the Internet has fundamentally changed in the last 9 months.


Reddit never had experts. Maybe for half a minute. It became an echo chamber fast: fake internet points to be gained for saying what got upvoted last week, or to be lost for saying anything different.


If all you do is browse (default) home or all, then sure, it's just a stupid echo chamber obsessed with hating the things its cool to hate. That's not where the value is on reddit, and it's not what people are referring to when they say the search reddit for answers. If you're looking for product reviews, it's not perfect but it's tough to find anywhere better unless you happen to know of exactly the right hidden gem of a forum to visit for your particular subtopic (and the link to that hidden gem of a forum is probably easier to find on the relevant subreddit than it is on google).


Reddit may not have experts per set, but in the right subreddits it definitely has enthusiasts. In ages gone by, you'd find the same people on message boards or forums, talking up and comparing the minute details of this or that. There's obviously the same risk of cargo culting that there's always been, but there's genuinely useful information available from people who spend way more time than the common man on their area of interest.


I think, at least on programming language subreddits, there are people who deserve to be labeled experts. r/cpp has some frequent users who work on standards proposals or compiler features. There are also subreddits dedicated just to communicating with experts, like r/askdocs


Two things can be true at the same time: "Reddit is prone to karma-driven bullshittery" and "Reddit content is generally significantly higher-quality than SEO content farms".

With Reddit you might get inane arguments and bandwagoning about what the best game strategy is, but you're exceedingly unlikely to read about a game mechanic that was straight-up hallucinated by a LLM.


For now. Wait till legions of bot redditors infiltrate real subs and make their own.


Some subreddits at (like askscience) at least asked for a copy of your diploma (in a science field) if you wanted flair. It was actually an awesome reddit.


Hobby subreddits and technical subreddits had a lot of experts. I frequented r/excel for work a lot, and those guys were WIZARDS.


I wanna politely disagree on that. I've found it to frequently be a resource on par with Stackechange.


Prior to LLMs, generating plausible-sounding misinformation took actual effort - not much effort, but the marginal cost was reasonably above free. With LLMs making the cost of bullshit vanishingly close to free we're going to tip into an era where uncurated LLM confabulation is going to dominate free information.

There's "one loony had a blog" levels of wrong, and then there's "industrial scale bullshit" levels of wrong and we are not prepared for the latter.


Because hiring humans to write misinformation costs more money than $20/mo? Like what are you even trying to say?

Before LLMs, a $3000 camera had fake reviews on Amazon, and you got fake news about politicians. But you can safely assume "bg3 silver ingot" information is likely real, since hiring someone to make up silver ignot will never make the money back.

No any more.


The price to produce text without caring if it's true has gone down, leading to more production.


GP is literally reminding you of a time when online misinformation was rampant, but before search engines (temporarily) did a better job than overwhelmed curators.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: