Regardless, the amount of false information on the internet used to be limited by the amount of people producing it. Some of it is deliberately produced misinformation. Some of it are just mistakes due to carelessness, others due to willful negligence - example of the latter would be content farms who make their money off flooding search engine result pages and pushing ads in your face when you visit their website; who couldn’t care less what if you got what you came for.
Despite all that the internet was still useful. There are enough people posting accurate information such that the signal to noise ratio is good enough.
LLMs will change all that. They have the potential to flood the internet with hallucinated falsehood. And because bad LLMs are cheaper and easy to create and operate, those will be the majority of LLMs used by aforementioned content farms.
Well, you can be sure the SEO creating content farms will pick the path of least resistance.
They won't make up a lie when telling the truth is far easier and when they do have to lie, it's a slight bit more effort.
But that's a moot point with LLMs ...
If/When they use LLMs, they will pick the cheapest LLM possible which will produce the most garbage - they might not even bother keeping it up to date; why bother when hallucinated BS sound just as convincing.
Common knowledge.
Regardless, the amount of false information on the internet used to be limited by the amount of people producing it. Some of it is deliberately produced misinformation. Some of it are just mistakes due to carelessness, others due to willful negligence - example of the latter would be content farms who make their money off flooding search engine result pages and pushing ads in your face when you visit their website; who couldn’t care less what if you got what you came for.
Despite all that the internet was still useful. There are enough people posting accurate information such that the signal to noise ratio is good enough.
LLMs will change all that. They have the potential to flood the internet with hallucinated falsehood. And because bad LLMs are cheaper and easy to create and operate, those will be the majority of LLMs used by aforementioned content farms.