For one month beginning on October 5, I ran an experiment: Every day, I asked ChatGPT 5 (more precisely, its “Extended Thinking” version) to find an error in “Today’s featured article”. In 28 of these 31 featured articles (90%), ChatGPT identified what I considered a valid error, often several. I have so far corrected 35 such errors.


No surprise.
Wikipedia ain’t the bastion of facts that lemmites make them out to be.
It’s a mess of personal fiefdoms run by people with way too much time on their hands and an ego to match.
Disagree, Wikipedia is a pretty reliable bastion of facts due to its editorial demands for citations and rigorous style guides etc.
Can you point out any of these personal fiefdoms so we can see what you’re referring to?
Yeah, better to use grokpedia /s
I know this is sarcasm, but in case people don’t know.
Oh Jesus Christ no. At least Wikipedia has some form of oversight from multiple sources and people.