1/31/2026 at 11:08:17 PM
> That means the article contained a plausible-sounding sentence, cited to a real, relevant-sounding source. But when you read the source it’s cited to, the information on Wikipedia does not exist in that specific source. When a claim fails verification, it’s impossible to tell whether the information is true or not.This has been a rampant problem on Wikipedia always. I can't seem to find any indicator that this has increased recently? Because they're only even investigating articles flagged as potentially AI. So what's the control baseline rate here?
Applying correct citations is actually really hard work, even when you know the material thoroughly. I just assume people write stuff they know from their field, then mostly look to add the minimum number of plausible citations after the fact, and then most people never check them, and everyone seems to just accept it's better than nothing. But I also suppose it depends on how niche the page is, and which field it's in.
by crazygringo
1/31/2026 at 11:54:43 PM
There was a fun example of this that happened live during a recent episode of the Changelog[1]. The hosts noted that they were incorrectly described as being "from GitHub" with a link to an episode of their podcast which didn't substantiate that claim. Their guest fixed the citation as they recorded[2].[1]: https://changelog.com/podcast/668#transcript-265
[2]: https://en.wikipedia.org/w/index.php?title=Eugen_Rochko&diff...
by crabmusket
2/1/2026 at 2:28:23 AM
How did they know it was not LLM generated?by chr15m
2/1/2026 at 1:26:10 PM
The false claim was added 7 Nov 2022 [1] while chatgpt wasn't released until 30 Nov 2022.[1] https://en.wikipedia.org/w/index.php?title=Eugen_Rochko&diff...
by michaelt
2/1/2026 at 4:14:45 PM
Not that it's likely but there were publicly-released LLMs, like GPT-J, that were released in 2021.by mmcwilliams
2/2/2026 at 1:48:11 AM
Thanks!by chr15m
2/1/2026 at 1:11:50 AM
The problems I've run into is both people giving fake citations (the citations don't actually justify the claim that's being made in the article), and people giving real citations, but if you dig into the source you realize it's coming from a crank.It's a big blind spot among the editors as well. When this problem was brought up here in the past, with people saying that claims on Wikipedia shouldn't be believed unless people verify the sources themselves, several Wikipedia editors came in and said this wasn't a problem and Wikipedia was trustworthy.
It's hard to see it getting fixed when so many don't see it as an issue. And framing it as a non-issue misleads users about the accuracy of the site.
by gonzobonzo
2/1/2026 at 12:51:37 PM
A common source of error is in articles for movies where it gives plot summaries. The plot summaries are very often written by people who didn't watch the movie but are trying to re-resemble the plot like a jigsaw puzzle from little bits they glean from written reviews, or worse just writing down whatever they assume to be the plot. Very often it seems like the fuck ups came from people who either weren't watching the movie carefully, or were just listening to the dialogue while not watching the screen, or simply lacked media literacy.Example [SPOILERS]: the page for the movie Sorcerer claims that rough terrain caused a tire to pop. The movie never says that, the movie shows the tire popping (which results in the trucks cargo detonating). The next scene reveals the cause, but only to those paying attention; the bloody corpse of a bandito laying next to a submachine gun is shown in the rubble beside the road, and more banditos are there, very upset and quite nervous, to hijack the second truck. The obvious inference is that the first truck's tire was shot by the bandit to hijack/rob the truck. The tire didn't pop from rough terrain, the movie never says it did, it's just a conclusion you could get from not paying attention to the movie.
by mikkupikku
2/1/2026 at 3:26:27 PM
To me that sounds a bit like summaries made on the base of written movie scripts. A long time ago, I read a few scripts to movies I had never watched, and that's exactly the outcome: You get a rough idea what it's about and even get to recognise some memorable quotes, but there's little cohesion to it, for lack of all the important visual aspects and clues that tie it all together.by shmeeed
2/1/2026 at 4:33:03 PM
> The problems I've run into is both people giving fake citations (the citations don't actually justify the claim that's being made in the article), and people giving real citations, but if you dig into the source you realize it's coming from a crank.Citations have become heavily weaponized across a lot of spaces on the internet. There was a period of time where we all learned that citations were correlated with higher quality arguments and Wikipedia’s [Citation Needed] even became a meme.
But the quacks and the agenda pushers realized that during casual internet browsing readers won’t actually read, let alone scrutinize the citation links, so it didn’t matter what you linked to. As long as the domain and title looked relevant it would be assumed correct. Anyone who did read the links might take so much time that the comment section would be saturated with competing comments by the time someone can respond with a real critique.
This has become a real problem on HN, too. Often when I see a comment with a dozen footnoted citations from PubMed they’re either misunderstandings what the study says or some times they even say the opposite of what the commenter claims.
The strategy is to just quickly search PubMed or other sources for keywords and then copy those into the post with the HN footnote citation format, knowing that most people won’t read or question it.
by Aurornis
2/1/2026 at 7:45:12 AM
> but if you dig into the source you realize it's coming from a crank.It is a dark sunday afternoon, Bob Park is sitting on his sofa as usual, drunk as usual, suddenly the TV reveals to him there to be something called the Paranormal (Twilight Zone music) ..instantly Bob knows there are no such things and adds a note to the incomprehensible mess of notes that one day will become his book. He downs one more Budweiser. In the distance lightning strikes a tree, Bob shouts You don't scare me! and shakes his fist. After a few more beers a miracle of inspiration descends and as if channeling, in the time span of 10 minutes he writes notes about Cold Fusion, Alternative Medicine, Faith Healing, Telepathy, Homeopathy, Parapsychology, Zener cards, the tooth fairy and father xmas. With much confidence he writes that non of them are real. It's been a really productive afternoon. It reminds him of times long gone back when he actually published many serious papers. He counts the remaining beers in his cooler and says to himself, in the next book I will need to take on god himself. The world needs to know, god is not real. I too will be the authority on that subject.
https://en.wikipedia.org/w/index.php?title=Special:WhatLinks...
by 6510
2/1/2026 at 3:19:49 PM
Curious what the point you're making here is. I don't know anything at all about Bob Park and whether he is a crank. But if you make your career doing the admirable work of debunking pseudo-science and nonsense theories, you would necessarily be linked to in discussions of those theories very, very frequently.So maybe that's not a good description of him. But the link you posted is hardly dispositive.
by CPLX
2/2/2026 at 8:45:14 PM
The pseudo science of corporate values is a contradiction in terms invented by HR ladies who drink tea for a living. People who believe such things also believe in aliens and have theories about vegetarian tigers.You are now debunked.
(This comment is intentionally stupid, useless and the author knows nothing about the topic)
by 6510
2/1/2026 at 2:19:08 AM
LLMs can add unsubstantiated conclusions at a far higher rate than humans working without LLMs.by chr15m
2/1/2026 at 2:26:05 AM
At some point you're forced to either believe that people have never heard of the concept of a force multiplier, or to return to Upton Sinclair's observation about getting people to believe in things that hurt their bottom line.by EA-3167
2/1/2026 at 2:39:21 AM
I don’t see why people keep blaming cars for road safety problems; people got into buggy crashes for centuries before automobiles even existedby DrewADesign
2/1/2026 at 2:57:31 AM
Because a difference in scale can become a difference in category. A handful of buggy crashes can be reduced to operator error, but as the car becomes widely adopted and analysis matures, it becomes clear that the fundamental design of the machine and its available use cases has fundamental flaws that cause a higher rate of operator error than desired. Therefore, cars are redesigned to be safer, laws and regulations are put in place, license systems are issued, and traffic calming and road design is considered.Hope that helps you understand.
by nullsanity
2/1/2026 at 3:57:55 AM
Is the sarcasm really that opaque? Who would unironically equate buggy accidents and automobile accidents?by DrewADesign
2/1/2026 at 5:25:29 AM
I’d like to introduce you to the internet.There’s a reason /s was a big thing, one persons obvious sarcasm is (almost tautologically) another persons true statement of opinion.
by obidee2
2/2/2026 at 1:25:10 AM
Thanks. I wasn’t aware of that.by DrewADesign
2/2/2026 at 1:50:12 AM
It took me a minute to realise you were joking too! :)by chr15m
2/1/2026 at 10:32:11 AM
How much time have you spent around developers?by forgetfreeman
2/2/2026 at 1:21:57 AM
I got my first tech job in 1998. Some of the most sarcastic people I’ve ever met.by DrewADesign
2/1/2026 at 12:40:15 PM
True, but humans got a 20 year head start and I am willing to wager the overwhelming majority of extant flagrant errors are due to humans making shit up and no other human noticing and correcting it.My go too example was the SDI page saying that brilliant pebble interceptors were to be made out of tungsten (completely illogical hogwash that doesn't even pass a basic sniff test.) This claim was added to the page in February of 2012 by a new wikipedia user, with no edit note accompanying the change nor any change to the sources and references. It stayed in the article until October 29th, 2025. And of course this misinformation was copied by other people and you can still find it being quoted, uncited, in other online publications. With an established track record of fact checking this poor, I honestly think LLMs are just pissing into the ocean.
by mikkupikku
2/1/2026 at 1:53:28 PM
If LLMs 10X it, as the advocates keep insisting, that means it would only take 2 years to do as much or more damage as humans alone have done in 20.by asadotzler
2/1/2026 at 2:04:04 PM
Perhaps so. On the other hand, there's probably a lot of low hanging fruit they can pick just by reading the article, reading the cited sources, and making corrections. Humans can do this, but rarely do because it's so tedious.I don't know how it will turn out. I don't have very high hopes, but I'm not certain it will all get worse either.
by mikkupikku
2/1/2026 at 8:05:25 PM
The entire point of the article is that LLMs cannot make accurate text, but ironically you claiming LLMs can do accurate texts illustrates your point about human reliability perfectly.I guess the conclusion is there simply is no avenues to gain knowledge.
by SiempreViernes
2/2/2026 at 12:45:35 AM
> I am willing to wager the overwhelming majority of extant flagrant errors are due to humans making shit upIn general, I agree, but I wouldn't want to ascribe malfeasance ("making shit up") as the dominant problem.
I've seen two types of problems with references.
1. The reference is dead, which means I can't verify or refute the statement in the Wikipedia article. If I see that, I simply remove both the assertion and the reference from the wiki article.
2. The reference is live, but it almost confirms the statement in the wikipedia article, but whoever put it there over-interpreted the information in the reference. In that case, I correct the statement in the article, but I keep the ref.
Those are the two types of reference errors that I've come across.
And, yes, I've come across these types of errors long before LLMs.
by busyant
2/1/2026 at 12:01:07 AM
When I've checked Wikipedia citations I've found so much brazen deception - citations that obviously don't support the claim - that I don't have confidence in Wikipedia.> Applying correct citations is actually really hard work, even when you know the material thoroughly.
Why do you find it hard? Scholarly references can be sources for fundamental claims, review articles are a big help too.
Also, I tend to add things to Wikipedia or other wikis when I come across something valuable rather than writing something and then trying to find a source (which also is problematic for other reasons). A good thing about crowd-sourcing is that you don't have to write the article all yourself or all at once; it can be very iterative and therefore efficient.
by mmooss
2/1/2026 at 1:26:48 AM
It's not that I personally find it hard.It's more like, a lot of stuff in Wikipedia articles is somewhat "general" knowledge in a given field, where it's not always exactly obvious how to cite it, because it's not something any specific person gets credit for "inventing". Like, if there's a particular theorem then sure you cite who came up with it, or the main graduate-level textbook it's taught in. But often it's just a particular technique or fact that just kind of "exists" in tons of places but there's no obvious single place to cite it from.
So it actually takes some work to find a good reference. Like you say, review articles can be a good source, survey articles or books. But it can take a surprising amount of effort to track down a place that actually says the exact thing. I literally just last week was helping a professor (leader in their field!) try to find a citation during peer review for their paper for an "obvious fact" in the field, that was in their introduction section. It was actually really challenging, like trying to produce a citation for "the sky is blue".
I remember, years ago, creating a Wikipedia article for a particular type of food in a particular country. You can buy it at literally every supermarket there. How the heck do you cite the food and facts about it? It just... is. Like... websites for manufacturers of the food aren't really citations. But nobody's describing the food in academic survey articles either. You're not going to link to Allrecipes. What do you do? It's not always obvious.
by crazygringo
2/1/2026 at 3:36:36 PM
If you can buy the food at a supermarket, can't you cite a product page? Presumably that would include a description of the product. Or is that not good enough of a citation?by Jepacor
2/1/2026 at 8:21:57 PM
Retail product listing URLs change constantly. They're not great.And then you usually want to describe how the food is used. E.g. suppose it's a dessert that's mainly popular at children's birthday parties. Everybody in the country knows that. But where are you going to find something written that says that? Something that's not just a random personal blog, but an actual published valid source?
Ideally you can find some kind of travel guide or book for expats or something with a food section that happens to list it, but if it's not a "top" food highly visible to tourists, then good luck.
by crazygringo
2/1/2026 at 8:04:41 PM
I found several that were contradicting the claim they were supposed to support (in popular articles). I will never regain faith in wikipedia. Being an editor or just verifying information from wikipedia makes you hate itby efilife
2/1/2026 at 2:01:04 AM
[dead]by FranklinJabar
2/1/2026 at 11:41:58 AM
Linkrot is a problem and edited articles are another. Because you can cite all you want, but if the underlying resource changes your foundation just melted away.by jacquesm
2/1/2026 at 1:06:37 PM
Pretty much every citation added to wikipedia is passed on to web archive now, either by the editor or automatically later on.For news articles especially the recommendation now is to use the archive snapshot and not the url of the page.
It’s not a perfect solution, but it tries to solve the link rot issue.
by jayflux
2/1/2026 at 2:04:50 PM
> Applying correct citations is actually really hard workNot disagreeing - many existing articles on wikipedia have barely any references or citation at all and in some cases wrong citation or wrong conclusions. Like when an article says water molecules behave oddly and then the wikipedia article concluding that water molecules behave properly.
by shevy-java
2/1/2026 at 5:16:00 PM
> This has been a rampant problem on Wikipedia always. I can't seem to find any indicator that this has increased recently? Because they're only even investigating articles flagged as potentially AI. So what's the control baseline rate here?...y'know, I don't want to be that guy, but this actually seems like something AI could check for, and then flag for human review.
by Wowfunhappy
2/1/2026 at 8:16:20 PM
You don't need an LLM to find loads of uncorroborated claims on Wikipedia. See f.e. https://en.wikipedia.org/wiki/Variational_autoencoder Most articles about tech are woefully undersourced.by bjourne