Posted by greyface- 5 hours ago
https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(techni...
https://old.reddit.com/r/wikipedia/comments/1rllcdg/megathre...
In short, a Wikimedia Foundation account was doing some sort of test which involved loading a large number of user scripts. They decided to just start loading random user scripts, instead of creating some just for this test.
The user who ran this test is a Staff Security Engineer at WMF, and naturally they decided to do this test under their highly-privileged Wikimedia Foundation staff account, which has permissions to edit the global CSS and JS that runs on every page.
One of those random scripts was a 2 year old malicious script from ruwiki. This script injects itself in the global Javascript on every page, and then in the userscripts of any user that runs into it, so it started spreading and doing damage really fast. This triggered tons of alerts, until the decision was made to turn the Wiki read-only.
That makes the fix pretty easy. Write a regex to detect the evil script, and revert every page to a historic version without the script.
I refuse to believe that someone on the security team intentionally tested random user scripts in production on purpose.
On the other hand,
>a Staff Security Engineer at WMF, and naturally they decided to do this test under their highly-privileged Wikimedia Foundation staff account
seriously?
this is both really cool and really really insane
For the global ones that need admin permissions to edit, it's no different from all the other code of mediawiki itself like the php.
For the user scripts, it's no worse than the fact that you can run tampermonkey in your browser and have it modify every page from evry site in whatever way your want.
- Inject itself into the MediaWiki:Common.js page to persist globally, and into the User:Common.js page to do the same as a fallback
- Uses jQuery to hide UI elements that would reveal the infection
- Vandalizes 20 random articles with a 5000px wide image and another XSS script from basemetrika.ru
- If an admin is infected, it will use the Special:Nuke page to delete 3 random articles from the global namespace, AND use the Special:Random with action=delete to delete another 20 random articles
EDIT! The Special:Nuke is really weird. It gets a default list of articles to nuke from the search field, which could be any group of articles, and rubber-stamps nuking them. It does this three times in a row.
Do keep us updated on the whole situation if any relevant situation can happen from your POV perhaps.
I'd suggest to give the domain to wikipedia team as they might know what could be the best use case of it if possible.
If anyone from the Russian government is reading this, get the fuck out of Ukraine. Thank you.
You have helped to bring peace by approximately zero nanoseconds, while doing absolutely nothing about western countries still buying massive amounts of natural resources from Putin. Tax income on their exports make the primary source of income for the federal budget, which directly funds the military.
Good virtue signaling, though. I'm completely disillusioned with the West, this is nothing new.
By doing nothing, you are allowing a malicious actor to buy the domain. In fact I am sure they would love for everyone else to be paralyzed by purity tests for a $1 domain.
All things being equal, yeah don’t buy a .ru domain. But they are not equal.
> On 1 January 2025, Ukraine terminated all Russian gas transit through its territory, after the contract between Gazprom and Naftohaz signed in 2019 expired. [...] It is estimated that Russia will lose around €5bn a year as a result.
https://en.wikipedia.org/wiki/Russia%E2%80%93Ukraine_gas_dis...
> Namecheap is a U.S. based domain name registrar and web hosting service company headquartered in Phoenix, Arizona.
and in 2025 they were purchased by:
> CVC Capital Partners plc is a Jersey-based private equity and investment advisory firm
What should we put there, anyway?
Note while this looks like its trying to trigger an xss, what its doing is ineffective, so basemetrika.ru would never get loaded (even ignoring that the domain doesnt exist)
Of course it's very possible someone wrote it with AI help. But almost no chance it was designed by AI.
Well, worm didn't get root -- so if wikimedia snapshots or made a recent backup, probably not so much of a nightmare? Then the diffs can tell a fairly detailed forensic story, including indicators of motive.
Snapshotting is a very low-overhead operation, so you can make them very frequently and then expire them after some time.
People usually remember what they changed yesterday and have uploaded files and such still around. It's not great, but quite possible. Maybe you need to pull a few content articles out from the broken state if they ask. No huge deal.
If you decide to roll back after a week or so, editors get really annoyed, because now they are usually forced to backtrack and reconcile the state of the knowledge base, maybe you need a current and a rolled-back system, it may have regulatory implications and it's a huge pain in the neck.
As an aside, snapshotting would have prevented a good deal of horror stories shared by people who give AI access to the FS. Well, as long as you don't give it root.......
Feels good to pat oneself in the back. Mine is sore, though. My E&O/cyber insurance likes me.
obviously you can. but, what is the actual snapshot frequency? like, what is the timestamp of the last known good snapshot? that is what matters.
in any case, the comment you are replying to is a hypothetical, which correctly points out that even a day or two of lost edits is fine (not ideal, but fine). your reply doesnt engage with their comment at all.
I did engage, by pointing out that it wasn't relevant nor a realistic scenario for a competent sysadmin. (Did you read the OP?) That's a /you/ problem if you rely on infrequent backups, especially for a service with so much flux.
> what is the actual snapshot frequency? like, what is the timestamp of the last known good snapshot?
? Why would I know what their internal operations are?
>Why would I know what their internal operations are?
i mean... you must, right? you know that once-a-day snapshots is not relevant to this specific incident. you know that their sysadmins are apparently competent. i just assumed you must have some sort of insider information to be so confident.
my decade of dealing with incompetent sysadmins and broken backups (if they even exist) has given me the opposite of confidence.
but im glad you have had a different experience
Oh, I agree that the average bar is low. That's part of the reason I do it all myself.
The heuristic with wikimedia is that they've been running a PHP service that accepts and stores (anonymous) input for 25 years. The longetivity with the risk exposure that they have are indicators that they know what they are doing, and I'm sure they've learned from recovering all sorts of failures over the years.
Look at how quickly it was brought back up in this instance!
So, yeah. I don't think initial hypothetical counterpoint holds water, and that's what I have been pointing out.
I still don't need to assume what the intent is. Troll or no troll, it works. My comments might inspire someone else to try a CoW fs. I'm also really impressed with wikimedia's technical team.
i found kibone's reply to a hypothetical musing as if it was some counterpoint in a debate instead of a simple expansion on their comment to be off putting. we had some comments back and forth and we both came out of it just fine. weird of you to add on this little insult to an otherwise pretty normal exchange.
1. In 2023, vandal attacks was made against two Russian-language alternative wiki projects, Wikireality and Cyclopedia. Here https://wikireality.ru/wiki/РАОрг is an article about organisators of these attacks.
2. In 2024, ruwiki user Ololoshka562 created a page https://ru.wikipedia.org/wiki/user:Ololoshka562/test.js containing script used in these attacks. It was inactive next 1.5 years.
3. Today, sbassett massively loaded other users' scripts into his global.js on meta, maybe for testing global API limits: https://meta.wikimedia.org/wiki/Special:Contributions/SBasse... . In one edit, he loaded Ololoshka's script: https://meta.wikimedia.org/w/index.php?diff=prev&oldid=30167... and run it."
I’ve always thought the fact that MediaWiki sometimes lets editors embed JavaScript could be dangerous.
It seems like the worm code/the replicated code only really attacks stuff on site. But leaking credentials (and obviously people reuse passwords across sites) could be sooo much worse.
If an attacker wanted passwords en masse they could inject fake login forms and try to simulate focus and typing, but that chain is brittle across browsers, easy to detect and far lower yield than stealing session tokens or planting persistent XSS. Defenders should assume autofill will be targeted and raise the bar with HttpOnly cookies, SameSite=strict where practical, multifactor auth, strict Content Security Policy plus Subresource Integrity, and client side detection that reports unexpected DOM mutations.
"The incident appears to have been a cross-site scripting hack. The origin of rhe malicious scripts was a userpage on the Russian Wikipedia. The script contained Russian language text.
During the shutdown, users monitoring [https://meta.wikimedia.org/wiki/special:RecentChanges Recent changes page on Meta] could view WMF operators manually reverting what appeared to be a worm propagated in common.js
Hopefully this means they won't have to do a database rollback, i.e. no lost edits. "
Interesting to note how trivial it is today to fake something as coming "from the Russians".https://wikipediocracy.com/forum/viewtopic.php?f=8&t=14555
https://en.wikipedia.org/wiki/Wikipedia:Village_pump_(techni...
https://old.reddit.com/r/wikipedia/comments/1rllcdg/megathre...
Apparent JS worm payload: https://ru.wikipedia.org/w/index.php?title=%D0%A3%D1%87%D0%B...
The Wikipedia community takes a cavalier attitude towards security. Any user with "interface administrator" status can change global JavaScript or CSS for all users on a given Wiki with no review. They added mandatory 2FA only a few years ago...
Prior to this, any admin had that ability until it was taken away due to English Wikipedia admins reverting Wikimedia changes to site presentation (Mediaviewer).
But that's not all. Most "power users" and admins install "user scripts", which are unsandboxed JavaScript/CSS gadgets that can completely change the operation of the site. Those user scripts are often maintained by long abandoned user accounts with no 2 factor authentication.
Based on the fact user scripts are globally disabled now I'm guessing this was a vector.
The Wikimedia foundation knows this is a security nightmare. I've certainly complained about this when I was an editor.
But most editors that use the website are not professional developers and view attempts to lock down scripting as a power grab by the Wikimedia Foundation.
True, but there aren't very many interface administrators. It looks like there are only 137 right now [0], which I agree is probably more than there should be, but that's still a relatively small number compared to the total number of active users. But there are lots of bots/duplicates in that list too, so the real number is likely quite a bit smaller. Plus, most of the users in that list are employed by Wikimedia, which presumably means that they're fairly well vetted.
[0]: https://en.wikipedia.org/w/api.php?action=query&format=json&...
Unfortunately, Wikipedia is run on insecure user scripts created by volunteers that tend to be under the age of 18.
There might be more editors trying to resume boost if editing Wikipedia under your real name didn't invite endless harassment.
>There are currently 15 interface administrators (including two bots).
https://en.wikipedia.org/wiki/Wikipedia:Interface_administra...
> Based on the fact user scripts are globally disabled now I'm guessing this was a vector.
Disabled at which level?Browsers still allow for user scripts via tools like TamperMonkey and GreaseMonkey, and that's not enforceable (and arguably, not even trivially visible) to sites, including Wikipedia.
As I say that out loud, I figure there's a separate ecosystem of Wikipedia-specific user scripts, but arguably the same problem exists.
You can also upload scripts to be shared and executed by other users.
As in, user can upload whatever they wish and it will be shown to them and ran, as JS, fully privileged and all.
A certain number of "community" admins maintain that right to this day after it was realized this was a massive security hole.
Just now thought “if Wikipedia vanished what would it mean … and it’s not on the level of safe drinking water, but it is a level.
That someone would need to restore some backups, and in the meantime, use mirrors.
Seriously, not that big of a deal. I don't know how many copies of Wikipedia are lying around but considering that archives are free to download, I guess a lot. And if you count text-only versions of the English Wikipedia without history and talk pages, it is literally everywhere as it is a common dataset for natural language processing tasks. It is likely to be the most resilient piece of data of that scale in existence today.
The only difficulty in the worst case scenario would be rebuilding a new central location and restarting the machinery with trusted admins, editors, etc... Any of the tech giants could probably make a Wikipedia replacement in days, with all data restored, but it won't be Wikipedia.
That's small enough to live on most people's phones. It's small enough to be a single BluRay. Maybe Wikipedia should fund some mass printings.
What you do not get however is any media. No sounds, images, videos, drawings, examples, 3D artifacts, etc etc etc. This is a huge loss on many many many topics.
It's not a high bar.
Haven't we hit that point already with bad faith (and potentially government-run) coordinated editing and voting campaigns, as both Wales and Sanger have been pointing out for a while now?
See, for example,
* Sanger: https://en.wikipedia.org/wiki/User:Larry_Sanger/Nine_Theses
* Wales: https://en.wikipedia.org/wiki/Talk:Gaza_genocide/Archive_22#...
* PirateWires: https://www.piratewires.com/p/how-wikipedia-is-becoming-a-ma...
Yes, this is a real phenomenon. See, for instance, https://en.wikipedia.org/wiki/Timeline_of_Wikipedia%E2%80%93...: the examples from 2006 are funny, and the article's subject matter just gets sadder and sadder as the chronology goes on.
> and voting campaigns
I'm not sure what you mean by this. Wikipedia is not a democracy.
> as both Wales and Sanger have been pointing out
{{fv}}. Neither of those essays make this point. The closest either gets is Sanger's first thesis, which misunderstands the "support / oppose" mechanism. Ironically, his ninth thesis says to introduce voting, which would create the "voting campaign" vulnerability!
These are both really bad takes, which I struggle to believe are made in good faith, and I'm glad Wikipedians are mostly ignoring them. (I have not read the third link you provided, because Substack.)
https://en.wikipedia.org/wiki/Wikipedia:What_Wikipedia_is_no...