Skip to main content

Meet the 9 Wikipedia bots that make the world’s largest encyclopedia possible

The idea behind Wikipedia is, let’s face it, crazy. An online encyclopedia full of verifiable information, ideally with minimal bias, that can be freely edited by anyone with an internet connection is a ridiculous idea that was never going to work. Yet somehow it has.

Nineteen years old this month (it was launched in January 2001, the same month President George W. Bush took office), Wikipedia’s promise of a collaborative encyclopedia has, today, resulted in a resource consisting of more than 40 million articles in 300 different languages, catering to an audience of 500 million monthly users. The English language Wikipedia alone adds some 572 new articles per day.

Recommended Videos

For anyone who has ever browsed the comments section on a YouTube video, the fact that Wikipedia’s utopian vision of crowdsourced collaboration has been even remotely successful is kind of mind-boggling. It’s a towering achievement, showing how humans from around the globe can come together to create something that, despite its flaws, is still impressively great.

What do we have to thank for the fact that this human-centric dream of collective knowledge works? Well, as it turns out, the answer is bots. Lots and lots of bots.

Bots to the rescue

Bots emerged on Wikipedia out of necessity. The term, used as shorthand for “software robot,” is an automated tool designed to carry out specific tasks. In the early days of Wikipedia, this largely involved sorting out vandalism. This problem could be handled manually when the total number of active contributors on Wikipedia numbered in the dozens or even hundreds. But as the website experienced its first boom in popularity this was no longer so easy to do. By 2007, for example, Wikipedia was receiving upward of 180 edits every minute. That was way too much for human editors to cope with.

“A very important thing that [Wikipedia bots were created to do] is to protect against vandalism,” Dr. Jeff Nickerson, a professor at the Stevens Institute of Technology in Hoboken, N.J., who has studied Wikipedia bots, told Digital Trends. “There’s a lot of instances where someone goes into a Wikipedia page and defaces it. It’s like graffiti. That became very annoying for the people who maintain those pages to have to go in by hand and revert the edits. So one logical kind of protection [was] to have a bot that can detect these attacks.”

Along with other researchers from the Stevens Institute of Technology, Nickerson recently carried out the first comprehensive analysis of all 1,601 of Wikipedia’s bots. According to that study, published in the Proceedings of the ACM on Human-Computer Interaction journal, bots account for around 10% of all activity on Wikipedia. This rises to a massive 88 percent of activity on Wikidata, the central storage platform for structured data used on the various Wikimedia websites.

Wikipedia Bot Roles and Associated Functions

Generator

Generate redirect pages

Generate pages based on other sources

Fixer

Fix links

Fix content

Fix files

Fix parameters in template/category/infobox

Connector

Connect Wikipedia with other wikis

Connect Wikipedia with other sites

Tagger

Tag article status

Tag article assessment

Tag Wikiprojects

Tag multimedia status

Clerk

Update statistics

Document user data

Update maintenance pages

Deliver article alert

Archiver

Archive content

Clean up sandbox

Protector

Identify policy violations

Identify spam

Identify vandals

Advisor

Provide suggestions for Wikiprojects

Provide suggestions for users

Greeting the newcomers

Notifier

Send user notifications

The research conducted by Nickerson and colleagues divided bot activity on Wikipedia into nine different categories. There are, as noted, “protectors,” dedicated to identifying policy violations, spam, and vandals. Then there are “fixers,” who live virtual lives revolving around the fixing of links, content, files, and anything else in need of a good tweaking. There are “taggers,” for tagging article statuses and assessments; “clerks,” for updating statistics and maintenance pages; “archivers” for archiving content; “advisors” for greeting newcomers and providing suggestions for users; “notifiers” for sending user notifications; and “generators” for creating redirection pages or generating new content based on other sources.

“Their complexity varies a lot,” said Morten Warncke-Wang, the current controller of SuggestBot, a bot which, well, suggests articles for editors to edit, based on their previous edit history. “It depends on the task that they’re sent to carry out.”

A certain degree of autonomy

Nickerson agreed. A bot, he suggested, can be anything from a relatively simple algorithm to a more complex machine learning A.I. What they have in common, he said, is a degree of autonomy. A bot is something that is created and then deployed to act on its orders, a little bit like a mission objective delegated to an employee. “[A bot] can go off and make hundreds, thousands, sometimes millions of edits on its own,” Nickerson said. “This is not something that [a human editor is] just running once while you’re sitting there.” The 24 tops bots on Wikipedia have made more than 1 million edits in their lifetime: far in excess of virtually every human editor on the website.

If the range of bot categories sounds, frankly, a bit like a medieval colony of monks — all pursuing the unified goal of dogmatic enlightenment through an assortment of seemingly menial tasks — you’re not entirely wrong. The fact that the bot world is reminiscent of a community of sorts is not at all accidental.

Anyone can develop a bot, just like anyone can edit an article.

Despite the fact that most casual Wikipedia users will never interact with a bot, their creation is every bit as collaborative as anything on the Wikipedia front end. Bots are not implemented by Wikimedia in a top-down manner. Anyone can develop a bot, just like anyone can edit an article. They do this according to perceived problems they believe a bot might be able to assist with. To get their bot rubber-stamped, they must submit an approval request to BAG, the Bot Approvals Group. If BAG deems the bot to be a valuable addition to the collective, it will be approved for a short trial period to ensure that it operates as designed. Only after this will it be unleashed on Wikipedia as a whole.

“There’s a prosocial nature to a lot of the editors on Wikipedia,” Nickerson said. “A lot of the time people might write these bots for themselves and then make it available to the community. That’s often the way these bots emerge. Some editor’s doing a task they realize could be fixed with a fairly simple bot. They’ve got the skill to build it, and then that bot gets deployed and used by everyone.”

Like an algorithmic “bring your dog to work day,” the owner of each bot is responsible for its behavior. Fail to respond to behavioral concerns, and your bot will be revoked.

Make bots great again

Here in 2020, bots frequently have a popular reputation that’s somewhere between venereal disease and John Wilkes Booth. They are frequently cast as the human job-replacing, election-swaying tools designed to do far more bad than good. The Wikipedia example shows the flip-side to this picture. Wikipedia’s bots are the site’s immune system: near-invisible tools that help provide resistance to (metaphorical) infection and toxins, while strengthening the system in the process.

As Nickerson points out, however, the bots are not entirely invisible. And that’s to their betterment. “When people don’t think they’ve received a good recommendation, they’ll regularly post about that on the bot page,” he said, describing the “advisers” and “notifiers” intended to coax human contributors to do better. “To me, that’s very interesting. I’d love to be able to affect news feeds I get [elsewhere], but I can’t. I don’t have a way of going to the companies that are selecting news for me and saying, ‘I think you’re giving me too much of this; I’d rather get more of that.’ Having control over the algorithms that are communicating with you is an important thing. And it seems to really work with Wikipedia.”

Image used with permission by copyright holder

Some Wiki bots carry out simple text generation. The first-ever Wikipedia bot, which appeared in late 2002, was designed to add and maintain pages for every U.S. county and city. But both Nickerson and Morten Warncke-Wang, the man behind SuggestBot, said that they couldn’t foresee Wikipedia ever handing control of the website over entirely to text-generating algorithms. “They’re rarely used to create the content,” Warncke-Wang said. “They’re much more used as tools to manage the content development.”

At its core, Wikipedia is a deeply human effort — and the bots are there to help, not hinder. As Manfred E. Clynes and Nathan S. Kline, the two researchers who coined the term “cyborg” wrote in an influential 1960 essay: “The purpose of the [ideal collaboration between humans and machine] is to provide an organizational system in which such robot-like problems are taken care of automatically and unconsciously; leaving man free to explore, to create, to think, and to feel.”

Wikipedia bots follow in that spirit. As long as that relationship continues, long may they carry on helping us find the information we want. And stop bad actors from defacing the pages of celebrities they don’t like.

Luke Dormehl
Former Digital Trends Contributor
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
Hyundai believes CarPlay, Android Auto should remain as options
The 6.9-inch Sony digital media receiver installed in the dashboard of a vehicle.

Hyundai must feel good about the U.S. market right now: It just posted "record-breaking" November sales, led by its electric and hybrid vehicles.

It wouldn’t be too far of a stretch for the South Korean automaker to believe it must be doing something right about answering the demands of the market. And at least one recurring feature at Hyundai has been a willingness to keep offering a flexible range of options for drivers.

Read more
Dodge’s Charger EV muscles up to save the planet from ‘self-driving sleep pods’
dodges charger ev muscles up to save the planet from self driving sleep pods stellantis dodge daytona

Strange things are happening as the electric vehicle (EV) industry sits in limbo ahead of the incoming Trump administration’s plans to end tax incentives on EV purchases and production.

The latest exemple comes from Dodge, which is launching a marketing campaign ahead of the 2025 release of its first fully electric EV, the Daytona Charger.

Read more
Many hybrids rank as most reliable of all vehicles, Consumer Reports finds
many hybrids rank as most reliable of all vehicles evs progress consumer reports cr tout cars 0224

For the U.S. auto industry, if not the global one, 2024 kicked off with media headlines celebrating the "renaissance" of hybrid vehicles. This came as many drivers embraced a practical, midway approach rather than completely abandoning gas-powered vehicles in favor of fully electric ones.

Now that the year is about to end, and the future of tax incentives supporting electric vehicle (EV) purchases is highly uncertain, it seems the hybrid renaissance still has many bright days ahead. Automakers have heard consumer demands and worked on improving the quality and reliability of hybrid vehicles, according to the Consumer Reports (CR) year-end survey.

Read more