Wikipedia is one of the most popular sites on the Web, with more than three million articles. The Wikimedia Foundation, a nonprofit group in San Francisco that oversees the user-generated encyclopedia, has approved a feature called flagged revisions, according to The New York Times.
Flagged revisions will reportedly mandate an experienced Wikipedia volunteer editor to review public changes made to articles about living people before they go live. A version with the changes will be invisible until an editor gives approval.
“We are no longer at the point that it is acceptable to throw things at the wall and see what sticks,” Michael Snow, a Seattle lawyer and chairman of the Wikimedia board, told the Times. “There was a time probably when the community was more forgiving of things that were inaccurate or fudged in some fashion — whether simply misunderstood or an author had some ax to grind. There is less tolerance for that sort of problem now.”
Avoiding Controversy
The policy isn’t entirely new. Wikipedia imposed flagging on its German-language version last year. And this isn’t the first time Wikimedia has changed the editorial policy, either. Wikipedia has in the past responded to several controversial issues with policy changes.
Indeed, Wikipedia biographies have drawn controversy over the years. John Seigenthaler, a former assistant to U.S. Attorney General Robert Kennedy, and Google Watch protest-site creator Daniel Brandt are among high-profile persons who have lashed out at Wikipedia for inaccurate bios. In 2005, some even suggested libel suits.
Wikipedia took steps to prevent posters from claiming to be someone they are not in 2007. People who boast prestigious credentials were obliged to reveal their identities in posts. That change came in the wake of the discovery that a poster going by the screen name Essjay and claiming to be a professor of theology was really a 24-year-old college dropout named Ryan Jordan.
The cloak of anonymity was removed from Wikipedia users in years past as the site moved to gain more credibility with users. With its latest policy changes, the site seems to be taking a proactive rather than a reactive approach. According to the Times, Wikipedia’s latest editorial changes will segment contributors into two compartments: Experienced, trusted editors and the general public.
A Maturing Wikipedia
“By creating a two-tiered editorial control infrastructure, Wikipedia is not going to hurt the site. It’s only going to make it better,” said Brad Shimmin, an analyst at Current Analysis. “Wikipedia is not the free-for-all it was three or four years ago. What we are looking at now is a maturing of Wikipedia in terms of the breadth of content that’s up there leveling out a little bit and the control that’s imposed on that content.”
Shimmin said industry watchers have always been surprised that Wikipedia is as deep and accurate as it is. He credits a number of interested individuals for that success. But there have been admitted issues with people posting inaccurate and even spiteful information.
“Even for a short period of time, you can’t think it’s OK to have bad content on a source that’s trusted,” Shimmin said. “You can’t make that assumption anymore. The service is too valuable to too many people to post an article saying someone is dead when they are not dead and wait a couple of days until somebody figures it out.”