A very interesting project going by the name Wikitrust (part of the Wikipedia Quality Initiative) analyzes if information on Wikipedia is trustworthy. In order to do this, Wikitrust computes both the authors’ and the individuals words’ trustworthiness. The more contributions an author made and the more edits they survive, the better; in case of words, the more edits they survive the higher they score:
- First, we compute the reputation of each author by analyzing the author’s contributions. When an author makes a contribution that is preserved in subsequent edits, the author gains reputation. When an author makes a contribution that is undone or reverted quickly, the author loses reputation.
- The trust value of a new word is proportional to the reputation of its author. When subsequent authors edit the page, words that are left unchanged gain trust: by leaving them there, the authors implicitly agree with them. Words closer to the edit gain more trust, as the author of the edit is likely to have paid more attention to them. In contrast, text that has been rearranged (new text, text at the border of cut-and-paste, etc) has again a reputation proportional to the author of the edit.
Information that seems questionable is marked in different shades of orange. The screenshot shows a strongly opinionated Wikipedia article about Italian cuisine (and how it allegedly compares to French cuisine). Note how the (insulting) opinionated parts are marked dark orange:
More info on the WikiTrust Blog. This way of rating Wikipedia information based on the authors’ and the words’ credibility seems like it might work. What do you think?
(via Planblog / Tim Bonnemann)