I'd like to put together a Wikipedia Accuracy Meter - Perhaps a browser plug-in or a web page that attempts to give an estimate of the reliability of any given article at any given moment. Please come in for discussion of algorithm and technical hoo-ha to help me assess if this is a weekend project or a master's thesis....
The kinds of things to be taken into account would be age of entry, number of changes, number of reversions, locked status. Most of these would have some sort of bell curve for influence on the rank. An article near the 0% reliability mark would be a new entry with few authors. A higher ranked article would be older with many revisions and a decreasing frequency of revision. I am not sure how to treat a locked entry - Same rank as before it was locked, minus something that takes into account the fact that someone out there strongly feel the existing status is incorrect? Contributions from "trusted authors" might help as well, though that could catapult this into a more substantial AI problem. I am kind of hoping to get a moderately useful number with little work by taking the Wikipedia methodology on its own terms, though I can see the possibility of using what I perceive as deeper weaknesses (and strengths) in the method to inform the algorithm.
Technically, I think the actual programming could be easy, though tuning should be pretty hard. I know I could stumble through a web page in php that takes a URL and figures out how to access the history of the page - Is there any back end API for that kind of stuff on the Wikipedia? My poking around made me think it's the kind of stuff they are happy to expose on the web page, though maybe not to developers.
Elsewhere on the net someone pointed me to the WikiTrust
project but I am not sure if I can use that effectively without mirroring the entire Wikipedia. I do like the idea of being able to do contextual highlighting to note the more or less reliable parts of an article.
Thoughts, advice, recriminations encouraged!