Internal:Scientific claim wiki

It would be good if there were a searchable location to check whether the conclusions of published scientific work has been disputed -- because its data turned out to be wrong, or interpretation turned out to be wrong, or it was overwhelmed by counter-evidence. Otto Yang, Peter Meyer, and Lane Rasberry are in a joint phone call about this prospect.

Set up a wiki-type database could be established where experts rate papers on points such as:
 * One proposal (from Otto Yang):
 * 1. Reproducibility of data (with links to subsequent papers)
 * 2. Correctness of interpretation of data (since the data could be correct but misinterpreted in retrospect after future data on the topic, requiring revision of interpretation)
 * 3. Acceptance of results/conclusions in the field
 * 4. Listing of papers on the same topic that agree/disagree


 * Example issues
 * Autism-vaccine claims
 * RV 144 vaccine -- see on wikipedia: https://en.wikipedia.org/wiki/RV_144
 * Rogoff and Reinhardt on whether there is a threshold of govt debt at 90% of GDP causing loss of trust in financial markets (they found YES but it was shown later that they had left out five key observations by mistake and the conclusion was therefore not justified)
 * Levitt on whether abortions reduced crime rates 20 years later, state by state (econ) (methods shown to be incorrect, apparently by mistake ; conclusion is unsettled)


 * Existing scientific/expert/dispute summary sites
 * Citizendium; has expert articles on general topics ; no known standard format for disputed claims
 * Scholarpedia ; has expert articles on general topics ; no known standard format for disputed claims
 * WetWare ; MediaWiki, with semantics, and many bio-related labs have pages there on common platform ; not known whether it lists specific works and disputes in a common format
 * discoursedb -- has opinion pieces organized by their basic posture ; not oriented toward scientific publications
 * sciencegist.com -- mainly oriented to writing short summaries of scientific works in plain English ; no standard format for info on agree/disagree/dispute/debunk
 * acawiki -- has 1000 summaries of academic works ; no standard format for disputes
 * reffit.com - Has summaries of science articles, and a column for praise and criticism. GOOD EXAMPLE TO SEE.   It does not seem to be a wiki and is not a MediaWiki.  See http://reffit.com/about.
 * Web of Science -- has data on which scientific papers cited which other papers (but no known format for recording disputes and comments?)
 * Retraction Watch --- blog of recent papers with known dramatic errors of some kind -- thanks User:Antony-22!
 * others listed at http://acawiki.org/AcaWiki:Similar_projects. Let's figure out whether anybody's marking what's been debunked. And why not?
 * take a look at http://www.iassistdata.org/downloads/2013/2013_e4_castro.pdf
 * http://impactstory.org


 * Issues and observations
 * observation: NONE of these other smart sites, so far as I can tell, has a simple category for "disputed" or "debunked."  I'm not sure why.  Calls for research.  The service of identifying disputed/debunked would plainly be useful. Need to summarize what these sites do and how it relates to this project/mission. Econterms (talk) 14:18, 25 March 2014 (EDT)
 * observation: If this were a wiki doing this it would not generally identify TRUTH but rather identify whether a conclusion were disputed, and why.
 * Issue: what expertise level is required to contribute?  MediaWiki is not ideal platform for careful management of access control and rights.   We can manage clear misbehavior, but not content disputes, or not real well.
 * How should participants identify their own conflicts of interest?
 * How should participants identify their own knowledge base?  Can we require them to have a user page before posting an evaluation.
 * How to prevent a flood of anti-vaccine commentary for example?  One approach: develop community and technology traditions before they show up.  Community:  a few scientists and good faith-participants who are willing to defend relatively objective/scientific standards.
 * Observation/constraint: If the site is global and covers all sciences, it can get to be trusted and have good traction and contribute a lot of value.  Less risky is to focus it on a narrow set of fields but that may implode from lack of interest.


 * Support for project
 * Dr. Otto Yang says there is probably funding for this and identifies important examples of dispute-worthy science in vaccine realm
 * Wiki DC could help, manage, work it.
 * Platform, design, and structure is not clear/agreed on.
 * Where is the community to support the effort? Can we put that together?
 * Lane Rasberry, on joint phone call, supports mission, has worked on such things, refers to strengths of Wikipedia content on this already
 * Wikidata is probably going in the direction of having huge lists of scientific papers, with DOIs, URLs, etc. Wikidata's list could then be the domain/universe of sources about which disputes are relevant and displayed in a front end wiki.


 * To do
 * Peter to try out an example, perhaps on acawiki.org, for discussion.
 * Peter to post on [meta:IdeaLab], when we have more of an agreed-on proposal.
 * Many wiki/science experts to consult: Daniel Mietchen], Yaron Koren (expert Semantic Wikimedia programmer who made discoursedb.org and would like this project), w:User:Debivort (Harvard neuroscientist/entomologist, nice guy on-wiki; as a scientist he must be familiar with the debunking issue and be able to envision a wiki scheme that would work)
 * This proposal submitted for presentation at Wikimania is also relevant:
 * Otto to check with Applied Medical on their interest in supporting this
 * Otto to offer example of sci work with critiques, both commentary and categories of critique
 * Lane to be in touch with Dario and/or Daniel and other wikidata/science people
 * next phone call Thur Apr 10 at 4pm EST


 * Classification systems to describe how/why academic source A is citing academic source B
 * The classification system here: http://www.bmj.com/content/348/bmj.g1585
 * Otto suggests: for B's methodology, as background, to confirm a result from A, to contradict a result from B

We discussed classification systems to describe how/why academic source A is citing academic source B
 * updates for the group
 * Otto identifies four types: for B's methodology, as background, to confirm a result from A, to contradict a result from B
 * Lane pointed us to this paper: http://www.bmj.com/content/348/bmj.g1585.  Table 1 classifies scientific paper citations OF Wikipedia, using a classification system to describe why A cites B.  more than 4 categories.

Please see: http://openwetware.org/wiki/Main_Page It has about 24,000 "content pages": (to confirm, see: http://openwetware.org/wiki/Special:Statistics) It gets dozens of edits a day (http://openwetware.org/index.php?title=Special:RecentChanges&limit=500&hideminor=0)

Otto to send another example of a commentary on a paper, and/or ten cases of citations to the one he sent last time. Peter and Lane will try to wiki-fy these on acawiki.org and see what can be done about a classification system on-wiki. To see our progress visit: http://acawiki.org/Special:RecentChanges  and look for edits by us

See also that PLoS is inviting the drafting of computational bio papers on-wiki: http://topicpages.ploscompbiol.org/wiki/Main_Page