The ARTT project is to develop a software tool to aid reliability evaluation and discussion. We envision a tool that, when given a URL to a social media page or a Wikimedia page, can give useful information and advice. The tool understands the streucture of what it's looking at, so if it's looking at an English Wikipedia article, it knows there might be a talk page and how to interpret it; knows how to interpret wikilinks knows how to interpret footnotes; knows about ORES scores, and knows about the edit history of this article and how to detect and evaluate the users who have made edits to this article.
We're focused in this round of work on vaccine-related articles, we want to use practical principles which would also work for articles related to politics. We can offer a list of vaccine-related articles.
The tools should be able to detect certain kinds of reliability issues -- e.g. the use of problematic "perennial sources", or certain common claims or relationships (vaccines-autism) which signal links to inaccurate or mis- or dis-information. The tools should be able to give advice on reliability matters and also how to handle discussions about them with other editors (or other users on a social platform).
From April-Aug 2022, we are to help gather information to help design the internals, external appearance, and desirable capabilities of that tool.
Relevant design questions:
- Is this a relevant signal? "the most recent edit is by an editor who is unreliable, often in conflict, or unknown, e.g. an IP editor?
- Do we want to mark particular versions of an article as "more reliable" or "less reliable"? In other words, is our reliability evaluation associated with a particular version of an article? I think we should assume yes and not ask a lot in focus groups about it. Let's draw from the lit about ORES and the discussions leading up to ORES.
- Counts of footnotes and sources?
- Monday and Wednesday meetings
- Add to meta:Research:Projects, including the Press reference ()
- Knowledge Futures (KFG) is a possible partner. They are working on "Underlay." IThat might be a formal citation scheme for machines to evaluate reliability of texts -- can't tell yet. SJ Klein and Danny Hillis and a couple dozen other people are involved. It's headquartered in Cambridge, MA. They created PubPub software; not clear what that is.
- Underlay is developed by Knowledge Futures and would be a kind of super-wikidata
- Files tracked/stored by Underlay will use the IPFS standard for distributed file storage across a network
- ORES and other assessments of articles:  - let's add here what the abbr stands for
- ICFN = International Fact-Checking Network