Internal:Public Policy/Reducing online harassment

From Wikimedia District of Columbia
< Internal:Public Policy
Revision as of 21:12, 24 February 2018 by Econterms (talk | contribs) (→‎Why Wikimedia DC, why now: see if refs and wikilinks simplify the text)
Jump to navigation Jump to search

UPDATE 31 Dec 2017: There is an new bill called the Enough Act. It has sponsors from both parties and both Houses of Congress. The text looks similar to the earlier version (then called IPPA) discussed below. Our speaker and likely ally Danielle Citron supports it, and so does Facebook.[1] WMF hasn't commented. Possibly one reason for broad speedy support is that Rep Joe Barton just got stung by a version of this. Peter's read it. It looks very safe to support it.

Sources:

Why Wikimedia DC, why now

  • We have been tracking this legislation for some time, since WikiConference USA 2015, where law professor and internet harassment expert Danielle Citron was our guest speaker, and WMF support staff Patrick Earley and I met with members of Rep. Katherine Clark's staff. (For details see user page User:Avery Jensen/Legislation.)
  • Up until now, while there have been many legislative proposals worth supporting, all the proposals have had some major flaw: they did not have bipartisan support, they did not have support in both houses of Congress, they had a spending portion which almost guaranteed they would never get out of the appropriations committee, or they were broadly tailored in ways that made them an imperfect fit for the specific needs of our membership. This particular bill has been introduced in both houses of Congress, is sponsored by members of both parties, adds no spending or additional layers of bureaucracy, and is very specific to non-consensual private images. While we could wish for more and better legal protections from harassment, this bill addresses the worst and most distressing type of harassment that has been experienced and witnessed, and which can have such a chilling effect on our corner of the internet.
  • For details of non-consensual private images in the context of harassment on Wikipedia, see the Wikimedia Harassment Survey 2015, also Forte, Andalibi, and Greenstad (2017)[2] The later study has anonymous interviews with Wikimedians, who perceived the five most common types of threats to be loss of privacy, loss of employment opportunity, fear for safety of self and loved ones, harassment/intimidation, and reputation loss. (p.5) "Participants spoke of threats from private citizens more concretely. They were afraid of things like threats to their families, being doxxed, having fake information about them circulated, being beaten up, or having their heads photoshopped onto porn because they experienced these things or saw them happen to others." (p.7)
  • Although the WMF has assisted users in the past, for instance the WikiTravel users who formed WikiVoyage, they do not provide legal assistance to users who have non-consensual private images posted in connection with their Wikipedia activities. Such a person would probably be referred to some organization like EFF, that only does impact litigation and would be unlikely to provide free legal help. The WMF would be unlikely to help a community member directly, as they can only represent the Wikimedia Foundation and would not be willing to create an attorney-client relationship with users.
  • There may be a heightened local awareness of employment security issues since the reinstatement of the Holman Rule in January 2017 that allows any legislator to amend an appropriation bill to defund individual federal employees.[2] The rule was used in July 2017 in an unsuccessful attempt to cut 89 employees from the CBO.[3]

Draft news release

Peter envisions several paragraphs, adding up to a page:

  • We support the creation of the new legal offense because this behavior is toxic and intimidating and harrassment. It drives away participants in creating our knowledge pool, our service to the public. It is therefore relevant to Wikimedia. It's used to humiliate and silence participants.
  • Comment on tradeoff: (1) we know it's a constraint on free expression.
  • Tradeoff (2) we know it puts a little risk on the platform. Protections for platforms comes from section 230 o the communications decency act, aka the "safe harbor provision". we think the protection is sufficient. if the platform is accused of misbehavior, we can eliminate the offending content, fight the action in court, or support changes to the law. we do not anticipate this; rather . Note that prosecutors will need to show reasonable judgment, which we expect, and if we don't see that we'll support changes.

Pre-2018 discussion

  • Our chapter could publicly support the w:Intimate Privacy Protection Act
  • Text of the draft law (4 pages, clear to read)
  • The proposed law forbids sharing sexually intimate images of identifiable persons with reckless disregard for their lack of consent. It does not forbid the use of such images in reports to law enforcement, the courts, corrections officers, intelligence services or in other cases of (unspecified) public interest (e.g. I suppose reports of a public health problem). Definition of "sexually explicit" is inherited from existing laws. "Reckless" will be interpreted by prosecutors. Interactive computer platform providers (e.g. WMF or Facebook) are not considered violators of the law if a user uploaded something reckless, unless the platform explicitly invites such content.
  • Background: Motherboard/Vice article showing that google got the drafters of the law to add protections for platform providers (like WMF). It's quite interesting to read. Note that the ACLU does not favor the draft law, apparently because it imposes on free speech.
  • Peter's judgment: The proposed law makes sense, and the risks of passing it are less than the risks left open by not-passing it. The law has appropriate limits. If enforcement seems to go awry, we would support changing it but that seems unlikely.
Steps before support
  • read the draft law Green tickY (done by Peter) Peter will compare IPPA bill to Enough Bill. Green tickY Done -- after some comparison, can't find any SUBSTANTIVE difference. There's a lot of text changing, especially because now there is a long section of definitions in the front. The one-paragraph abstract, or summary, of the new Enough Act, is identical to the previously proposed IPPA.
  • consult with WMF public policy -- Done Green tickY with nice msg from Jan Gerlach
  • let's not take the time to consult with Tech Lady Mafia, EFF, WM NY, Newyorkbrad, or other partner orgs, because it takes too much time
  • we discuss extraterritoriality -- a US person using a server outside the US is still committing a crime if they use that server to host such inappropriate intimate pix. see page 6, paragraph (f). Good.

References