Internal:Public Policy/Reducing online harassment

From Wikimedia District of Columbia
< Internal:Public Policy
Revision as of 13:59, 26 February 2018 by Econterms (talk | contribs) (draft text for news release)
Jump to navigation Jump to search

UPDATE 31 Dec 2017: There is an new bill called the Enough Act. It has sponsors from both parties and both Houses of Congress. The text looks similar to the earlier version (then called IPPA) discussed below. Our speaker and likely ally Danielle Citron supports it, and so does Facebook.[1] WMF hasn't commented. Possibly one reason for broad speedy support is that Rep Joe Barton just got stung by a version of this. It looks very safe to support this proposal.

Sources:

Why Wikimedia DC, why now

  • We have been tracking this legislation for some time, since WikiConference USA 2015, where law professor and internet harassment expert Danielle Citron was our guest speaker, and WMF support staff Patrick Earley and I met with members of Rep. Katherine Clark's staff. (For details see user page User:Avery Jensen/Legislation.)
  • Up until now, while there have been many legislative proposals worth supporting, all the proposals have had some major flaw: they did not have bipartisan support, they did not have support in both houses of Congress, they had a spending portion which almost guaranteed they would never get out of the appropriations committee, or they were broadly tailored in ways that made them an imperfect fit for the specific needs of our membership. This particular bill has been introduced in both houses of Congress, is sponsored by members of both parties, adds no spending or additional layers of bureaucracy, and is very specific to non-consensual private images. While we could wish for more and better legal protections from harassment, this bill addresses the worst and most distressing type of harassment that has been experienced and witnessed, and which can have a chilling effect on our corner of the internet.
  • For details of non-consensual private images in the context of harassment on Wikipedia, see the Wikimedia Harassment Survey 2015, also Forte, Andalibi, and Greenstad (2017)[2] The later study has anonymous interviews with Wikimedians, who perceived the five most common types of threats to be loss of privacy, loss of employment opportunity, fear for safety of self and loved ones, harassment/intimidation, and reputation loss. (p.5) "Participants spoke of threats from private citizens more concretely. They were afraid of things like threats to their families, being doxxed, having fake information about them circulated, being beaten up, or having their heads photoshopped onto porn because they experienced these things or saw them happen to others." (p.7)
  • Although the WMF has assisted users in the past, for instance the WikiTravel users who formed WikiVoyage, they do not provide legal assistance to users who have non-consensual private images posted in connection with their Wikipedia activities. Such a person would probably be referred to some organization like EFF, that only does impact litigation and would be unlikely to provide free legal help. The WMF would be unlikely to help a community member directly, as they can only represent the Wikimedia Foundation and would not be willing to create an attorney-client relationship with users.
  • There may be a heightened local awareness of employment security issues since the reinstatement of the Holman Rule in January 2017 that allows any legislator to amend an appropriation bill to defund individual federal employees.[1] The rule was used in July 2017 in an unsuccessful attempt to cut 89 employees from the CBO.[3]

Draft news release

Peter envisions several paragraphs, adding up to a page:

Wikimedia DC supports the Enough Act and hopes it will pass both Houses of Congress.

We support the creation of the new, narrowly defined crime. The behavior to be outlawed -- posting of intimate images with reckless disregard for the interests of the person depicted -- has a toxic effect on online participation. It can intimidate, silence, and harass users of online platforms such as Wikipedia and the other Wikimedia sites. This behavior can drive away participants, especially women, from Wikimedia. This is not only unfair to them but it delays and adds difficulty to our offering their knowledge in our service to the public. It is therefore relevant to Wikimedia. The offense is also rare but so highly toxic that it frightens people who are not themselves attacked.

We recognize some tradeoffs in creating the new law. For one, it creates a constraint on free expression. We are confident that the behavior outlawed here is rare, carefully and narrowly defined in the Act, and will be sensibly interpreted by law enforcers and prosecutors. The behavior is highly toxic so it is worth some cost and constraint to reduce it.

Another tradeoff is that it creates some risk for nonprofit knowledge programs such as Wikimedia sites. The relevant legal protections for platforms comes from the "safe harbor provision," section 230 of the Communications Decency Act. We think the protection is sufficient. if the platform is accused of misbehavior, we can eliminate the offending content, fight the action in court, or support changes to the law. we do not anticipate this; rather .

Law enforcers and prosecutors will need to show reasonable judgment, which we expect, and if we don't see that we'll support changes. A greater danger is that too few cases will be identified. We will support further changes in law if we think the balance is not coming out right.

Our chapter has not previously advocated changes in law but we think this is a good and important proposal.

Pre-2018 discussion

  • Our chapter could publicly support the w:Intimate Privacy Protection Act
  • Text of the draft law (4 pages, clear to read)
  • The proposed law forbids sharing sexually intimate images of identifiable persons with reckless disregard for their lack of consent. It does not forbid the use of such images in reports to law enforcement, the courts, corrections officers, intelligence services or in other cases of (unspecified) public interest (e.g. I suppose reports of a public health problem). Definition of "sexually explicit" is inherited from existing laws. "Reckless" will be interpreted by prosecutors. Interactive computer platform providers (e.g. WMF or Facebook) are not considered violators of the law if a user uploaded something reckless, unless the platform explicitly invites such content.
  • Background: Motherboard/Vice article showing that google got the drafters of the law to add protections for platform providers (like WMF). It's quite interesting to read. Note that the ACLU does not favor the draft law, apparently because it imposes on free speech.
  • Peter's judgment: The proposed law makes sense, and the risks of passing it are less than the risks left open by not-passing it. The law has appropriate limits. If enforcement seems to go awry, we would support changing it but that seems unlikely.
Steps before support
  • read the draft law Green tickY (done by Peter) Peter will compare IPPA bill to Enough Bill. Green tickY Done -- after some comparison, can't find any SUBSTANTIVE difference. There's a lot of text changing, especially because now there is a long section of definitions in the front. The one-paragraph abstract, or summary, of the new Enough Act, is identical to the previously proposed IPPA.
  • consult with WMF public policy -- Done Green tickY with nice msg from Jan Gerlach
  • let's not take the time to consult with Tech Lady Mafia, EFF, WM NY, Newyorkbrad, or other partner orgs, because it takes too much time
  • we discuss extraterritoriality -- a US person using a server outside the US is still committing a crime if they use that server to host such inappropriate intimate pix. see page 6, paragraph (f). Good.

References