Twitter considers new Dehumanization Policy -- but wants your feedback first
Continuing to face criticism for failing to successfully tackle abuse on its platform, Twitter has announced a new approach. The company is planning to introduce new policies, but before they become part of official rules it will ask for feedback from users.
Starting off with a new Dehumanization Policy, Twitter will invite users to give their opinion and complete surveys about proposed policy changes. The first policy to be subject to this public scrutiny says: "You may not dehumanize anyone based on membership in an identifiable group, as this speech can lead to offline harm".
- Hooray! The option to view your Twitter timeline chronologically is coming!
- How to hide the Twitter bug reporter icon in Android
- Twitter hits Alex Jones and Infowars with permanent bans
- Twitter is killing push notifications for third party apps -- here's what you need to do
Previously, Twitter policies were subject to input from its own Trust and Safety Council and a handful of experts. Now, it says, "we're trying something new by asking everyone for feedback on a policy before it's part of the Twitter Rules".
The proposed Dehumanization Policy is quite wide-ranging, and Twitter explains why it is considering introducing it:
For the last three months, we have been developing a new policy to address dehumanizing language on Twitter. Language that makes someone less than human can have repercussions off the service, including normalizing serious violence. Some of this content falls within our hateful conduct policy (which prohibits the promotion of violence against or direct attacks or threats against other people on the basis of race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, or serious disease), but there are still Tweets many people consider to be abusive, even when they do not break our rules.
In a blog post explaining that users will be able to provide feedback for the next fortnight, Twitter offers up a couple of definitions for consideration:
Dehumanization: Language that treats others as less than human. Dehumanization can occur when others are denied of human qualities (animalistic dehumanization) or when others are denied of human nature (mechanistic dehumanization). Examples can include comparing groups to animals and viruses (animalistic), or reducing groups to their genitalia (mechanistic).
Identifiable group: Any group of people that can be distinguished by their shared characteristics such as their race, ethnicity, national origin, sexual orientation, gender, gender identity, religious affiliation, age, disability, serious disease, occupation, political beliefs, location, or social practices.
Users are invited to provide feedback about whether the definitions and policy are clear, if there are any improvements that could be made, and whether there are any "examples of speech that contributes to a healthy conversation, but may violate this policy" -- feedback is limited to just 280 characters.