Report: YouTube and Facebook are now using automated tools to remove extremist content

social_media_flickr_infocux_technologies_contentfullwidth

With the likes of ISIS understanding the power of social media, Facebook and other online services find themselves under increasing pressure to counter terrorist and other extremist content. A report by Reuters says that a number of online companies are using automated tools to remove videos that violate terms of use.

Such tools have previously been used to prevent the spread of copyright videos online, but now it seems that they have been put to a new task. While automation can do little, if anything, to prevent the initial appearance of extremist videos, social networks can use them to stem the flow of republishing.

Reuters says that Facebook and YouTube are just two of the sites to have made the switch to automation as they seek to avoid their platforms from being used to spread propaganda. The technology using a fingerprinting technique previously employed to identify and remove copyright material.

The system dramatically speeds up the process of removing reposts, and takes the legwork out of doing so. But Reuters points out that while "such a system would catch attempts to repost content already identified as unacceptable, but would not automatically block videos that have not been seen before".

Twitter, Facebook and other prominent companies have already taken steps to try to eradicate online hate speech, but the methods used have been rather labor-intensive.

15 Responses to Report: YouTube and Facebook are now using automated tools to remove extremist content

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.