GDPR rules could be used by the UK to fine tech firms for 'harmful content'
The UK is looking at hitting technology companies with financial penalties if they fail to do enough to counter "harmful content" on their platforms.
Jeremy Wright, the Digital, Culture, Media and Sport (DCMS) secretary, says that tech firms need to be made to "sit up and take notice" when it comes to dealing with problematic content. While clearly aimed more at social media companies, the proposals would encompass other technology firms as well. The plans also suggest that search engines should remove links to offending websites, and that some sites could even be blocked completely.
- Valve responds to European complaints about Steam geo-locking
- Europe hits Google with €1.49 billion fine for breaking antitrust law with AdSense restrictions
- UK watchdog says Huawei poses a national security risk
The DCMS would like to see the creation of a watchdog to oversee technology companies and to draw up a code of practice for them to follow. Speaking to the BBC, Wright outlined what the proposals could mean: "The era of self-regulation for online companies is over. Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough".
He went on to say:
If you look at the fines available to the Information Commissioner around the GDPR rules, that could be up to 4 percent of company's turnover. We think we should be looking at something comparable here.
The proposals are outlined in a government white paper which contains a number of ideas:
- establishing an independent regulator that can write a "code of practice" for social networks and internet companies
- giving the regulator enforcement powers including the ability to fine companies that break the rules
- considering additional enforcement powers such as the ability to fine company executives and force internet service providers to block sites that break the rules
The paper covers areas such as spreading terrorist content, child sex abuse, revenge porn, hate crimes, harassment, the sale of illegal goods, cyber-bullying, trolling and the spread of fake news and disinformation.
Facebook has already publicly called for greater regulation, and now the company's head of UK policy, Rebecca Stimson, says:
New regulations are needed so that we have a standardised approach across platforms and private companies aren't making so many important decisions alone. New rules for the internet should protect society from harm while also supporting innovation, the digital economy and freedom of speech.
Twitter appears to be in broad agreement with the firm's head of UK public policy, Katy Minshall, saying: "We look forward to engaging in the next steps of the process, and working to strike an appropriate balance between keeping users safe and preserving the open, free nature of the internet".