Britain appoints company to monitor online content.
The British government recently announced plans to give the U.K.’s telecommunications watchdog, Ofcom, the power to enforce a “duty of care” on companies such as Facebook and Twitter “to protect users from harmful and illegal terrorist and child abuse content.” If unlawful material is found on these sites, companies will be fined. Ofcom already monitors radio and television broadcasters.
Officials will need to draft new legislation in order for the new regulations to take effect. After this is in place, the law will allow time for platforms to remove “illegal content” quickly and “minimize the risk of it appearing.” If they do not comply, they will be sanctioned. And, in order to safeguard freedom of expression, the rules won’t restrict individuals from accessing or posting offensive but legal content.
Online companies “will be required to explicitly state what content and behavior is acceptable on their sites in clear and accessible terms and conditions and enforce these effectively, consistently and transparently,” the government said.
“Facebook has long called for new regulations to set high standards across the internet,” said Rebecca Stimson, the social network’s head of U.K. public policy. “New rules are needed so that we have a more common approach across platforms and companies aren’t making so many important decisions alone.”
YouTube also said it looked forward to “working in partnership with the government and Ofcom to ensure a free, open and safer internet that works for everyone.”
Digital Secretary Nicky Morgan explained the new rules would be “proportionate and strong.” She said, “We have an incredible opportunity to lead the world in building a thriving digital economy, driven by groundbreaking technology, that is trusted by and protects everyone in the U.K. There are many platforms who ideally would not have wanted regulation, but I think that’s changing. I think they understand now that actually regulation is coming.”
Julian Knight, chair elect of the Digital, Culture, Media and Sport Committee, called for “a muscular approach” to regulation. He added, “That means more than a hefty fine – it means having the clout to disrupt the activities of businesses that fail to comply, and ultimately, the threat of a prison sentence for breaking the law.”
The National Society for the Prevention of Cruelty to Children also said it welcomed “a duty of care model that puts the onus on big tech to prevent online harms.”
Seyi Akiwowo set up the online abuse awareness group Glitch after experiencing sexist and racist harassment online after a video of her giving a talk was posted on a neo-Nazi forum.
“When I first suffered abuse the response of the tech companies was below [what I’d hoped],” she said. “I am excited by the Online Harms Bill – it places the duty of care on these multi-billion-dollar tech companies.”
Some countries have already instituted similar laws restricting online content. Australia passed the Sharing of Abhorrent Violent Material Act in April 2019, introducing criminal penalties for social media companies, and China blocks Twitter, Google and Facebook. Information is also monitored for politically sensitive content.