Can the UK Online Safety Bill take on misogyny?

The 2021 Refuge report says over one in three British women have experienced online abuse. Companies could face fines of 10% of their turnover if they do not remove illegal content

Reuters
Published : 30 Jan 2023, 08:41 AM
Updated : 30 Jan 2023, 08:41 AM

The arrest of internet influencer Andrew Tate on suspicion of human trafficking brought his extreme online misogyny and hate speech into the spotlight, with debate swirling around whether lawmakers and social media platforms did enough to stop its spread.

The former kickboxer, who amassed millions of followers on social media, was detained last month and is being investigated by a Romanian anti-crime unit for allegedly forming an organised crime ring to sexually exploit women.

TikTok, Facebook, Instagram and YouTube suspended Tate's official accounts last August, but video clips of Tate shared by his followers continue to circulate, and his Twitter account was reinstated in November after new owner Elon Musk took over.

Now, advocacy groups in Britain that say social media firms are failing to curb harmful content, are looking to the proposed Online Safety Bill to hold big tech firms - and their bosses - accountable.

The bill, which is awaiting passage in parliament, will require online platforms to protect users from content that is illegal, such as terrorism, child sex abuse and revenge pornography, and that violates their own rules on hate speech.

It will also criminalise cyber-flashing - when a man sends a photo of his penis via a digital device - and the non-consensual sharing of manufactured intimate images (known as deepfakes).

"Women are not safe online and something really needs to be done about this," Jess Eagleton, policy manager at domestic abuse charity Refuge said.

"The government has that opportunity with the Online Safety Bill."

Companies could face fines of up to 10% of their turnover if they do not remove illegal content, and bosses of big tech firms could face jail time for ignoring the new rules.

"Platforms have never properly enforced their rules. (The bill) produces, for the first time ever, real disincentives to sitting on your hands," said Imran Ahmed, chief executive of the Centre for Countering Digital Hate (CCDH).

MORE EFFECTIVE

More than one in three British women have experienced online abuse, according to a 2021 Refuge report.

In recent years, social media platforms have beefed up their own safeguards against harassment and bullying - such as Facebook's rule of removing sexualised attacks on public figures and mass harassment of individuals through direct messaging.

Still, social media firms largely fail to respond to reports of misogynistic abuse: Instagram, owned by Meta, did not respond nine times out of 10 when women reported abusive direct messages on the platform, a study by CCDH last year showed.

In addition, nearly half the accounts that Twitter failed to remove for abusing women went on to post misogynist content again, CCDH found.

Tech firms are also slow to respond to domestic abuse victims' requests to take down intimate images, according to Refuge, which has pushed for better regulation of social media platforms.

"For too long, social media giants have marked their own homework - we need regulation to prioritise domestic abuse that we see many platforms just ignore," said Refuge's Eagleton, who welcomed the government's intent to criminalise coercive behaviour via social media.

But the bill has gaps, said Eagleton.

Refuge is campaigning alongside online abuse charity Glitch and other women's groups for the bill to require platforms to incorporate a Violence Against Women and Girls code of practice.

The code would require platforms to carry out risk assessments that address online abuse of women and girls and take measures to mitigate it.

FIGHTING THE ALGORITHM

In November, the government replaced a provision that protected adult users from harmful content that is not a criminal offence, with a requirement that social media firms enforce their own rules and offer users tools to filter out such content.

But the majority of online abuse women face falls below the criminal threshold, according to online abuse charity Glitch, and firms will only face fines if they fail to remove misogynistic content that is in breach of their own guidelines.

Rights groups have warned that algorithms on social media platforms can exacerbate the spread of hate speech, and that changes to the bill shift the focus onto how sites are used, and away from regulating the way platforms work.

"The spread of online misogyny is not organic," said Gwen Taylor, policy manager at Glitch. "It is being promoted by platform recommendations and algorithms that are sharing content that gets a lot of reactions."

Proposed user tools would only hide harmful content for individual users rather than reduce the level of abusive content circulating, the charity said.

An investigation by The Observer last year found that followers of Tate - who has talked about hitting and choking women and trashing their belongings - were told to flood social media with his most controversial clips to get more views and engagement.

Videos of Tate on TikTok have been viewed over 11 billion times. But, only a small portion of viral Tate content came directly from his accounts, with the majority circulated by other social media users.

TEACHER ROLE

While attention on the Online Safety Bill grows ahead of a debate in the House of Lords in the coming weeks, secondary school teachers across Britain have been taking their own actions to curb the influence of figures like Tate.

Several schools have started dedicating assemblies and classes to discussing misogyny, and are sending advice to parents on how to talk about Tate with their children.

"I'm all for stiffening the Online Safety Bill to make sure the content young people see is not warping the way they think, and leading them down a dangerous path," said Robin Macpherson, head of Robert Gordon's College in Aberdeen, which has introduced discussions on Tate and misogyny in class.

"Teachers are often seen as a solution to society's wider problems, but it needs to be a partnership with parents, community organisations and governments," she said.

"There needs to be a regulator that's got teeth to make sure that the bill is actually enforced and taken seriously."