Online safety bill ‘a missed opportunity’ to prevent child abuse, MPs warn

Read More

The sharing of some of the most insidious images of child abuse will not be prevented by a new government bill that aims to the make the internet a safer place, MPs have said.

The draft online safety bill is not clear or robust enough to tackle some forms of illegal and harmful content, according to a report by the digital, culture, media and sport (DCMS) committee. The landmark bill places a duty of care on tech firms to protect users from harmful content or face substantial fines imposed by the communications regulator Ofcom.

“In its current form what should be world-leading, landmark legislation instead represents a missed opportunity,” said Julian Knight, the chair of the DCMS committee. “The online safety bill neither protects freedom of expression, nor is it clear nor robust enough to tackle illegal and harmful online content. Urgency is required to ensure that some of the most pernicious forms of child sexual abuse do not evade detection because of a failure in the online safety law.”

The report urges the government to tackle types of content that are technically legal such as “breadcrumbing”, where child abusers leave digital signposts for fellow abusers to find abuse content, and deepfake pornography, which it says are not covered by the bill currently, although creators of deepfake images can be prosecuted for harassment. On child sexual abuse, the committee said the bill should tackle behaviour by predators designed to evade content moderation.

“One starting point should be to reframe the definition of illegal content to explicitly add the need to consider context as a factor, and include explicitly definitions of activity like breadcrumbing on the face of the bill,” says the report.

As currently drafted, the bill’s duty of care is split into three parts: preventing the proliferation of illegal content and activity such as child pornography, terrorist material and hate crimes; ensuring children are not exposed to harmful or inappropriate content; and, for large tech platforms such as Facebook, Twitter and YouTube, ensuring that adults are protected from legal but harmful content, a catch-all term covering issues such as cyberbullying.

The report recommends that the bill gives a definition of legal but harmful content that includes undermining someone’s reputation, national security or public health. The category should also account for attempts to interfere in elections or deter the public from voting, it says. Legal but harmful content is not strictly defined in the draft bill but it gives the culture secretary a key role in defining it.

Reflecting concerns that the legal but harmful category will affect freedom of speech, the report recommends a “must balance” test that weighs whether freedom of speech has been protected sufficiently in decisions on content.

The report warns that the bill is “vague” about the definition of illegal content and should be redrafted to state that it applies to existing criminal offences, rather than regulatory or civil offences. It also calls for Ofcom to have the power to conduct audits of tech companies’ systems. The former information commissioner Elizabeth Denham told MPs and peers last year that Ofcom should have the power to “look under the bonnet” of tech firms and scrutinise algorithms that could steer users down dangerous content rabbit holes.

It is the second committee report to demand changes to the bill, after a joint committee of MPs and peers urged a wide range of amendments, including criminal sanctions for tech executives who fail to deal with “repeated and systemic” safety failings. However, the DCMS committee report pushes back on calls for a permanent committee to oversee the act, stating that such a move would be a “significant departure from convention” and oversight of the act should be carried out by existing cross-party select committees.

The government has already suggested “significant improvements” could be made to the draft bill, with the culture minister Chris Philp telling MPs during a debate in the Commons this month that there were a number of areas where the online safety bill could be “improved substantially”.

A DCMS spokesperson rejected the committee’s criticism of the bill, saying it set a “gold standard” for safety. They said: “We do not agree with the criticism of the committee. The bill has been recognised as setting a global gold standard for internet safety. It has strict measures including a duty of care to stamp out child sexual abuse, grooming and illegal and harmful content.”

The spokesperson added: “The bill will make the UK the safest place to go online while protecting freedom of speech.”

Related articles

You may also be interested in

Headline

Never Miss A Story

Get our Weekly recap with the latest news, articles and resources.
Cookie policy

We use our own and third party cookies to allow us to understand how the site is used and to support our marketing campaigns.