Vast majority of UK tech professionals believe the government’s forthcoming Online Safety Bill (OSB) is not fit for purpose, with just 19% feeling the measures would actually make the internet safer, a survey by BCS, The Chartered Institute for IT, has found.
The OSB – the passage of which was paused by the government in July 2022 following legislative timetabling issues – would place a “duty of care” on tech companies to identify and remove illegal material, as well as clarify how they would deal with content that is “legal but harmful” to adults and children.
Failure to do so could result in fines of up to 10% of their turnover by the online harms regulator, which was confirmed to be Ofcom in December 2020.
However, according to a BCS survey of around 1,300 UK tech professionals, only 14% believed the legislation was “fit for purpose”, while 46% said it was unworkable. While just under one in five thought the measures would make the internet safer, just over half felt it would not.
One of the most controversial aspects of the bill is whether and how companies should deal with “legal but harmful content”, which critics have said could stifle freedom of expression online.
While the bill will not require the removal of legal content, larger platforms that fall into “category 1” – services with the highest risk functionalities and the highest user-to-user reach – will be required to set out how “priority content that is harmful to adults”, such as suicide-related material, is dealt with by their service.
While Parliament is yet to specify the types of harmful content, service providers will be required to balance their limiting of such content with the need to protect users’ freedom of speech.
Of those polled by BCS, 58% said the legislation would have a negative effect on freedom of speech, while only 9% were confident that “legal but harmful content” could be effectively and proportionately removed.
A further 74% said they felt the bill would do nothing to stop the spread of disinformation and fake news.
“There is real need to prevent online harm, but this law only goes part way to trying to achieve that. The aim should be to prevent hatred and abusive online behaviours by stopping harmful material from appearing online in the first place – and that takes a mix of both technical and societal changes,” said Rob Deri, chief executive of BCS, adding the new prime minister should “fundamentally review” the legislation.
“The technology itself has an important part to play in keeping people safe on social media platforms. However, the bill leans too heavily on tech solutions to prevent undesirable content, which can’t be relied upon to do that well enough and could affect freedom of speech and privacy in ways that are unacceptable in a democratic society.
“The legislation should also focus on substantive programmes of digital education and advice so that young people and their parents can confidently navigate the risks of social media throughout their lives,” added Deri.
The bill has already been through a number of changes. When it was introduced in March 2022, for example, a number of criminal offences were added to make senior managers liable for destroying evidence, failing to attend or providing false information in interviews with Ofcom, and for obstructing the regulator when it enters company offices for audits or inspections.
At the same time, the government announced it would significantly reduce the two-year grace period on criminal liability for tech company executives, meaning they could be prosecuted for failure to comply with information requests from Ofcom within two months of the bill becoming law.
In July 2022, Ofcom published its Online Safety Roadmap, provisionally setting out its plans to implement the government’s internet safety regime in the first 100 days of its enactment, but noted that the plan was subject to change as the bill evolved further.
The OSB is due to return to Parliament once the new prime minister is selected on 5 September 2022.