Chris Philp is the Minister for Tech and the Digital Economy. He is also the MP for Croydon South.
Elon Musk was right when he called Twitter the “de facto public town square”. Social media is where people now go to air their views, debate political issues and discover news.
It is through these platforms’ policies and algorithms which select and serve us content, and through the decisions made by their moderators, that much of what we see on the web is decided.
Yet all too often they ignore their own rules governing how the technology works and what users should expect.
Users on social media regularly see unwanted pornography, racism or pro-suicide content – material that is usually prohibited in the platforms’ own terms and conditions. But there are also numerous examples of tech companies removing legal free speech which doesn’t breach their terms and conditions. For example, during the pandemic Facebook removed some accurate health advice as well as covid misinformation, even though it only prohibits the latter.
We should be able to trust that when a platform says they do something – like tackle criminal activity, protect children or remove harmful content – they will actually do it.
Thanks to our new legislation things are about to change for the better. I’ve been sitting on the committee of MPs scrutinising the government’s groundbreaking Online Safety Bill as it makes its way through Parliament.
We heard evidence from a wide range of people – academics, charities and think tanks – welcoming this pioneering internet safety law. The Bill will give Ofcom the necessary powers – big fines, the ability to block failing sites, prosecute rogue bosses – to hold large social media firms to account and to prevent content that is illegal or that is harmful to children from being online.
But contrary to what you might have heard – and which keeps being misunderstood – these new laws won’t endanger freedom of speech. In fact, they will enhance it. The Bill will not stop adults from expressing controversial or unpopular viewpoints on social media.
Despite what is sometimes claimed, the Bill simply requires that the biggest social media platforms are transparent to their adult users about what kind of material they might encounter on their service and are consistent about how it is dealt with, if at all.
If platforms are happy for their users to post vulgar, offensive or tasteless content then as long as it is legal they will be free to say they are content with this material and allow it to be posted.
Parliament will list the most obviously toxic and harmful social media behaviour which isn’t illegal (such as racist, homophobic and sexist abuse that falls short of the criminal threshold). The platforms will need to ensure these types of content are addressed in their terms and conditions and then apply these conditions consistently – but it remains up to the social media firms to set their own terms and conditions.
They may decide, as almost all of them already do in their terms of service, to forbid such content, acknowledging the hurt and harm such material causes. In which case they will need to stick to their promises and stop it cropping up in people’s feeds. Or tailor the design of their services to stop amplifying harmful posts just to drive engagement and thus revenue. They won’t be able to claim it’s forbidden but then do nothing about it.
But if they choose to allow it on their platforms, users can then decide for themselves whether they want to be on those platforms or not, on the basis of a clear understanding of what to expect.
The legal requirement for transparency and consistent application of terms and conditions will ensure that the platforms are less able to act as arbitrary censors of what is allowed online. There is also a binding legal duty in the Bill to have regard to freedom of speech – the first time that such a duty will be imposed on social media firms in the UK. There are protections for content of democratic or journalistic importance. These provisions taken together will strengthen free speech online, not undermine it.
For the first time, social media platforms will also be legally obliged to have proper and straightforward routes of appeal so people can appeal wrongful takedowns of their content. They will have to deal with these appeals effectively and either give good reasons why posts have been removed or reinstate them.
News publishers’ content is rightly exempt from the regulation and journalists will get a fast-tracked right to appeal if any of their content is removed..
This Bill will be a vast improvement on the current system of self-regulation. I hope MPs across the House will support it as it passes through Parliament.
Chris Philp is the Minister for Tech and the Digital Economy. He is also the MP for Croydon South.
Elon Musk was right when he called Twitter the “de facto public town square”. Social media is where people now go to air their views, debate political issues and discover news.
It is through these platforms’ policies and algorithms which select and serve us content, and through the decisions made by their moderators, that much of what we see on the web is decided.
Yet all too often they ignore their own rules governing how the technology works and what users should expect.
Users on social media regularly see unwanted pornography, racism or pro-suicide content – material that is usually prohibited in the platforms’ own terms and conditions. But there are also numerous examples of tech companies removing legal free speech which doesn’t breach their terms and conditions. For example, during the pandemic Facebook removed some accurate health advice as well as covid misinformation, even though it only prohibits the latter.
We should be able to trust that when a platform says they do something – like tackle criminal activity, protect children or remove harmful content – they will actually do it.
Thanks to our new legislation things are about to change for the better. I’ve been sitting on the committee of MPs scrutinising the government’s groundbreaking Online Safety Bill as it makes its way through Parliament.
We heard evidence from a wide range of people – academics, charities and think tanks – welcoming this pioneering internet safety law. The Bill will give Ofcom the necessary powers – big fines, the ability to block failing sites, prosecute rogue bosses – to hold large social media firms to account and to prevent content that is illegal or that is harmful to children from being online.
But contrary to what you might have heard – and which keeps being misunderstood – these new laws won’t endanger freedom of speech. In fact, they will enhance it. The Bill will not stop adults from expressing controversial or unpopular viewpoints on social media.
Despite what is sometimes claimed, the Bill simply requires that the biggest social media platforms are transparent to their adult users about what kind of material they might encounter on their service and are consistent about how it is dealt with, if at all.
If platforms are happy for their users to post vulgar, offensive or tasteless content then as long as it is legal they will be free to say they are content with this material and allow it to be posted.
Parliament will list the most obviously toxic and harmful social media behaviour which isn’t illegal (such as racist, homophobic and sexist abuse that falls short of the criminal threshold). The platforms will need to ensure these types of content are addressed in their terms and conditions and then apply these conditions consistently – but it remains up to the social media firms to set their own terms and conditions.
They may decide, as almost all of them already do in their terms of service, to forbid such content, acknowledging the hurt and harm such material causes. In which case they will need to stick to their promises and stop it cropping up in people’s feeds. Or tailor the design of their services to stop amplifying harmful posts just to drive engagement and thus revenue. They won’t be able to claim it’s forbidden but then do nothing about it.
But if they choose to allow it on their platforms, users can then decide for themselves whether they want to be on those platforms or not, on the basis of a clear understanding of what to expect.
The legal requirement for transparency and consistent application of terms and conditions will ensure that the platforms are less able to act as arbitrary censors of what is allowed online. There is also a binding legal duty in the Bill to have regard to freedom of speech – the first time that such a duty will be imposed on social media firms in the UK. There are protections for content of democratic or journalistic importance. These provisions taken together will strengthen free speech online, not undermine it.
For the first time, social media platforms will also be legally obliged to have proper and straightforward routes of appeal so people can appeal wrongful takedowns of their content. They will have to deal with these appeals effectively and either give good reasons why posts have been removed or reinstate them.
News publishers’ content is rightly exempt from the regulation and journalists will get a fast-tracked right to appeal if any of their content is removed..
This Bill will be a vast improvement on the current system of self-regulation. I hope MPs across the House will support it as it passes through Parliament.