UK outlines regulations to safeguard children from harmful algorithms

Editor

The UK is taking steps to protect children online by calling on search and social media companies to address harmful content recommended by algorithms. Ofcom, the UK’s media regulator, outlined over 40 proposed requirements under its Online Safety Act rules to better protect minors online. These requirements include age-checks, content moderation, and preventing exposure to harmful content related to issues such as eating disorders, self-harm, suicide, pornography, violence, hate speech, and abuse. Companies will also need to prevent online bullying, dangerous challenges, and allow children to provide feedback on content they do not want to see.

Ofcom’s chief executive, Melanie Dawes, emphasized that tech firms have a responsibility to keep children safer online and must take steps to ensure that aggressive algorithms do not push harmful content to children in personalized feeds. Platforms will soon need to block content deemed harmful even if it means preventing children from accessing the entire site or app. Failure to comply with the new regulations could result in fines of up to £18 million or 10 percent of a company’s global revenue, with companies like Meta, Google, and TikTok facing potentially substantial penalties if they do not adhere to the rules.

Companies have until July 17th to respond to Ofcom’s proposals, after which the regulator will present the codes to parliament. A final version of the regulations is expected to be released in Spring 2025, giving platforms three months to make the necessary changes to comply with the rules. Ofcom has warned that companies that do not comply can expect enforcement action to be taken against them. The regulations aim to create a safer online environment for children by ensuring that they are not exposed to harmful or inappropriate content on search and social media platforms.

The proposed regulations also aim to prevent children from encountering content related to online bullying, dangerous challenges, and promotions for harmful activities. Platforms will need to allow children to provide feedback on content they do not want to see so that they can better curate their online experience. This includes blocking content that is deemed harmful or inappropriate, even if it means restricting children’s access to certain parts of the platform. Companies are being urged to take action to protect children online and prevent them from being exposed to content that could harm their well-being or mental health.

Overall, the UK’s Online Safety Act is designed to hold tech companies accountable for the content that is recommended to children on their platforms. Companies will need to implement measures such as age-checks and content moderation to ensure that children are not exposed to harmful or inappropriate content online. Failure to comply with the regulations could result in significant fines for companies, with potential penalties reaching millions of pounds. By taking action to address toxic algorithms and harmful content, the UK aims to create a safer online environment for children and protect them from potential harm while using search and social media platforms.

Share This Article
Leave a comment