New rules to protect children online: ofcom’s age verification and content filtering measures

Date:

Under new Ofcom regulations, websites and apps will be legally required to check the ages of children using their services and prevent access to harmful or illegal content. These rules are designed to enhance the safety of young internet users by filtering out inappropriate material and ensuring that children are not exposed to dangerous or distressing content. Platforms will also be expected to tailor their algorithms to prevent harmful content from appearing in children’s feeds.

++ AI-Powered glasses offer blind people independent navigation

If companies fail to implement the 40 measures outlined in the Online Safety Act by July, they could face substantial fines, up to £18 million or 10% of their global revenue. Ofcom will also have the authority to seek court orders to ban access to websites within the UK. Additionally, the government is considering a ‘social media curfew’, which would restrict children from accessing social media platforms in the evening. The move comes after technology secretary Peter Kyle expressed concerns over TikTok’s approach to limiting access for users under 16 after 10pm, and he is considering offering parents more control over their children’s access to such platforms.

The Online Safety Act’s new measures include more stringent age verification methods for adult content sites, such as facial age estimation, photo ID matching, or credit card checks. Sites will need to have robust tools in place to verify users’ ages to prevent children from accessing inappropriate material. Furthermore, the legislation will require platforms to give children more control over their online experiences, allowing them to block and filter content and manage connection requests.

++ Calls for action as tuberculosis resurges amid public health cuts in US

While the new rules are hailed as a “transformational” step by Almudena Lara, Ofcom’s child protection policy director, some experts remain sceptical about their effectiveness. Concerns about privacy, scalability, and the potential for companies to circumvent the regulations have been raised. Critics argue that without rigorous enforcement and continuous adaptation, the new measures may fail to achieve their goals. Others worry that these restrictions could drive children towards riskier, less regulated platforms, putting them in greater danger of encountering harmful content.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Share post:

Subscribe

Popular

More like this
Related

China launches deep-sea research base as ocean health sparks debate

Last Sunday marked World Oceans Day – a global...

Voyager probes push beyond solar system’s edge, revealing a ‘wall of fire’

In 1977, NASA launched the twin Voyager spacecraft with...

Eamonn Holmes criticises industry fakery and reflects on health battle

Broadcaster Eamonn Holmes has spoken candidly about insincerity within...

Tess Daly awarded MBE after honour letter sent to wrong address

Tess Daly has spoken of her emotional reaction upon...