Close Menu
    DevStackTipsDevStackTips
    • Home
    • News & Updates
      1. Tech & Work
      2. View All

      CodeSOD: A Unique Way to Primary Key

      July 22, 2025

      BrowserStack launches Figma plugin for detecting accessibility issues in design phase

      July 22, 2025

      Parasoft brings agentic AI to service virtualization in latest release

      July 22, 2025

      Node.js vs. Python for Backend: 7 Reasons C-Level Leaders Choose Node.js Talent

      July 21, 2025

      The best CRM software with email marketing in 2025: Expert tested and reviewed

      July 22, 2025

      This multi-port car charger can power 4 gadgets at once – and it’s surprisingly cheap

      July 22, 2025

      I’m a wearables editor and here are the 7 Pixel Watch 4 rumors I’m most curious about

      July 22, 2025

      8 ways I quickly leveled up my Linux skills – and you can too

      July 22, 2025
    • Development
      1. Algorithms & Data Structures
      2. Artificial Intelligence
      3. Back-End Development
      4. Databases
      5. Front-End Development
      6. Libraries & Frameworks
      7. Machine Learning
      8. Security
      9. Software Engineering
      10. Tools & IDEs
      11. Web Design
      12. Web Development
      13. Web Security
      14. Programming Languages
        • PHP
        • JavaScript
      Featured

      The Intersection of Agile and Accessibility – A Series on Designing for Everyone

      July 22, 2025
      Recent

      The Intersection of Agile and Accessibility – A Series on Designing for Everyone

      July 22, 2025

      Zero Trust & Cybersecurity Mesh: Your Org’s Survival Guide

      July 22, 2025

      Execute Ping Commands and Get Back Structured Data in PHP

      July 22, 2025
    • Operating Systems
      1. Windows
      2. Linux
      3. macOS
      Featured

      A Tomb Raider composer has been jailed — His legacy overshadowed by $75k+ in loan fraud

      July 22, 2025
      Recent

      A Tomb Raider composer has been jailed — His legacy overshadowed by $75k+ in loan fraud

      July 22, 2025

      “I don’t think I changed his mind” — NVIDIA CEO comments on H20 AI GPU sales resuming in China following a meeting with President Trump

      July 22, 2025

      Galaxy Z Fold 7 review: Six years later — Samsung finally cracks the foldable code

      July 22, 2025
    • Learning Resources
      • Books
      • Cheatsheets
      • Tutorials & Guides
    Home»Development»Ofcom Finalizes Online Child Safety Rules to Protect UK’s Youngest Internet Users

    Ofcom Finalizes Online Child Safety Rules to Protect UK’s Youngest Internet Users

    April 24, 2025

    Ofcom, Child Safety Rules, Online Child Safety Rules

    The United Kingdom communications regulator Ofcom has finalized a comprehensive set of child safety rules under the Online Safety Act, ushering in what it calls a “reset” for how children experience the internet.

    Announced Thursday, the new regulations require over 40 practical safeguards for apps, websites, and online platforms accessed by children in the UK. These range from filtering harmful content in social feeds to robust age checks and stronger governance requirements. The measures apply to platforms in social media, gaming, and search—any online service likely to be accessed by children under 18.

    “These changes are a reset for children online,” said Dame Melanie Dawes, Ofcom’s Chief Executive. “They will mean safer social media feeds with less harmful and dangerous content, protections from being contacted by strangers and effective age checks on adult content. If companies fail to act they will face enforcement.”

    The finalized Codes of Practice are the product of consultations with over 27,000 children, 13,000 parents, civil society organizations, child protection experts, and tech companies. The rules will be enforceable from July 25, 2025.

    Algorithmic Filters, Age Assurance, and Governance

    A key focus of the reforms targets personalized recommendation algorithms—often the pathway through which children are exposed to harmful content. Under the new rules, platforms using recommender systems must filter out harmful material from children’s feeds if they pose medium or high risks.

    The rules also impose mandatory age assurance on the most high-risk services. Platforms must verify users’ ages with a high degree of accuracy, and if unable to do so, must assume children are present and provide an age-appropriate experience. In some cases, this may mean blocking children’s access entirely to certain content, features, or services.

    In addition, all providers must maintain fast-action processes to quickly assess and remove harmful material once identified.

    “These reforms prioritize safety-by-design,” said a UK-based child safety policy expert. “The burden is finally shifting onto platforms to proactively assess and mitigate risks, rather than waiting for harm to happen.”

    Child Safety Rule: More Control, Better Support for Children

    Beyond content moderation, the rules talk about giving children more control over their online environment. Required features include:

    • The ability to decline group chat invites.

    • Tools to block or mute accounts.

    • The option to disable comments on their own posts.

    • Mechanisms to flag content they do not wish to see.

    Services must also provide supportive information to children who search for or encounter harmful material, including around topics like self-harm, suicide, or eating disorders.

    Clear and accessible reporting and complaint tools are also mandatory. Ofcom requires platforms to ensure their terms of service are understandable to children and that complaints receive timely, meaningful responses.

    Accountability at the Top

    A standout requirement under the new framework is “strong governance.” Every platform must designate a named individual responsible for children’s safety, and senior leadership must annually review risk management practices related to child users.

    “These aren’t just tech tweaks. This is a cultural shift in corporate responsibility,” said the child saffety policy expert. “They [Ofcom] are holding leadership accountable for keeping children safe.”

    Also read: Australia Gives Online Industry Ultimatum to Protect Children from Age-Explicit Harmful Content

    Enforcement, Deadlines, and What’s Next

    Tech firms have until July 24, 2024, to finalize risk assessments for services accessed by UK children. From July 25, 2025, they must implement the measures outlined in Ofcom’s Codes—or demonstrate alternative approaches that meet the same safety standards.

    Ofcom has the authority to issue fines or apply to the courts to block access to non-compliant sites in the UK.

    The child safety measures build upon earlier rules introduced under the Online Safety Act to prevent illegal harms, such as grooming and exposure to child sexual abuse material (CSAM). They also complement new age verification requirements for pornography websites.

    More regulations are expected soon. Ofcom plans to launch a follow-up consultation on:

    • Banning accounts found to have shared CSAM.

    • Crisis response protocols for real-time harms.

    • AI tools to detect grooming and illegal content.

    • Hash matching to prevent the spread of non-consensual intimate imagery and terrorist material.

    • Tighter controls around livestreaming, which presents unique risks for children.

    “Children deserve a safer internet. This framework lays the foundation, but we’re not stopping here,” Ofcom said in a statement.

    Resources for Parents and Children

    To accompany the new regulations, Ofcom published guidance for parents, including videos and answers to common safety questions. It also launched child-friendly content explaining what changes children can expect in their favorite apps and platforms.

    As the codes go before Parliament for final approval, stakeholders across the tech ecosystem will be watching closely. For many, this marks a critical test of how well regulatory bodies can compel tech giants to prioritize child safety over engagement metrics.

    Source: Read More

    Facebook Twitter Reddit Email Copy Link
    Previous ArticleBCPS Cyberattack Confirmed: Employee and Student Data Potentially Compromised
    Next Article Cybercrime Losses Jump 33% in 2024, FBI Report Shows

    Related Posts

    Development

    GPT-5 is Coming: Revolutionizing Software Testing

    July 22, 2025
    Development

    Win the Accessibility Game: Combining AI with Human Judgment

    July 22, 2025
    Leave A Reply Cancel Reply

    For security, use of Google's reCAPTCHA service is required which is subject to the Google Privacy Policy and Terms of Use.

    Continue Reading

    CVE-2025-52782 – King Rayhan Scroll UP Cross-site Scripting Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    A Deep Dive into Building Enterprise grade Generative AI Solutions

    Development

    CVE-2025-23183 – Apache HTTP Server Open Redirect Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    CVE-2025-51658 – SemCms SQL Injection Vulnerability

    Common Vulnerabilities and Exposures (CVEs)

    Highlights

    CVE-2025-52795 – Aharonyan WP Front User Submit/Front Editor CSRF

    June 20, 2025

    CVE ID : CVE-2025-52795

    Published : June 20, 2025, 3:15 p.m. | 2 hours, 9 minutes ago

    Description : Cross-Site Request Forgery (CSRF) vulnerability in aharonyan WP Front User Submit / Front Editor allows Cross Site Request Forgery. This issue affects WP Front User Submit / Front Editor: from n/a through 4.9.4.

    Severity: 7.1 | HIGH

    Visit the link for more details, such as CVSS details, affected products, timeline, and more…

    CVE-2025-38230 – Linux JFS Shift Out of Bounds Vulnerability

    July 4, 2025

    CVE-2025-46221 – Apache HTTP Server Authentication Bypass

    April 23, 2025

    CSS FILTERS Explained

    May 11, 2025
    © DevStackTips 2025. All rights reserved.
    • Contact
    • Privacy Policy

    Type above and press Enter to search. Press Esc to cancel.