The United Kingdom government is preparing to ask Apple and Google to block nude photos and explicit images on devices unless users first verify their age, marking a significant shift in how smartphone operating systems handle sensitive content. Under the proposed strategy, tech companies would integrate nudity-detection algorithms directly into iOS and Android to prevent the taking, sharing, and viewing of explicit media without prior age confirmation.
Officials from the Home Office are expected to make the proposal public in the coming days as part of a broader push to protect children and vulnerable users from exposure to sexually explicit material. Rather than targeting individual apps, ministers want safeguards baked into device operating systems, which would stop explicit content at the point of capture, upload, or display unless the user proves they are an adult, potentially through biometric scans or official identification checks.
In the latest development, the Financial Times reports the following request made by the Home Office:
Ministers want the likes of Apple and Google to incorporate nudity-detection algorithms into their device operating systems to prevent users taking photos or sharing images of genitalia unless they are verified as adults […]
[Additionally,] the Home Office wants to see operating systems that prevent any nudity being displayed on screen unless the user has verified they are an adult through methods such as biometric checks or official ID.
The proposal builds on the UK’s Online Safety Act 2023, which already mandates robust age verification for online pornography and harmful content. However, the law targets platform-hosted material and can still be bypassed using VPNs or proxy services. The planned device-level nudity block represents a next step in policymakers’ efforts to reduce minors’ exposure to explicit material.
Proponents of the ban argue that embedding nudity-detection software at the OS level would ensure a more consistent and effective approach to child protection. A step like that will cover cameras, galleries, messaging, and web browsing functions.
Similar laws and proposals have emerged in the United States, Canada, and Europe, where governments are debating how to balance online child safety with individual privacy rights. Even Pakistan is actively debating the extent of such a social media embargo.
While Apple and Google have traditionally maintained that privacy and user autonomy are core principles, the UK’s request places new pressure on the companies to assume responsibility for detecting and restricting explicit content at the operating system level.
Apple has put some safety measures in place within the Messages app. If a child who’s part of an iCloud Family group gets a sexually explicit image, it will first appear blurred, accompanied by a warning message. If the child decides to tap the View Photo button, a pop-up will appear explaining why the message is flagged as sensitive and asking for their confirmation to view it. This pop-up also informs them that the parent designated as the admin for the group will receive a notification.
Age verification is a hot topic in the tech industry these days, with major players like Meta pushing for regulations that would hold Google and Apple accountable for this process.