- Apple has added more child protection features to FaceTime in iOS 26
- The latest one blurs videos when it detects nudity is present
- It currently affects adult accounts too, but that might be a bug
Apple has been adding parental control features that are designed to protect minors for years now, and it looks like a new one has just been found in the iOS 26 beta. Yet it’s turning out to be pretty controversial, as there are concerns that it could be something of an overreach on Apple’s part.
Specifically, the new feature has been added to the FaceTime video-calling app. When FaceTime detects that someone is undressing on the call, it pauses the call and instead displays a warning message that reads: “Audio and video are paused because you may be showing something sensitive. If you feel uncomfortable, you should end the call.” There are then buttons labeled “Resume Audio and Video,” and “End Call.”
At its WWDC 2025 in June, Apple published a press release covering new ways its systems will protect children and young people online. The release included a feature that lines up the new FaceTime behavior: “Communication Safety expands to intervene when nudity is detected in FaceTime video calls, and to blur out nudity in Shared Albums in Photos.”
The actual implementation was noted by iDeviceHelp on X. Below the post, @user_101524 added that the feature can be found in the Settings app in iOS 26 by going to Apps > FaceTime > Sensitive Content Warning.
By default, the feature is disabled, so it needs to be switched on by the user, but that hasn’t stopped it from stirring up online debate…
Generating controversy
While this new feature might seem sensible, it has actually generated a degree of controversy. That’s because right now, it seems to affect all users of iOS 26, not just those who are using a child account. This has ruffled some feathers among people who feel that Apple is potentially censoring the behavior of consenting adults.
As well as that, some users have questioned how Apple knows what is being displayed on-screen and whether the company has access to customer video calls. On this point, Apple has said the following:
“Communication Safety uses on-device machine learning to analyze photo and video attachments and determine if a photo or video appears to contain nudity. Because the photos and videos are analyzed on your child’s device, Apple doesn’t receive an indication that nudity was detected and doesn’t get access to the photos or videos as a result.”
Like many of Apple’s features, the on-device processing means that content is not sent to Apple’s servers and is not accessible by the company. Rather, it is using artificial intelligence (AI) to flag video content that likely contains nudity, then censoring it.
The fact that Apple’s Communication Safety features are aimed at protecting minors suggests that this latest FaceTime feature might not be intended to cover adults as well as children. Its inclusion on all accounts, therefore, might be an oversight or bug. While we don’t know for sure, we should find out by September when iOS 26 comes out of beta and releases in full to the public.
Read the full article here