Apple’s Communications Security feature for iPhone, designed to protect children from viewing nude images over iMessage, is being expanded to include adult users alongside video content and other communication methods. Announced during the WWDC event on Monday, the improvements will arrive with iOS 17 later this year. All image and video processing will be performed directly on the device itself to ensure that everything remains private even from Apple.
Communication Security in Messages uses on-device machine learning to automatically blur nude images in iMessages before a child can see them. With iOS 17, the expanded feature will also protect kids from viewing or sharing nude photos while browsing their picture library via AirDrop, new Contact Posters, FaceTime messages, and using the Photo Picker. Besides photos, the feature will also be able to scan video content for nudity. Apple has yet to confirm whether this feature will also apply to live video content such as FaceTime video calls.
The Communication Security tool is currently an optional feature within Apple’s existing Family Sharing system. The feature is only applicable to iMessages until iOS 17 rolls out in the fall, currently enabled it detects when a child sends and receives images that may contain nudity, then alerts the child and blurs the photo before it’s displayed on the minor’s device. It also gives the child the option to message an adult they trust for helpful resources and additional support.
Soon, adults will be able to enjoy similar protections against unwanted nude images. “Sensitive Content Alert” will be a new feature available to devices running iOS 17 that warns users of all ages if an image or video they receive contains nudity. The content will be flagged with a pop-up message asking users if they still want to view it, providing some assurances and helpful guidance on security.