It has been revealed that Instagram, one of the most popular online photo and short video sharing platforms in the world, is working on a new feature. This feature, which was detected by software developer Alessandro Paluzzi, who deals with reverse engineering processes, reveals that Instagram is waging a war against nudity.
In fact, Meta, the umbrella company of Instagram, had previously taken important steps regarding spam and insulting messages. The filtered words feature on Instagram was the clearest example of this. The new feature that is being worked on will work just like filtered words. When a user tries to send a nude photo to another user, Instagram will partially prevent this situation.
The new feature will work on demand
*An image shared from Instagram’s new feature
The Verge reached out to Meta for more detailed information on the feature. In the official statement on the subject, it was stated that this feature is still under development. Saying that they focus on the security of users, Meta officials also stated that this feature will work optionally. In other words, users will be able to view the nudity images sent to them without any censorship. So how will this feature work?
According to the screenshot provided by Paluzzi on Twitter, the feature will activate the nudity detection feature of iOS. Thus, incoming and outgoing messages from Instagram will be analyzed with Apple’s artificial intelligence. If there is nudity in the image as a result of the analysis, it will be sent to the user by applying the blur filter. According to Apple, the original image will not be uploaded to the servers during the analysis process, thus ensuring user privacy. It is not clear for now how it will work for Android users. Finally; It should be noted that it is not known when the feature will be available.