An App To Delete Child Nude Photos Has Been Developed

An application to help prevent child abuse has been developed in collaboration with a company and two government agencies in Japan. The app automatically detects nude photos taken by children and deletes them from the device. However, the fact that the app will constantly scan the gallery and specifically search for nude photos has raised some privacy concerns.
 An App To Delete Child Nude Photos Has Been Developed
READING NOW An App To Delete Child Nude Photos Has Been Developed

Although social media platforms offer advantages such as spending time and creating alternative income, they also host some serious problems and discussions. The most discussed of these is their transformation into mediator platforms for child abuse, although this is not the purpose.

While social media platforms continue to update their policies to prevent this situation, news of an application that can help solve this issue came from Japan. A research team in Japan has developed an app that will automatically delete photos from the device if individuals under the age of 18 take nude photos.

Parents instantly notified about ‘naked selfie’

The new application, which uses artificial intelligence, was developed to allow adults to communicate with children on social media and prevent children from posting nude photos. The app came to life in partnership with Tokyo-based Smartbooks, Fuijta Health University, and the local Nakamura police station.

The application automatically detects nude photos taken by children under the age of 18 from the camera and deletes them from the device. In addition, the application also sends a warning message to the parents of the child after detecting the photos. The new application is expected to be available by the end of this year.

Apple also developed a similar application, but withdrew it after the backlash: Do the same concerns apply to this application as well?

A similar measure against child abuse was previously offered by Apple. Apple would scan photos on iPhones and notify law enforcement if it found images of child abuse. But Apple’s app has faced a major user privacy controversy. Although Apple defended the feature, it gave up on implementing this feature after intense reactions.

Developed in Japan, the application will control the phone’s gallery, similar to Apple. Each new photo added to this gallery will be scanned and nudity photos will be detected. There is no clear explanation as to whether the scanned photos (including those containing nudity) will be stored by the app in this process. In addition, the fact that even a similar measure from the first party, such as Apple, creates controversy, indicates that this application may also get its share of discussions.

Comments
Leave a Comment

Details
184 read
okunma30101
0 comments