Meta, which includes Facebook and Instagram, has been criticized for a long time on the grounds that these platforms harm young people. For this reason, the company was constantly introducing some innovations to both platforms to increase its security measures. Now another move has been made in this regard.
Meta announced today that it is introducing new privacy and security features for kids on Instagram and Facebook. Accordingly, children under 18 (under 16 in some countries) who sign up for Facebook will now have more secure settings by default.
New restrictions on Facebook want to create a safer environment for children
It was stated by the company that moves will be made for children who are already in practice. According to the statement, Facebook will begin to encourage users under the age of 18 to choose such more secure settings. Thus, there will be restrictions on who can see items such as children’s friend lists, posts they are tagged in, users they follow.
To open it even more, the platform will try to create a safer environment by ensuring that children’s activities on the platform are only seen by their friends. Let’s add that this move comes a year after Instagram made the profile of underage users private by default.
Meta is also testing ways to protect children from potentially malicious adults on both Instagram and Facebook.
In addition, ways to protect children from suspicious adults they do not know and who may be malicious are also being tested. Suspicious adults will be users who have previously been blocked or reported by different child users.
Meta says suspicious adults will not be displayed in the ‘People you may know’ section on Facebook. On Instagram, the company is testing a feature that will remove the message button to prevent a suspicious adult from texting a child user’s account when viewing their account.
Working on a platform to prevent children’s private images from spreading online
In addition, the company stated that it is working on a global platform for young people who are concerned about the possibility of sharing their private images without their permission. Working in partnership with the National Center for Missing and Exploited Children (NCMEC), Meta noted that the purpose of the platform is to prevent such images from spreading on the Internet.
The tech giant also added that the platform could be used by other companies in the industry, avoiding dire situations by responding to the needs of young people. It was also stated in the statements that details about the new platform will be shared in the coming weeks.