An open letter to Zoom, one of the human rights organizations: “Give up this plan now!”

A large group of human rights organizations wrote to Zoom in an open letter demanding that it immediately abandon its much-discussed new app.
 An open letter to Zoom, one of the human rights organizations: “Give up this plan now!”
READING NOW An open letter to Zoom, one of the human rights organizations: “Give up this plan now!”

The nonprofit digital rights foundation Fight for the Future and 27 human rights organizations wrote an open letter to Zoom asking the company to stop investigating the use of AI that can analyze emotions on its videoconferencing platform. The groups say they wrote the letter in response to a Protocol report that says Zoom is actively researching how to incorporate emotional AI into its product in the future.

This research is part of a larger effort to examine how companies are starting to use artificial intelligence to detect a potential customer’s emotional state during sales calls.

With the onset of the epidemic, video conferencing has become much more common around the world. Without the ability to read body language from a screen, salespeople struggle to gauge how open potential customers are to their products and services. Companies have started using technology with the ability to analyze people’s moods during calls, and Protocol says Zoom plans to provide the same service.

Fight for the Future and other human rights organizations hope their call will pressure Zoom to abandon its plans. They describe the technology as “discriminatory, manipulative, potentially dangerous, and based on the assumption that all people use the same facial expressions, voice patterns, and body language.”

In addition, these groups; He pointed out that technology, like facial recognition, is inherently biased and racist. They claim that by including the feature, Zoom will discriminate against certain ethnicities and people with disabilities. It could also be used to punish students or staff if they display the wrong emotion.

In 2021, a project led by Cambridge University professor Alexa Hagerty demonstrated the limits of emotion recognition AIs and how easy it is to fool them. Previous studies have also said that emotion recognition programs fail the racial bias test and have trouble reading black faces.

The group concluded the letter by citing Zoom’s decision to cancel the rollout of face tracking features, describing it as another opportunity to do the right thing for its users. They are now asking Zoom to commit to not implementing emotion AI in its product by May 20, 2022.

Comments
Leave a Comment

Details
201 read
okunma25775