AI Systems and Not Just Humans Prone to Skin Type Bias

AI Systems and Not Just Humans Prone to Skin Type Bias

facial analysis programs

Major technology companies have demonstrated that gender biases and skin type biases are important factors in commercial facial analysis programs. In these experiments the error rate which determine the darker skinned women to 20% in one case and about 34% in other cases. The error in determining the gender of light skin men was 0.8%. These findings reveal and raise questions regarding how neural networks today which perform computational tasks look for patterns in data sets, how they evaluated and analyse it.

An accuracy rate above 95% for the face recognition system designed by researchers at a major US technology company. However the data set which assess its performance was 77% male and 83% White. This method can be used to determine someone’s gender and identify a person in cases of criminal suspect or for unlocking a phone and it is not just about computer vision. These three programs were general purpose facial analysis systems in order to match as well as to assess characteristics such as age, and gender. These systems treated gender classification as male or female which made the task very easy statistically.

The researches had a project which was ethnically diverse and they had to present the device to public but they had to rely on lighter-skinned team members in order to demonstrated because the system wouldn’t work for darker skin the users.

One of the researchers they first started submitting her photographs 2 commercial facial recognition programs as she was dark and she found that the program failed to recognize the photos as even a human face.