SOFTWARE that recognises faces has bounded ahead in recent years, propelled by a boom in a form of artificial intelligence called deep learning (see article). Several firms now offer face recognition as a commercial service, via their respective clouds. The ability to recognise in faces such things as an individual’s sex has improved too, and this is also commercially available.
The algorithms involved have, however, long been suspected of bias. Specifically, they are alleged to be better at processing white faces than those of other people. Until now, that suspicion has been unsupported by evidence. But next week, at Fairness, Accountability and Transparency, a conference in New York, Joy Buolamwini of the Massachusetts Institute of Technology will present work which suggests it is true.
Ms Buolamwini and her colleague Timnit Gebru looked at three sex-recognition systems, those of IBM, Microsoft and Face++. They tested these on a set of 1,270 photographs of parliamentarians from around the world and found that all three classified lighter faces more accurately than darker ones. All also classified males more accurately than females. IBM’s algorithm, for example, got light male faces wrong just 0.3% of the time. That compared with 34.7% of the time for dark female faces. The other two systems had similar gulfs in their performances. Probably, this bias arises from the sets of data the firms concerned used to train their software. Ms Buolamwini and Ms Gebru could not, however, test this because those data sets are closely guarded.
IBM has responded quickly. It said it had retrained its system on a new data set for the past year, and that this had greatly improved its accuracy. When testing the new system on an updated version of the set of politicians Ms Buolamwini and Ms Gebru had used, the firm said it now achieved an error rate of 3.46% on dark-skinned female faces—a tenth of that the two researchers had found using the existing system. For light-skinned males the error rate also fell, to 0.25%.
–Top Twitter To Follow: