Researchers at Stanford University have demonstrated that, when shown pictures of one gay man, and one straight man, the algorithm could attribute their sexuality correctly 81% of the time.
Humans managed only 61%.
In countries where homosexuality is a crime, software which promises to infer sexuality from a face is an alarming prospect.
Less violent forms of discrimination could also become common.
Employers can already act on their prejudices to deny people a job.
But facial recognition could make such bias routine, enabling firms to filter all job applications for ethnicity and signs of intelligence and sexuality.
Nightclubs and sports grounds may face pressure to protect people by scanning entrants’ faces for the threat of violence—even though, owing to the nature of machine-learning, all facial-recognition systems inevitably deal in probabilities.
Moreover, such systems may be biased against those who do not have white skin, since algorithms trained on data sets of mostly white faces do not work well with different ethnicities.
Such biases have cropped up in automated assessments used to inform courts’ decisions about bail and sentencing.
Eventually, continuous facial recording and gadgets that paint computerised data onto the real world might change the texture of social interactions.
Dissembling helps grease the wheels of daily life.
If your partner can spot every suppressed yawn, and your boss every grimace of irritation, marriages and working relationships will be more truthful, but less harmonious.