AI gaydar can accurately determine sexuality from a photo

sexuality

Facial detection technology is usually used to identify individuals for the purposes of crime prevention, or as a biometric security method. But a paper published by Stanford University -- entitled simply "Deep neural networks are more accurate than humans at detecting sexual orientation from facial images" -- shows that it could also be used to determine people's sexuality.

Using AI and deep neural networks, algorithms have been shown to have a far better gaydar than people. Working with a sample of more than 35,000 photographs, the system was able to correctly determine whether individuals were gay or straight with staggering accuracy -- 81 percent of men and 74 percent of women. While on one hand the results are impressive, there are also ethical concerns.

The ability of the system to correctly identify gay and straight men and women is significantly better than the normal human prediction rate. The human gaydar was shown to yield a success rate of around 61 percent for men and 54 percent for women. The research by Michal Kosinski and Yilun Wang shows that AI is able to pick up on subtle differences in facial structure between gay and straight people.

The report, published in Journal of Personality and Social Psychology and also made publicly available on Open Science Framework, used publicly available images from a dating website -- the first of the ethical issues some may spot with the research.

Using a piece of software called VGG-Face, the photographs were scanned and assigned a number and an algorithm used to predict sexuality (which was then compared to the sexual orientation declared on the dating website). The Economist explains:

When shown one photo each of a gay and straight man, both chosen at random, the model distinguished between them correctly 81 percent of the time. When shown five photos of each man, it attributed sexuality correctly 91 percent of the time. The model performed worse with women, telling gay and straight apart with 71 percent accuracy after looking at one photo, and 83% accuracy after five. In both cases the level of performance far outstrips human ability to make this distinction. Using the same images, people could tell gay from straight 61 percent of the time for men, and 54 percent of the time for women. This aligns with research which suggests humans can determine sexuality from faces at only just better than chance.

This sounds like an impressive rate of accuracy, but the system is flawed. The high success rate is only achieved when comparing images of two men when one of them is known to be gay. In "real world" tests where the ratio of gay to straight people is much lower, the accuracy dropped dramatically. The system was, however, able to select with 90 percent accuracy the 10 people it was most confident were gay.

While the research showed that it is possible to use characteristics such as nose shape, forehead size and jaw length (exhibiting, in the paper's wording "gender atypical" features) to determine sexuality with some accuracy, the authors issue a warning:

Importantly, we would like to warn our readers against misinterpreting or overinterpreting this study's findings. First, the fact that the faces of gay men and lesbians are, on average gender atypical, does not imply that all gay men are more feminine than all heterosexual men, or that there are no gay men with extremely masculine facial features (and vice versa in the case of lesbians). The differences in femininity observed in this study were subtle, spread across many facial features, and apparent only when examining averaged images of many faces. Second, our results in no way indicate that sexual orientation can be determined from faces by humans

Then there is the concern that such technology could be used nefariously. This is something the authors are aware of, and touch on in the paper:

Some people may wonder if such findings should be made public lest they inspire the very application that we are warning against. We share this concern. However, as the governments and companies seem to be already deploying face-based classifiers aimed at detecting intimate traits (Chin & Lin, 2017; Lubin, 2016), there is an urgent need for making policymakers, the general public, and gay communities aware of the risks that they might be facing already. Delaying or abandoning the publication of these findings could deprive individuals of the chance to take preventive measures and policymakers the ability to introduce legislation to protect people. Moreover, this work does not offer any advantage to those who may be developing or deploying classification algorithms, apart from emphasizing the ethical implications of their work. We used widely available off-the-shelf tools, publicly available data, and methods well known to computer vision practitioners. We did not create a privacy-invading tool, but rather showed that basic and widely used methods pose serious privacy threats. We hope that our findings will inform the public and policymakers, and inspire them to design technologies and write policies that reduce the risks faced by homosexual communities across the world.

The authors also point out:

The results reported in this paper were shared, in advance, with several leading international LGBTQ organizations.

You can read the full study over on the Open Science Framework website.

Image credit: zetwe / depositphotos

18 Responses to AI gaydar can accurately determine sexuality from a photo

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.