Scientist Defends Dangerous Gaydar Program
Last year, news sources shared the story of Michal Kosinski who had created an AI program with its own gaydar. Now, he’s sharing why he created the program that could have dangerous repercussions.
Through a study with Standford University’s resources, Kosinski and Standford computer scientist Yilun Wang created this program.
Kosinski says that the idea for the program and study was sudden and random after he was casually scanning faces in Facebook photos.
“It suddenly struck me,” he said, “Introverts and extroverts have completely different faces. I was like, ‘Wow, maybe there’s something there.’”
From there, he and Yilun Wang obtained “off the shelf” facial recognition software that could take five images from more than 35,000 white men and women (people of color, trans people, and bisexual people were not considered for the study). The program would then use facial recognition to trace similar attributes. As for the photos, they obtained them from dating websites.
Their research found that white gay men and women frequently have “gender-atypical” features and “grooming styles.” This means gay men had more feminine features and gay women had more masculine looks.
The data suggests that their’s differences in the actual facial structure such as white gay men having narrower jaws, larger noses and larger foreheads than their straight peers. Meanwhile, gay women have larger jaws and smaller foreheads compared to straight women.
The results from the study also found that the program could predict if men were gay with 91% accuracy. If only one image was used, the program could still distinguish between gay and straight men 81% of the time.
As for women, the program had a 83% accuracy rate with five photos and a 74% accuracy rate for one photo.
This is much better than if humans were to judge as their records show human judges were only 61% accurate for men and 54% accurate for photos of women.
Last year, the study incited great criticism and concern for the possibilities that the program offered. Many worried that if the algorithm landed in the wrong hands, countries with anti-gay laws could prosecute people based solely on their faces.
While Kosinski agrees with this mentality, he says he felt compelled to release his data.
“This is the inherent paradox of warning people against potentially dangerous technology,” he said.
“I stumbled upon those results, and I was actually close to putting them in a drawer and not publishing – because I had a very good life without this paper being out. But then a colleague asked me if I would be able to look myself in the mirror if, one day, a company or a government deployed a similar technique to hurt people.”