A ‘Sexist’ Search Bug Says More About Us Than Facebook

The social network has been criticized for directing users to search for photos only of female friends, not male ones. But it's not all Facebook's fault.
a facebook search of a photograph of a woman in a bikini
Lauren Joseph; Vladimir Godnik/Getty Images

Just before Valentine’s Day last week, Belgian security researcher Inti De Ceukelaire noticed something strange on Facebook. He found the social network’s search function treated pictures of men and women in dramatically different ways. Searching for “photos of my female friends” returned a hodgepodge of images, whereas a similar search for “photos of my male friends” yielded no results. Worse, Facebook assumed “male” was simply a typo for the word “female.”

Tech blogs picked up on De Ceukelaire’s tweets, and after replicating the results, published stories with headlines like "Facebook lets you search for pictures of your female friends, but not your male ones." Even more damning were the autocomplete predictions Facebook offered: When many users typed “photos of my female friends,” the platform suggested phrases like “in bikinis” or “at the beach.”

The discovery was creepy—especially given Facebook’s recent cascade of privacy scandals. But those search results are not the product of sexist engineers. Facebook search is governed in part by an algorithm, which favors the most popular queries over those made less often. Algorithms can be biased and they can mirror prejudices that already exist in society. In this case, it turns out people are significantly more interested in using Facebook to find pictures of women in bathing suits than they are in finding pictures of men—and the algorithm reflects that.

Louise Matsakis
Louise Matsakis

Like with Google and other search engines, there’s a difference between Facebook search predictions and search results. The fact that “in bikinis” is a suggestion when searching for “photos of female friends” doesn’t mean Facebook is training its algorithm to catalog your swimsuit pictures. (That’s not to say Facebook’s AI is incapable of processing images—it is. The company says, for instance, that it automatically detects and removes around 96 percent of adult and nude content that violates its rules. Skimpy Speedo photos don’t get caught in that dragnet for a reason.) When I searched for “photos of my female friends in bikinis,” Facebook returned a strange array of images that did indeed feature only women, but none of them was wearing a bathing suit.

“Facebook Search predictions represent what people may be searching for on Facebook, but are not necessarily reflective of actual content on Facebook,” a spokesperson said in a statement. “We know that just because something doesn’t violate our Community Standards doesn’t necessarily mean people want to see it, so we’re constantly working to improve search to make sure predictions are relevant to people.”

Predictions are generated based on a number of elements, including the overall popularity of the search term. That means they’re essentially a window into what other users look for on Facebook, which can be illuminating. When I searched for “female friends,” for instance, some of the predictions included “who are single,” “with benefits,” and “over 50.” A search for “dating” brought up suggestions such as “for 40s 50s 60s and beyond,” as well as “singles near me.” Perhaps unsurprisingly, it looks like older people may often use Facebook search to try to find dates. (The percentage of Americans over 65 who say they use Facebook has doubled since 2012, according to Pew Research Center.)

For the same reason, Facebook’s search predictions also reflect major news events. When I searched for “Oakland teachers” Friday, the top prediction was “strike,” since nearly all of the California city’s 2,300 educators began protesting Thursday.

Facebook search predictions are also uniquely personalized for each user, based on the pages you have liked, the groups you have joined, the current city you provided, your connections, location data, and the News Feed posts and search results you’ve engaged with in the past. Facebook search is therefore much more personalized than Google Search—for good reason. This tailoring is the reason why when you begin typing a name like “Jeff,” Facebook predicts you're looking to search for someone you know rather than, say, Jeff Bezos.

Search results are similarly personalized based on your social connections and your activity on Facebook. Again, if you searched for “Jeff,” Facebook would likely return people you’re already friends with and those with whom you have friends in common. This function is what allows users to easily find someone they recently met and add them on Facebook. But that doesn’t mean the social network’s algorithmically generated search results aren’t sometimes bizarre. The “photos of my female friends” search produced a smattering of seemingly random photos of women I know, some of which were uploaded years ago. It’s not necessarily clear why I was shown some images rather than others.

And in De Ceukelaire’s case, a search for “photos of my male friends” returned nothing at all. When I tried it, the search didn’t produce photos of my friends, like when searching for female friends. Instead, I was shown a random assortment of memes. “I got what are presumably male dogs and two male-themed cartoons, including one cautioning men against peeing outdoors in the polar vortex,” Melissa Locker wrote in Fast Company. A Facebook spokesperson said this experience is “a bug we’re working on fixing.”

The problem was neglected because (at least until it was highlighted in news stories) the search term “photos of my male friends” was very unpopular. With limited resources, Facebook has an incentive to ensure the most common searches produce relevant content, before optimizing the results for rarer ones. This is true of Google as well. Popular queries like “How old is Jeff Bezos?” readily generate accurate answers on Google, while more unusual searches may not produce what you’re looking for at all.

Facebook’s search tool wasn’t designed to be sexist, but that doesn’t mean it’s free from issues. The algorithm that governs it remains a black box, and it’s impossible for users to know precisely why a certain prediction or result may be generated. Last month, New York Times writer John Herrman wrote that Facebook's lack of transparency encourages paranoia in its users, who are often left guessing about why things happen the way they do on the platform. Facebook's bikini search snafu, and the uneasy response it generated, is just the latest example.


More Great WIRED Stories