Originally posted on The Atlantic
by ALEXIS C. MADRIGAL
Online matchmaking is getting better at telling us whom we ought to like—and that's not good.
The air in Santa Cruz was warm and still as I sat among perfect roses in the backyard of the bride’s parents. At the key moment of this nontraditional Jewish wedding, the friend presiding over the ceremony took a moment to explain the Hebrew word kadosh. It’s translated as “holy,” or “the holy one,” but it also connotes the act of setting apart or elevating one thing above all other things of a type. Marriage is holy because each partner says, “You are the one person I choose out of all the people in the world.”
If only you could Google your way to The One. The search engine, in its own profane way, is a kadosh generator. Its primary goal is to find the perfect Web page for you out of all the Web pages in the world, to elevate it to No. 1.
The Santa Cruz couple had met in a time-honored way—through a friend—but the number of such encounters is decreasing. One reputable estimate suggests that 74 percent of singles looking for a mate now turn to dating sites like eHarmony, Match.com, and OkCupid, which use algorithms to pair people up based on answers to sets of questions.
But even e-yentas find prognosticating love difficult. Date-mining software needs lots of tuning to create good matches, so the services track everything would-be lovebirds do. Their romantic-data trails become grist for matchmaking improvements.
OkCupid, which according to The Boston Globe aspires to be the Google of online dating, has been particularly aggressive about tracking users. The company’s goal is to stimulate “three-ways”—a double entendre that, to someone at OkCupid, means a person sent a note, received a reply, and fired off a follow-up.
“Imagine if you had a video camera at every bar in the country,” Sam Yagan, a co-founder of OkCupid, told me. “You’d have all these data that reveal things about society and predict them. This isn’t a survey. It isn’t a lab experiment. These are millions of people going about their lives. We just happen to be able to track and quantify everything about it.”
The company can quantify things you could guess but might rather not prove. For instance, all races of women respond better to white men than they should based on the men’s looks. Black women, as a group, are the least likely to have their missives returned, but they are the most likely to respond to messages.
I asked Yagan whether OkCupid might try tailoring its algorithm to surface more statistically successful racial combinations. Such a measure wasn’t out of the question, he said. “Imagine we did a lot of research, and we found that there were certain demographic or psychographic attributes that were predictors of three-ways. Hispanic men and Indian women, say,” Yagan suggested. “If we thought that drove success, we could tweak it so those matches showed up more often. Not because of a social mission, but because if it’s working, there needs to be more of it.”
Imagine the reverse, though, in the past or future. What if the dating sites had existed in the 1950s? How would they have dealt with interracial matches? Given the female response to white men in 2010, should white men show up more often? “We could do some really screwed-up things,” Yagan admits. Imagine war broke out with China, causing Chinese users’ ratings to plummet: would dating Web sites start reducing the number of Chinese people showing up in other groups’ searches?
Algorithms are made to restrict the amount of information the user sees—that’s their raison d’être. By drawing on data about the world we live in, they end up reinforcing whatever societal values happen to be dominant, without our even noticing. They are normativity made into code—albeit a code that we barely understand, even as it shapes our lives.
We’re not going to stop using algorithms. They’re too useful. But we need to be more aware of the algorithmic perversity that’s creeping into our lives. The short-term fit of a dating match or a Web page doesn’t measure the long-term value it may hold. Statistically likely does not mean correct, or just, or fair. Google-generated kadosh is meretricious, offering a desiccated kind of choice. It’s when people deviate from what we predict they’ll do that they prove they are individuals, set apart from all others of the human type.