So you grow some scruff, start wearing tight jeans, occasionally throw on suspenders, and regularly don plaid. You start hanging out at that artisanal kombucha spot down the street — you know, the one with all the wicker? And then one morning — before you wax that 360-degree handlebar — when you log onto Facebook, you notice something different.
There's one ad for sock suspenders, and another one for Amish-made beard conditioner*.
At parties and on the streets, it's obvious — you're a total hipster. And now Facebook knows it too, just by the photos you've uploaded.
While this dystopian scenario is still set in an undetermined future, the idea that social media can scan your photos for contextual clues about lifestyle has already been proven. In his lab at University of California San Diego, Serge J. Belongie and his students have created a computer algorithm that can guess what "urban tribe" (i.e., subculture) you belong to.
Humans can do this almost without thinking. We hold stereotypes of what the word "hipster" implies, but we know that not one of these variables completely defines the term. A "hipster" is the sum of many parts, like a plaid shirt or tight jeans.
This concept — the ability to use contextual clues to define an object — is what artists and psychologists call gestalt. But that type of cognition is not easy for a computer, which operates on hard rules.
"That's the theme in my research," Belongie says. "Whenever I stumble across a pattern that people can see but is not immediately evident from a quantitative perspective, I'd like to probe that to figure out what's causing that."
Belongie and his team compile reference photos for each urban tribe and then "teach" the computer to recognize the patterns in colors, textures, and geometric arrangements. The computer then "combs through the database and looks for exceptional examples of each of those categories." In other words, Belongie is trying to teach the computer the gestalt it needs to navigate a high school cafeteria.
Once the computer gets a feel for these patterns, it can start guessing. Belongie's program isn't perfect. The machine operates at just 39 percent better than chance, Belongie says. "This is something that we're just cracking open a new research area."
A Seismic Shift
Belongie's work goes beyond hipsters and goths. This was just an offshoot of his more sober work — teaching computers to recognize cancer in a biopsy slide or dead coral from satellite photos.
And yet, he says his research can't ignore the pull of social media. "In the old days," he says, "It was NSF and NIH and DARPA grants — that's what was kept this field rolling." But now companies like Facebook and Google are investing heavily in artificial intelligence, and picking off his students when they graduate.
The defining story of the decade continues to be the applications of big data. Yes, the Snowden leaks were the biggest story of the year, but they speak to the larger trend of what huge institutions, such as the U.S. government, can do when they can process terabytes of data quickly into usable information. But here's where that story is starting to shift. Big data is going to gain (greater) consciousness. Or at least that's where the research money is heading.
Just recently, Facebook announced it is opening a giant artificial intelligence lab to be piloted by Yann LeCun, a researcher in the field from New York University. "The set of technologies that we'll be working on is essentially anything that can make machines more intelligent," LeCun tells WIRED about the initiative.
Google, too, recently hired Ray Kurzweil, a futurist who has predicted that computer intelligence will exceed human intelligence by 2045, to work on machine learning projects. "I envision, some years from now, that the majority of search queries will be answered without you actually asking," he told Singularity Hub after he was hired. "It'll just know this is something that you're going to want to see."
Which is kind of creepy, right?
Let's put it into this context: Currently Facebook does a lot with its user data, such as running mass social experiments on Election Day, and researching at which rates women take the names of their husbands when they get married (the social network makes all these studies public). But if that's creepy, what's to come is still creepier.
"There is a huge, seismic shift that is happening, that every company wants artificial intelligence in some way," Belongie explains. "They are just drowning in big data."
And an enormous amount of that data is in image form, which up to now, hasn't fed back meaningful information about users. "These images are just sitting in these servers like dark matter," he says.
Belongie says it's part of the "natural progression," that these companies will use programs like his own "for good or for ill — but most likely just to serve up more relevant ads."
I asked Belongie if he thought his own work was "creepy."
"I've lost my objectivity," he laughs, "because I find these problems so interesting from a scientific perspective. I don't find it creepy, but I think people need to be aware of what they are sharing."
*Editor's Note: A wonderful product.
CORRECTION: This article orignally understated the accuracy of the Urban Tribe program. The system guesses correctly 48 percent of the time.