Business, Finance & Economics

Many hated how Facebook manipulated personal emotions. The one group that didn't? Dictators

RTR3UHB1.jpg

Credit: REUTERS/Dado Ruvic

Our relationship with Facebook is "complicated." A new study showed the company could alter the emotional state of its users by showing positive or negative posts on their newsfeed.

I'm sure by now you've heard the latest uproar involving Facebook.

Player utilities

(This story is based on a radio interview. Listen to the full interview.)

Here's the basic sketch: The company wanted to see if the emotional reactions of its users could be "contagious." So for a study, published in the Journal PLOS One, it changed the news feeds of more than 600,000 of its users.

But there was one catch - Facebook didn't tell them about the changes or the study. When people found out, ironically, many learned about it on Facebook. There were many people who got upset. And, that includes those whose newsfeeds were not part of the study.

Facebook has apologized, though it insists it was within its rights.

But the story has an interesting international hook. Zeynep Tufekci brought it to our attention Wednesday during an interview. She's an assistant professor at the University of North Carolina, Chapel Hill. For The World, she's the person we turn to to talk about the intersection of technology and society.

Tufekci suggests that it deals with dictators and other authoritarian regimes. I'll get to it in a moment.

But first, it's important to understand the role Facebook plays in the world. Tufekci says the company matters greatly to parts of the world where Facebook is a fundamental part of the public sphere.

"There are parts of the Middle East where large numbers of people use Facebook as if it were the Internet," she says. "That's where they get their news. That's where they do their email. And there are a lot of people who organize protests on Facebook."

Tufekci adds that for many, they don't have an alternative. Facebook is the only place.

Now let's get back to the link between dictators and an academic study. Tufekci has recently been looking at the ways authoritarian regimes try to keep their supporters off Facebook.

It's pretty easy to see why an authoritarian regime wouldn't like Facebook because it cannot control the content. Tufekci says one of the steps regimes use is to make social media websites look like awful places where horrible things happen.

It's harder than it seems. Tufekci says in many countries - the places where Facebook is central to online life - Facebook is too popular to just simply shut down. "You can't just unplug it," she says. "People would get upset."

So what the regimes do, says Tufekci, is mount a demonization campaign against Facebook. To put it simply, it's a propaganda campaign.

"Politicians around the world, from Putin in Russia to [Yudhoyono] in Indonesia, they are saying, 'Oh social media it's dangerous. It's awful. It's horrible because you have problems there.'"

Tufekci says this gets to the interesting part. Every time Facebook makes a misstep, like publishing a study where it admitted to experimenting on their user's emotional state without telling them, it just gives fuel to the authoritarian regime's propaganda.

And it leaves her, and others who know the good that can come from the social media giant, in a tricky place.

"I find myself in the same day both criticizing Facebook's opaque algorithms and shaking my head at this intense campaign coming from authoritarian governments trying to demonize these platforms."

That's why she says it is important to have these conversations. To hash it out on Facebook.

"This is the stuff of life in the 21st century."

And that's why our relationship with Facebook is, as Facebook would say, complicated.

Comments