Miles O'Brien Productions

  • About
  • Environment
  • Health
  • Science
  • Space
  • Tech
  • Podcast
Featured Tech

Inside Facebook’s fight against junk news

Miles O'Brien 04.26.18. 1 Comment

Inside Facebook's fight against misinformation | Miles O'Brien Productions

Junk news, fake news, misinformation–whatever you choose to call it, the internet seems to be awash in these fallacious stories. Top US intelligence agencies have agreed that the organized spread of this misinformation over social networks like Facebook allowed Russian state actors to exert influence over the 2016 US Presidential election.

After that election, I decided to investigate the world of online misinformation. With producer Cameron Hickey, we have dug deep and managed to untangle some of these issues. Here is our first installment of a four-part series for the PBS NewsHour about how junk news spreads, who creates it, and what social networks can do about it:

More to come on this front next Wednesday, please tune in.

We’ve also been diving even deeper on the topic of junk news on the recent episodes of my podcast, Miles To Go.

SUBSCRIBE ON ITUNES LISTEN ON LIBSYN


TRANSCRIPT

Judy Woodruff:

Now, false or misleading news, or what you might even call junk news, and the challenge of combating it.

Since the 2016 election, we have all become more aware of the problem. Initial efforts to stop it or slow it down have had mixed results, at best.

Today, France’s President Emmanuel Macron told a joint session of Congress that false news is an ever growing virus that threatens to corrode the very spirit of democracy.

Facebook is front and center in the effort to address the issue.

And, tonight, Miles O’Brien begins a four-part series spread over the next four Wednesdays about the larger problems, including at Facebook, part of our weekly segment on the Leading Edge of technology.

Miles O’Brien:

At Facebook, the scale of everything is hard to grasp. The largest building at its headquarters in Menlo Park, California, spans a quarter-mile, and sits beneath a lush nine-acre rooftop park. It’s vast, built into the landscape, and growing fast, much like the force of human nature they curate inside on the largest open floor plan in the world.

Sara Su:

This entire area is the News Feed team.

Miles O’Brien:

Sara Su is a product manager on the News Feed team. She gave me a rare glimpse of the inner workings of the space on Facebook where users spend most of our time. These are the trenches in Facebook’s battle against misinformation that began after it hit the 2016 presidential election like a tsunami.

Sara Su:

So, over here, we have a lot of the team that works on misinformation and working with our third-party fact-checking partners to identify and take action on potentially false news.

Miles O’Brien:

With more than two billion monthly active users uploading 300 million photos a day, sharing more than a half-million comments a minute, it is simply impossible for humans to keep up.

So, decisions here are made by machines running a complex formula, an algorithm, which Sarah Su and her team are constantly tweaking, trying to teach the software what is false, or enticing, yet unfulfilling spam called clickbait, and then send it to the bottom of the virtual pile.

Sara Su:

We work really hard to come up with these techniques that can identify what is likely to be clickbait and then show it lower in people’s news feeds.

If it shows up lower consistently, it gets less distribution overall, then spammers see less of an economic incentive. And the goal is to reward the types of content that are actually leading to good experiences and disincentive the bad.

Miles O’Brien:

Is that working, you think?

Sara Su:

It is, and we’re doubling down on it.

Miles O’Brien:

But during our visit, the team was grappling with a particularly mean-spirited piece of false news, that David Hogg, a survivor of the Parkland High School shootings in Florida, and a now gun control advocate, is a so-called crisis actor.

The algorithm did not flag and lower the ranking of this piece of false news. In fact, it had gone viral and rose to the top of trending stories.

Research scientist Shaili Jain is trying to understand why.

I supposed the other factor is, once you kind of home in on whatever it may be, they just change their tactic, right?

Shaili Jain:

Right. It’s a bit of a cat-and-mouse game, that we try to find the bad guys. They enhance their strategies.

Miles O’Brien:

The bad guys are constantly looking for ways to fool Facebook’s News Feed algorithm, which learns in great detail what we like, and then strives to give us more of the same.

Dan Zigmond:

We take all this content, and, honestly, for most people on Facebook, there’s about 2,000 stories that you would get altogether, that total inventory of things that we might show you.

Miles O’Brien:

Dan Zigmond is the director of analytics for the News Feed. He explained how it works.

That’s the average, 2,000?

Dan Zigmond:

Two thousand is the average, but most people, they’re only looking at, I don’t know, about 200. And so there’s some line here, and most people don’t scroll beyond that.

Miles O’Brien:

So every item gets a ranking based on the way you use Facebook. That makes it sound more simple than it is.

Dan Zigmond:

You could rank it so that the things you’re mostly likely to like are at the top. You could rank it, so that the things you’re most likely to comment, the things where you’re most likely to spend the most time.

And each of those rankings would give you a slightly different order.

Miles O’Brien:

And so they do a little bit of each. The algorithm blends all of these criteria. What we see at the top of our feed, in theory, should be a representative sampling of all the things we are interested in.

And that’s the rub. There is mounting research that suggests we are much more likely to read, like and share misinformation, because it is designed to target our emotions.

Dan Zigmond:

One of the issues we face is that misinformation is often very engaging. So, people don’t create a lot of fake boring stories. In general, if they’re going to go through the trouble of creating a fake story, it’s about something really interesting and exciting.

And so the stories can get disproportional amounts of engagement. People might click them more than they would click like a more staid, kind of honest story, a little bit like eating junk food or something like that.

Miles O’Brien:

Fiddling with this is tricky territory for Facebook, because its business model depends upon engagement. Our attention is the commodity it sells to advertisers.

Could be bad for business, couldn’t it?

Sara Su:

Well, I personally feel really strongly that there won’t be a business unless we have a place where people can have good experiences, and so that’s what I really think about first.

Miles O’Brien:

But critics are skeptical the flood of falsehoods can be dammed by simply changing formulas. They see a fundamental conflict at Facebook.

Roger McNamee:

The problem is a business model that creates incentives for you to manipulate the emotions of the user and to make them vulnerable to manipulation by third parties or bad actors.

Miles O’Brien:

Silicon Valley entrepreneur Roger McNamee was an early investor in Facebook and remains a stockholder. The social network’s key role in spreading misinformation has turned him into a vocal critic.

Roger McNamee:

Whatever provokes a religion or anti-immigrant, whatever the issue is that gets people into that emotional state, is what Facebook is looking for.

And so what they figured out how to do was to take the News Feed, the core messages that you see, and to tailor them to provoke the outrage cycle at a predictable interval.

Miles O’Brien:

And politics is an ideal topic to spark that outrage.

The 2016 presidential election became the perfect misinformation storm, fueled by hyperpartisans, profit-seeking spammers and Russian trolls meddling with our democracy. And as worrisome as it is in the U.S., in many countries where Facebook is the Internet, authoritarian rulers have spread misinformation to incite violence against minorities and political foes.

But for all the smoke, it is hard to find the fire.

Cameron Hickey:

And so we have got to use sort of detective work, and in this case sort of Internet detective work, to find them.

Miles O’Brien:

Cameron Hickey is the producer of this series. He began his career as a computer programmer. This came in handy as we started digging.

Cameron Hickey:

It was really clear that, in order to get a handle on it, we had to do some data journalism. And so I started building this tool to understand the scale of the problem, look at the various places where it was coming from, just so we could understand it.

Miles O’Brien:

He started by looking at some existing misinformation sites and isolated telltale patterns, including the age of the pages, the way they share content, and common terms, like outrageous, shocking, and unbelievable.

And then he wrote a program to help the “NewsHour” investigation. He calls it NewsTracker, and it will soon be used by Harvard University to research misinformation online. Hunting for wily spammers is akin to a game of Whac-A-Mole. They change their domains with astonishing speed, but Cameron noticed they don’t change their Facebook pages.

After all, that is where they gather their audience. So he focused on Facebook.

Cameron Hickey:

And so what turns out the key to finding the next source is this woman. She is my grandmother.

Miles O’Brien:

Her name is Betty Manlove. She is an 86-year-old Christian conservative Trump supporter. She lives in Indianapolis.

Unbeknownst to her, she has liked more than 1,400 partisan, hyperpartisan, spam and even Russian disinformation sites. Her News Feed is filled with junk news, as her grandson describes it.

These were the leads he was looking for. Now, after more than a year of work, NewsTracker is offering a real-time dashboard to better understand the nature of the problem of junk news.

Cameron Hickey:

One of the things that I discovered really early on in looking through all these junk news sites was this pair of sites, Truth Examiner and Truth Monitor.

Miles O’Brien:

They are virtually identical sites, one catering to liberals, the other to conservatives. He confirmed one person owns both and began watching.

Then, last summer, the conservative site abruptly changed gears, offering lifestyle stories, instead of red meat political fare.

Cameron Hickey:

And it was at that moment where I was like, I got to know what’s happened here. Why did this shift happen?

And so I actually reached out to the publisher and called them up and said, I want to understand what you’re up to. And, to my surprise, he responded.

Miles O’Brien:

And so began a long, strange trip that led us to one of the most prolific purveyors of hyperpartisan content in the world, not in Macedonia or St. Petersburg, closer to home, in California.

In coming installments, you will meet him and Cameron’s grandmother, and see the problem of junk news through their eyes. And, finally, we will take you back inside Facebook to see how they are redesigning the News Feed, employing third-party fact-checkers, and using artificial intelligence to solve a problem that may be too hard for humans to grasp.

I’m Miles O’Brien for the “PBS NewsHour” in Menlo Park, California.

Judy Woodruff:

Fascinating.

And one other note:

For all of the anger around Facebook, its ad business has not suffered. The company reported today that it earned almost $12 billion in revenue for the first three months of the year. That is close to 50 percent more than it earned for the same period last year.

We will also have more about the NewsTracker project, as you just heard about, and how it came to be. That’s on our Web site.

Previous Post
Next Post

Tagged With: facebook, internet, junk news, misinformation

Miles O'Brien

About Miles O'Brien

Miles O'Brien is a veteran, award-winning science journalist. Previously an anchor for CNN, he now charts his own path at Miles O'Brien Productions.

Get Our Biggest Stories Delivered To Your Inbox

Trackbacks

  1. Want to learn about junk news? Here’s all our reporting. - Miles O'Brien Productions says:
    05.10.19. at 10:10 am

    […] our first installment, we went to Facebook, the disseminator of most of this junk news. After much cajoling, we managed […]

    Reply
Miles O'Brien

We're on a mission to make sense of science, celebrate innovators and improve lives.

LEARN MORE

Focus: Climate Change

How climate change 'exacerbates' wildfires in the American West | Miles O'Brien Productions

How climate change ‘exacerbates’ wildfires in the American West

Companies race to mine lithium, a battery essential | Miles O'Brien Productions

Companies race to mine lithium, a battery essential

A risky expedition to study the ‘doomsday glacier’ | Miles O'Brien Productions

A risky expedition to study the ‘doomsday glacier’

Australian bushfires prompt conversation about land management practices | Miles O'Brien Productions

Australian bushfires prompt conversation about land management practices

SHARE THIS POST

Share on TwitterShare on FacebookShare on LinkedInShare on Email

ABOUT | PRODUCTION | PODCAST | SUBSCRIBE | CONTACT US

Copyright © 2023 · Miles O'Brien Productions | Privacy | Terms