Inside Facebook's race to separate news from junk | Miles O'Brien Productions

Inside Facebook’s race to separate news from junk

At Facebook, there are two competing goals: keep the platform free and open to a broad spectrum of ideas and opinions, while reducing the spread of misinformation. The company says it’s not in the business of making editorial judgments, so they use fact-checkers, artificial intelligence and their users. Can they stop junk news from adapting? Science correspondent Miles O’Brien reports.

This is Part 4 of our junk news series for the PBS NewsHour. Click to watch Part 1, Part 2, and Part 3–as well as to listen to all our junk news extended interview episodes of my podcast, Miles To Go.


TRANSCRIPT

Judy Woodruff:

Facebook is under pressure to crack down on false news, fake accounts and inflammatory content that can be manipulated to influence the public.

This week, the social media giant announced that it deleted 865 million posts in the first three months of this year. Most of it was spam. The company also quickly disabled more than a half-a-billion fake accounts.

But that isn’t everything.

Tonight, we take a look at how Facebook tries to tackle the content it won’t delete.

Miles O’Brien has been examining the problem of junk news and Facebook.

For the record, the “NewsHour” has partnered with Facebook on projects in the past.

This is the last report in a special series. It’s part of our weekly look at the Leading Edge of science and technology.

Miles O’Brien:

Inside Facebook headquarters in Silicon Valley, they are trying rethink the column of babble that the News Feed has become.

Man:

So, I don’t think this is actually necessarily all that bad of a design, even though it doesn’t look that great.

Miles O’Brien:

Here, they are trying to figure out how rate the quality of the news we like and share, more clearly identify the source, offer users some context, and make sure the cream rises to the top of the feed, while the junk sits at the bottom.

Tessa Lyons:

We don’t want a false story to get distribution, so we demote it.

Miles O’Brien:

Tessa Lyons is product manager of News Feed integrity. She works with two competing goals in mind, keep the platform free and open to a broad spectrum of ideas and opinions and reduce the spread of misinformation.

Why not just delete it?

Tessa Lyons:

Well, it’s an interesting question. And I think, look there’s real tension here. We believe that we’re not arbiters of truth and the people don’t want us to be arbiters of truth.

We also believe that censoring and fully removing information, unless it violates our community standards, is not the expectation from our community. So we work to reduce the damage that it can do and ensure that, when it is seen, it’s surrounded by additional context.

Miles O’Brien:

Even though nearly half of all Americans now get their news from Facebook, the company insists it is a technology enterprise, not in the business of making editorial judgments.

So they are outsourcing the work. The most clear-cut problem, content that is demonstrably false. To grapple with that, they have turned to a 20 third-party fact-checkers globally, including one of the Internet’s original arbiters of fact from fiction, Snopes.com.

We caught up with managing editor Brooke Binkowski, a former newspaper and radio reporter. She works at home or at a neighborhood coffee shop in San Diego.

And is this a typical day? Busy day? Yes, OK.

Brooke Binkowski:

This is a busier-than-normal day.

Miles O’Brien:

Facebook reached out to Snopes to be among its outside fact-checkers in 2016.

Brooke Binkowski:

It’s gone from probably eight-hour days for all of us to more like 12-, 15-hour days now because there’s just so much to tackle, and we all are true believers, basically. We all think that it’s important, what we’re doing.

Miles O’Brien:

Facebook says when a story is debunked by fact-checkers, it reduces its reach by 80 percent. But Binkowski remains skeptical.

Brooke Binkowski:

I still am not convinced that it’s making a huge difference.

Miles O’Brien:

Part of the problem? Old-fashioned shoe-leather reporting, making calls to sources, doing the research, sometimes even taking a trip to a real library, takes time.

Brooke Binkowski:

Mark Twain famously said, what is it, a lie can travel halfway around the world while the truth is still putting on its shoes.

I think it’s gotten so bad now that a lie can travel like three times around the world, completely change, affect the perspectives and the votes of thousands of people, and wreak havoc all over the place while the truth is still kind of getting out of bed. It’s just happened much faster, and it’s overwhelming.

Miles O’Brien:

At Facebook, they are keenly aware of this, but they see no easy fix, either from humans or machines.

Is it possible to match the rate at which people use the product?

Tessa Lyons:

If we’re always waiting for individual facts to be reviewed, that is going to be slow in each case. But by working with the fact-checkers if we’re able to understand the pages, the domains who repeatedly spread this type of information, then there’s more work that we can do upstream to stop it earlier.

Miles O’Brien:

They are also looking upstream for spammers who create content that it is factually correct, but misleading, incomplete or polarizing. It’s often called clickbait.

To try and defend against this, Facebook is using an artificial intelligence technique called machine learning classification. The idea? Feed the computer reams of clickbait examples, so it can find patterns and learn how to spot this material and send it to the bottom of the News Feed.

So, with artificial intelligence, you can make the algorithm smart enough to identify what is clickbait from a spammer?

Tessa Lyons:

This is using a machine learning classifier in order to help us scale this problem, because,at two billion people, we want to ensure that our solutions don’t require individual manual review, but, rather, they can scale across the platform.

Miles O’Brien:

Producer Cameron Hickey has been developing his own tool to identify the junk as part of our reporting. In doing so, he has seen the limits of machine learning and the persistence of the adversary.

Cameron Hickey:

Using software to recognize patterns and then do something based on those patterns that you recognize is only as good as a pattern remaining consistent.

And the whole point of this problem is that the people who are trying to publish content like this, they’re adaptive, right? So as soon as you shut down one avenue, they move to another avenue.

Miles O’Brien:

Historically, junk news producers have taken advantage of the fact that most everything that appears in the Facebook News Feed looks the same, whether it’s heavily researched and vetted journalism or pure junk.

In the days when we bought newspapers and magazines at newsstands or supermarkets, it was easier to identify the difference between quality and junk. Facebook is developing ways to give its users some clues in labs like this one.

Grace Jackson:

So, what’s going to happen is, there’s going to be red dots that pop up on the screen. I just want you to follow them with your eye.

Miles O’Brien:

Grace Jackson is a quantitative user experience researcher at Facebook. She is showing me how she tracks eye movements as a user reads a News Feed.

She’s testing to see how easily I recognize visual cues that an article has been fact-checked or links are added for context.

I blew by it.

Grace Jackson:

Yes. So…

Miles O’Brien:

I think you need a little more there.

Grace Jackson:

This was our original design that we had tested and learned that a lot of people skipped right over it and never saw the entry point over here.

Miles O’Brien:

Well, it’s not obvious that’s a click point, right?

Grace Jackson:

Exactly.

Miles O’Brien:

The eye-tracking data helps product designer Jeff Smith as he ponders new ways to give users clues about the credibility of information.

Jeff Smith:

We’re in this new space and age where we’re trying to design for new mechanisms that people those credibility cues that used to be there via the publisher on the supermarket aisle or the newsstand.

Miles O’Brien:

He’s working on a design that more clearly identifies articles that have been debunked, provides context, related articles, and information about the publisher.

Jeff Smith:

I’m trying to give the user as much information as possible in a way that they can easily sort of digest and understand, without getting in the way.

Miles O’Brien:

The Facebook News Feed algorithm is finely tuned to hold our attention, originally without an emphasis on the quality of the content. But the company says it is trying to change that.

Alex Hardiman:

We want to make sure that the news people see is high quality. And we didn’t have that stance before. And so it’s a pretty radical departure in terms of the way that we have been thinking about news, and I think a really worthwhile one.

Miles O’Brien:

Alex Hardiman is the director of products for news. She is leading Facebook’s effort to identify news sources that are credible, trustworthy and authentic.

They’re turning to their users for the answer, asking them to rate news sources they trust, and feeding those rankings into the News Feed algorithm to determine what sources should rise to the top.

With so many people getting their news from Facebook, why don’t you have a newsroom?

Alex Hardiman:

Because, for us, thinking through what quality means doesn’t require us to have a newsroom. We’re really trying to make sure that we pull in great information from publishers and from the people who use Facebook to make these decisions.

Miles O’Brien:

But, if given the choice, will users of Facebook choose quality over junk? The hyperpartisan content publisher we found in California, Cyrus Massoumi, is skeptical.

Cyrus Massoumi:

They aren’t New York Times readers, necessarily. Maybe some of them are, but the majority of them just want a 250- to 350-word article which will sort of like get them a little bit fired up.

Miles O’Brien:

And the numbers back him up. His Truth Examiner page has 3.8 million fans, and the stories he publishes generate a much higher rate of likes and shares than The New York Times and The Washington Post.

Cyrus Massoumi:

Nobody wants to read that stuff when they’re own their phone, which is what everybody’s doing when they’re on Facebook. Like, nobody pulls out their phone and goes like, aha, I would love to read this 5,000-word profile of the endangered giraffe in the Congo.

Miles O’Brien:

As long as users continue to like and share junk news, should Facebook redefine its role as a publisher?

Brooke Binkowski:

I think that they really need to come to terms with the fact that they are a media company, on top of everything else, because, right now, they keep saying they’re tech, they’re tech, they’re tech.

They’re trying to avoid it all coming crashing down when they finally say, we’re media, because then all those questions will come:

Well, why didn’t you do this? Why didn’t you do it that way? Why didn’t you listen to this?

Miles O’Brien:

Is it time for the company to take a little more responsibility about what is in the News Feed, what is on the trending stories in an active editorial way?

Alex Hardiman:

I say absolutely yes to responsibility. The tactics as to whether or not active editorial way, I would say we don’t know. So, the first part, yes, we have a responsibility to making sure that the news people see on Facebook is high-quality.

Miles O’Brien:

The problem is far from solved, and the 2018 midterm elections are looming. Even as the political races heat up, here at Facebook, they’re running their own race, with no finish line in sight.

I’m Miles O’Brien for the “PBS NewsHour” in Menlo Park, California.

Judy Woodruff:

Fascinating.

And you can watch all of the stories in Miles’ series about junk news online at PBS.org/NewsHour.

Subscribe
Notify of
guest
1 Comment
Newest
Oldest Most Voted
Inline Feedbacks
View all comments

Get our latest stories delivered to your inbox.

X