Inside the Filter Bubble – with Eli Pariser, the man who coined the phrase. | Miles O'Brien Productions

Inside the Filter Bubble – with Eli Pariser, the man who coined the phrase.


The Internet was supposed to provide a utopian virtual world where all of us could come together in peace, love and harmony to better understand each other and our differing viewpoints…

But we got derailed on the road to utopia, didn’t we? Eli Pariser, the man who coined the phrase “filter bubble”, knows as much about this as anyone. We featured Pariser in last week’s installment of our PBS NewsHour junk news series. I hope you enjoy this extended conversation as much as I did.

[maxbutton id=”5″ ] [maxbutton id=”8″ ]


Miles O’Brien: Hello and welcome to another edition of Miles To Go. I’m Miles O’Brien.

Our deep dive for the PBS NewsHour into the world of misinformation–many of you call it fake news, we like to call it junk news, no need to get involved in the semantics at this moment. But, it continues and our final piece is releasing this week. We’re going to take you back inside Facebook and look at some of the ways they’re trying to battle this issue of low quality, hyper-partisan misinformation.

Our stories were produced by Cameron Hickey. If you want to find out more about what we have aired so far, go to the website: You can see the stories there, you can sign up for our newsletter–once a week, no spam, no junk, and we promise it will keep you connected to the world of science and the world of news and where they intersect.

So, we’ve taken you inside Facebook; we’ve introduced you to one of the most prolific purveyors of hyper-partisan content in the business. You’ve also met, if you’ve been watching, a couple of consumers of this content on both sides of the political spectrum. So, check it out and we hope you might learn a little something about social networking and the concerns that it gives rise to.

While you’re in the process of checking things out, we’d love for you to weigh in on the podcast. Let us know what you think of it, one way or another.

Now, we’re interested in the unvarnished truth, the real story–not something that we might want to hear. You know, this is an all too common phenomenon in our social media driven world, of course.

We find ourselves in a virtual tribe online, and we tend to see the world in just the same way as our tribe, sort of by definition. Because this technology is really good at filtering out all the other stuff, the people we don’t want to hear from. We end up in a virtual bubble… a filter bubble.

That’s a phrase you might have heard. It was coined by Eli Pariser. He is one of these smart people we interviewed for this series for the PBS NewsHour. He is president of a company called Good Media and they run a site called Upworthy–and that’s where we began our discussion.

Miles O’Brien: Give me a few words on the company, what you’re doing right now first before we get into the Filter Bubble and all those things. Tell me about that.

Eli Pariser: So, five years ago, I started a company called Upworthy, which was trying to make ideas that are important reach a large audience in an algorithmic age using social media. We’ve grown to be pretty big. We’ve reached tens of millions of people a month these days and merged with Good to form Good Media Group last year.

Miles O’Brien: So, tell me your insight about what the Filter Bubble is. It predates the company, right?

Eli Pariser: Right.

Miles O’Brien: So tell me, how did you come upon this thesis?

Eli Pariser: I woke up one morning and logged on to Facebook, and I had just been trying to connect with a whole bunch of people who were kind of not thinking the same way that I was. I was bored of my friends and their similar opinions, and I wanted something different.

When I logged on to Facebook this particular morning, I noticed that it was just the people who I already kind of agreed with, that all of the people who I had befriended, but who weren’t like minded to me that kind of fallen out of my feed. And I got interested in, “Why might that happen?” This was back in 2010, and I started looking into the dynamics of the Facebook algorithm. And so what I discovered was that Facebook was looking at my behavior and deciding that, “Well, you say you’re interested in these people, but really you’re interested in these ideas that reflect what you already believe.”

That was really kind of the start of my research into the Filter Bubble where I realized that actually it wasn’t just Facebook, it was also Google. It was also a lot of other companies were embedding this kind of technology in our experience with the web.

And so I had this kind of image of a Filter Bubble as this kind of personal universe of information that follows us around wherever we go. And it filters out things that we might not want to engage with and shapes our own sort of view of the world.

Miles O’Brien: Haven’t humans always kind of existed in some kind of Filter Bubble? I mean, we hang around people we like. We tend to gravitate to people with similar views. How is it different when it’s magnified by social media?

Eli Pariser: I think there’s two ways that this is new, because we have always consumed media that reflects what we already believe. But this is passive, so we’re not choosing to pick up a magazine that has a particular point of view. And because it’s passive, because we don’t know who Google and Facebook exactly think we are, we don’t know on what basis information is being edited in, and we don’t know on what basis information is being edited out. So, we can’t conclude anything about what we’re not seeing in our Facebook feeds, that’s new.

I think the second thing that’s new is the sort of scale of these algorithms, the fact that this same sort of process is affecting now two plus billion people around the world on Facebook alone, add in however many people are viewing the world through Google. This is not one magazine or one piece of media. This is a system through which almost all of our digital media passes these days and that’s new as well.

Miles O’Brien: It’s kind of hard to overstate the power that that gives, right?

Eli Pariser: Yeah. No, I mean we’ve never seen companies aggregate the kind of human attention that a Facebook aggregates before. That’s never happened. And the fact that with small twists to these algorithms, you can create shifts that dramatically change how people all over the world are paying attention and what they’re paying attention to. That’s an enormous amount of power, and it’s power that’s relatively not held accountable by any government or regulatory body.

Miles O’Brien: It’s pretty much a black box, isn’t it?

Eli Pariser: Yeah. There’s very little that we know about exactly how all of these work and so, Facebook offers some vague ideas about sort of what they’re prioritizing at any given time. But who they think we are on what basis the information that comes to us is chosen, who it’s coming from even, all of that is a mystery and it’s a mystery kind of by design.

Miles O’Brien: On the one hand, it’s private company that is designed to make you feel good and come back each day and find something you like.

Eli Pariser: Right.
Miles O’Brien: So, in the sense of –- that’s a perfectly transparent mission, right? Most people are fairly content in that bubble, aren’t they?

Eli Pariser: Well, I mean increasingly we’re realizing the constraints of being in one of these bubbles and I think, you know, it feels good on a day-to-day basis.

But I think, more and more, we’re realizing that you can ignore the truth, but the truth isn’t going to ignore you and that the consequences of things that are not making it into your bubble still exist whether you see them or not. And so, I think the way that our attention is being drawn away from some of the things that really matter, I think is becoming more and more kind of a source of anxiety for a lot of people in America and around the world.

Miles O’Brien: You could sort of view Facebook as an almost a quasi government enterprise at this point, in some respects, another branch of government, or something to that effect. It has the ability through terms of service, for all intents and purposes, write law. It changes the way we think. It has tremendous power. Does it always more, put it that way, given the amount of power it has. It is a private corporation after all.

Eli Pariser: Yeah. Well, I think Facebook regulates and it regulates not through laws, but through code. But that code determines how people can interact and behave on the world’s largest platform for conversation.

And so in a very real way, it’s structuring our social space from the point of view of a private company. To my point of view, I mean Facebook may not have intended in a way to be that place. I sometimes think of Mark Zuckerberg as like someone who woke up one morning and was mayor of a small town that he never chose to be mayor of, and people are coming to him saying like, “Well, the garbage isn’t being picked up and there is lead in the water,” and he’s saying, “Well, I never wanted this job in the first place, that’s not my problem.” But it is his problem because, ultimately, he has displaced all of the other places where this was happening.

People talk a lot about disruption in Silicon Valley and Facebook has disrupted the entire sort of attention economy and attention ecosystem, and co-opted a whole lot of it. And as a result, they have a responsibility for what happens there.
Miles O’Brien: How much of this is just by sheer virtue of its size, kind of the monopoly factor of it, and how much of this is the way it does business? In other words, if there were more competition, would we be having this discussion?

Eli Pariser: Well, probably less, but I think one of the things that’s fascinating about Facebook is that the way that they use their size to stymie competition. So, there’s a whole team inside Facebook that looks at what new platforms are throwing off signals into Facebook. Facebook can then see like, “Oh, this new thing is gaining traction. Let’s study it and let’s replicate that functionality inside Facebook.”

And again, in a way, it’s private business and God bless, it’s — but in another way, we’re increasingly facing this place where our entire kind of communication structure is mediated by a company that’s not accountable to anyone and that can’t be competed with.

Miles O’Brien: That’s a scary statement.

Eli Pariser: I think it is and I think Facebook is realizing that in many ways this is an uncomfortable position for them because people are beginning to hold them accountable for what happens on the platform. And it’s hard to be a place for good democratic discourse across 190 societies all running through one algorithm.

Miles O’Brien: Probably impossible, right, when you consider the cultures and everything else involved, language and you name it, to do that, right?

Eli Pariser: Well, I think it’s certainly very, very difficult and I think that’s what we’re seeing, with this latest set of announcements, that Facebook, in a way, wants to step back, wants to position itself as like, “We’re not part of this public conversation as much anymore. We’re just a place for friends to talk to friends, and this whole news and politics maybe should be someone else’s problem.”

I don’t think it’s that simple, but I think they would kind of like to step back from that conversation.

Miles O’Brien: I think you could probably make an argument it’s sort of late to step back at this point, right? Isn’t it? I mean, it’s not just that they dominate the discussion in the public square, they are the public square.

Eli Pariser: They’ve co-opted the public square and now — I mean this is why I started Upworthy. I think you can’t ignore the importance of Facebook, like it or not. There has to be a conversation there because that’s where people are paying attention and I think our choices are merely, are people paying attention to things that are kind of substance-less or are they also having conversations that really actually matter.

Miles O’Brien: With all due respect, you have a great company and a good idea, but how can Upworthy possibly make a dent given the size of what we’re talking about here?

Eli Pariser: Well, I think we were trying to — we do reach tens of millions of people and I think the purpose was to demonstrate that it’s not as if audiences actually don’t care about consuming important issues — like at the time we started, Facebook was mainly seen as a place to share cat GIFS and people sledding off the roof. And I think we wanted to demonstrate that there is a way to reach a lot of people around substantive content in a way that they’ll engage with, I mean, across the political divide by the way.

But certainly, we’re not the solution either. I mean I think there needs to be a whole ecosystem of public-spirited companies on the media side that are helping to engage people around this kind of content.

Miles O’Brien: How do you do this? Give us some specifics on how it works, how you try to counter the effects of the Filter Bubble.

Eli Pariser: For us, it starts with connecting with people as people and around values. We’re a deeply divided country in many ways and certainly, if you start with any kind of sense of partisan labels or identities like it’s game over as far as a reasonable conversation. But I think there are lot of values that we all agree on and when we can see each other as people, then we can often get past some of our immediate reactions.

At Upworthy, we focus a lot on kind of telling stories that build empathy across these divides and that help people flush out what otherwise might be a fairly thin stereotype of the other side.

What’s exciting is those really do get shared, not only a lot millions of times, but they get shared by a very diverse group of people politically that include folks from the left, center, and right. For us, it’s a way to try to bring people back toward some sense of commonality in a moment where it feels like a lot of people are trying to drag people away from each other.

Miles O’Brien: Yeah. We’re going full tribal. You’re fighting full tribal at this point, right?

Eli Pariser: Yeah. I think it is a tribal moment and people get tribal when they’re fearful and scared. And I think one of the best things we can do in this moment is to remind the people of the power and security that we have when we work together. Those stories are not the stories that we typically hear on the news, which are these stories of war and chaos and division

But it’s true that actually this country has done great things together in the past when we’ve worked together, and if we can remember that, we can actually get out of or above our little tribal affiliations.

Miles O’Brien: The pejorative is clickbait. But truthfully, what you want to create is clickbait. Otherwise, people aren’t watching what you’re creating. Clickbait doesn’t necessarily have to be bad, is that the point?

Eli Pariser: Well, I mean for us, it was never about the click. It was about does it resonate with people in a way that they’ll share? And what’s made me most proud is the proof point for us is, “Okay, you watched it. Now, do you want to put your name by that with all of your friends, or is it not that quite good enough to do that?”

And if we meet that mark, then I feel like we’ve done a service. I feel like the proof is in people actually being willing to put their name behind it in front of their friends.
Miles O’Brien: So, you defined a little bit of the conventional wisdom here. That there is a middle ground where people can meet and actually agree on things. That still exists?

Eli Pariser: Yeah. It does exist because actually there are a lot values that everybody in this country believes in taking care of their family and protecting their family, and making life better for their kids. There’s a bunch of these things that we do all actually believe on and there’s a bunch of stories that actually transcend these political categories that we put ourselves in.

Miles O’Brien: So, this idea that you’re kind of pushing middle ground through this filter bubble membrane.

Eli Pariser: Yeah.

Miles O’Brien: That’s not easy, right, or is it?

Eli Pariser: No, it’s not. It’s not easy. I think we have a social scientist on staff who all he does is help us understand in what contexts people come together and in what contexts they don’t, and what kinds of stories are actually going to resonate. One of the things that we found that was most interesting is that if you look at stories about topics like climate change, people engage with that topic and usually they feel like incredibly depressed if they get to the end the article.

What’s interesting is that both inhibits their likelihood to share it with others, but it also inhibits their likelihood to do anything about it for obvious reasons. They feel hopeless. When you can create a sense of empowerment, a sense that something could be done, whether or not you can do anything about it right now, it totally changes people’s relationship to that topic.

And all of a sudden, they actually want to engage with it because there’s a sense that they’re in the driver’s seat a little bit. And so, I think this is another way on which we try to sort of do things a little bit differently from some other media instead of just presenting a fairly gloomy outlook. It’s not like we try to be Pollyannaish about it, but we try to suggest there is something that can be done and that you can be part of doing it. And that activation actually reshapes people’s orientation around wanting to learn more about it in the first place.
Miles O’Brien: So, is there money to be made doing this?

Eli Pariser: We’re a for-profit business. We’re profitable and we’re growing, and I think that’s always been part of the design that if we were going actually reach people at the kind of scale that we wanted Upworthy to reach, that we needed to be able to fund it.
Miles O’Brien: So, it seems to me and we talk so much about Facebook. When you look at the big corporate media players, they’re a part and parcel of this problem too, aren’t they? Because they’re not focused on the kind of stories that you’re looking at, they’re looking at one side of the other, often polarized content that suits their audience. Would you agree with that? I mean it seem like there is a perception that the place is a place like CNN, that there’s a lot more money to be made and often polarized content, right?

Eli Pariser: There’s a belief that making us more anxious will allow more advertising, more ability to sell products. And I think that may be true in the short term, but in the long term it makes us all kind of like numb. I think what people are really looking for in this moment, where we’re very politically divided where things are economically divided is some sense of like hope and optimism that’s grounded in reality.

I think actually it’s kind of a miscalculation to not focus on that as well.
Miles O’Brien: So, I mean we like to simplify problems too much, of course we call it fake news and we blame Facebook, and we’re done, right?

Eli Pariser: Yeah.

Miles O’Brien: That’s really way too simplistic, isn’t it? It’s a much larger ecosystem. It’s so much more a nuisance problem, isn’t it?

Eli Pariser: Yeah. The focus on fake news is probably overheated because it’s not just about — you can get rid of all of the sort of false stories about the Pope endorsing Trump and still have a pretty disengaged and pretty unhappy information ecosystem. And so, it’s not just about fake news, it’s about how do you make the truth louder and how do you make sure that it gets to everyone rather than just sort of a small group of news junkies.

Miles O’Brien: So, have we gotten to a point though where we are completely skeptical of everything and everyone and how much does that make it difficult to define what our common body of facts are anymore?

Eli Pariser: We do risk some form of truth bankruptcy, like we do risk a moment where people throw up their hands and say, “I’m hearing this is true from one person, false from another. I don’t have time in my busy life to authentically figure out which is which and I’m just kind of disengage.” And I think that’s the scary part of this moment is that a lot of people just tune out because they’re hearing too many sort of jarringly different perspectives. Ultimately, there probably are ways that you can build an information ecosystem and build platforms that do actually help us make sure that the truth rises to the top as we all kind of learned in Civics School, but that’s clearly not happening today.

Miles O’Brien: So, have we gotten to a point though where we are completely skeptical of everything and everyone and how much does that make it difficult to define what our common body of facts are anymore?

Eli Pariser: We do risk some form of truth bankruptcy, like we do risk a moment where people throw up their hands and say, “I’m hearing this is true from one person, false from another. I don’t have time in my busy life to authentically figure out which is which and I’m just kind of disengage.” And I think that’s the scary part of this moment is that a lot of people just tune out because they’re hearing too many sort of jarringly different perspectives. Ultimately, there probably are ways that you can build an information ecosystem and build platforms that do actually help us make sure that the truth rises to the top as we all kind of learned in Civics School, but that’s clearly not happening today.

Miles O’Brien: Give me a couple of trend statements since you first wrote about the filter bubble. When was it published?

Eli Pariser: Yeah.

Miles O’Brien: When was that?

Eli Pariser: 2011.

Miles O’Brien: 2011, so what has happened since then?

Eli Pariser: One of the things that happened is that these companies have just gotten enormously larger. Facebook in particular was on the scene in 2011 as kind of place to connect with your friends

But it wasn’t the central place that controlled an enormous portion of the digital advertising spend and an enormous portion of people’s online attention. Right now, I think the number is that people spend something like four hours a month on Facebook and that may be an old number. So, it’s just an enormous amount of time, times an enormous amount of people.

Eli Pariser: Since I wrote the book, things have gotten both enormously bigger. The way that we relate to social media, the fact that increasingly people see it as their kind of first source for news has shifted and we’re seeing the rise of a generation in particular with Millennials that sort of are digitally native and aren’t going back.

And so, all of that is kind of reshaping politics as well because whereas it used to be, the case that TV news and the local TV news were the primary way that Americans got informed how they would vote. Now, digital media is becoming a really critical place for Millennials as well.

Miles O’Brien: Well, quick thought on the Millennials. They’re a lot more savvy about this than certainly my generation. Are they better equipped to forage for divergent viewpoints in this landscape than we are and maybe the problem in that respect might solve itself over time as this new generation takes power, or are they equally duped by the Filter Bubble?

Eli Pariser: I think the average Millennial and the average kind of digital consumer in general knows enough to be dangerous in a way because the fact is that almost everyone, other than really experienced experts, can be fooled by the sort of trappings of truth.

There’s a great study where a bunch of professional fact checkers faced off against a bunch of like very highly educated academics, journalists and other professionals to try to figure out which statements were true and which statements were false. And even these very well-educated folks fell for, you know, “Well, it looked like a very impressive website and it had really nice font, had a picture of a guy who looked like he knew what he was talking about.” We all fall for this stuff and so I think at some point you do need systems that really deeply account for the credibility of an institution in deciding whether it gets distribution rather than allowing people to make what are fairly fast judgements about who they can trust and who they can’t.

Miles O’Brien: I mean there’s always been fake news. You just knew it was a — in the supermarket line, as you know, Hillary with an alien, and you knew it was there and I was like, “Okay, I get that.” And now, completely, it’s a hall of mirrors, isn’t it?

Eli Pariser: Yeah. Well, I think there are many, many different flavors of gray. That’s a mixed metaphor. O n the one hand, you have things that are sort of clearly false, but then you also have kind of partisan propaganda. You have advertorials. You have things that are sort of opinionated but roughly true and you have straight hard journalism which may be motivated by any number of sponsors. All of that is a landscape that’s difficult to navigate and much more so than one in which we just had fewer information sources to begin with.

Miles O’Brien: How do you find the truth in that spectrum, I wonder.

Eli Pariser: Well, I think what we’re realizing now is kind of an old learning, I think, which is that truth always was about who we trusted, but we allowed ourselves to forget that for a while because there weren’t that many choices to make about whom we trusted. And I think right now we’re seeing the consequences of a moment where we haven’t really made great decisions as a society about who to trust.

I’m relying and many people are relying on a set of friends to populate our newsfeeds. Who are not taking that on with a sense of responsibility for my information diet. They are just taking it on with the sense of like “Oh, I’m going to post this to Facebook and hopefully someone will see it.” That’s a really big shift from a set of editors and producers carefully thinking about what should go on the front page and what shouldn’t.

We’re kind of in this weird middle ground where it feels very reliable because it’s coming from someone who you know and trust much more than you knew the editor of the New York Times, but that person does not really have either the expertise or even the intentions of programming your information diet.

Miles O’Brien: A few words on what Facebook did recently. You described Zuckerberg as a mayor of a small town, a little more than a small town, but he decides to kind of retreat from this public square role a little bit. Is that the right thing to do at this point and what’s been your experience looking at how the algorithm has changed on Facebook and what it’s doing?

Eli Pariser: I think there are parts of that announcement that I think are dismaying and parts of it that I think are potentially interesting, encouraging. I think the scary part is kind of Facebook takes like a cannonball into the media pool.

Displaces a bunch of media and then says, “Ah, we never really wanted to be in the news business to begin with, so we’re going to retreat from that,” and now that doesn’t grow back. It’s just what was news distribution is replaced by people talking about their kids and their cats, and whatever else. I think that part is worrying because I think whether Facebook likes it or not and whether we like it or not, Facebook, at this point, does have an important role to play in our information distribution and in our democracy, and they can’t shirk that. I think the part that’s interesting is they’re starting to work on some ways of thinking about which news sites you can trust and I think well, that’s an inexact — the way that they are approaching it is going to need some refinement.

I think that the idea of some kind of measure of, “Is this content seem credible not only to people from one political viewpoint, but across political viewpoints, that if you do that then you can reach a large audience?” That’s a way to start to encourage media and encourage people toward some commonality as opposed to toward away from each other. And so, I think that is exciting and interesting.
Miles O’Brien: Is there a way to identify sources that we can all equally trust?

Eli Pariser: No. There’s no perfect solution here and I think increasingly — and it’s also not symmetrical is I think the other thing that’s worth saying. Like both the structure and intentions of left wing media look really different from the structure and intentions of right wing media. In general, what you see, if you sort of look at the landscape, is a lot of very partisan far right media not much in the center right and then a big bump in the center left, and some very partisan left wing media.

The distribution is not the same and part of the challenge that we have is that for folks on the far right, they trust almost nothing on the left. For folks on the left, they’re willing to kind of go center right, but there is very little there institutionally and so it’s not as if you can — if each side did the same thing in other words, you wouldn’t get the same results.

Miles O’Brien: Why doesn’t each side do the same thing?

Eli Pariser: Well, I would say there’s decades of research showing that the way that liberals and conservatives ingest information, process it and think about the world is different. For conservatives, generally speaking, conservatives are less interested in sort of hearing multiple views as an inherent good. Like just less of them have a problem with believing that Fox News is the only place that they should get information according to a few studies.

With liberals, it’s a bit more baked in, that you should be hearing from everybody. And whether or not they actually do that, that sort of shapes their information consumption habits and means that they want to hear from people who kind of seem unbiased. Jon Stewart is actually fairly, clearly a left wing person, but has a kind of posture of unbiasedness that makes liberals happy.

So, each side has its own kind of way of engaging with media. Neither of which are like fully rational or perfect by any means, but which shape the kind of media that grows up around them.
Miles O’Brien: So, where are we headed as we approach the next election?

Eli Pariser: We will see both partisan sides, but I would argue probably more the right than the left trying to kind of use people’s confusion about the truth to score political points in various ways.

And in some ways, just to neutralize like we saw this in the election in Alabama where there were bunch of accusations of sexual misconduct. That were fairly well-established, just creating kind of uncertainty around whether that was true is actually hugely politically beneficial. That worked to a large degree although it wasn’t ultimately effective. I think we’ll see a lot more of that like people are going to be trying to create a lot of uncertainty around the things that they don’t want us pay attention to. And I think our job as citizens and as an electorate is going to be to see that for what it is. It’s a smokescreen that’s being thrown up and not let it distract us from actually getting to the bottom of what people are doing at the end of a day.
Miles O’Brien: There’s a certain amount of responsibility to being a citizen…

Eli Pariser: Yeah.

Miles O’Brien: Then I think maybe we have forgotten this. Is that accurate to say, you think, collectively?

Eli Pariser: Yeah. I mean I think both things are true. I think it is a hard time to be a busy, yet responsible citizen and try to figure out what the heck is going on. It’s maybe never been harder. There are so many choices, there are so many things flying around, it’s all moving so quickly. At the same time, I think that is the core idea of democracy is that we’re all going to devote ourselves enough to that effort to be able to make a good decision and to do it on the basis of what’s actually true. So, we have to call ourselves to do that even when it’s kind of unpleasant and —

Miles O’Brien: Do you see that happening realistically?

Eli Pariser: I think people are getting a sense of what these tactics actually do and are falling for them less than they were two years ago, because we are all adapting to a new environment in which the relationship to the truth is different than it was in 2016.

Miles O’Brien: All right, tell me a little bit about design solutions for fake news, that document. You’re kind of all about solutions, which I like. It’s good somebody’s out there slugging away. Walk us through. What are the stakes? Is it something like that that could lead to the disembodiment of our great democracy here and if so, are there solutions that —

Eli Pariser: Yeah, though I think on the worst side, this could be a prelude to civil war and when nations lose sight of their shared identity and their shared values, and their shared facts, that’s an easy way to get into that situation.

On the upside, I would say tribalism and tribal epistemology and kind of thinking in a group isn’t a new phenomenon in human history. It’s as old as civilization. Repeatedly, in civilization, we’ve actually found ways to ameliorate that to come together in bigger groups and to overcome it. So, I don’t think I’m not at all a fatalist who believes that it’s all just going to unwind. Because in a way, like, why hasn’t it already in 2,000 or 3,000 years, unwound?

And so what design solutions was, was “Let’s look at this problem of fake news and let’s just think about like what are different ways that we could set up, either these algorithms or functions in networks like Facebook or Google, that would help expose us to different ideas that would help create incentives for people to be truthful or to be credible.” That would do that in way that wasn’t like you have a White House press corp pass so you get a checkmark next to your name, but that actually takes advantage of some of the great things about the internet. I threw up a Google Doc that had five ideas of mine in there, tweeted it out, just randomly expecting a few other friends to join in and it was really kind of magical. A bunch of people came in and started adding ideas. Pretty soon there were Google engineers in there, Facebook engineers in there, journalists, and it grew to about 160 pages of sometimes crazy but sometimes really smart conversation about how you could actually design better solutions for this.

What’s most remarkable to me and in a way that answers the question about like, “Should we be hopeful?” Is that no one came along and just like erased it all or wrote swear words in it or whatever, that actually like somehow even though anyone could, it was this open place were people collaborated to come up with a better solution. So, that kind of gives me hope, like there are places where people can come together and actually think creatively across these divides.

Miles O’Brien: Are there some real solutions there, you think?

Eli Pariser: Yeah, I mean actually, I think one of the things that Facebook did this week was referenced in the document pretty high up as one of the best solutions.

So, who knows if those two things had any relationship, but it’s nice to see some of these things being tried.
Miles O’Brien: Do we make a mistake focusing too much on Facebook here or is that appropriate given its reach?

Eli Pariser: So, yes and no. It’s important to remember that most Americans still get their news from local TV news. We’re not all in social media yet and although that’s growing, there are many other information systems that are very powerful, including talk radio and cable news. It’s easy to overplay that Facebook is primary and in fact -– and while something like it will be, it isn’t yet.

So, let’s not forget about the distort of power of some of these other channels that exist to shape what people believe. That said, the future clearly looks like Facebook and not like a cable channel and reckoning with what’s coming is really important.
Miles O’Brien: So, is this something that screams out for regulatory approach? How do you regulate a multinational corporation, 190 countries and all that kind of thing? Is that even possible?

Eli Pariser: The place you could start is just making it a little less impossible to compete with. So, one thing you could do from a regulatory approach is to say, “We’re going to give people data sovereignty.” What does that mean? It means that the data that you put into Facebook, you ought to be able to take out and take somewhere else. If you did that, all of a sudden, a whole ecosystem of startups that really can’t compete with Facebook because they don’t have years of data, all of a sudden would be able to get it if they were providing a good enough service to do so.

To me, that would be a game changer. It’s also something that’s very reasonable like we ought to own the data that we create and put into these systems. So, to me, you don’t have to go to hover over Facebook’s shoulder telling it what to do and what not to do to change the rules of the game in the way that would create better incentives.
Miles O’Brien: I guess a lot of people aren’t fully aware that they don’t own what’s in there, do they?

Eli Pariser: People aren’t and frankly they’re not happy when they learn that, because it does seem like when I write a status update, that should be my intellectual property. It shouldn’t be something that Facebook won’t allow me to take out and move somewhere else.

Miles O’Brien: It’s your cat, goddammit.

Eli Pariser: That’s right. It’s my cat.

Eli Pariser: The difficult and interesting thing about being Facebook is that there’s no decision that doesn’t have enormous consequences simply because of the scale and so you can imagine people fretting in four years because the whole political media ecosystem look so anodyne. Nobody can tell the difference between anyone and anyone else.

And it’s all sort of centrist and there are no real ideas. Obviously, we’re not there at this moment and that we have moved back and forth between those two things in time if you look at political media in the 1950s versus 1968 like there’s a huge shift in the way in the kind of content that people consumed.

So, I don’t think there’s anything that doesn’t have possible negative consequences, but I do think some of the reaction to this trust initiative that Facebook is undergoing sort of sounds like, you should put news that people say they don’t trust in front of them anyway just to force it down their throats and I don’t think that’s a viable mechanism either. Like if people don’t trust it, they’re not going to listen to it regardless of how loud it is.

Miles O’Brien: Yeah, you mentioned that pendulums swing from ‘50s to ‘68. Is it possible this is just another swing, or does this feel different?

Eli Pariser: It feels different to us now because of the speed and because we haven’t lived through it before, but I’m not sure in the long arc of the relationship between media and democracy that this will feel really different 20 years from now. I think it’s just that it hasn’t happened this quickly before. To go from three years ago to now was such a huge shift of behavior that’s new.

Miles O’Brien: Time flies when you’re having fun.

Eli Pariser, thank you for your time, and thank you for listening. Head over to the website if you can, you can always check out what we have going on. You can see the rest of our coverage of the phenomenon of junk news. And you’ll never miss a thing if you sign up for our email newsletter–no spam, promise.

We will keep you updated on what happens when the world of science and the world of news collide. No filter bubbles, just the straight facts.

This has been Miles To Go, I’m Miles O’Brien.

Banner image credit: Sebastian Pichler | Unsplash.

Notify of
Oldest Most Voted
Inline Feedbacks
View all comments

Get our latest stories delivered to your inbox.