Whose Best Interest - Can Facebook's Business Model Be Repaired? | Miles O'Brien Productions

Whose Best Interest – Can Facebook’s Business Model Be Repaired?

,

The Cambridge Analytica scandal has placed Mark Zuckerberg and Facebook at the center of the data privacy debate. But is Facebook’s ad-driven business model fundamentally incompatible with protecting users’ personal data? And, if so, what can be done to fix it? Miles O’Brien Productions team members Brian Truglio and Fedor Kossakovski hash it out on this special edition of Miles To Go.

[maxbutton id=”5″ ] [maxbutton id=”8″ ]


TRANSCRIPT

Miles O’Brien: Hello and welcome to another edition of Miles To Go. I’m Miles O’Brien.

I’m deep in the middle of our investigative series for the PBS NewsHour on misinformation. Producer Cameron Hickey calls all of this “junk news”. It’s a pretty good term if you think about it. Our first piece, which offers up an inside look at the News Feed team at Facebook headquarters in Menlo Park, California, will air on the NewsHour on Wednesday – hope you’ll tune in for that. And if you go to our website, we’ll give you links to see it if you miss it on air.

Today were going to take a deeper dive into the social network that has succeeded really like no other company has in the realm of global domination.

Now, before we get to the conversation, do me a favor: take a few minutes, go to the aforementioned website milesobrien.com, sign up for the newsletter. It’s free, we won’t sell your name or email to anybody, we won’t spam you–just one measly email a week and you’ll be on top of the world of science and technology and where it meets news. So, do that, and you’ll feel like you never have to worry about FOMO.

Once you’re done with this podcast, and why not listen to another half dozen or so while you’re at it, by all means rate and review us. We love the feedback. Good, bad… forget the indifferent, just good and bad.

This week, something a little different! We have some smart people who float around the halls here in Boston who have some interesting things to say and we do spend an awful lot of time doing our homework. In a post-fact world, I guess we are anachronisms: we do like to get things right.

So with that I give you two of the brightest stars in the MOBProd universe with a podcast we’re calling: Hash It Out.

Fedor Kossakovski: Alright, hi! Welcome to the inaugural podcast of Hash It Out. My name’s Fedor Kossakovski. I do research and some writing for Miles O’Brien.

Brian Truglio: My name is Brian Truglio. I’m senior editor for Miles O’Brien Productions.

Fedor Kossakovski: So, we’re both huge nerds and are so happy to get to work with Miles on covering science.

Brian Truglio: Before we can edit something down we really have to understand the topic in depth. And Fedor is always the guy that I turn to, he is our in-house science expert, and in the edit suite, I find we have these really interesting conversations.

Fedor Kossakovski: Yeah we hash it out, we hash it out. We try to, at least. We have all these different interviews that Miles gets to be a part of and we get to sit in on or listen and watch. And we have this wealth of information that we work on distilling for you.

But also there’s so much that doesn’t make it into the PBS NewsHours or the NOVAs and we thought it’d be pretty cool to get to share that with you. So here we are. We’re going to talk about science topics–news topics and from a science angle.

Brian Truglio: And we’ve never seen a rabbit hole that we can’t dive down. But we endeavor to stay on topic here on these hash it out episodes. And so what is our topic today?

Fedor Kossakovski: This week we are covering Facebook. Seems like everyone is on this grind right now after Zuckerberg’s recent testimony and we thought we’d take a deeper dive into Facebook, their relationship with their users, the users’ data, and the general business model of Facebook. Is it sustainable in this world where people are getting more and more worried about their data being breached?

Brian Truglio: Yeah, over the last few months we’ve been doing a lot of reporting on this. Cameron Hickey and Miles are collaborating on this story. I believe we’re going to be doing four stories for PBS NewsHour, so look out for that.

But in those upcoming stories they’re going to focus on junk news and Facebook. We thought that since Zuckerberg has been testifying over the last couple of days there were a lot of points that came up in the interviews we conducted off the junk news topic that we thought were kind of essential in understanding exactly what’s going on with this testimony. And maybe if we can shed some light on a couple of these topics it might be useful.

I think it’s easy to underestimate the scale of what’s going on with Facebook. Fedor how big of a problem do you think this is in your opinion?

Fedor Kossakovski: I think this is a big problem and mostly just because it’s highlighting how little the general public understands how the data is used and managed. Watching the testimony, Zuckerberg’s testimony on the Hill I myself was learning a lot about how Facebook uses the the data. Over and over Zuckerberg was saying you know we don’t sell the data we don’t sell the data. And in the end that’s true. They don’t sell the actual data out to people but they allow the access if they want to sell skis as Zuckerberg is always talking about–they want to sell skis! – they’ll send that ad directly to the people that that want skis.

Facebook is still creating these profiles using not only stuff that you’ve shared but also using their AI, looking at what you’ve posted and what you might like. Google’s doing the same stuff. All these big tech companies that’s how they’re making money. I think that’s the crux of the problem is people don’t quite realize what they signed up to give away. And they did. They did sign up to give it away. People don’t quite understand how much of their stuff is out there and available to be used.

Brian Truglio: We did an interview with Roger McNamee. McNamee was an early and big investor. But I think more importantly he entered Zuckerberg’s life when the company was young and they were considering a buyout option from Yahoo. And McNamee came in and said “don’t do it, don’t sacrifice your vision.” And he’s the one who brought Sheryl Sandberg to the mix and he owed her a favor actually because she introduced him to Bono who became a member of his…

Fedor Kossakovski: What kind of world is–I want to have like people that owe me favors like that. “Could you just, I need a favor, can you install me in like as the COO of a next major tech company of the world, please?”

Brian Truglio: Anyway, he’s kind of become one of the gurus of Silicon Valley let’s say and he pointed out that you know Facebook has two point one billion members–two point one billion members–which is almost as large as the number of Christians in the world, OK? So when we’re talking about Facebook, what we think about Facebook might be your friends and family that you share things with and certainly that is what Zuckerberg is playing on and his testimony is constantly reminding you of that. However the reach is enormous. That gives Facebook an unprecedented amount of power.

A lot of people make fun of me because I don’t drink coffee. People have a hard time understanding that. OK. But when the aliens come to enslave us, the aliens will quickly realize that the coffee supply is the universal human weakness that they can exploit. They poison the coffee supply, they kill everybody off but me. And you know basically I’m left to save the human race. Needless to say it probably won’t get saved.

Facebook has now replaced coffee you know. I did some quick searching to see, well, how many coffee users are there just out of curiosity? Which interestingly enough you cannot find a number on. But what you get what you can find a number on is that there are 2.25 billion cups of coffee consumed every day. So let’s say if you divide let’s say on average everybody’s going to have at least two cups of coffee. That would be 1.13 billion coffee drinkers a day. Interestingly enough, less than Facebook users.

Fedor Kossakovski: Wow. Wow.

Brian Truglio: So so yes indeed. I think with my armchair statistics here we can say that Facebook is more popular than coffee. But that just gives you a scale of the problem and I think… I’ve been constantly joking with Fedor because the number of times that Zuckerberg reminds us that he started this in his dorm room.

Fedor Kossakovski: What is it now, like 18 over 2 days?

Brian Truglio: As a new father, I might be a little bit worried that his… his nostalgia for his dorm room that might not, but anyway, I’m not going to speculate on his personal life. But the whole dorm room thing to me it’s not it’s not just a fun or innocent reference, I don’t think. I think he wants us to keep Facebook in that scale rather than in the larger scale. As Lindsey Graham pointed out, “you are a monopoly,”. And he put it to him he said he said to him “if nobody likes your service where do they go?”.

Fedor Kossakovski: Right.

Brian Truglio: “Who are your, who are your top competitors?” You know he couldn’t name any. But in addition to that fact there’s nobody else sitting at the table with him. It’s just him. And I think the point is is that yes Facebook has a unique kind of monopoly. So the scale of the problem is big, the company is big, its reach is larger than most countries. And it reminds me a lot of big tobacco or let’s say even big oil.

Fedor Kossakovski: Yeah.

Brian Truglio: The difference being again that when big oil was and big tobacco were called to testify before Congress, there was a table full of companies there. He’s it.

Fedor Kossakovski: Does this make Facebook less of a tech company more of a utility company almost? I’ve been thinking about it. They talked about pipes. You know “we’re not just pipes,” or you know the Internet, the ISP providers, the internet service providers are the pipes and Facebook is something else. But I would like your thoughts on do you think Facebook is… is it a utility? Should it be regulated like let’s say an energy grid. Right. Is you know something like this kind of have to happen? Is the business model all messed up for the consumer?

Brian Truglio: I certainly don’t have an answer to that exactly but I was surprised that Zuckerberg is basically welcoming regulation. I think that’s a good sign. But really the senators and congressmen were struggling to define exactly that question during Zuckerberg’s testimony. I think the struggle that’s now going to go on is to define the company. It is a unique beast of some kind. The one thing though in looking at McNamee’s interview that I really wanted to tease out was the fact that he gets at a core conflict that exists in Facebook and it’s a specific problem that has to do with the shift that took place once Facebook transitioned from being this place where you were you know sharing photos and pictures and messages with friends and family to the moment that they went public in 2012 and had to start earning money.

And when they did that they made the decision to go with advertising. And there is an important shift that took place at that point that I don’t think people understand and that is that they went from being the customer to being the product.

Fedor Kossakovski: The users themselves, right?

Brian Truglio: Yes, the users themselves went from being the customers of Facebook to being the product of Facebook. And you know we heard Zuckerberg many times over, talk about how your… “we don’t make money selling your private information” and that may literally be true, but it’s kind of disingenuous in some other ways because the advertising model on Facebook requires precision targeting and that precision targeting is based on not just your private information but also your online behavior. And that opens up a whole bunch of conflicts of interest I think within the company.

There was one interesting thing that I learned looking at an op-ed by a guy named Jonathan Zittrain, Harvard professor. And in that article he mentioned a guy named Jack Balkin from Yale Law School. Funny enough Balkin’ s name did come up once in Zuckerberg’s testimony.

Fedor Kossakovski: That’s right he said. Zuckerberg was kind of pro Balkin right. He was like, “Jack is a good writer,” or something like that.

Brian Truglio: Yeah exactly. His point centers around this concept called fiduciary responsibility. Fiduciary sounds, sounds really complicated, sounds really legal. However it basically boils down to the fact that in certain professions where you have access to highly sensitive personal information the people who are receiving that information have a responsibility to act in your best interest. This covers doctors, covers lawyers, you’re familiar with attorney-client privilege.

Fedor Kossakovski: It’s dead. Client attorney privilege is dead.

Brian Truglio: Fiduciary responsibility also applies to CEOs of companies and it is their fiduciary responsibility to act in the best interest of shareholders. When you become the product, Facebook is both the receiver of your personal information, so they should be acting in your best interest. However, they also are beholden to shareholders and they need to operate in the shareholders’ best interest. And the problem is is that Facebook is now pulling in two different directions.

Fedor Kossakovski: Right. I mean it seems like it’s conflicting things right. I mean it’s the same thing that they’re looking at–it’s the data that’s the central thing. But they have a responsibility to keep it safe but also the way that they’re going to make money is by selling it. Maybe not directly, as we have found, right? They keep the data presumably secure within their servers and then advertisers come to them and tell them what kind of person they want. They will kind of connect those those two. You think that’s not aligned, interest-wise, those two things?

Brian Truglio: In my opinion, the crux of the problem is addiction. Advertisers need eyeballs right. Facebook is trying to maximize the number of eyeballs but also the amount of time those eyeballs are spending on the page.

Fedor Kossakovski: Right. They want eyeball hours. They want eyeball hours.

Brian Truglio: Exactly. Which means they want to figure out how to keep you on the web page for as long as possible. It’s their skill in doing that by having all of this information about you your likes et cetera, the kind of things that keep us engaged are things that are as McNamee says lower brain stem kind of stuff. Fear, excitement, anger you know things that provoke us things that provoke our passions right. More often than not, though, anger is one of the best ways to do it.

Now McNamee uses the example that he began to notice that he had two Facebook friends who we wasn’t particularly close with but who had political ideas that he disagreed with but that constantly kind of got him riled up and engaged. He realized that they constantly appeared high in his News Feed.

Fedor Kossakovski: So he was saying the things like how they were up up ranking, down ranking stuff from the News Feed. Yeah. You’re seeing that happen in real time just because that was going to engage him more?

Brian Truglio: Right. It was going to keep him on the page more, get him engage more and you know regardless of what Facebook says you know he says, and it makes sense, there is editing going on in the background. They are controlling what they show you. And they’ve gotten so good at it that you know caffeine may have addicted X number of people to coffee. But the kind of fear and anger motivated stuff that they’re putting in front of you becomes even more engaging and more addicting and it keeps you on the site longer and longer and longer.

The problem comes when this high level of manipulation is then put to use and for political messages, to manipulate elections, to fall into the hands of our enemies let’s say when the Russians are using it to you know to basically sow discord divide us against ourselves those kinds of things. I know that Zuckerberg spent a lot of time trying to you know couch everything in terms of its improves your experience it shows you just what you want. OK. That’s the positive side the negative side is the same tools can be used to very very bad effect. And I think it’s safe to say that our democracy may be under threat because of it and it just makes no sense that no nobody is standing up and saying look we accept it. These were used for this. This stuff was used in unprecedented ways to harm us. That goes against our mission. We need to step back. We need to fix it.

And I think McNamee feels like they nobody is realizing how serious this is at Facebook. All they’re presenting are ways to tweak… The entire building is on fire and we are focusing on putting out the flames in one room.

Now ultimately I think these kinds of things run against the mission statement of Facebook: to make the world a better place, to make you happier.

Fedor Kossakovski: Connect people. Right. It’s not. It’s not about connecting people to advertisers. It’s supposed to be connecting people to people, right? But that’s not really that’s not the way that they’re making money.

It just sounds like the way that the business is structured does not allow them to have fiduciary responsibility the way that it’s set up where they have to be making money off of your data means that they can’t protect the data totally, right? Because those two things are working against you know they’re working in opposite directions. So I mean I think the question here is you know does Facebook need to change its business model.

On one of the talk shows when Sheryl Sandberg was doing the rounds after the Cambridge Analytica stuff broke, someone asked her you know “How can you make this better? How could we make sure that users have full protection over that data?” And she just said, “We’d have to make it a paid service.” And it’s interesting because that was the only time that a C executive has said that and Mark Zuckerberg won’t say that. He thinks it should stay free otherwise you know he can’t connect the world.

And I found interesting I was listening to NPR, I was listening to Marketplace, and Kai Ryssdal has done a poll on Twitter of his followers: “Would you pay for Facebook?” And over 4000 people voted and 87 percent said they would not pay. Even though you know you’re using it all day every day almost. What are your thoughts on that? Would people be willing to pay for it if it suddenly switched over? Or should it be like a tiered system? Is it the right way?

Brian Truglio: I really don’t see any other option to be honest. One thing that disturbs me is, and it disturbed me since the beginning with Facebook is that every time I get a window that’s warning me about some kind of a sharing of information thing, it doesn’t matter to me how they spin that sharing of information. I know that they’re putting it there because there’s some dubious way they want to be legally protected because they’re not using your data in a way that if you found out about you would like. So I mean you know the window might say that “We’re doing this so that we can you know you can have a better experience,” but in reality it’s just saying, “We’re you know we’re doing this because we’re kind of screwing you over.”

And you know when you think about genuine friendships right, you don’t call me up and say oh yeah you know before we continue this conversation, can you mind signing this document. Because the condition of our friendship is that I might want to share some of this information with people who hate you or whatever. You would instantly be suspicious of what was going on. Right and you would probably say, “No way! Fedor I’m not going to be your friend. OK. I don’t want to talk to you again.”

Fedor Kossakovski: Aw, come on Brian. I’m just gonna share the data. I’m connecting people. Let me connect you with my best friend Expedia.

Brian Truglio: As much as I hate the cable subscription model, I don’t really see any other way to go about it. The thing is is that Facebook managed to create a highly engaging platform in which to share social experience online. They can certainly apply that same creativity on the business side of things and come up with other ways to make money that don’t put a central conflict at their mission statement.

And you know even just a cable model subscription, either something where it was like a tiered model like you’ve got different levels of access, or different channels on Facebook that you could pay for. I think it could be done. And they they’ve taken the same survey that Kai Ryssdal has taken, I’m sure, and they know that they would lose a lot of users. But if they had a free version and they had a pay version and they made it clear on the free version “we’re sharing your personal data if you don’t want to see ads or have us share your personal data pay us five to ten bucks a month”. It doesn’t bother anybody for Netflix, right?

Fedor Kossakovski: Yeah. You know what this all reminds me of, actually? We’ve done some corporate work for this telecoms company–VEON.

They’re a large telecommunications company, they have they provide mobile phone service. Pakistan through Italy through Netherlands Russia mostly overseas stuff. Facebook is kind of like this big platform and they’re a utility that needs to make money somehow to run itself right. And so they have to sell your data.

Similarly telecoms companies you know they provide all of the connectivity and then people download apps using their network and they don’t get a single cent off of that, right? We were talking to the chief digital officer there, Christopher Schlaeffer, and he was one of the first people that brought the Android platform into reality back in I think 2008 and he was talking about how Steve Jobs actually with the iPhone revolutionized how not only mobile technology works but also just how phones and app stores work with the telecoms company and when when Steve Jobs was pitching the iPhone he wanted to get a cut of every iPhone bill from the network.

He got all these concessions from what was Cingular at the time which is AT&T now right. And they had an exclusive deal for five years I think to just be the only provider with an iPhone. Apple also made money off of the actual telecoms bill, right? And the telecommunications people weren’t able to negotiate a thing where they got revenue sharing from the Apple store. So they missed out on a huge ecosystem of money that is being produced using their service, right, using their infrastructure. And because of that telecommunications companies are forced to also collect metadata and sell it. It’s the same issue.

And so Veon is actually moving to this thing where they’ve set up their own Veon app store let’s say, it’s also called Veon… but basically if you are using their phone using their telecommunications infrastructure, you can use their app store for free and go and browse the internet for free. Call your friends for free. They’re setting up this whole community that they can better profit off of without having to sell ads. It’s add free.

Brian Truglio: So how do they make money?

Fedor Kossakovski: They do revenue sharing with companies that get apps onto their platform. And so they have a music streaming service that they partnered with and you know they get a cut of the music streaming profits off of that.

This other kind of business structure shifts it back from the consumer from the user being the product to actually being the consumer again. Right. And it can be a little scary that you know still this one big company Veon or something is collecting your data right. That is definitely true. But because they don’t have this incentive to use that data to sell it, they actually are implementing sort of like blockchain type technologies which anonymize your data and is being used to still interact with apps, giving Uber or whatever, what you actually need to call the Uber and bring you here or there, right, and it knows your credit card information but only for that transaction and only for that amount of time. And then it just disappears and they can’t even see it.

It’s like how WhatsApp is encrypted both ways right. That also came out in the Facebook testimony if people didn’t know about it. Facebook owns WhatsApp and WhatsApp is doubly encrypted or it’s called end to end encryption where they can’t look at what your messages are. It’s just the way that software is set up, the math behind it makes it impossible.

So that’s what Veon is trying to do for all of their communications and I think that’s really a way that Facebook can try to go. I mean can they provide this service without charging you for it? Probably not, right. They won’t be able to switch over to a revenue share model with all their app providers because most their apps stuff is free too or like your integrated Facebook stuff is free as well. But that’s something to think about. We shouldn’t resign ourselves to the fact that this data is available forever out there.

I’m kind of OK with all this data being out there. I’m very careful about what I put out there. So I feel less bad about you know there’s breaches like this happened because I don’t sign into stuff using Facebook. I don’t post stuff except for mostly work stuff on Facebook. So that’s fine for me, but I’m also more cognizant of the stuff and so I think there are ways to do it though that would be monumental shifts but would be beneficial for everyone.

Brian Truglio: I like the idea too of data expiring. From day one I mean I basically assume that anything that went out over the Internet was public, the Internet is public space more or less I think you have to treat it like that. Someone once told me that “don’t write anything online that you wouldn’t want to see published on the front page of The New York Times tomorrow.” I think that’s a good rule to go by. I believe very strongly in privacy. And I think it’s a generational thing too. I don’t think your generation is as concerned with privacy. And now the wakeup call is here.

In a way we should be compensated for all of the information that they’re using to profit off of. I don’t think that’s an unfair thing to consider. But likewise if you value your personal data, if you value your privacy then you also need to protect it. And if you do protect it then it becomes valuable, possibly to the point where people will give you a cut of whatever information we use.

I still think the problem can be solved. They can they can figure out other revenue streams. It may be a step back for them you know. Not every way of making a profit is a legal or ethical way. And that’s why regulation exists in the first place in my opinion. I mean the government is there to protect its people. Facebook is now essentially a multinational and we’ve seen this in the past right with, maybe the oil and gas industry is the next nearest parallel, but the only thing that you have to protect you from a giant multinational is your government. If your government isn’t on your side, what chance do you have? And I think most of the hearing was was kind of sounding out exactly how we should regulate it basically. The surprise for me was that Zuckerberg admitted and welcomed regulation.

Now of course there’s two sides to regulation because regulation can also cement their place. And I think as Lindsey Graham pointed out it’s hard to make an argument that they’re not a monopoly. They don’t really have competitors. Friendster and MySpace and all those they went by the wayside a long time ago.

The funny thing is is that a lot of this could be a moot point because the European Union is bringing in a set of regulations called the GDPR. Their regulations mean that it has to be very clear to the user how their data is being used so you can no longer submit 5000 page document that nobody understands. It has to be clear and simple. And then there are serious penalties as well.

Fedor Kossakovski: Fines up to for 24-25 million dollars per infraction. So now you add all that together: trillions of dollars.

Brian Truglio: And this applies not just to the countries where the businesses are located it applies to every EU citizen. And that is really powerful because basically Facebook on May 25th they will you know for all European Union users they will be implementing these these new policies. But to be safe they’re also going to implement it across their entire platform anywhere in the world which means we are all going to get the benefit of the European leadership on this. Listening to this it was sort of like: you guys are having this debate here and these hearings but the EU is already light years ahead of you. And in the end they might be the ones protecting everybody here.

Fedor Kossakovski: Miles brought that up on air when he was reporting on this. It’s just unclear when that’s going to be rolled out for the rest of the users.

Brian Truglio: Alright. So I do want to go back to addiction because I do think it’s another big problem that we’re dealing with. And again to go back to McNamee, he basically says their algorithms which are driven by artificial intelligence have gotten way too good at manipulating us and they have this kind of you know pipe into your brain and they know exactly what buttons to push in order to keep you on the site. And I’m going to go back again to the tobacco industry and why these hearings kind of remind me and I think have a parallel to the old tobacco industry testimony. When those hearings started for the tobacco industry probably 90 percent of the people in that room were all smokers right. So it was not necessarily news that they were really ready to wake up to because they were users.

In the same way everybody in that room yesterday were users of Facebook. Pretty pretty high percentage. All of us listening are users of Facebook. I mean we are all addicted to Facebook. OK. Like it or not, I mean, wake up in the morning and you’re checking facebook before you go to the bathroom. And then after. And then on your way to work. And then while you’re at work. And then on your lunch break. And you know at some point we have to call this what it is.

This is a product that has a high level of addiction. Part of the reason I think everything was so laughy-jokey at this testimony and the reason that Zuckerberg could sit there more or less with a smile on his face the whole time even while he was admitting to massive shortcomings is that he knows we’re all addicted you know. Like, he knows we don’t have anywhere else to go.

Fedor Kossakovski: Sorry I was I was checking Facebook I’m sorry. What are you saying?

Brian Truglio: One thing that might help is and again this was brought up yesterday. We might need some sort of an independent auditor who is simply hired to look inside algorithms.

These algorithms are opaque. We have no idea how this how the AI and the algorithms are actually taking and processing data and what they’re spitting out. We need some sort of independent auditor who has complete access to all these data gathering algorithms who can look inside and say OK what is it actually doing. What are the consequences of what is doing? Are they ethical? Are they legal? We have independent auditors financially. In the same way we can create an algorithm auditing department.

Fedor Kossakovski: Yeah we brought this up with Cameron. I agree. I mean the algorithm is very important to understand how it works–does it make legal or ethical sense the way it’s sorting data? These AI algorithms though a lot of the time you know they change themselves.

And that’s the whole idea of AI. But the way that they operate is off of initial data sets that you give it right. So I think another important part of that is just not not only how algorithms work but what the hell are we giving these machines to train it? Are we giving it a lot of conservative content and then saying, “Hey, this is might not be great content” and then train it to block that? Are we doing the same thing for liberal stuff? How do we differentiate, who gets to pick?

That’s something that is a big discussion right now in all of AI and machine learning. And I think that needs to be regulated because what you’re feeding these machines is you can be coding in biases from these surprisingly non diverse technological companies where you walk in there and it’s mostly white dudes working on these algorithms.

Brian Truglio: And young people too so.

Fedor Kossakovski: Whoa, whoa, whoa….

Brian Truglio: I think this is a lot of territory that yeah there’ll be they’ll be talking about in these upcoming junk news reports for PBS NewsHour.

Fedor Kossakovski: Basically if you just go to milesobrien.com right now a lot of what we’re posting and looking at and talking about is this stuff and I think it’s just going to be more and more once we get this junk news series going.

So did we hash this one out?

Brian Truglio: Have we hashed it out? I think well

Fedor Kossakovski: We tried.

Brian Truglio: I think we raised more questions than we answered but I think that…

Fedor Kossakovski: To our one listener: tell us what you want us to try to hash out next. If you have any suggestions, if something’s on your mind, if you see something in the news and you don’t quite understand it from a science-y perspective, want to understand how it works, we’ll try.

Brian Truglio: Otherwise it’s going to be all nuclear power.

Fedor Kossakovski: Yeah, if you don’t tell us what to do it’s just going to be different types of thorium fuel cells and shit like that.

Brian Truglio: Dude you can’t say the S word in a podcast.

Fedor Kossakovski: Ahh, we’ll cut it out.

Brian Truglio: Nice work Fedor, until next time.

Fedor Kossakovski: Yeah. See ya.

Miles O’Brien: Thank you Fedor, thank you Brian, and thank you for listening.

Ahead as we continue our deep dive into the world of junk news, a conversation with Jonathan Albright of the Tow Center at Columbia University. He’s one of the smartest people studying news and stuff purporting to be news, and mapping how it ricochets around the interwebs. It’s a fascinating conversation. That’s next time on Miles To Go. I’m Miles O’Brien–thanks for listening.

Banner image credit: www.shopcatalog.com.

Subscribe
Notify of
guest
3 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments

Get our latest stories delivered to your inbox.

X