After the 2016 U.S. presidential election, it became clear to a lot of people, including me, that the issue of fake news on the internet merited some deeper discussion.
As all the major intelligence agencies revealed that Russian actors tried to sway the election with an organized campaign of online misinformation, we were conducting our own investigation.
Producer Cameron Hickey–a coder long before venturing into videojournalism–built a software tool he called NewsTracker to try and understand how fake news, misinformation, and other viral content moved through the internet. We began calling all of this “junk news,” a larger umbrella term for well-packaged, snackable content with little nutritional value. This includes everything from slightly misleading clickbait to outright lies and malicious disinformation.
Cameron identified some promising leads and we secured some funding from the Knight Foundation to continue our deep dive. We brought the idea to the PBS NewsHour for a series of reports, and were approved for four pieces.
For more than a year, we followed the stories and met some very interesting people caught in the rapidly developing junk news ecosystem. We ended up having so much content that four NewsHour segments didn’t feel like enough, so decided to use my podcast, Miles To Go, to get more of that content out there.
Here’s what we found.
In our first installment, we went to Facebook, the disseminator of most of this junk news. After much cajoling, we managed to get Facebook to participate, and they gave us some insights into the challenges they face in dealing with junk news on their platform.
In that first segment, we met Dan Zigmond, head of analytics for Facebook’s News Feed. Our interview with him was particularly insightful. We dove deeper with Zigmond in a Miles To Go podcast episode, in which we learned how Facebook tries to identify and downrank problematic content.
As we were putting together these junk news pieces, Facebook CEO Mark Zuckerberg was testifying to Congress following revelations of the Cambridge Analytica data breach scandal. To put this conversation into perspective, MOBProd team members Brian Truglio and Fedor Kossakovski discussed the idea that many of Facebook’s issues stem from its business model–and what can be done about it.
In our second installment, we used Cameron’s NewsTracker tool to find Cyrus Massoumi, a prolific purveyor of hyper-partisan content. His is in an interesting trade: crafting clickbait for liberals and conservatives and finding a way to game Facebook’s algorithms to reach the largest audience. Millions of people follow his work, which shows that perhaps quality content isn’t what we look for in our Facebook News Feeds (even if we say otherwise!).
Providing context for Massoumi’s endeavor was danah boyd. As a Principal Researcher at Microsoft Research and founder of Data & Society, she has been studying the spread of misinformation online for years. Here’s our podcast episode with her.
In the third installment of our NewsHour series, we visited two of Massoumi’s readers to understand how they interact with junk news on Facebook. The conservative user, Betty Manlove, is actually Cameron Hickey’s grandmother, who he identified early on in his NewsTracker investigation as being a super-consumer of hyper-partisan fare. The liberal user, Gabe Doran, is Cameron’s neighbor. Although they have very different ideas about politics, how they interact with Facebook and its junk news content we found to be quite similar.
Data journalist Jonathan Albright helped us understand how Russian propaganda spreads on the internet–helpful stuff when you consider how hard it is to distinguish propaganda from unbiased reporting on a platform where all your content looks similar. You can listen to our full interview here:
Eli Pariser, who coined the term “filter bubble,” also made an appearance in that NewsHour segment. Considering that both Manlove and Doran said their views were not swayed, only cemented, by their use of Facebook, Pariser’s insights on how we craft a personal information landscape to reinforce our preexisting beliefs are a useful way of approaching the issue.
For the final installment of our junk news series, we looped back to Facebook to see the approaches they are currently developing to send spam deep into the depths of users’ News Feeds. They also wired me up to demonstrate an eye-tracking experiment–it was clear from the results that early versions of their fact-checking labels were not very useful.
One of the people tasked with keeping our News Feeds from feeding us junk is Tessa Lyons, head of News Feed integrity for Facebook. In her interview, she led us through why Facebook is better off ranking junk low in your News Feed instead of deleting it outright. The obvious follow-up question: does that make Facebook a media company?
At this point, Snopes.com is a household name, synonymous with fact-checking and truth. Recently, they were recruited by Facebook to help identify and label misinformation and other junk news on the platform. This has been a difficult challenge and one with dubious results, as we found out in conversation with Brooke Binkowski, managing editor of Snopes.com.
As a final debrief, I sat down with series producer Cameron Hickey to discuss the whole junk news project: How and why he built the NewsTracker software, the pros and cons of interviewing your own family, and his plans for the future. He will continue investigating the phenomenon of junk news at the Shorenstein Center at Harvard University, and we couldn’t be prouder. We wish you all the best, Cameron!
Banner image credit: Cameron Hickey | PBS NewsHour, edited by Fedor Kossakovski.