Skip to Main Content

Fake News and the Post-Truth Era: List of Resources

List of Resources

Class Activity
Watch one or more of the videos with your students, then use the Prompt Questions to facilitate a class discussion. 

"While companies like Facebook, Twitter and Google promise they will take steps to reduce "fake news," Michael J. Casey and Oliver Luckett, authors of "The Social Organism," argue the first step is an overhaul of the companies' algorithm-based platforms to make them more transparent."
  • Drawing from this video and what you have learned in introductory biology (and other) courses, do you think the analogy of social media being this living organism is fitting? Epidemiologists and other medical experts talk about ways to contain a virus that is spreading. Do you think there’s a way to contain the spread of fake news? Personally, what can you do? What can your community do?
  • Luckett discusses the idea of training the body to recognize a pathogen as foreign and react appropriately. Using the mind/body analogy, what can we do to ‘train the brain’ to recognize fake news? You have likely heard your professors talk about “critical thinking skills.” What role can the development of these skills play in detecting fake news?
  • What are the moral and ethical ramifications of maintaining propriety over algorithms currently used by profit-generating companies like Facebook? Should these algorithms be “opened up” as Casey suggests? Is there a moral obligation to do so? Or is that placing too great a financial and legal burden on social media companies?
  • Casey cites an example from Riot Games in which an “immunotherapy approach” is used to define and delineate hate speech, in order to reduce its occurrence. Do you think this was a fair way to go about reducing hate speech? What about freedom of speech? What new doors might this so-called immunotherapy approach open in the discussion of the line between hate speech, and freedom of speech?
  • Toward the end, Luckett says, “Evolution is not always progress.”
    Discuss! :-)
"Ryan Holiday, the author of, "Trust Me, I'm Lying," shares a bit about how he has manipulated media to get bogus, anonymous stories to the front-page of news media outlets."
  • Holiday mentions that some blog sites have a “low threshold” for what they will and will not publish. What do you think might define a “low threshold”? How would you describe a blog (or other news source) that, in contrast, has a higher threshold? What might that higher threshold entail?
  • Holiday mentions how he has sent news media outlets “fake anonymous emails” and then watched as the resulting story spread and grew in strength. What does this say about the inherent danger of the so-called ‘anonymous tip’? Should news outlets have a blanket policy of refusing to act on such tips? What effect could that have on the flow and sharing of information in a democratized society?
  • Toward the very end, the narrator asks two very good questions, and Holiday gives his answer. What do you think about his answer?
  • "If you’re not paying for it, you’re not the customer, you’re the product.”

    Discuss! :-)

"Trevor Noah, host of the Daily Show, has told BBC Hardtalk’s Zeinab Badawi that factual accuracy is the base of his best jokes."
  • Noah says, “The best jokes are based in truth.” Drawing from what you have learned in introductory psychology or sociology (and other) courses, would you agree or disagree?
  • Do you think it’s possible, or advisable, to operate in a space that, as Noah explains, “…is “completely neutral, devoid of all opinion, and giving everybody an equal platform to share their views”? 
"Syria still in turmoil but the other political realities turned upside down amid much talk of fake news -- and post-truth politics. In first of a series looking at how the world changed in 2016 here's our special correspondent Allan Little."
  • In the opening, the narrator describes a “…shared public reality, within which they [readers] can disagree, dispute and challenge each other.” What do you think defines a “shared public reality” today? What is (or should be) the news media’s role in shaping that reality? What is (or should be) the role of various social media platforms in shaping that reality?
  • How might the aforementioned “shared public reality” conflict with what they narrator goes on to describe as “two parallel public realities”? How do you think this split occurred?  
  • “Cognitive bias” is, and has been, a buzzword for a long time. One form of cognitive bias is Confirmation Bias. Drawing from this video and what you have learned in introductory psychology or sociology (and other) courses, what seems to be the relationship between Cognitive Bias and fake news? Have you ever struggled with Cognitive Bias? What are some ways we can combat the effects of Cognitive Bias when consuming news?
  • What is (or should be) journalism’s role in a democracy?
  • Toward the end, the narrator poses the question, “Who in the new media landscape is to police what’s valid and what’s fake, what’s true and what’s post-true?”
    Discuss! :-)
"Washington Post reporter Wesley Lowery says the social media giant isn't excused from making responsible editorial choices just because it wishes to see itself as a technology company first. Lowery's book is "They Can't Kill Us All: Ferguson, Baltimore, and a New Era in America's Racial Justice Movement"."
  • If Facebook has the ability to stop fake news from being spread on their website, why do you think they had not exercised that ability, at least the time this video was made?
  • Lowery mentions an “editorial infrastructure” that helps to maintain a check-and-balance of what’s true and what’s not. Should large social media platforms like Facebook maintain such an infrastructure? Should it be their responsibility? Or are these platforms there simply to provide a forum of communication for its users, however chaotic that forum may be?
  • Lowery says, “As soon as they [Facebook] begin playing that role at all, they now take on, I believe, a responsibility to curate this content.” Do you agree or disagree? Why? If you agree, how do you think Facebook could accomplish this? What steps can they take? If you disagree, whose responsibility, if anyone’s, should it be to curate the content that propagates via Facebook?
  • Towards the end, Lowery states, “When you choose to publish something on your platform…you are making an editorial decision to allow it to exist in a space.” How does this statement fit in with what Trevor Noah talks about in the BBC interview video?      
"This teen says the secret to creating viral hoaxes is to tell people what they want to hear — and to throw in a little Justin Trudeau."
"It's nothing new, and it didn't swing the election. "
"Do children's digital fluency allow them to distinguish between fake news and real news online? WSJ's Sue Shellenbarger has surprising results of a study of nearly 8,000 students (from grammar school through college) that tested their ability to tell news from ads and to discern websites from hate groups and mainstream professional organizations."
  • Have you ever read a story on social media that scared or shocked you? How did you react? Reflect back on that experience. Does what you read still scare you, or did you learn later on that it wasn’t anything to worry about? If your feelings about that story changed, how and why do you think they changed?
  • Shellenbarger cites a statistic from Media Insight Project that states that by the age of 18, 88% of young adults are getting their news from Facebook. Are you one of those 88%? Be honest! What are some other, more credible and reputable news sources that you can consult to double-check a sensational story on Facebook, or other social media platform? (Not sure? Check out this link: http://libguides.columbiasc.edu/fakenews/evaluating.)
  • What steps does Shellenbarger recommend that parents take to teach their children to critically evaluate sources of news? Could these recommendations be good everyone? If you could tweak or adjust her recommendations for you and your peer-group, what adjustments, if any, would you make? 
"Twitter, Facebook and Google are taking steps to reduce fake news, misinformation, and harassment on the internet after users expressed concerns that false news stories and hate speech fueled divisiveness in the recent presidential election campaign."
  • Zuckerberg defended Facebook by saying, “Less than 1% of the site’s [Facebook’s] worldwide content could be classified as fake. Overall, this makes it extremely unlikely hoaxes changed the outcome of this election in one direction or the other.” It could be argued that Zuckerberg is taking a strictly quantitative approach to the problem, stressing how small the percentage actually is. But what about a qualitative approach? Considering how influential Facebook is, could that 1% of fake content have been more influential than he is giving it credit for? Or do you agree that 1% is too small a number to worry about?
  • What separates “hate speech” from “free speech”? Do you think it’s possible for computer software, such as AI (Artificial Intelligence) or an algorithm, can truly and accurately detect hate speech and misinformation? Or is that ultimately a task that will (or should?) fall to real people?   
  • Do you agree with Google’s stance to ban fake news websites from using Google’s ad selling system, which will likely hurt their revenue? It could be argued that these websites are, like any other company, just trying to make money. What if you were an employee of one such website, with a family to support? Would you want to see your employer’s revenue take a hit? 
"Major corporate brands are, often inadvertently, placing the ads that fund the growing number of web sites that peddle fake news online. WSJ's Lee Hawkins explains."
  • Much of what Hawkins is reporting on is beyond the control of the average person. But what are some steps and actions that you can take to educate yourself and others about fake news and hate speech?
  • Why do you think the invention of a “truth filter technology” has eluded modern society thus far? What makes this such a challenging task? Do you think there will ever be such a thing? If you were put in charge of a Truth Filter Technology Task Force, how would you go about it? What actions would this filter take to identify untrue information? Once identified, should your filter then recommend corrective or punitive measures?
"Facebook says its artificial intelligence know-how could eventually be a key to stamping out the fake news that critics say has infused the social media network. WSJ's Lee Hawkins explains."
  • Scenario: You graduate with honors from Columbia College (Go you!!) and are immediately hired by a social media Artificial Intelligence company as a researcher. Your first task is to come up with an answer to the following questions: “What’s the trade-off between filtering and censorship?” and “How can filtering technology be introduced in a responsible way?” Discuss! :-) 
"Hillary Clinton calls fake news an "epidemic" that is putting lives at risk."
  • Have you ever known anyone who has been the subject of a fake news story or rumor? How did it impact that individual? Were their friends and family affected? Were they able to ‘set the story straight’? If so, how did they go about it and how long did it take?
  • What are some other, more credible and reputable news sources that you can consult to double-check a sensational story on Facebook, or other social media platform? (Not sure? Check out this link: http://libguides.columbiasc.edu/fakenews/evaluating.)

Infographic: Fake News Is A Real Problem | Statista      Infographic: Fake News Stories Are a Problem - But Who's to Blame? | Statista

Infographic: Where People Trust The News Most And Least | Statista     Among Millennials, Facebook Far Exceeds Any Other Source for Political News

Understanding The Fake News Universe      

Barclay, D. A. (2017, January 4). The challenge facing libraries in an era of fake news. The Conversation. Retrieved from https://theconversation.com/the-challenge-facing-libraries-in-an-era-of-fake-news-70828

Cellan-Jones, R. (2016, November 27). Facebook, fake news and the meaning of truth. BBC.com. Retrieved from http://www.bbc.com/news/technology-38106131

Domonoske, C. (2016, November 23). Students have 'dismaying' inability to tell fake news from real, study finds. National Public Radio. Retrieved from http://www.npr.org/sections/thetwo-way/2016/11/23/503129818/study-finds-students-have-dismaying-inability-to-tell-fake-news-from-real

Ember, S. (2017, April 3). This is not fake news (but don’t go by the headline). The New York Times. Retrieved from https://www.nytimes.com/2017/04/03/education/edlife/fake-news-and-media-literacy.html?_r=0

Herrman, J.  (2016, December 22). Facebook’s problem isn’t fake news — It’s the rest of the internet. The New York Times Magazine. Retrieved from http://www.nytimes.com/2016/12/22/magazine/facebooks-problem-isnt-fake-news-its-the-rest-of-the-internet.html

Isaac, M. (2016, December 15). Facebook mounts effort to limit tide of fake news. The New York Times Magazine. Retrieved from http://www.nytimes.com/2016/12/15/technology/facebook-fake-news.html

Leetaru, K. (2016, December 12). How data and information literacy could end fake news. Forbes. Retrieved from http://www.forbes.com/sites/kalevleetaru/2016/12/11/how-data-and-information-literacy-could-end-fake-news/#39b9619b3335

Najmabadi, S. (2016, December 12). How can students be taught to detect fake news and dubious claims? The Chronicle of Higher Education.

Najmabadi, S. (2017, February 26). Information literacy. The Chronicle of Higher Education.

Robins-Early, N. (2016, November 22). How to recognize a fake news story. The Huffington Post. Retrieved from http://www.huffingtonpost.com/entry/fake-news-guide-facebook_us_5831c6aae4b058ce7aaba169

Swaim, B. Who’s to blame for fake news? America’s real newsrooms. The Washington Post. Retrieved from https://www.washingtonpost.com/opinions/the-rise-of-fake-news-is-an-indictment-of-americas-real-newsrooms/2016/12/12/9ccd7ac2-be52-11e6-94ac-3d324840106c_story.html?utm_term=.6e407f5a65d8

Sydell, L. (2016, November 23).  We tracked down a fake-news creator in the suburbs. Here's what we learned. National Public Radio. Retrieved from http://www.npr.org/sections/alltechconsidered/2016/11/23/503146770/npr-finds-the-head-of-a-covert-fake-news-operation-in-the-suburbs

Zamudio-Suaréz, F. (2016, December 22). A professor once targeted by fake news now is helping to visualize it. The Chronicle of Higher Education.

Barthel, M., Mitchell, A., and Holcomb, J. (2016, December 15). Many Americans believe fake news is sowing confusion. [Report]. Pew Research Center. Retrieved from http://www.journalism.org/2016/12/15/many-americans-believe-fake-news-is-sowing-confusion/  

Stanford History Education Group. (2016, November 22). Evaluating information: the cornerstone of civic online reasoning. [Report]. Retrieved from https://sheg.stanford.edu/upload/V3LessonPlans/Executive%20Summary%2011.21.16.pdf

  • Snopes Field Guide to Fake News Sites and Hoax Purveyors
    Snopes.com's updated guide to the internet's clickbaiting, news-faking, social media exploiting dark side.
  • Hoaxy
    Visualize the spread of claims and fact checking.
  • FactCheck.org
    "...a nonpartisan, nonprofit “consumer advocate” for voters that aims to reduce the level of deception and confusion in U.S. politics."
  • Politifact
    "...a fact-checking website that rates the accuracy of claims by elected officials and others who speak up in American politics."
  • Society of Professional Journalists: Code of Ethics
    "Ethical journalism strives to ensure the free exchange of information that is accurate, fair and thorough."