What Is Misinformation? How False Information Spreads in the Digital Age

Brandon Stover
Network Manager
May 8, 2026
·
13
min read

We are living through one of the most chaotic information environments in human history. Every day, millions of people scroll through headlines, viral videos, political commentary, memes, podcasts, and breaking news — all competing for our attention, emotions, and beliefs. In that flood of information, it has become increasingly difficult to separate what is true from what is manipulative, misleading, or entirely fabricated.

Misinformation refers to false or inaccurate information that is shared without malicious intent. Someone may genuinely believe what they are posting is true before passing it along to others. 

Disinformation, on the other hand, is false information spread deliberately to deceive, manipulate, gain power, influence political outcomes, generate outrage, or profit financially. 

While the two are different in intent, both can distort public understanding and erode trust in institutions, media, elections, science, and even each other.

The challenge is not just that false information exists. Lies, propaganda, and conspiracy theories have always existed. What has changed is the speed, scale, and sophistication with which information now spreads. Social media algorithms can guide users toward increasingly extreme content. Generative AI can create convincing fake audio, video, and articles. Political actors, influencers, niche online communities, and even entertainment platforms can all shape public perception in ways that blur the line between fact and opinion.

Yet many experts argue the deeper problem is not technology alone. It is the gradual breakdown of shared trust, civic bonds, media literacy, and our collective ability to agree on what is real in the first place.

In this article, experts from across politics, journalism, cybersecurity, technology, academia, and democracy advocacy explain how misinformation and disinformation spread, why people believe it, how social media and AI amplify it, and what individuals and institutions can do to fight back. The insights and quotes throughout this article come from podcast interviews with researchers, authors, political leaders, technologists, and media experts who study the modern information ecosystem firsthand.

Understanding misinformation is no longer optional. In a world where information itself can be weaponized, learning how to critically evaluate what we consume may be one of the most important civic skills of the modern era.

DON'T MISS OUT ON THE BEST DEMOCRACY PODCAST FOCUSED NEWSLETTER!

Subscribe to receive a biweekly collection of the hottest podcast episodes from the network, upcoming special events, expert features, and news from your favorite shows.

Subscribe to our Newsletter

What Is Misinformation?

First, let’s get a clear picture of what misinformation is and how it differs from related phenomena such as disinformation, propaganda, and fake news. Pauline Hoffman, also known as the Data Doyenne, is an esteemed infodemiologist and the author of Fake News, Witch Hunts, and Conspiracy Theories: An Infodemiologist’s Guide to the Truth. She explains the difference between misinformation and disinformation.

Pauline Hoffman:

“It could essentially be the same piece of information, but the difference is the intent.
Disinformation is when someone creates a piece of information for some kind of gain — political gain, power, money, or economic advantage.
Misinformation is when you might see that piece of information and think it’s true, so you share it with your followers or with somebody else. You are not doing it for any malicious reason. You’re simply sharing it because you saw it from a trusted source and believed it to be true.”

Listen to the full episode on Future Hindsight: Understanding the Infodemic: Pauline Hoffmann

American author, journalist, and activist Jonathan Rauch further describes disinformation as a form of information warfare.

Jonathan Rauch:

“Disinformation, or propaganda, or what’s often called information warfare — or what I call epistemic warfare — is about manipulating the social, media, and information environment for political advantage.
What you’re trying to do is divide, disorient, confuse, deceive, and ultimately demoralize your target population.”

Listen to the full episode on Village Squarecast: A Defense of Truth with Jonathan Rauch

Tactics of Misinformation & Disinformation

Now let’s look at some of the tactics used to create misinformation and disinformation — or to make the information environment so untrustworthy that falsehoods become more prominent than relevant, factual information.

Firehose of Falsehood

Jonathan Rauch describes a phenomenon known as the “firehose of falsehood,” a strategy that floods the information landscape with misleading or false claims, making it difficult for people to identify reputable sources.

Jonathan Rauch:

“How to adapt Russian-style mass disinformation techniques, like the so-called firehose of falsehood, and apply them to U.S. politics.
What’s the firehose of falsehood? That’s where you put out so many lies, exaggerations, half-truths, and conspiracy theories through so many channels simultaneously that people can’t keep up.
The media goes nuts trying to knock these things down, but two come, then three come, then a dozen more. Every time you knock one down, the claims are repeated again and again, so they soak into people’s heads.
The public becomes confused and disoriented. They no longer know who or what to believe because they’re constantly being hit with so many of these things all the time.”

Listen to the full episode on Village Squarecast: A Defense of Truth with Jonathan Rauch

Suppressing Information

Once the information environment becomes overwhelmed with false claims, additional efforts are often made to suppress factual information.

Bruce Schneier, a security technologist and New York Times bestselling author of Rewiring Democracy: How AI Will Transform Our Politics, Government, and Citizenship, explains how the same technology that could support an information-rich democracy can also be used to create an information-scarce autocracy.

Bruce Schneier:

“In theory, democracy works best when individuals know what’s going on, have access to accurate information, and are able to express their preferences within the group.
Authoritarianism works better when people do not know what’s going on. Giving people accurate information is harmful for autocracy because it lets people see where the system is failing, who is benefiting, who is not, who disagrees, and where the protests are.
An authoritarian regime wants to suppress accurate information about society, the government, the economy, and the things happening around people.
So the same technology that could be used in a democracy to move information around becomes a system used to suppress information.”

Listen to the full episode on Democracy Works: How AI is changing democracy

Mia Hoffman explains that some groups are particularly vulnerable to disinformation, especially around voting narratives and elections.

Mia Hoffman:

“Immigrant communities, in particular, are at a higher risk of harm because their languages and perspectives are not necessarily represented in mainstream media.
These communities often retreat to more niche platforms for news consumption. There is an information vacuum in their native languages around how election processes work, what changes in voting policy mean, or even the most common disinformation narratives.
There is also a lack of campaign outreach in their native languages. That vacuum creates room for malicious actors to inject disinformation through phony news sources or media pages that target these communities in their native languages.”

Listen to the full episode on Democracy Decoded: Navigating Election Facts in the AI Era

Rachel Maron, author of This Woman Votes, explains why suppressing trust in information is such an effective tactic.

Rachel Maron:

“I think understanding reputable sources — that’s what this all comes down to. It comes down to the erosion of authority. How do we know what’s true? How do you know anything?
Once you start attacking the very idea of knowing, you erode everything.
You don’t have to tell lies. You don’t have to tell the truth. You simply create an environment where nobody knows what the hell is going on or who to believe.
And in that environment, when everyone is questioning everything, you can commit almost any atrocity you want because there aren’t enough people willing or able to stop you.”

Listen to the full episode on Freedom Over Fascism: Fighting the Epistemic War Against the MAGA Murder Regime with Rachel Maron

Generative AI

Of course, one of the tactics drawing the most concern today is the use of AI-generated content, which has become increasingly easy to create and increasingly believable.

Simone Leeper, host of Democracy Decoded and Senior Legal Counsel at the Campaign Legal Center, explains that misinformation and disinformation have become more convincing and persuasive with the rise of generative AI.

Meanwhile, Bruce Schneier and his co-author Nathan E. Sanders argue that AI itself is simply a tool — one that can either suppress democracy or strengthen it, depending on the intentions of the people using it.

Simone Leeper:

“Misinformation and disinformation are not exactly new threats to American democracy. For most of our history, however, for a piece of false information to spread widely, it had to fool people in the traditional news media into printing or broadcasting it.
Today, bad actors do not need to trick professional journalists. They simply need falsehoods to spread through the digital equivalent of word of mouth.
The newest tool bad actors can use is generative AI. It allows creators to make false audio, like the Biden robocall. They can create convincing images showing things that never happened, or even entire fake news articles in multiple languages.
AI changes the stakes for misinformation and disinformation. The content it creates is not simply false — it becomes false evidence supporting a falsehood.”

Listen to the full episode on Democracy Decoded: Navigating Election Facts in the AI Era

Bruce Schneier:

“AI is a tool, but AI also has affordances. By making something easier, cheaper, or faster, it changes what is possible. We see that with propaganda.
Propaganda is not new. Disinformation is not new. Fake news is not new. What AI does is make it easier for ordinary people to produce convincing fake audio, video, and images.
We recently saw that with the invasion narratives around Venezuela, where fake images spread across the internet within hours.
AI will change things because of that dynamic. We are also living in a decade where democracy is declining in many parts of the world, and AI may exacerbate that decline. But AI also has the potential to supercharge pro-democracy movements, organizations, and governments.
So we are watching democracy and how it uses AI just as much — if not more — than the technology itself.”

Nathan E. Sanders:

“AI is a power-amplifying technology. It can turn speech or thought into action in a way that is both unique and more scalable than previous technologies.
We analyze AI as a technology that amplifies the power of whoever is using it, regardless of their purpose.
In a democratic context, if you want to use AI to improve the efficiency of benefits administration, it can do that. If you want to use it to enforce the agenda of a single authoritarian leader, it can do that as well.”

Listen to the full episode on Democracy Works: How AI is changing democracy

Why It Happens

Why does misinformation happen in the first place? According to Jonathan Rauch, agreeing on what is true is one of the most difficult challenges any social group faces.

Jonathan Rauch:

“The hardest thing for any social group — whether it’s a small tribe or a large society — is agreeing on a common account of truth: how we arrive at truth and who has authority over truth.”

Listen to the full episode on Village Squarecast: A Defense of Truth with Jonathan Rauch

Ulterior Motives

Mia Hoffman explains that misinformation and disinformation related to elections and politics are generally used to achieve three broad goals.

Mia Hoffman:

“The first is to sway voter opinions — to change how people feel about different candidates.
The second is voter suppression: discouraging people from going to the polls in the first place.
The third is seeding mistrust in electoral processes and democratic governance as a whole.
Many of the narratives from the previous election — manipulated voting machines or fraudulent mail-in voting, for example — helped create mistrust in electoral systems. That has really been the target over the past several years.
The internet, and social media in particular, have become the primary environments where misinformation and disinformation are shared and consumed. This includes platforms like Facebook and TikTok, but also more niche platforms we may not immediately think about, such as WhatsApp, Telegram, and WeChat.”

Listen to the full episode on Democracy Decoded: Navigating Election Facts in the AI Era

Are People Stupid?

It can be tempting to assume that people who believe false information are simply unintelligent or naive. But as Pauline Hoffman explains, anyone can fall prey to misinformation, and a lack of empathy only makes it harder to understand why these narratives spread so easily.

Pauline Hoffman:

“I still sometimes find myself speaking with people and thinking, ‘Maybe you’re just stupid.’ But the reality is that just because you don’t know what I know does not make you dumb.
I do not know what you know either, and that does not make me stupid.
I think we really need to understand why people believe what they believe — what their frame of reference is and what their background is — if we truly want to understand them.
That does not mean we have to agree with them. But if we want to communicate effectively and explain what is true, we first have to understand where people are coming from…
What I find troubling, particularly with conspiracy theories today, is that some people are living in a very different reality. Their definition of what is real is not the same as ours.
It becomes incredibly difficult to communicate when the world they have created — or are living in — does not match the same shared reality.
I do not mean they literally think the sky is green. But when people begin believing there are secret groups trying to undermine democracy or control the weather so hurricanes hit certain cities — things that are, of course, not true — they are often trying to explain events that do not have easy explanations.
Those narratives may feel logical to them. Of course, they are not logical, but to some people they can seem completely accurate.”

Listen to the full episode on Future Hindsight: Understanding the Infodemic: Pauline Hoffmann

Bad Information Ecosystem

Michael Rich, President and CEO of the RAND Corporation, explains three ways our information ecosystem has deteriorated, allowing misinformation and disinformation to flourish.

Michael Rich:

“The problem is much more insidious. That goes back to why I think about ‘truth decay’ and why I prefer that term over phrases like ‘post-fact era’ or ‘post-truth era.’
The first trend is the erosion of any distinction between fact and opinion — the blurring of that line, whether or not there is an intentional lie involved.
The second, which Sasha implied, is the explosion in the quantity of opinion relative to fact.
The third trend is the diminishing trust the public has in institutions that were once regarded as authorities on factual accuracy, scientific knowledge, and similar issues.
All three of those trends contribute to a fourth trend: increasing disagreement about objective facts, scientific consensus, and analytical interpretations of data.
Intentional lies are an important and difficult challenge, especially as technology makes it easier to fabricate audio, video, and so-called facts. But the problem is larger than that.
It also involves the unintentional spread of misleading information and people’s inability to distinguish between fact and opinion. Both have value, and both are forms of information, but they are not the same thing. They need to remain separate.
A variety of factors are making that separation more difficult, including changes in the information landscape itself.”

Listen to the full episode on Let's Find Common Ground: Truth Decay

Angelo Carusone, president of Media Matters for America, further explains how the people and platforms shaping public information have changed over time — for better or worse.

Angelo Carusone:

“The way we do our work really starts with monitoring, and there are multiple ways to monitor information environments. The important question is: what are you monitoring in real time?
For most of Media Matters’ history, that was actually an easy question to answer. Who mattered — and when they mattered — was very static and stable. You always listened to Rush Limbaugh. You always listened to Glenn Beck. You always watched Fox News and a few other influential voices because their roles rarely changed.
What is different now is that everything is much more dynamic. The question of who matters and when they matter is no longer stable.
There are moments when Steve Bannon’s show gives you the cutting edge of what the right-wing narrative will become over the next day, week, or longer. Then there are periods where he is simply repeating ideas and not driving anything new.
The universe of what is considered politically relevant has also changed dramatically. Fifteen years ago, we would not have monitored comedy programs because they were not shaping political narratives. They were simply part of the broader conversation.
Today, however, we find ourselves monitoring podcasts centered on comedy because they can become deeply influential in politics during certain moments. They help shape the stories and narratives that many Americans ultimately walk away believing.”

Listen to the full episode on The Context: Maybe Don’t Get All Your News from Podcasts, America

Social Media

The next culprit people often point to is social media algorithms. Angelo Carusone explains how recommendation algorithms can gradually guide users toward increasingly extreme or potentially false information through associated content.

At the same time, Jonathan Rauch argues that social media itself is not the largest source of misinformation and disinformation. Rather, politicians are often the primary drivers, while social media acts as an amplification engine.

Angelo Carusone:

“We ran a study asking, ‘If someone watches a Joe Rogan clip on TikTok, what happens next?’ We conducted similar analyses using several of the top online programs.
What we found was fascinating. If you engage with one of those clips even a single time, TikTok’s algorithm assumes you may be interested in a much broader range of extreme content.
It immediately begins serving you a buffet of wild ideas: conspiracy theories, apocalyptic fearmongering, medical misinformation, Christian nationalism, trad-wife content, misogynistic content, and more.
It happens consistently. One engagement with one of these shows can trigger it.
Not all of your feed becomes that content. Much of it remains the normal recommendations based on your interests. But the algorithm starts layering these ideas in.
Then, if you engage with one of those fear-based or conspiratorial videos, the system gives you more and more of it. That is how people get pulled down rabbit holes.
That’s what makes these recommendation systems so significant. Joe Rogan did not necessarily do anything wrong in the clips themselves, nor did many of the other creators. The algorithm is what says, ‘We have a way to identify people who may become interested in extremist ideas, and we know how to guide them there.’
Many people will never become interested in hard misogyny, conspiracy theories, or fearmongering. But if you repeatedly serve people a buffet of increasingly extreme content, eventually something is going to resonate with someone. Once that happens, they get sorted into a narrower category and are pushed further and further down that feed.”

Listen to the full episode on The Context: Maybe Don’t Get All Your News from Podcasts, America

Jonathan Rauch:

“An important point I learned from people like Bill Adair, the founder of PolitiFact, is that social media is not the number one spreader of fake news and disinformation. It’s not even number two.
Number two is cable news. Number one — by a wide margin — is politicians. Politicians are the oldest form of social amplification that exists.
If a major politician says something, the media has to cover it. That is still the primary mechanism through which figures like Donald Trump spread information and narratives.
So we cannot assume Facebook is going to solve this problem for us. It will not, and it cannot.”

Listen to the full episode on Village Squarecast: A Defense of Truth with Jonathan Rauch

What We Can Do

So what can we do about misinformation and disinformation? Fortunately, many experts believe there is a great deal that can be done — from large-scale institutional reforms to small individual actions.

Jonathan Rauch outlines three broad approaches for pushing back against misinformation and disinformation.

Jonathan Rauch:

“The good news is that we are just beginning to fight back, and there is a lot that can be done.
You are never going to eliminate disinformation or social manipulation entirely, either in principle or in practice. But there are three dimensions where meaningful action can happen.
The first is institutions. This is where the work of people like Aaron comes in. Throughout history, including after the invention of the printing press, societies responded to information crises by creating buffers, guardrails, and incentives through institutions. These structures helped people become more resilient and better at navigating new media environments.
One example is Facebook’s Oversight Board. Another is the International Fact-Checking Network, where PolitiFact has been a leader. There are many others.
The second dimension is the individual level. Most of us belong to institutions that are truth-seeking, truth-based, or at least truth-friendly. There is always something we can do in our own environments.
We can be what I call ‘reality allies’ by speaking up and telling the truth, even when others are heading down some conspiracy rabbit hole.
Do not repost nonsense just because it is amusing and likely to go viral. If you see a cancel campaign targeting someone, do not join it. In many cases, those campaigns can be stopped quickly if enough people push back.
If you are an employer, do not automatically fire someone who is being publicly targeted.
We can also become smarter about how we consume media. Some of that is already happening.
The third dimension is civic institutions, which is especially relevant to organizations like Village Square. Propaganda and polarization go hand in hand. Every time we rebuild civic bonds, strengthen relationships, and increase trust between people, we reduce the ability of bad actors — whether Vladimir Putin or anyone else — to weaponize our social vulnerabilities against us.”

Listen to the full episode on Village Squarecast: A Defense of Truth with Jonathan Rauch

Fact Check Before Sharing

On an individual level, one of the simplest things we can do is pause before sharing information online.

Jason Reifler reminds us that there are many quick and accessible ways to verify information before spreading it.

Jason Reifler:

“One of the simplest things you can do is ask yourself whether the claim even seems believable. If you are unsure, Google it and fact-check it. If something seems unlikely to be true, there is a good chance it is not.
Use fact-checking organizations. Use PolitiFact. Use the Washington Post Fact Checker. Use FactCheck.org. Use Snopes.”

Listen to the full episode on The Politics Guys: What We’ve Learned About Fake News

Pauline Hoffman stresses that each of us has a personal responsibility to avoid contributing to the spread of false information.

Pauline Hoffman:

“Care before you share. Do not automatically repost things online.
I am guilty of knee-jerk reactions too. Sometimes I see something sensational and immediately want to share it, but I really try to stop myself from doing that.
Please do not share something simply because it is funny or entertaining, especially satire. Other people may believe it is real.
We have to be careful about what we spread online and do our best not to contribute to the infodemic. Fact-check information whenever possible.”

Listen to the full episode on Future Hindsight: Understanding the Infodemic: Pauline Hoffmann

Katie Harbath, an award-winning leader at the intersection of technology, policy, and elections and former Public Policy Director at Facebook, explains that social media platforms can also be designed to encourage more thoughtful behavior.

Katie Harbath:

“Research shows that directly telling people they are wrong can sometimes make them dig their heels in even further.
But there are other kinds of nudges that platforms can use. Community Notes is one example that many companies have been experimenting with.
Before someone shares content that fact-checkers have marked as false, platforms could display a pop-up saying, ‘Before you share this, please know this claim has been marked false, and here is more information.’
Or a platform might add context to an image, such as: ‘This photo is actually from five years ago, not today.’
There are many design approaches platforms can use to better educate people about what they are sharing and consuming online.
Part of the challenge is that there is no universal agreement about what kinds of speech should or should not be allowed.”

Listen to the full episode on Talkin‘ Politics & Religion Without Killin‘ Each Other: The Election Whisperer: Katie Harbath on Ten Years Inside Facebook and Panicking Responsibly

Consider the Source

Simone Leeper encourages people to strengthen their information literacy by carefully considering the source of the information they consume. Meanwhile, Samuel C. Spitale offers a simple but powerful question we can ask ourselves to evaluate the intentions behind a message.

Simone Leeper:

“So what can you do to become part of the solution to misinformation and disinformation? For starters, consider the source.
Where did the information come from, and is that source credible? If it is an article from an unfamiliar outlet, investigate the author. Are they a real person?
Consult trusted media organizations and look for other credible sources supporting the same story to determine whether it is accurate.
Taking a moment to double-check information before sharing it can go a long way toward combating misinformation. Encouraging your friends and family to do the same can help as well.”

Listen to the full episode on Democracy Decoded: Navigating Election Facts in the AI Era

Samuel C. Spitale:

“One of the strongest questions you can ask about any media message is: ‘Who benefits from me believing this?’
Whether it is a political campaign ad or some other kind of media, that question often reveals when someone is trying to sell you something you may not need.
Another important thing to examine is emotion. If a message is built around strong negative emotions, there is a good chance someone is trying to manipulate you.
For example, on social media you often see stories highlighting crimes committed by undocumented immigrants. Those stories are usually wildly disproportionate to the actual data, since immigrants generally commit crimes at lower rates than native-born citizens. Most immigrants do not want to risk deportation.
But those headlines are often designed to trigger anti-immigrant sentiment, fear, outrage, or racism. They are also frequently used to discredit political opponents or government administrations.
When you stop and think about it statistically, one immigrant committing a crime does not represent all immigrants, especially when thousands of crimes committed by citizens on the same day receive no similar attention.
So who benefits from those headlines?
We need to pay attention to the language being used and the emotional response a message is trying to create. Fear, outrage, and hate can short-circuit critical thinking and make us more reactive.
The more emotionally charged we become, the less likely we are to stop, think critically, and evaluate the information carefully.”

Listen to the full episode on Outrage Overload: How to Combat the Misinformation Crisis

Educate Yourself

Pauline Hoffman believes that stronger education and literacy are essential if people are going to accurately assess the information they encounter.

Pauline Hoffman:

“I think the more educated we are, the better off we are going to be.
We have so many literacy problems in this country. We suffer from a lack of media literacy, health literacy, science literacy, political literacy, civic literacy — really, every kind of literacy you can imagine.
The only way to combat that is through education.”

Listen to the full episode on Future Hindsight: Understanding the Infodemic: Pauline Hoffmann

Katie Harbath explains that organizations can help by proactively providing accurate information before misinformation has the chance to spread widely.

Katie Harbath:

“Beginning around the 2022 midterms, many organizations started working with election officials on something called ‘prebunking.’
The idea is to give people accurate information before misinformation reaches them. That includes explaining how voting works, what different voting methods mean, and how people can vote safely and confidently.
For example, if someone does not want to vote in person, organizations can explain absentee ballots, mail-in voting, and other available options ahead of time.
The goal is to help people understand the process in advance so they already have accurate information if they later encounter misinformation or disinformation online.”

Listen to the full episode on Talkin‘ Politics & Religion Without Killin‘ Each Other: The Election Whisperer: Katie Harbath on Ten Years Inside Facebook and Panicking Responsibly

Consider Your Messaging

Denver Riggleman, former Congressman from Virginia, explains that politicians and public figures also have a responsibility to communicate carefully and remain committed to factual information, even in a chaotic media environment.

Denver Riggleman:

“Some candidates say to just ignore misinformation entirely. But we are operating in a new information warfare environment and a new kind of information battle space.
What I would tell candidates is that when something is especially outrageous, responding with humor and facts in a non-threatening way is often the best approach.
But you also have to choose your media outlets carefully because so many platforms are built around clickbait. Even if you give a thoughtful interview, headlines can still be manipulated just to generate clicks.
You have to be extremely careful about how you present yourself and how you communicate your message. You need a strong team that understands the media environment and can help provide proper context.
But even with all of that, you cannot abandon your responsibility to tell the truth, even when it is politically painful.”

Listen to the full episode on The Great Battlefield: Fighting Back Against Misinformation with Former Congressman Denver Riggleman

Diversify Your Media Diet

Both Jonathan Rauch and Katie Harbath remind us that we still have agency over the information we consume. One of the healthiest habits we can develop is intentionally seeking out a variety of perspectives — including viewpoints we may disagree with.

Jonathan Rauch:

“Try to avoid getting trapped inside a media bubble where you are only consuming right-wing or left-wing viewpoints.
If you are liberal, read the Wall Street Journal. If you are conservative, read the New York Times.
The goal is not necessarily to agree with the other side, but to understand the contours of their arguments and expose yourself to some degree of disconfirming information.
That is difficult to do. Studies show that many people would rather go to the dentist than encounter political opinions they disagree with.
But trying to understand opposing viewpoints will ultimately make you intellectually healthier.”

Listen to the full episode on Village Squarecast: A Defense of Truth with Jonathan Rauch

Katie Harbath:

“I would really encourage people to remember that they still have agency when it comes to the internet and the information ecosystem.
Personally, I try to conduct a kind of annual ‘news audit’ for myself where I ask: Am I actually reading the sources I want to be reading? Am I exposing myself to viewpoints from across the political spectrum, both within the United States and internationally?
For me, that mostly involves technology and politics, but I think it is valuable for everyone to pause occasionally and ask themselves whether their actual media habits align with the kind of information environment they want to participate in.
We do have control over where we spend our time and attention online.”

Listen to the full episode on Talkin‘ Politics & Religion Without Killin‘ Each Other: The Election Whisperer: Katie Harbath on Ten Years Inside Facebook and Panicking Responsibly

Don’t Feed the Trolls. Feed the Truth.

Misinformation and disinformation do not spread on their own. They spread because human beings share them, repeat them, emotionally react to them, and build systems that reward outrage more than accuracy. In many ways, the modern information war is not simply a battle over facts — it is a battle over attention.

The uncomfortable reality is that none of us are completely immune.

Intelligent people fall for false narratives. Educated people share misleading headlines. Good people repeat information that later turns out to be wrong. The issue is not whether you are “smart enough” to avoid misinformation forever. The issue is whether you are willing to slow down, think critically, and take responsibility for the role you play in the information ecosystem.

Every time you share an article without reading it, repost a sensational headline because it confirms your worldview, or spread a claim without verifying it, you become part of the problem — even if your intentions are good. But the opposite is also true. Every time you fact-check a claim, pause before reacting emotionally, seek out multiple perspectives, or respectfully push back against false information, you strengthen the culture of truth.

That responsibility does not belong only to journalists, politicians, tech companies, or fact-checkers. It belongs to all of us.

The experts throughout this article repeatedly pointed toward the same solution: stronger civic trust, better media literacy, healthier institutions, more thoughtful communication, and individuals willing to act with intellectual humility instead of tribal outrage.

So before you share that viral clip, repost that headline, or forward that “shocking” story to your friends, stop and ask yourself:

  • Is this true?
  • Who benefits if I believe this?
  • Am I reacting emotionally or thinking critically?
  • Have I verified the source?
  • Am I helping inform people — or simply amplifying noise?

The future of democracy, public trust, and our shared understanding of reality may depend on how enough ordinary people answer those questions every single day.

DON'T MISS OUT ON THE BEST DEMOCRACY PODCAST FOCUSED NEWSLETTER!

Subscribe to receive a biweekly collection of the hottest podcast episodes from the network, upcoming special events, expert features, and news from your favorite shows.

Subscribe to our Newsletter

READ MORE BLOG POSTS

Learning Guide Image
May 8, 2026
·
13
min read

What Is Misinformation? How False Information Spreads in the Digital Age

What is misinformation, and how does it spread so quickly online? Learn the difference between misinformation and disinformation, how social media and AI amplify false narratives, and what experts say you can do to protect yourself and strengthen democracy.

Brandon Stover
Read Post
Learning Guide Image
March 31, 2026
·
9
min read

What Is Ranked Choice Voting? How It Works (and Why It Matters)

What is ranked-choice voting, and could it fix broken elections? Learn how ranked-choice voting works, the problems it solves, and how it could reduce polarization while giving voters more power.

Brandon Stover
Read Post
Learning Guide Image
March 9, 2026
·
10
min read

What Is Civics? The Three Principles Every Citizen Should Know

What is civics, and why does it matter for the future of democracy? Learn how understanding civics—being informed, taking action, and practicing civility—empowers citizens to protect their rights and actively participate in shaping their government.

Brandon Stover
Read Post