top of page

Outrage, Clickbait, and the Critical Thinking We Desperately Need


How to recognize bias, resist manipulation, and navigate information with confidence.
How to recognize bias, resist manipulation, and navigate information with confidence.

By NOBLE technology's Tara Stewart, as featured in the October Edition of Wellness Education Magazine


The great irony of the Information Age is that it began as a harbinger of truth. With a few keystrokes, we could access vast digital libraries packed with verifiable facts. “Google it” became shorthand for inquiry and proof.

But then tech giants discovered something more profitable than facts: the limitless potential of harvesting our interactions. First, it was goods. Then services. And now, ideas. The innovations that once promised clarity have mutated into tools of distortion. We’ve entered not the Information Age, but the Disinformation Age.

Here, truth is obscured, emotions outweigh evidence, and once-trusted vessels for fact have withered under the pressure of competing for clicks. Outrage has become currency, and “clickbait” outperforms accuracy.


The consequences are everywhere:

  • Young, impressionable minds are misled.

  • Older, trusting minds are manipulated.

  • Educated, caring minds are overwhelmed.

As cognitive scientist Stephan Lewandowsky warns: “Misinformation can erode people’s trust in politics, science, and society, and once trust is lost, it is very difficult to rebuild.”

This is where the conversation must begin, not with technology itself, but with us.


Owning Our Bias

Before we can teach others to resist misinformation, we must start with ourselves. Each of us carries personal bias, shaped by upbringing, experiences, and emotions. Recognizing this is the first step in building resilience.

Misinformation is not random; it is designed. Algorithms feed us personalized realities, versions of events most likely to keep us engaged, even if distorted. As the American Psychological Association notes, “People are more likely to share misinformation when it aligns with their personal identity… or when it elicits strong emotions.”

Even subtle cues can manipulate perception. Studies show that changing the color of text in a post, green versus red, shifts interpretation. Red feels urgent, even threatening; green feels calm or trustworthy. If such a detail can sway us, imagine the power of engineered headlines, inflammatory language, or AI-generated voices and images.

And yet, most of us still care deeply about accuracy. The search for truth must begin with self-awareness: recognizing our own biases and understanding how they are being exploited. Only then can we begin to separate signal from noise.


The Manipulation of Reality

Psychologists call it the misinformation effect, when memory of an event is reshaped by false information introduced after the fact. At scale, this becomes a mass misinformation effect, rewriting collective memory.

Over time, we begin to normalize falsehood. The question shifts from “Is this true?” to “Does this feel true for my side?”

A striking example came in 2019 during “Sharpie Gate.” U.S. President Donald Trump presented a hurricane forecast map altered with a marker to falsely extend the storm’s path. When challenged by meteorologists, he doubled down, insisting his version was correct.

Regardless of politics, the implications were profound. The leader of the free world had circulated misinformation, refused correction, and demanded acceptance of his version of reality. This wasn’t about weather, it was about the normalization of misinformation at the highest levels of authority.


The consequences are corrosive:

  • For society: trust erodes and deception is legitimized.

  • For individuals: cynicism grows and truth feels futile.


The Burden of Truth Lost

The toll of misinformation is not just civic, it is deeply personal. Constant exposure to distortions creates confusion, stress, and anxiety. Young people in particular may develop learned helplessness: why bother fact-checking if those in power can erase facts with a marker or a tweet?

Clinicians now report rising levels of ambient anxiety, a background hum of unease fueled by relentless news cycles, online hostility, and blurred boundaries between fact and fiction. When reality itself feels unstable, our psychological foundation wobbles.


Imagine What’s Coming

If today’s misinformation challenges our sanity, the future may test it even more. Advancing AI tools can already:

  • Mimic human voices convincingly.

  • Generate photorealistic images and videos of events that never happened.

  • Produce persuasive propaganda at scale, tailored in real time to our responses.


Super-intelligent systems could take this further, creating self-reinforcing realities that are nearly impossible to distinguish from truth. Imagine a world where your neighbor, your news feed, or even your child’s voice on the phone could be simulated. The ability to manipulate not only facts but trust itself poses profound risks for democracy and for mental wellbeing.

In such a world, the misinformation effect could become near-total, with collective memory and shared truth consistently reengineered by those with the most powerful tools.


The CLEAR Framework for Recognizing Disinformation

Whether in schools, families, or counseling rooms, our strongest protection against misinformation lies in the ability to seek truth with intention. That is why NOBLE technology created the CLEAR Framework, a simple, memorable guide to help people recognize bias, resist manipulation, and navigate information with confidence.

C -- Context: Is this the full story or just a clip? Was it original or reshared?

L -- Language: Is the tone neutral or inflammatory? Objective statements inform; subjective language manipulates.

E -- Emotion: Did it spark anger, fear, or triumph? Emotional triggers most often signal manipulation.

A -- Accuracy: Are claims verifiable with credible sources or references?

R -- Responsibility: Who stands behind it? Is the author accountable, or anonymous and untraceable? Ask yourself: “If I share this, could I cause harm?”


By applying CLEAR, students and adults alike can pause, question, and verify before reacting or sharing. It’s a habit that builds resilience, essential for both mental wellbeing and democratic citizenship.


Connecting in the Real World

I don’t believe our future will be decided by algorithms or AI alone. It will be shaped by us, by our willingness to pause, to question, and to connect honestly with one another. Frameworks like CLEAR help us steady ourselves in this storm of disinformation. They protect mental health by reducing anxiety, reinforce democracy by rooting debate in accountability and truth, and preserve humanity by affirming that disagreement and diversity are not threats but strengths.

At times, it may feel easier to give up on clarity. But these are the very moments when our commitment to truth matters most. If we can inspire one another, our students, families, and colleagues, to keep questioning, reflecting, and searching, then misinformation loses its grip.

At the heart of it all is something technology can never replicate: authentic human connection. As Brené Brown reminds us, “Connection is why we’re here; it gives purpose and meaning to our lives.” The lived, imperfect reality will always be richer than the contrived. And the more we immerse ourselves, and our communities, in building real connections, the stronger, healthier, and more hopeful we will all be.


References:

Comments


bottom of page