Calling Bullshit cover

Calling Bullshit - Book Summary

The Art of Skepticism in a Data-Driven World

Duration: 29:38
Release Date: December 2, 2023
Book Author: Carl T. Bergstrom, Jevin D. West
Categories: Society & Culture, Personal Development, Science
Duration: 29:38
Release Date: December 2, 2023
Book Author: Carl T. Bergstrom, Jevin D. West
Categories: Society & Culture, Personal Development, Science

In this episode of 20 Minute Books, we delve into "Calling Bullshit" by Jevin D. West and Carl T. Bergstrom. This enlightening 2020 publication guides readers through the muddled waters of information overload in our modern age. It is a masterclass in identifying the manipulation of data and scientific processes, offering tools and insights to detect and combat the spread of misinformation.

West and Bergstrom, both leading scientists at the University of Washington, lend their expertise to this crucial subject. West, an associate professor in the Information School and the director of the Center for an Informed Public, specializes in analyzing the circulation of misinformation. Meanwhile, his co-author, Carl T. Bergstrom, is a biology professor whose research focuses on the flow of information through complex networks.

This book is particularly suited for enthusiasts of popular science who are keen to peek behind the veil shrouding the data that shapes our perception of the world. Data enthusiasts looking to enhance their analytical skills will find it invaluable, and concerned citizens committed to the fight against misinformation will discover actionable strategies for engaging with the world more critically. Join us as we cut through the noise and empower ourselves with the ability to call out "bullshit" when it arises in the prolific exchange of information marking the 21st century.

Ditch the Nonsense: Sharpen Your Bullshit Radar

It's an avalanche out there — a relentless barrage of information, hurtling towards us from every angle. Your phone chimes, emails stack up, social media beckons, and the news plays on a constant loop. But amid this constant stream of data, there's a lot of... well, nonsense.

Beneath the data, facts, and figures lies a layer of misinformation and deceit that often goes unnoticed at first glance. It's the sort of stuff that sounds plausible, but with a closer inspection, it crumbles like a house of cards. The world has coined a term for it: bullshit. And it's everywhere.

But knowledge is power and with the right tools, you can become a deft detector of drivel. This narrative unveils the secrets to sifting through the science and statistics to spot the bullshit before it can take root. Ready to become an expert in calling out the fluff? Let's dive in.

Discover the phrenological folly of crime prediction

Ever heard that criminals can be spotted by the shape of their skull? It might sound outlandish now, but not so long ago, people believed in phrenology — the notion that one's character, and propensity for criminality, could be read from the bumps on their head. Thankfully, science has moved on since then, but it serves as a prime example of how easily we can be swayed by convincing-sounding, yet ultimately bogus, theories.

Untangle the false links — like housing prices and birth rates

Sometimes, the connections drawn between two data sets are about as sturdy as a house of cards in a windstorm. Take, for instance, the claim that soaring house prices lead to declining birth rates. It's an example of a correlation mistaken for causation, a logical fallacy where two simultaneously occurring trends are mistakenly assumed to be in a cause-and-effect relationship. Learning to discern when this fallacy is at play can prevent us from accepting misguided or erroneous conclusions.

Expose the hidden half-truths of scientific publishing

And then there's the problem of selective storytelling in the realm of scientific research. Here's a bitter pill to swallow: not all studies are created equal, and not all findings see the light of day. This phenomenon, known as selection bias, skews our understanding of scientific truths. It's a bit like being fed the highlights reel and missing out on the entire, nuanced story. Battling this selective bias demands a healthy skepticism and an eye for the missing pieces of the puzzle.

By engaging with this narrative, you'll be on your way to becoming a confident skeptic, someone who doesn't just consume information but interrogates it. You'll learn to look beyond the surface, to question and analyze the credibility of what's presented to you, thus ensuring you're not misled by the vast amounts of bullshit masquerading as truth. Keep this skepticism handy — it's your best defense in a world brimming with balderdash.

Stay Sharp: Navigating the Minefield of Misinformation

Picture this: it's 1998, and a medical bombshell drops. A study published in The Lancet, authored by Andrew Wakefield, alleges a connection between the MMR vaccine and autism. The implication ripples through the global conscience, breeding doubt, fear, and a burgeoning antivax movement. Only later does the truth emerge — the study was deeply flawed. Wakefield's work was discredited, and in 2010, the study was retracted.

This, however, is just one face of a multifaceted beast known as bullshit. Despite being debunked, the tale of the MMR vaccine and autism continues to hinder vaccination rates and foster the resurgence of once-controlled diseases. It's a testament to the unnerving power of misinformation and the difficulty of rooting it out once it's taken hold.

The lesson, stark and true, is that we must all be vigilant in the face of bullshit.

Bullshit isn't confined to the digital age; even the ancient philosopher Plato called out his contemporaries, the Sophists, for their preference for persuasion over truth. But the 21st century has proven alarmingly accommodating for the spread of such deceit. Today, false claims hide behind the veneer of scientific credibility or exploit the persuasive power of visuals — like the fabricated story following the Boston Marathon bombing that fooled over 92,000 social media users with an emotionally charged, yet utterly false story of an eight-year-old runner.

What all these anecdotes emphasize is the harsh reality of our current information ecosystem. The potent combination of modern technology like social media, coupled with hyperpartisan media outlets, production of fake news at scale, and the simplicity of manipulating images, has precipitated a crisis of misinformation. This underscores a societal urgency: to take action against the tide of bullshit, to sharpen our critical faculties, and be the arbiters of our own understanding.

Each of us bears the responsibility to distinguish between fact and fabrication. It's an essential skill in today's world, where the difference between the two can have serious, far-reaching consequences. Stay sharp, stay skeptical, and remember — in the realm of information, not all that glitters is gold.

The Art of Deception: Unmasking the Bullshitter's Toolkit

So, what is at the heart of the information that's too good to be true, or the arguments that seem sound yet feel off? Let's dissect the concept of bullshit. If we consider the views of the authors, bullshit is essentially the art of convincing at any cost, where the persuasive impact triumphs over the triviality of truth.

The modern bullshitter's arsenal is brimming with tools designed for obfuscation instead of illumination — clever language, twisted statistics, and misleading graphics. These tools aren't just lying dormant; they are actively wielded to bombard audiences with a facade of facts that masquerades as authenticity.

Here's the crux of the matter: Bullshitters are adept at convincing others of something's veracity, with scant regard for the evidence supporting it.

Let us delve into the clever strategy that science sociologist Bruno Latour eloquently termed "black boxes." Imagine a process where data is funneled through a convoluted algorithm, akin to magic. What emerges from the other side is often perceived as unassailable truth — but the magic, the black box, must be questioned.

Peek into the source of data — there lies the first thread to pull. For example, consider a 2016 study positing that a criminal mind could be linked to the shape of one's head, backed by an algorithm's seal of approval. However, a glaring flaw was revealed: the comparison was between convicts' ID photos and the professional headshots of non-criminals. A genuine smile is less likely in an ID photo booth than in a professional photographer's studio, isn't it?

This instance exemplifies how a flawed dataset renders any algorithm's conclusion worthless. And this gets us thinking: were the study's authors deliberately deceiving? Not necessarily. Their single-minded focus on validating their hypothesis might have blinded them to the glaring inadequacies in their data selection.

In the end, though well-intended, their study is a classic case of bullshit — embellished with a scientific flourish yet fundamentally flawed at its core. Recognizing these patterns and peeling back the layers of the sophisticated charade is the essence of calling out bullshit. It's about seeing beyond the illusion, challenging the black boxes, and aligning our beliefs with robust, credible evidence.

The Misleading Maze of Correlation and Causation

Take a moment to consider the seemingly eye-opening revelations of certain scientific studies. Some findings appear to nod vigorously to common sense, such as a study that highlights a link between self-esteem and early experiences with romance in college students. But pause and ponder — does this truly unravel the mystery of youthful confidence, or has it merely scratched the surface of a more complex social tapestry?

Imagine hypothesizing about the alchemy of kisses and self-worth — does one lead to the other, or are they fellow travelers on the path of burgeoning young relationships? The research may be illuminating, but it's careful not to leap to conclusions about the underlying causes.

Here's the pivotal takeaway: Just because two things move in sync, it doesn't mean they're dancing to the same tune. Correlation does not imply causation.

The media often plays puppeteer in this deceptive dance, and subtle statistical nuances can be lost in translation once the research headlines hit your news feed. Take, for instance, a 2018 study by Zillow, which pointed out the parallel trends of house price hikes and falling fertility rates. The study was circumspect, noting that these two phenomena coincide but don't necessarily cause one another. Yet, when the baton was passed to the media, caution was thrown to the wind as reports spun a tale of cause and effect, shedding the nuances of the original research.

It's essential to understand that correlation can sometimes be an echo chamber amplifying coincidences. For example, plotting the surge in autism diagnoses alongside the growth of organic food sales may reveal a tight correlation. Both have spiked in recent years, but to imply that one leads to the other is to walk a tightrope with no safety net of reason. They're twin graphs of growth, but hardly a meaningful map of mutual cause and effect.

Remember, studies may point you to the treasure chest, but it's the discerning eye that can tell fool's gold from the real deal. Uncover the truths buried beneath layers of correlational convenience. Question, clarify, and critically evaluate before you etch those findings onto the stone tablets of your understanding.

Decoding Deception: The Alarming Flexibility of Figures

Let's ponder a seemingly trivial moment: one of the authors, Carl, is about to indulge in a late-night cocoa craving at a hotel. He picks up the packet and scans for its caffeine content. "99.9 percent caffeine free," it proudly claims. Momentarily reassured, Carl then calculates this in the context of a standard Starbucks coffee, which, despite its notoriety for caffeine, is already 99.9 percent caffeine free by volume. Suddenly, the cocoa's boast seems less than boast-worthy.

This anecdote reveals a stark truth: Numbers can become chameleons in the hands of those intent on persuasion.

Consider this: Numbers pack an emotional punch. The website Breitbart once highlighted that 2,139 individuals with DACA status have been convicted or accused of crimes. This number, devoid of context, appears daunting. However, set against the backdrop of the total 700,000 DACA individuals, the perspective shifts dramatically. Suddenly, the statistical likelihood of a DACA individual having a criminal accusation is dwarfed by the incarceration rates of U.S. citizens. Numbers can be wielded like a sword or a shield, magnifying fears or quelling concerns — depending on the wielder's intent.

When discussing numerical increments, whether in figures or percentages, one can play a conjurer's trick to manifest sizeable issues from minuscule matters. Take, for example, the risk of alcohol-related health problems. A press release from The Lancet noted a 0.5 percent increase in risk from one daily drink. Yet, this statistic hovers over a baseline risk of merely 1 percent in non-drinkers, revealing the actual risk increase to be a negligible 0.005 percentage points.

Herein lies an essential lesson in mathematical literacy — the distinction between percentage differences and percentage point differences. It's all too easy to plunge into a pit of misinformation when these concepts are misunderstood or misrepresented. It's not always about outright fabrication; the devil, as they say, is in the details.

As you wade through the swamp of statistics, stay nimble, stay critical. Whether it's the spin of a cocoa packet or the weightier matters of public policy, numbers can be a slippery slope. Remember, vigilance is the key to not sliding down that slope into a pool of number-driven nonsense.

The Slippery Slope of Selection Bias in Data

In a world awash with statistics and data-driven decisions, one crucial question looms large: from whence does the data spring? Take, for example, the tall tale of the Dutch — renowned for their impressive height. It's not that someone roved the flatlands of the Netherlands measuring every man's stature; they simply took a sample.

Now, imagine if that sample skewed toward the basketball courts.

Or consider the potential distortion if you conduct a political survey at an organic market. Odds are, the needle will swing toward the liberal end of the spectrum, not due to the true distribution of political beliefs, but because of where you gathered your data.

This leads us to a pivotal concept: selection bias.

Here's the essence of the matter: When data isn't neutral, its roots tainted by bias, the fruits it bears — the results of any test or analysis — are equally tainted.

Let's explore an illustrative dilemma — the association between attractiveness and personality. Suppose an exhaustive graph plots all men, randomly scattering dots without correlation between charm and civility. After excluding the exceedingly unpleasant and those who don't strike your fancy aesthetically, you will notice a seeming trend: of the remaining candidates, the more attractive ones appear less congenial. What you're seeing is not a universal truth but a consequence of selectively curating the sample pool.

Distortions of this kind aren't intrinsically deceitful. But when car insurance companies boast that switching to them saves people an average sum — say, five hundred dollars annually — we must recognize the sleight of hand in play. Those who switch generally do so because they are guaranteed substantial savings; thus, the average is skewed by this economically motivated migration.

The scientific world grapples with a similar quandary known as data censoring. Consider the clinical trials for a new medication, where side effects might cause a portion of participants to bow out, thus rendering their experiences invisible in the final analysis. If withdrawal from the trial is influenced by specific factors, the end results become skewed.

The quest for entirely random samples is a Herculean task in a landscape rife with selection bias. This omnipresent hazard serves as yet another reminder to approach statistical declarations with a discerning eye, to question their origins, and to consider the bias that may lurk within.

Big Data's Illusions: The Need for Solid Groundwork

In this digital age, graphic designers wield their technological wands to conjure baffling visuals — diagrams twisted into the shape of a ram's horn or whimsical subway maps charting biblical narratives or musical genres. Such creations are more than mere eye candy; they're engaging ways to convey information. Yet, don't let the traditional facade of bar charts and graphs fool you — without a full scale, proportions can be distorted, obscuring the true narrative.

And as we step further into the complex digital landscape, particularly the world of big data, another caveat awaits us.

Let's consider this critical insight: Big data and machine learning aren't immune to the pitfalls of quality — the data underpinning them must be legitimate.

The allure of "big data" lies in its vastness, which, when harnessed by enigmatic algorithms in machine learning, can perform feats from facial recognition to advanced market predictions. But there's a catch. Oftentimes, these algorithms act as inscrutable "black boxes," producing outcomes detached from transparent understanding.

For instance, let's revisit the tale of the machine learning algorithm, making dubious criminal identifications based on facial analysis. Similarly, a chest X-ray analysis machine was fed a diet of images tagged for heart or lung issues, only to learn to spot anomalies based on irrelevant text printed by a specific type of scanner — a triumph in pattern recognition, yes, but medically irrelevant.

A leap of faith in machine learning also led Google astray. In 2008, Google Flu Trends embarked on a mission to map the spread of influenza through search queries. Yet, its algorithm started associating the flu trends with winter search spikes that had no medical connection, such as "high school basketball." As the seasons passed, the algorithm's predictions deteriorated, hampered by the treacherous assumption that correlation equates to causation.

Google's algorithm was adept at analyzing historical patterns but floundered when tasked with navigating the future. Its disconnection from causative links led it into a quagmire of irrelevance.

The lesson is clear: no matter how expansive or sophisticated our datasets and algorithms may appear, their outputs are still bound by the quality of their inputs. Human oversight remains an indispensable arbiter in differentiating between meaningful insights and statistical chimeras.

As we venture through this realm where the virtual and the real intertwine, let's not be blinded by the grandeur of machine learning. Instead, let's apply the human touch — scrutinizing, questioning, and discerning — to ensure we avoid mistaking the mirage of machine insights for oases of truth.

Science's Achilles Heel: The Pervasive Presence of Bullshit

Science's quest for truth is a noble one, driven by a self-correcting impulse that propels progress. Experiments build upon each other, painting an ever-evolving picture of our understanding. However, this does not imply the existence of an absolute scientific truth — instead, we have a collective body of knowledge, growing incrementally through each study's contributions.

But herein lies a dilemma — the scientific domain isn't immaculate; it's riddled with flaws that can let bullshit slip through the cracks.

The publication landscape favors the positive, the groundbreaking, the affirmatory — and this is where selection bias rears its head. If a myriad of experiments fail to produce notable results, they might never see the light of day. It's the success stories, those that confirm hypotheses, that grace the pages of journals.

This brings us to the notion that the imperfections of modern science inadvertently fertilize the fields for bullshit to flourish.

Statistical significance often hinges on the concept of the p-value. A standard threshold is set at 0.05, a statistical way of saying there's a 95% chance the results are not mere flukes. But this benchmark isn't infallible, especially when we consider Goodhart's Law, which states that once a metric becomes a goal, its utility diminishes as people begin to manipulate circumstances to achieve it.

Enter the dubious practice of p-hacking — the tweaking of experiments to conjure that coveted p-value. It's more common than one might wish to believe.

And then there's the media's role in science communication. Only a sliver of research garners media attention, essentially creating a selection bias. Not every study can claim the spotlight — it's those with the most sensationalist potential that dominate headlines.

Even within scientific journals, not all publications are created equal. Less prestigious ones may be less discerning in their selection, sometimes prioritizing financial incentives over academic rigor. That's why, when extravagant claims emanate from the depths of obscure journals, skepticism should be your first response. Credible results fight their way into reputable publications.

It's a scientific landscape where bullshit sometimes masquerades as breakthrough. Understanding the mechanisms that allow for such distortions is the first step. The next? Approach each assertion with a critical eye, dissect the p-values with care, and remember that in the quest for truth, even science must be scrutinized.

Arm Yourself With Effective Anti-Bullshit Tactics

Think like a journalist. They may not always land on the truth, but their inquisitive nature is gold when it comes to filtering fact from fiction. Whenever you encounter new information, channel your inner reporter and ask three critical questions: Who's providing this information? What methods did they use to gather it? And what might they be trying to sell me?

This approach isn't a cure-all, but it's a powerful starting point in the battle against bullshit.

Ready for more tools for your anti-bullshit toolkit? Here are a few:

When something sounds too fantastical, it probably is. Be wary of extraordinary claims that defy common sense. If your gut says it's implausible, chances are, it's soaked in bullshit.

Next up, embrace Fermi estimations. Named after physicist Enrico Fermi, these are back-of-the-envelope calculations you can perform to get a ballpark figure. They're handy for quickly judging the plausibility of a claim.

Let's put it into practice. Suppose someone claims the UK boasts 121,000 individuals named John Smith. Roughly estimate the UK's population — around 100 million. Now, consider the prevalence of the first name John and the surname Smith. If both occur in about 1 in 100 people, a double division gives us an approximate 10,000 John Smiths — far from 121,000. This discrepancy hints that the original claim might be dubious.

Be on guard for confirmation bias. It's an easy trap to fall into, embracing information that reinforces our pre-existing beliefs. Stay alert and approach agreeable data with an extra dose of scrutiny.

Recall the lesson on correlation and causation. A claim that one trend causes another should raise your skeptical antenna. Often, seemingly related trends have no direct causative link, and something else entirely could explain their concurrent rise or fall.

And in today's world of frenzy-fed newsfeeds, treat information from questionable internet sources with suspicion. If your news comes via Twitter, for instance, double-check it through more reliable channels.

When you uncover bullshit, don't shy away from calling it out — but do so with tact and politeness. Everyone slips now and then, and a gentle correction can open minds more effectively than harsh confrontation.

That's your armory against the barrage of bullshit: a journalist's questions, Fermi estimations, a guard against confirmation bias, the skepticism of correlation causation, and the critical evaluation of dubious sources. Equip these tools with grace, and you'll navigate the misinformation maze with newfound dexterity.

The Essential Guide to Cutting Through the Bullshit

The siren song of bullshit is a melody that's become ever-present in our daily lives, with its harmonies designed to persuade rather than illuminate. As we navigate the vast seas of information bolstered by the winds of social media and big data, the need for vigilance has never been greater.

In our exploration, we've unearthed that at the heart of bullshit lies a blatant disregard for authenticity. Whether it's in the form of statistics masquerading as facts or studies corrupted by selection bias, the intent remains the same: to convince, not to clarify.

To stand firm against the tides of misinformation, recall that just because events or data points dance hand in hand does not mean one choreographed the other's moves — correlation is not causation. Context is the lens through which numerical claims should be viewed, revealing the truth within the numbers. Scrutinize the roots of the datasets you encounter; their merits determine the value of the fruits they yield.

By cultivating a healthy skepticism, understanding the statistical ruses laid bare before us, and using our newfound discernment to differentiate fact from fanciful fiction, we can arm ourselves against the fog of bullshit. And when we do spot its grey clouds on the horizon, let us shine through it with the bright light of fact-checking and thoughtful skepticism.

Calling Bullshit Quotes by Carl T. Bergstrom, Jevin D. West

Similar Books

Cloudmoney
AI Superpowers
Pegasus
Laurent Richard and Sandrine Rigaud
Privacy Is Power
The Four
The Master Algorithm
The People Vs Tech