Reason Video

Reason Video

Video journalism from Reason magazine

  1. 09/25/2025 · VIDEO

    Donald Trump and Peter Thiel Are Using AI To Supercharge the Surveillance State

    Peter Thiel, the billionaire venture capitalist and PayPal co-founder, has a provocative theory about how the Antichrist could take over the Earth and enslave humanity. "My speculative thesis is that if the Antichrist were to come to power it would be by talking about Armageddon all the time," Thiel told Hoover Institution interviewer Peter Robinson earlier this year. The greatest danger we face, according to Thiel, might not be from global warming, terrorism, nuclear winter, or artificial intelligence going rogue. The real danger is that we're so afraid of these threats that we're willing to give up our freedom in the interest of "peace and safety," which is the phrase Thiel ascribed to the Antichrist, citing Thessalonians 5:3. "It's the opposite of the picture of Baconian science from the 17th, 18th century, where the Antichrist is like some evil tech genius, evil scientist who invents this machine to take over the world," Thiel told the New York Times' Ross Douthat on a podcast. "In our world, it's far more likely to be Greta Thunberg." "I feel like that Antichrist would maybe be using the tools that you are building," replied Douthat.  Douthat was referring to Palantir, the government contractor that Thiel co-founded in 2003 during the height of the war on terror. Today, Palantir is "in the white-hot center of the latest trend reshaping the global order," according to The Wall Street Journal, receiving more than $322 million from government contracts in the first half of 2025. It's equipping the government with tools to sift through massive data troves to identify patterns and hunt down illegal immigrants. It's helping the Feds deploy facial recognition technology, and has created AI tools to "predict" where crimes might happen in advance. "Like, wouldn't the Antichrist be like: Great, we're not going to have any more technological progress, but I really like what Palantir has done so far," Douthat asked Thiel. "Isn't that a concern? Wouldn't that be the irony of history, that the man publicly worrying about the Antichrist accidentally hastens his or her arrival?" When Thiel replied that hastening the Antichrist's arrival is "obviously" not what he thinks he's doing, Douthat agreed that it was unlikely but pressed, "I'm just interested in how you get to a world willing to submit to permanent authoritarian rule." It's a great question. While Peter Thiel is warning that the Antichrist could bring totalitarianism by exploiting our desire for "peace and safety," a company that he co-founded is building the tools with great potential for abuse by a totalitarian surveillance state, all based on our desire for "peace and safety."  Is it too late to stop this nightmare? It's fitting that Palantir is named after the mythical stones in J.R.R. Tolkien's Lord of the Rings, which allow users to see into distant lands, eavesdrop on conversations, peer into the past, and—some claim—conjure visions of the future. Tolkien, a devout Catholic, explored the dangers of unbridled power in a way that resonates with Thiel's ideas about the Antichrist. While the Antichrist invokes peace and safety, the corrupted wizard Saruman in Lord of the Rings implores his fellow wizard Gandalf to join forces with the dark entity Sauron, he invokes the values of "Knowledge, Rule, Order." Trump promised peace and safety when he took the stage to accept the Republican nomination in 2016 by cracking down on crime and riots in America's big cities. "I have a message for all of you: the crime and violence that afflicts our nation will soon come to an end. Beginning on January 20th 2017, safety will be restored." Thiel, who was the first tech billionaire to back Trump, spoke at that convention. Eight years later, Trump was back at the RNC once again warning of a crisis and promising to restore peace and safety. "There's never been an invasion like this anywhere," said Trump, referring to a spike in illegal immigration.  Exploiting a national emergency, real or manufactured, is how governments typically grow their power and limit our freedoms. As the libertarian economic historian Robert Higgs chronicled in his 1987 classic, Crisis and Leviathan, the government ratcheted up its power in the 20th century by capitalizing on the Great Depression and two world wars. More recently, the 9/11 terrorist attacks justified the war in Afghanistan, provided cover for the invasion of Iraq, and led to the expansion of the surveillance state under a new paradigm known as Total Information Awareness. "[Total Information Awareness was] what Palantir is now. What they were literally trying to do is come up with the ability to ingest data from pretty much any source and then run that against their de facto target list," says Patrick Eddington, a former CIA analyst who studies the surveillance state at the libertarian Cato Institute. His latest book is The Triumph of Fear, a history of federal law enforcement and surveillance. "All this stuff at the end of the day is fear-based. That's how we get a lot this surveillance mentality," says Eddington.  Total Information Awareness was an initiative started at DARPA, the research arm of the Defense Department. The stated goal was to construct a "virtual, centralized, grand database" for tracking terrorists, the same kind of database Trump wants to build to track illegal immigrants. The government hit years of roadblocks before finally achieving this vision. Public backlash to the unnerving name led to a rebranding before Congress defunded the program in 2003. "But a lot of these really bad surveillance-related ideas, even if they get knocked down, they're kind of like a vampire, right? You knock them down. You think you've knocked them down for good, and then one way or the other, they just seem to kind of come back," says Eddington. Total Information Awareness lived on under the innocuous code name "basketball" and was absorbed by the NSA, which Edward Snowden would later reveal was collecting the phone records of millions of Americans and intercepting web traffic without the knowledge of the companies involved. The same year that Congress defunded Total Information Awareness, Peter Thiel and his co-founders created Palantir. They even met with John Poindexter, the recently fired director of Total Information Awareness, who told them they had "an interesting idea." Initially, the company struggled to attract mainstream investment, but when the CIA's venture capital arm, In-Q-Tel, put in $2 million, it signaled interest from Washington, and the company took off. Today, Palantir is worth more than $400 billion with over half of its revenue coming from government contracts. Thanks to advancements in artificial intelligence, the dream—or nightmare—of "total information awareness" seems closer than ever before. ICE is now using mobile facial recognition technology to track immigrants and tie their identities to "derogatory information" compiled in one of their database. The agency also uses Clearview AI, facial recognition tech that scrapes social media, and much of the data is managed by Palantir. Even if you support Trump's objective to deport more immigrants, you should still be concerned about this technology. Like with many surveillance tools of the past, immigrants and foreigners are fertile testing ground before it's rolled out to the wider domestic population. People on the Republican side should really not be cheering a lot of this stuff that's going on right now, because if Democrats manage to retake the White House and both chambers of Congress, which is a very real possibility, then they could turn around and use those tools," says Eddington. "And by the way, that's exactly what my own work in this book basically shows." The MAGA movement should understand what it's like to walk under the gaze of the Eye of Sauron. After the January 6 riot, "right-wing extremism" became a priority for America's intelligence agencies. Online censorship reached its zenith under the Biden administration, which leaned on social media companies to suppress speech that criticized COVID policies, vaccines, or questioned election integrity. The "triumph of fear," as Eddington calls it, is what has allowed governments—particularly federal law enforcement—to expand and consolidate power throughout history, consistent with Higgs' theory of crisis and Leviathan. Eddington has documented how, since its inception, federal law enforcement has been weaponized to go after not just criminals but political dissidents, which the FBI did with a program called COINTELPRO in the '50s and '60s. "It started out as something designed to target the Communist Party of the United States," says Eddington, "But then of course it branched out. You know, they went after Dr. Martin Luther King, Jr. They went after the organization that he founded. We're talking about hundreds of thousands of human beings who were targeted in the course of this." It's easy to imagine how the technology that Palantir is building to help the government keep tabs on terrorists and illegal immigrants in the interest of peace and safety could be applied more broadly. Snowden has predicted that governments will soon use AI-enhanced surveillance not merely to fight terrorism or deport immigrants but to shape behavior. "We are all entering a new phase of history where what we considered the more enlightened collection of States globally, ones that had embraced the classically liberal ideal, are now some of those working hardest to roll it back, to bureaucratize, to influence, to nudge, to shape, to ultimately control each and every individual within their territory and beyond," said Snowden at a conference in Singapore late last year.  Thiel is also worried about AI. "If you were to say that crypto is libertarian, then why can't we say that AI is is communist," said Thiel at the Univ

    17 min
  2. 05/19/2025 · VIDEO

    Eisenhower Warned Us About the 'Scientific Elite'

    In President Dwight D. Eisenhower's famous 1961 speech about the dangers of the military-industrial complex, he also cautioned Americans about the growing power of a "scientific, technological elite." "The prospect of domination of the nation's scholars by federal employment project allocations and the power of money is ever present," warned Eisenhower. The federal government had become a major financier of scientific research after World War II, and Eisenhower was worried that the spirit of open inquiry and progress would be corrupted by the priorities of the federal bureaucracy. And he was right. Today, many of the people protesting the Trump administration's cuts to federal funding for scientific research are part of that scientific, technological elite. But there's a good chance that slashing federal spending will liberate science from the corrupting forces that Eisenhower warned us about. "If you look at, particularly, 19th century Britain when science was absolutely in the private sector, we have some of the best science," says Terence Kealey, a professor of clinical biochemistry at the University of Buckingham and a critic of government science funding. "It comes from the wealth of the rich. Charles Darwin was a rich person. Even [scientists] who had no money had access to rich men's money one way or another. The rich paid for science." Kealey points out that Britain's gross domestic product (GDP) per capita outpaced that of 19th-century France and Germany—both of which generously subsidized scientific research—indicating that the return on state subsidies in the form of economic growth was low. As America emerged as a superpower, its GDP per capita surpassed Britain's. "So the Industrial Revolution was British, and the second Industrial Revolution, was American, and both were in the absence of the government funding of science," says Kealey. Thomas Edison's industrial lab produced huge breakthroughs in telecommunications and electrification. Alexander Graham Bell's lab produced modern telephony and sound recording, all without government money. The Wright Brothers—who ran a bicycle shop before revolutionizing aviation—launched the first successfully manned airplane flight in December 1903, beating out more experienced competitors like Samuel Langley, secretary of the Smithsonian Institution, who had received a grant from the War Department for his research. The notion that the government needs to accelerate scientific progress was based on America's experience during World War II, when federally funded research led to breakthroughs in rocketry, medicine, and radar. The Manhattan Project, which cost $27 billion in today's dollars, employed more than half a million people and culminated in the creation of the atomic bomb and the discovery of nuclear fission. "Lobbyists took the Manhattan Project and said, 'Look what government funding of science can do,' and they then twisted it," says Kealey. He acknowledges that the government can accomplish discrete, "mission-based" scientific projects—like racing toward a bomb—but he argues that this is very different from the generalized state funding of "basic research" that followed.  In November 1944, President Franklin D. Roosevelt sent a letter to Vannevar Bush, director of the U.S. Office of Science and Development during the war. Roosevelt instructed Bush to come up with a plan to make federal funding of scientific research permanent.  "It has been basic United States policy that government should foster the opening of new frontiers," wrote Bush in calling for the nationalization of basic science research. "It opened the seas to clipper ships and furnished land for pioneers." Bush's treatise eventually led to the creation of the National Science Foundation in 1950. But it was a stunning accomplishment from America's greatest rival that would supercharge the nationalization of science. Sputnik, the world's first manmade satellite, seemed to confirm fears that the Soviets, with their centrally planned economy, might eclipse the U.S. in scientific innovation and weapons technology. That turned out to be completely wrong. But in 1957, Americans were terrified. After Sputnik, the Eisenhower administration tripled the budget of the National Science Foundation, which would provide federal grants to universities and labs. If federal funding of science is counterproductive, as Kealey argues, what explains the success of Sputnik and the Manhattan Project? Of course, government funding has led to major breakthroughs both during and after World War II, such as the synthesis and mass production of penicillin during World War II (though it was accidentally discovered in a contaminated hospital lab in 1928), cancer immunotherapy, artificial heart valves, and the gene-editing technology CRISPR. But this has to be compared to what might have otherwise happened. Good economics takes into account not only the seen, but the unseen. What are the unseen innovations the world misses out on when governments set the research agenda? "If the government funds science, it actually takes the best scientists out of industry puts them in the universities, and then industry in fact suffers," says Kealey. After Sputnik, government money pushed basic science out of the private sector. By 1964, two-thirds of all research and development was paid for by the federal government. "If you were a tool maker in Ohio in 1964, and you wanted to invest in R&D to make better tools because you wanted the beat your competitors in Utah, you wrote a grant to the Department of Commerce," says Kealey. "That's how nationalized American science was … Eisenhower's warning is absolutely correct." In academic science, process often takes precedence over outcomes. Researchers are incentivized to publish peer-reviewed papers that garner citations, which helps them secure prestigious academic posts and more federal grants. "What happens under peer review under the government is that there's homogenization, and only one set of ideas is allowed to emerge," says Kealey. The pressure to publish has created a positivity bias, where an increasing number of papers supporting a hypothesis are published, while negative findings are often buried. One biotech company could confirm the scientific findings of only six out of 53 "landmark" cancer studies. Swedish researchers found that up to 70 percent of positive findings in certain brain imaging studies could be false. A team of researchers re-examined 100 psychology studies and successfully replicated only 39. "There is still more work to do to verify whether we know what we think we know," they concluded. In an influential 2005 paper, Stanford University professor John Ioannidis flatly concluded that "most published research findings are false." He argued that the current peer review model encourages groupthink, writing that "prestigious investigators may suppress via the peer review process the appearance and dissemination of findings that refute their findings, thus condemning their field to perpetuate false dogma." "You end up with a monolithic view, and so you crush what's so important in science, which is different ideas competing in a marketplace of ideas," says Kealey. For decades, the federal government advised Americans to avoid saturated fat and prioritize carbohydrates based on the work of a researcher named Ancel Keys, who received substantial funding from the U.S. Public Health Service and the National Institutes of Health (NIH). Today, the debate that Keys suppressed rages on. "Ancel Keys said, 'I have the solution, it's all to do with fats,'" says Kealey. "And very quickly, you couldn't get grants to the American Heart Association unless you subscribe to Ancel Key's theory of fat. Having captured this small little redoubt, he then moved to the [National Science Foundation], and then suddenly the whole world believed only one thing." More recently, Stanford's Jay Bhattacharya was attacked by the public health establishment for questioning the COVID-19 lockdowns. He told Reason there's an inherent conflict between the NIH director setting public health policy and doling out grant money. "If you have an NIH director that [sets policy and distributes money], they control the minds of so many scientists. It's an inherent conflict, and nobody's going to really speak. Nobody's going to disagree with them because that's the cash cow," says Bhattacharya, who President Donald Trump appointed head of the NIH. His agency now faces a proposed 40 percent spending cut. But if Kealey is right, slashing science funding could, counterintuitively, accelerate medical innovation in the long run. "If these changes can be managed in such a way that these scientists can move from the NIH into the private sector without massive disruptions to all the work and research they're doing, that will be to the benefit of America," says Kealey. It would be similar to what happened in the early 1970s, when Congress slashed the Defense Advanced Research Projects Agency's budget in half, laying the groundwork for the rise of the computer age. "What happens to all those scientists? Well, they all go out to Silicon Valley, because they've all been made redundant … And they invent the modern world," says Kealey.  "New frontiers of the mind are before us, and if they are pioneered with the same vision, boldness, and drive with which we have waged this war we can create a fuller and more fruitful employment and a fuller and more fruitful life," wrote Roosevelt in his letter to Bush. But maybe Roosevelt drew the wrong conclusions from the war. "Vision, boldness, and drive" can be found amongst the dreamers and tinkerers working in private laboratories, who are often too iconoclastic to be good candidates for government research grants but whose ideas, simply, work. "It's technology that k

    13 min
  3. 04/08/2025 · VIDEO

    Milton Friedman's Warning to DOGE

    "Wise words," wrote Elon Musk about this 1999 viral clip described as "Milton Friedman casually giving the blueprint for DOGE [the Department of Government Efficiency]" as he ticks off a list of federal government agencies he'd be comfortable eliminating.  Musk is right. Friedman, a Nobel Prize–winning libertarian economist, did offer a solid blueprint for creating a smaller, less intrusive government. At the peak of his fame, he seemed poised to influence an American president to finally slash the federal bureaucracy. But those efforts ended in disappointment because they were blocked by what Friedman called the Iron Triangle of Politics. Slashing government waste and making the federal bureaucracy more accountable are incredibly important. But President Donald Trump and Musk are hitting the same wall President Ronald Reagan did more than four decades ago.  Now more than ever, it's time to pay attention to Milton Friedman's advice for how to defeat the tyranny of the status quo. In the 1980s, Friedman's influence reached deep into the halls of power. "Government is not the solution to our problem. Government is the problem," said President Reagan during his first inaugural address in January 1981.  Like Trump, Reagan was preceded in the White House by a big government liberal, who expanded the size of government and whose presidency was plagued by inflation. Reagan, who awarded Friedman the Presidential Medal of Freedom, promised to enact many of the libertarian policy ideas laid out in the 1980 bestseller co-authored with his wife Rose. "I don't think it's an exaggeration to call Milton Friedman's Free to Choose a survival kit for you, for our nation, and for freedom," Reagan said in an introduction to the television adaptation of Friedman's book. But for the most part, the Reagan Revolution failed to deliver on its libertarian promises. "Reagan's free market principles…clashed with…political reality…everywhere," wrote his former budget director David Stockman in his 1986 book The Triumph of Politics: Why the Reagan Revolution Failed. "For the Reagan Revolution to add up," he wrote, all the people "lured" by politicians into milking social services "had to be cut off." Reagan tried to keep his promises but, like most presidents, he was only partly successful. Reagan lifted price controls on oil, cut taxes, and pushed for deregulation. But his commitment to these initiatives quickly fizzled. Federal spending exploded, and he even left trade quotas in place for the automotive industry. The failure of the Reagan revolution inspired the Friedmans to write The Tyranny of the Status Quo, which examines the political obstacles that obstruct government cost cutting. Their insights are as relevant today as they were 41 years ago.  The book, which came out in 1984, pinpoints the Iron Triangle of Politics as the main obstacle to cutting government. The triangle's three points reinforce each other to uphold the status quo: the Beneficiaries, the Politicians, and the Bureaucrats. The "beneficiaries" are interest groups and connected industries that profit off of government programs at the expense of taxpayers. Today's beneficiaries include farmers who receive federal dollars. The new budget bill backed by the Republican Party would extend the Farm Bill, which subsidizes crop purchases. As Friedman said, the people paying the bill are "dispersed." You might not have noticed your share of the $2.1 billion going to prop up corn, soybeans, wheat, and other prices when you paid your 2023 taxes, but the farmers who get that money certainly did. The "politicians" depicted on the triangle are supposed to be responsive to their constituents but end up serving interest groups instead. But it's the bureaucrats who actually distribute the money. They grow their power when politicians grow the size of their departments, which generates more spoils to distribute to the beneficiaries. It's a symbiotic relationship all at taxpayer expense. Bureaucracy tends to "proceed by laws of its own," wrote Friedman, noting that in the half-century between Franklin Delano Roosevelt's New Deal and the Reagan Revolution, the U.S. population "didn't quite double but federal government employees multiplied almost fivefold." Musk has also observed that a metastasizing bureaucracy "proceeds by laws of its own," stating in a press conference from the Oval Office that "if the people cannot vote and have their will be decided by their elected representatives…then we don't live in a democracy, we live in a bureaucracy." And, like Friedman, he senses danger if the ballooning of the bureaucratic state isn't reversed. At another press conference, he told attendees that "the overall goal here with the DOGE team is to help address the enormous deficit….If this continues, the country will become de facto bankrupt." DOGE's strategy is to try breaking through the Iron Triangle by the force of a thousand cuts, looking for little inefficiencies with the mindset of a software engineer. Musk has described his role as "tech support," which is fairly accurate given that the Executive Order that created DOGE actually just rebranded an Obama-era agency called the U.S. Digital Service. It's a good start. The federal work force should be streamlined, and much of it even automated. But Musk might be repeating some mistakes of the Reagan years. As Stockman observed, the Reagan Revolution floundered because his team only focused on "easy solutions" like ferreting out "obscure tidbits of spending that could be excised without arousing massive political resistance," which" yielded savings that amounted to rounding errors in a trillion-dollar budget." To make real progress on cutting spending, the cost reduction must go deeper than tech support could manage on its own. Friedman knew that the path to shrinking the federal government began with abolishing federal agencies. In his viral clip, he lists the departments of Housing and Urban Development, Agriculture, Commerce, and Education as ones to put on the chopping block.  Trump has already shut down the Department of Education…kind of. His executive order directs the Education secretary to draw up plans to eliminate or shift some spending to other departments. It keeps major spending like federal student loans intact, and a total dismantling will require Congress to act. The Trump administration has made severe cuts to the U.S. Agency for International Development (USAID), and it defunded the Consumer Financial Protection Bureau, the brainchild of Elizabeth Warren, which made access to credit and banking more difficult for low-income customers. DOGE also enticed 75,000 federal workers to resign. But many of these cost-cutting initiatives have been challenged in court. Truly eliminating federal agencies requires congressional action. Because Trump holds only slim congressional majorities and didn't win on a platform to slash government, he won't be able to eliminate entire federal departments like Commerce or Agriculture. But what would happen if the Trump administration had really followed the Friedman blueprint, learned from the shortcomings of the Reagan Revolution, and created a political movement capable of pressuring Congress to finally start permanently eliminating entire agencies? Friedman says it would actually make the federal government function better by narrowly focusing on providing what state governments and the private sector can't. "One function of government is to protect the country against foreign enemies—national defense," says Friedman. "A second function of government, and one which it performs very, very badly, is to protect the individual citizen against abuse and coercion by other citizens….I believe that the government performs that function very ineffectively because it's doing so many things that it has no business doing." Earlier this year, Musk wielded a chainsaw gifted to him by Argentina's libertarian president, Javier Milei, who more closely followed the Friedman blueprint by targeting the beneficiaries and the bureaucracy, which he calls "La Casta." In Argentina, it took massive poverty and triple-digit inflation to spark a real libertarian movement that now has a chance of overthrowing the tyranny of the status quo. We don't want to wait for things to get that bad.  Musk praised Milei's approach at an event in Buenos Aires co-hosted by the libertarian Cato Institute. "I think governments around the world should be actively deleting regulations, questioning whether departments should exist," said Musk. "Obviously President Milei seems to be doing a fantastic job on this front." Fantastic indeed. But how can the Iron Triangle be overcome in the American system? DOGE itself can't legally delete entire departments. DOGE's website claimed $140 billion in cuts out of its $2 trillion goal as of early April 2025. But it hasn't provided full documentation, and various media and open source analyses have ball-parked DOGE's total savings as more in the $2 billion to $7 billion range. Either way, DOGE isn't anywhere close to reaching its goal of cutting $2 trillion in government spending, or almost 30 percent of the $7 trillion annual budget. The Congressional Budget Office found the deficit grew 5 percent in February compared to the previous year despite DOGE's early cuts. Meanwhile, the Republican majority passed a budget projected that would add $3.4 trillion to our $28.8 trillion debt. And we haven't even talked about Social Security and Medicare, which are the major drivers of debt, and which Trump has promised not to touch. As Stockman came to realize, this is a bipartisan problem. "There isn't a difference [between the parties] when it comes to the debt," he said on an episode of Reason's Just Asking Questions. "How in the world can we keep adding $1 trillion to our pub

    16 min
  4. 02/24/2025 · VIDEO

    Why the Internet Celebrated a Killer

    "Deny," "defend," "depose"—these three words were allegedly written on bullets found at the murder scene of United Healthcare CEO Brian Thompson. The slogan began appearing in graffiti, highway banners, and T-shirts. When the identity of the likely killer was revealed to be a man named Luigi Mangione, he developed a passionate fanbase.  "So many men and women are going nuts over how good-looking this killer is," said Jimmy Kimmel in a breezy monologue joking about his writing staff's adulation of Mangione's physique. "Free Luigi!" exclaimed comedian Bill Burr on a later episode of Kimmel's show. How did a man who allegedly executed a married father of two at dawn on a New York City sidewalk become a hero? Those three bullets with words inscribed on them explain not just why the alleged killer did it, but why he received so much adulation. And it's not for the reasons most people think.  It seemed like the perfect American tragedy: A handsome valedictorian with a promising future suffers a back injury and a botched surgery, robbed of life's pleasures at his physical peak—no surfing, no travel, no sex. The personal became political, so the story goes, when the health insurance industry rejected Mangione's claim for treatment. "Deny, defend, depose" was likely a reference to a 15-year-old book by Jay Feinman exposing how health insurance companies don't pay routine claims.  "Frankly, these parasites simply had it coming," Mangione wrote in a manifesto.  His fans embraced him as "our shooter." The media made him a symbol of American rage towards a system that denies basic treatments with an eye toward the bottom line. Former Washington Post and New York Times reporter Taylor Lorenz defended the celebrations of Thompson's murder, writing that in a nation with "a barbaric healthcare system," where "the people at the top…rake in millions while inflicting pain, suffering, and death on millions of innocent people," "it's natural to wish" that people like Brian Thompson "suffer the same fate."  "I felt alongside so many other Americans, joy, unfortunately," Lorenz told an incredulous Piers Morgan when asked to describe her reaction to Thompson's murder, later clarifying that she felt, "maybe not joy, but certainly not empathy."  Forty-one percent of poll respondents under age 30 say the killing of Brian Thompson was acceptable. More young people polled admitted to viewing the killer favorably than unfavorably. But these poll numbers don't actually tell us very much about popular dissatisfaction with health insurance.  Most people under 30 are healthy and don't interact much with the health care industry. In fact, despite its problems, two-thirds of Americans say they are personally "satisfied" with their own insurance coverage.  Yet, the "delay, deny, and defend" inscribed on bullets do explain Mangione's popularity: Equating words with weapons is a reflection of how our culture increasingly treats language and violence as morally indistinguishable. I first encountered claims that speech equaled violence a decade ago as I interviewed college students about microaggressions, trigger warnings, and deplatforming mobs. One student expressed the view that "political change is hard to conceive of without violence…even taking human life." Today, most students approve of shouting down viewpoints they disagree with; almost half are okay with blocking access to speeches; and a third say violence is a justified response to hateful ideas.  These notions trace back to the 1960s and a group of intellectuals who were part of the "Frankfurt School." In a 1965 essay, the German-American philosopher Herbert Marcuse, who was once branded the father of the New Left, called into question the value of free speech in a "manipulated" society, arguing that we need to "reexamine…the traditional distinction between violent and non-violent action" and recognize a difference between "revolutionary and reactionary violence, between violence practiced by the oppressed and the oppressors." "People experience denied claims as an act of violence against them," Rep. Alexandria Ocasio-Cortez (D–N.Y.) said in a social media post addressing Thompson's murder. If words are violence—and denying a service is violence—then actual violence is justified as retribution. To celebrate the murder of Brian Thompson, one must first dehumanize him by transforming him from a three-dimensional human into a low-resolution symbol. But he was a real person with a family: a father of two and the son of a grain elevator operator who worked his way up the corporate ladder.  Thompson reportedly rushed $135 billion to an emergency fund for health care providers, keeping "thousands of hospitals and other healthcare providers afloat during the pandemic." Meanwhile, the federal government struggled to find the money. He was soon after promoted to CEO.  He made a lot of money in that role but didn't start out with the same "privileges" as his accused murderer: an Ivy League son of wealthy parents who spent the months leading up to the murder bumming around with friends in Hawaii. There's also no evidence that Mangione was ever a United Healthcare client. He evidently received back surgery and recommended the procedure to others on Reddit, advising on strategies to get insurers to cover it.  The truth is, neither of these men should be viewed as heroes or villains, or simplistic symbols in a memetic war. They're flawed humans who made consequential choices with varying levels of knowledge and virtue. And now Mangione, if proven guilty, will rightly face the consequences of those choices.  Fortunately, political violence is still unpopular with the majority of Americans, but it only takes a small and determined minority to wreak havoc when left unchecked. During the Black Lives Matter protests of 2020, a similar mentality that led to celebrating Thompson's murder generated support and excuses when neighborhoods were looted and bricks were tossed through store windows. Following its defeat, some members of the Democratic party, such as Rep. Seth Moulton (D–Mass.), are distancing themselves from the mob cancellation, anti-speech, deplatforming mindset because they realize that it's a losing strategy.  "We have a wing of our party that shames us, that tries to cancel people who even bring up these difficult topics and shames voters," Moulton said in an interview defending his comments about trans athletes that had provoked a backlash in his party.  Maybe it's possible to reverse the cultural drift that made the murder of Brian Thompson into something to celebrate. Combine moral zealotry with increasingly blurred lines between political speech and violence long enough, and the outcome is predictable: more violence. Photo credits: Antony Mayfield (CC BY 2.0), SEBASTIAN KAULITZKI/SCIENCE PHOTO LIBRARY/Newscom, Mykyta Starychenko, Anthony Behar/Sipa USA/Newscom, Curtis Means/UPI/Newscom, Stefan Krusche, Tomas Del Amo, Kelpfish, Cristina Matuozzi/Sipa USA/Newscom, Steve Sanchez/Sipa USA/Newscom; Credit: Marilyn Humphries/Newscom   Producer: John OsterhoudtGraphics: Lex VillenaThe post Why the Internet Celebrated a Killer appeared first on Reason.com.

    7 min
  5. 01/17/2025 · VIDEO

    Why Donald Trump Made a Deal To Free Ross Ulbricht

    Ross Ulbricht was arrested at 29. Now, he's 40. He faces a double life sentence plus 40 years with no possibility of parole for creating the Silk Road, a dark web drug marketplace that facilitated $1.2 billion in bitcoin-denominated transactions. "I'll spend the next few decades in this cage. Then, sometime later this century, I'll grow old and die. I'll finally leave prison, but I'll be in a body bag," he told an interviewer at a 2021 virtual blockchain conference. But a second chance might be coming for Ulbricht, from an unlikely savior. "If you vote for me, on day one, I will commute the sentence of Ross Ulbricht," Donald Trump told a crowd of attendees at the 2024 Libertarian National Convention. Trump made a deal with the Libertarian Party. And now, Ulbricht might not have to spend his middle and old age behind bars. He might not have to leave in a body bag if Trump makes good on his promise. Will he? In the coming months, career FBI officers and Department of Justice (DOJ) attorneys may dredge up lies about Ulbricht so that Trump will change his mind. They may appeal to some of his draconian instincts. They may try to pin on Ulbricht some of the disastrous outcomes of the drug war. But Trump should ignore the saboteurs and keep his promise to free Ulbricht. Here's why: Ulbricht's arrest on October 1, 2013, at the Glen Park Branch of the San Francisco Public Library was like a scene from an action movie. As he was downloading an interview with Vince Gilligan, the creator of Breaking Bad, he was simultaneously administering the Silk Road in another browser window. Undercover FBI agents staged a physical fight behind him. When he turned his head to observe the commotion, another agent snatched his laptop before he could close the cover, which would have encrypted the contents of its hard drive. The FBI keeps a picture of the computer on display as a trophy from the hunt. But Ulbricht was no Walter White, the frustrated high school chemistry teacher who transforms into a violent drug kingpin in Gilligan's series, driven by his lust for power and retribution. Ulbricht was an Eagle Scout, who majored in materials science and engineering at the University of Texas at Dallas on a full scholarship. He was passionate about libertarian philosophy and Austrian economics, and read Ludwig von Mises and Murray Rothbard. He wrote on LinkedIn that he wanted to use economic theory "as a means to abolish the use of coercion and aggression amongst mankind" and create "an economic simulation to give people a first-hand experience of what it would be like to live in a world without the systemic use of force." "I was trying to do something good," said Ulbricht in a 2021 jailhouse interview. "I was trying to help us move forward." And he did. Ulbricht created an underground e-commerce website called the Silk Road. He was its first vendor, selling homegrown psilocybin mushrooms. The Silk Road became the eBay of drugs, with trusted sellers earning higher ratings, and the message boards filled with tips for safer drug use. It established an ethical code of conduct: No fake degrees, no child porn, no stolen goods. "Our basic rules are to treat others as you would wish to be treated and don't do anything to hurt or scam someone else." "I was trying to help us move toward a freer and more equitable world," said Ulbricht. At the same time, the Obama administration's justice department was pressuring banks and credit card companies to stop servicing gun shops, adult websites, and payday lenders, even though what they were doing was completely legal. The Silk Road demonstrated that, with bitcoin, you could buy things on the internet by circumventing payment rails that the government controlled. Online trade had become virtually unstoppable. "Back then, bitcoin made me feel like anything was possible," Ulbricht says in his jailhouse interview. Ulbricht became a hero to libertarians. But others say he got exactly what he deserved. "Life in prison without parole. Anybody else? Any other wise guys want to do it? That's what you'll get," gloated Bill O'Reilly on Fox News at the time of the sentencing. For his part, Ulbricht is remorseful and regretful, telling his interviewer from the jailhouse that "we all know the road to hell is paved with good intentions, and now here I am. I'm in hell." Does he deserve this fate? As Ulbricht became paranoid that he'd be caught, did he stray from his high-minded ideals and "break bad" like Walter White? And does that mean Trump should think twice before freeing him? Ulbricht's friends were shocked at his arrest. They described him as "sweet-natured," "loyal," and "guileless and nonaggressive." He comes across as poetic and sensitive in his artwork and an online interview with a friend posted before his arrest, where they each muse about their first love and plans for the future. But many who oppose freeing Ulbricht say that he was also a contract killer, pointing to uncharged allegations that he tried to hire hit men to take out digital bandits during his tenure at the Silk Road. But when you look closer, things get murky. Here's what we know: When building their case, prosecutors drew on chat logs from a moment of crisis at the Silk Road. The site's top administrator, Curtis Green, had just been arrested. It looks like he might have stolen about $350,000 worth of bitcoin. "Nob," a participant on the Silk Road, was chatting with the site's top administrator, who called himself the "Dread Pirate Roberts." He told Nob, "This will be the first time I have had to call on my muscle," and asked that Green be "beat up, then forced to send the bitcoins he stole back." Later that day, the Dread Pirate Roberts messaged Nob again: Can you change the order to execute rather than torture? Nob sent the Dread Pirate Roberts pictures of what looked like Curtis Green being tortured and killed. It turns out that Nob was DEA agent Carl Force, one of two investigators on the case who went to prison for embezzling bitcoin during the investigation. He had staged Green's murder as part of a sting operation. But Ulbricht's defenders say that the Dread Pirate Roberts who was chatting with the corrupt undercover agent who set up a fake hit wasn't actually Ross Ulbricht. After all, the name was inspired by the film The Princess Bride to describe a character inhabited over and over by different individuals through many generations. When the Dread Pirate Roberts granted an interview to Forbes two months before Ulbricht's arrest, he insisted that he was not the site's founder. "I didn't start the Silk Road, my predecessor did," he told the Forbes journalist, who pressed to ask if he wrote the comments in the Silk Road forums. "The most I am willing to reveal is that I am not the first administrator of Silk Road," replied the Dread Pirate Roberts. The jury never saw this interview. Green, the man targeted for the fake hit, has said there were "multiple" people with access to the account, including himself. Green told Ulbricht's mother that he doesn't believe Ulbricht was the one who put out the hit and claims the undercover agent, Carl Force (aka Nob), also had access to the Dread Pirate Roberts' account. There were other hits also ordered by the Dread Pirate Roberts that didn't lead to any actual known murders. Federal prosecutors never charged Ulbricht with attempted murder, but the federal judge who sentenced him to two life sentences plus 40 years with no possibility of parole nonetheless referenced these episodes in her decision. She also said that Ulbricht should serve as a public example for acting as though he "was better than the laws of this country." Well, maybe he is. The drug war is the real villain in his story. It has cost $1 trillion while fueling decades of black market crime and violence—all as U.S. overdose deaths increased. Black markets often become violent because there are no legal mechanisms for contract enforcement, and their participants become desperate to avoid being caught, which is what happened to the Silk Road. It's ironic that Trump, the man who openly admires China and Singapore's death penalty for drug dealers, might be the one to free the founder of the world's first major dark web drug market. But if there's one thing Trump likes, it's a good deal. One of his most audacious deals of the 2024 campaign was with the Libertarian Party, which never wins but sometimes covers the spread in close presidential elections. Trump showed up at their convention and proclaimed himself something of a Libertarian himself, to which he was met with a chorus of boos. But behind the scenes, something else was going on. Trump had struck an agreement with the Libertarian Party leadership: Don't spoil this election for me, and I'll free your boy Ross. Libertarian National Committee chair Angela McArdle told Reason that Trump's former Director of National Intelligence Richard Grenell set up a meeting between her and the president at Mar-a-Lago during the 2024 campaign. "Rick [Grenell] is like, 'Well, President Trump is a deal-maker,'" says McArdle. "'You ask for the world. You ask for whatever you think you can get, and it'll probably land in the middle. He likes making deals.' I was anxious to ask too much and have Ross fall by the wayside. So I really hit it home like, 'We just want you to free Ross Ulbricht. Like, it's the most important thing.' And he said, 'I love freeing people. I'll do that.'" After the party's nominating convention, McArdle vowed that the party would only back its own candidate—Chase Oliver—in noncompetitive states favoring the Democrats. Trump won by thin margins in several swing states while the Libertarian Party finished with its lowest vote share in 16 years. "It was sort of this like a glorious miracle, the way things unfolded with Chase [Oliver]," says McArdle. "It couldn't ha

    16 min
  6. 09/26/2024 · VIDEO

    Three Mile Island Nearly Killed Nuclear. Now It's Coming Back.

    Is a nuclear renaissance about to begin on the very site of the public relations catastrophe that practically destroyed the industry 45 years ago? Constellation Energy recently announced a deal with Microsoft to restore a retired reactor on Pennsylvania's Three Mile Island. Microsoft has agreed to purchase energy from the plant for 20 years to power its AI data centers. A U.S. nuclear reactor has never before been brought out of retirement. Nuclear power was once considered the clean energy source of the future, with dozens of new plants coming online in the late '60s and early '70s. But in March of 1979, a meltdown occurred at Three Mile Island's nuclear plant. There were no casualties, and there was no lingering environmental damage. But the incident spooked the nation. From a publicity standpoint, the timing was disastrous—Three Mile Island occurred while The China Syndrome, a fictional account of safety cover-ups at a nuclear plant, was still in theaters, featuring Jane Fonda, Jack Lemmon, and Michael Douglas. "After Three Mile Island, what was considered to be the best interest of the public was just reducing risk to as low as possible," says Adam Stein, director of the Nuclear Energy Innovation Program at the Breakthrough Institute. "It resulted in a huge volume of regulations that anybody that wanted to build a new reactor had to know. It made the learning curve much steeper to even attempt to innovate in the industry." It was a public relations disaster for the nuclear industry, and the industry's expansion tapered off, concluding in a 20-year spell in which no new nuclear reactors were built in the U.S. "My view is that these supposedly environmentalist groups formed in the 1970s that are not primarily pro-environment. They're really primarily anti-nuclear," says Eric Dawson, co-founder of Nuclear New York, a group fighting to protect the industry on the grounds that nuclear is "the most scalable, reliable, efficient, land-conserving, material-sparing, zero-emission source of energy ever created." He says that Three Mile Island empowered the antinuclear movement. The same year of the meltdown, about 200,000 antinuclear activists crowded into New York's Battery Park City, capping off a week-long concert featuring Pete Seeger, Jackson Browne, and Bonnie Raitt, which raised awareness and funding for the antinuclear movement. "Stopping atomic energy is practicing patriotism," Ralph Nader told the crowd. "Stopping atomic energy is fighting cancer; stopping atomic energy is fighting inflation." "They are a generation that was radicalized from the Vietnam War," says Dawson. "They became antiwar. They then became anti nuclear weapons, and then they conflated nuclear weapons with nuclear energy. And they made it their mission to shut down nuclear energy."  And they succeeded in that mission. Environmentalists, in effect, may have crippled the only truly viable form of clean energy. The federal government makes permitting arduous. Many states severely restrict new plant construction and force operational ones to shut down prematurely. A striking recent example was the shutdown of Indian Point Energy Center, New York state's largest nuclear plant. Antinuclear activists had targeted the plant. Their cause gained significant traction with the support of New York State Attorney General—and future governor—Andrew Cuomo, who believed the nuclear plant was "risky." Of course, it is true that nuclear energy carries risk. So does every form of power generation. "If you look at energy sources, there's nothing that's perfect. There is no utopia.  basically we have a choice. Everything is compared to something else," says Dawson. Decades of political attacks on the nuclear industry have caused the United States to rely more on burning fossil fuels, which brings another set of risks. "[Nuclear] would eliminate the majority of pollution-related fatalities in the US, which is thousands a year, because most of those come from coal-fired power plants," says Stein.  As politicians have slowly realized that the dangers of nuclear power may have been exaggerated by activists, and the benefits of a reliable emissions-free energy source underappreciated, the regulatory landscape has slowly changed. The first new U.S. reactor built from scratch since 1974 opened in Georgia in 2022—albeit at a very high cost. The federal government issued its first ever approval for a small modular reactor in January 2023. Constellation estimates that it will spend about $1.6 billion to bring the Three Mile Island reactor online by 2028 and will seek to renew the operating license through 2054. Pennsylvania's governor Josh Shapiro wrote a letter to federal regulators asking that the application be fast-tracked. Microsoft's VP of energy calls the deal "a major milestone" in the company's effort to "decarbonize the grid" while pursuing an AI-driven future that's going to require a lot of energy. The Microsoft deal is the latest piece of evidence that nuclear energy—after being hampered by decades of hyper-cautious regulation—is poised for a comeback. Three Mile Island could one day become a symbol for nuclear's rebirth.   Photo Credits: RICHARD B. LEVINE/Newscom; FRANCES M. ROBERTS/Newscom; Paul Souders / DanitaDelimont.com / Danita Delimont Photography/Newscom; LAURENCE KESTERSON/KRT/Newscom;  Robert J. Polett/Newscom; Dick Darrell/Toronto Star/ZUMA Press/Newscom; St Petersburg Times/ZUMAPRESS/Newscom; Library of Congress/Bernard Gotfryd; Jmnbqb, CC BY-SA 4.0 DEED, via Wikimedia Commons; Meghan McCarthy/ZUMA Press/Newscom; Erik Mcgregor/ZUMA Press/Newscom; Joe Sohm/Visions of America/Joseph Sohm/Universal Images Group/Newscom; Reginald Mathalone/ZUMAPRESS/Newscom; Bastiaan Slabbers/ZUMAPRESS/Newscom; Anthony Behar/Sipa USA/Newscom; */Kyodo/Newscom; Pacific Press/Sipa USA/Newscom; Paul Hennessy/ZUMAPRESS/Newscom; Michael Siluk/UCG/Universal Images Group/Newscom; KEVIN DIETSCH/UPI/Newscom; Reginald Mathalone/ZUMAPRESS/Newscom; ROGER L. WOLLENBERG/UPI/Newscom Music Credits: "Bubbles Drop" by Cosmonkey via Artlist; "Paper or Plastic" by Bubblz via Artlist; "Digital Abyss" by Stephen Keech via Artlist; "Expand" by Theatre of Delays via Artlist; "Monomer" by Leroy Wild via Artlist; "Behind the City" by Ziv Moran via Artlist; "Fantasma" by Omri Smadar via Artlist Video Editor: Danielle ThompsonAudio Production: Ian KeyserGraphics: Adani SamatThe post Three Mile Island Nearly Killed Nuclear. Now It's Coming Back. appeared first on Reason.com.

    8 min
  7. 05/14/2024 · VIDEO

    A New Law Is Making It Even Harder To Find Day Care in D.C.

    Average toddler day care costs in Washington, D.C., exceed $24,000 a year, outstripping expenses in cities like New York and San Francisco. Despite the steep prices, parents such as Megan McCune and Tom Shonosky, who live in a suburban D.C. neighborhood with their children John and Lizzy, believe day care is still worth it. "They're doing these amazing activities with kids. John's last teacher was planning just all these really stimulating, exciting experiences," McCune says. "That's just not something that we can feasibly do and also have full-time jobs." But day care might soon become a luxury the couple can no longer afford. In 2016, a regulation was passed mandating that day care workers obtain a college degree. The city's logic is straightforward: If D.C.'s day care staff had college degrees, they could do a better job helping disadvantaged kids climb out of poverty.  "The developmental opportunities and those early opportunities that they have really set the foundation for their potential success long term," explained local education official Elizabeth Groginsky, a proponent of the regulation. After a delay, the rule was finally implemented in December 2023.  Yet contrary to its intended benefits, this regulation could lead to job losses among day care workers, increased operating costs for day cares, and higher tuition for parents.  Ami Bawa, lead teacher and assistant director at a nursery school in northwest D.C., exemplifies the unintended consequences of the regulation. Although she has been working in the field for over 20 years, Bawa may now be forced out of her job. "Even though I have a lot of experiential learning, I don't meet what is now the current standard," she explains. As a veteran teacher, Bawa is technically eligible to apply for a waiver to continue working, but she's been waiting for five months for a response from the city. "All of these roadblocks make it harder. We're going to lose a lot of really good teachers," Bawa says.  Proponents argue that the regulation will earn teachers more respect and higher salaries. But Bawa disagrees: "A profession like teaching specifically has to be one where you really care for and love what you're doing. What your education credential is doesn't equate to loving and being committed to the field." The regulation "makes us feel like we're interchangeable, like anybody could do this job, when that really is not the case."  In addition, the college requirement complicates the process for day cares to find qualified staff. McCune explains, "It's going to be the smaller day cares, the more affordable day cares that are going to suffer because they're not going to be able to attract talent or retain it, and they're not going to be able to put their prices to the level that they need to be to cover that talent, because people like us aren't going to be able to pay it."  In 2018, the libertarian-leaning public-interest law firm the Institute for Justice sued the D.C. government to overturn the education requirements, claiming it interferes with the right to earn a living. But the courts ruled in favor of the city on the grounds that the requirement was reasonable.  Yet the effectiveness of college requirements remains a subject of debate. As Robert Pianta, a professor of early childhood education at the University of Virginia, points out, "The evidence for a two-year degree or a four-year degree is not strong."  There are over 3,000 early childhood degree programs across the United States, and they vary significantly in terms of what they teach and focus on. "With all that variation under there, it's no surprise to anyone that the degree itself doesn't matter," Pianta says.  Many day care teachers eager to retain their jobs have enrolled part-time at institutions such as Trinity Washington University, a small college in the district. To earn the degree required to be an assistant teacher at a D.C. day care, students at Trinity can take classes like American history and music appreciation but aren't required to take courses in early education.   Councilmember Christina Henderson supports the idea that day care workers study subjects unrelated to early education, emphasizing the importance of "critical thinking and learning." In contrast, McCune remarks, "Let's just back up a little and remember that these are babies….I think the needs of children at that stage, they're pretty primal."  Nicole Page, a local preschool director, believes that "it does not only take education, it takes experience" to work at a day care. "That's what we will lose if we are not able to retain our staff, is the wealth of knowledge that they have by hands-on experience." Her preschool is at risk of losing valuable staff, with at least 11 teachers failing to meet the new qualifications. One teacher even has a Ph.D. in family and children studies and is an adjunct professor teaching a policy and advocacy course for early childhood education at a local university, but she's no longer qualified to teach at a day care because her degree isn't in early childhood education.  "If we are not able to retain the staff that we have, we may end up having to close some of our classrooms," Page explains. This regulation, intended to improve child care quality, may instead harm those it aims to assist. "I just think in D.C., there's a lot of bureaucracy," says Shonosky. "This is just another case where bureaucracy is going to make our lives worse."   Music Credits: "Pizzi Waltz" by Kadir Demir, via Artlist; "Against the Clock" by Rhythm Scott, via Artlist; "The Morning Lights" by Francesco DAndrea. via Artlist; "Sophisticated Nostalgia" by Nobou, via Artlist; "Deep Dive" by Ty Simon, via Artlist; "The Isle" by Rhythm Scott via Artlist; "Grey Shadow" by ANBR, via Artlist; "Currents" by Ardie Son, via Artlist;  Photo Credits: Caroline Brehman/CQ Roll Call/Newscom  Audio Production: Ian KeyserGraphics: Adani SamatWriter: Katarina HallThe post A New Law Is Making It Even Harder To Find Day Care in D.C. appeared first on Reason.com.

    11 min
  8. 05/07/2024 · VIDEO

    Academics Use Imaginary Data in Their Research

    After surviving a disastrous congressional hearing, Claudine Gay was forced to resign as the president of Harvard for repeatedly copying and pasting language used by other scholars and passing it off as her own. She's hardly alone among elite academics, and plagiarism has become a roiling scandal in academia. There's another common practice among professional researchers that should be generating even more outrage: making up data. I'm not talking about explicit fraud, which also happens way too often, but about openly inserting fictional data into a supposedly objective analysis. Instead of doing the hard work of gathering data to test hypotheses, researchers take the easy path of generating numbers to support their preconceptions or to claim statistical significance. They cloak this practice in fancy-sounding words like "imputation," "ecological inference," "contextualization," and "synthetic control." They're actually just making stuff up. Claudine Gay was accused of plagiarizing sections of her Ph.D. thesis, for which she was awarded Harvard's Toppan Prize for the best dissertation in political science. She has since requested three corrections. More outrageous is that she wrote a paper on white voter participation without having any data on white voter participation. In an article in the American Political Science Review that was based on her dissertation, Gay set out to investigate "the link between black congressional representation and political engagement," finding that "the election of blacks to Congress negatively affects white political involvement and only rarely increases political engagement among African Americans." To arrive at that finding, you might assume that Gay had done the hard work of measuring white and black voting patterns in the districts she was studying. You would assume wrong. Instead, Gay used regression analysis to estimate white voting patterns. She analyzed 10 districts with black representatives and observed that those with more voting-age whites had lower turnout at the polls than her model predicted. So she concludes that whites must be the ones not voting. She committed what in statistics is known as the "ecological fallacy"—you see two things occurring in the same place and assume a causal relationship. For example, you notice a lot of people dying in hospitals, so you assume hospitals kill people. The classic example is Jim Crow laws were strictest in states that skewed black. Ecological inference leads to the false conclusion that blacks supported Jim Crow. Gay's theory that a black congressional representative depresses white voter turnout could be true, but there are other plausible explanations for what she observed. The point is that we don't know. The way to investigate white voter turnout is to measure white voter turnout.  Gay is hardly the only culprit. Because she was the president of Harvard, it's worth making an example of her work, but it reflects broad trends in academia. Unlike the academic crime of plagiarism, students are taught and encouraged to invent data under the guise of statistical sophistication. Academia values the appearance of truth over actual truth. You need real data to understand the world. The process of gathering real data also leads to essential insights. Researchers pick up on subtleties that often cause them to shift their hypotheses. Armchair investigators, on the other hand, build neat rows and columns that don't say anything about what's happening outside their windows. Another technique for generating rather than collecting data is called "imputation," which was used in a paper titled "Green innovations and patents in OECD countries" by economists Almas Heshmati and Mike Shinas. The authors wanted to analyze the number of "green" patents issued by different countries in different years. But the authors only had data for some countries and some years. "Imputation" means filling in data gaps with educated guesses. It can be defensible if you have a good basis for your guesses and they don't affect your conclusions strongly. For example, you can usually guess gender based on a person's name. But if you're studying the number of green patents, and you don't know that number, imputation isn't an appropriate tool for solving the problem. The use of imputation allowed them to publish a paper arguing that environmentalist policies lead to innovation—which is likely the conclusions they had hoped for—and to do so with enough statistical significance to pass muster with journal editors. A graduate student in economics working with the same data as Heshmati and Shinas recounted being "dumbstruck" after reading their paper. The student, who wants to remain anonymous for career reasons, reached out to HeshmAati to find out how he and Shinas had filled in the data gaps. The research accountability site Retraction Watch reported that they had used the Excel "autofill" function. According to an analysis by the economist Gary Smith, altogether there were over 2,000 fictional data points amounting to 13 percent of all the data used in the paper. The Excel autofill function is a lot of fun and genuinely handy in some situations. When you enter 1, 2, 3, it guesses 4. But it doesn't work when the data—like much of reality—have no simple or predictable pattern. When you give Excel a list of U.S. presidents, it can't predict the next one. I did give it a try though. Why did Excel think that William Henry Harrison' would retake the White House in 1941? Harrison died in office just 31 days after his inauguration—in 1841. Most likely, autofill figured it was only fair that he be allowed to serve out his remaining years. Why did it pick 1941? That's when FDR began his third term, which apparently Excel considered to be illegitimate, so it exhumed Harrison and put him back in the White House. In a paper published in the journal of the American Medical Association and written up by CNN and the New York Post, a team of academics claimed to show that age-adjusted death rates soared 106 percent during the pandemic among renters who had received eviction filing notices, compared to 25 percent for a control group. The authors got 483,408 eviction filings, and asked the U.S. Census how many of the tenants had died. The answer was 0.3 percent, and that 58 percent were still alive. The status of about 42 percent was unknown—usually because the tenant had moved without filing a change of address. If the authors had assumed that all the unknowns were still alive, the COVID-era mortality increase would be 22 percent for tenants who got eviction notices versus 25 percent who didn't. This would have been a statistically insignificant finding, wouldn't have been publishable, and certainly wouldn't have gotten any press attention. Some of the tenants that the Census couldn't find probably did die, though likely not many, since most dead people end up with death certificates—and people who are dead can't move, so you'd expect most of them to to be linked to their census addresses. But some might move or change their names and then die, or perhaps they were missing from the Census database before receiving an eviction notice. But whatever the reality, the authors didn't have the data. The entire result of their paper—the 106 percent claimed increase in mortality for renters with eviction filings versus the 22 percent observed rate—comes from a guess about how many of the unknown tenants had died. How did they guess? They made the wildly implausible assumption that the Census and the Social Security Administration are equally likely to lose track of a dead person and a living one. Yet the government is far more interested in when people die than when they move, especially because they don't want to keep cutting them Social Security checks. Also, dead people don't move or change their names. Whether or not their assumption was plausible, the paper reported a guess as if it reflected objective data. That's considered acceptable in academia, but it shouldn't be. Another paper, titled "Association Between Connecticut's Permit-to-Purchase Handgun Law and Homicides," was published in the American Journal of Public Health. It cooked up data to use as a control. The study claimed to show that a 1994 gun control law passed in Connecticut cut firearm homicides by 40 percent. But firearm homicide rates in Connecticut followed national trends, with no obvious change after the 1994 law. Forty percent compared to what? The authors arrived at their conclusion by concocting an imaginary state to serve as the control group, combining numbers from California, Maryland, Nevada, New Hampshire, and Rhode Island. This fictional state had 40 percent more homicides than the real Connecticut. Reality is too messy for a technique like this to tell us anything meaningful. The author's entire finding derived from the fact that Rhode Island, which comprised most of "synthetic Connecticut," experienced a temporary spate of about 20 extra murders from 1999 to 2003, a large percentage increase in such a small state. Since the temporary spike in murders wasn't the result of a change in gun control policy, it tells us little about the efficacy of Connecticut's 1994 law or the policy issue at hand. Is it always wrong to guess about missing data? No, not under conditions of extreme uncertainty in which data collection is impossible before a decision has to be made. For example, if you're considering taking a potentially life-saving medicine that hasn't been properly studied, you make the best guess you can with the information you have. But difficult decisions that have to be made with scarce information shouldn't influence public policy and aren't worthy of publication. Yet researchers routinely rely on these methods to generate results on matters of no great urgency, because in academia publishing matters

    9 min
4.3
out of 5
35 Ratings

About

Video journalism from Reason magazine