Let's Know Things

A calm, non-shouty, non-polemical, weekly news analysis podcast for folks of all stripes and leanings who want to know more about what's happening in the world around them. Hosted by analytic journalist Colin Wright since 2016. letsknowthings.substack.com

  1. Sports Betting

    1D AGO

    Sports Betting

    This week we talk about prediction markets, incentives, and gambling addiction. We also discuss insider trading, spot-fixing, and Gatorade. Recommended Book: The Kingdom, the Power, and the Glory by Tim Alberta Transcript Prediction markets are hundreds of years old, and have historically been used to determine the likelihood of something happening. In 1503, for instance, there was a market to determine who would become the next pope, and from the earliest days of commercial markets, there were associated prediction markets that were used to gauge how folks thought a given business would do during an upcoming economic quarter. The theory here is that while you can just ask people how well they think a political candidate will fare in an election or who they think will become the next pope, often their guesses, their assumptions, or their analysis will be swayed by things like political affiliation or maybe even what they think they’re meant to say—the popular papal candidate, for instance, or the non-obvious, asymmetric position on a big commercial enterprise that might help an analyst reinforce their brand as a contrarian. If you introduce money into the equation, though, forcing people to put down real currency on their suspicions and predictions, and give them the chance to earn money if they get things right, that will sometimes nudge these markets away from those other incentives, making the markets commercial enterprises of their own. It can shift the bias away from posturing and toward monetization, and that in turn, in theory at least, should make prediction markets more accurate because people will try to align themselves with the actual, real-deal outcome, rather than the popular—with their social tribe, at least—or compellingly unpopular view. This is the theory that underpins entities like Polymarket, Kalshi, Manifold Markets, and many other online prediction markets that have arisen over the past handful of years as regulations on these types of businesses have been eased, and as they’ve begun to establish themselves as credible players in the predicting-everything space. In politics in particular, these markets have semi-regularly shown themselves to be better gauges of who will actually win elections than conventional polls and surveys, and though their records are far from perfect and still heavily biased in some cases, such community-driven predictions from money-motivated markets are gaining credibility because of their capacity to incentivize people to put their money where their mouths are, and to try to profit from accurate preordination. The flip-side of these markets, and some might even say a built-in flaw with no obvious solution, is that they are rife with insider trading: people who are in the position to know things ahead of time making in some cases millions of dollars by placing big bets that, for them, aren’t bets at all, because they know what will or what is likely to happen. This seems to have occurred at least a few times with big political events in 2025, and it’s anticipated that it could become an even bigger issue in the future, especially for markets that use cryptocurrencies to manage payments, as those are even less likely than their fiat currency peers to keeps solid tabs on who’s actually behind these bets, and thus who might be trading on knowledge that they’re not supposed to be trading on. That said, it could be argued that such insider trading makes these markets even more accurate, eventually at least. And that points us toward another problem: the possibility that someone on the inside might look at a market and realize they can make a killing if they use their position, their power to sway these markets after placing a bet, giving them the ability to assure a payout by abusing their position—major events being influenced by the possibility of a community-funded payday for those in control. What I’d like to talk about today is the same general principle as it’s playing out in the sports world, and why the huge sums of money that are now sloshing around in the sports betting industry in the US are beginning to worry basically everyone, except the sports betting companies themselves. — In October of 2025, the head coach of the NBA basketball team, the Portland Trail Blazers, Chauncey Billups, Miami Heat player Terry Rozier, and former NBA player Damon Jones, and about 30 other people were arrested by the FBI due to their alleged illegal sports gambling activities. Rozier was already under investigation following unusual betting activity that was linked to his performance in a 2023 game—he was later cleared of wrongdoing, but the implication then and in this more recent instance is that he and those other folks who were rounded up by the FBI may have been involved in rigging things so they could get a big payoff on gambling markets. Similar things have been happening across the sports world, including a lifetime ban for Jontay Porter, a former Toronto Raptors player, who apparently gave confidential information to people who were placing bets on NBA games—he later pleaded guilty to conspiracy to commit wire fraud as a result of that investigation—and in November of 2025 two Major League Baseball players, both of them pitchers for the Cleveland Guardians, Emmanuel Clase and Luis Ortiz, were charged by federal prosecutors for allegedly rigging pitches to benefit people betting on those pitches; they’ve been charged with wire fraud and money laundering, and each could face up to 65 years in prison. And those are just a few of the many instances of game-rigging that have been alleged in recent years, the specifics of which vary, but the outcome is always to give someone an advantage in these markets, which are only recently broadly legal across the United States, and which thus allow folks with the right connections or some money to invest ahead of time to, for instance, pay a pitcher to throw an inning, or pay a coach to tell them who will be benched and when, so that they can make a big wager with less of a risk, or in some cases, no risk at all. One of the big issues here is that rather than simply being a which-team-will-win sort of thing, many of these bets are highly specific and granular, including what are called proposition or prop bets that allow folks to gamble on the number of strikeouts a pitcher will tally in a given inning and other very specific things. If a pitcher were to then place a bet, perhaps through an intermediary, on their own prop bet-related performance, they would stand a decent chance of tallying the right number of strikes and balls. They could also sell that information to someone else, taking a guaranteed payout in exchange for the foreknowledge they grant that gambler, who could then do what they want with the information, and then if they do well with it, they could pay that pitcher to do the same again in the future. This type of bet is called spot-fixing, and it’s seen across prediction markets, not just sports markets. Pitchers can fix an inning of a game, but poker players can also go all-in or fold a given number of times in a tournament, and the folks in charge of dumping Gatorade over the winning coach following a Super Bowl event can leak that color, based on their foreknowledge of the setup, to gamblers—these markets are sprawling and varied, and anyone in any position of power who can make decisions about such things, or who’s involved enough to leak information can do so at a profit, either themselves putting down money on spot-fixed prop bets, or selling that information to those who will themselves place a bet. The issue sports organizations in the US are now running into is that while they aligned themselves with sports gambling entities like DraftKings and FanDuel after these platforms were legalized in more states following the striking-down of a federal ban on such things in 2018—as I record this, they’re currently legal in 31 states, alongside Washington DC and Puerto Rico—and they’ve profited a fair bit from that, allowing these businesses to become sponsors, to slap their logos on everything, and to generally become interwoven with the leagues themselves; despite all that, they’ve also created a sports culture in which betting is ultra-common, and that means fans are no longer just fans, they’re putting down money on various possible sports-related outcomes. That means folks who were maybe previously die-hard fans of their local team may no longer just be disappointed when their team loses, they’ll be financially impacted, perhaps even devastated. And many athletes who play on these teams, in these leagues, are now suffering all kinds of abuse and threats from people who decided to put a lot of money on their performance, but who failed to win a game, or maybe even throw the exact right number of strikes and balls in a given inning. This points at two big issues with sports betting in the US right now. First is that there’s a lot of money splashing around in this space. An estimated $160-170 billion was wagered by US citizens in 2025 alone, generating about $16.4 billion in revenue for sportsbooks—the entities that take these sorts of bets. That’s likely a significant undercount, too, as more generalist prediction markets are also getting involved in the sports betting game, blending this type of gambling with other sorts of prediction markets, like those related to politics and international happenings, like war. And second, a lot of people are gambling a lot of money on sports stuff right now, and that’s becoming an issue. In October of 2025, a Pew Research poll found that 43% of US adults think legalized sports betting is bad for society, up from 34% in 2022, and 40% says it’s bad for sports, up from 33%. A whopping 22% of US adults say they personally bet money on sports in the past year, up from 19% in 2022, and 10%, one in ten American adult

    16 min
  2. Data Center Politics

    12/23/2025

    Data Center Politics

    This week we talk about energy consumption, pollution, and bipartisan issues. We also discuss local politics, data center costs, and the Magnificent 7 tech companies. Recommended Book: Against the Machine by Paul Kingsnorth Transcript In 2024, the International Energy Agency estimated that data centers consumed about 1.5% of all electricity generated, globally, that year. It went on to project that energy consumption by data centers could double by 2030, though other estimates are higher, due to the ballooning of investment in AI-focused data centers by some of the world’s largest tech companies. There are all sorts of data centers that serve all kinds of purposes, and they’ve been around since the mid-20th century, since the development of general purposes digital computers, like the 1945 Electronic Numerical Integrator and Computer, or ENIAC, which was programmable and reprogrammable, and used to study, among other things, the feasibility of thermonuclear weapons. ENIAC was built on the campus of the University of Pennsylvania and cost just shy of $500,000, which in today’s money would be around $7 million. It was able to do calculators about a thousand times faster than other, electro-mechanical calculators that were available at the time, and was thus considered to be a pretty big deal, making some types of calculation that were previously not feasible, not only feasible, but casually accomplishable. This general model of building big-old computers at a center location was the way of things, on a practical level, until the dawn of personal computers in the 1980s. The mainframe-terminal setup that dominated until then necessitated that the huge, cumbersome computing hardware was all located in a big room somewhere, and then the terminal devices were points of access that allowed people to tap into those centralized resources. Microcomputers of the sort of a person might have in their home changed that dynamic, but the dawn of the internet reintroduced something similar, allowing folks to have a computer at home or at their desk, which has its own resources, but to then tap into other microcomputers, and to still other larger, more powerful computers across internet connections. Going on the web and visiting a website is basically just that: connecting to another computer somewhere, that distant device storing the website data on its hard drive and sending the results to your probably less-powerful device, at home or work. In the late-90s and early 2000s, this dynamic evolved still further, those far-off machines doing more and more heavy-lifting to create more and more sophisticated online experiences. This manifested as websites that were malleable and editable by the end-user—part of the so-called Web 2.0 experience, which allowed for comments and chat rooms and the uploading of images to those sites, based at those far off machines—and then as streaming video and music, and proto-versions of social networks became a thing, these channels connecting personal devices to more powerful, far-off devices needed more bandwidth, because more and more work was being done by those powerful, centrally located computers, so that the results could be distributed via the internet to all those personal computers and, increasingly, other devices like phones and tablets. Modern data centers do a lot of the same work as those earlier iterations, though increasingly they do a whole lot more heavy-lifting labor, as well. They’ve got hardware capable of, for instance, playing the most high-end video games at the highest settings, and then sending, frame by frame, the output of said video games to a weaker device, someone’s phone or comparably low-end computer, at home, allowing the user of those weaker devices to play those games, their keyboard or controller inputs sent to the data center fast enough that they can control what’s happening and see the result on their own screen in less than the blink of an eye. This is also what allows folks to store backups on cloud servers, big hard drives located in such facilities, and it’s what allows the current AI boom to function—all the expensive computers and their high-end chips located at enormous data centers with sophisticated cooling systems and high-throughput cables that allow folks around the world to tap into their AI models, interact with them, have them do heavy-lifting for them, and then those computers at these data centers send all that information back out into the world, to their devices, even if those devices are underpowered and could never do that same kind of work on their own. What I’d like to talk about today are data centers, the enormous boom in their construction, and how these things are becoming a surprise hot button political issue pretty much everywhere. — As of early 2024, the US was host to nearly 5,400 data centers sprawled across the country. That’s more than any other nation, and that number is growing quickly as those aforementioned enormous tech companies, including the Magnificent 7 tech companies, Nvidia, Apple, Alphabet, Microsoft, Amazon, Meta, and Tesla, which have a combined market cap of about $21.7 trillion as of mid-December 2025, which is about two-thirds of the US’s total GDP for the year, and which is more than the European Union’s total GDP, which weighs in at around $19.4 trillion, as of October 2025—as they splurge on more and more of them. These aren’t the only companies building data centers at breakneck speed—there are quite a few competitors in China doing the same, for instance—but they’re putting up the lion’s share of resources for this sort of infrastructure right now, in part because they anticipate a whole lot of near-future demand for AI services, and those services require just a silly amount of processing power, which itself requires a silly amount of monetary investment and electricity, but also because, first, there aren’t a lot of moats, meaning protective, defensive assets in this industry, as is evidenced by their continual leapfrogging of each other, and the notion that a lot of what they’re doing, today, will probably become commodity services in not too long, rather than high-end services people and businesses will be inclined to pay big money for, and second, because there’s a suspicion, held by many in this industry, that there’s an AI shake-out coming, a bubble pop or bare-minimum a release of air from that bubble, which will probably kill off a huge chunk of the industry, leaving just the largest, too-big-to-fail players still intact, who can then gobble up the rest of the dying industry at a discount. Those who have the infrastructure, who have invested the huge sums of money to build these data centers, basically, will be in a prime position to survive that extinction-level event, in other words. So they’re all scrambling to erect these things as quickly as possible, lest they be left behind. That construction, though, is easier said than done. The highest-end chips account for around 70-80% of a modern data center’s cost, as these GPUs, graphical processing units that are optimized for AI purposes, like Nvidia’s Blackwell chips, can cost tens of thousands of dollars apiece, and millions of dollars per rack. There are a lot of racks of such chips in these data centers, and the total cost of a large-scale AI-optimized data center is often somewhere between $35 and $60 billion. A recent estimate by McKinsey suggests that by 2030, data center investment will need to be around $6.7 trillion a year just to keep up the pace and meet demand for compute power. That’s demand from these tech companies, I should say—there’s a big debate about where there’s sufficient demand from consumers of AI products, and whether these tech companies are trying to create such demand from whole cloth, to justify heightened valuations, and thus to continue goosing their market caps, which in turn enriches those at the top of these companies. That said, it’s a fair bet that for at least a few more years this influx in investment will continue, and that means pumping out more of these data centers. But building these sorts of facilities isn’t just expensive, it’s also regulatorily complex. There are smaller facilities, akin to ENIAC’s campus location, back in the day, but a lot of them—because of the economies of scale inherent in building a lot of this stuff all at once, all in the same place—are enormous, a single data center facility covering thousands of acres and consuming a whole lot of power to keep all of those computers with their high-end chips running 24/7. Previous data centers from the pre-AI era tended to consume in the neighborhood of 30MW of energy, but the baseline now is closer to 200MW. The largest contemporary data centers consume 1GW of electricity, which is about the size of a small city’s power grid—that’s a city of maybe 500,000-750,000 people, though of course climate, industry, and other variables determine the exact energy requirements of a city—and they’re expected to just get larger and more resource-intensive from here. This has resulted in panic and pullbacks in some areas. In Dublin, for instance, the government has stopped issuing new grid connections for data centers until 2028, as it’s estimated that data centers will account for 28% of Ireland’s power use by 2031, already. Some of these big tech companies have read the writing on the wall, and are either making deals to reactivate aging power plants—nuclear, gas, coal, whatever they can get—or are saying they’ll build new ones to offset the impact on the local power grid. And that impact can be significant. In addition to the health and pollution issues caused by some of the sites—in Memphis, for instance, where Elon Musk’s company, xAI, built a huge data center to help power his AI chatbot, Grok, the company is operating 35 unpermitted gas turbi

    17 min
  3. Chip Exports

    12/16/2025

    Chip Exports

    This week we talk about NVIDIA, AI companies, and the US economy. We also discuss the US-China chip-gap, mixed-use technologies, and export bans. Recommended Book: Enshittification by Cory Doctorow Transcript I’ve spoken about this a few times in recent months, but it’s worth rehashing real quick because this collection of stories and entities are so central to what’s happening across a lot of the global economy, and is also fundamental, in a very load-bearing way, to the US economy right now. As of November of 2025, around the same time that Nvidia, the maker of the world’s best AI-optimized chips at the moment became the world’s first company to achieve a $5 trillion market cap, the top seven highest-valued tech companies, including Nvidia, accounted for about 32% of the total value of the US stock market. That’s an absolutely astonishing figure, as while Nvidia, Apple, Microsoft, Alphabet, Amazon, Broadcom, and Meta all have a fairly diverse footprint even beyond their AI efforts, a lot of that value for all of them is predicated on expected future income; which is to say, their market caps, their value according to that measure, is determined not by their current assets and revenue, but by what investors think or hope they’ll pull in and be worth in the future. That’s important to note because historically the sorts of companies that have market caps that are many multiples of their current, more concrete values are startups; companies in their hatchling phase that have a good idea and some kind of big potential, a big moat around what they’re offering or a blue ocean sub-industry with little competition in which they can flourish, and investment is thus expected to help them grow fast. These top seven tech companies, in contrast, are all very mature, have been around for a while and have a lot of infrastructure, employees, expenses, and all the other things we typically associated with mature businesses, not flashy startups with their best days hopefully ahead of them. Some analysts have posited that part of why these companies are pushing the AI thing so hard, and in particular pushing the idea that they’re headed toward some kind of generally useful AI, or AGI, or superhuman AI that can do everyone’s jobs better and cheaper than humans can do them, is that in doing so, they’re imagining a world in which they, and they alone, because of the costs associated with building the data centers required to train and run the best-quality AI right now, are capable of producing basically an economy’s-worth of AI systems and bots and machines operated by those AI systems. In other words, they’re creating, from whole cloth, an imagined scenario in which they’re not just worthy of startup-like valuations, worthy of market caps that are tens or hundreds of times their actual concrete value, because of those possible futures they’re imagining in public, but they’re the only companies worthy of those valuation multiples; the only companies that matter anymore. It’s likely that even if this is the case, that the folks in charge of these companies, and the investors who have money in them who are likely to profit when the companies grow and grow, actually do believe what they’re telling everyone about the possibilities inherent in building these sorts of systems. But there also seems to be a purely economic motive for exaggerating a lot and clearing out as much of the competition as possible as they grow bigger and bigger. Because maybe they’ll actually make what they’re saying they can make as a result of all that investment, that exuberance, but maybe, failing that, they’ll just be the last companies standing after the bubble bursts and an economic wildfire clears out all the smaller companies that couldn’t get the political relationships and sustaining cash they needed to survive the clear-out, if and when reality strikes and everyone realizes that sci-fi outcome isn’t gonna happen, or isn’t gonna happen any time soon. What I’d like to talk about today is a recent decision by the US government to allow Nvidia to sell some of its high-powered chips to China, and why that decision is being near-universally derided by those in the know. — In early December 2025, after a lot of back-and-forthing on the matter, President Trump announced that the US government will allow Nvidia, which is a US-based company, to export its H200 processors to China. He also said that the US government will collect a 25% fee on these sales. The H200 is Nvidia’s second-best chip for AI purposes, and it’s about six-times as powerful as the H20, which is currently the most advanced Nvidia chip that’s been cleared for sale to China. The Blackwell chip that is currently Nvidia’s most powerful AI offering is about 1.5-times faster than the H200 for training purposes, and five-times faster for AI inferencing, which is what they’re used for after a model is trained, and then it’s used for predictions, decisions, and so on. The logic of keeping the highest-end chips from would-be competitors, especially military competitors like China, isn’t new—this is something the US and other governments have pretty much always done, and historically even higher-end gaming systems like Playstation consoles have been banned for export in some cases because the chips they contained could be repurposed for military things, like plucking them out and using them to guide missiles—Sony was initially unable to sell the Playstation 2 outside of Japan because it needed special permits to sell something so militarily capable outside the country, and it remained unsellable in countries like Iraq, Iran, and North Korea throughout its production period. The concern with these Nvidia chips is that if China has access to the most powerful AI processors, it might be able to close the estimated 2-year gap between US companies and Chinese companies when it comes to the sophistication of their AI models and the power of their relevant chips. Beyond being potentially useful for productivity and other economic purposes, this hardware and software is broadly expected to shape the next generation of military hardware, and is already in use for all sorts of wartime and defense purposes, including sophisticated drones used by both sides in Ukraine. If the US loses this advantage, the thinking goes, China might step up its aggression in the South China Sea, potentially even moving up plans to invade Taiwan. Thus, one approach, which has been in place since the Biden administration, has been to do everything possible to keep the best chips out of Chinese hands, because that would ostensibly slow them down, make them less capable of just splurging on the best hardware, which they could then use to further develop their local AI capabilities. This approach, however, also incentivized the Chinese government to double-down on their own homegrown chip industry. Which again is still generally thought to be about 2-years behind the US industry, but it does seem to be closing the gap rapidly, mostly by copying designs and approaches used by companies around the world. An alternative theory, the one that seems to be at least partly responsible for Trump’s about-face on this, is that if the US allows the sale of sufficiently powerful chips to China, the Chinese tech industry will become reliant on goods provided by US companies, and thus its own homegrown AI sector will shrivel and never fully close that gap. If necessary the US can then truncate or shut down those shipments, crippling the Chinese tech industry at a vital moment, and that would give the US the upper-hand in many future negotiations and scenarios. Most analysts in this space no longer think this is a smart approach, because the Chinese government is wise to this tactic, using it itself all the time. And even in spaces where they have plenty of incoming resources from elsewhere, they still try to shore-up their own homegrown versions of the same, copying those international inputs rather than relying on them, so that someday they won’t need them anymore. The same is generally thought to be true, here. Ever since the first Trump administration, when the US government started its trade war with China, the Chinese government has not been keen on ever relying on external governments and economies again, and it looks a lot more likely, based on what the Chinese government has said, and based on investments across the Chinese market on Chinese AI and chip companies following this announcement, that they’ll basically just scoop up as many Nvidia chips as they can, while they can, and primarily for the purpose of reverse-engineering those chips, speeding up their gap-closing with US companies, and then, as soon as possible, severing that tie, competing with Nvidia rather than relying on it. This is an especially pressing matter right now, then, because the US economy, and basically all of its growth, is so completely reliant on AI tech and the chips that are allowing that tech to move forward. If this plan by the US government doesn’t pan out and ends up being a short-term gain situation, a little bit of money earned from that 25% cut the government takes, and Ndvidia temporarily enriching itself further through Chinese sales, but in exchange both entities give up their advantage, long term, to Chinese AI companies and the Chinese government, that could be bad not just for AI companies around the world, which could be rapidly outcompeted by Chinese alternatives, but also all economies exposed to the US economy, which could be in for a long term correction, slump, or full-on depression. Show Notes https://www.nytimes.com/2025/12/09/us/politics/trump-nvidia-ai-chips-china.html https://arstechnica.com/tech-policy/2025/12/us-taking-25-cut-of-nvidia-chip-sales-makes-no-sense-experts-say/ https://www.pcmag.com/news/20-years-later-how-concerns-about-weaponized-consoles-a

    14 min
  4. Digital Asset Markets

    12/09/2025

    Digital Asset Markets

    This week we talk about in-game skins, investment portfolios, and Counter-Strike 2. We also discuss ebooks, Steam, and digital licenses. Recommended Book: Apple in China by Patrick McGee Transcript Almost always, if you buy an ebook or game or movie or music album online, you’re not buying that ebook, or that game, or whatever else—you’re buying a license that allows you access it, often on a specified device or in a specified way, and almost always in a non-transferrable, non-permanent manner. This distinction doesn’t matter much to most of us most of the time. If I buy an ebook, chances are I just want to read that ebook on the device I used to buy it, or the kindle attached to my Amazon or other digital book service account. So I buy the book, read it on my ebook reader or phone, and that’s that; same general experience I would have with a paperback or hardback book. This difference becomes more evident when you think about what happens to the book after you read it, though. If I own a hard-copy, physical book, I can resell it. I can donate it. I can put it in a Little Free Library somewhere in my neighborhood, or give it to a friend who I think will enjoy it. I can pick it up off my shelf later and read the exact same book I read years before. Via whichever mechanism I choose, I’m either holding onto that exact book for later, or I’m transferring ownership of that book, that artifact that contains words and/or images that can now be used, read, whatever by that second owner. And they can go on to do the same: handing it off to a friend, selling it on ebay, or putting it on a shelf for later reference. Often the convenience and immediacy of electronic books makes this distinction a non-issue for those who enjoy them. I can buy an ebook from Amazon or Bookshop.org and that thing is on my device within seconds, giving me access to the story or information that’s the main, valuable component of a book for most of us, without any delay, without having to drive to a bookstore or wait for it to arrive in the mail. That’s a pretty compelling offer. This distinction becomes more pressing, however, if I decide I want to go back and read an ebook I bought years ago, later, only to find that the license has changed and maybe that book is no longer accessible via the marketplace where I purchased it. If that happens, I no longer have access to the book, and there’s no recourse for this absence—I agreed to this possibility when I “bought” the book, based on the user agreement I clicked ‘OK’ or ‘I agree’ on when I signed up for Amazon or whichever service I paid for that book-access. It also becomes more pressing if, as has happened many times over the past few decades, the publisher or some other entity with control over these book assets decides to change them. A few years ago, for instance, British versions of Roald Dalh’s ‘Matilda’ were edited to remove references to Joseph Conrad, who has in recent times been criticized for his antisemitism and racist themes in his writing. Some of RL Stine’s Goosebumps books were edited to remove references to crushes schoolgirls had on their headmaster, and descriptions of an overweight character that were, in retrospect, determined to be offensive. And various racial and ethnic slurs were edited out of some of Agatha Christie’s works around the same time. Almost always, these changes aren’t announced by the publishers who own the rights to these books, and they’re typically only discovered by eagle-eyed readers who note that, for instance, the publishers decided to change the time period in which something occurred, which apparently happened in one of Stine’s works, without obvious purpose. This also frequently happens without the author being notified, as was the case with Stine and the edits made to his books. The publishers themselves, when asked directly about these changes, often remain silent on the matter. What I’d like to talk about today is another angle of this distinction between physically owned media and digital, licensed versions of the same, and the at times large sums of money that can be gained or lost based on the decisions of the companies that control these licensed assets. — Counter-Strike 2 is a first-person shooter game that’s free-to-play, was released in 2023, and was developed by a company called Valve. Valve has developed all sorts of games over the years, including the Counter-Strike, Half-Life, DOTA, and Portal games, but they’re probably best known for their Steam software distribution platform. Steam allows customers to buy all sorts of software, but mostly games through an interface that also provides chat services and community forums. But the primary utility of this platform is that it’s a marketplace for buying and selling games, and it has match-making features for online multiplayer games, serves as a sort of library for gamers, so all their games are launchable from one place, and it serves as a digital rights management hub, which basically means it helps game companies ensure users aren’t playing with pirated software—if you want to use steam to store and launch your games, they have to be legit, purchased games, not pirated ones. As of early 2025, it was estimated that Steam claimed somewhere between 75-80% of the PC gaming market, compared to competitors like the Epic Game Store, which was founded by the folks behind the wildly successful game, Fortnite, which can only claim something like 5%. And Counter-Strike is one of Valve’s, and Steam’s crown jewels. It’s a free-to-play game that was originally developed as a mod, a free add-on to another game Valve owns called Half-Life, but Valve bought up the rights to that mod and developed it into its own thing, releasing the initial entry in the series in 2000, several main-series games after that in subsequent years, and then Counter-Strike 2 came out in 2023, to much acclaim and fanfare. Counter-Strike 2 often has around a million players online, playing the game at any given moment, and its tournaments can attract closer to 1.5 million. As of early 2024, it was estimated that Counter-Strike 2 pulled in around a billion dollars a year for Valve, primarily via what are called Case Keys, which allow players to open in-game boxes, each key selling for $2.50. Valve also takes a 15% cut of all player-to-player sales of items conducted on the Steam Community Market, which is a secure ebay- or Amazon-like component of their platform where players can sell digital items from the game, which are primarily aesthetic add-ons, like skins for weapons, stickers, and clothing—things that allow players to look different in the game, as opposed to things that allow them to perform better, which would give players who spent the most money an unfair advantage and thus make the game less competitive and fun. Because this is a free game, though, and by many estimates a really balance and well-made one, a lot of people play it, and a lot of people want to customize the look of their in-game avatar. So being able to open in-game boxes that contain loot, and being able to buy and sell said loot on the Steam Community Market, has led to a rich secondary economy that makes that component of the game more interesting for players, while also earning Valve a whole lot of money on the backend for those keys and that cut of sales between players. In late-October of 2025, Valve announced a change in the rules for Counter-Strike 2, now allowing players to trade-up more item types, including previously un-trade-up-able items like gloves and knives, into higher-grade versions of the same. So common items could be bundled together and traded in for less common items, and those less common items could be bundled together and traded up for rare ones. This seems like a small move from the outside, but it roiled the CS2 in-game economy, by some estimates causing upwards of $2 billion to basically disappear overnight, because rare gloves and knives were at times valued at as much as $1.5 million; again, these are just aesthetic skins that change the look of a player’s avatar or weapons, but there’s enough demand for these things that some people are willing to pay that much for ultra-rare and unique glove and knife skins. Because of that demand, some players had taken to spending real money on these ultra-rare items, treating their in-game portfolios of skins as something like an investment portfolio. If you can buy an ultra-rare glove skin for $40,000 and maybe sell it later for twice that, that might seem like a really good investment, despite how strange it may seem to those not involved in this corner of the gaming world to spend $40,000 on what’s basically just some code in a machine that tells the game that the gloves on your avatar will look a certain way. This change, then, made those rarer gloves and knives, which were previously unattainable except by lottery-like chance, a lot more common, because people could trade up for them, increasing their chances of getting the ultra-rare stuff. The market was quickly flooded with more of these things, and about half the value of rare CS2 skins disappeared, initially knocking about $6 billion of total value from the market before stabilizing to around $1.5-2 billion. Volatility in this market continues, and people who invested a lot of money, sometimes their life savings, and sometimes millions of dollars into CS2 in-game skins, have been looking into potential legal recourse, though without much luck; Valve’s user agreements make very clear that players don’t own any of this stuff, and as a result, Valve can manipulate the market however they like, whenever they like. Just like with ebooks and movies we “buy” from Amazon and other services, then, these in-game assets are licensed to us, not sold. We may, at times, have a means of putting our license to some of these things on a secondary mar

    14 min
  5. Climate Risk

    12/02/2025

    Climate Risk

    This week we talk about floods, wildfires, and reinsurance companies. We also discuss the COP meetings, government capture, and air pollution. Recommended Book: If Anyone Builds It, Everyone Dies by Eliezer Yudkowsky and Nate Soares Transcript The urban area that contains India’s capital city, New Delhi, called the National Capital Territory of Delhi, has a population of around 34.7 million people. That makes it the most populous city in the country, and one of the most populous cities in the world. Despite the many leaps India has made over the past few decades, in terms of economic growth and overall quality of life for residents, New Delhi continues to have absolutely abysmal air quality—experts at India’s top research hospital have called New Delhi’s air “severe and life-threatening,” and the level of toxic pollutants in the air, from cars and factories and from the crop-waste burning conducted by nearby farmers, can reach 20-times the recommended level for safe breathing. In mid-November 2025, the problem became so bad that the government told half its workers to work from home, because of the dangers represented by the air, and in the hope that doing so would remove some of the cars on the road and, thus, some of the pollution being generated in the area. Trucks spraying mist, using what are called anti-smog guns, along busy roads and pedestrian centers help—the mist keeping some of the pollution from cars from billowing into the air and becoming part of the regional problem, rather than an ultra-localized one, and pushing the pollutants that would otherwise get into people’s lungs down to the ground—though the use of these mist-sprayers has been controversial, as there are accusations that they’re primarily deployed near air-quality monitoring stations, and that those in charge put them there to make it seem like the overall air-quality is lower than it is, manipulating the stats so that their failure to improve practical air-quality isn’t as evident. And in other regional news, just southeast across the Bay of Bengal, the Indonesian government, as of the day I’m recording this, is searching for the hundreds of people who are still missing following a period of unusually heavy rains. These rains have sparked floods and triggered mudslides that have blocked roads, damaged bridges, and forced the evacuation of entire villages. More than 300,000 people have been evacuated as of last weekend, and more rain is forecast for the coming days. The death toll of this round of heavy rainfall—the heaviest in the region in years—has already surpassed 440 people in Indonesia, with another 160 and 90 in Thailand and Vietnam, respectively, being reported by those countries’ governments, from the same weather system. In Thailand, more than two million people were displaced by flooding, and the government had to deploy military assets, including helicopters launched from an aircraft carrier, to help rescue people from the roofs of buildings across nine provinces. In neighboring Malaysia, tens of thousands of people were forced into shelters as the same storm system barreled through, and Sri Lanka was hit with a cyclone that left at least 193 dead and more than 200 missing, marking one of the country’s worst weather disasters in recent years. What I’d like to talk about today is the climatic moment we’re at, as weather patterns change and in many cases, amplify, and how these sorts of extreme disasters are also causing untold, less reported upon but perhaps even more vital, for future policy shifts, at least, economic impacts. — The UN Conference of the Parties, or COP meetings, are high-level climate change conferences that have typically been attended by representatives from most governments each year, and where these representatives angle for various climate-related rules and policies, while also bragging about individual nations’ climate-related accomplishments. In recent years, such policies have been less ambitious than in previous ones, in part because the initial surge of interest in preventing a 1.5 degrees C increase in average global temperatures is almost certainly no longer an option; climate models were somewhat accurate, but as with many things climate-related, seem to have actually been a little too optimistic—things got worse faster than anticipated, and now the general consensus is that we’ll continue to shoot past 1.5 degrees C over the baseline level semi-regularly, and within a few years or a decade, that’ll become our new normal. The ambition of the 2015 Paris Agreement is thus no longer an option. We don’t yet have a new, generally acceptable—by all those governments and their respective interests—rallying cry, and one of the world’s biggest emitters, the United States, is more or less absent at new climate-related meetings, except to periodically show up and lobby for lower renewables goals and an increase in subsidies for and policies that favor the fossil fuel industry. The increase in both number and potency of climate-influenced natural disasters is partly the result of this failure to act, and act forcefully and rapidly enough, by governments and by all the emitting industries they’re meant to regulate. The cost of such disasters is skyrocketing—there are expected to be around $145 billion in insured losses, alone, in 2025, which is 6% higher than in 2024—and their human impact is booming as well, including deaths and injuries, but also the number of people being displaced, in some cases permanently, by these disasters. But none of that seems to move the needle much in some areas, in the face of entrenched interests, like the aforementioned fossil fuel industry, and the seeming inability of politicians in some nations to think and act beyond the needs of their next election cycle. That said, progress is still being made on many of these issues; it’s just slower than it needs to be to reach previously set goals, like that now-defunct 1.5 degrees C ceiling. Most nations, beyond petro-states like Russia and those with fossil fuel industry-captured governments like the current US administration, have been deploying renewables, especially solar panels, at extraordinary rates. This is primarily the result of China’s breakneck deployment of solar, which has offset a lot of energy growth that would have otherwise come from dirty sources like coal in the country, and which has led to a booming overproduction of panels that’s allowed them to sell said panels cheap, overseas. Consequently, many nations, like Pakistan and a growing number of countries across Sub-Saharan African, have been buying as many cheap panels as they can afford and bypassing otherwise dirty and unreliable energy grids, creating arrays of microgrids, instead. Despite those notable absences, then, solar energy infrastructure installations have been increasing at staggering rates, and the first half of 2025 has seen the highest rate of capacity additions, yet—though China is still installing twice as much solar as the rest of the world, combined, at this point. Which is still valuable, as they still have a lot of dirty energy generation to offset as their energy needs increase, but more widely disseminated growth is generally seen to be better in the long-term—so the expansion into other parts of the world is arguably the bigger win, here. The economics of renewables may, at some point, convince even the skeptics and those who are politically opposed to the concept of renewables, rather than practically opposed to them, that it’s time to change teams. Already, conservative parts of the US, like Texas, are becoming renewables boom-towns, quietly deploying wind and solar because they’re often the best, cheapest, most resilient options, even as their politicians rail against them in public and vote for more fossil fuel subsidies. And it may be economics that eventually serve as the next nudge, or forceful shove on this movement toward renewables, as we’re reaching a point at which real estate and the global construction industry, not to mention the larger financial system that underpins them and pretty much all other large-scale economic activities, are being not just impacted, but rattled at their roots, by climate change. In early November 2025, real estate listing company Zillow, the biggest such company in the US, stopped showing extreme weather risks for more than a million home sale listings on its site. It started showing these risk ratings in 2024, using data from a risk-modeling company called First Street, and the idea was to give potential buyers a sense of how at-risk a property they were considering buying might be when it comes to wildfires, floods, poor air quality, and other climate and pollution-related issues. Real estate agents hated these ratings, though, in part because there was no way to protest and change them, but also because, well, they might have an expensive coastal property listed that now showed potential buyers it was flood prone, if not today, in a couple of years. It might also show a beautiful mountain property that’s uninsurable because of the risk of wildfire damage. A good heuristic for understanding the impact of global climate change is not to think in terms of warming, though that’s often part of it, but rather thinking in terms of more radical temperature and weather swings. That means areas that were previously at little or no risk of flooding might suddenly be very at risk of absolutely devastating floods. And the same is true of storms, wildfires, and heat so intense people die just from being outside for an hour, and in which components of one’s house might fry or melt. This move by Zillow, the appearance and removal of these risk scores, happened at the same time global insurers are warning that they may have to pull out of more areas, because it’s simply no longer possible for them to do business in pla

    16 min
  6. Thorium Reactors

    11/25/2025

    Thorium Reactors

    This week we talk about radioactive waste, neutrons, and burn while breeding cycles. We also discuss dry casks, radioactive decay, and uranium. Recommended Book: Breakneck by Dan Wang Transcript Radioactive waste, often called nuclear waste, typically falls into one of three categories: low-level waste that contains a small amount of radioactivity that will last a very short time—this is stuff like clothes or tools or rags that have been contaminated—intermediate-level waste, which has been contaminated enough that it requires shielding, and high-level waste, which is very radioactive material that creates a bunch of heat because of all the radioactive decay, so it requires both shield and cooling. Some types of radioactive waste, particularly spent fuel of the kind used in nuclear power plants, can be reprocessed, which means separating it into other types of useful products, including another type of mixed nuclear fuel that can be used in lieu of uranium, though generally not economically unless uranium supplies are low. About a third of all spent nuclear fuel has already been reprocessed in some way. About 4% of even the recyclable stuff, though, doesn’t have that kind of second-life purpose, and that, combined with the medium- and long-lived waste that is quite dangerous to have just sitting around, has to be stored somehow, shielded and maybe cooled, and in some cases for a very long time: some especially long-lived fission products have half-lives that stretch into the hundreds of thousands or millions of years, which means they will be radioactive deep into the future, many times longer than humans have existed as a species. According to the International Atomic Energy Agency, something like 490,000 metric tons of radioactive spent fuel is currently being stored, on a temporary basis, at hundreds of specialized sites around the world. The majority of this radioactive waste is stored in pools of spent fuel water, cooled in that water somewhere near the nuclear reactors where the waste originated. Other waste has been relocated into what’re called dry casks, which are big, barrel-like containers made of several layers of steel, concrete, and other materials, which surround a canister that holds the waste, and the canister is itself surrounded by inert gas. These casks hold and cool waste using natural air convection, so they don’t require any kind of external power or water sources, while other solutions, including storage in water, sometimes does—and often the fuel is initially stored in pools, and is then moved to casks for longer-term storage. Most of the radioactive waste produced today comes in the form of spend fuel from nuclear reactors, which are typically small ceramic pellets made of low-enriched uranium oxide. These pellets are stacked on top of each other and encased in metal, and that creates what’s called a fuel rod. In the US, alone, about 2,000 metric tons of spent nuclear fuel is created each year, which is just shy of half an olympic sized swimming pool in terms of volume, and in many countries, the non-reuseable stuff is eventually buried, near the surface for the low- to intermediate-level waste, and deeper for high-level waste—deeper, in this context, meaning something like 200-1000 m, which is about 650-3300 feet, beneath the surface. The goal of such burying is to prevent potential leakage that might impact life on the surface, while also taking advantage of the inherent stability and cooler nature of underground spaces which are chosen for their isolation, natural barriers, and water impermeability, and which are also often reinforced with human-made supports and security, blocking everything off and protecting the surrounding area so nothing will access these spaces far into the future, and so that they won’t be broken open by future glaciation or other large-scale impacts, either. What I’d like to talk about today is another potential use and way of dealing with this type of waste, and why a recent, related development in China is being heralded as such a big deal. — An experimental nuclear reactor was built in the Gobi Desert by the Chinese Academy of Sciences Shanghai Institute of Applied Physics, and back in 2023 the group achieved its first criticality, got started up, basically, and it has been generating heat through nuclear fission ever since. What that means is that the nuclear reactor did what a nuclear reactor is supposed to do. Most such reactors exist to generate heat, which then creates steam and spins turbines, which generates electricity. What’s special about this reactor, though, is that it is a thorium molten salt reactor, which means it uses thorium instead of uranium as a fuel source, and the thorium is processed into uranium as part of the energy-making process, because thorium only contains trace amounts of fissile material, which isn’t enough to get a power-generating, nuclear chain reaction going. This reactor was able to successfully perform what’s called in-core thorium-to-uranium conversion, which allows the operators to use thorium as fuel, and have that thorium converted into uranium, which is sufficiently fissile to produce nuclear power, inside the core of the reactor. This is an incredibly fiddly process, and requires that the thorium-232 used as fuel absorb a neutron, which turns it into thorium-233. Thorium-233 then decays into protactinium-233, and that, in turn, decays into uranium-233—the fuel that powers the reactor. One innovation here is that this entire process happens inside the reactor, rather than occurring externally, which would require a bunch of supplementary infrastructure to handle fuel fabrication, increasing the amount of space and cost associated with the reactor. Those neutrons required to start the thorium conversion process are provided by small amounts of more fissile material, like enriched uranium-235 or plutonium-239, and the thorium is dissolved in a fluoride salt and becomes a molten mixture that allows it to absorb that necessary neutron, and go through that multi-step decay process, turning into uranium-233. That end-point uranium then releases energy through nuclear fission, and this initiates what’s called a burn while breeding cycle, which means it goes on to produce its own neutrons moving forward, which obviates the need for those other, far more fissile materials that were used to start the chain reaction. All of which makes this process a lot more fuel efficient than other options, dramatically reduces the amount of radioactive waste produced, and allows reactors that use it to operate a lot longer without needing to refuel, which also extends a reactor’s functional life. On that last point, many typical nuclear power plants built over the past handful of decades use pressurized water reactors which have to be periodically shut down so operators can replace spent fuel rods. This new method instead allows the fissile materials to continuously circulate, enabling on-the-fly refueling—so no shut-down, no interruption of operations necessary. This method also requires zero water, which could allow these reactors to be built in more and different locations, as conventional nuclear power plants have typically been built near large water sources, like oceans, because of their cooling needs. China initiated the program that led to the development of this experimental reactor back in 2011, in part because it has vast thorium reserves it wanted to tap in its pursuit of energy independence, and in part because this approach to nuclear energy should, in theory at least, allow plant operators to use existing, spent fuel rods as part of its process, which could be very economically interesting, as they could use the waste from their existing plants to help fuel these new plants, but also take such waste off other governments’ hands, maybe even be paid for it, because those other governments would then no longer need to store the stuff, and China could use it as cheap fuel; win win. Thinking further along, though, maybe the real killer application of this technology is that it allows for the dispersion of nuclear energy without the risk of nuclear weapons proliferation. The plants are smaller, they have a passive safety system that disallows the sorts of disasters that we saw in Chernobyl and Three-Mile Island—that sort of thing just can’t happen with this setup—and the fissile materials, aside from those starter materials used to get the initial cycle going, can’t be used to make nuclear weapons. Right now, there’s a fair amount of uranium on the market, but just like oil, that availability is cyclical and controlled by relatively few governments. In the future, that resource could become more scarce, and this reactor setup may become even more valuable as a result, because thorium is a lot cheaper and more abundant, and it’s less tightly controlled because it’s useless from a nuclear weapons standpoint. This is only the very first step on the way toward a potentially thorium-reactor dominated nuclear power industry, and the conversion rate on this experimental model was meager. That said, it is a big step in the right direction, and a solid proof-of-concept, showing that this type of reactor has promise and would probably work scaled-up, as well, and that means the 100MW demonstration reactor China is also building in the Gobi, hoping to prove the concept’s full value by 2035, stands a pretty decent chance of having a good showing. Show Notes https://www.deepisolation.com/about-nuclear-waste/where-is-nuclear-waste-now https://www.energy.gov/ne/articles/5-fast-facts-about-spent-nuclear-fuel https://www.energy.gov/ne/articles/3-advanced-reactor-systems-watch-2030 https://world-nuclear.org/information-library/nuclear-fuel-cycle/nuclear-waste/radioactive-wastes-myths-and-realities https://www.visualcapitalist.com/visualizing-all-the-nuclear-waste-in-the-world/ https://en

    13 min
  7. Extrajudicial Killing

    11/18/2025

    Extrajudicial Killing

    This week we talk about Venezuela, casus belli, and drug smuggling. We also discuss oil reserves, Maduro, and Machado. Recommended Book: Dungeon Crawler Carl by Matt Dinniman Transcript Venezuela, which suffered all sorts of political and economic crises under former president Hugo Chávez, has suffered even more of the same, and on a more dramatic scale, under Chávez’s successor, Nicolás Maduro. Both Chávez and Maduro have ruled over autocratic regimes, turning ostensibly democratic Venezuelan governments into governments ruled by a single person, and those they like and empower and reward, over time removing anyone from power who might challenge them, and collapsing all checks and balances within the structure of their government. They still hold elections, then, but like in Russia, the voting is just for show, the outcome predetermined, and anyone who gets too popular and who isn’t favored by the existing regime is jailed or killed or otherwise neutralized; the votes are then adjusted when necessary to make it look like the regime is still popular, and anyone who challenges that seeming popularity is likewise taken care of. As a result of that state of affairs, an unpopular regime with absolute power running things into the ground over the course of two autocrats’ administrations, Venezuela has suffered immense hyperinflation, high levels of crime and widespread disease, ever-increasing mortality rates, and even starvation, as fundamentals like food periodically become scarce. This has led to a swell of emigration out of the country, which has, during the past decade, become the largest ever recorded refugee crisis in the Americas, those who leave mostly flooding into neighboring countries like Colombia, Peru, and Ecuador. As of 2025, it’s estimated that nearly 8 million people, more than 20% of Venezuela’s entire population as of 2017, has fled the country to get away from the government, its policies, its collapsed economy, and the cultural homogeny that has led to so much crime, conflict, and oppression of those not favored by the people in charge. This has also led to some Venezuelans trying to get into the US, which was part of the justification for a proposed invasion of the country, by the US government, under the first Trump administration in 2017. The idea was that this is a corrupt, weak government that also happens to possess the largest proven oil reserves in the world. Its production of oil has collapsed along with everything else, in part because the government is so ineffectual, and in part because of outside forces, like longstanding sanctions by the US, which makes selling and profiting from said oil on the global market difficult. Apparently, though, Trump also just liked the idea of invading Venezuela through US ally Colombia, saying—according to Trump’s National Security advisor at the time, John Bolton—that Venezuela is really part of the US, so it would be “cool” for the US to take it. Trump also later said, in 2023, that when he left office Venezuela was about to collapse, and that he would have taken it over if he had been reelected instead of losing to Joe Biden, and the US would have then kept all the country’s oil. So there’s long been a seeming desire by Trump to invade Venezuela, partly on vibe grounds, the state being weak and why shouldn’t we own it, that kind of thing? But underlying that is the notion of the US being a country that can stomp into weaker countries, take their oil, and then nation-build, similar to what the government seemed to be trying to do when it invaded Iraq in the early 2000s, using 9/11 as a casus belli, an excuse to go to war, with an uninvolved nation that happened to own a bunch of oil resources the US government wanted for itself. What I’d like to talk about today is the seeming resurgence of that narrative, but this time with an, actual tangible reason to believe an invasion of Venezuela might occur sometime soon. — As I mentioned, though previously kind of a success story in South America, bringing people in from all over the continent and the world, Venezuela has substantially weakened under its two recent autocratic leaders, who have rebuilt everything in their image, and made corruption and self-serving the main driver behind their decisions for the direction of the country. A very popular candidate, María Corina Machado, was barred from participating in the country’s 2024 election, the country’s Supreme Court ruling that a 15-year ban on her holding public office because of her involvement with an alleged plot against Maduro with a previous candidate for office, Juan Guaido; Guiado is now in exile, run out of the country for winning an election against Maduro, which Maduro’s government has claimed wasn’t legit, but which dozens of governments recognize as having been legitimate, despite Maduro’s clinging to power after losing. So Machado is accused of being corrupt by Maduro’s corrupt government, and thus isn’t allowed to run for office. Another candidate that she wanted to have run in her place was also declared ineligible by Maduro’s people, so another sub was found, Edmundo González, and basically every outside election watchdog group says that he won in 2024, and handedly, over Maduro. But the government’s official results say that’s not the case, that Maduro won, and that has created even more conflict and chaos in the country as it’s become clearer and clearer that there’s no way to oust the autocrat in control of the government—not through the voting box, at least. This is part of what makes Venezuela an even more appealing target, for the Trump administration, right now, because not only is Maduro incredibly unpopular and running the country into the ground, there’s also a very popular alternative, in the shape of María Corina Machado, who could conceivably take control of things should Maduro be toppled. So there’s a nonzero chance that if someone, like the US military, were to step in and either kill Maduro or run him out of town, they could make a very sweet deal with the incoming Machado government, including a deal that grants access to all that currently underutilized oil wealth. This is theoretical right now, but recent moves by the US government and military suggest it might not remain theoretical for much longer. In mid-November, 2025, the US Navy moved the USS Gerald R. Ford Carrier Strike Group to the Caribbean—the USS Gerald R Ford being an aircraft carrier, and the strike group being the array of ships and aircraft that accompany it—it was moved there from the Eastern Mediterranean, where it was moved following the attack on Israel that led to Israel’s invasion of the Gaza Strip. This, by itself, doesn’t necessarily mean anything; the shifting of aircraft carrier groups is often more symbolic than practical. But the US government has suggested it might us these vessels and aircraft to strike drug manufacturers across South and Central America, and specifically in Venezuela. This is being seen as an escalation of an already fraught moment in the region, because the US has launched a series of strikes against small boats in the area, beginning back in September of 2025. These boats, according to the US government, are drug smuggling vessels, bringing fentanyl, among other drugs, to US shores. So the idea is that the people aboard these boats are criminals who are killing folks in the US by bringing this drug, which is highly addictive and super potent, and thus more likely to kill its users than other opioids, into the country for illegal sale and distribution. So, the claim goes, this is a justified use of force. These strikes have thus far, over the past two months, killed at least 79 people, all alleged by the US government to be drug smugglers, despite some evidence to the contrary, in some cases. The US’s allies have not been happy about these strikes, including allies the government usually relies on to help with drug-related detection and interdiction efforts, including regional governments that take action to keep drugs from shuffling around the region and eventually ending up in the US. Many US allies have also called the strikes illegal. The French foreign minister recently said they violate international law, and the EU’s foreign policy chief said something similar, indicating that such use of force is only valid in cases of self-defense, and when there’s a UN Security council resolution on the matter. Canadian and Dutch governments have been doing what they can to distance themselves from the strikes, without outright criticizing the at times vindictive US government, and some regional allies, like Colombia, have been signaling that they’ll be less cooperative with the US when it comes to drug-related issues, saying that they would no longer share intelligence with the US until they stop the strikes, which they’ve called “extrajudicial executions.” An extrajudicial killing is one that is not lawful; it doesn’t have the backing of a judicial proceeding, and thus lacks the authority typically granted by the proper facets of a government. Lacking such authority, killing is illegal. Given said authority, though, a killing can be made legal, at least according to the laws of the government doing the killing. The argument here is that while governments can usually get away with killing people, only authoritarian regimes typically and regularly to use that power to kill folks without going through the proper channels and thus getting the legal authority to do so. In this case, the facts seem to support the accusations of those who are saying these killings aren’t legally legitimate: the Trump administration has launched these attacks on these vessels without going through the usual channels, and without declaring Congressionally approved war on anyone in particular. They’ve instead claimed that drug cartels are terrorists,

    15 min
  8. Nitazenes

    11/11/2025

    Nitazenes

    This week we talk about OxyContin, opium, and the British East India Company. We also discuss isotonitazene, fentanyl, and Perdue. Recommended Book: The Thinking Machine by Stephen Witt Transcript Opioids have been used as painkillers by humans since at least the Neolithic period; there’s evidence that people living in the Iberian and Italian Peninsulas kept opium poppy seeds with them, and there’s even more evidence that the Ancient Greeks were big fans of opium, using it to treat pain and as a sleep aid. Opium was the only available opioid for most of human history, and it was almost always considered to be a net-positive, despite its downsides. It was incorporated into a mixture called laudanum, which was a blend of opium and alcohol, in the 17th century, and that helped it spread globally as Europeans spread globally, though it was also in use locally, elsewhere, especially in regions where the opium poppy grew naturally. In India, for instance, opium was grown and often used for its painkilling properties, but when the British East India Company took over, they decided to double-down on the substance as a product they could monopolize and grow into a globe-spanning enterprise. They went to great lengths to expand production and prevent the rise of potential competitors, in India and elsewhere, and they created new markets for opium in China by forcing the product onto Chinese markets, initially via smuggling, and then eventually, after fighting a series of wars focused on whether or not the British should be allowed to sell opium on the Chinese market, the British defeated the Chinese. And among other severely unbalanced new treaties, including the ceding of the Kowloon peninsula to the British as part of Hong Kong, which they controlled as a trading port, and the legalization of Christians coming into the country, proselytizing, and owning property, the Chinese were forced to accept the opium trade. This led to generations of addicts, even more so than before, when opium was available only illicitly, and it became a major bone of contention between the two countries, and informed China’s relationship with the world in general, especially other Europeans and the US, moving forward. A little bit later, in the early 1800s, a German pharmacist was able to isolate a substance called morphine from opium. He published a paper on this process in 1817, and in addition to this being the first alkaloid, the first organic compound of this kind to be isolated from a medicinal plant, which was a milestone in the development of modern drug discovery, it also marked the arrival of a new seeming wonder drug, that could ease pain, but also help control cold-related symptoms like coughing and gut issues, like diarrhea. Like many such substances back in the day, it was also often used to treat women who were demonstrating ‘nervous character,’ which was code for ‘behaving in ways men didn’t like or understand.’ Initially, it was thought that, unlike with opium, morphine wasn’t addictive. And this thinking was premised on the novel application method often used for morphine, the hypermedia needle, which arrived a half-century after that early 1800s isolation of morphine from opium, but which became a major driver of the new drug’s success and utility. Such drugs, derived scientifically rather than just processing a plant, could be administered at specific, controllable doses. So surely, it was thought, this would alleviate those pesky addictive symptoms that many people experienced when using opioids in a more natural, less science-y way. That, of course, turned out not to be the case. But it didn’t stop the progression of this drug type, and the further development of more derivations of it, including powerful synthetic opioids, which first hit the scene in the mid-20th century. What I’d like to talk about today is the recent wave of opioid addictions, especially but not exclusively in the US, and the newest concern in this space, which is massively more powerful than anything that’s come before. — As I mentioned, there have been surges in opioid use, latent and externally forced, throughout modern human history. The Chinese saw an intense wave of opioid addiction after the British forced opium onto their markets, to the point that there was a commonly held belief that the British were trying to overthrow and enslave the Chinese by weighing them down with so many addicts who were incapable of doing much of anything; which, while not backed by the documentation we have from the era—it seems like they were just chasing profits—is not impossible, given what the Brits were up to around the world at that point in history. That said, there was a huge influx in opioid use in the late-1980s, when a US-based company called Purdue Pharma began producing and pushing a time-released opioid medication, which really hit the big-time in 1995, when they released a version of the drug called OxyContin. OxyContin flooded the market, in part because it promised to help prevent addiction and accidental overdose, and in part because Purdue was just really, really good at marketing it; among other questionable and outright illegal things it did as part of that marketing push, it gave kickbacks to doctors who prescribed it, and some doctors did so, a lot, even when patients didn’t need it, or were clearly becoming addicted. By the early 2000s, Purdue, and the Sackler family that owned the company, was spending hundreds of millions of dollars a year to push this drug, and they were making billions a year in sales. Eventually the nature of Purdue’s efforts came to light, there were a bunch of trials and other legal hearings, some investigative journalists exposed Purdue’s foreknowledge of their drug’s flaws, and there was a big government investigation and some major lawsuits that caused the collapse of the company in 2019—though they rebranded in 2021, becoming Knoa Pharma. All of which is interesting because much like the forced legalization of opium on Chinese markets led to their opioid crisis a long time ago, the arrival of this incredibly, artificially popular drug on the US market led to the US’s opioid crisis. The current bogeyman in the world of opioids—and I say current because this is a fast-moving space, with new, increasingly powerful or in some cases just a lot cheaper drugs arriving on the scene all the time—is fentanyl, which is a synthetic opioid that’s about 30-50 times more potent than heroin, and about 100 times as potent as morphine. It has been traditionally used in the treatment of cancer patients and as a sedative, and because of how powerful it is, a very small amount serves to achieve the desired, painkilling effect. But just like other opioids, its administration can lead to addiction, people who use it can become dependent and need more and more of it to get the same effects, and people who have too much of it can experience adverse effects, including, eventually, death. This drug has been in use since the 1960s, but illicit use of fentanyl began back in the mid-1970s, initially as its own thing, but eventually to be mixed in with other drugs, like heroin, especially low-quality versions of those drugs, because a very small amount of fentanyl can have an incredibly large and potent effect, making those other drugs seem higher quality than they are. That utility is also this drug’s major issue, though: it’s so potent that a small amount of it can kill, and even people with high opioid tolerances can see those tolerances pushed up and up and up until they eventually take a too-large, killing dose. There have been numerous efforts to control the flow of fentanyl into the US, and beginning in the mid-20-teens, there were high-profile seizures of the illicitly produced stuff around the country. As of mid-2025, China seems to be the primary source of most illicit fentanyl around the world, the drug precursor produced in China, shipped to Mexico where it’s finalized and made ready for market, and then smuggled into the US. There have been efforts to shut down this supply chain, including recent tariffs put on Chinese goods, ostensibly, in part at least, to get China to handle those precursor suppliers. Even if that effort eventually bears fruit, though, India seems to have recently become an alternative source of those precursors for Mexican drug cartels, and for several years they’ve been creating new markets for their output in other countries, like Nigeria, Indonesia, and the Netherlands, as well. Amidst all that, a new synthetic drug, which is 40-times as potent as fentanyl, is starting to arrive in the US, Europe, and Australia, and has already been blamed for thousands of deaths—and it’s thought that that number might be a significant undercount, because of how difficult it can be to attribute cause with these sorts of drugs. Nitazenes were originally synthesized back in the 1950s in Austria, and they were never sold as painkillers because they were known, from the get-go, to be too addictive, and to have a bad tradeoff ratio: a little bit of benefit, but a high likelihood of respiratory depression, which is a common cause of death for opioid addicts, or those who accidentally overdose on an opioid. One nitazene, called isotonitazene, first showed up on US drug enforcement agency radars back in 2019, when a shipment was intercepted in the Midwest. Other agencies noted the same across the US and Europe in subsequent years, and this class of drugs has now become widespread in these areas, and in Australia. It’s thought that nitazenes might be seeing a surge in popularity with illicit drugmakers because their potency can be amped up so far, way, way higher than even fentanyl, and because their effects are similar in many ways to heroin. They can also use them they way they use fentanyl, a tiny bit blended into lower-quality versions of other drugs, like cocai

    14 min
4.8
out of 5
509 Ratings

About

A calm, non-shouty, non-polemical, weekly news analysis podcast for folks of all stripes and leanings who want to know more about what's happening in the world around them. Hosted by analytic journalist Colin Wright since 2016. letsknowthings.substack.com

More From Understandary

You Might Also Like