
Data Center Politics
23/12/2025 | 16 mins.
This week we talk about energy consumption, pollution, and bipartisan issues.We also discuss local politics, data center costs, and the Magnificent 7 tech companies.Recommended Book: Against the Machine by Paul KingsnorthTranscriptIn 2024, the International Energy Agency estimated that data centers consumed about 1.5% of all electricity generated, globally, that year. It went on to project that energy consumption by data centers could double by 2030, though other estimates are higher, due to the ballooning of investment in AI-focused data centers by some of the world’s largest tech companies.There are all sorts of data centers that serve all kinds of purposes, and they’ve been around since the mid-20th century, since the development of general purposes digital computers, like the 1945 Electronic Numerical Integrator and Computer, or ENIAC, which was programmable and reprogrammable, and used to study, among other things, the feasibility of thermonuclear weapons.ENIAC was built on the campus of the University of Pennsylvania and cost just shy of $500,000, which in today’s money would be around $7 million. It was able to do calculators about a thousand times faster than other, electro-mechanical calculators that were available at the time, and was thus considered to be a pretty big deal, making some types of calculation that were previously not feasible, not only feasible, but casually accomplishable.This general model of building big-old computers at a center location was the way of things, on a practical level, until the dawn of personal computers in the 1980s. The mainframe-terminal setup that dominated until then necessitated that the huge, cumbersome computing hardware was all located in a big room somewhere, and then the terminal devices were points of access that allowed people to tap into those centralized resources.Microcomputers of the sort of a person might have in their home changed that dynamic, but the dawn of the internet reintroduced something similar, allowing folks to have a computer at home or at their desk, which has its own resources, but to then tap into other microcomputers, and to still other larger, more powerful computers across internet connections. Going on the web and visiting a website is basically just that: connecting to another computer somewhere, that distant device storing the website data on its hard drive and sending the results to your probably less-powerful device, at home or work.In the late-90s and early 2000s, this dynamic evolved still further, those far-off machines doing more and more heavy-lifting to create more and more sophisticated online experiences. This manifested as websites that were malleable and editable by the end-user—part of the so-called Web 2.0 experience, which allowed for comments and chat rooms and the uploading of images to those sites, based at those far off machines—and then as streaming video and music, and proto-versions of social networks became a thing, these channels connecting personal devices to more powerful, far-off devices needed more bandwidth, because more and more work was being done by those powerful, centrally located computers, so that the results could be distributed via the internet to all those personal computers and, increasingly, other devices like phones and tablets.Modern data centers do a lot of the same work as those earlier iterations, though increasingly they do a whole lot more heavy-lifting labor, as well. They’ve got hardware capable of, for instance, playing the most high-end video games at the highest settings, and then sending, frame by frame, the output of said video games to a weaker device, someone’s phone or comparably low-end computer, at home, allowing the user of those weaker devices to play those games, their keyboard or controller inputs sent to the data center fast enough that they can control what’s happening and see the result on their own screen in less than the blink of an eye.This is also what allows folks to store backups on cloud servers, big hard drives located in such facilities, and it’s what allows the current AI boom to function—all the expensive computers and their high-end chips located at enormous data centers with sophisticated cooling systems and high-throughput cables that allow folks around the world to tap into their AI models, interact with them, have them do heavy-lifting for them, and then those computers at these data centers send all that information back out into the world, to their devices, even if those devices are underpowered and could never do that same kind of work on their own.What I’d like to talk about today are data centers, the enormous boom in their construction, and how these things are becoming a surprise hot button political issue pretty much everywhere.—As of early 2024, the US was host to nearly 5,400 data centers sprawled across the country. That’s more than any other nation, and that number is growing quickly as those aforementioned enormous tech companies, including the Magnificent 7 tech companies, Nvidia, Apple, Alphabet, Microsoft, Amazon, Meta, and Tesla, which have a combined market cap of about $21.7 trillion as of mid-December 2025, which is about two-thirds of the US’s total GDP for the year, and which is more than the European Union’s total GDP, which weighs in at around $19.4 trillion, as of October 2025—as they splurge on more and more of them.These aren’t the only companies building data centers at breakneck speed—there are quite a few competitors in China doing the same, for instance—but they’re putting up the lion’s share of resources for this sort of infrastructure right now, in part because they anticipate a whole lot of near-future demand for AI services, and those services require just a silly amount of processing power, which itself requires a silly amount of monetary investment and electricity, but also because, first, there aren’t a lot of moats, meaning protective, defensive assets in this industry, as is evidenced by their continual leapfrogging of each other, and the notion that a lot of what they’re doing, today, will probably become commodity services in not too long, rather than high-end services people and businesses will be inclined to pay big money for, and second, because there’s a suspicion, held by many in this industry, that there’s an AI shake-out coming, a bubble pop or bare-minimum a release of air from that bubble, which will probably kill off a huge chunk of the industry, leaving just the largest, too-big-to-fail players still intact, who can then gobble up the rest of the dying industry at a discount.Those who have the infrastructure, who have invested the huge sums of money to build these data centers, basically, will be in a prime position to survive that extinction-level event, in other words. So they’re all scrambling to erect these things as quickly as possible, lest they be left behind.That construction, though, is easier said than done.The highest-end chips account for around 70-80% of a modern data center’s cost, as these GPUs, graphical processing units that are optimized for AI purposes, like Nvidia’s Blackwell chips, can cost tens of thousands of dollars apiece, and millions of dollars per rack. There are a lot of racks of such chips in these data centers, and the total cost of a large-scale AI-optimized data center is often somewhere between $35 and $60 billion.A recent estimate by McKinsey suggests that by 2030, data center investment will need to be around $6.7 trillion a year just to keep up the pace and meet demand for compute power. That’s demand from these tech companies, I should say—there’s a big debate about where there’s sufficient demand from consumers of AI products, and whether these tech companies are trying to create such demand from whole cloth, to justify heightened valuations, and thus to continue goosing their market caps, which in turn enriches those at the top of these companies.That said, it’s a fair bet that for at least a few more years this influx in investment will continue, and that means pumping out more of these data centers.But building these sorts of facilities isn’t just expensive, it’s also regulatorily complex. There are smaller facilities, akin to ENIAC’s campus location, back in the day, but a lot of them—because of the economies of scale inherent in building a lot of this stuff all at once, all in the same place—are enormous, a single data center facility covering thousands of acres and consuming a whole lot of power to keep all of those computers with their high-end chips running 24/7.Previous data centers from the pre-AI era tended to consume in the neighborhood of 30MW of energy, but the baseline now is closer to 200MW. The largest contemporary data centers consume 1GW of electricity, which is about the size of a small city’s power grid—that’s a city of maybe 500,000-750,000 people, though of course climate, industry, and other variables determine the exact energy requirements of a city—and they’re expected to just get larger and more resource-intensive from here.This has resulted in panic and pullbacks in some areas. In Dublin, for instance, the government has stopped issuing new grid connections for data centers until 2028, as it’s estimated that data centers will account for 28% of Ireland’s power use by 2031, already.Some of these big tech companies have read the writing on the wall, and are either making deals to reactivate aging power plants—nuclear, gas, coal, whatever they can get—or are saying they’ll build new ones to offset the impact on the local power grid.And that impact can be significant. In addition to the health and pollution issues caused by some of the sites—in Memphis, for instance, where Elon Musk’s company, xAI, built a huge data center to help power his AI chatbot, Grok, the company is operating 35 unpermitted gas turbines, which it says are temporary, but which have been exacerbating locals’ health issues and particulate numbers—in addition to those issues, energy prices across the US are up 6.9% year over year as of December 2025, which is much higher than overall inflation. Those costs are expected to increase still further as data centers claim more of the finite energy available on these grids, which in turn means less available for everyone else, and that scarcity, because of supply and demand, increases the cost of that remaining energy.As a consequence of these issues, and what’s broadly being seen as casual overstepping of laws and regulations by these companies, which often funnel a lot of money to local politicians to help smooth the path for their construction ambitions, there are bipartisan efforts around the world to halt construction on these things, locals saying the claimed benefits, like jobs, don’t actually make sense—as construction jobs will be temporary, and the data centers themselves don’t require many human maintainers or operators, and because they consume all that energy, in some cases might consume a bunch of water—possibly not as much as other grand-scale developments, like golf courses, but still—and they tend to generate a bunch of low-level, at times harmful background noise, can create a bunch of local pollution, and in general take up a bunch of space without giving any real benefit to the locals.Interestingly, this is one of the few truly bipartisan issues that seems to be persisting in the United States, at a moment in which it’s often difficult to find things Republicans and Democrats can agree on, and that’s seemingly because it’s not just a ‘big companies led by untouchable rich people stomping around in often poorer communities and taking what they want’ sort of issue, it’s also an affordability issue, because the installation of these things seems to already be pushing prices higher—when the price of energy goes up, the price of just about everything goes up—and it seems likely to push prices even higher in the coming years.We’ll see to what degree this influences politics and platforms moving forward, but some local politicians in particular are already making hay by using antagonism toward the construction of new data centers a part of their policy and campaign promises, and considering the speed at which these things are being constructed, and the slow build of resistance toward them, it’s also an issue that could persist through the US congressional election in 2026, to the subsequent presidential election in 2028.Show Noteshttps://www.wired.com/story/opposed-to-data-centers-the-working-families-party-wants-you-to-run-for-office/https://finance.yahoo.com/news/without-data-centers-gdp-growth-171546326.htmlhttps://time.com/7308925/elon-musk-memphis-ai-data-center/https://wreg.com/news/new-details-on-152m-data-center-planned-in-memphis/https://www.politico.com/news/2025/05/06/elon-musk-xai-memphis-gas-turbines-air-pollution-permits-00317582https://www.datacenterwatch.org/reporthttps://www.govtech.com/products/kent-county-mich-cancels-data-center-meeting-due-to-crowdhttps://www.woodtv.com/news/kent-county/gaines-township-planning-commission-to-hold-hearing-on-data-center-rezoning/https://www.theverge.com/science/841169/ai-data-center-oppositionhttps://www.iea.org/reports/energy-and-ai/energy-demand-from-aihttps://www.cbre.com/insights/reports/global-data-center-trends-2025https://www.phoenixnewtimes.com/news/chandler-city-council-unanimously-kills-sinema-backed-data-center-40628102/https://www.mlive.com/news/ann-arbor/2025/11/rural-michigan-fights-back-how-riled-up-residents-are-challenging-big-tech-data-centers.html?outputType=amphttps://www.courthousenews.com/nonprofit-sues-to-block-165-billion-openai-data-center-in-rural-new-mexico/https://www.datacenterdynamics.com/en/news/microsoft-cancels-plans-for-data-center-caledonia-wisconsin/https://www.cnbc.com/2025/11/25/microsoft-ai-data-center-rejection-vs-support.htmlhttps://www.wpr.org/news/microsoft-caledonia-data-center-site-ozaukee-countyhttps://thehill.com/opinion/robbys-radar/5655111-bernie-sanders-data-center-moratorium/https://www.investopedia.com/magnificent-seven-stocks-8402262https://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/the-cost-of-compute-a-7-trillion-dollar-race-to-scale-data-centershttps://www.mckinsey.com/industries/technology-media-and-telecommunications/our-insights/ai-power-expanding-data-center-capacity-to-meet-growing-demandhttps://www.marketplace.org/story/2025/12/19/are-energyhungry-data-centers-causing-electric-bills-to-go-uphttps://en.wikipedia.org/wiki/Data_centerhttps://en.wikipedia.org/wiki/ENIAC This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Chip Exports
16/12/2025 | 13 mins.
This week we talk about NVIDIA, AI companies, and the US economy.We also discuss the US-China chip-gap, mixed-use technologies, and export bans.Recommended Book: Enshittification by Cory DoctorowTranscriptI’ve spoken about this a few times in recent months, but it’s worth rehashing real quick because this collection of stories and entities are so central to what’s happening across a lot of the global economy, and is also fundamental, in a very load-bearing way, to the US economy right now.As of November of 2025, around the same time that Nvidia, the maker of the world’s best AI-optimized chips at the moment became the world’s first company to achieve a $5 trillion market cap, the top seven highest-valued tech companies, including Nvidia, accounted for about 32% of the total value of the US stock market.That’s an absolutely astonishing figure, as while Nvidia, Apple, Microsoft, Alphabet, Amazon, Broadcom, and Meta all have a fairly diverse footprint even beyond their AI efforts, a lot of that value for all of them is predicated on expected future income; which is to say, their market caps, their value according to that measure, is determined not by their current assets and revenue, but by what investors think or hope they’ll pull in and be worth in the future.That’s important to note because historically the sorts of companies that have market caps that are many multiples of their current, more concrete values are startups; companies in their hatchling phase that have a good idea and some kind of big potential, a big moat around what they’re offering or a blue ocean sub-industry with little competition in which they can flourish, and investment is thus expected to help them grow fast.These top seven tech companies, in contrast, are all very mature, have been around for a while and have a lot of infrastructure, employees, expenses, and all the other things we typically associated with mature businesses, not flashy startups with their best days hopefully ahead of them.Some analysts have posited that part of why these companies are pushing the AI thing so hard, and in particular pushing the idea that they’re headed toward some kind of generally useful AI, or AGI, or superhuman AI that can do everyone’s jobs better and cheaper than humans can do them, is that in doing so, they’re imagining a world in which they, and they alone, because of the costs associated with building the data centers required to train and run the best-quality AI right now, are capable of producing basically an economy’s-worth of AI systems and bots and machines operated by those AI systems.In other words, they’re creating, from whole cloth, an imagined scenario in which they’re not just worthy of startup-like valuations, worthy of market caps that are tens or hundreds of times their actual concrete value, because of those possible futures they’re imagining in public, but they’re the only companies worthy of those valuation multiples; the only companies that matter anymore.It’s likely that even if this is the case, that the folks in charge of these companies, and the investors who have money in them who are likely to profit when the companies grow and grow, actually do believe what they’re telling everyone about the possibilities inherent in building these sorts of systems.But there also seems to be a purely economic motive for exaggerating a lot and clearing out as much of the competition as possible as they grow bigger and bigger. Because maybe they’ll actually make what they’re saying they can make as a result of all that investment, that exuberance, but maybe, failing that, they’ll just be the last companies standing after the bubble bursts and an economic wildfire clears out all the smaller companies that couldn’t get the political relationships and sustaining cash they needed to survive the clear-out, if and when reality strikes and everyone realizes that sci-fi outcome isn’t gonna happen, or isn’t gonna happen any time soon.What I’d like to talk about today is a recent decision by the US government to allow Nvidia to sell some of its high-powered chips to China, and why that decision is being near-universally derided by those in the know.—In early December 2025, after a lot of back-and-forthing on the matter, President Trump announced that the US government will allow Nvidia, which is a US-based company, to export its H200 processors to China. He also said that the US government will collect a 25% fee on these sales.The H200 is Nvidia’s second-best chip for AI purposes, and it’s about six-times as powerful as the H20, which is currently the most advanced Nvidia chip that’s been cleared for sale to China. The Blackwell chip that is currently Nvidia’s most powerful AI offering is about 1.5-times faster than the H200 for training purposes, and five-times faster for AI inferencing, which is what they’re used for after a model is trained, and then it’s used for predictions, decisions, and so on.The logic of keeping the highest-end chips from would-be competitors, especially military competitors like China, isn’t new—this is something the US and other governments have pretty much always done, and historically even higher-end gaming systems like Playstation consoles have been banned for export in some cases because the chips they contained could be repurposed for military things, like plucking them out and using them to guide missiles—Sony was initially unable to sell the Playstation 2 outside of Japan because it needed special permits to sell something so militarily capable outside the country, and it remained unsellable in countries like Iraq, Iran, and North Korea throughout its production period.The concern with these Nvidia chips is that if China has access to the most powerful AI processors, it might be able to close the estimated 2-year gap between US companies and Chinese companies when it comes to the sophistication of their AI models and the power of their relevant chips. Beyond being potentially useful for productivity and other economic purposes, this hardware and software is broadly expected to shape the next generation of military hardware, and is already in use for all sorts of wartime and defense purposes, including sophisticated drones used by both sides in Ukraine. If the US loses this advantage, the thinking goes, China might step up its aggression in the South China Sea, potentially even moving up plans to invade Taiwan.Thus, one approach, which has been in place since the Biden administration, has been to do everything possible to keep the best chips out of Chinese hands, because that would ostensibly slow them down, make them less capable of just splurging on the best hardware, which they could then use to further develop their local AI capabilities.This approach, however, also incentivized the Chinese government to double-down on their own homegrown chip industry. Which again is still generally thought to be about 2-years behind the US industry, but it does seem to be closing the gap rapidly, mostly by copying designs and approaches used by companies around the world.An alternative theory, the one that seems to be at least partly responsible for Trump’s about-face on this, is that if the US allows the sale of sufficiently powerful chips to China, the Chinese tech industry will become reliant on goods provided by US companies, and thus its own homegrown AI sector will shrivel and never fully close that gap. If necessary the US can then truncate or shut down those shipments, crippling the Chinese tech industry at a vital moment, and that would give the US the upper-hand in many future negotiations and scenarios.Most analysts in this space no longer think this is a smart approach, because the Chinese government is wise to this tactic, using it itself all the time. And even in spaces where they have plenty of incoming resources from elsewhere, they still try to shore-up their own homegrown versions of the same, copying those international inputs rather than relying on them, so that someday they won’t need them anymore.The same is generally thought to be true, here. Ever since the first Trump administration, when the US government started its trade war with China, the Chinese government has not been keen on ever relying on external governments and economies again, and it looks a lot more likely, based on what the Chinese government has said, and based on investments across the Chinese market on Chinese AI and chip companies following this announcement, that they’ll basically just scoop up as many Nvidia chips as they can, while they can, and primarily for the purpose of reverse-engineering those chips, speeding up their gap-closing with US companies, and then, as soon as possible, severing that tie, competing with Nvidia rather than relying on it.This is an especially pressing matter right now, then, because the US economy, and basically all of its growth, is so completely reliant on AI tech and the chips that are allowing that tech to move forward.If this plan by the US government doesn’t pan out and ends up being a short-term gain situation, a little bit of money earned from that 25% cut the government takes, and Ndvidia temporarily enriching itself further through Chinese sales, but in exchange both entities give up their advantage, long term, to Chinese AI companies and the Chinese government, that could be bad not just for AI companies around the world, which could be rapidly outcompeted by Chinese alternatives, but also all economies exposed to the US economy, which could be in for a long term correction, slump, or full-on depression.Show Noteshttps://www.nytimes.com/2025/12/09/us/politics/trump-nvidia-ai-chips-china.htmlhttps://arstechnica.com/tech-policy/2025/12/us-taking-25-cut-of-nvidia-chip-sales-makes-no-sense-experts-say/https://www.pcmag.com/news/20-years-later-how-concerns-about-weaponized-consoles-almost-sunk-the-ps2https://archive.is/20251211090854/https://www.reuters.com/world/china/us-open-up-exports-nvidia-h200-chips-china-semafor-reports-2025-12-08/https://theconversation.com/with-nvidias-second-best-ai-chips-headed-for-china-the-us-shifts-priorities-from-security-to-trade-271831https://www.economist.com/business/2025/12/09/donald-trumps-flawed-plan-to-get-china-hooked-on-nvidia-chipshttps://www.scmp.com/tech/tech-trends/article/3335900/chinas-moore-threads-unveil-ai-chip-road-map-rival-nvidias-cuda-systemhttps://www.investopedia.com/nvidia-just-became-the-first-usd5-trillion-company-monitor-these-crucial-stock-price-levels-11839114https://aventis-advisors.com/ai-valuation-multiples/ This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Digital Asset Markets
09/12/2025 | 13 mins.
This week we talk about in-game skins, investment portfolios, and Counter-Strike 2.We also discuss ebooks, Steam, and digital licenses.Recommended Book: Apple in China by Patrick McGeeTranscriptAlmost always, if you buy an ebook or game or movie or music album online, you’re not buying that ebook, or that game, or whatever else—you’re buying a license that allows you access it, often on a specified device or in a specified way, and almost always in a non-transferrable, non-permanent manner.This distinction doesn’t matter much to most of us most of the time. If I buy an ebook, chances are I just want to read that ebook on the device I used to buy it, or the kindle attached to my Amazon or other digital book service account. So I buy the book, read it on my ebook reader or phone, and that’s that; same general experience I would have with a paperback or hardback book.This difference becomes more evident when you think about what happens to the book after you read it, though. If I own a hard-copy, physical book, I can resell it. I can donate it. I can put it in a Little Free Library somewhere in my neighborhood, or give it to a friend who I think will enjoy it. I can pick it up off my shelf later and read the exact same book I read years before. Via whichever mechanism I choose, I’m either holding onto that exact book for later, or I’m transferring ownership of that book, that artifact that contains words and/or images that can now be used, read, whatever by that second owner. And they can go on to do the same: handing it off to a friend, selling it on ebay, or putting it on a shelf for later reference.Often the convenience and immediacy of electronic books makes this distinction a non-issue for those who enjoy them. I can buy an ebook from Amazon or Bookshop.org and that thing is on my device within seconds, giving me access to the story or information that’s the main, valuable component of a book for most of us, without any delay, without having to drive to a bookstore or wait for it to arrive in the mail. That’s a pretty compelling offer.This distinction becomes more pressing, however, if I decide I want to go back and read an ebook I bought years ago, later, only to find that the license has changed and maybe that book is no longer accessible via the marketplace where I purchased it. If that happens, I no longer have access to the book, and there’s no recourse for this absence—I agreed to this possibility when I “bought” the book, based on the user agreement I clicked ‘OK’ or ‘I agree’ on when I signed up for Amazon or whichever service I paid for that book-access.It also becomes more pressing if, as has happened many times over the past few decades, the publisher or some other entity with control over these book assets decides to change them.A few years ago, for instance, British versions of Roald Dalh’s ‘Matilda’ were edited to remove references to Joseph Conrad, who has in recent times been criticized for his antisemitism and racist themes in his writing. Some of RL Stine’s Goosebumps books were edited to remove references to crushes schoolgirls had on their headmaster, and descriptions of an overweight character that were, in retrospect, determined to be offensive. And various racial and ethnic slurs were edited out of some of Agatha Christie’s works around the same time.Almost always, these changes aren’t announced by the publishers who own the rights to these books, and they’re typically only discovered by eagle-eyed readers who note that, for instance, the publishers decided to change the time period in which something occurred, which apparently happened in one of Stine’s works, without obvious purpose. This also frequently happens without the author being notified, as was the case with Stine and the edits made to his books. The publishers themselves, when asked directly about these changes, often remain silent on the matter.What I’d like to talk about today is another angle of this distinction between physically owned media and digital, licensed versions of the same, and the at times large sums of money that can be gained or lost based on the decisions of the companies that control these licensed assets.—Counter-Strike 2 is a first-person shooter game that’s free-to-play, was released in 2023, and was developed by a company called Valve.Valve has developed all sorts of games over the years, including the Counter-Strike, Half-Life, DOTA, and Portal games, but they’re probably best known for their Steam software distribution platform.Steam allows customers to buy all sorts of software, but mostly games through an interface that also provides chat services and community forums. But the primary utility of this platform is that it’s a marketplace for buying and selling games, and it has match-making features for online multiplayer games, serves as a sort of library for gamers, so all their games are launchable from one place, and it serves as a digital rights management hub, which basically means it helps game companies ensure users aren’t playing with pirated software—if you want to use steam to store and launch your games, they have to be legit, purchased games, not pirated ones.As of early 2025, it was estimated that Steam claimed somewhere between 75-80% of the PC gaming market, compared to competitors like the Epic Game Store, which was founded by the folks behind the wildly successful game, Fortnite, which can only claim something like 5%.And Counter-Strike is one of Valve’s, and Steam’s crown jewels. It’s a free-to-play game that was originally developed as a mod, a free add-on to another game Valve owns called Half-Life, but Valve bought up the rights to that mod and developed it into its own thing, releasing the initial entry in the series in 2000, several main-series games after that in subsequent years, and then Counter-Strike 2 came out in 2023, to much acclaim and fanfare.Counter-Strike 2 often has around a million players online, playing the game at any given moment, and its tournaments can attract closer to 1.5 million. As of early 2024, it was estimated that Counter-Strike 2 pulled in around a billion dollars a year for Valve, primarily via what are called Case Keys, which allow players to open in-game boxes, each key selling for $2.50. Valve also takes a 15% cut of all player-to-player sales of items conducted on the Steam Community Market, which is a secure ebay- or Amazon-like component of their platform where players can sell digital items from the game, which are primarily aesthetic add-ons, like skins for weapons, stickers, and clothing—things that allow players to look different in the game, as opposed to things that allow them to perform better, which would give players who spent the most money an unfair advantage and thus make the game less competitive and fun.Because this is a free game, though, and by many estimates a really balance and well-made one, a lot of people play it, and a lot of people want to customize the look of their in-game avatar. So being able to open in-game boxes that contain loot, and being able to buy and sell said loot on the Steam Community Market, has led to a rich secondary economy that makes that component of the game more interesting for players, while also earning Valve a whole lot of money on the backend for those keys and that cut of sales between players.In late-October of 2025, Valve announced a change in the rules for Counter-Strike 2, now allowing players to trade-up more item types, including previously un-trade-up-able items like gloves and knives, into higher-grade versions of the same. So common items could be bundled together and traded in for less common items, and those less common items could be bundled together and traded up for rare ones.This seems like a small move from the outside, but it roiled the CS2 in-game economy, by some estimates causing upwards of $2 billion to basically disappear overnight, because rare gloves and knives were at times valued at as much as $1.5 million; again, these are just aesthetic skins that change the look of a player’s avatar or weapons, but there’s enough demand for these things that some people are willing to pay that much for ultra-rare and unique glove and knife skins.Because of that demand, some players had taken to spending real money on these ultra-rare items, treating their in-game portfolios of skins as something like an investment portfolio. If you can buy an ultra-rare glove skin for $40,000 and maybe sell it later for twice that, that might seem like a really good investment, despite how strange it may seem to those not involved in this corner of the gaming world to spend $40,000 on what’s basically just some code in a machine that tells the game that the gloves on your avatar will look a certain way.This change, then, made those rarer gloves and knives, which were previously unattainable except by lottery-like chance, a lot more common, because people could trade up for them, increasing their chances of getting the ultra-rare stuff. The market was quickly flooded with more of these things, and about half the value of rare CS2 skins disappeared, initially knocking about $6 billion of total value from the market before stabilizing to around $1.5-2 billion.Volatility in this market continues, and people who invested a lot of money, sometimes their life savings, and sometimes millions of dollars into CS2 in-game skins, have been looking into potential legal recourse, though without much luck; Valve’s user agreements make very clear that players don’t own any of this stuff, and as a result, Valve can manipulate the market however they like, whenever they like.Just like with ebooks and movies we “buy” from Amazon and other services, then, these in-game assets are licensed to us, not sold. We may, at times, have a means of putting our license to some of these things on a secondary market, but that secondary market exists completely at the whim of the entity that actually owns the digital assets—in this case, Valve.Recent court cases have resulted in clearer language from some license-selling companies, including Valve—though in most cases the buttons we click still say something like “Buy Now” rather than “Acquire License,” and the specifics of what we’re purchasing are hidden within a wall of legal text.So for the moment, at least, this sort of confusion will probably continue, with periodic wake-up calls for folks on the receiving end of updates or edits that impact them financially, or impact their ability to access what they thought they were buying, but which is later removed from their account, or changed without their knowledge or permission.Show Noteshttps://en.wikipedia.org/wiki/Steam_(service)https://en.wikipedia.org/wiki/Valve_Corporationhttps://theconversation.com/2b-counter-strike-2-crash-exposes-a-legal-black-hole-your-digital-investments-arent-really-yours-268749https://blix.gg/news/cs-2/how-to-make-money-with-cs2-skins-in-2025/http://tomshardware.com/video-games/ludicrous-usd6-billion-counter-strike-2-skins-market-crashes-loses-usd3-billion-overnight-game-update-destroys-inventories-collapses-markethttps://www.kvue.com/article/news/nation-world/counter-strike-2-online-market-crash/507-ae9be038-2833-49d4-a5b4-d8f24fd0b33chttps://en.wikipedia.org/wiki/Counter-Strikehttps://en.wikipedia.org/wiki/Epic_Games_Storehttps://www.sahmcapital.com/news/content/counter-strike-skins-market-hits-1-billion-valves-virtual-goldmine-revealed-2024-01-22https://mezha.ua/en/news/counter-strike-2-100-mln-dohodu-za-keysi-u-berezni-301011/https://www.morganlewis.com/pubs/2024/10/california-becomes-first-state-to-pass-law-targeting-advertising-of-digital-media-licenseshttps://en.wikipedia.org/wiki/Virtual_economyhttps://www.nytimes.com/2023/04/04/arts/dahl-christie-stine-kindle-edited.htmlhttps://bookriot.com/do-you-really-own-your-ebookshttps://jipel.law.nyu.edu/can-you-own-an-ebook-a-summary-of-the-anti-ownership-ebook-economy-report/ This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Climate Risk
02/12/2025 | 16 mins.
This week we talk about floods, wildfires, and reinsurance companies.We also discuss the COP meetings, government capture, and air pollution.Recommended Book: If Anyone Builds It, Everyone Dies by Eliezer Yudkowsky and Nate Soares TranscriptThe urban area that contains India’s capital city, New Delhi, called the National Capital Territory of Delhi, has a population of around 34.7 million people. That makes it the most populous city in the country, and one of the most populous cities in the world.Despite the many leaps India has made over the past few decades, in terms of economic growth and overall quality of life for residents, New Delhi continues to have absolutely abysmal air quality—experts at India’s top research hospital have called New Delhi’s air “severe and life-threatening,” and the level of toxic pollutants in the air, from cars and factories and from the crop-waste burning conducted by nearby farmers, can reach 20-times the recommended level for safe breathing.In mid-November 2025, the problem became so bad that the government told half its workers to work from home, because of the dangers represented by the air, and in the hope that doing so would remove some of the cars on the road and, thus, some of the pollution being generated in the area.Trucks spraying mist, using what are called anti-smog guns, along busy roads and pedestrian centers help—the mist keeping some of the pollution from cars from billowing into the air and becoming part of the regional problem, rather than an ultra-localized one, and pushing the pollutants that would otherwise get into people’s lungs down to the ground—though the use of these mist-sprayers has been controversial, as there are accusations that they’re primarily deployed near air-quality monitoring stations, and that those in charge put them there to make it seem like the overall air-quality is lower than it is, manipulating the stats so that their failure to improve practical air-quality isn’t as evident.And in other regional news, just southeast across the Bay of Bengal, the Indonesian government, as of the day I’m recording this, is searching for the hundreds of people who are still missing following a period of unusually heavy rains. These rains have sparked floods and triggered mudslides that have blocked roads, damaged bridges, and forced the evacuation of entire villages. More than 300,000 people have been evacuated as of last weekend, and more rain is forecast for the coming days.The death toll of this round of heavy rainfall—the heaviest in the region in years—has already surpassed 440 people in Indonesia, with another 160 and 90 in Thailand and Vietnam, respectively, being reported by those countries’ governments, from the same weather system.In Thailand, more than two million people were displaced by flooding, and the government had to deploy military assets, including helicopters launched from an aircraft carrier, to help rescue people from the roofs of buildings across nine provinces.In neighboring Malaysia, tens of thousands of people were forced into shelters as the same storm system barreled through, and Sri Lanka was hit with a cyclone that left at least 193 dead and more than 200 missing, marking one of the country’s worst weather disasters in recent years.What I’d like to talk about today is the climatic moment we’re at, as weather patterns change and in many cases, amplify, and how these sorts of extreme disasters are also causing untold, less reported upon but perhaps even more vital, for future policy shifts, at least, economic impacts.—The UN Conference of the Parties, or COP meetings, are high-level climate change conferences that have typically been attended by representatives from most governments each year, and where these representatives angle for various climate-related rules and policies, while also bragging about individual nations’ climate-related accomplishments.In recent years, such policies have been less ambitious than in previous ones, in part because the initial surge of interest in preventing a 1.5 degrees C increase in average global temperatures is almost certainly no longer an option; climate models were somewhat accurate, but as with many things climate-related, seem to have actually been a little too optimistic—things got worse faster than anticipated, and now the general consensus is that we’ll continue to shoot past 1.5 degrees C over the baseline level semi-regularly, and within a few years or a decade, that’ll become our new normal.The ambition of the 2015 Paris Agreement is thus no longer an option. We don’t yet have a new, generally acceptable—by all those governments and their respective interests—rallying cry, and one of the world’s biggest emitters, the United States, is more or less absent at new climate-related meetings, except to periodically show up and lobby for lower renewables goals and an increase in subsidies for and policies that favor the fossil fuel industry.The increase in both number and potency of climate-influenced natural disasters is partly the result of this failure to act, and act forcefully and rapidly enough, by governments and by all the emitting industries they’re meant to regulate.The cost of such disasters is skyrocketing—there are expected to be around $145 billion in insured losses, alone, in 2025, which is 6% higher than in 2024—and their human impact is booming as well, including deaths and injuries, but also the number of people being displaced, in some cases permanently, by these disasters.But none of that seems to move the needle much in some areas, in the face of entrenched interests, like the aforementioned fossil fuel industry, and the seeming inability of politicians in some nations to think and act beyond the needs of their next election cycle.That said, progress is still being made on many of these issues; it’s just slower than it needs to be to reach previously set goals, like that now-defunct 1.5 degrees C ceiling.Most nations, beyond petro-states like Russia and those with fossil fuel industry-captured governments like the current US administration, have been deploying renewables, especially solar panels, at extraordinary rates. This is primarily the result of China’s breakneck deployment of solar, which has offset a lot of energy growth that would have otherwise come from dirty sources like coal in the country, and which has led to a booming overproduction of panels that’s allowed them to sell said panels cheap, overseas.Consequently, many nations, like Pakistan and a growing number of countries across Sub-Saharan African, have been buying as many cheap panels as they can afford and bypassing otherwise dirty and unreliable energy grids, creating arrays of microgrids, instead.Despite those notable absences, then, solar energy infrastructure installations have been increasing at staggering rates, and the first half of 2025 has seen the highest rate of capacity additions, yet—though China is still installing twice as much solar as the rest of the world, combined, at this point. Which is still valuable, as they still have a lot of dirty energy generation to offset as their energy needs increase, but more widely disseminated growth is generally seen to be better in the long-term—so the expansion into other parts of the world is arguably the bigger win, here.The economics of renewables may, at some point, convince even the skeptics and those who are politically opposed to the concept of renewables, rather than practically opposed to them, that it’s time to change teams. Already, conservative parts of the US, like Texas, are becoming renewables boom-towns, quietly deploying wind and solar because they’re often the best, cheapest, most resilient options, even as their politicians rail against them in public and vote for more fossil fuel subsidies.And it may be economics that eventually serve as the next nudge, or forceful shove on this movement toward renewables, as we’re reaching a point at which real estate and the global construction industry, not to mention the larger financial system that underpins them and pretty much all other large-scale economic activities, are being not just impacted, but rattled at their roots, by climate change.In early November 2025, real estate listing company Zillow, the biggest such company in the US, stopped showing extreme weather risks for more than a million home sale listings on its site.It started showing these risk ratings in 2024, using data from a risk-modeling company called First Street, and the idea was to give potential buyers a sense of how at-risk a property they were considering buying might be when it comes to wildfires, floods, poor air quality, and other climate and pollution-related issues.Real estate agents hated these ratings, though, in part because there was no way to protest and change them, but also because, well, they might have an expensive coastal property listed that now showed potential buyers it was flood prone, if not today, in a couple of years. It might also show a beautiful mountain property that’s uninsurable because of the risk of wildfire damage.A good heuristic for understanding the impact of global climate change is not to think in terms of warming, though that’s often part of it, but rather thinking in terms of more radical temperature and weather swings.That means areas that were previously at little or no risk of flooding might suddenly be very at risk of absolutely devastating floods. And the same is true of storms, wildfires, and heat so intense people die just from being outside for an hour, and in which components of one’s house might fry or melt.This move by Zillow, the appearance and removal of these risk scores, happened at the same time global insurers are warning that they may have to pull out of more areas, because it’s simply no longer possible for them to do business in places where these sorts devastating weather events are happening so regularly, but often unpredictably, and with such intensity—and where the landscapes, ecologies, and homes are not made to withstand such things; all that stuff came of age or was built in another climate reality, so many such assets are simply not made for what’s happening now, and what’s coming.This is of course an issue for those who already own such assets—homes in newly flood-prone areas, for instance—because it means if there’s a flood and a home owner loses their home, they may not be able to rebuild or get a payout that allows them to buy another home elsewhere. That leaves some of these assets stranded, and it leaves a lot of people with a huge chunk of their total resources permanently at risk, unable to move them, or unable to recoup most of their investment, shifting that money elsewhere. It also means entires industries could be at risk, especially banks and other financial institutions that provide loans for those who have purchased homes and other assets in such regions.An inability to get private insurance also means governments will be increasingly on the hook for issuing insurance of last resort to customers, which often costs more, but also, as we’ve seen with flood insurance in the US, means the government tends to lose a lot of money when increasingly common, major disasters occur on their soil.This isn’t just a US thing, though; far from it. Global reinsurers, companies that provide insurance for insurance companies, and whose presence and participation in the market allow the insurance world to function, Swiss Re and Munich Re, recently said that uninsurable areas are growing around the world right now, and lacking some kind of fundamental change to address the climate paradigm shift, we could see a period of devastation in which rebuilding is unlikely or impossible, and a resultant period in which there’s little or no new construction because no one wants to own a home or factory or other asset that cannot be insured—it’s just not a smart investment.This isn’t just a threat to individual home owners, then, it’s potentially a threat to the whole of the global financial system, and every person and business attached to it, which in turn is a threat to global governance and the way property and economics work.There’s a chance the worst-possible outcomes here can still be avoided, but with each new increase in global average temperature, the impacts become worse and less predictable, and the economics of simply making, protecting, and owning things become less and less favorable.Show Noteshttps://www.nytimes.com/2025/11/30/climate/zillow-climate-risk-scores-homes.htmlhttps://www.nytimes.com/2025/11/30/climate/climate-change-disinformation.htmlhttps://www.nytimes.com/2025/11/30/world/asia/india-delhi-pollution.htmlhttps://www.nytimes.com/2025/11/30/world/asia/flooding-indonesia-thailand-southeast-asia.htmlhttps://www.bbc.com/news/articles/c5y9ejley9dohttps://www.theguardian.com/environment/2025/nov/22/cop30-deal-inches-closer-to-end-of-fossil-fuel-era-after-bitter-standoffhttps://theconversation.com/the-world-lost-the-climate-gamble-now-it-faces-a-dangerous-new-reality-270392https://theconversation.com/earth-is-already-shooting-through-the-1-5-c-global-warming-limit-two-major-studies-show-249133https://www.404media.co/americas-polarization-has-become-the-worlds-side-hustle/https://www.cnbc.com/2025/08/08/climate-insurers-are-worried-the-world-could-soon-become-uninsurable-.htmlhttps://www.imd.org/ibyimd/sustainability/climate-change-the-emergence-of-uninsurable-areas-businesses-must-act-now-or-pay-later/https://www.jec.senate.gov/public/index.cfm/democrats/2024/12/climate-risks-present-a-significant-threat-to-the-u-s-insurance-and-housing-marketshttps://www.weforum.org/stories/2025/04/financial-system-warning-climate-nature-stories-this-week/https://www.weforum.org/stories/2025/05/costs-climate-disasters-145-billion-nature-climate-news/https://arstechnica.com/science/2025/11/solars-growth-in-us-almost-enough-to-offset-rising-energy-use/https://ember-energy.org/latest-updates/global-solar-installations-surge-64-in-first-half-of-2025/ This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe

Thorium Reactors
25/11/2025 | 12 mins.
This week we talk about radioactive waste, neutrons, and burn while breeding cycles.We also discuss dry casks, radioactive decay, and uranium.Recommended Book: Breakneck by Dan WangTranscriptRadioactive waste, often called nuclear waste, typically falls into one of three categories: low-level waste that contains a small amount of radioactivity that will last a very short time—this is stuff like clothes or tools or rags that have been contaminated—intermediate-level waste, which has been contaminated enough that it requires shielding, and high-level waste, which is very radioactive material that creates a bunch of heat because of all the radioactive decay, so it requires both shield and cooling.Some types of radioactive waste, particularly spent fuel of the kind used in nuclear power plants, can be reprocessed, which means separating it into other types of useful products, including another type of mixed nuclear fuel that can be used in lieu of uranium, though generally not economically unless uranium supplies are low. About a third of all spent nuclear fuel has already been reprocessed in some way.About 4% of even the recyclable stuff, though, doesn’t have that kind of second-life purpose, and that, combined with the medium- and long-lived waste that is quite dangerous to have just sitting around, has to be stored somehow, shielded and maybe cooled, and in some cases for a very long time: some especially long-lived fission products have half-lives that stretch into the hundreds of thousands or millions of years, which means they will be radioactive deep into the future, many times longer than humans have existed as a species.According to the International Atomic Energy Agency, something like 490,000 metric tons of radioactive spent fuel is currently being stored, on a temporary basis, at hundreds of specialized sites around the world. The majority of this radioactive waste is stored in pools of spent fuel water, cooled in that water somewhere near the nuclear reactors where the waste originated. Other waste has been relocated into what’re called dry casks, which are big, barrel-like containers made of several layers of steel, concrete, and other materials, which surround a canister that holds the waste, and the canister is itself surrounded by inert gas. These casks hold and cool waste using natural air convection, so they don’t require any kind of external power or water sources, while other solutions, including storage in water, sometimes does—and often the fuel is initially stored in pools, and is then moved to casks for longer-term storage.Most of the radioactive waste produced today comes in the form of spend fuel from nuclear reactors, which are typically small ceramic pellets made of low-enriched uranium oxide. These pellets are stacked on top of each other and encased in metal, and that creates what’s called a fuel rod.In the US, alone, about 2,000 metric tons of spent nuclear fuel is created each year, which is just shy of half an olympic sized swimming pool in terms of volume, and in many countries, the non-reuseable stuff is eventually buried, near the surface for the low- to intermediate-level waste, and deeper for high-level waste—deeper, in this context, meaning something like 200-1000 m, which is about 650-3300 feet, beneath the surface.The goal of such burying is to prevent potential leakage that might impact life on the surface, while also taking advantage of the inherent stability and cooler nature of underground spaces which are chosen for their isolation, natural barriers, and water impermeability, and which are also often reinforced with human-made supports and security, blocking everything off and protecting the surrounding area so nothing will access these spaces far into the future, and so that they won’t be broken open by future glaciation or other large-scale impacts, either.What I’d like to talk about today is another potential use and way of dealing with this type of waste, and why a recent, related development in China is being heralded as such a big deal.—An experimental nuclear reactor was built in the Gobi Desert by the Chinese Academy of Sciences Shanghai Institute of Applied Physics, and back in 2023 the group achieved its first criticality, got started up, basically, and it has been generating heat through nuclear fission ever since.What that means is that the nuclear reactor did what a nuclear reactor is supposed to do. Most such reactors exist to generate heat, which then creates steam and spins turbines, which generates electricity.What’s special about this reactor, though, is that it is a thorium molten salt reactor, which means it uses thorium instead of uranium as a fuel source, and the thorium is processed into uranium as part of the energy-making process, because thorium only contains trace amounts of fissile material, which isn’t enough to get a power-generating, nuclear chain reaction going.This reactor was able to successfully perform what’s called in-core thorium-to-uranium conversion, which allows the operators to use thorium as fuel, and have that thorium converted into uranium, which is sufficiently fissile to produce nuclear power, inside the core of the reactor. This is an incredibly fiddly process, and requires that the thorium-232 used as fuel absorb a neutron, which turns it into thorium-233. Thorium-233 then decays into protactinium-233, and that, in turn, decays into uranium-233—the fuel that powers the reactor.One innovation here is that this entire process happens inside the reactor, rather than occurring externally, which would require a bunch of supplementary infrastructure to handle fuel fabrication, increasing the amount of space and cost associated with the reactor.Those neutrons required to start the thorium conversion process are provided by small amounts of more fissile material, like enriched uranium-235 or plutonium-239, and the thorium is dissolved in a fluoride salt and becomes a molten mixture that allows it to absorb that necessary neutron, and go through that multi-step decay process, turning into uranium-233. That end-point uranium then releases energy through nuclear fission, and this initiates what’s called a burn while breeding cycle, which means it goes on to produce its own neutrons moving forward, which obviates the need for those other, far more fissile materials that were used to start the chain reaction. All of which makes this process a lot more fuel efficient than other options, dramatically reduces the amount of radioactive waste produced, and allows reactors that use it to operate a lot longer without needing to refuel, which also extends a reactor’s functional life.On that last point, many typical nuclear power plants built over the past handful of decades use pressurized water reactors which have to be periodically shut down so operators can replace spent fuel rods. This new method instead allows the fissile materials to continuously circulate, enabling on-the-fly refueling—so no shut-down, no interruption of operations necessary.This method also requires zero water, which could allow these reactors to be built in more and different locations, as conventional nuclear power plants have typically been built near large water sources, like oceans, because of their cooling needs.China initiated the program that led to the development of this experimental reactor back in 2011, in part because it has vast thorium reserves it wanted to tap in its pursuit of energy independence, and in part because this approach to nuclear energy should, in theory at least, allow plant operators to use existing, spent fuel rods as part of its process, which could be very economically interesting, as they could use the waste from their existing plants to help fuel these new plants, but also take such waste off other governments’ hands, maybe even be paid for it, because those other governments would then no longer need to store the stuff, and China could use it as cheap fuel; win win.Thinking further along, though, maybe the real killer application of this technology is that it allows for the dispersion of nuclear energy without the risk of nuclear weapons proliferation. The plants are smaller, they have a passive safety system that disallows the sorts of disasters that we saw in Chernobyl and Three-Mile Island—that sort of thing just can’t happen with this setup—and the fissile materials, aside from those starter materials used to get the initial cycle going, can’t be used to make nuclear weapons.Right now, there’s a fair amount of uranium on the market, but just like oil, that availability is cyclical and controlled by relatively few governments. In the future, that resource could become more scarce, and this reactor setup may become even more valuable as a result, because thorium is a lot cheaper and more abundant, and it’s less tightly controlled because it’s useless from a nuclear weapons standpoint.This is only the very first step on the way toward a potentially thorium-reactor dominated nuclear power industry, and the conversion rate on this experimental model was meager.That said, it is a big step in the right direction, and a solid proof-of-concept, showing that this type of reactor has promise and would probably work scaled-up, as well, and that means the 100MW demonstration reactor China is also building in the Gobi, hoping to prove the concept’s full value by 2035, stands a pretty decent chance of having a good showing.Show Noteshttps://www.deepisolation.com/about-nuclear-waste/where-is-nuclear-waste-nowhttps://www.energy.gov/ne/articles/5-fast-facts-about-spent-nuclear-fuelhttps://www.energy.gov/ne/articles/3-advanced-reactor-systems-watch-2030https://world-nuclear.org/information-library/nuclear-fuel-cycle/nuclear-waste/radioactive-wastes-myths-and-realitieshttps://www.visualcapitalist.com/visualizing-all-the-nuclear-waste-in-the-world/https://en.wikipedia.org/wiki/High-level_radioactive_waste_managementhttps://en.wikipedia.org/wiki/Radioactive_wastehttps://en.wikipedia.org/wiki/Nuclear_reprocessinghttps://en.wikipedia.org/wiki/Dry_cask_storagehttps://en.wikipedia.org/wiki/Deep_geological_repositoryhttps://onlinelibrary.wiley.com/doi/abs/10.1002/er.3854https://archive.is/DQpXMhttps://en.wikipedia.org/wiki/Thorium-based_nuclear_powerhttps://en.wikipedia.org/wiki/Thorium_fuel_cycle This is a public episode. If you'd like to discuss this with other subscribers or get access to bonus episodes, visit letsknowthings.substack.com/subscribe



Let's Know Things