Category

Economy

Category

Objections to income inequality are commonplace. We hear these today from across the ideological spectrum, including, for example, from the far-left data-gatherer Thomas Piketty, the far-right provocateur Tucker Carlson, and Pope Leo XIV. 

Nothing is easier – and, apparently, few things are as emotionally gratifying – as railing against “the rich.” The principal qualification for issuing, and exulting in, denouncements of income inequality is first-grade arithmetic: One billion dollars is a larger sum of money than is ten thousand dollars, and so subtracting some dollars from the former sum and adding these funds to the latter sum will make incomes more equal. And because income is what people spend to achieve their standard of living, such ‘redistribution’ would also result in people being made more equal. What could be more obvious?

Countless careful researchers have convincingly shown that popular accounts of the magnitude of differences in monetary incomes are vastly overstated. But let’s here grant, for the sake of argument, that differences in monetary incomes within the United States are indeed vast. And then let’s pose some probing questions to proponents of using the state to tax and ‘redistribute’ high incomes.

• Do you teach your children to envy what other children have? Do you encourage your children to form gangs with their playmates to ‘redistribute’ toys away from richer kids on the schoolyard toward kids less rich? If not, why do you suppose that envy and ‘redistribution’ become acceptable when carried out on a large scale by the government?

• Suppose that Jones chooses a career as a poet. Jones treasures the time he spends walking in the woods and strolling city streets in leisurely reflection. His reflections lead him to compose poems critical of capitalist materialism. Working as a poet, Jones earns $40,000 annually. Smith chooses a career as an emergency-room physician. She works an average of 60 hours weekly and seldom takes a vacation. Her annual salary is $400,000. Is this “distribution” of income unfair? Is Smith responsible for Jones’s relatively low salary? Does Smith owe Jones money? If so, how much? And what formula would you use to determine Smith’s debt to Jones? What, in short, is the “fair” amount by which Smith’s income should be lowered in order to raise Jones’s income?

• While Dr. Smith earns more money than poet Jones, poet Jones earns more leisure than Dr. Smith. Do you believe that leisure has value to those who possess it? If so, are you disturbed by the inequality of leisure that separates leisure-rich Jones from leisure-poor Smith? Do you advocate policies to ‘redistribute’ leisure from Jones to Smith – say, by forcing Jones to wash Smith’s dinner dishes or to chauffeur Smith to and from work? If not, why not?

• Nobel-laureate economist William Nordhaus found that entrepreneurial innovators in the US from 1948 through 2001 captured, on average, only 2.2 percent of the total social value of their technological innovations. As Nordhaus put it, nearly 98 percent “of the benefits of technological change are passed on to consumers rather than captured by producers.” Does the fact that market competition obliges entrepreneurs to share the vast bulk of their wealth creation with consumers give you pause in your demands for ‘redistributing’ the wealth that these entrepreneurs manage to retain for themselves?

• Surveys show that Americans in general are not as bothered by income inequality as are academics and media pundits. Are the many Americans who don’t suffer searing envy of others’ monetary incomes stupid, naïve, or uninformed? Do the professors and pundits who agonize incessantly over income inequality know something that most Americans don’t? If so, what?

• You allege that great differences in incomes are psychologically harmful to relatively poor people even if these poor people are, by historical standards, quite wealthy. How, then, do you explain the great demand of very poor immigrants to come to America, where these immigrants are relatively much poorer than they are in their native lands?

• Do you believe that someone to whom government gives, say, $100,000 annually, year in and year out, simply because that person is a citizen of the country, feels as much psychological satisfaction as that person would feel if he learned a trade or a profession at which he earns an annual salary of, say, $80,000?

• Would you prefer to live in a society in which everyone’s annual income is $50,000 or in a society with an average annual income of $75,000 but in which annual incomes range from $30,000 to $3 million, and in which no occupation is obstructed by government-erected barriers to entry? And regardless of the choice you would make, do you believe that others who choose differently from you are in error?

• You often speak of income inequality as being a market failure. Can you identify an economic theory that predicts that every well-functioning market economy generates incomes that are equal or close to equal? I’m an economist and have never encountered such a theory, so I’d be delighted if you expand my intellectual horizons.

• You also warn that large differences in incomes make society unstable – or, as Paul Krugman insists, jeopardizes “the whole nature of our society.” Can you point to historical evidence in support of this claim? But remember: To be valid, the evidence must be from market economies in which the great majority of people – rich and not rich – earn their incomes through voluntary market activities and where the size of the economic pie isn’t fixed.

Evidence of social unrest in pre-industrial and nonmarket societies doesn’t count. Economic arrangements in such societies are fundamentally different than in our own. And unlike in our market economy, the amount of wealth in nonmarket economies is largely fixed. Therefore, in nonmarket economies, more wealth for some people does indeed mean less wealth for other people. Our economy differs categorically: Because the amount of wealth in market economies isn’t fixed, people get rich by creating more wealth rather than by seizing the wealth of others. In market economies, more wealth for rich people means, not less, but more wealth for other people.

• When you describe growing income inequality in the United States, you typically look only at the incomes of the rich before they pay taxes and at the incomes of the poor before they receive noncash transfers from the government such as food stamps, Medicare and Medicaid. You also ignore noncash transfers that the poor receive from private charities. Why? If you’re trying to determine whether or not more income ‘redistribution’ is warranted, doesn’t it make more sense to look at income differences after the rich have paid their taxes and after the poor have received all of their benefits from government and private sources?

• Have you considered that greater income inequality might result from demographic changes that reflect neither weakness nor injustice in the economy, nor any increasing differences in economic well-being? For example, do you account for the fact that retirees rely heavily on consuming their capital – for instance, by selling their expensive large homes, moving into less-expensive smaller homes, and using the differences in sales proceeds to fund some of their living expenses? People’s annual incomes are typically lower when they are retired than when they were working, but their wealth – their ability to maintain their standard of living – isn’t necessarily lower.

• Do you not worry that creating government power today to take from Smith and give to Jones – simply because Smith has more material wealth than Jones – might eventually be abused so that tomorrow the government takes from Jones and gives to Smith simply because Smith has more political influence than Jones?

• Do you disagree with Thomas Sowell when he writes that “when politicians say ‘spread the wealth,’ translate that as ‘concentrate the power,’ because that is the only way they can spread the wealth. And once they get the power concentrated, they can do anything else they want to, as people have discovered – often to their horror – in countries around the world.” Asked differently, if you worry that abuses of power are encouraged by concentrations of income, shouldn’t you worry even more that abuses of power are encouraged by concentrations of power?

Americans love a garage, but we don’t park cars there. We store old bikes with bent wheels, parts of beds and dressers, and a wide variety of tools and old clothes. Personally, I have a box of old electronic parts and cables that I’m sure I’ll use…. someday.

It was not always like this. People had only a few things, and shared by allowing reciprocal borrowing. Often there was some communal arrangement for bigger things, as with the neighborhood bread ovens of medieval Europe.

Today, though, separate private ownership is the norm. To understand why, one has to understand transaction costs. Of course, I would say that, since I think the answer to almost every question in economics is “transaction costs.” But in this case, it’s actually true.

Coase’s Other Question: Why Do We Own Instead of Rent?

Ronald Coase famously asked, “If markets are so great, why are there firms?” His answer was “transaction costs.” Using markets is costly. Prices are important signals at broad levels, but for many small and routine choices, it’s much cheaper to “internalize” the costs of using markets. Firms arise to bypass markets and reduce transaction costs.

If Coase were alive today, he might ask a different question: If renting is cheaper and more efficient for most durables, why do we own almost everything?

Different question, but the same answer: transaction costs. Ownership internalizes the costs of using rental markets or sharing arrangements. The transaction cost of renting a ladder, a car, a spare room, or kitchen appliance wildly exceeded the value to the potential user, or the revenue an owner could hope to earn.

To share any asset or tool, you must account for certain costs:

  1. Triangulation – finding someone who wants to rent it to you.
  2. Transfer – moving the item or coordinating access and payments.
  3. Trust – ensuring the renter will not damage, steal, or misuse the asset.

Ownership was a workaround for high transaction costs. We purchased goods we barely used because we wanted reliable access to them without depending on others.

But what if dependence and sharing were easy, and as cheap, even invisible?

Platforms as Middlemen Selling Reductions in Transaction Costs

Transactions “take place.” That means exchanges happen, but they also require a “location.” That place was once a physical market, or a mall. Today, the “place” can be virtual, on a platform. A platform’s true output is not a product but a service: the reduction of transaction costs that make commodifying excess capacity expensive. Amazon, Uber, Airbnb, and thousands of other platforms sell connections. If I have a car, and a few minutes, and you need a ride, we can now transact at low cost. Platforms are factories that produce reliable cooperation between strangers. The result is that the idle time of durable assets becomes legible to markets. What was once an expensive object to be stored becomes an income-producing asset.

We buy durable goods only for the stream of services they provide. I don’t actually want a power tool; what I want is two holes in this wall, now. The lowest solution (to triangulate, transfer, and trust) to that problem has been ownership. But that comes bundled with idle storage time: a drill may be used for 10 minutes a year, yet occupies physical and financial space for 365 days.

The same is true for cars, guest bedrooms, clothing, tools, musical instruments, and row after row of kitchen equipment quietly rusting under our sinks. We “store to renew” future access: we want the option to use the thing immediately when we need it. But for most of our lives, we are simply paying a two-part tax — the opportunity cost of capital tied up in stuff, and the costs of storage — for the privilege of being able to get these two holes in this wall, right now, or for two extra chairs used only when company visits.

It’s not just garages being used to store stuff we don’t use. The US has more than 23 million individual storage units in 50,000 facilities, with annual revenues of nearly $45 billion. Well over 10 percent of US households rent at least one storage unit; many rent several.

Two-Sided Markets

In two-sided markets, a person can be both buyer and seller, as suits them. Consider an example: A person who is normally a consumer of housing is leaving town for two weeks. She is now a seller or producer of housing services, renting out her flat on AirBnB. When she drives her car to the airport, “buying” transport services from herself, she parks in the Turo lot. The car is then rented out for 10 days, and is available for her when she returns from her trip. She both saved the costs of parking the car in the airport lot, and made money from renting the car out when it was otherwise idle. Overall, between renting her flat and renting out her car, she makes more than $4,000: enough to pay for most of her vacation expenses.

The bright-line distinction between producer and consumer, a relic of the Industrial Revolution, dissolves in platform space. This has consequences for policy, taxation, labor regulation, and even our intuitions about property. When excess capacity is commodified, ownership becomes less a categorical state and more a bundle of transferable rights that can be partitioned and sold temporarily.

In this sense, the “sharing economy” is poorly named. Nothing is being shared in the gift-exchange sense. What is being shared is temporary access. What is being commodified is excess capacity. Fifty years from now, observers will look back on our era with incredulity. Why did people buy all their own tools? Why did cities devote up to 30 percent of their usable road area to storing empty cars? (In Manhattan, it’s more than 40 percent!) Why did we allow trillions of dollars of capital to sit idle for 95+ percent of its lifespan?

The answer we will give — “Well, transaction costs were too high!”— will seem quaint.

Platforms are revealing that much of what we think of as “ownership” is really just expensive access insurance. Once platforms reliably provide that insurance, the original rationale for owning evaporates.

The commodification of excess capacity is not a fad. It is the natural consequence of entrepreneurs discovering that the most valuable thing they can sell is not a product or a service, but the reduction of friction that once made sharing impossible.

Every era of easy money produces its speculative mascots. In the late 1990s, it was Beanie Babies — tiny stuffed animals that cost under a dollar to make, yet sold for hundreds to thousands. The Princess Diana memorial bear that once fetched more than $60,000 now goes for around $3; one variant, Peanut the Royal Blue Elephant, once a $5,000 trophy, now sells for about $6. At the height of the craze, these toys accounted for 10 percent of all eBay listings. Their boom coincided perfectly with the late-1990s dot-com melt-up, a period defined by suppressed interest rates, rapid credit expansion, and rampant speculation. When the tech bubble burst in 2000, the Beanie Baby market had already collapsed — a micro-indicator of distorted price signals and misallocated capital.

A full generation later, the same underlying forces have produced the newest collectible frenzy: Labubus. Made by Chinese toymaker Pop Mart, these demonic-looking plush figures were sold in “blind boxes,” injecting a gambling-like payoff structure into retail purchases. For much of the past year, drops sold out instantly. Secondary-market prices rocketed: a limited-edition Vans Old Skool Labubu sold for $10,585, and a unique four-foot-tall version went for $170,000 in China. Counterfeits proliferated, and stores faced crowds, shouting matches, and physical brawls. Even Forbes briefly labeled the toys “good investments.”

But the correction has now arrived. Only months after its enthusiastic coverage, the same Forbes writer issued an update: prices were falling, inventories rising, and attention shifting elsewhere. Today, none of the 60 priciest Labubus on eBay has significant bids. And, as one middle-schooler put it succinctly, “they’re not that cool anymore.” The wall may have already been hit.

This pattern will feel familiar to readers of an earlier analysis of COVID-era collectible markets. As previously pointed out in our article on the “Pokéflation,” massive monetary expansion — stimulus payments, PPP funds, lockdown savings, and near-zero interest rates — drove Pokémon card prices up tremendously from 2020 through 2022. Professional grading services amplified the mania, valuations soared into the tens or hundreds of thousands, and speculative participation surged. That earlier piece traced how artificially suppressed interest rates mimicked an increase in real savings, misleading producers and consumers alike. The result, as Austrian theory predicts, was widespread malinvestment — new projects, longer production structures, and frenzied bidding in assets ranging from stocks and crypto to trading cards.

Pop Mart stock price (9992 Hong Kong) and US Money Supply M2, 2000 – present

(Source: Bloomberg Finance, LP)

The subsequent bust unfolded exactly as the Austrian explanation suggested. As rates rose in 2022 and credit contracted, asset prices slid broadly. The S&P 500 fell more than 20 percent, the NASDAQ nearly 30 percent, and Bitcoin dropped from $65,000 to below $19,000. NFT markets collapsed. A Holographic McDonald’s Pikachu card that once sold for $51 dropped to $16.88, a 67 percent decline. Jack Dorsey’s first-tweet NFT went from $2.9 million to $280. The BITA NFT Index fell 68 percent, and has declined consistently. As that earlier article argued, these corrections were the necessary liquidation of projects and valuations sustained by monetary illusion rather than genuine shifts in consumer time preferences.

The same monetary distortions that propelled Pokémon card prices and crypto valuations upward have pushed speculative enthusiasm into ever-more-marginal corners of consumer culture.

Viewed through that lens, the Labubu phenomenon is less of an isolated cultural oddity than the next chapter in the same story — a minor but telling update to what was previously observed in Pokémon cards, NFTs, and other speculative submarkets. The same monetary distortions that propelled card prices and crypto valuations upward have pushed speculative enthusiasm into ever-more-marginal corners of consumer culture. When plush dolls trade hands at used-car prices, the underlying issue isn’t the toys themselves, but the environment producing the bidding frenzy.

If the broader asset bubble deflates (further), the policy response is predictable: rapid rate cuts and another round of extraordinary quantitative easing. But as Austrian theory warns, and as earlier episodes repeatedly confirmed: monetary manipulation merely postpones the reckoning and intensifies the next cycle of errors.

Are Labubus the new Beanie Babies? In the narrow sense, yes. But more importantly, they are a small, harmless reminder — an update, really — of the same deeper dynamic previously pointed out: easy money distorts choices, inflates curiosities into “investments,” and turns even plush toys into bellwethers of a monetary system addicted to perpetual stimulus.

The most common cancer in America is also one of the most preventable — if people simply had access to effective sunscreen. We spend $9 billion a year treating the cancerous effects of sun damage, not to mention the billions we spend to soothe the sun’s more minor effects. So many people get skin cancer that the statistics aren’t even reportable to cancer registries. But nearly all skin cancer is the result of sunlight and UV exposure, which means it is preventable. 

But that’s the (often greasy) rub: in the United States, sunscreen is locked inside a bureaucratic vault built in 1938, guarded by the Food and Drug Administration as if it were an experimental medical treatment. 

The FDA’s Precautionary Paralysis

Americans don’t hate sunscreen. We hate American sunscreen. Thick, greasy, chalky — our “broad spectrum” formulas barely block the most dangerous rays, meaning damaging UVA rays still get through. Even when you’re wearing “good” American sunscreen, you remain vulnerable to aging-accelerating sunspots and cancer-causing skin damage. 

Why? Because American sunscreen is trapped in a regulatory time warp. Since the late 1990s, the FDA has refused to approve a single new UV filter. Europe and Asia now use more than 30 modern filters, with similar safety standards. The US? Just 17 — most of them older, less effective, and less pleasant on the skin. Foreign formulations reach beyond visible sunburns well into the UVA wavelength, offering superior protection from the rays that cause 90 percent of visible aging and much of the skin cancer burden. That’s protection Americans are being deliberately denied.

Susan Swetter, MD, is exactly the person you’d want to ask about that kind of thing. She’s a professor of dermatology and the physician in charge of cutaneous oncology (skin cancers) at Stanford University Medical Center. She was blunt: “The best sunscreens abroad contain Tinosorb, Mexoryl or Uvinul — none of which are currently FDA-approved.”

The reason is almost comical. Because sunscreen prevents cancer, the FDA classifies it as a drug, not a cosmetic. Approving a new UV filter drug here requires decades of animal testing, multi-million-dollar studies, and years — sometimes decades — of regulatory limbo before a new ingredient can hit US shelves. The result of this “precautionary principle” is not more safety but less. By locking out proven, widely used ingredients like bemotrizinol — sold abroad for more than 20 years under EU standards without incident — the FDA has left Americans with weaker protection, higher cancer rates, and ballooning medical costs. 

It’s inaction in the name of public health, and the costs are becoming more visible.

“The sunscreen issue has gotten people to see that you can be unsafe if you’re too slow,” economist Alex Tabarrok told NPR. Regulation by delay doesn’t always prevent harm. In many cases, it guarantees harm.

Consumers vote with their wallets, importing bottles of Korean and European brands through web outlets, Reddit fora, and TikTok recommendations. New formulas are chemically superior: some are sweat-proof even in humid conditions, others defend skin against air pollution. Australian sunscreens are among the best in the world, and their SPF claims are rigorously checked and enforced. When sunscreen feels better, looks better, and works better, people are more likely to wear it.

Around the world, innovation races ahead where sunscreen is treated as skincare. 

The Incentive to Do Nothing

Industry has little reason to push the FDA to move faster. The cost of approval can reach $20 million, yet the reward is just 18 months of exclusivity. After that, competitors can copy the formula, leaving innovators to cover all the upfront costs.

Congress prompted the FDA to reconsider its classification and speed up approvals (in November 2025’s continuing resolution, but also 2020, 2014, 2011, and 2005 to absolutely no effect). If bemotrizinol wins FDA approval in 2026, it will be the first new filter in a generation.

Swiss-Dutch skincare company DSM-Firmenich, branding the compound as PARSOL Shield, has petitioned and lobbied the FDA for nearly a decade.

US companies keep recycling the same tired formulas. L’Oréal, Neutrogena, and others already sell better versions of their products abroad. Sephora is reportedly eager to supply better products to US buyers, but will have to settle for intentionally formulated inferior alternatives until the FDA moves. For 20 years, the discriminating skin care buyer could pay extra to import the good stuff.

And speaking of paying a premium for imported goods…

Tariffs Make a Bad Problem Worse

In 2024, the Trump administration slapped a 25 percent tariff on Korean imports, including cosmetics like sunscreen. Though the rate has seesawed since — sometimes 15 percent, sometimes zero — the uncertainty has sparked panic buying and price spikes. Retailers warn that if the full tariff returns, they’ll have no choice but to pass costs onto consumers.

Now, the administration has also eliminated the de minimis exemption, which used to let individuals import up to $800 in goods tariff-free. Without that protection from costly customs duties, millions of American consumers who rely on direct-to-door K-beauty orders will see the cost of reliable skincare soar overnight.

The Cost of Bureaucracy and Protectionism

First, the FDA blocks innovation, so American products are distinctly inferior. Reform (say, to streamline FDA approvals, remove required animal testing, or approve new UV filters) moves slower than skin cancer spreads across an unprotected brow. Now, ill-conceived trade policy threatens to choke off the only affordable workaround consumers have left. 

Bipartisan glimmers exist. Rep. Alexandria Ocasio-Cortez and Sen. Mike Lee have called for regulatory reform to streamline FDA approvals and allow modern testing methods without the requirement for mandatory animal testing (funny enough, we could rely on a 30-year longitudinal study on the human populations of Europe and Southeast Asia).

Busybody-bullies make it their job to get in the way of consumers’ choices for themselves and entrepreneurs’ attempts to meet those needs. As usual, the twin idols of American bureaucracy — safety theater and national security hobgoblins — generate fear, feed lobbyists, and prop up campaign funding, but produce the opposite of their intent.

The results of FDA protection:

Not safety, but exposure — to the sun, to higher prices, to worse health outcomes, and an estimated 8,000 preventable deaths a year. Medicare and Medicaid are likely to pay billions annually to treat skin cancer that could’ve been prevented, had the FDA not, well, prevented that. 

American standards force companies here and elsewhere to produce deliberately inferior products at higher prices than those freely available to buyers in other countries. American manufacturers are excluded from a booming global skin care market. 

One of our best tools for reliable, risk-free cancer prevention is being treated as a luxury good. 

COVID may be on its way to being a chapter in our history books, but it’s left its fingerprints all over life as we know it. The world post-pandemic is not the same as the one we had in the early days of 2020. Work-from-home is now normalized; many companies are partially (and even fully) remote. Entire city populations have shifted as large numbers of Americans relocated around the country.

Perhaps less obvious to the naked eye — although not less significant — is COVID’s effect on the public school system.

The pandemic, and all the social changes that came with it, shattered some of our culture’s biggest educational taboos. More importantly, it shattered the illusion that our public schools are a great and trustworthy American institution.

Everybody talks about “COVID learning loss” (and the large gaps in learning students are still suffering months after school closures and missed lessons). Far less discussed is the “COVID trust loss” in our public schools (at a 25-year low) and all the ways that the social norms that bound public school together as an American bedrock have begun to fragment.

Since 2020, a wave of school choice policy has swept across the country. Its seeds were sown long before the pandemic, but COVID trust loss created the culture ripe to harvest them.

Four shifts post-pandemic are changing the fabric of American education: increased transparency inside the classroom (and more light shed on all the shortcomings of public schooling), the breakdown of the homeschooling taboo, the shift toward remote work, and demographic migrations into states prioritizing school choice.

Each of these paradigm shifts is quietly rewriting the education world. Each is important, and each is reshaping education in its own way.

If you’re a skeptic of government-run schools, there’s a lot to be excited about.

Zoom School Revealed the Rot in Public Education

In the early days of COVID, public schools went online, and parents had the chance to watch what was happening in their child’s classroom in real time. Many were not pleased.

Teachers and administrators were trying to translate an already broken model of education onto a format it didn’t fit, breaking it even further in the process.

Parents, also shut up inside their houses and in close proximity to their children, saw Zoom school and were horrified — is this really what my kid does all day?

Some chalked it up to the shortcomings of the medium: public school was designed for real-life rooms and three-dimensional interactions, not computer screens.

Others (more astutely) blamed the model itself.

It’s no secret that America’s public school outcomes aren’t great — the Nation’s Report Card, published by the federal government, publicly documents as much. But many parents were confused by the content of their children’s classes — like the parents documented in the Sold a Story podcast, who were horrified to discover their children weren’t learning to read.

Public school enrollment dropped sharply. Many families switched to private schools (which re-opened faster than public schools) or began homeschooling. Some of those families returned to public school after the lockdowns ended, but many didn’t, and public school enrollment is trending downward. Nationally, enrollment dropped 2.5 percent between 2019 and 2023, and continues to decline.

Even in cities like Austin (with population trending upward), public school enrollment is falling — Austin’s school district has lost 10,000 students over the past decade, despite the city population growing by 10 percent (nearly 100,000 new residents) over the same period.

If public schools were private companies, and surveyed their customers (the parents) — or just looked at their retention data — the market feedback would be clear. Parents aren’t happy with the results public schools are delivering, and are looking elsewhere.

Zoom School Made Homeschooling Less Taboo

Millions of parents pulled their kids out of public school in 2020 and started homeschooling them, confident that their homespun instruction would be better than whatever Zoom school was meting out.

Nearly overnight, homeschooling — once a strange practice reserved for the hippies and religious zealots and social outcasts — became normal. It went from a fringe concept to a shared cultural experience.

Nearly everybody knows somebody who homeschooled for at least a few months during the pandemic. And you can’t say “homeschoolers are weirdos” without a tinge of irony if you yourself (or your sister, or your best friend, or your cool neighbor) were once a homeschooling parent, no matter what the extenuating circumstances.

More importantly, even if you weren’t homeschooling, you still had your kids at home all day — one of the core (unimaginable?) realities of homeschooling life. Pre-COVID, parents could say “I could never homeschool my kids, I can’t imagine having them home all day.” Post-COVID, no longer: having your kids home all day was something everyone could imagine, because it was something everyone had experienced.

By the time the lockdown had abated, having your child home all day had gone from unimaginable, to practical, to a very viable possibility for the future.

Work-From-Home Broke Up “Default” Childcare 

At the same time Zoom school was in full swing, COVID was permanently rearranging the workplace. Technology had long before made remote work possible; the pandemic forced employers to catch up. Parents went from 9-5 office residencies to commuting only as far as the kitchen table, taking meetings while waiting for their sourdough to rise.

One of the core services public school offers, to families and to society generally, is childcare. Parents who go to work need somewhere for their kids to go. But if parents work from home, they can be the adult in the room while kids do school — especially if their child is enrolled in an online program (so mom doesn’t have to be the teacher).

For many kids, especially an older student who doesn’t need constant supervision, doing school online (with mom or dad in the other room for support if you need them) is a viable option. If your child doesn’t like the public school curriculum, or prefers working at their own pace, or is on the butt end of public school bullying and social hierarchies, online school can offer a compelling prospect.

During COVID, all sorts of online schools grew quickly: Sora School, an online project-based middle and high school; Synthesis, the game-based spinoff program from Elon Musk’s Ad Astra school; Kubrio, a “world school” with three time zone swaths and students from all around the globe — to name just a few.

More traditional models like online charter schools and public cyber schools are also on the menu; but for families with self-directed children, more custom combinations of tools and programs (Khan Academy paired with Teaching Company lectures, IXL supplemented with Coursera MOOCs) abound.

Post-COVID, remote work appears to be here to stay. As Cal Newport wrote in his book Slow Productivity, referencing Apple employees refusing to go back to the office: “These frustrated Apple employees [are] at the vanguard of a movement that’s leveraging the disruptions of the pandemic to question so many more of the arbitrary assumptions that have come to define the workplace.” 

This questioning of assumptions, not incidentally, applies equally to schooling.

Perhaps equally importantly, remote work also frees families from work-induced geographic constraints, making it easier for them to relocate to states with the best schools or robust school-choice supports.

COVID Migration Moved Families To Choice-Friendly States

COVID policy rearranged the demographic spread of the country en masse: people fled in droves from locked-down states (like New York and Illinois and California) to open ones (like Texas and Tennessee and Florida).

States with less-restrictive school closure policies also tend towards freer education policy (both are correlated with the relative power of teachers unions). Many fewer states have since passed sweeping school choice policies, where families have access to public vouchers for use at private schools.

The effect is that a large number of kids — who would’ve otherwise been stuck in states without school choice — now live in states with a huge number of school options emerging.

Eighteen states have implemented universal school choice since 2020. Some of those states, like Florida and Texas, are becoming hotbeds for education innovation, and many are seeing an increasing number of private school options emerging.

Incidentally (or perhaps not incidentally at all), many of these school choice-friendly states are also the places people are having the most kids — a positive indicator of the education market’s future growth. Where there is demand (young students) and capital (school choice dollars), supply (interesting new schools) will follow.

Part of the reason public schools in America have had such a monopoly on education is because of other extenuating circumstances: parents need to work all day; no one was at home to watch the kids; tax dollars were exclusively bundled into the public school system — and everybody trusted the public school system. After all, it’s one of the great American institutions (or so we’re led to believe).

But with those undergirdings starting to shift, public school’s Herculean hold on the American psyche (and the American way of life) is shifting too. Logistically, we don’t need public schools the way we did a decade ago. We’re more skeptical of them. And alternatives have been destigmatized.

The 2.5 percent public school enrollment drop is still small; it’s early days. Nearly 50 million kids are still enrolled in public schools. Public education is still the default.

But the cultural landscape — and the cultural paradigm — has shifted. And the education landscape will continue to shift in response — slowly now, but more and more, until the unquestioned “default” of one school for all children feels as distant as normal reality did during the height of the pandemic.

The US housing market in late 2025 is defined by contradictory forces: rising prices but slowing growth, increasing inventory but falling affordability, and a demographic shift that is weakening long-run demand even as short-run supply remains structurally constrained. 

Against this backdrop, President Trump’s proposal for a 50-year mortgage is an attempt to stretch affordability in a market that has outpaced incomes, and it exposes deeper issues. Mortgage duration is both a financial feature and a policy artifact shaped by decades of government intervention dating back to the New Deal. A 50-year mortgage may expand access by lowering monthly payments, but it also dramatically increases lifetime interest costs and could raise prices depending on supply elasticity. The debate over this proposal is ultimately a debate over the real frictions in the housing market, namely interest-rate lock-in, constrained supply, and the institutional architecture that prevents solutions like portable mortgages from being widely available.

The American housing market rarely changes in sudden leaps. Prices adjust gradually, construction responds slowly, and mortgage product design barely shifts at all. That is why the mere suggestion of a 50-year mortgage by President Trump is so economically revealing. If housing finance policymakers are floating half-century debt structures, something fundamental in the market is out of balance.

Today’s housing market presents a strange tableau: home prices are still rising, but at a slowing pace. According to the latest Federal Housing Finance Agency (FHFA) data, prices are up roughly 2.2 percent year-over-year in Q3 2025. Sales have ticked up modestly, with existing-home transactions rising 1.2 percent in October. Inventory is finally improving, up about 12.6 percent year over year, driven mostly by new construction rather than existing homes. Mortgage rates have eased from their peaks and now sit in the six-percent range for many buyers. On paper this resembles a soft landing, but in reality the market remains defined by broad affordability stress.

Many homes are sitting unsold for long stretches, not because buyers are absent but because sellers are holding out for pandemic-era prices. Renters’ expectations of becoming homeowners have collapsed from 52.6 percent in 2019 to just 33.9 percent today, according to the Federal Reserve Bank of St. Louis. And demographic headwinds are emerging: births are declining, population growth is slowing, and long-run demand will weaken as the nation ages. Housing should be cooling naturally, yet it isn’t. The affordability crisis is so acute in the short run that policymakers are reaching for financial engineering solutions rather than addressing structural constraints.

It is amidst this backdrop that Trump proposed the 50-year mortgage, casting himself in the lineage of major housing-finance reforms, much like President Roosevelt’s role in ushering in the modern 30-year mortgage during the New Deal. By extending loan terms, the administration argues it can meaningfully lower monthly payments and open the door to homeownership for buyers priced out of today’s market. In that narrow sense, the idea appears palatable; when affordability is collapsing, and buyers are increasingly constrained by monthly cash flow, stretching the mortgage horizon looks like an intuitive policy lever. But as with any major change in mortgage design, the economic logic is more complicated.

A longer mortgage lowers monthly payments at the cost of paying much more interest over many more years. For a median-priced $415,000 home purchased with an FHA loan at 3.5 percent down, here is how the math works:

Mortgage TermInterest RateMonthly PaymentNumber of PaymentsTotal of All PaymentsTotal Interest Paid
15-year5.5%$3,272.21180$588,998.69$188,523.69
30-year5.99%$2,398.48360$863,451.30$462,976.30
50-year6.4%$2,227.44600$1,336,462.35$935,987.35

The 50-year mortgage trims the monthly payment relative to the 30-year, but at the cost of doubling the total interest burden. For households focused solely on monthly cash flow, particularly first-time buyers, this tradeoff can appear worth it. A lower payment either gets a family into the home they want or allows them to buy a more expensive house. Economically, it functions like any intertemporal tradeoff: more affordability now, much higher cost later.

Critics argue that introducing ultra-long mortgages will push home prices higher. The answer is that it depends entirely on supply elasticity. In markets where new housing is constrained by zoning, permitting delays, or not-in-my-backyard (NIMBY)-driven land-use restrictions, extended mortgage terms can, in fact, capitalize into higher prices. In elastic markets, the effect is muted. This is not a moral failing of the policy; it is simple microeconomics. If your policy goal is to increase homeownership, you accept certain tradeoffs, just as every country with 40- to 100-year mortgages has.

What often gets lost in the discussion is that the United States did not adopt the 30-year mortgage because markets naturally arrived at it. The product is fundamentally the outcome of government intervention. During FDR’s New Deal, the Federal Housing Administration standardized long-term amortized mortgages, displacing the short-term, interest-only loans that had dominated before the Great Depression. Fannie Mae later expanded liquidity and uniformity in mortgage finance. The 30-year mortgage was authorized by Congress in the late 1940s and eventually became dominant because federal agencies guaranteed it.

In other words, the “normal” American mortgage is not a market creation; it is a political one. A 50-year mortgage would simply be the next step in an 80-year continuum of policy-driven mortgage evolution.

The deeper issue in the housing market is not the absence of exotic mortgage products. It is that existing homeowners are frozen in place. Millions of households locked in 2–4 percent mortgage rates during the pandemic. With current rates near six percent, these homeowners don’t want to move, even when downsizing or scaling up might make sense. That keeps existing inventory off the market. Meanwhile, a record share of homes for sale are new construction, not existing properties.

One innovative solution would be portable mortgages, where the borrower could keep their existing mortgage rate but shift the collateral to a new home. Instead of being trapped in their current house because of a pandemic-era 3 percent mortgage, a household could sell, buy a different property, and simply move the lien from House A to House B. In theory, this would dramatically improve mobility, unfreeze existing-home inventory, and loosen one of the tightest bottlenecks in the current housing market: interest-rate lock-in.

But portable mortgages do not exist in the United States for reasons deeply rooted in the structure of American mortgage finance. The US system is built around long-term, fixed-rate mortgages that are pooled into mortgage-backed securities, financial instruments priced according to the specific borrower and the specific property at the moment the loan is issued. Letting borrowers carry their old loan to a new house would upend that securitization model, causing investors to absorb unknown collateral risk midstream and making the securities far harder to price.

The dominance of the 30-year fixed-rate mortgage compounds the issue. If borrowers could port low rates across multiple moves, they would have little reason to refinance, starving lenders of the fee income and interest-rate resets they depend on to originate new loans. Investors could also face greater duration risk, being stuck earning three percent for decades even as market rates rise, causing them to demand higher rates across the entire mortgage market. And unlike countries such as the UK or Canada, where portable mortgages are common, US mortgages are secured by a specific property for the life of the loan and fixed for thirty years, not two to five.

Changing collateral midstream would require new appraisals, new legal filings, and a fundamental reengineering of mortgage securitization. All of this means that while portable mortgages could meaningfully improve housing mobility, they run directly counter to the incentives and infrastructure of the US mortgage system. Banks and investors prefer refinancing into higher rates, and the legal plumbing is built around property-specific collateral, not borrower-specific contracts. As a result, portable mortgages remain economically appealing in theory but institutionally implausible in practice.

If, however, banks and regulators could find a way to make portability compatible with the existing system, it could be a genuine game changer for American homeowners. Imagine a world where a young couple who locked in a low rate on their starter home is not punished financially for having a third child and needing more space, or where empty nesters can downsize without watching their mortgage payment jump simply because they move. Portability would make the mortgage contract follow the household’s life cycle rather than anchor it to a single property, smoothing mobility across labor markets, helping people move closer to better jobs, and reducing the mismatch between housing stock and household needs.

This would mean more efficient use of the existing housing stock, less pressure to overbuild in certain markets, and a healthier, more dynamic relationship between housing and labor-market mobility. It is precisely because the gains to households and to the broader economy are so large that portable mortgages are worth serious experimentation, even if the institutional and regulatory hurdles are high.

Ultimately, the debate over 50-year mortgages is less about exotic loan structures and more about the deeper structural limits of America’s housing system. Affordability has deteriorated because supply is constrained, mobility is frozen, and our mortgage architecture has not evolved with economic realities.

Extending mortgage terms may offer short-term relief, but the real innovations, like portable mortgages or reforms that unlock supply, require rethinking the institutional plumbing that has defined US housing finance since the New Deal. If policymakers want lasting affordability rather than financial patches, they must address the structural forces that make such extreme proposals politically viable in the first place.

In an era where “democratic socialism” has gained renewed traction among politicians, activists, and intellectuals, one might assume the term carries a clear, operational meaning. Yet, a closer examination reveals a concept shrouded in ambiguity, often serving as a rhetorical shield rather than a blueprint for policy.  

Proponents often invoke it to promise equality and democracy without the baggage of historical socialist failures, but this vagueness undermines serious discourse. Precise definitions are essential for theoretical, empirical, and philosophical scrutiny. Without them, democratic socialism risks becoming little more than a feel-good label, evading accountability while potentially eroding the very freedoms it claims to uphold. 

The Historical Consensus on Socialism: State Ownership and Its Perils 

During the socialist calculation debate of the early twentieth century, a clash between Austrian economists like Ludwig von Mises and Friedrich Hayek and their socialist counterparts, including Oscar Lange and Abba Lerner, the consensus definition of socialism was straightforward: state ownership of the means of production. As I demonstrate in my coauthored paper, “The Road to Serfdom and the Definitions of Socialism, Planning, and the Welfare State, 1930-1950,” this understanding was shared not only by critics but also by the socialist intellectuals of the time.  

Socialism, in this context, entailed the state directing resources through planning, often requiring ownership to fund expansive welfare programs. This definition is crucial for interpreting Hayek’s seminal work, The Road to Serfdom (1944), which posits a unique threat to democracy arising from state ownership of the means of production. Hayek argued that central planning inevitably concentrates power, leading to authoritarianism as planners override individual choices to meet arbitrary goals. Far from a slippery slope toward any government intervention, Hayek’s warning targeted the specific dynamics of state-owned economies, where the absence of market prices stifles the flow of information and the structuring of incentives, ultimately endangering democratic institutions. Using this definition, my coauthors and I, in our paper “You Have Nothing to Lose but Your Chains?” empirically test and confirm Hayek’s hypothesis that democratic freedoms cannot be sustained under socialism.  

Economists working in this tradition, from Mises to contemporary scholars, retain this rigorous definition. It serves as a foundation for understanding why socialist systems have repeatedly faltered: without private ownership of the means of production, rational economic calculation becomes impossible, resulting in waste, shortages, and coercion.  

The Vagueness and Contradictions of Modern Socialist Rhetoric 

Contrast this clarity with the approach of many contemporary socialists, including those advocating democratic variants. Definitions of socialism often shift, praised in moments of perceived success and disowned when failures mount. This pattern is not new; it has recurred across a range of historical experiments, from the Soviet Union to Venezuela. Kristian Niemietz’s Socialism: The Failed Idea That Never Dies offers a comprehensive review of socialist rhetoric that highlights this inconsistency: regimes are initially hailed as “true” socialism, such as “worker-led” and “democratic,” only to be retroactively labeled as distortions or “state capitalism” once repression and economic stagnation emerge.  

When Hugo Chavez introduced socialism in Venezuela in 2005, he claimed that he was re-inventing socialism so as to avoid the outcomes of the Soviet Union, stating that they would “develop new systems that are built on cooperation, not competition.” And that they “cannot resort to state capitalism.” Bernie Sanders famously endorsed this socialism, saying that the American dream was more likely to be realized in places like Venezuela and calling the United States a banana republic in comparison. Nobel laureate economist Joe Stiglitz was quick to point out the “very impressive” growth rates and the eradication of poverty. But socialism in Venezuela, according to the state ownership of the economy measure from Varieties of Democracy, corresponded to the classic definition of socialism, leading to the very blackouts, empty grocery shelves, and suppression of political freedom socialists explicitly sought to avoid.   

This vagueness extends to democratic socialism today. Proponents often speak in lofty terms, such as “workplace democracy,” without specifying policies. Such abstractions allow evasion of empirical evidence. By rendering the concept unfalsifiable, socialists can dismiss critiques as attacks on straw men, perpetuating debates that stall progress. If democratic socialists insist on reclaiming the term “socialism,” as distinct from the technical term used by economists, the burden falls on them to explicitly state their divergence and provide a concrete definition amenable to empirical testing. 

The Imperative of Precision for Empirical and Philosophical Inquiry 

A precise definition is not mere pedantry; it is the prelude to meaningful investigation. To enable cross-country comparisons, socialism must be defined through specific policies, not vague platitudes. What exact measures constitute this vision? Some socialists point to the Nordic countries as their model, but these countries have important differences between them. And, if a country is the model, then democratic socialists must consistently advocate for all the policies in that country, including those that might contradict their ideals, such as flexible labor markets or low corporate taxes. The Nordic countries, as measured by state ownership of the economy, are capitalist. Similarly, as measured by Fraser Economic Freedom of the World Index, they are also some of the most economically free.   

Empirical literature in economics often examines the effects of specific policies in isolation, separate from the discussion of comparative economic systems, revealing trade-offs often ignored by democratic socialists. Minimum wage laws, for example, often supported by unions, can reduce employment opportunities, particularly for low-skilled workers and minorities. Prevailing wage requirements, pushed by unions, may inflate costs and exclude smaller firms, suppressing economic mobility and also having racially disparate economic impacts.  

Philosophical debates demand equal rigor. Consider unions, a cornerstone of many democratic socialist platforms. Do proponents support open ballot laws, which protect workers from intimidation during union votes, or do they favor secret ballots to ensure true democracy? Exempting unions—as labor cartels—from antitrust laws raises concerns: why allow monopolistic practices that could hike prices and limit competition, regressively harming consumers? If a national or subnational electorate democratically enacts right-to-work laws, preventing closed-shop unions, should this override a workplace vote? Such questions expose potential anti-democratic undercurrents, where “worker democracy” might privilege special interests over broader societal choice. 

These inconsistencies highlight a deeper issue: democratic socialism often conflates social democracy – market economies with robust safety nets – with true socialism, diluting the latter’s radical edge while inheriting its definitional baggage. Without clarity, it risks repeating history’s errors, where good intentions devolve into coercion. 

Toward Clarity and Accountability 

Democratic socialism’s appeal lies in its promise of equity without tyranny, but its vagueness invites skepticism. Only by adhering to historical definitions and demanding specificity can we foster advancement in these debates. What policies do democratic socialists argue for exactly? How will they avoid the pitfalls of past experiments in socialism, which often started with the noblest of intentions? Until answered, democratic socialism remains an elusive mirage.  

The Trump administration is making good on its promise to shrink the bloated federal bureaucracy, starting with the Department of Education. Education Secretary Linda McMahon recently announced that her department has signed six interagency agreements with four other federal departments – Health and Human Services, Interior, Labor, and State – to shift major functions away from the Education Department.  

These agreements will redistribute responsibilities like managing elementary and secondary education programs, including Title I funding for low-income schools, to the Department of Labor; Indian Education programs to the Interior Department; postsecondary education grants to Labor; foreign medical accreditation and child care support for student parents to Health and Human Services; and international education and foreign language studies to the State Department, to agencies better equipped to handle them without the added layer of bureaucratic meddling. 

Interagency agreements, or IAAs, aren’t some radical invention. They’re commonplace in government operations. The Department of Education already maintains hundreds of such pacts with other agencies to coordinate on everything from data sharing to program implementation. What makes this move significant isn’t the mechanism – it’s the intent. By offloading core duties, the administration is systematically reducing the department’s scope, making it smaller, less essential, and easier to eliminate altogether. This approach is the next logical step in a process aimed at convincing Congress to vote to abolish the agency entirely. 

Remember, the Department of Education was created by an act of Congress in 1979, so dismantling it requires congressional action. In the Senate, that means overcoming the filibuster, which demands a 60-vote supermajority. Without it, Republicans would need a handful of Democrats to cross the aisle – or they’d have to invoke the “nuclear option” to eliminate the filibuster for this legislation.  

Conservatives have wisely resisted that temptation. Ending the filibuster might feel expedient now, but it would set a dangerous precedent, allowing Democrats to ram through their big-government agendas – like expanded entitlements or gun control – with a simple majority the next time they hold power. It’s better to build consensus and preserve the procedural safeguards that protect limited government. 

The Trump team’s strategy is smart: It breaks down the bureaucracy piece by piece, demonstrating to the public and lawmakers that other agencies can handle education-related workloads more efficiently. Why prop up a standalone department riddled with waste when existing structures can absorb its functions? The administration’s approach goes beyond administrative housekeeping to serve as proof of concept that education policy belongs closer to home, not in the hands of distant D.C. officials. 

Of course, the only ones howling about sending education back to the states are the teachers unions and the politicians in their pockets. Groups like the National Education Association (NEA) and the American Federation of Teachers (AFT) thrive on centralized power. It’s easier for them to influence one federal agency where they’ve already sunk their claws than to battle for control across 50 states and thousands of local districts.  

We’ve seen this playbook in action. During the COVID-19 pandemic, unions lobbied the Centers for Disease Control and Prevention – another federal entity – to impose draconian guidelines that made school reopenings nearly impossible. They held children’s education hostage, demanding billions in taxpayer-funded ransom payments through stimulus packages. 

The unions’ power grab isn’t new. The Department of Education itself was born as a political payoff. Democrat President Jimmy Carter created it in 1979 to secure the NEA’s endorsement for his reelection bid. It’s no secret that teachers unions have long controlled Democrat politicians, but even some Republicans aren’t immune.  

Rep. Brian Fitzpatrick (R., Pa.) came out swinging against dismantling the department, claiming it was established “for good reason.” That “good reason” apparently includes his own ties to the unions. Fitzpatrick is the only Republican in Congress currently endorsed by the NEA. Back in 2018, the NEA even backed him over a Democrat challenger. Over the years, he’s raked in hundreds of thousands of dollars in campaign contributions from public-sector unions. Is it any wonder he’s against Trump’s plan?  

Meanwhile, more than 98% of the NEA’s political donations went to Democrats in the last election cycle, yet less than 10% of their total funding went towards representing teachers. Follow the money, and you’ll see why federal control suits them just fine. 

Sending education to the states would empower local communities, where parents and educators know best what’s needed. It would also mean more dollars reaching actual classrooms instead of lining the pockets of useless bureaucrats in Washington. Federal education spending gets skimmed at every level, with administrative overhead siphoning off funds that could buy books, hire teachers, or upgrade facilities. 

Critics claim abolishing the department would gut protections for vulnerable students, but that’s a red herring. Federal special-needs laws, like the Individuals with Disabilities Education Act, predated the department and can continue without it. Civil-rights enforcement in schools doesn’t require a dedicated agency; the Justice Department and other entities already handle similar oversight. Moreover, the word “education” appears nowhere in the US Constitution. The department’s very existence arguably violates the 10th Amendment, which reserves powers not delegated to the federal government to the states or the people. 

The evidence against federal involvement is damning. Since the department’s inception, Washington has poured about $3 trillion into K-12 education. Achievement gaps between rich and poor students haven’t closed, and in many cases, they’ve widened. Overall academic outcomes have stagnated or declined. Per-student spending, adjusted for inflation, has surged 108% since 1980, yet test scores remain flat. The US spends more per pupil than nearly any other developed nation, but our results are an international embarrassment. 

The Trump administration has already taken decisive action to chip away at this failed experiment. They’ve slashed millions in diversity, equity, and inclusion grants that promote division rather than learning. Thousands of department employees have been let go, streamlining operations and cutting costs. The unions are probably gearing up to sue over these latest interagency agreements. But they tried that before – challenging the administration’s personnel reductions – and lost at the Supreme Court. The chief executive has clear authority to manage the executive branch, and the unions would likely face another defeat if they push this latest move to litigation. 

It’s time to end the charade. The Department of Education focuses on control rather than helping kids. By dispersing its functions and proving the sky won’t fall, the Trump team is paving the way for real reform. America’s students deserve better than a federal fiefdom beholden to special interests. Let’s send education back where it belongs: to the states, the localities, and the families who know their children best. 

Thanksgiving draws people, regardless of race or creed, together around a table heavy with food and laughter. At its center sits a golden turkey, but it’s the sides (mashed potatoes, green beans, stuffing, and gravy) that spark the most excitement. American football murmurs from the television as plates and hands cross the table, passing dishes with the casual choreography of family life.

This is, in spirit, the very scene Frédéric Bastiat once imagined when he marveled at how Paris was fed each morning. “It staggers the imagination,” he wrote, “to comprehend the vast multiplicity of objects that must pass through its gates tomorrow… And yet all are sleeping peacefully at this moment.” No single mind coordinates the miracle and yet, it happens.

Thanksgiving is the modern version of Bastiat’s wonder. What we see is the feast itself: Mom and Nana pulling the turkey from the oven. The Department of Agriculture reports that roughly 46 million turkeys, about the population of Spain, are eaten every Thanksgiving. The extended family that arrives hours before the meal is ready is joined by 1.6 million people who travel on Thanksgiving. Dad and his child switch between American football and the Macy’s Thanksgiving Day Parade, joining the more than 100 million viewers who tune in each year, coordinated across satellites, networks, advertisers, and camera crews, so that the same spectacle can play out in millions of living rooms at once. 

What remains unseen are the invisible threads of cooperation that make the Thanksgiving table possible. Long before the turkey reached the oven, farmers in Iowa, Nebraska, and Arkansas were raising it, relying on feed grown by other farmers and transported by rail from thousands of miles away. The green beans and sweet potatoes come from networks of growers, processors, and distributors whose work depends on forecasts, algorithms, and trade routes most of us never think about. Truck drivers cross state lines to deliver ingredients to logistics managers who ensure that shelves stay stocked. Every piece comes together until someone realizes the cranberry sauce is missing. Last-minute panic sets in, and a quick dash to the grocery store follows.

Today such a trip isn’t seen with wild wonder. But in 1989, during a policy shift called perestroika, or restructuring, the USSR sent a delegation to thaw relations with the United States. Alongside a tour of NASA’s Johnson Space Center in Texas, the foreign delegation made an unscheduled stop at a Randalls Supermarket. Among them was future Russian President Boris Yeltsin, who, astonished by the variety of foods, claimed, “Even the Politburo doesn’t have this choice. Not even Mr. Gorbachev.” The visit left Yeltsin at a loss for words: “I think we have committed a crime against our people by making their standard of living so incomparably lower than that of the Americans.” 

Since its founding in 1917, the Soviet Union endured famine with grim regularity. The Volga famine of 1917–1922 claimed between five and seven million lives. A decade later, the Holodomor of 1932–1933 starved another five to eight million, and after World War II, the famine of 1946–1947 took roughly two million more. Each disaster was born not of nature but of policy: central planning, forced collectivization, and the state’s determination to control production.

By the 1990s, the pattern of scarcity persisted, mocking the propaganda that declared, “Life has become easier, comrades; life has become happier.” In April 1991, bread prices rose 300 percent, beef 400 percent, and milk 350 percent. Shortages grew so severe that Premier Mikhail Gorbachev appealed to the international community for humanitarian aid, with officials admitting that the USSR had “flung itself around the world, looking for aid and loans.”

Shipments of frozen chicken, nicknamed “Bush legs” after President George H. W. Bush, were flown in to feed the population. The image carried an irony history could not have scripted better: just decades earlier, at the height of the Cold War in the 1950s, Premier Nikita Khrushchev had thundered before Western diplomats, “About the capitalist states, it doesn’t depend on you whether or not we exist. If you don’t like us, don’t accept our invitations, and don’t invite us to come see you. Whether you like it or not, history is on our side. We will bury you.” Yet by the end of the century, the USSR that vowed to bury the West was surviving on American poultry—in other words, on capitalist chicken. The spiraling crisis soon escalated into nationwide strikes and protests demanding the end of the system itself. By Christmas Day, December 25, 1991, the Soviet Union dissolved, undone by the same command economy that had once promised to abolish hunger.

Even the most ardent bureaucrats, armed with vast tracts of farmland and central plans, could not guide the Soviet Union into prosperity, let alone feed its people. Yet the urge to direct, ration, and manage markets never disappears; it only changes its accent. 

Today, in New York City, the beating heart of global finance, the temptation to fix the market endures. Mayor-elect Mahmood Mamdani has proposed government-run grocery stores as “a public option for produce,” arguing that too many New Yorkers find groceries out of reach. His plan would cost roughly $60 million, financed through higher corporate taxes at 11.5 percent and a new 2 percent levy on those earning over a million dollars a year. Despite the recent failure of a government-run grocery store in Kansas City, which left local taxpayers with a $750,000 bill, New York’s food culture already rests on some 13,000 independent bodegas: small, adaptive enterprises that thrive precisely because they respond to local needs. A state-run grocery network would not only crowd them out, but also make the city more vulnerable to the very shortages it hopes to prevent.

Thanksgiving is a yearly proof of concept for liberty: a society of free individuals coordinating better than any plan could dictate. From Moscow to New York, the lesson remains the same. The miracle of prosperity does not flow from ministries or mayors, but from the voluntary cooperation of ordinary people who produce, trade, and trust one another. 

The Soviet Union collapsed because it tried to command what can only be discovered, the daily knowledge of millions working freely. New York, for all its wealth, risks forgetting that lesson each time it trades competition for control. The feast that fills our tables each November is more than a meal; it is civilization itself, renewed by freedom and gratitude. Each Thanksgiving feast reminds us that civilization’s greatest miracles are not decreed; they are cooked, carried, traded, and shared by free people every day.