Category

Economy

Category

The most common cancer in America is also one of the most preventable — if people simply had access to effective sunscreen. We spend $9 billion a year treating the cancerous effects of sun damage, not to mention the billions we spend to soothe the sun’s more minor effects. So many people get skin cancer that the statistics aren’t even reportable to cancer registries. But nearly all skin cancer is the result of sunlight and UV exposure, which means it is preventable. 

But that’s the (often greasy) rub: in the United States, sunscreen is locked inside a bureaucratic vault built in 1938, guarded by the Food and Drug Administration as if it were an experimental medical treatment. 

The FDA’s Precautionary Paralysis

Americans don’t hate sunscreen. We hate American sunscreen. Thick, greasy, chalky — our “broad spectrum” formulas barely block the most dangerous rays, meaning damaging UVA rays still get through. Even when you’re wearing “good” American sunscreen, you remain vulnerable to aging-accelerating sunspots and cancer-causing skin damage. 

Why? Because American sunscreen is trapped in a regulatory time warp. Since the late 1990s, the FDA has refused to approve a single new UV filter. Europe and Asia now use more than 30 modern filters, with similar safety standards. The US? Just 17 — most of them older, less effective, and less pleasant on the skin. Foreign formulations reach beyond visible sunburns well into the UVA wavelength, offering superior protection from the rays that cause 90 percent of visible aging and much of the skin cancer burden. That’s protection Americans are being deliberately denied.

Susan Swetter, MD, is exactly the person you’d want to ask about that kind of thing. She’s a professor of dermatology and the physician in charge of cutaneous oncology (skin cancers) at Stanford University Medical Center. She was blunt: “The best sunscreens abroad contain Tinosorb, Mexoryl or Uvinul — none of which are currently FDA-approved.”

The reason is almost comical. Because sunscreen prevents cancer, the FDA classifies it as a drug, not a cosmetic. Approving a new UV filter drug here requires decades of animal testing, multi-million-dollar studies, and years — sometimes decades — of regulatory limbo before a new ingredient can hit US shelves. The result of this “precautionary principle” is not more safety but less. By locking out proven, widely used ingredients like bemotrizinol — sold abroad for more than 20 years under EU standards without incident — the FDA has left Americans with weaker protection, higher cancer rates, and ballooning medical costs. 

It’s inaction in the name of public health, and the costs are becoming more visible.

“The sunscreen issue has gotten people to see that you can be unsafe if you’re too slow,” economist Alex Tabarrok told NPR. Regulation by delay doesn’t always prevent harm. In many cases, it guarantees harm.

Consumers vote with their wallets, importing bottles of Korean and European brands through web outlets, Reddit fora, and TikTok recommendations. New formulas are chemically superior: some are sweat-proof even in humid conditions, others defend skin against air pollution. Australian sunscreens are among the best in the world, and their SPF claims are rigorously checked and enforced. When sunscreen feels better, looks better, and works better, people are more likely to wear it.

Around the world, innovation races ahead where sunscreen is treated as skincare. 

The Incentive to Do Nothing

Industry has little reason to push the FDA to move faster. The cost of approval can reach $20 million, yet the reward is just 18 months of exclusivity. After that, competitors can copy the formula, leaving innovators to cover all the upfront costs.

Congress prompted the FDA to reconsider its classification and speed up approvals (in November 2025’s continuing resolution, but also 2020, 2014, 2011, and 2005 to absolutely no effect). If bemotrizinol wins FDA approval in 2026, it will be the first new filter in a generation.

Swiss-Dutch skincare company DSM-Firmenich, branding the compound as PARSOL Shield, has petitioned and lobbied the FDA for nearly a decade.

US companies keep recycling the same tired formulas. L’Oréal, Neutrogena, and others already sell better versions of their products abroad. Sephora is reportedly eager to supply better products to US buyers, but will have to settle for intentionally formulated inferior alternatives until the FDA moves. For 20 years, the discriminating skin care buyer could pay extra to import the good stuff.

And speaking of paying a premium for imported goods…

Tariffs Make a Bad Problem Worse

In 2024, the Trump administration slapped a 25 percent tariff on Korean imports, including cosmetics like sunscreen. Though the rate has seesawed since — sometimes 15 percent, sometimes zero — the uncertainty has sparked panic buying and price spikes. Retailers warn that if the full tariff returns, they’ll have no choice but to pass costs onto consumers.

Now, the administration has also eliminated the de minimis exemption, which used to let individuals import up to $800 in goods tariff-free. Without that protection from costly customs duties, millions of American consumers who rely on direct-to-door K-beauty orders will see the cost of reliable skincare soar overnight.

The Cost of Bureaucracy and Protectionism

First, the FDA blocks innovation, so American products are distinctly inferior. Reform (say, to streamline FDA approvals, remove required animal testing, or approve new UV filters) moves slower than skin cancer spreads across an unprotected brow. Now, ill-conceived trade policy threatens to choke off the only affordable workaround consumers have left. 

Bipartisan glimmers exist. Rep. Alexandria Ocasio-Cortez and Sen. Mike Lee have called for regulatory reform to streamline FDA approvals and allow modern testing methods without the requirement for mandatory animal testing (funny enough, we could rely on a 30-year longitudinal study on the human populations of Europe and Southeast Asia).

Busybody-bullies make it their job to get in the way of consumers’ choices for themselves and entrepreneurs’ attempts to meet those needs. As usual, the twin idols of American bureaucracy — safety theater and national security hobgoblins — generate fear, feed lobbyists, and prop up campaign funding, but produce the opposite of their intent.

The results of FDA protection:

Not safety, but exposure — to the sun, to higher prices, to worse health outcomes, and an estimated 8,000 preventable deaths a year. Medicare and Medicaid are likely to pay billions annually to treat skin cancer that could’ve been prevented, had the FDA not, well, prevented that. 

American standards force companies here and elsewhere to produce deliberately inferior products at higher prices than those freely available to buyers in other countries. American manufacturers are excluded from a booming global skin care market. 

One of our best tools for reliable, risk-free cancer prevention is being treated as a luxury good. 

COVID may be on its way to being a chapter in our history books, but it’s left its fingerprints all over life as we know it. The world post-pandemic is not the same as the one we had in the early days of 2020. Work-from-home is now normalized; many companies are partially (and even fully) remote. Entire city populations have shifted as large numbers of Americans relocated around the country.

Perhaps less obvious to the naked eye — although not less significant — is COVID’s effect on the public school system.

The pandemic, and all the social changes that came with it, shattered some of our culture’s biggest educational taboos. More importantly, it shattered the illusion that our public schools are a great and trustworthy American institution.

Everybody talks about “COVID learning loss” (and the large gaps in learning students are still suffering months after school closures and missed lessons). Far less discussed is the “COVID trust loss” in our public schools (at a 25-year low) and all the ways that the social norms that bound public school together as an American bedrock have begun to fragment.

Since 2020, a wave of school choice policy has swept across the country. Its seeds were sown long before the pandemic, but COVID trust loss created the culture ripe to harvest them.

Four shifts post-pandemic are changing the fabric of American education: increased transparency inside the classroom (and more light shed on all the shortcomings of public schooling), the breakdown of the homeschooling taboo, the shift toward remote work, and demographic migrations into states prioritizing school choice.

Each of these paradigm shifts is quietly rewriting the education world. Each is important, and each is reshaping education in its own way.

If you’re a skeptic of government-run schools, there’s a lot to be excited about.

Zoom School Revealed the Rot in Public Education

In the early days of COVID, public schools went online, and parents had the chance to watch what was happening in their child’s classroom in real time. Many were not pleased.

Teachers and administrators were trying to translate an already broken model of education onto a format it didn’t fit, breaking it even further in the process.

Parents, also shut up inside their houses and in close proximity to their children, saw Zoom school and were horrified — is this really what my kid does all day?

Some chalked it up to the shortcomings of the medium: public school was designed for real-life rooms and three-dimensional interactions, not computer screens.

Others (more astutely) blamed the model itself.

It’s no secret that America’s public school outcomes aren’t great — the Nation’s Report Card, published by the federal government, publicly documents as much. But many parents were confused by the content of their children’s classes — like the parents documented in the Sold a Story podcast, who were horrified to discover their children weren’t learning to read.

Public school enrollment dropped sharply. Many families switched to private schools (which re-opened faster than public schools) or began homeschooling. Some of those families returned to public school after the lockdowns ended, but many didn’t, and public school enrollment is trending downward. Nationally, enrollment dropped 2.5 percent between 2019 and 2023, and continues to decline.

Even in cities like Austin (with population trending upward), public school enrollment is falling — Austin’s school district has lost 10,000 students over the past decade, despite the city population growing by 10 percent (nearly 100,000 new residents) over the same period.

If public schools were private companies, and surveyed their customers (the parents) — or just looked at their retention data — the market feedback would be clear. Parents aren’t happy with the results public schools are delivering, and are looking elsewhere.

Zoom School Made Homeschooling Less Taboo

Millions of parents pulled their kids out of public school in 2020 and started homeschooling them, confident that their homespun instruction would be better than whatever Zoom school was meting out.

Nearly overnight, homeschooling — once a strange practice reserved for the hippies and religious zealots and social outcasts — became normal. It went from a fringe concept to a shared cultural experience.

Nearly everybody knows somebody who homeschooled for at least a few months during the pandemic. And you can’t say “homeschoolers are weirdos” without a tinge of irony if you yourself (or your sister, or your best friend, or your cool neighbor) were once a homeschooling parent, no matter what the extenuating circumstances.

More importantly, even if you weren’t homeschooling, you still had your kids at home all day — one of the core (unimaginable?) realities of homeschooling life. Pre-COVID, parents could say “I could never homeschool my kids, I can’t imagine having them home all day.” Post-COVID, no longer: having your kids home all day was something everyone could imagine, because it was something everyone had experienced.

By the time the lockdown had abated, having your child home all day had gone from unimaginable, to practical, to a very viable possibility for the future.

Work-From-Home Broke Up “Default” Childcare 

At the same time Zoom school was in full swing, COVID was permanently rearranging the workplace. Technology had long before made remote work possible; the pandemic forced employers to catch up. Parents went from 9-5 office residencies to commuting only as far as the kitchen table, taking meetings while waiting for their sourdough to rise.

One of the core services public school offers, to families and to society generally, is childcare. Parents who go to work need somewhere for their kids to go. But if parents work from home, they can be the adult in the room while kids do school — especially if their child is enrolled in an online program (so mom doesn’t have to be the teacher).

For many kids, especially an older student who doesn’t need constant supervision, doing school online (with mom or dad in the other room for support if you need them) is a viable option. If your child doesn’t like the public school curriculum, or prefers working at their own pace, or is on the butt end of public school bullying and social hierarchies, online school can offer a compelling prospect.

During COVID, all sorts of online schools grew quickly: Sora School, an online project-based middle and high school; Synthesis, the game-based spinoff program from Elon Musk’s Ad Astra school; Kubrio, a “world school” with three time zone swaths and students from all around the globe — to name just a few.

More traditional models like online charter schools and public cyber schools are also on the menu; but for families with self-directed children, more custom combinations of tools and programs (Khan Academy paired with Teaching Company lectures, IXL supplemented with Coursera MOOCs) abound.

Post-COVID, remote work appears to be here to stay. As Cal Newport wrote in his book Slow Productivity, referencing Apple employees refusing to go back to the office: “These frustrated Apple employees [are] at the vanguard of a movement that’s leveraging the disruptions of the pandemic to question so many more of the arbitrary assumptions that have come to define the workplace.” 

This questioning of assumptions, not incidentally, applies equally to schooling.

Perhaps equally importantly, remote work also frees families from work-induced geographic constraints, making it easier for them to relocate to states with the best schools or robust school-choice supports.

COVID Migration Moved Families To Choice-Friendly States

COVID policy rearranged the demographic spread of the country en masse: people fled in droves from locked-down states (like New York and Illinois and California) to open ones (like Texas and Tennessee and Florida).

States with less-restrictive school closure policies also tend towards freer education policy (both are correlated with the relative power of teachers unions). Many fewer states have since passed sweeping school choice policies, where families have access to public vouchers for use at private schools.

The effect is that a large number of kids — who would’ve otherwise been stuck in states without school choice — now live in states with a huge number of school options emerging.

Eighteen states have implemented universal school choice since 2020. Some of those states, like Florida and Texas, are becoming hotbeds for education innovation, and many are seeing an increasing number of private school options emerging.

Incidentally (or perhaps not incidentally at all), many of these school choice-friendly states are also the places people are having the most kids — a positive indicator of the education market’s future growth. Where there is demand (young students) and capital (school choice dollars), supply (interesting new schools) will follow.

Part of the reason public schools in America have had such a monopoly on education is because of other extenuating circumstances: parents need to work all day; no one was at home to watch the kids; tax dollars were exclusively bundled into the public school system — and everybody trusted the public school system. After all, it’s one of the great American institutions (or so we’re led to believe).

But with those undergirdings starting to shift, public school’s Herculean hold on the American psyche (and the American way of life) is shifting too. Logistically, we don’t need public schools the way we did a decade ago. We’re more skeptical of them. And alternatives have been destigmatized.

The 2.5 percent public school enrollment drop is still small; it’s early days. Nearly 50 million kids are still enrolled in public schools. Public education is still the default.

But the cultural landscape — and the cultural paradigm — has shifted. And the education landscape will continue to shift in response — slowly now, but more and more, until the unquestioned “default” of one school for all children feels as distant as normal reality did during the height of the pandemic.

The US housing market in late 2025 is defined by contradictory forces: rising prices but slowing growth, increasing inventory but falling affordability, and a demographic shift that is weakening long-run demand even as short-run supply remains structurally constrained. 

Against this backdrop, President Trump’s proposal for a 50-year mortgage is an attempt to stretch affordability in a market that has outpaced incomes, and it exposes deeper issues. Mortgage duration is both a financial feature and a policy artifact shaped by decades of government intervention dating back to the New Deal. A 50-year mortgage may expand access by lowering monthly payments, but it also dramatically increases lifetime interest costs and could raise prices depending on supply elasticity. The debate over this proposal is ultimately a debate over the real frictions in the housing market, namely interest-rate lock-in, constrained supply, and the institutional architecture that prevents solutions like portable mortgages from being widely available.

The American housing market rarely changes in sudden leaps. Prices adjust gradually, construction responds slowly, and mortgage product design barely shifts at all. That is why the mere suggestion of a 50-year mortgage by President Trump is so economically revealing. If housing finance policymakers are floating half-century debt structures, something fundamental in the market is out of balance.

Today’s housing market presents a strange tableau: home prices are still rising, but at a slowing pace. According to the latest Federal Housing Finance Agency (FHFA) data, prices are up roughly 2.2 percent year-over-year in Q3 2025. Sales have ticked up modestly, with existing-home transactions rising 1.2 percent in October. Inventory is finally improving, up about 12.6 percent year over year, driven mostly by new construction rather than existing homes. Mortgage rates have eased from their peaks and now sit in the six-percent range for many buyers. On paper this resembles a soft landing, but in reality the market remains defined by broad affordability stress.

Many homes are sitting unsold for long stretches, not because buyers are absent but because sellers are holding out for pandemic-era prices. Renters’ expectations of becoming homeowners have collapsed from 52.6 percent in 2019 to just 33.9 percent today, according to the Federal Reserve Bank of St. Louis. And demographic headwinds are emerging: births are declining, population growth is slowing, and long-run demand will weaken as the nation ages. Housing should be cooling naturally, yet it isn’t. The affordability crisis is so acute in the short run that policymakers are reaching for financial engineering solutions rather than addressing structural constraints.

It is amidst this backdrop that Trump proposed the 50-year mortgage, casting himself in the lineage of major housing-finance reforms, much like President Roosevelt’s role in ushering in the modern 30-year mortgage during the New Deal. By extending loan terms, the administration argues it can meaningfully lower monthly payments and open the door to homeownership for buyers priced out of today’s market. In that narrow sense, the idea appears palatable; when affordability is collapsing, and buyers are increasingly constrained by monthly cash flow, stretching the mortgage horizon looks like an intuitive policy lever. But as with any major change in mortgage design, the economic logic is more complicated.

A longer mortgage lowers monthly payments at the cost of paying much more interest over many more years. For a median-priced $415,000 home purchased with an FHA loan at 3.5 percent down, here is how the math works:

Mortgage TermInterest RateMonthly PaymentNumber of PaymentsTotal of All PaymentsTotal Interest Paid
15-year5.5%$3,272.21180$588,998.69$188,523.69
30-year5.99%$2,398.48360$863,451.30$462,976.30
50-year6.4%$2,227.44600$1,336,462.35$935,987.35

The 50-year mortgage trims the monthly payment relative to the 30-year, but at the cost of doubling the total interest burden. For households focused solely on monthly cash flow, particularly first-time buyers, this tradeoff can appear worth it. A lower payment either gets a family into the home they want or allows them to buy a more expensive house. Economically, it functions like any intertemporal tradeoff: more affordability now, much higher cost later.

Critics argue that introducing ultra-long mortgages will push home prices higher. The answer is that it depends entirely on supply elasticity. In markets where new housing is constrained by zoning, permitting delays, or not-in-my-backyard (NIMBY)-driven land-use restrictions, extended mortgage terms can, in fact, capitalize into higher prices. In elastic markets, the effect is muted. This is not a moral failing of the policy; it is simple microeconomics. If your policy goal is to increase homeownership, you accept certain tradeoffs, just as every country with 40- to 100-year mortgages has.

What often gets lost in the discussion is that the United States did not adopt the 30-year mortgage because markets naturally arrived at it. The product is fundamentally the outcome of government intervention. During FDR’s New Deal, the Federal Housing Administration standardized long-term amortized mortgages, displacing the short-term, interest-only loans that had dominated before the Great Depression. Fannie Mae later expanded liquidity and uniformity in mortgage finance. The 30-year mortgage was authorized by Congress in the late 1940s and eventually became dominant because federal agencies guaranteed it.

In other words, the “normal” American mortgage is not a market creation; it is a political one. A 50-year mortgage would simply be the next step in an 80-year continuum of policy-driven mortgage evolution.

The deeper issue in the housing market is not the absence of exotic mortgage products. It is that existing homeowners are frozen in place. Millions of households locked in 2–4 percent mortgage rates during the pandemic. With current rates near six percent, these homeowners don’t want to move, even when downsizing or scaling up might make sense. That keeps existing inventory off the market. Meanwhile, a record share of homes for sale are new construction, not existing properties.

One innovative solution would be portable mortgages, where the borrower could keep their existing mortgage rate but shift the collateral to a new home. Instead of being trapped in their current house because of a pandemic-era 3 percent mortgage, a household could sell, buy a different property, and simply move the lien from House A to House B. In theory, this would dramatically improve mobility, unfreeze existing-home inventory, and loosen one of the tightest bottlenecks in the current housing market: interest-rate lock-in.

But portable mortgages do not exist in the United States for reasons deeply rooted in the structure of American mortgage finance. The US system is built around long-term, fixed-rate mortgages that are pooled into mortgage-backed securities, financial instruments priced according to the specific borrower and the specific property at the moment the loan is issued. Letting borrowers carry their old loan to a new house would upend that securitization model, causing investors to absorb unknown collateral risk midstream and making the securities far harder to price.

The dominance of the 30-year fixed-rate mortgage compounds the issue. If borrowers could port low rates across multiple moves, they would have little reason to refinance, starving lenders of the fee income and interest-rate resets they depend on to originate new loans. Investors could also face greater duration risk, being stuck earning three percent for decades even as market rates rise, causing them to demand higher rates across the entire mortgage market. And unlike countries such as the UK or Canada, where portable mortgages are common, US mortgages are secured by a specific property for the life of the loan and fixed for thirty years, not two to five.

Changing collateral midstream would require new appraisals, new legal filings, and a fundamental reengineering of mortgage securitization. All of this means that while portable mortgages could meaningfully improve housing mobility, they run directly counter to the incentives and infrastructure of the US mortgage system. Banks and investors prefer refinancing into higher rates, and the legal plumbing is built around property-specific collateral, not borrower-specific contracts. As a result, portable mortgages remain economically appealing in theory but institutionally implausible in practice.

If, however, banks and regulators could find a way to make portability compatible with the existing system, it could be a genuine game changer for American homeowners. Imagine a world where a young couple who locked in a low rate on their starter home is not punished financially for having a third child and needing more space, or where empty nesters can downsize without watching their mortgage payment jump simply because they move. Portability would make the mortgage contract follow the household’s life cycle rather than anchor it to a single property, smoothing mobility across labor markets, helping people move closer to better jobs, and reducing the mismatch between housing stock and household needs.

This would mean more efficient use of the existing housing stock, less pressure to overbuild in certain markets, and a healthier, more dynamic relationship between housing and labor-market mobility. It is precisely because the gains to households and to the broader economy are so large that portable mortgages are worth serious experimentation, even if the institutional and regulatory hurdles are high.

Ultimately, the debate over 50-year mortgages is less about exotic loan structures and more about the deeper structural limits of America’s housing system. Affordability has deteriorated because supply is constrained, mobility is frozen, and our mortgage architecture has not evolved with economic realities.

Extending mortgage terms may offer short-term relief, but the real innovations, like portable mortgages or reforms that unlock supply, require rethinking the institutional plumbing that has defined US housing finance since the New Deal. If policymakers want lasting affordability rather than financial patches, they must address the structural forces that make such extreme proposals politically viable in the first place.

In an era where “democratic socialism” has gained renewed traction among politicians, activists, and intellectuals, one might assume the term carries a clear, operational meaning. Yet, a closer examination reveals a concept shrouded in ambiguity, often serving as a rhetorical shield rather than a blueprint for policy.  

Proponents often invoke it to promise equality and democracy without the baggage of historical socialist failures, but this vagueness undermines serious discourse. Precise definitions are essential for theoretical, empirical, and philosophical scrutiny. Without them, democratic socialism risks becoming little more than a feel-good label, evading accountability while potentially eroding the very freedoms it claims to uphold. 

The Historical Consensus on Socialism: State Ownership and Its Perils 

During the socialist calculation debate of the early twentieth century, a clash between Austrian economists like Ludwig von Mises and Friedrich Hayek and their socialist counterparts, including Oscar Lange and Abba Lerner, the consensus definition of socialism was straightforward: state ownership of the means of production. As I demonstrate in my coauthored paper, “The Road to Serfdom and the Definitions of Socialism, Planning, and the Welfare State, 1930-1950,” this understanding was shared not only by critics but also by the socialist intellectuals of the time.  

Socialism, in this context, entailed the state directing resources through planning, often requiring ownership to fund expansive welfare programs. This definition is crucial for interpreting Hayek’s seminal work, The Road to Serfdom (1944), which posits a unique threat to democracy arising from state ownership of the means of production. Hayek argued that central planning inevitably concentrates power, leading to authoritarianism as planners override individual choices to meet arbitrary goals. Far from a slippery slope toward any government intervention, Hayek’s warning targeted the specific dynamics of state-owned economies, where the absence of market prices stifles the flow of information and the structuring of incentives, ultimately endangering democratic institutions. Using this definition, my coauthors and I, in our paper “You Have Nothing to Lose but Your Chains?” empirically test and confirm Hayek’s hypothesis that democratic freedoms cannot be sustained under socialism.  

Economists working in this tradition, from Mises to contemporary scholars, retain this rigorous definition. It serves as a foundation for understanding why socialist systems have repeatedly faltered: without private ownership of the means of production, rational economic calculation becomes impossible, resulting in waste, shortages, and coercion.  

The Vagueness and Contradictions of Modern Socialist Rhetoric 

Contrast this clarity with the approach of many contemporary socialists, including those advocating democratic variants. Definitions of socialism often shift, praised in moments of perceived success and disowned when failures mount. This pattern is not new; it has recurred across a range of historical experiments, from the Soviet Union to Venezuela. Kristian Niemietz’s Socialism: The Failed Idea That Never Dies offers a comprehensive review of socialist rhetoric that highlights this inconsistency: regimes are initially hailed as “true” socialism, such as “worker-led” and “democratic,” only to be retroactively labeled as distortions or “state capitalism” once repression and economic stagnation emerge.  

When Hugo Chavez introduced socialism in Venezuela in 2005, he claimed that he was re-inventing socialism so as to avoid the outcomes of the Soviet Union, stating that they would “develop new systems that are built on cooperation, not competition.” And that they “cannot resort to state capitalism.” Bernie Sanders famously endorsed this socialism, saying that the American dream was more likely to be realized in places like Venezuela and calling the United States a banana republic in comparison. Nobel laureate economist Joe Stiglitz was quick to point out the “very impressive” growth rates and the eradication of poverty. But socialism in Venezuela, according to the state ownership of the economy measure from Varieties of Democracy, corresponded to the classic definition of socialism, leading to the very blackouts, empty grocery shelves, and suppression of political freedom socialists explicitly sought to avoid.   

This vagueness extends to democratic socialism today. Proponents often speak in lofty terms, such as “workplace democracy,” without specifying policies. Such abstractions allow evasion of empirical evidence. By rendering the concept unfalsifiable, socialists can dismiss critiques as attacks on straw men, perpetuating debates that stall progress. If democratic socialists insist on reclaiming the term “socialism,” as distinct from the technical term used by economists, the burden falls on them to explicitly state their divergence and provide a concrete definition amenable to empirical testing. 

The Imperative of Precision for Empirical and Philosophical Inquiry 

A precise definition is not mere pedantry; it is the prelude to meaningful investigation. To enable cross-country comparisons, socialism must be defined through specific policies, not vague platitudes. What exact measures constitute this vision? Some socialists point to the Nordic countries as their model, but these countries have important differences between them. And, if a country is the model, then democratic socialists must consistently advocate for all the policies in that country, including those that might contradict their ideals, such as flexible labor markets or low corporate taxes. The Nordic countries, as measured by state ownership of the economy, are capitalist. Similarly, as measured by Fraser Economic Freedom of the World Index, they are also some of the most economically free.   

Empirical literature in economics often examines the effects of specific policies in isolation, separate from the discussion of comparative economic systems, revealing trade-offs often ignored by democratic socialists. Minimum wage laws, for example, often supported by unions, can reduce employment opportunities, particularly for low-skilled workers and minorities. Prevailing wage requirements, pushed by unions, may inflate costs and exclude smaller firms, suppressing economic mobility and also having racially disparate economic impacts.  

Philosophical debates demand equal rigor. Consider unions, a cornerstone of many democratic socialist platforms. Do proponents support open ballot laws, which protect workers from intimidation during union votes, or do they favor secret ballots to ensure true democracy? Exempting unions—as labor cartels—from antitrust laws raises concerns: why allow monopolistic practices that could hike prices and limit competition, regressively harming consumers? If a national or subnational electorate democratically enacts right-to-work laws, preventing closed-shop unions, should this override a workplace vote? Such questions expose potential anti-democratic undercurrents, where “worker democracy” might privilege special interests over broader societal choice. 

These inconsistencies highlight a deeper issue: democratic socialism often conflates social democracy – market economies with robust safety nets – with true socialism, diluting the latter’s radical edge while inheriting its definitional baggage. Without clarity, it risks repeating history’s errors, where good intentions devolve into coercion. 

Toward Clarity and Accountability 

Democratic socialism’s appeal lies in its promise of equity without tyranny, but its vagueness invites skepticism. Only by adhering to historical definitions and demanding specificity can we foster advancement in these debates. What policies do democratic socialists argue for exactly? How will they avoid the pitfalls of past experiments in socialism, which often started with the noblest of intentions? Until answered, democratic socialism remains an elusive mirage.  

The Trump administration is making good on its promise to shrink the bloated federal bureaucracy, starting with the Department of Education. Education Secretary Linda McMahon recently announced that her department has signed six interagency agreements with four other federal departments – Health and Human Services, Interior, Labor, and State – to shift major functions away from the Education Department.  

These agreements will redistribute responsibilities like managing elementary and secondary education programs, including Title I funding for low-income schools, to the Department of Labor; Indian Education programs to the Interior Department; postsecondary education grants to Labor; foreign medical accreditation and child care support for student parents to Health and Human Services; and international education and foreign language studies to the State Department, to agencies better equipped to handle them without the added layer of bureaucratic meddling. 

Interagency agreements, or IAAs, aren’t some radical invention. They’re commonplace in government operations. The Department of Education already maintains hundreds of such pacts with other agencies to coordinate on everything from data sharing to program implementation. What makes this move significant isn’t the mechanism – it’s the intent. By offloading core duties, the administration is systematically reducing the department’s scope, making it smaller, less essential, and easier to eliminate altogether. This approach is the next logical step in a process aimed at convincing Congress to vote to abolish the agency entirely. 

Remember, the Department of Education was created by an act of Congress in 1979, so dismantling it requires congressional action. In the Senate, that means overcoming the filibuster, which demands a 60-vote supermajority. Without it, Republicans would need a handful of Democrats to cross the aisle – or they’d have to invoke the “nuclear option” to eliminate the filibuster for this legislation.  

Conservatives have wisely resisted that temptation. Ending the filibuster might feel expedient now, but it would set a dangerous precedent, allowing Democrats to ram through their big-government agendas – like expanded entitlements or gun control – with a simple majority the next time they hold power. It’s better to build consensus and preserve the procedural safeguards that protect limited government. 

The Trump team’s strategy is smart: It breaks down the bureaucracy piece by piece, demonstrating to the public and lawmakers that other agencies can handle education-related workloads more efficiently. Why prop up a standalone department riddled with waste when existing structures can absorb its functions? The administration’s approach goes beyond administrative housekeeping to serve as proof of concept that education policy belongs closer to home, not in the hands of distant D.C. officials. 

Of course, the only ones howling about sending education back to the states are the teachers unions and the politicians in their pockets. Groups like the National Education Association (NEA) and the American Federation of Teachers (AFT) thrive on centralized power. It’s easier for them to influence one federal agency where they’ve already sunk their claws than to battle for control across 50 states and thousands of local districts.  

We’ve seen this playbook in action. During the COVID-19 pandemic, unions lobbied the Centers for Disease Control and Prevention – another federal entity – to impose draconian guidelines that made school reopenings nearly impossible. They held children’s education hostage, demanding billions in taxpayer-funded ransom payments through stimulus packages. 

The unions’ power grab isn’t new. The Department of Education itself was born as a political payoff. Democrat President Jimmy Carter created it in 1979 to secure the NEA’s endorsement for his reelection bid. It’s no secret that teachers unions have long controlled Democrat politicians, but even some Republicans aren’t immune.  

Rep. Brian Fitzpatrick (R., Pa.) came out swinging against dismantling the department, claiming it was established “for good reason.” That “good reason” apparently includes his own ties to the unions. Fitzpatrick is the only Republican in Congress currently endorsed by the NEA. Back in 2018, the NEA even backed him over a Democrat challenger. Over the years, he’s raked in hundreds of thousands of dollars in campaign contributions from public-sector unions. Is it any wonder he’s against Trump’s plan?  

Meanwhile, more than 98% of the NEA’s political donations went to Democrats in the last election cycle, yet less than 10% of their total funding went towards representing teachers. Follow the money, and you’ll see why federal control suits them just fine. 

Sending education to the states would empower local communities, where parents and educators know best what’s needed. It would also mean more dollars reaching actual classrooms instead of lining the pockets of useless bureaucrats in Washington. Federal education spending gets skimmed at every level, with administrative overhead siphoning off funds that could buy books, hire teachers, or upgrade facilities. 

Critics claim abolishing the department would gut protections for vulnerable students, but that’s a red herring. Federal special-needs laws, like the Individuals with Disabilities Education Act, predated the department and can continue without it. Civil-rights enforcement in schools doesn’t require a dedicated agency; the Justice Department and other entities already handle similar oversight. Moreover, the word “education” appears nowhere in the US Constitution. The department’s very existence arguably violates the 10th Amendment, which reserves powers not delegated to the federal government to the states or the people. 

The evidence against federal involvement is damning. Since the department’s inception, Washington has poured about $3 trillion into K-12 education. Achievement gaps between rich and poor students haven’t closed, and in many cases, they’ve widened. Overall academic outcomes have stagnated or declined. Per-student spending, adjusted for inflation, has surged 108% since 1980, yet test scores remain flat. The US spends more per pupil than nearly any other developed nation, but our results are an international embarrassment. 

The Trump administration has already taken decisive action to chip away at this failed experiment. They’ve slashed millions in diversity, equity, and inclusion grants that promote division rather than learning. Thousands of department employees have been let go, streamlining operations and cutting costs. The unions are probably gearing up to sue over these latest interagency agreements. But they tried that before – challenging the administration’s personnel reductions – and lost at the Supreme Court. The chief executive has clear authority to manage the executive branch, and the unions would likely face another defeat if they push this latest move to litigation. 

It’s time to end the charade. The Department of Education focuses on control rather than helping kids. By dispersing its functions and proving the sky won’t fall, the Trump team is paving the way for real reform. America’s students deserve better than a federal fiefdom beholden to special interests. Let’s send education back where it belongs: to the states, the localities, and the families who know their children best. 

Thanksgiving draws people, regardless of race or creed, together around a table heavy with food and laughter. At its center sits a golden turkey, but it’s the sides (mashed potatoes, green beans, stuffing, and gravy) that spark the most excitement. American football murmurs from the television as plates and hands cross the table, passing dishes with the casual choreography of family life.

This is, in spirit, the very scene Frédéric Bastiat once imagined when he marveled at how Paris was fed each morning. “It staggers the imagination,” he wrote, “to comprehend the vast multiplicity of objects that must pass through its gates tomorrow… And yet all are sleeping peacefully at this moment.” No single mind coordinates the miracle and yet, it happens.

Thanksgiving is the modern version of Bastiat’s wonder. What we see is the feast itself: Mom and Nana pulling the turkey from the oven. The Department of Agriculture reports that roughly 46 million turkeys, about the population of Spain, are eaten every Thanksgiving. The extended family that arrives hours before the meal is ready is joined by 1.6 million people who travel on Thanksgiving. Dad and his child switch between American football and the Macy’s Thanksgiving Day Parade, joining the more than 100 million viewers who tune in each year, coordinated across satellites, networks, advertisers, and camera crews, so that the same spectacle can play out in millions of living rooms at once. 

What remains unseen are the invisible threads of cooperation that make the Thanksgiving table possible. Long before the turkey reached the oven, farmers in Iowa, Nebraska, and Arkansas were raising it, relying on feed grown by other farmers and transported by rail from thousands of miles away. The green beans and sweet potatoes come from networks of growers, processors, and distributors whose work depends on forecasts, algorithms, and trade routes most of us never think about. Truck drivers cross state lines to deliver ingredients to logistics managers who ensure that shelves stay stocked. Every piece comes together until someone realizes the cranberry sauce is missing. Last-minute panic sets in, and a quick dash to the grocery store follows.

Today such a trip isn’t seen with wild wonder. But in 1989, during a policy shift called perestroika, or restructuring, the USSR sent a delegation to thaw relations with the United States. Alongside a tour of NASA’s Johnson Space Center in Texas, the foreign delegation made an unscheduled stop at a Randalls Supermarket. Among them was future Russian President Boris Yeltsin, who, astonished by the variety of foods, claimed, “Even the Politburo doesn’t have this choice. Not even Mr. Gorbachev.” The visit left Yeltsin at a loss for words: “I think we have committed a crime against our people by making their standard of living so incomparably lower than that of the Americans.” 

Since its founding in 1917, the Soviet Union endured famine with grim regularity. The Volga famine of 1917–1922 claimed between five and seven million lives. A decade later, the Holodomor of 1932–1933 starved another five to eight million, and after World War II, the famine of 1946–1947 took roughly two million more. Each disaster was born not of nature but of policy: central planning, forced collectivization, and the state’s determination to control production.

By the 1990s, the pattern of scarcity persisted, mocking the propaganda that declared, “Life has become easier, comrades; life has become happier.” In April 1991, bread prices rose 300 percent, beef 400 percent, and milk 350 percent. Shortages grew so severe that Premier Mikhail Gorbachev appealed to the international community for humanitarian aid, with officials admitting that the USSR had “flung itself around the world, looking for aid and loans.”

Shipments of frozen chicken, nicknamed “Bush legs” after President George H. W. Bush, were flown in to feed the population. The image carried an irony history could not have scripted better: just decades earlier, at the height of the Cold War in the 1950s, Premier Nikita Khrushchev had thundered before Western diplomats, “About the capitalist states, it doesn’t depend on you whether or not we exist. If you don’t like us, don’t accept our invitations, and don’t invite us to come see you. Whether you like it or not, history is on our side. We will bury you.” Yet by the end of the century, the USSR that vowed to bury the West was surviving on American poultry—in other words, on capitalist chicken. The spiraling crisis soon escalated into nationwide strikes and protests demanding the end of the system itself. By Christmas Day, December 25, 1991, the Soviet Union dissolved, undone by the same command economy that had once promised to abolish hunger.

Even the most ardent bureaucrats, armed with vast tracts of farmland and central plans, could not guide the Soviet Union into prosperity, let alone feed its people. Yet the urge to direct, ration, and manage markets never disappears; it only changes its accent. 

Today, in New York City, the beating heart of global finance, the temptation to fix the market endures. Mayor-elect Mahmood Mamdani has proposed government-run grocery stores as “a public option for produce,” arguing that too many New Yorkers find groceries out of reach. His plan would cost roughly $60 million, financed through higher corporate taxes at 11.5 percent and a new 2 percent levy on those earning over a million dollars a year. Despite the recent failure of a government-run grocery store in Kansas City, which left local taxpayers with a $750,000 bill, New York’s food culture already rests on some 13,000 independent bodegas: small, adaptive enterprises that thrive precisely because they respond to local needs. A state-run grocery network would not only crowd them out, but also make the city more vulnerable to the very shortages it hopes to prevent.

Thanksgiving is a yearly proof of concept for liberty: a society of free individuals coordinating better than any plan could dictate. From Moscow to New York, the lesson remains the same. The miracle of prosperity does not flow from ministries or mayors, but from the voluntary cooperation of ordinary people who produce, trade, and trust one another. 

The Soviet Union collapsed because it tried to command what can only be discovered, the daily knowledge of millions working freely. New York, for all its wealth, risks forgetting that lesson each time it trades competition for control. The feast that fills our tables each November is more than a meal; it is civilization itself, renewed by freedom and gratitude. Each Thanksgiving feast reminds us that civilization’s greatest miracles are not decreed; they are cooked, carried, traded, and shared by free people every day.

It is important to celebrate victories for economic freedom as they emerge, even when they come in the most peculiar of places. One such place is the racing world. 

In October, North Carolina Governor Josh Stein signed into law HB 926, called the “Right to Race” law. This new measure shields racetracks from noise-related nuisance lawsuits if the facility existed and was permitted before nearby properties were developed. This is an incredible win for economic freedom against NIMBYs demanding to silence roaring engines after making the decision to move next to a racetrack. North Carolina has definitively answered “no” to the question: Should those who knowingly move next to a racetrack be allowed to use the government to quiet it?

As areas around the country redevelop, with rural areas becoming suburbs and suburbs evolving into de facto metropolises, racetracks have found themselves the target of those moving into these newer developments. Racetracks that long predate the existence of these neighbors are facing legal action by the newcomers. 

For example, in Tennessee, the Nashville Fairgrounds Speedway, which first opened in 1904, now faces opposition from residents as the track seeks to renovate in order to lure NASCAR racing back to the facility. Despite being there first, despite the track’s positive economic impact, many tracks find themselves without legal protection from the locals. 

In response to this, some states have taken action. Over the summer, Iowa Governor Kim Reynolds enshrined HF 645, which protects racetracks, such as the famed Knoxville Raceway, from the constant threat of litigation from their neighbors, provided the track preexisted the neighboring property’s purchase or development.

Last month, North Carolina followed suit. Racing has been embedded in the state for almost a century, with tracks like Bowman Gray Stadium (1937), North Wilkesboro Speedway (1947), Hickory Motor Speedway (1951), and Charlotte Motor Speedway (1960) becoming renowned venues of racing around the world. 

There is a long-held legal, philosophical, and economic position: first use establishes right. This is how we arrive at property rights, through homesteading. These rights secure not only possession, but also established and peaceful use. In interaction with property rights, we reach “coming to the nuisance.” In this common-law doctrine, you consent to the effects of an existing activity if you choose to move next to it. It is your choice of proximity that entails your acceptance of the conditions. 

In “Law, Property Rights, and Air Pollution,” economist Murray Rothbard makes this point with the example of an airport. Prior use generates a legitimate easement-like claim in sound or emission. In the case of the airport, Rothbard writes, “The airport has already homesteaded X decibels worth of noise. By its prior claim, the airport now owns the right to emit X decibels into the surrounding area.” 

This simple point illustrates the following: when an airport operates openly for years, it “homesteads” its noise. The sound waves become part of its legitimate use, and newcomers consent when moving next door. The same logic applies here to racetracks. Their races, the noise of the engines, do not violate anyone’s rights — they exercise, instead, preexisting ones. What seems like an abstract theory is expressed in clear statutes like those in North Carolina and Iowa.

These states are translating the principles of homesteading into positive law, at least in the defense of racing. In effect, they take what Rothbard described as a natural rights easement — earned through peaceful, longstanding use — and make it explicit law. What these laws help clarify is the difference between preferences and rights. When preferences override rights, this signals institutional instability — rights are negotiable. Nashville proves itself as a cautionary counterexample. Without statutory protection, the Fairgrounds Speedway is vulnerable to neighborhood pressures that could lead to the violation of the track owners’ rights.

With any luck, North Carolina and Iowa will not be outliers, but a broad legal correction. When courts and city councils prefer what might be called “aesthetic interventionism”, where neighbors’ preferences, not owners’ rights, dictate outcomes, this creates uncertainty regarding property rights, threatening the very foundation of a free economy. When property owners and entrepreneurs can rely on institutional stability, they can invest with confidence in the future. Without such confidence, the erosion of trust that it produces deters economic growth. 

These “Right to Race” laws push back against this drift, restoring predictability to this segment of the market. These racetrack cases are only a small, visible example, but the same logic applies to other industries in various ways, such as nightlife ordinances and noise complaints for musicians. The order of homesteading matters, and these laws help preserve the space for voluntary exchange. 

The sound of dozens of racecars flying around North Wilkesboro or Charlotte may not be music to everyone’s ears, but it represents something deeper than sport. It is the roar of property rights at work — the anchor of fairness, stability, and freedom. States like North Carolina and Iowa have protected not only racing, but the freedom that depends upon stable expectations. 

Nashville’s ongoing fight, on the other hand, shows what happens without such clarity. When rights are negotiable, every market action becomes provisional. Economic freedom demands the simple rule that those who came, acted, and homesteaded first have property rights. Sometimes, that means protecting racetracks. Thank you, North Carolina and Iowa.

In a recent analysis gone viral, financial blogger Michael W. Green traced how modern American families can earn anywhere from $40,000 to $100,000 and still fall further behind. The argument is devastatingly simple: the mathematical parameters defining “poverty” are built upon a benchmark drawn in 1963, multiplied by three, and only lightly adjusted for inflation. Everything else — childcare, healthcare, housing, transportation, and the structural design of the welfare state — has transformed beyond recognition. The result is a system in which the official poverty line tells us less about deprivation than it does about starvation. And once you trace the math, the inescapable metaphor emerges: America’s working households require escape velocity to break free from the gravitational well of modern costs of living.

In physics, escape velocity is the minimum energy needed to break free from a body’s gravitational pull. Below that threshold, every burst of energy merely bends the trajectory and drops the object back into orbit. The same dynamic now governs mobility in the United States.

Using conservative assumptions, a bare-bones “participation budget,” the minimal cost necessary for a household to work, raise children, and avoid freefall, is roughly between $136,000 to $150,000. That figure doesn’t represent luxurious living; it’s the updated application of Mollie Orshansky’s original method, which assumed food was one-third of a household’s budget. Today, food is closer to 5 to 7 percent, and the real multipliers reside in the unavoidable costs of existing in a post-industrial service economy. The system still uses the original 1963 architecture, so the “poverty line” is measured as if housing, childcare, and healthcare still operated like they did during the Kennedy administration.

Below this new-era threshold, income gains are eaten by benefit cliffs: the loss of Medicaid, SNAP, childcare subsidies, and at that same point a sudden, full exposure to market prices in sectors that the United States has spent decades distorting through subsidies, mandates, and regulatory sclerosis. A family can leap from $45,000 to $65,000 and end up poorer, because the system confiscates more than 100 percent of that incremental income. From that perspective, it’s not irrational to stay put rather than aggressively seek higher earnings that will only bring more hardship and deprivation.

Using the 1963 poverty line today is like measuring the distance from Earth to the moon with a yardstick whose markings have been sandblasted away. It ensures two outcomes. First, because the benchmark is too low, benefits are means-tested too early. The ladder gets sawed off halfway up. The poor face marginal tax rates that would make a hedge fund blanch, and the working poor find that one extra dollar of income can trigger thousands of dollars in lost benefits. The mathematics are inherently punitive, punishing upward mobility and the productive instincts that animate it.

Second, persistent inflation, especially in non-discretionary categories, reshapes the spending basket faster than the poverty formula can adjust. This is not purely the result of supply-and-demand fundamentals. It is a direct consequence of decades of monetary expansion, financial repression, interest-rate suppression, and regulatory barriers that choke off the supply in housing, healthcare, education, and childcare. When the Federal Reserve aims to stabilize macroeconomic aggregates, it also inadvertently distorts the production of essential goods that determine whether a family can remain afloat. Price levels matter for survival even if economic science has come to prefer analyzing rates of change.

A similar mismatch between past prices and present reality — the real versus nominal divide — haunts the financial system. The $10,000 reporting requirement for bank transfers was created in the early 1970s, when $10,000 represented a down payment on a house. Today it represents two or three months’ rent in many cities — or a single dental emergency. Inflation has quietly turned an anti-money-laundering threshold into a mass-surveillance dragnet for normal people performing normal transactions. That same inflation, coupled with outdated benchmarks, now pushes American families into poverty by statistical invisibility and brutally repels attempts at upward mobility.

When escape velocity is $140–$150k, and the effective marginal tax rate is 80–120 percent, buying scratch-off tickets ceases to be obviously irrational. One needs a tremendous economic leap of roughly $100,000 a year to continue living without disruption. In a nonlinear system with cliffs and arbitrary phase changes, a low-probability high-payout gamble can be mathematically defensible. Tilting at heavy-tailed payoffs is not illogical; it is a response to a payoff structure policymakers engineered.

A likely response, politically, is to suggest simply lifting eligibility all the way up to the true cost-of-living threshold. But indexing benefits to the real cost of American life would balloon federal outlays by trillions. Extending Medicaid, SNAP, housing subsidies, and childcare credits to households making $140,000 would produce deficit dynamics that would make the 2020–2021 stimulus era look mild and restrained. The welfare state is already actuarially fragile; expanding it to cover half the US population would collapse it. On the other hand, three somewhat simple reforms could help restore a sane poverty escape velocity:

  • Use a modern participation-budget approach, not a 1963 grocery multiple. If there is to be a social safety net, it should be driven by means testing which phases out smoothly, not falls off cliffs. 
  • Deregulate housing, healthcare, childcare, and education: the sectors where supply is most strangled by regulation. Deregulation — particularly zoning, certificate of need laws, licensing, and insurance mandates — would create downward price pressure far more powerful than subsidies.
  • The Federal Reserve’s century-long experiment with cheap money has inflated asset prices, destroyed purchasing power, raised the cost of entry into middle-class life, and widened the gap between wages and participation requirements. A quick fix could be rendered by shifting from discretion to a rules-based monetary regime (whether Taylor-style, commodity-linked, or another transparent, market-tested anchor) to stabilize prices and reduce the boom-bust cycles that erode household stability.

America’s primary poverty crisis is not moral failure, laziness, or poor financial literacy. It is math. A system built on 1963 assumptions cannot function in a 2025 reality. Until the parameters shift, which is to say until lawmakers acknowledge the true cost of participation, that escape velocity will remain impossibly out of reach for tens of millions. The tragedy is not that people are failing; it is that the system is calibrated for a world that has not existed in over three generations. There is no reform, no genuine improvement in the condition of the poor, no revival in the living standards of consumers — or of any American who works — without monetary reform beginning at the very top, with the Federal Reserve.

In most sectors of the American economy, we celebrate the moment when insiders break away to build something better. Engineers start their own firms. Chefs open their own restaurants. Innovators leave incumbents and test their mettle in the market. Only in US healthcare do we treat that entrepreneurial impulse as a threat worthy of prohibition. 

Section 6001 of the 2010 Affordable Care Act froze the growth of physician-owned hospitals (POHs) by barring new POHs from getting paid by Medicare and Medicaid, and by restricting the expansion of existing POHs. It did not ban POHs outright, but it had roughly the effect of a ban; after years of growth, the number of POHs in the US abruptly plateaued at around 230-250, and practically no new POHs have opened since 2010.   

Supporters of the ban on POHs say it is needed to prevent conflicts of interest, cream-skimming, and overuse.

One argument is that without such a ban, POHs would cherry-pick the healthier and more profitable patients, leaving other hospitals with sicker and more costly patients. There is some evidence that physician-owned specialty hospitals tend to attract healthier patients and tend to focus on lucrative service lines. But why does that justify a ban on POHs? Specialization is one way that entrepreneurs create value. Cardiac centers, orthopedic hospitals, and focused surgical facilities exist precisely because repetition and standardization can improve outcomes and reduce costs. Specialty hospitals can even exert a positive influence on surrounding general hospitals to improve quality and reduce costs for everyone. 

Another argument is that uncontrolled self-referral would result in the overutilization of services and a rise in healthcare spending. Overutilization is a major contributor to wasteful spending in healthcare, which has been estimated to account for approximately 25 percent of total healthcare spending, or between $760 billion and $935 billion nationwide. The reasoning is that if physicians are able to refer patients internally for services, procedures, and tests, then physicians will cease to exercise careful cost control. This, however, is more of an indictment of the current price and payment systems than an indictment of physician ownership. By setting prices via committee instead of relying on genuine market prices, policymakers have created in Medicare and Medicaid a gameable system that rewards volume. The response to poorly designed reimbursement mechanisms should be to fix the mechanisms, not blame ownership models.

The POH issue illustrates how, in a mixed economy, controls beget controls. To keep the program politically popular, Medicare’s pre-payment review and protections against waste are generally less stringent than those found in the private insurance world. Given that context, preventing physicians from referring patients to the entities they own can seem like a sensible check against waste and abuse.

In a more market-driven system, however, the problem would evaporate without the need for a ban on POHs. Individuals (or their plan sponsors) would control more of their healthcare dollars; prices would be transparent and site-neutral; and hospitals and physician-led facilities would compete on bundled prices, warranties, and measured outcomes. The alleged perils of physician ownership would be addressed through competition and reputation. Insurers and self-funded employers would exercise discipline on overuse through selective contracting, reference-based pricing, and value-based payments, and patients would reward cost-effective specialists. 

In a free-market system, a physician’s ownership stake in a hospital is no more a threat to the taxpayer than a chef’s ownership stake in a restaurant is to an individual looking for a good place to dine. 

Often in US health policy, we are in the position of needing to make multiple fixes simultaneously in order to take a real step forward. Philosophically, the ban is indefensible. Physicians should be as free as any other professionals to become entrepreneurs and form, finance, and run institutions. Entrepreneurship should not require special permission. In nearly every other industry, the very engine of specialization, quality improvement, and cost discipline is entrepreneurship. Entrepreneurial profit is a reward for foresight, innovation, and service. But prior policy decisions give the ban a veneer of justification.

If we let the POH ban stand, then incumbency triumphs over innovation, with large hospital systems holding a legislated shield against potential competitors. If we lift the ban but make no accompanying changes, some fleecing of the taxpayer could occur.

We ought to lift the ban on POHs while simultaneously making reforms that let individuals control more of their own healthcare dollars. This would incentivize physicians to compete on value, mitigating concerns about overutilization.

One way to do this is to pair the repeal of the POH ban with payment neutrality and consumer control. This would end the artificial price differences that federal policy has assigned to different sites of care. MedPAC has long recommended site-neutral payment to strip away hospital markups for services that can be safely delivered in lower-cost settings. Efficient entrants will thrive by being better at care, instead of being better at “location arbitrage.”

Another way to do this is to put more real dollars under patient control. Empowering individuals with flexible accounts — yes, even in the Medicare and Medicaid contexts — would guard against overutilization. Evidence shows that when consumers face prices and control the marginal dollar, spending becomes more disciplined. This could be the proving ground for broader reforms involving the pairing of portable health savings accounts with catastrophic coverage in the Medicare and Medicaid populations.

Maintaining the ban on POHs is wrong. It denies clinicians the freedom to build their own institutions, and it denies patients the freedom to choose them. However, simply repealing the ban without making any other changes could open the door to overutilization at the expense of taxpayers, which is why we should pair the lifting of the ban with other changes. We should protect voluntary exchange among free individuals, while taking steps to align incentives so that patients, not political pull, direct the flow of dollars.