Category

Economy

Category

New research strongly suggests teachers’ unions are driving the skyrocketing administrative bloat that’s sucking resources away from classrooms. By diverting additional funding toward hiring more people, they starve effective educators of the raises and support they need, all to pad their own power structures. Unions benefit enormously from inflating the number of employees in the system, turning public schools into top-heavy bureaucracies that serve adults — not our kids.

Teachers’ union bosses gain in two major ways from the rapid expansion in administrative hiring — which also siphons resources away from teachers, students, and classrooms. 

First, a larger workforce means a bigger voting bloc in local and state politics. Public school employees can be mobilized to push for ideological agendas, from changing curricula to boxing out alternatives by blocking school choice. 

Second, more employees mean more revenue. Membership dues from 40 percent more administrators aren’t chump change — they add up to hundreds of millions of dollars annually, giving union leaders immense financial clout.

Has that money brought down student-teacher ratios or rewarded excellent teachers with raises? The latest LM2 report from the National Education Association (NEA), the largest labor union in the country, paints a damning picture. Out of more than $400 million in annual revenues, less than 10 percent goes toward representing teachers — the very people the union claims to champion. Instead, those funds are funneled into left-wing causes and Democratic campaign coffers.

According to the latest data from OpenSecrets, more than 98 percent of the NEA’s campaign contributions went to Democrats in the last election cycle. At this point, the teachers’ union has become nothing more than a money laundering operation for the Democratic Party, using educators’ hard-earned dues to bankroll partisan agendas.

The self-serving system explains why teacher salaries have remained flat since 1970, even after adjusting for inflation. Meanwhile, spending per student has ballooned by about 170 percent in real terms over the same period. If the money were truly going to improve education, we’d see it reflected in better pay for teachers or enhanced classroom resources. But it’s not.

The unions actively push to funnel that funding toward hiring more people — administrators, support staff, and other non-teaching roles. Why? Because if the money went toward increasing salaries for good teachers instead, school systems would have less to spend on expanding headcount. That means fewer dues-paying members and lower total revenues for union bosses to control and redirect toward their political allies.

According to the latest data from the Edunomics Lab at Georgetown University, student enrollment in US public schools has dropped by about 750,000 since 2014. In the same period, public school employment has increased by more than 600,000 people. Fewer kids, more adults. The government school system has morphed into a jobs program for grown-ups, with education for children as an afterthought. The situation isn’t sustainable, and it’s certainly not in the best interest of families or taxpayers.

My new peer-reviewed study, coauthored with Christos Makridis and published in Politics and Policy, confirms the unions’ role in this mess. We found that K-12 administrative bloat is more pronounced in places with stronger teachers’ union influence, all else equal. When unions hold sway, districts prioritize empire-building over efficiency, leading to layers of unnecessary bureaucracy that drain budgets without benefiting students.

Look at union-controlled Los Angeles public schools as a prime example. The Edunomics Lab data show that staffing there has increased by about 19 percent since 2014, even as the district has lost about 26 percent of its student population over the same period.

In what other industry do you go on a hiring spree just as you’re losing more than a quarter of your customers? Such decision-making would bankrupt a private business overnight. But public schools aren’t subject to market forces — they’re near-monopolies, insulated from competition by law. Without school choice, these unwise spending decisions hurt everyone: taxpayers foot the bill for the bloat, parents see their kids stuck in underperforming systems, and children suffer from misallocated resources. There’s no recourse because families can’t easily take their education dollars elsewhere.

The pattern holds in other union strongholds. In California, where unions dominate, staffing has increased by 11 percent while student enrollment has dropped about eight percent. Compare that trend to a state like North Carolina, which has banned collective bargaining for public employees. There, staffing has only risen by about four percent since 2015, with student enrollment remaining essentially unchanged. The difference is stark: without union pressure to inflate headcounts, districts focus on stability rather than expansion at all costs.

Teachers themselves are the forgotten victims in this scheme. Many of them feel left out by the one-size-fits-all system that prioritizes politics and union bosses over educators and kids. Teachers who are tired of seeing their dues siphoned off can join alternative representation like the Teacher Freedom Alliance — for free. By exiting, they would send a powerful message, incentivizing unions to refocus on fighting for higher salaries rather than just adding more boots on the ground to bolster their power.

Ultimately, the solution lies in introducing real competition through school choice. When families can vote with their feet — and take their education funding with them — districts will be forced to spend money wisely. Teachers will be financially free to find models they can believe in and that serve students well. Resources will flow into classrooms and toward training and retaining effective teachers, not toward unnecessary administrators and bloated staffs. It’s time to break the unions’ stranglehold and put kids first.

The Supreme Court has been systematically dismantling the modern administrative state. In several decisions, the justices have pushed back against the idea that executive-branch agencies can be insulated from presidential oversight. The constitutional principle is straightforward: Executive power must be accountable to the president.

Yet the court has hesitated to apply this logic to the Federal Reserve, easily the most important independent agency. That exception is increasingly hard to defend.

Recent Supreme Court cases such as Seila Law v. CFPB and Collins v. Yellen reject the notion that Congress may create powerful agencies whose leaders are shielded from removal by the president. The Court has been clear that technocratic expertise, political convenience, and even good policy outcomes do not override the Constitution’s separation of powers. If an agency exercises executive authority, it must ultimately answer to the elected chief executive.

Monetary policy would seem to fit squarely within that framework. The Fed regulates banks, influences the availability and price of credit, and controls the nation’s ultimate settlement asset. These decisions materially shape markets for labor, housing, and securities, which include the market for Treasury debt. If this does not count as executive power, what does?

And yet the Court appears willing to carve out an exception for the central bank. Defenders of Fed independence point to history, especially the First and Second Banks of the United States, and to the dangers of presidential meddling with monetary policy. They warn that subjecting Fed decisions to democratic accountability would invite political interference, with the likely result of excessive dollar depreciation.

But these are not constitutional arguments. They are prudential ones. They do not change the basic matter of what the Constitution says about executive authority. If the Constitution rules out conventional central banking, it is central banking that needs to change, not the Constitution.

History alone cannot justify departures from constitutional structure. Contrary to the Supreme Court’s claims, the First and Second Banks of the United States bore little resemblance to today’s Federal Reserve, as even Fed Chair Powell recognized. They lacked modern macroeconomic stabilization powers, operated under government charters, and existed for limited terms. Invoking them as precedent for an unaccountable central bank with sweeping discretionary authority is an historical solecism.

Nor can expertise supply a constitutional warrant. The Supreme Court has repeatedly rejected the idea that technical competence licenses insulation from political control. If Ph.D. economists qualify for special treatment, why not epidemiologists, climate scientists, or national security analysts? The argument has no limiting principle.

The real issue, though perhaps uncomfortable, is simple: Either the president runs the executive branch, or he does not.

If we believe that presidential interference with monetary policy is so dangerous that it must be prevented at all costs, the Constitution offers two solutions. One is to place the Federal Reserve firmly under executive control and accept political accountability for monetary outcomes, just as we do for every other entity that enforces laws passed by Congress. The other is to eliminate discretion altogether by binding monetary policy to strict, automatic rules—ones that leave no room for judgment calls by policymakers, however credentialed.

What the Constitution does not permit is what we have now: discretionary macroeconomic governance by financial insiders who answer to no elected official.

The Fed’s defenders often invoke “constrained discretion” as a middle ground. But discretion is still discretion. Choosing inflation targets, interpreting economic data, timing interventions, and deciding when to bend or suspend rules all involve judgment. Those judgments often have significant distributional consequences, benefiting some groups at the expense of others. Exercising such power without political accountability is precisely what the Court has rejected in other contexts.

To be sure, markets have grown accustomed to an independent Fed. But market expectations do not confer constitutional legitimacy. Investors once took Chevron deference and expansive agency authority for granted, too. Stability is desirable, but it cannot come at the expense of constitutional government.

The uncomfortable truth is that the Federal Reserve survives not because it fits neatly within our constitutional order, but because the alternative frightens us. Presidents might pressure the Fed to run the printing presses before elections, just as President Nixon had Fed Chair Arthur F. Burns do. Yes, inflation might follow. These are real concerns—but they are not legal ones.

If discretionary monetary policy is incompatible with democratic accountability, the answer is to reform monetary institutions so that discretion is radically constrained, not exempt those institutions from constitutional scrutiny. Alternatively, we should rethink whether a centralized monetary authority is compatible with the letter and spirit of constitutional law in the first place.

The Supreme Court has rightly insisted that the separation of powers means what it says. If that principle stops at the doors of the Federal Reserve, it is not a principle. It is an exception born of fear. And fear is a poor foundation for constitutional self-government.

In 1980, at Dartmouth College, psychologists Richard E. Kleck and Angelo G. Strenta set out to study how people perceive subtle social cues. In their own mischievous words, “Individuals were led to believe that they were perceived as physically deviant in the eyes of an interactant.”

Over a series of studies, dozens of volunteers (mostly females) pretended to have a physical ailment or disfigurement — often a facial scar — during an interview. The rub was this: the person pretending to have the disfigurement was the true test subject.

To make volunteers believe they bore a scar, each subject sat down with a professional makeup artist, who carefully applied a facial “injury.” The participants were shown their fake scar in a hand mirror and given a moment to absorb their appearance. But before meeting the stranger, the makeup artist returned under the pretense of “touching up” the scar  —  and quietly removed it. Participants were unaware of the subterfuge and entered their interview convinced their face still bore the disfigurement.

Even though no scar was present, nearly all subjects reported that the strangers they met seemed uneasy by their appearance — avoiding eye contact, speaking awkwardly, or looking at them with pity.

A Distorted ‘Social Reality’

What Kleck and Strenta had uncovered was how easily human expectations about how we’re perceived can color — and distort — our reading of other people’s behavior.

Though the study was groundbreaking, previous studies had treaded similar ground. This included Beatrice A. Wright’s “Physical Disability: A Psychological Approach,” which had found that once a person acquires a physical disability, their perception of social reality becomes filtered through that disability.

While Kleck and Strenta noted that their research differed from Wright’s in key ways, they also saw a clear connection, noting “that persons who are permanently physically deviant make the same kinds of disability-linked attributes to a natural stream of behavior as did the subjects in the present studies.”

The findings of Kleck and Strenta are highly relevant half a century later. In modern America, it’s not uncommon for people to see their group identity as a kind of social handicap. The idea that certain identity groups face discrimination has taken root in the American mind. For example, a May 2025 Pew Research survey asked Americans how much discrimination they think various groups face. Their responses are instructive:

  • Illegal Immigrants: 82% say some (57% say “a lot”) 
  • Transgender people: 77%
  • Muslims: 74%
  • Jews: 72%
  • Black people: 74%
  • Hispanic people: 72%
  • Asian people: 66% 
  • Legal immigrants: 65% 
  • Gay/Lesbian individuals: 70%
  • Women: 64%
  • Older people: 59%
  • Religious people: 57%

Discrimination certainly exists, at both the individual and collective level. No doubt some individuals in all these groups have faced discrimination, just as some individuals have in groups less commonly associated with it, such as rural Americans (41%), young people (40%), white people (38%), and atheists (33%).

Kleck and Strenta’s experiment shows that people often believe they’re being discriminated against because they expect to be, not because they are. Edgier comedies of the 1990s explored this idea.

In one Seinfeld episode, Jerry is on a date and stops to ask a mailman (whose face is obscured) if he knows where a certain Chinese restaurant is. “Excuse me, you must know where the Chinese restaurant is around here,” the comedian says.

Things get awkward, however, when it turns out the mailman is Asian. “Why must I know? Because I’m Chinese?” the mailman says in a heavy accent. “You think I know where all the Chinese restaurants are?”

The scene is comical and a bit absurd, but it reflects a phenomenon Kleck and Strenta observed in their experiments: Once humans begin to feel they are being treated differently based on their physical appearance or inherent attributes, they will believe people are treating them differently even when that is not the case.

“… if we expect others to react negatively to some aspect of our physical appearance,” the authors wrote, “there is probably little those others can do to prevent us from confirming our expectation.”

The Cost of Victimhood Culture

I learned about the Kleck-Strenta as an undergraduate in an introductory psychology class nearly 30 years ago. But I had rarely thought about it since — until Konstantin Kisin, a British-Russian political commentator, brought it up in recent podcast appearances and connected the study to the rise of victimhood culture.

“If you preach to people constantly that we’re all oppressed, that we’re all being discriminated against, then that primes people to look for that,” says Kisin, “even where it doesn’t exist.”

As a concept, victimhood culture wasn’t articulated until the 1990s. But by 2015, the phenomenon was widely discussed in academic literature and mainstream publications, likely due to the rise of political correctness that preceded it. 

“‘Victimhood culture,’ Arthur Brookes wrote in the New York Times a decade ago, “has now been identified as a widening phenomenon by mainstream sociologists.”

Though relatively new in scholarship, the psychological phenomenon itself has been around for ages. In his masterpiece The Brothers Karamazov, Fyodor Dostoevsky explores the human proclivity to manufacture grievance by taking offense. 

“A man who lies to himself is the first to take offense. It sometimes feels very good to take offense, doesn’t it? And surely he knows that no one has offended him, and that he himself has invented the offense and told lies just for the beauty of it, that he has exaggerated for the sake of effect, that he has picked on a word and made a mountain out of a pea — he knows all that, and still he is the first to take offense, he likes feeling offended, it gives him great pleasure, and thus he reaches the point of real hostility.”

These words are uttered by Father Zosima, a wise monk tasked with settling a dispute among the Karamazov family. Fyodor Karamazov, the lecherous patriarch, practical purrs in agreement.

“Precisely, precisely—it feels good to be offended,” he replies. “You put it so well, I’ve never heard it before.”

Dostoevsky was observing a psychological tendency: our ability to see ourselves as victims. (Anyone who has watched The Sopranos has seen this idea explored in brilliant artistic fashion.) 

Combine that impulse with postmodern philosophies that have turned oppression into a kind of fascination, and you get a potent—and corrosive—worldview.

None of this is to deny that real oppression exists or that people are sometimes treated differently because of their appearance. But the research of Kleck and Strenta suggests that, oftentimes, the perceived discrimination exists only in the minds of those who believe they’ve been wronged.

“Regime uncertainty” should be our bywords for understanding the economy of 2025. Trump’s push for “state capitalism,” ranging from tariffs to taking federal stakes in companies to industrial policy to jawboning companies to fire executives to targeted regulatory carveouts, has created a chaotic, pay-to-play environment in which firms find they can get favorable treatment by contributing to Trump’s political success, but the basic rules of the economic game are unpredictable and open to constant negotiation. That unpredictability has in turn deterred private investment and brought on stagflation.

Economist Robert Higgs developed the concept of regime uncertainty to explain why American recovery from the Great Depression was so slow. Investors feared for the security of their contracts and their private property rights as FDR turned explicitly anti-business during the 1936 presidential campaign. As a result, private investment stagnated and the economy tipped back into recession, prolonging the Great Depression. Investor confidence didn’t return until after the war, when it launched an economic boom.

Regime uncertainty was on my mind when I watched the fateful “Liberation Day” press conference in the Rose Garden on April 2, when President Trump held up the schedule of so-called “reciprocal tariffs” that would apply to imports from other countries. The tariff rates themselves were extremely high, but more than that, they were absurd and irrational. They had no logical basis in law or economics. I immediately moved my retirement savings out of US equities into global equities and bought physical gold with all my family’s free cash. After most of the tariffs were rolled back, I gradually started shifting back to a 60-40 US-global equities mix, which is still overweighted to international stocks compared to most Americans’ portfolios.

I didn’t shift to gold or global stocks because I thought the tariffs themselves would be so destructive, but because I lost all confidence in this administration’s economic policy team. I figured we were in for a wild ride, and unfortunately, we have been.

Regime uncertainty explains not just why global stocks have outperformed US stocks (Figure 1), but why gold has performed so well, despite reasonably moderate inflation (Figure 2). Gold is hedging not just ongoing inflation, but policy uncertainty more broadly.

Figure 1: S&P 500 Index vs. S&P Global Index, Jan. 2025 to Dec. 2025 (source: WSJ)

Figure 2: Gold Spot Price, Jan. 2025 to Dec. 2025 (source: TradingView)

Regime uncertainty explains another curious fact about the current US economy: the return of mild stagflation. With private investment lagging, unemployment has risen along with inflation. Macroeconomists would interpret this combination as an “adverse supply shock,” that is, a loss of productivity growth.

As Figure 3 shows, US job growth has slowed markedly since April. Monthly job growth since then has averaged a measly 35,000 workers, compared to over 100,000 every single month going back to October 2024.

Figure 3: US Nonfarm Jobs, Monthly Change, Jan. 2022 to Nov. 2025

The unemployment rate has also now risen a full percentage point since its 2023 low. Note that there is a gap in the series because of the government shutdown: the October figure is missing. The November figure is 4.6%, which means that the unemployment rate has risen a full half a percentage point just since June.

Figure 4: US Unemployment Rate, Jan. 2022 to Nov. 2025

Figure 5 shows monthly personal consumption expenditures (PCE) inflation rates up through November (excluding October). While this series is highly noisy, an upward trend since March is plainly apparent. Moreover, the disinflationary trend that we saw from the beginning of the series up until March has definitively come to an end. For the months through September, we had six straight months of PCE inflation near or above the Fed’s target. The last time that happened was the six months ending in June 2023.

Figure 5: US PCE Inflation, Monthly, Jan. 2022 to Sept. 2025

So far the data suggest the US economy is slowing, investors are hedging against something, and investors prefer to park their money abroad rather than in the US. But the smoking gun that suggests regime uncertainty is at fault is private investment. Real gross private investment is the series that Higgs himself uses to establish a “capital strike” during the late New Deal period.

What do we see when we look at US real gross domestic private investment in 2025? In Q2 2025, the quarter that includes Liberation Day, we see the largest single-quarter decline in US private investment since Q2 2020, the quarter most affected by the global pandemic (Figure 6). To see another decline as large, we have to go all the way back to 2009, during the throes of the Great Recession. Indeed, to find another quarterly decline as large outside the immediate period during or around an official recession, we have to go all the way back to Q1 1988. The just-released figures for Q3 show another decline in real private investment, of 0.3%.

Figure 6: US Real Gross Domestic Private Investment, Quarterly Growth, Q1 2020 to Q3 2025

All of this has been happening at the same time as an AI-fueled boom in capital investment for electricity generation and data centers has taken hold. Had this technologically driven boom not been ongoing, the effects of regime uncertainty on the US economy presumably would have been much worse.

Indeed, if we look at the components of private investment, “information processing equipment,” “software,” and “research and development” have contributed significantly to growth this year. But their growth has been more than offset in the last two quarters by declines in nonresidential structures and residential construction.

Why has regime uncertainty become such a problem? After all, the Trump Administration has made a verbal commitment to deregulation, and the tax cuts passed in the One Big Beautiful Bill should have incentivized business investment. But all of the deregulation enacted by the Trump Administration has come through executive orders and the rulemaking process, not legislation. A future administration could easily undo it. As a result, businesses lack the confidence that large capital investments will pay off in the long run.

The evidence suggests that to turn the American economy around, the Trump Administration needs to work through Congress to pass statutory deregulation, end its experiments with industrial policy, government ownership, and tariffs, and shift from a “deal-making” posture of transactional politics to a firm, credible commitment to enforcing a level playing field for private business. Without a believable shift in strategy, this administration risks incurring an economic malaise that could last for another three years.

The darkest days of the year have always asked something of us.

Long before we strung electric lights and gathered around brick fireplaces, people across cultures marked the winter solstice — the darkest day of the year. We do so still, not with despair, but as a moment of deliberate action, of invitation. We bring greenery indoors. We light candles in the windows and fires in the hearth. We gather, sing, feast, celebrate. We tell stories about the sun’s return, even though the bulk of winter lies heavy ahead.

Why We Bring Life and Light Indoors

The solstice is neither the end of winter nor its harshest nadir. It is the turning point, the time when decline stops and reversal begins. The days will begin, imperceptibly, to lengthen again. There seems to be some universal human impulse to mark the deepest darkness with light of our own making, and to gather in the good things that sustain us through lean times.

By the second century BCE, Roman households decorated their doors with holly, sacred to the god Saturn. Its red berries and vibrant green foliage set the season’s signature color scheme. Around the same time, Druids were decorating holly trees (though cutting one down would bring bad luck) and bringing in red-and-white speckled fly agaric mushrooms to dry by the fire. Further north, the Norse were hanging mistletoe, with its tiny white berries, under which couples would stop to kiss. Celts and Germanic tribes cut boughs from the evergreen plants around them — ivy, fir, laurel — to symbolize enduring life. 

As Christianity crept across the continent, these cultural practices were absorbed by new narratives. The decorated tree was eventually brought indoors and adorned with ornaments, migrating from Germany with Prince Albert and Queen Victoria. It was subsumed into the Christmas tradition and subsequently across the Anglosphere, including the southern hemisphere. 

But indigenous midwinter celebrations of fire, music, and revelry were already observed there, in June. The Mapuche of southern Chile and adjacent Argentina celebrate We Tripantu around the June solstice as a new year and renewal of the natural cycle. Each recognizes that midwinter is not a time of death, but of quiet hibernation, preparation, and anticipation.

Why We Light the Dark

In Scandinavia, a giant oak log, the Yule log, was burned to symbolize strength and endurance, its light defying the darkness and promising regrowth and rebirth. A fragment was saved each year to start next year’s fire.

And that, perhaps, is the enduring lesson of these solstice ceremonies of renewal.

Evergreens remain visibly living when deciduous plants appear dead, so they become symbols of continuity, rebirth, and eternal life. But neither are really dead. They are storing away energy, waiting for the light and opportunity to grow to return.

The embers of the Yule log, saved for next year; the hard-won sugars stored up in evergreen boughs. Human beings have always marked the darkest economic and seasonal moments not by denial, but by deliberate acts of renewal — bringing light and living things indoors as a vote of confidence in the future.

Discipline in Dormancy

Human capital works the same way. Periods of stagnation and loss can be times of preparation — if we take responsibility for them. Skills can be sharpened. Habits can be examined. Character can be rebuilt. But none of this happens automatically. Winter only becomes preparation if we choose to treat it as such.

The first signs of recovery are often invisible: better decisions, renewed discipline, a willingness to accept responsibility for one’s future. These do not immediately produce abundance, but they change the trajectory. Compounding works quietly at first.

In economic life, downturns, stagnation, and personal failure function as winter solstices. In the moments when progress is slowest, we might not notice that reversal has already begun. Prosperity does not return automatically — it returns because people act as if it will. We conserve capital, tend embers, make plans, and orient ourselves toward the future. We honor the natural cycles by preparing ourselves to grow again.

Prosperity depends not only on policies or institutions, but on the daily choices of individuals who conserve, invest, and prepare. It depends on people willing to tend embers rather than curse the cold.

In a time when economic anxiety is widespread and faith in the future often feels thin, the solstice offers a bracing reminder. Darkness does not mean directionlessness. Dormancy does not mean decay. And renewal does not require grand gestures — only the discipline to preserve what still lives and the courage to believe that patient effort matters.

The people who celebrated the “longest night” were not naïve. They knew months of cold still lay ahead. They knew crops would not sprout for a long time. Yet they marked the moment anyway, because direction mattered more than speed. After the solstice, as in recession and personal loss, progress begins before comfort returns.

President Trump has accused virtually every country, including those inhabited only by penguins of ripping us off when it comes to trade. But there’s one region that the President has neglected to protect us from: the North Pole. By every metric that the Trump administration has used, Good Saint Nick should really be considered an economic terrorist. Consider the following:

North Pole Trade Deficit

Santa operates out of the North Pole which is (as of yet) not part of the United States; everything that he brings into the country is considered an import. Meanwhile we export nothing to the North Pole annually, giving us an entirely one-sided trade deficit with the North Pole. But just how much does Santa actually import into the United States each Christmas?

With just under two-thirds of Americans identifying as “Christian,” that gives us about 200 million people eligible for gifts from Santa, of which about half would be considered children. A recent survey finds that parents are anticipating spending about $521 per child on Christmas presents, which equates to $52.1 billion worth of Christmas spending. Obviously, parents and other family members will contribute the bulk of these presents to the kids. If we assume that only a quarter of the gifts that children receive on Christmas morning are “From: Santa,” that means that Santa must be importing $13 billion worth of Christmas presents on Christmas Eve.

Using the same methodology that the President’s Council of Economic Advisors has used, we can calculate the devastation that this pile of presents would bring upon our nation. At an average wage of $36 per hour, Santa’s imports are the equivalent of 180,555 manufacturing jobs that are destroyed by him deciding to spread his “good cheer.” 

Of course, if Santa were to contribute more than a mere quarter of the Christmas presents per year, this number would only rise.

Unfair Trade Practices

Worse still, is Santa’s practice of dumping gifts on the American economy. “Dumping,” according to US law, is when a foreign producer sells goods in America below the cost of production. Previous administrations have solved this in the past through the use of antidumping duties, sometimes exceeding 200 percent of the product’s value.

But Santa does not merely sell below cost. He gives his goods away for free. This is dumping at a price of zero, which is completely indefensible under US law. Even China, often referred to as the worst trade offender in the world, has the decency to charge us something for their harmful production. 

Using standard methodology to calculate the appropriate response is simple: take the value of the good, divide it by the price the importer is selling, and multiply it by 100 to arrive at the appropriate percentage penalty to apply. Since Santa charges us nothing, the appropriate response is therefore an infinite tariff rate applied to any and all goods imported from the North Pole.

Unfair Production Processes

We must also consider the means by which the North Pole produces its wares: intellectual property theft. Santa does not produce his own merchandise, but rather creates facsimiles of products readily available on shelves of stores around the country. This is intellectual property theft at its finest, which by some estimates costs the US up to $600 billion annually. Is it possible that all of those “elves on the shelves” are really spies, seeking to steal trade secrets through corporate espionage? After all, they apparently return to the North Pole every evening to “report to Santa.” Just what is in those reports? Has anyone seen them?

And how does Santa build these gifts? Through the use of what can only be considered child and slave labor, no less. Watching 1994’s The Santa Clause with the eyes of a trade representative reveals just how abhorrent Santa’s labor practices are as Santa has been using child elf labor since the beginning of his operation. Worse is 2003’s Elf, which reveals that when an elf does manage to escape, none other than Santa himself will descend from his throne and seek to collect the escapee. And should an elf wish to be anything other than a toymaker, he is berated and faced with serious pressure to conform, as the 1964 Rudolph the Red-Nosed Reindeer shows with the plight of Hermey and his desire to become a dentist.

“And what are these elves paid with?” you ask. That’s easy: candy canes, hot cocoa, and “Christmas spirit.” This is currency manipulation at its finest. Meanwhile, these workers live in a region with no significant thermal activity and average temperatures of about -40°F, six months of darkness, and zero compliance with standard OSHA practices. Still, Santa reports that his elves are “merry.”

National Security Threat

Finally, we must consider the national security threat that Santa presents. President Trump has secured the border against illegal crossings. So why has nothing been done against Santa? Has he been vetted by national security advisors? Does he have visa paperwork or asylum status? Does he enter through a designated port of entry, submit to customs inspections, or declare the goods he is importing? It appears that the answers to all of these questions are a resounding “no,” which means that if any border is in need of a wall, it’s our northern border. Good luck building one tall enough to stop reindeer flying at over 650 miles per second.

Either Condemn Santa—or Thank Free Traders 

To his credit, the President has tried to convince parents that they should give their children fewer Christmas presents. At his affordability rally in Pennsylvania just a few weeks ago, he pointed out very clearly that “you don’t need 37 [dolls] for your daughter. Two or three is nice.” This isn’t the first time the President has derided excess consumption in the name of national security: he said as much back in May as well.

The reality is that the American people understand full well that Santa is no “economic terrorist.” While we may bemoan having to clean up the mess of wrapping paper, find ourselves unprepared for the sheer number of toys that need (but do not include) batteries, and perhaps find frustration that after a late night, we have to get up absurdly early, we still see Christmas Day not as a sign of us being taken advantage of, but as a day of celebration and good cheer.

But if we really stop and think about it, foreign producers have a degree of “Santa” in them. While they do not sell us their wares at zero price, they still charge lower prices than our domestic counterparts can match. This means more access to goods and services that allow us to live healthily and wealthily, however we choose to define those terms. Unlike Santa, foreign producers sell their “gifts” to everyone regardless of age or religious affiliation and they do so year round. 

So what we should really be after here is consistency: either condemn Santa as the job-destroying, IP-stealing, border-flouting menace he is — or thank foreign producers for enriching our lives with their gifts of specialization. You cannot have it both ways.

For decades, pundits have declared that Americans shouldn’t have to save for retirement in the casino of the stock market. They argued that individuals saving for themselves was too risky and that only a strong collective safety net could provide a secure retirement. 

Those pundits have been proven wrong. 

A recent Wall Street Journal story highlighted the hundreds of thousands of “401(k) millionaires” just at the Fidelity brokerage. Far from being a refuge just for the wealthy, individual retirement accounts have become a widespread and secure way to save for retirement. They have also become one of the main reasons for America’s national wealth.  

In the decades after World War II, federal tax policy encouraged employers to offer what are known as “defined benefit” retirement plans, where companies promised to pay set amounts to their former employees after retirement. But in 1974 the government began allowing people to open individual retirement accounts (IRAs) for themselves, with no tax on the contributions. More importantly, in 1978, Congress added what would become the famous Section 401(k) to the US tax code, giving employers the option to support individual retirement accounts.

Around the time of the 401(k) tax code change, there were about 30 million defined benefit plan participants in the private sector, an all-time peak. That was nearly double the total in “defined contribution” or individual retirement plans, such as the 401(k).

Today, the number of active participants in defined benefit plans is down to about 10 million, but there are almost 90 million in defined contribution plans. Thanks to 401(k)s, the total number of workers with any retirement plan is at an all-time high, even accounting for population growth.  

While many have lamented the decline of the defined benefit package, in one sense the market has spoken. People have moved away from stodgy jobs with strict defined benefits packages. One reason is that the government allows companies to wait up to five years before any of their defined benefits are vested, and companies often choose to vest such plans slowly, because the plans are quite risky for the companies themselves. Since the median length of a job in the US is about four years, defined benefit plans can leave employees at these companies without any savings at all.  

The riskiness of defined benefits packages is demonstrated by the long history of their bankruptcies. The only reason those plans are not even rarer today is that they are supported by a government corporation that came along at the same time the IRA was created, the Pension Benefit Guaranty Corporation. Due to plan bankruptcies, the PBGC was tens of billions of dollars in the hole until an American Rescue Plan bailout in 2021 salvaged it.  amer

In 2025, Americans held  $13 trillion in defined contribution accounts, mainly 401(k)s, and another $18 trillion in individual retirement accounts not directly attached to employers. Most of those individual retirement accounts, though, came from “rollovers” of previous employer accounts into IRAs, showing the flexibility that comes from individual savings when people move or change jobs. In total, almost a quarter of all household financial wealth in America is in individual retirement savings. 

Despite periodic cries about a retirement crisis, people with the option to save for retirement are saving a lot. Fidelity estimates that people with 401(k)s are saving over 14 percent of their income in them, including both employer and employee contributions. The median retirement savings for the recently retired is $200,000, which helps explain the all-time record net worth for this group. The amount of savings will go up as more people retire who only know of defined contribution accounts. The number of people with individual accounts at middle age is actually higher than it is for older groups.  

Beyond the benefits to individuals, there are social benefits to individual retirement accounts. In countries with more expansive collective safety nets and social security, most people don’t have to save as much for retirement. Although for some individuals that could work out fine, for society as a whole, it can be devastating. Retirement is one of the main reasons people save, and savings are the main reason businesses can invest, and investment is the main reason economies grow.  

Decades ago, economist Martin Feldstein showed that the Social Security system in the US reduces personal savings by anywhere from a third to more than half. Although the precise magnitude of this effect is debatable, the broader point is not: an even more expansive social safety net would further depress savings.

The proliferation and success of 401(k)s is one reason political pressure to expand Social Security has remained muted. Social Security is a necessary provider for those with limited savings or options, but there is broad agreement that the program requires reform. One sensible approach would be to limit payouts for individuals who already have substantial incomes in retirement—and here, again, 401(k)s will be central. Many Americans are entering their later years with sizable holdings of stocks and bonds in 401(k)s, and modest reductions in Social Security benefits for these groups would not be devastating.

The success of tax-advantaged savings accounts has arrived and should be celebrated. Yet that success has gone too far in one respect. The federal government now has not just tax-advantaged retirement accounts, but tax-advantaged savings accounts for higher education, accounts for K-12 education, accounts for health care expenses, accounts for funds spent on people with disabilities, and, most recently, accounts for the expenses of “emergencies” more generally.  

The surprising proliferation of tax-advantaged savings accounts is moving much of the population into a system where their savings are not taxed at all, which is much to the good. But now families have to navigate how much money to put into each bucket and for how long, and what will happen if they don’t spend the funds or if funds from one bucket are needed for other expenses. 

Ideally, the system could just stop taxing people’s savings and focus on taxing consumption.  

In the meantime, believers in individual liberty should celebrate the success of the 401(k) and pray for more successes to come. 

For three months at the peak of COVID-19, I treated some of New York City’s sickest patients at Bellevue Hospital, the city’s historic public hospital. There, extraordinary clinicians delivered heroic care to the most at-risk patients. While there, I couldn’t help but compare Bellevue to the gleaming NYU Langone Tisch Hospital — a nonprofit private facility almost next door where patients with robust insurance predominantly received care. The hospital even maintained a quasi-VIP room in its emergency department, a feature that had ignited controversy in 2022 for symbolizing stratified care.

Rich and poor patients receive starkly different treatment in New York City — and nationwide. It’s exactly these types of disparities that infuriate newly elected New York City Mayor Zohran Mamdani, who vows to eradicate them in the name of equity.

The mayor-elect wants to increase access to healthcare. His administration has prioritized affordability and expansion of public services, building on a campaign that mobilized young voters and progressives toward a vision of universal rights.

Democratic socialists champion healthcare as a universal right, yet this vision confronts an intractable barrier: will the government compel physicians, nurses, or hospitals to participate? Insurance coverage, however robust, remains meaningless without actual access and delivery. Expanding coverage alone does not guarantee providers will accept patients, especially when financial realities favor higher-paying private plans.

A concrete example is joint arthroplasty, such as hip or knee replacement. Abundant data confirm that, for appropriate candidates, surgery dramatically enhances quality of life and functional status. Studies consistently show improvements in mobility, pain reduction, and overall patient-reported outcomes, making it a benchmark for assessing equitable access to elective procedures.

Countries with socialized medicine, like Canada, treat healthcare as a positive right and provide universal coverage — yet they falter on universal access. Canada sets a national benchmark of 26 weeks for hip replacement; according to the Fraser Institute, only 66 percent of patients undergo surgery within that timeframe. Wait times also reflect resource allocation challenges in single-payer systems, where rationing occurs through queues rather than price, often delaying care for non-urgent but life-improving interventions.

In the United States, Medicaid patients — covered by the government’s safety-net insurance — are less likely to receive arthroplasty and face longer surgical waits than those with commercial insurance. Research from national databases reveals that Medicaid enrollees not only access these procedures less frequently but also experience barriers in specialist referrals and pre- and post-operative optimization.

Surgeons struggle to treat Medicaid patients for several reasons, chief among them reimbursement. Medicaid pays far less for identical work: if Medicare reimburses a physician $1.00 per procedure, private insurance averages $1.43, while New York Medicaid pays just 76 cents.

To achieve equity, Mayor Mamdani will have to lobby Governor Hochul and the federal government to increase Medicaid reimbursement rates.

Medicaid patients also experience higher rates of complications, readmissions, prolonged hospital stays, and worse patient-reported outcomes. They face 81.7 percent greater odds of emergency department visits compared to privately insured patients.

The new city administration campaigned on a pledge to “expand access” and “lower costs for everyone.” To achieve this, Mayor Mamdani will need to substantially increase physician and provider participation in safety-net hospitals and insurance.  What if physicians don’t want to participate?

The uncomfortable (and usually unspoken) reality is this: Achieving true healthcare equity will require the forcible appropriation of physicians’ property — their time, expertise, and professional autonomy.

At its core, the conflict pits positive rights (entitlements to goods and services) against negative rights (freedoms from coercion), inseparable from the foundational principles of property ownership. Philosophically, positive rights demand active provision by others, potentially infringing on individual liberties, while negative rights protect against interference — a tension central to debates on mandatory service or quotas.

This is the fundamental challenge posed by positive rights. For example, the European Union recognizes a right to education, yet someone must actually provide that education. Similarly, the EU acknowledges a right to healthcare, but someone must deliver that care. Even more critically, these positive rights can come into direct conflict with one another. The EU, for instance, guarantees workers certain work-life balance protections, including a minimum of four weeks of paid vacation per year. Physicians, however, are a scarce and finite resource. What happens when physicians exercise their mandated vacation time and there are not enough doctors available to meet patient demand?

In NYC, in the name of Mamdani’s equity, will the city compel physicians to accept every insurance plan? Should it mandate minimum patient quotas? Should it outlaw tiered care — framed through the lens of oppressor and oppressed — to enforce uniform outcomes?

If a city government can conscript doctors in these ways, what else can it command them to do?

“Was the government to prescribe to us our medicine and diet, our bodies would be in such keeping as our souls are now.” -Thomas Jefferson

Let me introduce you to Sam. Sam has obesity, Type 2 diabetes, heart disease, and high blood pressure. His diet consists mostly of refined grains and trans fats. He’s got cabinets full of dirt-cheap junk food and sky-high healthcare costs to address its effects. He takes home $27,000 a year, but spends $36,000. He’s in debt up to his jaundiced eyeballs, and he wants his niece to foot the bill for weight-loss medication.

As a real-life niece of my Uncle Sam, I’m concerned about his diet. Some 56.2 percent of the daily calories consumed by US adults come from federally subsidized food commodities: corn, soybeans, wheat, rice, sorghum, dairy, and livestock. While these calorie-dense foods once made sense for a government preparing for famine or total war, in recent decades they’ve instead helped make us fatter and sicker. 

Obesity is a top driver of healthcare costs. One study compared the health of people who eat mostly foods the federal government subsidizes to those who eat fewer. Those who follow the revealed preferences of what the government subsidizes (rather than the diet it consciously recommends) are almost 40 percent more likely to be obese and face significant diet-related health issues. Those with the highest consumption of federally subsidized foods also have significantly higher rates of belly fat, abnormal cholesterol, high levels of blood sugar, and more markers of chronic inflammation. All these are increasing contributors to the most common causes of death in the developed world.

The negative impact of subsidized crop consumption on health — while it can’t be called causal — persists even after controlling for age, sex, and socioeconomic factors. But life does not control for those factors.

The Great Grain Giveaway

The federal government recommends one diet to Americans, and subsidizes another. The Dietary Guidelines for Americans from the USDA and HHS promote eating fruits, vegetables, whole grains, protein, and moderate dairy, while limiting saturated fats, sugars, salt, and refined grains. According to data compiled for Meatonomics, American agribusiness receives about $38 billion annually in federal funding, with only 0.4 percent ($17 million) going to fruits and vegetables. Just three percent of cropland is devoted to fruits and vegetables, despite USDA guidelines’ insistence that they should cover half of your dinner plate. Just 10 percent of Americans consume the recommended amount of fresh produce, and the poor consume the least. (Fruit and vegetable producers’ exclusion from the federal direct payments program provides a valuable example of a food industry thriving without significant subsidies. They do, however, rely heavily on migrant labor to lower costs.)

Instead, the US spends tens of billions annually to subsidize seven major commodities. The three largest farm subsidy programs contribute 70 percent of funds to producers of just three crops — corn, soybeans, and wheat. Approximately 30-40 percent of US corn, over half of soybeans, and nearly all sorghum feed livestock, heavily discounting high-fat, lower-nutrition meat and dairy (especially compared to grass-fed options). The prevalence of grain-fed livestock generates demand for commodities used to feed them, completing the circle. 

Subsidies also contribute to our consumption of refined grains, sugary drinks, and processed foods. About five percent of corn becomes artificially cheap high-fructose corn syrup (which allows it to compete with tariffed natural sugars), and half of soybeans are processed into oils, which also contribute to obesity.

My Uncle Sam is sick because he eats the food the government makes artificially more affordable. Those foods are poorer in quality and more harmful to health than their unsubsidized alternatives. We are paying to make ourselves sicker.

Diet-Related Health Issues Fuel Healthcare Costs

For more than 20 years, the FDA has known that trans fats and refined grains harm health, damage metabolism, and cause disease. Diet-related illnesses like obesity, Type 2 diabetes, and high blood pressure are increasing, while heart disease remains the leading cause of death. These epidemics are intertwined at the artery level, and both contribute hugely to rising US health care costs.

In an economic order awash with subsidies and regulation, agricultural policy is health policy. Government subsidies for agricultural products have shaped the current American nutritional environment, and they are exacerbating obesity trends.

An article in the American Journal of Preventive Medicine confirms: “Current agricultural policy remains largely uninformed by public health discourse.”

Johns Hopkins physician (and current Commissioner of the US Food and Drug Administration) Marty Makary called out the disconnect clearly. “Half of all federal spending is going to health care in its many hidden forms,” he told an interviewer in October, but Americans continue “getting sicker and sicker… Chronic diseases are on the rise. Cancers are on the rise. And we have the most medicated generation in human history.”

We’re getting more medicated every day — and more of it is at taxpayer expense. 

A Better Answer Than Ozempic?

Government spending on healthcare now exceeds the entire discretionary budget. Excess weight is a significant risk for older Americans, who are also the most likely to both have high healthcare costs and to rely on government health care. Forty percent of Americans over 60 are classified as having obesity, which is a contributing or complicating factor in diseases that kill older Americans: cancers, heart disease, infection, stroke, and cirrhosis.

Late last year, the Food and Drug Administration approved the weight-loss drug Wegovy as a treatment for people at risk of heart attack or stroke. Medicare is forbidden by statute from covering prescription drugs for weight loss alone, but in 2021 regulators approved Wegovy for reducing weight-related risks in patients with diabetes. Medicare Part D plans spent $2.6 billion last year on related compound Ozempic to keep 500,000 patients with diabetes stable. Wegovy’s list price is around $1,300 per month, but that’s still small compared to the $1.4 trillion Americans spend on direct and indirect costs from obesity.

It has a certain economic logic. Instead of waiting for a patient to develop a cascade of expensive comorbidities like heart failure or diabetes, we could consider asking Medicare to pay for anti-obesity meds on the front end. That wouldn’t work as well as lifestyle changes, but all our health and activity messaging over the past several years doesn’t seem to have moved that needle, and significant evidence suggests our efforts are counterproductive. 

The Tangled Web of Farm Subsidies

To understand the insanity of American agricultural and health policy, it’s hard to do better than comedian-illusionists Penn & Teller, who in characteristically salty style (really — you’ll want headphones and a sense of humor to watch the video) explained it this way 15 years ago: 

High fructose corn syrup is a dirt-cheap way to add sweetener and extend shelf life. And why is it so cheap? Because we subsidize corn farmers! Our government gives about 10 billion of our tax dollars to corn farmers every year so they can produce more corn than we need. They then sell the corn at artificially low prices. They spend our money to make corn syrup cheap, and now the same government that uses our tax money to keep soft drinks cheap wants more of our tax money to make soft drinks more expensive. Does anyone else think this is incredibly f—d up?

Yes, Penn. We do. And since that clip aired, obesity rates have worsened 50 percent, and rose 78 percent in children. Medical spending on the consequences of obesity doubled. Over the same period, subsidies to corn growers (which includes disaster aid and insurance) have tripled. 

Rather than cut back on his terrible diet, Uncle Sam wants us to pony up for weight loss drugs — to undo what our food policy has done.

Over the five years since the COVID pandemic, the AIER Year End Holiday Index has climbed by an average of about 3.8 percent per year, resulting in a total increase of just under 21 percent. In the preceding five-year period from 2015 to 2020, the index rose only slightly — just over 2.7 percent in total — equivalent to an average annual gain of about 0.5 percent.

(Source: Bloomberg Finance, LP. Data subject to shutdown limitations.)

Our proprietary HDAY Index captures price movements across a broad basket of holiday-relevant goods and services, including apparel, toys, books, software, jewelry, pet and personal care items, gift-wrapping materials, postage and shipping, alcohol, confectionery, houseplants, and movie tickets. The table below presents both the average annual rate of change and the cumulative price increase for the five years preceding the pandemic and the five years that followed. These results are shown alongside changes in the Employment Cost Index as well as key holiday travel expenses over the same periods, including airfare and gasoline.

Avg Annual ChangeAvg Annual ChangeTotal ChangeTotal Change
Category(2015-2020)(2020-2025)(2015-2020)(2020-2025)
HDAY Index0.68%4.17%3.46%22.64%
ECI Index2.56%4.10%13.48%22.28%
Airfare-6.41%5.50%-28.18%30.70%
Gasoline (average)-0.87%7.60%-4.28%44.23%

Since the end of 2019, the HDAY Index reveals an increase of over 18 percent in the prices of selected goods and services. And as is shown below, every category other than recreational books and toys has surged in price. Notable increases over the past half-decade have occurred in categories most closely associated with Christmas, Hanukkah, and other end-of-year festivities: postage and delivery services, stationery and gift wrapping, confectionary, and indoor plants and flowers. 

Avg Annual ChangeAvg Annual ChangeTotal ChangeTotal Change
Category(2015-2020)(2020-2025)(2015-2020)(2020-2025)
Sugar and Sweets0.92%6.44%4.68%35.26%
Women’s and Girls Apparel-2.35%2.33%-11.23%12.30%
Men’s and Boys Apparel-0.98%3.42%-4.79%18.47%
Toys-8.18%-0.74%-34.21%-3.66%
Recreational Books-0.97%-0.01%-4.74%-0.05%
Pets, Pet Products, and Services1.26%4.71%6.42%25.85%
Postage and Delivery Services2.97%4.22%15.74%23.03%
Jewelry and Watches0.27%2.01%1.38%10.50%
Indoor Plants and Flowers0.93%4.97%4.74%27.51%
Haircuts and Other Personal Care Services2.83%4.82%15.03%26.48%
Cakes, Cupcakes, and Cookies1.05%5.68%5.39%31.25%
Alcoholic Beverages At Home1.48%2.24%7.64%11.74%
Stationery, Stationery Supplies, Gift Wrap-0.20%6.44%-1.02%36.63%

As the 2025 holiday shopping season unfolds, several Christmas-related prices have climbed noticeably, reflecting broader inflationary pressures and lingering effects from tariffs on imported goods. One of the most visible examples is in artificial Christmas trees, where higher import costs have pushed retail prices up by roughly 10–15 percent this year, affecting a staple purchase for many American households. The tariff-driven increase represents a meaningful rise against the backdrop of generally elevated seasonal costs. In addition, a growing number of consumers and small retailers have reported higher prices on holiday decorations and gift items, including ornaments and novelty gifts, with some toys and decorative goods seeing wholesale cost increases in the range of 5 to 20 percent, which retailers in turn are passing on to shoppers. Those trends, in turn, are contributing to heightened consumer awareness of ongoing inflationary pressures as gift budgets tighten and shoppers adjust their purchases. 

One hopes that consumers increasingly recognize these affordability strains as the cumulative result of the past five years of extraordinary monetary and fiscal expansion, pandemic-era interventionism, global spending largesse, and a sudden shift toward mercantilist trade policies.