Category

Economy

Category

The Federal Reserve’s Board of Governors got a taste of The Apprentice treatment last week. 

On August 25, President Trump removed Lisa Cook from her position as a Fed governor. Her ousting is widely viewed as an attack on the central bank’s independence. Nearly 600 economists have signed an open letter to express their “strong support” for “Lisa Cook and for the longstanding principle of central bank independence.”

It is easy to see why the president might want the Biden appointee gone. Trump has consistently called for the Fed to cut its federal funds rate target, thus far to no avail. Sacking Cook gives him another permanent seat to fill on the Federal Open Market Committee — and may persuade the remaining governors to get in line. In other words, firing Cook may enable Trump to remake the Fed in his own image.

But that’s not the reason the president offered. In a letter published to Truth Social, Trump indicated he was removing Cook “for cause” following a criminal referral from Federal Housing Finance Agency Director William Pulte. That is convenient, to say the least. The Federal Reserve Act permits the president to remove a governor for cause. It does not permit the president to remove a governor over policy disputes.

The Allegations

Let’s start with the allegations. According to Pulte, Cook made false statements on one or more mortgage documents. He cited two loans in the initial referral. As The Wall Street Journal reports, Cook took out a $203,000, 15-year mortgage in June 2021 on an Ann Arbor, Michigan home she had owned since 2005, indicating she would use the property as her primary residence for at least one year. Then, just weeks later, she took out a $540,000, 30-year mortgage to purchase an Atlanta, Georgia condo, again indicating she would use the property as her primary residence for at least one year. Pulte alleges Cook committed occupancy fraud by claiming she would use both properties as her primary residence for at least one year.

Falsely claiming a secondary residence as a primary residence will typically reduce the interest rate a borrower must pay, since borrowers are much less likely to default on a loan that would see them lose their primary residence. 

How significant is the offense? It’s a federal felony. However, as Megan McArdle explains at the Washington Post, “individuals are rarely prosecuted” for occupancy fraud “because that would take a lot of time that the bank and prosecutors could more profitably spend doing something else.” Still, it is hard to justify “letting a public official get away with something the system can’t afford to publicly condone” once the offense has come to light. That the public official is a bank regulator makes it even more difficult to justify.

But that’s not all! In a second criminal referral, submitted on Friday, Pulte alleged Cook made false statements on a third loan as well. In April 2021, Cook took out a $361,000, 15-year mortgage on a Cambridge, Massachusetts condo she had purchased in 2002, indicating she would use the property as a second home for at least one year. According to Pulte, this property was not used as a second home, but as an investment property. “Documents she filed with the federal Office of Government Ethics show that Cook was already drawing rental income from the property by December 2021,” The Wall Street Journal reports.

The Court Battle

Cook sued President Trump, the Federal Reserve Board, and Fed Chair Jerome Powell on August 28, seeking “immediate declaratory and injunctive relief to confirm her status as a member of the Board of Governors, safeguard her and the Board’s congressionally mandated independence, and allow Governor Cook and the Federal Reserve to continue its critical work.” She is also seeking a temporary restraining order, which would permit her to remain in her position as governor until the case is settled, on the basis that she “is likely to succeed on the merits of her claims that President Trump’s purported firing violated her statutory and constitutional rights.”

In a hearing held on Friday, Cook’s attorney argued “the President has relied on a thinly-veiled pretext in an attempt to remove Governor Cook over her unwillingness to lower interest rates.”

The administration’s attorney responded to the pretext argument by reiterating that Cook was removed for cause and citing the decision in Trump v. Hawaii, which rejected a theory that would require “an inquiry into the President’s motives,” continuing,

Insofar as Dr. Cook seeks a ruling that the President’s stated rationale was pretext, the Court should decline ‘to probe the sincerity of the [president’s] stated justifications’ for an action when the President has identified a facially permissible basis for it. Not only does precedent foreclose that path as a matter of law, but Dr. Cook offers nothing but speculation to support her charge of insincerity. That is no basis to set aside a presidential action committed to the President’s discretion by law.

In other words, the president is free to reshape the Fed Board to achieve his policy goals — so long as he can show cause.

US District Judge Jia Cobb has yet to rule on the matter, but she is expected to rule before the Federal Open Market Committee meets in September.

Central Bank Independence

Democrats are understandably upset about Trump’s attempt to fire Cook. But their calls for central bank independence ring hollow. Time and time again, they have shown themselves willing to play politics with the Fed — when it suits their interests.

For starters, consider their relatively recent efforts to change the Fed’s mandate. In 2019, Rep. Alexandria Ocasio-Cortez (D-NY) and Sen. Ed Markey (D-MA) sponsored legislation for a Green New Deal, which would have seen the Fed adjust policy to help achieve climate goals. In 2023, Rep. Maxine Waters (D-CA) and Sen. Elizabeth Warren (D-MA) sponsored the Federal Reserve Racial and Economic Equity Act, which would have required “the Federal Reserve Board to carry out its duties in a manner that supports the elimination of racial and ethnic disparities in employment, income, wealth, and access to affordable credit.” Congress certainly has the right to modify the Federal Reserve Act. But it is hard to square these particular efforts with the current calls for central bank independence. Indeed, they look like efforts that would further politicize the Fed in order to advance so-called progressive political causes.

Democrats have also pushed out a Fed governor over purported ethics violations. Richard Clarida resigned in January 2022, amid claims that he had profited from insider information about forthcoming Fed policy in the early days of the pandemic. As the New York Times reported, he had moved somewhere between $1 million and $5 million from a broad-based bond fund to broad-based stock funds on Feb. 27, 2020. The trade, which the Fed described as a preplanned portfolio rebalancing that was similar to a trade he had made the prior year, complied with the central bank’s financial ethics rules. And, given the timing, it is a trade that probably cost him dearly: the S&P 500 declined 11.7 percent over the month that followed, while domestic bonds declined just 1.5 percent. Still, Sen. Warren requested Securities and Exchange Commission Chair Gary Gensler open an investigation in October 2021 and was still going on about the supposed “trading scandal” as late as August 2025. The real scandal — for genuine advocates of central bank independence — is that Democrats misconstrued a standard portfolio rebalancing to get rid of a Trump appointee.

Finally, consider how Cook’s appointment came about. In February 2018, Janet Yellen resigned, creating a vacancy on the Fed Board. Then-President Trump nominated Judy Shelton for the position in July 2019. However, her nomination stalled in the Senate. When Shelton finally came up for a vote in November 2020, not a single Democrat voted to confirm her. This left the vacancy for Biden to fill. He nominated Cook, the Senate split along party lines, and Vice President Kamala Harris broke the tie in favor of Cook’s appointment.

Of course, Senators have the right to oppose a president’s nominee. But it is difficult to argue they were not playing politics when they refused to confirm Shelton. Unlike Cook, who to the best of my knowledge had never written or spoken publicly about monetary policy prior to being considered for the Board seat, Shelton had written and spoken extensively on the subject. She was certainly qualified for the position, as judged by Cook’s later appointment. But Senate Democrats refused to confirm Shelton to get a Fed Governor with policy views closer to their own. It was a lawful decision, to be sure. But it was also a political decision.

Now, Trump is making what appears to be a lawful decision to fire Cook — for cause — in order to appoint a Fed Governor with policy views closer to his own. 

Democrats do not like it. But they would almost certainly do the same if given the chance.

Earth is going to hit “peak population” before the end of this century. Within 25 years, most of the world’s developed nations will be facing sharp population declines, with shrinking pools of young people working to support an ever-aging population.

The reason is not famine, war, or pestilence. We did this to ourselves, by creating a set of draconian solutions to a problem that didn’t even exist. Fear has always been the best tool for social control, and the fear of humanity was deployed by generations of “thinkers” on the control-obsessed left. 

Most starkly, Paul Ehrlich made a remarkably frightening, and entirely false, prediction in 1968, in his book Population Bomb (PDF):

The battle to feed all of humanity is over. In the 1970s the world will undergo famines —  hundreds of millions of people are going to starve to death in spite of any crash programs embarked upon now. At this late date nothing can prevent a substantial increase in the world death rate…

We may be able to keep famine from sweeping across India for a few more years. But India can’t possibly feed two hundred million more people by 1980. Nothing can prevent the death of tens of millions of people in India in the 1970s…

And England? If I were a gambler, I would take even money that England will not exist in the year 2000.

PJ O’Rourke explained what was going on, in his 1994 book All the Trouble in the World:

The bullying of citizens by means of dreads and fights has been going on since paleolithic times. Greenpeace fundraisers on the subject of global warming are not much different than the tribal Wizards on the subject of lunar eclipses. ‘Oh no, Night Wolf is eating the Moon Virgin. Give me silver and I will make him spit her out.

Family Planning and State Intervention

But there is more going here than just gulling the gullible; the overpopulation hysteria of the 1960s and 1970s had world-changing consequences, effects that are just now becoming clear. It’s not fair (though it is fun) to blame Ehrlich; the truth is that the full-blown family-size freakout emerged from a pseudo-science that held growth was a threat to prosperity. Influential organizations were founded by very worried people. The Population Council and the International Planned Parenthood Federation were both created early on, in 1952. Developing nations began promoting aggressive family planning initiatives, often with substantial support, and sometimes with coercive pressures, from Western governments and international agencies.

The United Nations, the World Bank, and bilateral donors, particularly the United States through USAID, increasingly integrated population control into foreign aid programs. High fertility rates, particularly in Asia, Africa, and Latin America, were viewed not merely as demographic trends but as Malthusian obstacles to modernization, poverty alleviation, and global security. China implemented its infamous “One-Child Policy” in 1979 with coercive measures, including forced sterilizations and abortions. India conducted mass sterilization campaigns, particularly during the Emergency period (1975–1977), often using force or extreme social pressure, including withholding ration cards. A number of countries in East Asia saw aggressive state-controlled programs, often funded by the World Bank, that sought to use questionable and coercive methods to reduce population growth quickly and permanently.

In more than a few cases, of course, the availability of contraception was actually a means of freeing women to make a choice to have fewer children. But combining this choice with state-sponsored coercion meant that even those who wanted more children, or would have wanted more children if the social pressures had been more sensibly used, were diverted from their private dream of several children.

That would be bad enough, if that were the end of the story. But it is only the beginning, because the sanctimony of scientism has created an actual population crisis, one that will affect the world for decades. Some nations may never recover, at least not in their present form. That crisis is the population bust.


Shrinking Planet: Which Nations Will Peak When?


I did some back-of-the-envelope calculations, using available data. What I was trying to calculate was the year of projected peak population, for the 26 countries where the data are reliable enough to make an educated guess. That projection is based on Total Fertility Rates, and accounting for immigration, and mortality (life expectancy) trends. These estimates are, at best, approximations, because in some cases the data are not strictly comparable. But the data I do have are drawn from the United Nations World Population Prospects, OECD statistical reports, and national demographic data.

CountryTotal Fertility RateProjected Peak Population Year
Australia1.66 (2023)2035
Austria1.45 (2022)2040
Belgium1.60 (2022)2038
Canada1.40 (2022)2045
Chile1.48 (2022)2040
Czech Republic1.70 (2021)2033
Denmark1.55 (2022)2037
Finland1.35 (2021)2035
France1.84 (2021)2050
Germany1.53 (2021)2035
Greece1.43 (2021)2030
Hungary1.55 (2021)2035
Ireland1.78 (2021)2045
Israel3.00 (2021)No peak this century
Italy1.25 (2021)2030
Japan1.30 (2021)2008 (already peaked)
Korea0.70 (2023)2025 (peaking)
Mexico1.73 (2021)2050
Netherlands1.60 (2021)2040
New Zealand1.65 (2022)2045
Norway1.50 (2021)2040
Poland1.39 (2021)2032
Portugal1.40 (2024)2028
Spain1.19 (2021)2028
Sweden1.60 (2021)2045
Turkey2.05 (2021)2050
United Kingdom1.53 (2021)2040
United States1.62 (2023)2045
REPLACEMENT TFR2.08-2.11Constant population
See endnote for more source information.

Peak population years are based on UN World Population Prospects (PDF) mid‑variant projections, supported by regional reports noting that most European/North American nations will peak in the late 2030s. Japan already peaked around 2008, South Korea around 2025, and Israel — with TFR near 3.0 — may not peak this century.

As is noted in the final row of the table, the replacement rate for total fertility is about 2.10, given trends in life expectancy and assuming no net migration.

This raises a question: if all these countries have TFRs below replacement, what is actually happening to the world’s population? The answer is simple, though it has not been talked about much. The world population is going to peak, and then start to decline. The total number of people on Earth will begin to fall sometime in the near future. The actual date of the peak is a matter of conjecture, since it depends on specific assumptions, but the estimates appear mostly to fall between 2060 (assuming current TFRs are constant) and 2080 (if TFRs increase slightly, and life span increases):

Sources: 
United Nations Medium-Fertility Projection (orange line)
Simplified Lancet Projection Population Scenario (yellow line)

None of this needed to happen, folks. There is plenty of room on Earth, as you know if you have ever flown across Australia, Canada, or for that matter the US, at night. There is a lot of empty space.

Let’s do a thought experiment: there are 8.1 billion people on Earth now. Suppose all of them lived in the US state of Texas (for those Texans reading this, I know it seems like we are moving in that direction; the traffic in Dallas is remarkable!). Texas has an area of 676,600 square kilometers. So supposing present trends continue, and literally the whole world did move to Texas; what would that look like?

Well, 8.1billion / 676,600 is about 12,000 people per square kilometer. That’s slightly more dense than the five boroughs of New York (about 11,300 per square kilometer), but much less than Paris (20,000), and dramatically less than Manila (nearly 44,000). Now, New York and Paris are pretty crowded, but people do live there, and even go there voluntarily to visit sometimes. Even if the entire current global population had to move into Texas, it’d be only marginally more annoying than Manhattan at rush hour. 

So, here’s the takeaway: there was no good reason for the population hysteria of past decades. As I tried to argue in an earlier piece, those predictions were ridiculous even at the time. And we need not be concerned about reviving the “population bomb,” because there is plenty of room, even if the human population does start to grow again, and even if we all had to move to Texas.

The effects of population decline are already starting to be felt in countries such as South Korea and Japan. As the average age climbs, the absolute number of people under 40 starts to decline. Unless something changes, the world population in general, and many specific countries, will  face circumstances that, until now, have only ever been observed during catastrophic plagues or savage wars: blocks of empty houses, abandoned cities, and hordes of elderly people who lack the ability to provide for themselves. The difference in the present case, however, is that we are not suffering from famine or war. As Antony Davis pointed out, the current collapse of world civilization is a consequence of a striking failure to recognize that human beings are the most valuable resource we have.

 Some Notes on Sources

  • TFR data comes from OECD and UN: OECD average TFR was 1.5 in 2022
  • OECD Social Indicators 2024
  • The Real Reason People Aren’t Having Kids, The Atlantic 
  • Fertility Rate, Total for OECD Members, St. Louis Federal Reserve
  • List of countries by past fertility rate Wikipedia.org
  • Country‑specific TFRs drawn mostly from UN/EU data such as: Total fertility rate Wikipedia.org
  • Charted: When Every Continent’s Population Will Peak This Century visualcapitalist.com
  • More countries, including China, are grappling with shrinking and aging populations, The Atlantic
  • Denmark’s TFR (1.55 in 2022) is from its national statistics
    Korea’s extremely low TFR (0.7 in 2023) is from OECD press releases 

In a TikTok video that went viral this week (newsworthy, I know), a barefaced young woman sits in her car, in the middle of what we Gen Zers call “a crash out.”

Alyssa Jeacoma, you see, has been making student loan payments of $1500/month (the cost of a one-bedroom apartment or the mortgage on a starter house) for two years.

In tears and disbelief, she explains that she spent two years thinking she was paying down her debt, and was shocked when she discovered that, thanks to a 17-percent interest rate, her total balance had gone up.

The comments section is full of commiseration: “Yep. Used $31k in student loans. I graduated 10 years ago and I now owe $59k…”

The video went viral, with millions of views, and for good reason. It hits a nerve for many young Americans who — 15 years after Obama’s drastic federal takeover of the student loan program — are drowning in debt, unable to sustain the upward trajectory of living that they were told was the American dream.

If a young person with no credit history and no collateral assets tried to take out a $50,000 loan to start a business, few if any banks would take the risk. Ms. Jeacoma, who now owes $90,000, seems to agree that’s questionable: “How does this even make sense? I signed up for this…when I was 17. This should not be legal, bro.” A college education was once thought such a safe investment that no one blinked at the idea of giving tens of thousands of dollars to a teenager with only the barest understanding of the contract she was signing.

This woman’s student debt has an interest rate close to that of a credit card (the great no-no of financial advice – whatever you do, don’t take on credit card debt, or you may never climb back out of the abyss again). We’re giving away crushing student loans like candy. The entire university system is riding on the shoulders of broke twenty-somethings, mortgaging their futures to pay for football stadiums and presidential salaries, asking them to be small Atlases holding up a whole world while they slowly suffocate under the weight.

Pundits decry that Millennials and Gen Z “stuck in their parents’ basements.” Headlines lament that we aren’t growing up fast enough, aren’t buying houses fast enough, aren’t getting out of debt fast enough. The average age of buying a first house in America is now 38, the highest on record. And no wonder, when so many are still trying to pay down their student loans.

But, the argument goes, you have to go to college to be successful in life. If you don’t go to college you’ll end up working at Burger King and living at your parents’ house and never making it on your own.

Unfortunately, if that was once true, the data show it isn’t anymore. Yes, young people are working at Burger King and Starbucks, living at their parents’ houses, and failing to strike out on their own. But a lot of them also have college degrees, and those college degrees aren’t saving them. Pundits and guidance counselors alike are selling college degrees as a life raft in the rising waters of an unhappy economy. But said college degrees are proving to be oxygen masks that don’t work, life rafts that fail to expand when they hit the water. Even students with the coveted sheepskins are discovering, too late, that parchment doesn’t float.

52 percent of college graduates are underemployed a year after graduating (52 percent of the class of 2023 were working jobs that didn’t require degrees at all). At the ten-year mark, 45 percent are still underemployed. The New York Fed estimates that 33 percent of all college graduates are underemployed – of all ages, in the entire economy.

As Cassandra (and entrepreneurs like Isaac Morehouse, writers like Ryan Craig, political players like Robert Reich, and countless others) have been saying for well over a decade, college does not make you employable.

College doesn’t even guarantee you an education. The great redeeming value of college has always been its educational value. Even if you aren’t going to use the degree in a specific field, a liberal arts degree will still help make you a well-rounded person.

But again, the statistics say otherwise.

Only 46 percent of American adults can read above a sixth-grade level, but 47 percent of American adults have at least an associate’s degree (another 15 percent attended college but didn’t graduate). That means more adults have college degrees than are literate at a high-school level. We send young people to college who shouldn’t have graduated high school, and even after they’ve graduated college they still have an elementary-level literacy capability. How does that even happen?

And if all these statistics are true, then why on earth are we consigning kids to a debt rat race?

The cultural fear runs deep: if you can’t get a job that requires a college degree, you’re not going to make it in life (fundamentally not true). A blue-collar job can pay well into the six figures. Even a moderate hourly rate (say $20/hr) can be enough to establish a foundation in life. A young person living at home and saving for the first couple years of their career can have an enormous advantage over their college-educated peers.

What’s actually shackling young people is the debt, an anchor around their neck they can’t untie fast enough, while sinking at an alarming rate (to the tune of 17 percent a year interest rates). It’s almost impossible to tread water fast enough to keep above the surface. 

And debt for what? Not for the benefit of the students, clearly, if some are graduating still unable to even read at a high school level.

Of course, some make it inside the system and thrive. According to the stats, most don’t. Fewer than half who start finish in four years. Every industry is filled with stories — doctors who hate their careers but are so shackled by crippling med school debt they have no choice but to carry on.

The cost-benefit analysis doesn’t check out. But high schoolers aren’t taught how to do a cost-benefit analysis, so they don’t know how to analyze the life-altering decision that comes at them often before they’re even legally adults. Most high schools don’t require an economics class as a prerequisite for graduation. Classrooms don’t cover personal finance – just interpreting Shakespeare and prepping for the SAT and putting a condom on a banana. Everyone is taught how to get good grades, how to ace a test, and how to impress a college admissions officer.

Which they do, to a tune of 62.8 percent (the number of 18-24 year olds enrolled in college in October of 2024). High schools are really, really good at producing students that can get into college.

But they’re really bad at preparing kids to make financial decisions that won’t weigh them down for decades to come. Students sign up for loans (the average graduate owes $33,150) with a 17-percent interest rate without knowing what that means, then are shocked when they see their balances rising.

The solution isn’t free college — because clearly college isn’t solving anyone’s employment problems.

The solution is to stop the cycle. Do a cost-benefit analysis on what you’re buying before you make the purchase. Assess: is the career path I’m on really going to work for me? Or is the liberal-arts-to-Starbucks-pipeline not the well-worn highway I want to traverse?

Teaching kids how to run cost-benefit analyses and understand finance might cripple our opulent, bloated, debt-fueled higher education system. It depends on our young people’s naïveté to pad its budgets and the pockets of its administrators. But it might just save the next generation of young people, who are far more important.

The Great Enrichment refers to substantial improvements in well-being, driven by comparatively permissive, enlightened, and liberal attitudes towards experimentation and entrepreneurship. Such attitudes coalesced in western Europe, fostered innovation, and have increased real incomes by a conservatively estimated 3,000 percent since 1800. Deirdre McCloskey describes it as an era of “human ingenuity emancipated.”

Many factors influenced this enrichment, and some economists note how the introduction of luxury goods like coffee increased well-being and productivity, especially in eighteenth-century Europe. But where did people get such a boon?

Whether you have five cups a day or none at all, we often take such questions and connections for granted. We can order our favorite coffee drinks on the Starbucks app, we can quickly get another k-cup, and there are myriad devices for the more discerning coffee drinker. There are over 38,000 Starbucks and 14,000 Dunkin’ Donuts stores across the globe. Such methods of caffeination are commonplace.

People in the sixteenth and seventeenth centuries didn’t have it so easy, but they did have coffeehouses!

Coffee had only recently been discovered — cultivation took off in Ethiopia in the 1400s — but it quickly revolutionized society once introduced. Coffeehouses became an efficient organization to satisfy the desires of myriad coffee producers and consumers, from farmers and merchants to daily drinkers and occasional passersby.

Seventeenth century coffeehouse in England. Bodleian Library, University of Oxford.

One of the earliest references to coffeehouses in Istanbul (formerly Constantinople), in 1554, notes the origin and relative luxury of these places: 

two private persons called Schems and Hekem, one from Damascus, and the other from Aleppo, set up each of them a coffee-house at Constantinople, in the quarter named Takh-tacalah, and began to sell coffee publickly, receiving the company on very neat couches or sofas.

In London, one of the first coffeehouses opened in the 1650s under the direction of a Levantine merchant and his assistant. Thomas Hodges was the merchant who had learned of coffee during his travels, along with the aid of Pasqua Rosee. Hodges started serving coffee to family and friends at his home in London. As this operation became too much, he encouraged Rosee to open a shop to sell coffee. The shop in St. Michael’s alley in Cornhill became a success and eventually allowed for the purchase of a larger shop nearby. 

Commemorative plaque at the first coffee house in London, now site of The Jamaica Wine House. Shutterstock.

So entrepreneurship was key, but so were institutions that encouraged trade in coffee, the formation of coffeehouses, and the use of coffeehouses in other entrepreneurial plans.

In a recent article, “From Beans to Houses: the Entrepreneurship and Institutions of Coffeehouses,” in The Review of Austrian Economics, I expand upon these connections. The article reexamines the history of coffeehouses, how entrepreneurs helped bring coffee to our lives, and how institutions shaped that history. The main takeaway is not just that entrepreneurs served coffee to earn profits, it is that institutions — humanly devised formal and informal rules—influence whether innovation continues or becomes stifled. 

Coffeehouses in sixteenth-century Istanbul and seventeenth-century London were similar in many respects, but they flourished in London. They became a source of social, political, technological, and financial innovation. Istanbul, the larger Ottoman Empire, and their coffeehouses remained stagnant. 

British institutions that generally protected private property rights and limited governmental abuses encouraged innovation. Such institutions, perhaps along with changes in social norms about commerce, bourgeois dignity, or the regard people held for entrepreneurs, encouraged coffee entrepreneurs to make a go of it, to develop coffeehouses, and to find ways to better serve and attract customers. Such innovations include coffeehouses as sources of news from incoming ships, coffeehouses as places to gather, and coffeehouses as marketplaces. 

Coffeehouses throughout the Ottoman Empire, however, rarely helped to spur the same kinds of innovations; their institutions discouraged such developments. Such institutions include the Ottoman fiscal system that left many facing uncertain taxes, Islamic charitable organizations known as awqaf, and the Janissary military corps, among others. These institutions generally encouraged entrepreneurs to serve the interests of elites, religious leaders, and Ottoman rulers, rather than customers.

These insights help us better understand these complex histories. Moreover, they help us navigate modern coffee markets and whether our institutions foster similar kinds of innovations. For example, a recent National Bureau of Economic Research (NBER) working paper shows that areas with more Starbucks also have more startup businesses. Recent tariffs over imported coffee beans (say, a 50 percent tariff on all Brazilian goods) and the uncertainties surrounding tariff schedules, however, serve to discourage such effects.

In the extreme, recent tariffs might gut local coffee shops. One story from Jessica Simmons, owner of Bethany’s Coffee Shop in Lincoln, Nebraska, indicates how. She estimates her prices have increased 18-25 percent since January, and that, “We are at a point where we don’t have a choice but to raise prices. Our margins are thin. Small businesses are struggling with the rising costs of tariffs.” Similar stories can be told in many other places too, from New York to New Mexico. These are the predictable effects when tariffs are raised, especially when there are few domestic substitutes, like for coffee beans.

Coffeehouses can bring us coffee, facilitate additional interaction, and continue the great enrichment, but only if our institutions reward entrepreneurship and innovation.

British satirist and cultural commentator Konstantin Kisin — author of An Immigrant’s Love Letter to the West (2022) — recently shared a debate clip from Doha, Qatar, in which he made a simple observation: Slavery has existed in every human society and across the whole of human history. 

It’s a statement so uncontroversial it should have, at most, drawn some polite nods. Instead, it provoked gasps, giggles, boos, and tut-tuts from the hostile audience (see below).

This reaction reflects a troubling trend in modern discourse.

Rather than seriously engaging with arguments that challenge their preconceived ideas, many people have been so ravaged by ideological tribalism that they retreat into their comforting bubbles of confirmation bias.

In this case, Kisin’s point disrupts the one-dimensional narrative often presented in discussions on colonialism, slavery, and racial politics — a narrative reinforced in recent decades by figures like Ibram X. Kendi, Robin DiAngelo, and Nikole Hannah-Jones.

But slavery was the global norm for millennia and existed on every inhabited continent. 

The very word “slave” originally referred to Slavic people in northern Europe, who were frequently captured by Vikings. 

The Arab slave trade operated for over a thousand years and likely enslaved as many as 18 million people, as compared to 12 million over about 400 years for the Transatlantic slave trade. It was also often far more brutal. Male slaves were routinely castrated using barbaric and unsanitary methods that led to the painful deaths of between 60-90 percent of them, according to historians such as Bernard Lewis, Murray Gordon, Jan Hogendorn, and Marion Johnson.

Barbary pirates from North Africa enslaved perhaps a million Europeans, in addition to millions of other Africans, between the mid-1400s and the beginning of the nineteenth century, when the US destroyed much of the pirates’ capacity to capture slaves during the First and Second Barbary Wars.

Slavery was widespread across Asia as well — India, China, Japan, Korea, Thailand, and Mongolia all have significant histories of enslavement, and the practice still persists in parts of Africa, the Middle East, and China today. 

To place sole blame on Europeans as many leftists do betrays profound ignorance and an utter disregard for reality.

As Kisin correctly pointed out, it was in fact the West — particularly Britain and the United States — that ultimately led the global effort to abolish slavery. 

The abolitionist movement was born from the same philosophical ideas that informed America’s founding, typically coupled with the evolving religious viewpoint that all human beings deserve to be treated with dignity and grace as they are all part of God’s creation.

Given that fact, it’s unsurprising that one of the earliest organized objections to slavery came from Quakers in Germantown, Pennsylvania, in 1688. 

Four men — Francis Daniel Pastorius, Garret Hendericks, Derick op den Graeff, and Abraham op den Graeff — wrote a petition invoking the principle, “do unto others as you would have them do unto you,” challenging society’s acceptance of enslavement. Over the eighteenth century, Quakers increasingly barred slaveholders from their congregations and lobbied governments to outlaw the practice.

Enlightenment philosophers provided a broader intellectual foundation. 

John Locke’s theory of natural rights — “life, liberty, and property” — strongly influenced Thomas Jefferson and the entire structure of America’s Constitution and Bill of Rights. David Hume, Montesquieu, and Adam Smith critiqued both the morality and economics of slavery. In The Wealth of Nations, Smith argued that slavery was not only unjust but also economically inefficient: 

“The work done by slaves… comes dearer to the master than that performed by freemen.”

These ideas fueled some of history’s most remarkable anti-slavery campaigns. 

British Abolitionists such as William Wilberforce, Thomas Clarkson, and Granville Sharp wove those Enlightenment ideas and Christian teachings into effective advocacy, while firsthand accounts amplified the call for change. Former slave Olaudah Equiano documented the horrors of enslavement and the promise of liberty. He worked with Thomas Clarkson, who collected evidence of the trade’s brutality to persuade the public and Parliament. 

As Clarkson famously wrote, 

“We cannot suppose that God has made such a difference between us and them, as to intend one part of mankind to be the perpetual slaves of another.”

Thanks to their efforts, Britain abolished the slave trade across its vast empire with the Slave Trade Act in 1807.

Abolition was neither cheap nor politically expedient.

Britain spent the modern equivalent of hundreds of billions of dollars enforcing abolition, tasking the Royal Navy’s West Africa Squadron with hunting down slavers, sinking their ships, demolishing their slave ports, and freeing the men and women enslaved there. Over a period of a few decades, they seized over 1,600 slave ships and freed at least 150,000 Africans. They also paid an enormous amount of money as compensation to slaveowners under the 1833 Slavery Abolition Act, in exchange for more peacefully ending the practice where they could.

William Wilberforce, who championed abolition in Parliament for decades, finally witnessed his life’s work realized just days before his death in 1833.

Meanwhile, the United States’ founders grappled with the same moral questions. 

America’s relationship to slavery was more complicated but still deeply informed by the same Enlightenment principles. At the Constitutional Convention, slavery was one of the most divisive issues. Northern states were already moving toward abolition. Several southern states had begun phasing it out or restricting the trade. 

But even the strongest voices for abolition among the Founders, such as Benjamin Franklin and John Adams recognized that pushing total abolition in that moment would have broken the union in its infancy, negating their hard-won fight for independence. Instead, they deployed mechanisms like the Three-Fifths Compromise — which, contrary to many people’s mistaken understanding, was a way to reduce the political power of slave states within the federal government. 

Even some of the slave-owning Founders such as George Washington (who inherited slaves from his father and his wife Martha’s family) viewed slavery as morally abhorrent, writing “I can only say that there is not a man living who wishes more sincerely than I do to see a plan adopted for the abolition of it,” in a letter to Robert Morris in 1786.

Hypocrite though he may have been, Washington also freed the 123 slaves he owned at the time of his death. It’s worth noting that this was the only example of such a large-scale emancipation in Virginia at the time.

The moral tension between human dignity and political expediency was always inescapable — and it percolated for decades. 

Abolitionist voices grew louder, from Frederick Douglass to Sojourner Truth. Eventually, that tension exploded in the form of the Civil War — the bloodiest conflict in U.S. history. And at its conclusion, the Thirteenth Amendment finally abolished slavery in 1865, making America one of the first nations in the world to do so.

The West’s confrontation with slavery was imperfect but principled. It was driven by ideas valuing human dignity, liberty, and moral responsibility. These efforts demonstrate that moral courage, grounded in reason and ethics, can reshape societies.

The lesson remains relevant today. Ideologically blinkered presentism has compelled a huge number of people to reduce the complex reality of these issues to a moral black and white built almost entirely on falsehoods.

But recognizing the historical scope of slavery, the complexity of its abolition, and the immense human cost involved should not be a matter of ideology; it is a matter of truth. Understanding this history equips us to engage with moral and political questions more thoughtfully and to appreciate the principles that helped dismantle one of humanity’s darkest institutions.

US Energy Secretary Chris Wright recently summed up the new federal approach: “We are unabashedly pursuing a policy of more American energy production and infrastructure, not less.” That’s a welcome shift after four years of Washington micromanaging energy markets, imposing costly regulations, and forcing unreliable sources into the grid. 

But as someone who lives near Austin, Texas, I’ve seen firsthand that avoiding federal overreach is only part of the solution. States must also resist the temptation to control energy markets — something Texas is starting to get wrong.

Texas and California are the two largest US states by population and economic output. Their energy policies have produced starkly different outcomes. 

See endnote for chart references.
  • California has some of the highest residential electricity rates in the country, averaging 31.8 cents per kilowatt-hour, compared to 15.5 cents in Texas. 
  • Texas produced 512 terawatt-hours of electricity in 2023 — more than a quarter of US generation — while California generated 194 terawatt-hours.
  • California’s renewable share is 57 percent, compared to Texas’s 31 percent. 

But being more “green” doesn’t necessarily mean better. California’s grid is fragile and prone to blackouts, overreliant on intermittent sources. Texans benefit from lower prices and higher output, but Texas now leads the nation in wind generation — a title earned through decades of subsidies, not pure market forces.

That chart tells the story: Texas still delivers more energy for less money, but the policy gap is closing — in the wrong direction. Texas is in danger of losing its competitive edge by repeating California’s mistakes.

Texas Risks Becoming California

The Texas blackout during 2021’s Winter Storm Uri was a wake-up call. As energy economist Rob Bradley, policy expert Bill Peacock, and others have reported, frozen wind turbines and iced-over solar panels were unable to meet demand. Natural gas plants underperformed due to poor weatherization, and the state’s overreliance on intermittent sources magnified the crisis. 

Texas’s electricity market is not as “free” as many believe. While Texans can choose their electricity providers in a competitive retail market under the Electric Reliability Council of Texas (ERCOT), the generation side is riddled with subsidies and political interference. 

For years, Texas’s Chapter 313 property tax abatement program was heavily favored by wind and solar projects, distorting investment decisions. But that program expired in December 2022. In 2023, the legislature passed another version of property tax abatements in Chapter 403, and voters approved a new constitutionally dedicated Texas Energy Fund, with $5 billion in low-interest, taxpayer-funded loans for new natural gas plants. This year, the legislature passed another $5 billion in subsidized loans and could soon subsidize nuclear power.

This is still political favoritism — only the fuel source changes, leaving a flawed policy intact.

Subsidies Distort Energy Markets

Subsidies heavily distort Texas’s power generation market. Federal production tax credits and Inflation Reduction Act subsidies for wind and solar drive massive overbuilding, even when those sources couldn’t meet demand during peak stress.

Now, Texas is repeating its errors with dispatchable generation — bolstering natural gas with state-backed loans rather than allowing higher market prices to attract private investment. As AIER research has shown, markets allocate resources more efficiently when prices reflect actual costs and risks. When the government guarantees a return, investors chase subsidies rather than consumer demand.

Gabriella Hoffman, director of the Center for Energy and Conservation at the Independent Women’s Forum and a recent guest on my “Let People Prosper” show, makes this point clearly: conservation and market-based energy policy can coexist, but only if the government stops trying to pick winners. Europe’s energy crisis serves as a warning about what happens when political goals take precedence over reliability and affordability.

Energy Freedom Drives Prosperity

Energy is a foundational component of economic growth. It powers manufacturing, health care, technology, and every part of modern life. Energy policy must be grounded in cost-benefit analysis and market principles — not central planning and political favors.

Secretary Wright is correct to warn that Europe’s “net zero” strategy has made energy more expensive and less reliable. The US should not follow suit, and neither should Texas. Instead, lawmakers should remove barriers to pipelines, LNG terminals, and power plants, while eliminating taxes and fees on oil and natural gas production that fund unnecessary government growth.

Living near Austin, I see every day how energy costs shape business decisions and family budgets. People continue to move here from California for lower costs and greater opportunities. But if Texas continues to copy California’s interventionist approach, that advantage will erode.

Power for the Future

Texas’s energy sector still outproduces California’s, delivering lower prices and greater reliability. But government meddling — whether for wind, solar, gas, or nuclear — threatens to undo many of these benefits. The path forward is clear:

  • End all tax preferences and subsidies for every energy source.
  • Let ERCOT’s price signals work without political interference.
  • Approve infrastructure quickly, but without taxpayer guarantees.
  • Let consumers and suppliers, not politicians or regulators, decide the energy mix.

If Texas allows people in the market to navigate freely, the state can remain the energy leader that California isn’t. This would also ensure families and businesses continue to move here for the freedom and prosperity they can’t find elsewhere. If not, Texas could find itself with California’s prices, reliability problems, and dependence on politics to keep the lights on.

Chart references:

  • US Energy Information Administration (EIA). Net Generation by State (2023).
  • EIA – State Electricity Profiles: Texas (2023).
  • EIA – State Electricity Profiles: California (2023).
  • EIA – Electric Power Monthly, Table 5.6.A: Average Retail Price of Electricity to Ultimate Customers by End-Use Sector, by State (May 2025).
  • EIA – California State Energy Profile (includes Renewables Share).
  • *EIA – Texas State Energy Profile (includes Renewables Share).

British satirist and cultural commentator Konstantin Kisin — author of An Immigrant’s Love Letter to the West (2022) — recently shared a debate clip from Doha, Qatar, in which he made a simple observation: Slavery has existed in every human society and across the whole of human history. 

It’s a statement so uncontroversial it should have, at most, drawn some polite nods. Instead, it provoked gasps, giggles, boos, and tut-tuts from the hostile audience (see below).

This reaction reflects a troubling trend in modern discourse.

Rather than seriously engaging with arguments that challenge their preconceived ideas, many people have been so ravaged by ideological tribalism that they retreat into their comforting bubbles of confirmation bias.

In this case, Kisin’s point disrupts the one-dimensional narrative often presented in discussions on colonialism, slavery, and racial politics — a narrative reinforced in recent decades by figures like Ibram X. Kendi, Robin DiAngelo, and Nikole Hannah-Jones.

But slavery was the global norm for millennia and existed on every inhabited continent. 

The very word “slave” originally referred to Slavic people in northern Europe, who were frequently captured by Vikings. 

The Arab slave trade operated for over a thousand years and likely enslaved as many as 18 million people, as compared to 12 million over about 400 years for the Transatlantic slave trade. It was also often far more brutal. Male slaves were routinely castrated using barbaric and unsanitary methods that led to the painful deaths of between 60-90 percent of them, according to historians such as Bernard Lewis, Murray Gordon, Jan Hogendorn, and Marion Johnson.

Barbary pirates from North Africa enslaved perhaps a million Europeans, in addition to millions of other Africans, between the mid-1400s and the beginning of the nineteenth century, when the US destroyed much of the pirates’ capacity to capture slaves during the First and Second Barbary Wars.

Slavery was widespread across Asia as well — India, China, Japan, Korea, Thailand, and Mongolia all have significant histories of enslavement, and the practice still persists in parts of Africa, the Middle East, and China today. 

To place sole blame on Europeans as many leftists do betrays profound ignorance and an utter disregard for reality.

As Kisin correctly pointed out, it was in fact the West — particularly Britain and the United States — that ultimately led the global effort to abolish slavery. 

The abolitionist movement was born from the same philosophical ideas that informed America’s founding, typically coupled with the evolving religious viewpoint that all human beings deserve to be treated with dignity and grace as they are all part of God’s creation.

Given that fact, it’s unsurprising that one of the earliest organized objections to slavery came from Quakers in Germantown, Pennsylvania, in 1688. 

Four men — Francis Daniel Pastorius, Garret Hendericks, Derick op den Graeff, and Abraham op den Graeff — wrote a petition invoking the principle, “do unto others as you would have them do unto you,” challenging society’s acceptance of enslavement. Over the eighteenth century, Quakers increasingly barred slaveholders from their congregations and lobbied governments to outlaw the practice.

Enlightenment philosophers provided a broader intellectual foundation. 

John Locke’s theory of natural rights — “life, liberty, and property” — strongly influenced Thomas Jefferson and the entire structure of America’s Constitution and Bill of Rights. David Hume, Montesquieu, and Adam Smith critiqued both the morality and economics of slavery. In The Wealth of Nations, Smith argued that slavery was not only unjust but also economically inefficient: 

“The work done by slaves… comes dearer to the master than that performed by freemen.”

These ideas fueled some of history’s most remarkable anti-slavery campaigns. 

British Abolitionists such as William Wilberforce, Thomas Clarkson, and Granville Sharp wove those Enlightenment ideas and Christian teachings into effective advocacy, while firsthand accounts amplified the call for change. Former slave Olaudah Equiano documented the horrors of enslavement and the promise of liberty. He worked with Thomas Clarkson, who collected evidence of the trade’s brutality to persuade the public and Parliament. 

As Clarkson famously wrote, 

“We cannot suppose that God has made such a difference between us and them, as to intend one part of mankind to be the perpetual slaves of another.”

Thanks to their efforts, Britain abolished the slave trade across its vast empire with the Slave Trade Act in 1807.

Abolition was neither cheap nor politically expedient.

Britain spent the modern equivalent of hundreds of billions of dollars enforcing abolition, tasking the Royal Navy’s West Africa Squadron with hunting down slavers, sinking their ships, demolishing their slave ports, and freeing the men and women enslaved there. Over a period of a few decades, they seized over 1,600 slave ships and freed at least 150,000 Africans. They also paid an enormous amount of money as compensation to slaveowners under the 1833 Slavery Abolition Act, in exchange for more peacefully ending the practice where they could.

William Wilberforce, who championed abolition in Parliament for decades, finally witnessed his life’s work realized just days before his death in 1833.

Meanwhile, the United States’ founders grappled with the same moral questions. 

America’s relationship to slavery was more complicated but still deeply informed by the same Enlightenment principles. At the Constitutional Convention, slavery was one of the most divisive issues. Northern states were already moving toward abolition. Several southern states had begun phasing it out or restricting the trade. 

But even the strongest voices for abolition among the Founders, such as Benjamin Franklin and John Adams recognized that pushing total abolition in that moment would have broken the union in its infancy, negating their hard-won fight for independence. Instead, they deployed mechanisms like the Three-Fifths Compromise — which, contrary to many people’s mistaken understanding, was a way to reduce the political power of slave states within the federal government. 

Even some of the slave-owning Founders such as George Washington (who inherited slaves from his father and his wife Martha’s family) viewed slavery as morally abhorrent, writing “I can only say that there is not a man living who wishes more sincerely than I do to see a plan adopted for the abolition of it,” in a letter to Robert Morris in 1786.

Hypocrite though he may have been, Washington also freed the 123 slaves he owned at the time of his death. It’s worth noting that this was the only example of such a large-scale emancipation in Virginia at the time.

The moral tension between human dignity and political expediency was always inescapable — and it percolated for decades. 

Abolitionist voices grew louder, from Frederick Douglass to Sojourner Truth. Eventually, that tension exploded in the form of the Civil War — the bloodiest conflict in U.S. history. And at its conclusion, the Thirteenth Amendment finally abolished slavery in 1865, making America one of the first nations in the world to do so.

The West’s confrontation with slavery was imperfect but principled. It was driven by ideas valuing human dignity, liberty, and moral responsibility. These efforts demonstrate that moral courage, grounded in reason and ethics, can reshape societies.

The lesson remains relevant today. Ideologically blinkered presentism has compelled a huge number of people to reduce the complex reality of these issues to a moral black and white built almost entirely on falsehoods.

But recognizing the historical scope of slavery, the complexity of its abolition, and the immense human cost involved should not be a matter of ideology; it is a matter of truth. Understanding this history equips us to engage with moral and political questions more thoughtfully and to appreciate the principles that helped dismantle one of humanity’s darkest institutions.

A recent report from The Heritage Foundation argues that “the wealthy” are not “idle idols” but are instead owners and investors of wealth-creating ventures. Through their ownership of productive assets, they are the driving force behind overall wealth creation in the country and, in some cases, the world.  The report reveals a crucial truth that is often lost in today’s political rhetoric: the overwhelming majority of American wealth among the most wealthy (88.2 percent) consists of assets linked directly to businesses and economic production. Despite the commonly accepted belief that millionaires hold their money in real estate or “yachts, sports cars, private planes, gold bars, and jewelry,” most of that wealth is investment, not consumption goods.

Building on this important truth, we emphasize two additional insights that may inform our current policy debates, particularly as the Trump administration seeks to expand government ownership stakes in private companies.

First, we must acknowledge that capitalism, for all its flaws in practice, is fundamentally a system that rewards serving others, not exploiting them. Look at Henry Ford: he benefited tremendously from figuring out how to mass produce cars such that the common man was able to afford a vehicle. But I submit to you that, while he became fabulously wealthy from his innovations, the real winners of this exchange were people like you and me. Everyday Americans received better access to transportation, fundamentally transforming our lives. People like Ford already had access to this then-privilege, so while he may command more wealth as a result of his efforts, the efforts themselves improved our lives much more than his.

The people who create medical treatments and vaccines against disease also often become wealthy. But when people all around the world are freed from contracting diseases, enjoying a fundamentally better and happier life, the wealth gained by their inventors seems small. 

Or think of tech moguls like Bill Gates, Steve Jobs, and Tim Cook. By bringing computing power to the masses, they fundamentally transformed the way we all live our everyday lives.

Consider this ad for computers from 1990:

In today’s dollars, these items would cost $6,590, $2,533, and $5,829, respectively. Also in 1990, the average nominal pay for the whole United States was $23,602, meaning that the average person would have to work 220 hours, 84 hours, and 194 hours respectively, to buy these items. Today, with an average wage of $36.44 ($72,880 annually), the hours worked to afford these items (at their 2025 prices) would be 181, 70, and 160.

But we wouldn’t be buying computer equipment from 1990, anyway. Computer prices have actually fallen dramatically. At the time of this writing, a comparable baseline iMac costs $1,299 (35 hours of work).  The latest LaserJet printer from HP costs $169 (4.6 hours).  IBM sold its computer hardware division to Lenovo in 2005, and a Lenovo desktop computer now costs $859 (23.5 hours). Even if we ignore the massive improvements in quality and the explosion of computing power contained in those devices, computing power has never been more affordable. Millions of careers were transformed by the efforts of Bill Gates, Steve Jobs, Tim Cook, and everyone else from engineer to assembly line at Microsoft and Apple. And while the CEOs and employees of these companies have surely become wealthier, the real winners of the innovations are everyday people like you and me.

In a free society characterized by capitalism, wealth is generated by serving others. Those who can best serve others — and consume less than they generate — find themselves amassing what we define as “wealth.”

The second lesson we can glean from the Heritage study is what the ultra-wealthy actually do with the wealth they amass. Nearly 90 percent of their fortunes are tied up in productive economic activity, not luxury consumption. Only three percent of the top one percent’s wealth is in consumer durables — things like cars, furniture, and jewelry. For the bottom quintile, that ratio is likely to be 15-20 percent.  

Far from being “idle rich,” the wealthy invest their fortunes, providing the capital necessary to fund increased economic activity. For the rest of us, that means more jobs, more production, and better access to the goods and services that enable us to live healthily and wealthily, however we choose to define these terms.

That investment cycle also helps explain why “eating the rich” is a recipe for disaster. Sticking the rich with exorbitant federal taxes can only mean that wealth is removed from productive economic uses to pay for public sector malfeasance. Policymakers are not taking gold coins out of a swimming pool à la Scrooge McDuck, they’re taking investments out of the private sector. The loss of capital impacts not the rich, primarily, but the prosperity that the rest of us have come to enjoy and depend upon.

The reality is that the wealth of the wealthiest people in America largely represents the market’s assessment of their ability to continue serving their customers in the future. As new information comes to light, this assessment can and does change. Tesla, for example, started off white-hot, with stock prices skyrocketing. But lately, after the abysmal launch of the Cybertruck and delays in its production and delivery, combined with some of Musk’s stupendously bad investments, the market has revised its assessment of Tesla downward.  As a result, Musk has lost more than $80 billion in wealth thus far in 2025 alone.

This brings us to a troubling development: President Trump, Congressional Republicans, and members of the so-called New Right have recently floated the idea that we should tax the rich more. Even more alarmingly, these same people hold that the federal government should take equity stakes in private companies. This is a fundamental departure from the principles that allowed for the creation of the wealth policymakers now wish to strip away, and a complete rejection of lower-tax, small-government Republicanism.

President Trump is “taking a 10 percent stake in Intel,” making the federal government the single largest stakeholder of the company. Earlier this year, the sale of US Steel to Nippon was approved, contingent on the US government receiving a “golden share.” While Trump is in office, this golden share is held by the President (i.e. Donald Trump), and after he leaves office, it will revert to being held by the Treasury and Commerce Departments. Importantly, while he is in office, the President will have veto power over some production and wage decisions. Not wishing to be left behind, the Pentagon is taking a 15 percent stake in MP Materials, a producer of rare-earth magnets, among other things.

All of this shifts the nation away from the capitalism that created an economy (and indeed, society) the likes of which has never been seen in human history and toward the type of capitalism found in, say, China. Trying to “out-China” China is a fool’s errand.
The reality is that economies, societies, and the nation itself are best served when individual people are given the freedom and tools to succeed, not when government bureaucrats pick winners and losers. In a free-market, capitalist system like the one the US for the most part enjoys, the best way to serve oneself is by serving others.

Federal Reserve Chair Jerome Powell announced changes to the Fed’s monetary policy framework in his remarks at the annual Jackson Hole Economic Summit. Most notably, the Fed has decided to scrap its controversial Flexible Average Inflation Targeting (FAIT) regime in favor of a Flexible Inflation Targeting (FIT) regime. FAIT was adopted in August 2020, when the Fed last revised its framework. It had replaced an earlier FIT regime. Out with the old, in with the older.

The Fed’s monetary policy framework serves as the central bank’s operational blueprint: a set of principles and guidelines that govern how monetary policymakers respond to economic conditions and communicate their decisions to the public. Think of the policy framework as a (non-binding) monetary constitution for making interest rate decisions and signaling intentions to markets.

Central to this framework is the dual mandate from Congress: maintaining price stability and achieving maximum employment. For price stability, policymakers target 2 percent inflation as measured by the Personal Consumption Expenditures Price Index (PCEPI). Since maximum employment cannot be directly observed, the central bank aims for conditions that support the broadest possible participation in the labor market that is consistent with price stability.

Since some fluctuation in inflation may be desirable (e.g., following supply disruptions), the Fed has opted for a flexible rather than strict inflation target. With a strict inflation target, the monetary authority indicates it will attempt to deliver 2-percent inflation regardless of the circumstances. With a flexible inflation target, policymakers indicate they will take the circumstances into account. For example, they might look through supply disruptions they expect will cause the rate of inflation to rise temporarily. In other words, the flexibility of FIT gives the Fed some discretion, which they believe will result in better monetary policy.

In its last monetary policy framework review, which concluded in August 2020, the Fed adopted FAIT. At the time, Fed officials were concerned that inflation was too low. Inflation had been persistently below 2 percent since the Fed had officially adopted the target in 2012, and despite clarifying in its 2016 revisions that the target was symmetric—i.e., that it would be just as likely to overshoot its target as to undershoot it. With the 2020 move to FAIT, Fed officials committed to let inflation rise above 2 percent for a time following periods where inflation had fallen below 2 percent, in order to ensure inflation averaged 2 percent over time. They believed committing to a make-up policy would help anchor expectations on the target, and in doing so, make that target easier to hit.

Although the Fed did not indicate how it would respond if inflation were to rise above 2 percent in its official Statement on Longer-Run Goals and Monetary Policy Strategy, statements from Fed officials made clear that the FAIT framework was asymmetric: the Fed would only make-up for below-target inflation, not above-target inflation. At the time, no one was worried about high inflation. Inflation had been very low for more than a decade. Correspondingly, there was no concern that inflation expectations might rise above target.

The FAIT framework became outdated almost immediately. Inflation climbed above 2 percent in early 2021 and would not reach a peak until mid-2022. Any ambiguity related to the Fed’s asymmetric make-up policy was resolved. Powell clearly stated that the Fed had no intention of delivering inflation below 2 percent for a period, to ensure that inflation would average 2 percent. Rather, the Fed would merely bring inflation back down to 2 percent. 

Many market watchers and economists were surprised to learn that FAIT was asymmetric, especially given the Fed’s insistence that FAIT would anchor expectations at target. Why would one expect inflation to average 2 percent if the Fed only intended to make up for periods where inflation fell below 2 percent? Since such a policy would tend to deliver more than 2 percent inflation, market participants would come to expect more than 2 percent inflation. And they did. Inflation expectations implied by bond prices have exceeded the Fed’s target in all but two months since the Fed adopted FAIT.

The newly-revised framework removes the Fed’s commitment to make up for past mistakes, essentially marking a return to the pre-2020 framework. Fed officials believe this policy will be easier to communicate to the public. For one, they will not have to explain why they will not let inflation fall below 2 percent, as would be required to ensure inflation averages 2 percent over time. Instead, they will be able to let bygones be bygones and aim at 2 percent on a go-forward basis.

A Missed Opportunity for Real Reform

The Fed’s return to the pre-2020 framework is disappointing. They could have used the opportunity to introduce a symmetric average inflation target or nominal income level target, both of which would tend to ensure that inflation averages 2 percent over time. Such a regime would have helped the Fed prevent inflation from rising so high in 2021 and 2022.

Throughout 2021, central bank officials generally believed inflation had risen due to supply disruptions associated with COVID-19 policies and the corresponding restrictions on economic activity. On its own, this negative supply shock would cause the level of prices to rise temporarily above trend, and then return to trend once those constraints eased.

The economy had been hit by a negative supply shock, to be sure. But it also suffered from a positive demand shock. Indeed, the supply shock had largely reversed by September 2021—and, still, inflation climbed higher. Rather than returning to trend, prices grew faster.

Had the Fed been targeting nominal income, misidentifying the shock would have been of little consequence. The positive demand shock would have pushed nominal spending above target, forcing the Fed to contract.

Had the Fed committed to a symmetric average inflation target, it is unlikely that they would have waited so long to contract. A symmetric average inflation target would have required the Fed to make up for above-target inflation. The further inflation rises above target, the more the Fed will have to contract. In order to avoid a large contraction, Fed officials would have likely begun contracting much sooner.

In both cases, it is relatively straightforward to communicate the policy—certainly easier than trying to explain a confusing asymmetric makeup policy.

Instead of introducing a new framework, the Fed has returned to the familiar. But the FIT approach has known problems. It does not anchor inflation expectations very well. And it does not discourage the Fed from responding to supply shocks. A symmetric average inflation target or nominal income level target would have been an improvement on these margins. Instead, we got old wine in new bottles. Expect the next major economic disruption to leave a sour taste in your mouth.

New data from the Bureau of Economic Analysis show that inflation ticked down in July. The Personal Consumption Expenditures Price Index (PCEPI), which is the Federal Reserve’s preferred measure of inflation, grew at an annualized rate of 2.4 percent in July, down from 3.5 percent in the prior month. It has averaged 2.5 percent over the last six months and 2.6 percent over the last year.

Core inflation, which excludes volatile food and energy prices but also places more weight on housing services prices, edged up. According to the BEA, core PCEPI grew 3.3 percent in July, up from 3.2 percent in June. It has averaged 3.0 percent over the last six months and 2.9 percent over the last year.

Figure 1. Headline and core PCEPI inflation, July 2015 to July 2025

The decline in inflation is even larger when imputed prices are excluded. Market-based PCE, which is a supplemental measure offered by the BEA based on household expenditures for which there are observable prices, grew at an annualized rate of just 1.1 percent in July, down from 4.1 percent in the prior month. It has averaged 2.2 percent over the last six months and 2.3 percent over the last year.

Market-based core PCE, which removes food and energy prices in addition to most imputed prices, grew 2.1 percent in July after having grown 3.8 percent in June. It has averaged 2.8 percent over the last six months and 2.6 percent over the last year.

Taken together, the latest release reaffirms the view that inflation is on a path back to its 2-percent inflation target—and that the Fed should start cutting its federal funds rate target. 

Fed Governor Christopher Waller, who was one of two dissenting votes at the Federal Open Market Committee’s July meeting, has been making the case for rate cuts. Speaking to the Economic Club of Miami on Thursday, he said the more recent “economic data have reinforced” his “view of the outlook and my judgment that the time has come to ease monetary policy and move it to a more neutral stance.”

Federal Reserve Chair Jerome Powell, in contrast, has preferred a wait-and-see approach. However, in his remarks at the annual Jackson Hole Economic Symposium, he appeared to suggest a rate cut could be coming soon. He still believes the “risks to inflation are tilted to the upside,” but his “baseline outlook and the shifting balance of risks may warrant adjusting our policy stance.”

Markets certainly expect the Fed to cut its federal funds rate target soon. The CME Group currently puts the implied odds of a 25 basis point rate cut in September at 87.1 percent, up from 63.3 percent one month ago.

Interestingly, markets also suggest a September cut could be the first in a series of cuts. According to the CME Group, there is currently a 46.5 percent chance that the federal funds rate target is 50 basis points lower following the October meeting and a 38.8 percent chance that it is 75 basis points lower following the December meeting. That would put the federal funds rate between 3.5 and 3.75 percent by the end of the year.

Waller appeared to endorse a series of rate cuts in Miami:

I believe the data on economic activity, the labor market, and inflation support moving policy toward a neutral setting. Based on the median of FOMC participants’ estimates of the longer-run value of the federal funds rate, neutral is 125 to 150 basis points lower than the current setting. While I believe we should have cut in July, I am still hopeful that easing monetary policy at our next meeting can keep the labor market from deteriorating while returning inflation to the FOMC’s goal of 2 percent. So, let’s get on with it.

He said “there is a growing consensus that monetary policy needs to be more accommodative, and even some recognition that it would have been wise to begin this process in July.” Waller said he anticipates “additional cuts over the next three to six months, and the pace of rate cuts will be driven by the incoming data.”

Whether other Fed officials have come around to Waller’s view or continue to believe a slow path back toward neutral is preferable remains to be seen. But the tides are turning.