Category

Economy

Category

In May 1938, or the ninth year of the Great Depression, a minister in Columbia County, Pennsylvania cast about for a sermon topic. In the past the minister, C.R. Ness, had spoken to the members of North Berwick Evangelical Church on a variety of themes: Paul’s Letter to the Philippians, the healing power of Jesus. He counseled families on their private lives, selecting topics such as “The Devotional Life of Men,” or “Traps for Young People.”  This time, however, Reverend Ness chose the Book of Job. 

And small wonder. By 1938, Columbia County felt like Job. The Book opens with Job at his prime, so righteous and prosperous he is known as “The Greatest Man of the East.” But God allows the Devil to play with Job and test him with trial upon trial. One after another, Job’s fortune, his family, his health are taken away. Since the beginning of the decade, Columbia County had endured its own series of trials –joblessness, shortages of cash, mortgage foreclosures, meals of only stewed tomatoes, or no meals at all. The siege was gradually wearing away citizens’ confidence in their own future. 

“Why Do the Godly Suffer?,” Ness asked his flock. Was God testing Columbia County’s faith?  “This will be a very timely series,” commented the editors of the local paper, the Berwick Enterprise. Job’s ancient travail, and his sustained faith, would console people now, “in these days of depression.”

Nor were Ness’s parishioners alone in their sense of desperation. In nearby Allentown, cash-short and underemployed citizens had, two years before, marched to the state capital and occupied the Senate gallery. Philadelphia, also nearby, had once been known as The Workshop of the World. No longer. In the early years of the Depression, hungry families revolted. Now two in ten still sought work.

Such troubles beset every region, and with such severity that Americans everywhere felt a sense of biblical retribution. Drought had plagued Western states for many seasons now, beggaring farmers. Black walls of dust, some a mile high, descended on farm towns, destroying crops and farms, and recalling the punishments in Job: “a great wind came from the wilderness, and smote the four corners of the house.” It was the Book of Exodus that crossed the mind of a columnist when another plague, grasshoppers, covered fields, devouring harvests. “Grasshoppers are locusts,” as in the plagues suffered by Pharaoh, wrote the columnist. The grasshoppers’ arrival was “the American equivalent of the Biblical plague that smote Egypt.”1  Later, the folksinger Woody Guthrie would recall that many believed they were suffering as the Egyptians had. “We watched the dust storm coming up like the Red Sea.”2

In the spring of 1938, when Ness sermonized on Job, unemployment in the building trades averaged more than 40%. Nationwide, one in five workers found themselves still jobless— or jobless yet again. It seemed that God was indeed smiting the land. Afterward, in the 1940s or 1950s, recalling their benumbed state, many Americans still treated the decade-long downturn as a divine test, or, more often, a great mystery, better left unexplored.

The eight mostly prosperous decades that have intervened since have only deepened the aura of mystery around the Depression. That is partly because the facts about the Depression lie outside the range of our experience. Today even the thought of unemployment greater than 10 percent spooks us; joblessness stayed stubbornly above the 10 percent line throughout the 1930s. Today we consider a rising stock market our national birthright. After the initial crash from a market high of 381 in 1929, the market stayed low for more than a generation, attaining its 1929 level only in the 1950s.

Further befogging the record have been those most equipped to illuminate it: scholars. Largely out of fealty to the President who led us during the Depression period, Franklin Roosevelt, many historians have been unwilling to probe the effect of Roosevelt’s multi-year recovery program, the New Deal. Economists, especially economics professors, will readily blame the rough post-crash period on the Federal Reserve’s failure to supply sufficient liquidity – money. 

But with a few exceptions, the economics trade neglects the obvious next question.  What about the years that followed? Why did recovery not return after five years, or after seven? It is not, after all, a deflation shock, however sharp, that converted the initial depression, lower case, into a Great Depression, with capital letters. It was the duration that made the Depression great. Whether monetarist or Keynesian, economists respond to commonsense queries about the later years with a single line: “That is complicated.” It is as if a sign has been placed over the period to intimidate the curious: “Here Be Dragons.” 

Still, the duration of the Depression matters. These days, politicians routinely invoke the New Deal as a model of inspiration, without delving into the evidence of its effect. They do so even though the New Deal never, not even eight years in, met Roosevelt’s primary goal to “put America back to work.”

In reality, recovery’s absence in the 1930s is not so mysterious. Natural disasters contributed to the plight of the farms. The drought of the 1930s was unusually severe. The summers were unusually hot. While the “mighty wind” of the Dust Bowl resulted from a man-made eco-disaster: the overplowing of tens of thousands of acres. 

The absence of a general recovery also can be explained. Recoveries, after all, are like people. They make choices. In each year of the 1930s, the recovery surveyed the economic landscape — and opted to stay away a while longer. Simple facts go a good way towards explaining why each year—and for slightly different reasons — the recovery absented itself. The same facts reveal the dangers of faith — not religious faith, but the political kind.

This story starts with the market crashes before 1929. These occurred with regularity as the Nineteenth Century closed and into the Twentieth. In those years, our laws did not task the federal government with managing the economy. When the economy stumbled, the economy had to right itself. The private sector, the belief was, would and should take the lead. Congress limited the federal government’s assignment to budget discipline that would keep the currency stable. Later, Congress created a new institution to aid the management of money, the Federal Reserve. Most relief work, however, was the work of the states. 

In such a situation, the market determines how far prices fall.  Even the price of labor. Nor is this necessarily evil. When businesses are less profitable, employers have fewer funds for wages. The obvious move for employers is to reduce wages. Most workers prefer lower pay to losing their jobs altogether.

In the early 1920s, as James Grant has shown, Washington and the young Fed addressed a severe downturn by halving federal spending and raising interest rates.  These moves would today be considered counterintuitive, to put it politely. In the same period, a new president, Warren Harding, sent a signal: there was no need for grand reform from the government, despite the downturn. Grand reforms might impede recovery; what business needed was the assurance that giant changes from the government would not be forthcoming. “No altered system will work a miracle,” said Harding. “Any wild experiment will only add to the confusion. Our best assurance lies in efficient administration of our proven system.”  Assailing the heavy burden of taxes postwar, Harding, once elected, made it clear to the public that he intended to reduce taxes wherever and whenever he could. Fewer burdens would free the private sector to pull the country forward.

 It did. Indeed, the economy recovered so rapidly that the early 1920s downturn is today known as The Forgotten Depression. Stock prices rose dramatically, more than tripling over the decade. Jobs materialized, and most importantly, the standard of living increased. Productivity gains meant the old six-day work week could drop to five days. That gave America a gift we still enjoy: Saturday. Perhaps the best symbol of the general 1920s acceleration came from Henry Ford’s production line. At the beginning of the decade the standard car, the Model T, could reach the impressive speed of 40 miles per hour. But the Model A, sold from 1927, moved at 55 or 60 miles per hour. 

When the Dow Jones Industrial Average accelerated, moving up by more than half within the course of a year, citizens of course expected some kind of market crash. The fall of 1929 brought that crash. Beyond inflated share prices, other factors, well documented by the extensive studies of the early years, exacerbated the subsequent downturn: the young Fed’s missteps, an international crisis, the collapse of vulnerable small banks across the land. Congress passed, and President Herbert Hoover went along with, a damaging tariff, Smoot-Hawley. Hoover’s tolerance of tariffs is particularly regrettable since Hoover knew better: his insights on the costs of the perverse policies at Versailles were so trenchant they won praise from even that most judgmental of colleagues, John Maynard Keynes. The 1929 plunge of the Dow was dramatic, but from November, the market began to move up smartly, from a low of 199 to 294 in April of 1930. The three preceding downturns had averaged 15 months in duration.3 Therefore citizens reasoned, just as they might today: “All things being equal, the recovery will come soon.”

But this time, all things were not equal. For Hoover, unlike Harding or his successor Calvin Coolidge, was inclined to action. Indeed, Hoover’s very reputation as a rescuer was what had brought this nonpolitician, engineer, and investor to the Presidency. Hoover had made his name in World War I, by organizing and seeing through a program to feed starving Belgians behind enemy lines. Further fame had come to Hoover when he directed a program to feed revolutionary Russia. Now Hoover ached to mount the most glorious rescue of all, the rescue of the American economy.

Hoover therefore turned to measures his “do less” predecessors would have eschewed. Rather than allowing the financial markets to bottom out, Hoover tried to stop their drop, personally, by railing against short sellers, whom he characterized pejoratively as those who conduct “raids on our markets with purpose to profit.” Hoover loaded burdens on business with a large tax hike, raising the top income tax rate to 63% from 25%. Even Hoover’s smaller interventions today look perverse: At a time when transactions were difficult, Hoover threw sand in the gears by introducing a tax on checks. As the President’s machinations proceeded, the banking crisis deepened, and the Dow plunged again.

Not content with meddling in markets, Hoover likewise tried to manage prices in another new area: labor. The economy wasn’t producing enough. Under a then-novel theory, higher wages would prompt recovery because they would invigorate workers and enable workers to spend more, stimulating the economy. Production, Hoover said, “depends upon a widening range of consumption only to be obtained from other purchasing power of high real wages.”  As Lee Ohanian of UCLA has noted, through a combination of suasion and brute pressure, Hoover drove employers to raise the average real manufacturing wage by 10 percent.4

In 1931, Hoover and Congress codified their wage drive with passage of the Davis-Bacon Act. The official purpose of Davis-Bacon was to boost the economy through federal spending on construction projects in the states. But the law also mandated that builders pay “prevailing wages,” which translated to higher wages, for Washington and unions had a hand in setting levels.  Strapped firms had to comply if they wanted the contracts –-but could hardly afford the wages. So the firms cut worker hours — dramatically—hired fewer workers or rehired more slowly. Bank failures did their damage. So did scarce credit. Still, the pressure on wages matters more than the history books convey. By 1932 joblessness hit 25 percent. The same year, the Dow Jones Industrial Average plunged further, dropping to 41.2 in July 1932, or almost 90 percent off its 1929 high. 

Such stunning numbers—yet more stunning because they came years into the downturn—struck many Americans dumb. Others thought to reverse troubles through protest. In 1932, some 25,000 veterans converged on Washington to demand the government advance their “bonus,” a pension package set to be paid in 1945. Congress deadlocked, and for months the vets camped off Capitol Hill. Eventually, the President sent federal troops to clear the camps, torching their makeshift shelters. One of the presidential candidates that year, Franklin Roosevelt of New York, occasionally made conservative speeches, vowing to cut federal spending.  Roosevelt likewise promised to help the worker, whom he characterized as “the forgotten man at the bottom of the economic pyramid.” But in other speeches, Roosevelt, like a politician from our own day, tried out class war rhetoric, assailing “princes of property.” 

Though Roosevelt hardly made clear which policies he would promulgate, voters in 1932 opted for change and elected him. Over the winter of 1932-1933 rounds of bank failures amped up national anxiety. Like Hoover before him, the new president rejected Harding’s “no change” rule. Indeed, Roosevelt promised the opposite: “bold, persistent experimentation,” action for action’s sake, so much action that Roosevelt made Hoover look tame.

By the spring of 1933, the normally can-do American public was ready to go along with new interventions. Americans went along with, even savored, Roosevelt’s growing habit of scapegoating business. They also accepted the notion that Roosevelt’s experts, a clutch of professors quickly nicknamed the Brains Trust, knew better than they.  Roosevelt was presented as a kind of political pastor, and Americans warmed to that, too. Today presidents who speak through new media gain special popularity: think of Donald Trump on X. Then, a new medium also worked magic: the radio. As Americans sat in their living rooms, Roosevelt’s disembodied voice reassured them: “The only thing we have to fear is fear, itself.” Soon Roosevelt was delivering routine talks to the American population, his Fireside Chats. “Even if what he does is wrong, they are with him,” commented the humorist Will Rogers of the electorate and Roosevelt at inauguration time. “If he burned down the Capitol we would cheer and say, “Well we at least got a fire started, anyhow.”

The dire situation offered Roosevelt a license available to none of his predecessors, at least not in peacetime: the license to direct the entire economy, from monetary policy to Wall Street, industry, agriculture, and even then-new industries, such as utilities. With his New Deal, the President claimed that license. In the famous 100 Days, his first legislative drive, Roosevelt established dozens of large programs to oversee or alter virtually every sector of the economy. 

The National Recovery Administration, tasked with managing the industry, became the centerpiece of the New Deal. It was no accident the President selected a brigadier general, Hugh Johnson, to lead the National Recovery Administration (NRA), and a blue eagle as the NRA’s symbol. The NRA was to be a kind of military campaign, demanding suspension of disbelief. “Do not trifle with that bird,” warned Johnson.

Under statutes bearing visible traces of Benito Mussolini’s syndicalism, the NRA assigned large firms and industry leaders, to draft codes to promote efficiency in their markets. These codes spelled out in magnificent detail right down to what price a cleaner might charge to press pants, or which chicken a butcher must kill first — every aspect of daily business. One theory the NRA applied was a cartoon version of Henry Ford’s assembly line. Consumer choice at the counter slowed down commerce, the theory ran, and, the code’s authors maintained, slowed recovery. Fewer choices would accelerate the rate of transactions. Today, businesses make their money precisely because they offer consumers options. Starbucks comes to mind. Under the New Deal, such choice was suppressed. 

Another principle embedded in the NRA was that wages and prices must stay high or move higher. All codes, as Lee Ohanian reports, set a minimum wage for lower-skilled workers, and most set wages for
higher-skilled workers as well.5 Thus did the New Deal scale Hoover’s higher-wage policy. That smaller businesses might be disadvantaged by codes crafted by industry giants, few dared discuss. 

One who did was a tiremaker from Newark, Ohio, Carl Pharis. Pharis detailed his situation in a letter to Senator William Borah. Pharis Tire and Rubber was a small firm, which, through great discipline, held market share by producing “the best possible rubber tire” and selling at “the lowest price consistent with a modest but safe profit.”  The NRA code imposed a price floor on tires that forced the prices of Pharis tires to the same levels as the tire giants. Under this system, even a firm loyal to the New Deal like Pharis’s would fail. “We are surely on our way to ruin,” Pharis told Borah. 

The absurdity of these methods went overlooked, however, in part because they were put forth by great minds. This, even though the Brains Trusters displayed little awareness of how a business such as Pharis Tire and Rubber operated on the ground, or what roles consumers and sellers, individuals, might play in the marketplace. The most perspicacious of Roosevelt’s advisors, Raymond Moley, captured the myopic credentialism of his fellow Brain Truster, Felix Frankfurter:

The problems of economic life were [to Frankfurter] matters to be settled in a law office, a court room, or around a labor-management bargaining table “The government was the protagonist. Its agents were its lawyers and commissioners. The antagonists were big corporate lawyers. In the background were misty principals whom Frankfurter never really knew at first hand…These background figures were owners of the corporations, managers, workers and consumers.”

Those who dared to violate NRA rules confronted criminal charges and jail time: the NRA’s fathers had given their statute teeth. One company the Justice Department indicted was a small wholesale kosher butcher firm in Brooklyn, New York, Schechter Brothers. A third of their industry had already failed. To survive Schechter Poultry kept wages lower than the poultry code mandated. Paying wages that were too low was one of the many charges against them brought by the Justice Department. In the Schechters’ culture customers liked to pick their own chicken, as they had in the old live poultry markets of Europe. Choice was the Schechters’ market advantage over their new antagonist, the supermarkets. The poultry code held that the butchers must hand the buyer the first chicken that came into their hands. The Schechters’ feared, legitimately enough, that denying their buyers this option would lose them business—and allowed customers to continue choosing. That emerged as another count against them.

To win in public or in court, the Roosevelt Administration routinely resorted to intellectual bullying. In the Schechter case a federal prosecutor, Walter Lyman Rice, listened as one chicken dealer, named Louis Spatz, explained that he liked to set prices low to win customers, “same as any other business.”  This common-sense the prosecution treated as primitive ignorance. An exchange between Rice, a graduate of Harvard Law, on the one hand, and Spatz, an immigrant with little English, on the other, captures the confidence with which the New Deal pulled rank. 

Rice: You are not an expert. 

Spatz: I am experienced, but not an expert…

Rice: You have not studied agricultural economics….

Rice: Or any sort of economics?

Prosecution: What is your education? 

Spatz: None, very little.

The Supreme Court put the bully in its place. In its 1935 opinion on Schechter, the justices unanimously struck down the entire NRA. The High Court was even confident enough to pun about their decision: the NRA must go, “bone and sinew,” as one justice put it. But in the meantime, the Administration tugged and shoved at the demand curve for other industries. Hunger was still common across the land. Yet the NRA’s corollary agency in agriculture, the Agricultural Adjustment Administration both forced and paid farmers to destroy their crops, again on the principle that less product would drive up prices. No potato farmer could produce more than five bushels of potatoes without a special permit. “Despite the millions of times those cabalistic letters have appeared in print,” wrote a columnist in 1935 of the AA, “not all understand fully what they mean.” In Maine, it is said, that some authorities even required that farmers pour poisonous blue dye on the extra potatoes, to ensure they were unsalable and inedible. In Texas, mules long trained to step over delicate cotton rows were now driven over those plants to destroy them. The mules balked, as the papers reported.6 

Farmers hardly saw the logic in the crop reduction scheme, but cash short as they were, they found themselves reluctant to turn down the generous payments. In Texas, where Lyndon Johnson worked for the New Deal, farmers in 1934 received some $50 million for complying with the production cutback. 

Consumers were franker. After six million piglets were slaughtered in the name of nudging up prices, a disconcerted housewife wrote to the Agriculture Secretary, Henry Wallace: “It just makes me sick all over to think how the government has killed millions of millions of little pigs.” As it happened the price increase that had followed was too violent, raising “pork prices until today we poor people cannot even look at a piece of bacon.” 

The extent to which Roosevelt relished disruptive experiments becomes clear in accounts of Roosevelt’s efforts to deflate the currency. After committing to the gold standard, Roosevelt abruptly reversed course and, surprising even his advisors, announced in April 1933 that the U.S. was leaving the standard. The move, an effective surprise devaluation, rocked other nations, many likewise enduring downturns. American central bankers dispatched to a London Conference a few weeks later now considered their main work restoring comity among rattled foreign governments. But Roosevelt undermined his negotiators by lobbing another bomb, calling plans for stabilization a “specious fallacy.”  Returning home, George Harrison of the New York Fed told others “He felt as if he had been kicked in the face by a mule.”7

When cutting the dollar from gold did not keep commodity prices at the levels he sought, Roosevelt undertook to do so himself. His method was direct purchases of gold on the open market. Even for an institution as large as the United States government, this was a puzzling move equivalent of trying to raise the level of the ocean by dropping in water with a thimble. One by one, Roosevelt drove from the Administration those alert to his fallacies. “We are entering on waters for which I have no charts,” a financial advisor to Roosevelt, James Warburg, said in his resignation letter. As Liaquat Ahamed reports in Lords of Finance, Britain’s central banker, Montagu Norman, reacted by saying “This is the most terrible thing that has happened.” Even the truest of Roosevelt loyalists, Henry Morgenthau, found himself confused by the fashion —from his bed, in the morning, after breakfast —in which Roosevelt set his gold purchase prices.

One morning, as Morgenthau recorded in his diary, FDR ordered the gold price up 21 cents. Why 21, Morgenthau asked. Because 21 was 3 x 7, Roosevelt replied, and three was a lucky number. “If anybody really knew how we set the gold price,” Morgenthau recorded, “they really would be frightened.” 

Markets were. The Dow Jones Industrial Average slipped down again. This is not to say that the private sector gave up in 1993 and 1934. In the spirit of a then- popular children’s book, The Little Engine That Could, companies responded to their obstacles by simply pushing harder. As Alexander Field has shown in his book A Great Leap Forward, engineers and scientists at private companies strained mightily and produced innovations that enabled companies to do more with less: diesel replacing the steam engine in railroading, for example.8 A road network planned by the states in the 1920s was in the process of construction, making it easier for firms to ship goods around the country. But these efforts came under duress, despite the obstacles of new regulations and random interventions. 

In every downturn, certain industries continue to grow and have the capacity to serve as a locomotive of recovery: the energy sector after the financial crisis of 2008 is a good example. The industry that had that magic potential in the 1930s was electricity. In each year of the Depression—except 1933—American households used more electricity than before.9 Utilities such as Commonwealth and Southern were commencing the expensive work of wiring even hard-to-reach rural areas. 

Now the federal government entered this market. Its start was the establishment of the Tennessee Valley Authority, or TVA, which aimed to serve the South through hydropower. Soon after came the Rural Electrification Administration, or REA, to fund the wiring of farms. At first, power companies told themselves and shareholders they could work with the government. REA officials met with companies, who assumed the funding would subsidize their work laying lines. The press interpreted these events with similar optimism: “As 95% of the electric industry in this country is in the hands of private operating companies,” wrote the New York Times, “the administration’s public funds will be dispensed in that proportion.”10 Instead, the REA bypassed the companies and funded local cooperatives, new and willing to follow REA’s direction. For a while, competition ensued; when the REA’s co-ops threw up their own lines on a rural road, a private company quickly laid its own, to mark its own plans. New Dealers smeared these efforts as “spite lines.”11 Over time, it became clear that the REA and TVA together were not helping companies like Commonwealth and Southern. Rather, they were squeezing them out of the market in a pincer action. 

After the pincers came the hammer. The Administration saw through the passage of a law governing the capital-intensive utilities industry: the 1935 Public Utilities Holding Company Act, or PUHCA. The private companies labeled the law “a death sentence” because PUHCA so constrained their ability to raise capital that they genuinely could not compete. 

Even those industries not scapegoated suffered under the New Deal tax policy. Though Hoover had raised taxes, Roosevelt boosted them yet again, specifically targeting those who were most likely to create jobs through investment: top earners. On a theory we would call Keynesian — though John Maynard Keynes was just beginning to elucidate it—the Administration wanted businesses to overcome their prudent tendency to save in hard times and distribute their profits instantly to stimulate the economy. In 1936, Roosevelt set out to force businesses to do so by seeing through the passage of an undistributed profits tax of 27%. The levy came on top of the (also increased) corporate income tax. 

The New Dealers practiced an early iteration of what we call lawfare, selecting and prosecuting political opponents and stars of industry for tax evasion. The Administration picked the symbol of 1920s prosperity, the former Treasury Secretary Andrew Mellon, for one show trial. The initial decision to order an audit had been made by Treasury Secretary Morgenthau over the protest of the Bureau of Internal Revenue intelligence unit head Elmer Irey, whose review of Mellon’s returns had not alarmed him. His own experience also probably told him that one of the authors of the tax code, Mellon, was unlikely to claim anything other than legal deductions.

But President Roosevelt made it clear he would not honor the American law’s traditional distinction between tax avoidance – legal deductions, say – and tax evasion—illegal moves. Over time it emerged that politics, not logic, drove the officials: ‘You can’t be too tough in this trial to suit me,” Morgenthau told the prosecutor, the future Supreme Court justice Robert Jackson. “I consider that Mr. Mellon is not on trial but Democracy and the privileged rich and I want to see who will win.”12 Mellon was largely exonerated, but only after years: he spent his eightieth birthday in a Manhattan courtroom. By the end, even liberal columnists such as Walter Lippmann reckoned the Mellon prosecution was a
“profound injustice.” 

Some businesses dared to pipe up, suggesting that higher tax rates created a moral hazard of their own. “When you outrage a citizen’s sense of equity,” warned a spokesman for the Ford Motor Company, “you school him in evasion.”  A Harvard friend of the President’s, Alexander Forbes, wrote later to the President to argue that those causes for which rich men took tax deductions, charities such as university research, often did better than the federal jobs programs the Administration now funded. These latter were mere temporary jobs, “boondoggle.” Roosevelt replied by impugning Forbes’ morals; in a national emergency, the righteous move was to pay even those taxes not required by the tax code. Roosevelt, now well at home in his role as minister to the national flock, did not hesitate to chide or shame. To a lawyer who asked about the courts’ treatment of tax avoidance, FDR would write in 1937: “Ask yourself what Christ would say about the American Bar and Bench were he to return today.”13 

The seasons passed and the New Deal created tens of thousands of jobs through programs such as the Works Progress Administration, or the Public Works Administration. Those who claimed one of these jobs, or worked on the Civilian Conservation Corps, clearing forest, say, were truly glad of the employment. Even tens of thousands of make-work jobs, and even the billions spent on these posts did not bring joblessness down to what Americans considered tenable levels. As the election year 1936 dawned, unemployment remained close to two in ten. 

Yet that year, the nation gave President Roosevelt a second term in a resounding landslide. Only two states, Vermont and Maine, declined to support the man and the New Deal. Though the landslide still warrants more study, some explanations for the victory are evident. Four years in, Roosevelt had indeed become something like a national father. Alf Landon, the Republican candidate, chose to offer the electorate a version of New Deal Lite, which gained him no votes.

Now more thoroughly intimidated—the duration was beginning to feel permanent – voters supported Roosevelt as passengers support a captain in a storm: it is too late for another choice.

Yet a third reason, doubtless the strongest, was Roosevelt’s exquisite preparation for the 1936 presidential contest. Through 1935 and 1936, as authors such as Burton Folsom have shown, the New Dealers found a way to please, support, and fund nearly every large bloc that mattered to the election’s outcome. In dark times, and no longer on the land, seniors in cities and towns sought a pension. Social Security became the law of the land in 1935. Labor unions sought laws that would give them more muscle to pressure firms to accept unionization. The 1935 Wagner Act gave them muscle and then some, enough to force unionization on a resistant giant like Henry Ford. For farmers, likewise, there was a gift: the subsidies flowed. 

In swing states, as Folsom shows, New Deal outlays increased dramatically. And for governors and mayors — always influential in presidential contents— millions came for construction projects, sparing the states and towns the difficult task of raising taxes. Many elections are bought, but the 1936 election qualifies as the boughtest. The states, the commentator Raymond Clapper wrote in the Washington Post, now realized that it was important “to keep on the good side of Santa Claus.” To voters still struggling, however, this Santa Claus seemed like a necessary miracle. 

Still, such unprecedented and cynical electioneering cost the country troubles that we continued to endure well into prosperity, and even today.  Through its outlays, the government was not only squeezing out business but also squeezing out what remained of Tocqueville’s local America. The statistics tell the story: the twelve months of 1936 were the first peacetime year in American history that federal spending outpaced state and local spending. The following year, federal spending fell back – but only for that year. The federal government has dominated states ever since.

There was another cost in the political culture, one we likewise feel. Sensing the inevitability of victory, Roosevelt’s opponents took to sliming the Administration. Members of an opposing group, the Liberty League, pointed to the record, but also assailed Roosevelt personally, more unusual then than now. This venture into the mud did not profit those opponents. Roosevelt responded with not just mud but a kind of mudslide, assailing “business and financial monopoly.” In tones that would sound unstatesmanlike even in our own era of trash talk, he told Americans of his opponents that “I welcome their hatred.” 

“I should like to have it said of my first administration,” the campaigning president said in the same speech, delivered just days before the election, “that in it these forces of selfishness and lust for power met their match. I should like to have it said of my second administration that in it these forces met their master.” Some voters relished the prospect of all-out class war; many more, one suspects, chose not to consider the consequences of such a declaration. 

But businesses did consider them. After the election, when Roosevelt made good on his promises, launching a vigorous antitrust campaign, effectively the opposite policy of the early New Deal and the syndicate-friendly NRA. Now the New Deal officials, who had set prices, officially deplored the “disappearance of price competition.”14 New actions assailed the oil industry, tobacco manufacturers were convicted of colluding absent proof of meetings or agreements, a daunting precedent. Companies reeled. It was in this period that utilities surrendered; Wendell Willkie of Commonwealth and Southern even sold part of the company to the TVA. After waiting politely through the 1936 election, labor unions used their new power to mount an unprecedented series of strikes, even occupying factories in so-called “sit-down strikes.” As Roosevelt’s own Labor Department mournfully reported: 

“There were 4,740 strikes which began in the United States during 1937, in which 1,860,621 workers were involved. These workers lost approximately 28,425,000 man-days of work while strikes were in progress during the year.”15 

Each of these man-days lost postponed recovery. To halt the strikes, employers again paid higher wages than otherwise. Again, that constraint forced them to rehire more slowly. Companies despaired and went on strike themselves: net domestic private investment, the statistic that measures a company’s capacity to increase productivity, had been negative earlier in the Depression. In 1938 the measure turned negative yet again.16 A new Federal Reserve law increased the amount of cash banks had to hold in reserve; anxious, the banks held far more than officials had predicted, contracting credit. As Robert Wright shows in a forthcoming history of the period, even Social Security, which many Americans today rate the best of the New Deal, impeded recovery in the later 1930s: the few paycheck dollars held back by the government as worker payments were dollars that went unspent and uninvested.

Workers mulled the failure of relief programs. There was a new awareness of the degradation that came with competing with neighbors for federal benefits. Voters even reconsidered Roosevelt’s promise to help the Forgotten Man. Their mood resembled that of an editorialist in Muncie, Indiana who had published on the question two years before.17 “Who is the Forgotten Man in Muncie? I know him as intimately as I know my own undershirt. He is the man who is trying to get along without public relief and has been attempting the same thing since the Depression cracked down on him. He is too proud to accept relief yet deserves it more than three-quarters of those who are getting it.”  This 1938 period came to be known as the “Depression within
the Depression.”

Preparing for the 1938 midterm election, the President sought during the primaries to drive out disloyal Democrats, even as he tried to give the appearance of impartiality. But this time, Moley later recalled, “the “mysticism did not sit well with the country.”18 While they might repeat the Job story at church, they were now beginning to realize that populist slogans are no substitute for prosperity, and voted accordingly, weakening the President’s majorities in both houses. Roosevelt himself began to weary of the New Deal.

The flood of new laws slowed which is certainly one reason the nation finally began to recover in 1939 and 1940. Many have argued that the dramatic rise in federal spending to support Britain in its war with Germany facilitated the comeback and that the further billions in outlays that came after Pearl Harbor guaranteed it. But another factor was perhaps more important: Roosevelt turned to the war long before Pearl Harbor. To arm Britain – and then America – he needed to ally himself with great companies and drop his domestic class war. Relieved businesses appreciated the ceasefire and began to produce and hire accordingly. 

Historian Robert Higgs has developed a useful thesis to explain this lost decade: “regime uncertainty,” the notion that an erratic, aggressive government can terrify businesses into slowdown. The same theme was taken up by the chief economist of Chase Bank, Benjamin Anderson in a 1945 book, Economics and the Public Welfare. Though individual policies promulgated during the Depression may have differed, Anderson noted, there was one commonality: authorities’ arrogance. “Preceding chapters,” concluded Anderson at the end of his section on the Great Depression, “have explained the Great Depression of 1930–1939 as due to the efforts of governments, and very especially of the Government of the United States, to play God.” When playing God failed, Anderson noted, our government had determined that “far from retiring from the role of God,” it “must play God yet more vigorously.”19

What is the relevance for our own fractious age, and our own future downturns? The first point is that realization comes gradually: Americans did not see all the errors in the New Deal at first. Another point is that policy matters. When we belabor, lame, or derail the private sector, we reduce the likelihood of strong recovery. A third is that countercyclical spending, now institutionalized as the standard antidote to downturns, may not deliver all we imagine. Perhaps the federal response to the early 1920s is the better model. However severe the initial monetary challenges, the downturn after 1929 would not have become the Great Depression had Presidents Hoover and Roosevelt replayed the restrained federal policy of the early 1920s: reduce uncertainty and allow the market to take the lead. 

The Depression also shows that we underrate the damage of bitter partisan attacks. There is a price to slinging the first mud in a conflict, and a price to descending into the dirt with your opponent. Had the opposition to Roosevelt been stronger, clearer, and less demagogic, it would have had wielded more influence.  Another point, that of Robert Higgs, also warrants underscoring: the uncertainty generated by a power-happy demagogue, or an arrogant regime, costs all of us more than standard texts convey. 

Yet a final thought, especially relevant now, involves the cost of politicizing economics. Any of us can understand that politicians must back silly or profoundly perverse policies to win an election. Many advocates of free markets will vote for candidates who emphasize anti-market concessions during campaign season, consoling themselves with the fact that the same campaigners, in a kind of aside, occasionally pay lip service to the power of markets. The voter, sometimes naively, hopes that once secure in office, the politicians will deliver the free-market policy. Sometimes, they do.

What is truly insidious however is when politicians advertise those subpar policies, from tariffs to, say, child credits or make-work jobs, as optimal economics, a guarantee of prosperity. For then, at least for a while, voters believe them. That is what happened in the 1930s. In such cases, the public, like the long-suffering 1930s electorate, becomes complicit in its own deception and disappointment.

In short, there is a price for placing faith in political leaders as one would in a church, a price for which voters are also responsible. “Have we found our happy valley?” Roosevelt asked when he called for a political license to continue his experimentation from on high in 1936. By reelecting him, even those many voters who did not have New Deal jobs agreed to continue to travel behind Roosevelt in his quest, and to tolerate policy that, at some level, they knew could not deliver. The consequence of this complicity was that second term of Depression. 

The Americans who succumbed to political lures and failed to wind down, halt, block, or attenuate the New Deal were our great grandparents, or at the very least, our forerunners. Americans today owe it to them to forgive that error – and remember it. After all, they and we are one people. It is possible therefore to extend the sage Anderson’s point. The Great Depression did not endure because God struck America. It endured because our leaders played God. And because we let them.

References

  1. Thorne, Frank, “Warning of Pest Heeded Too Late.” Buffalo News, July 11. 1936.
  2. Ken Burns, 2012. “Bonus Material.” The Dust Bowl. Public Broadcasting Service. 
  3. Museum of American Finance. Banking Panics, 1930-1931. Richard Sylla. https://www.moaf.org/publications-collections/financial-history-magazine/82/_res/id=Attachments/index=0/Article_82.pdf  (https://www.federalreservehistory.org/essays/banking-panics-1930-31)
  4. Ohanian, Lee. E. “What, or who—started the Great Depression?” Working Paper 15258, National Bureau of Economic Research, 2009.
  5. Cole, Harold L. and Lee Ohanian. New Deal Policies and the Persistence of the Great Depression: A General Equilibrium Analysis. Journal of Political Economy, Vol. 112, No. 4.
  6. Pegler, Westbrook. “New Farming Theories.” Chicago Tribune Service, August 30, 1933.
  7. Ahamed, Liaquat, Lords of Finance: The Bankers Who Broke the World, New York, Penguin Press, 2009. 
  8. Field, Alexander. A Great Leap Forward: the 1930s Depression and U.S. Economic Growth. New Haven: Yale University Press, 2012.
  9. Historical Statistics of the United States, Colonial Times to 1957, Series S 81-93, “Use of Electric Energy: 2902 to 1956, p. 511. 
  10. Hirsh, Richard F., Powering American Farms: The Overlooked Origins of Rural Electrification. Baltimore:  Johns Hopkins University Press, 2022.
  11. Chayes, Antonia H., 1951.  “Restrictions on Rural Electrification Cooperatives” Yale Law Journal. no. 61.
  12. John Morton Blum, Ed. From the Morgenthau Diaries: Years of Crisis, 1928-1938, Boston: Houghton Mifflin, 1959. 
  13. Martin, George, CCB: The Life and Century of Charles C. Burlingham, New York: Hill and Wang, 2005 and Folsom, Burton W. Jr. 2008. New Deal or Raw Deal? How FDR’s Economic Legacy Has Damaged America, New York, Threshold Editions.
  14. Roosevelt, Franklin. “Message to Congress on Curbing Monopolies,” 1938.
  15. Monthly Labor Review, May 1938. United States Department of Labor.
  16. Net Private Domestic Investment: Net Fixed Investment, Series A560RC1A027NBEA, Federal Reserve Bank of St. Louis, U.S. Bureau of Economic Analysis. https://fred.stlouisfed.org/series/A560RC1A027NBEA, August 22, 2024.
  17. Lynd, Robert S and Helen Merrell Lynd, Middletown in Transition: A Study in Cultural Conflict, New York: Harcourt Brace and Company. 1937.
  18. Moley, Raymond, After Seven Years, New York: Harper and Brothers, 1938.
  19. Anderson, Benjamin M, Economics and the Public Welfare, New York: Van Nostrand Company, 1949.

Download the Paper

AIER_Explainer_3_FINALDownload

AIER has submitted amicus curiae briefs to the New Hampshire Supreme Court in the combined cases Rand v. State and ConVal v. State. We took this step because our economic research is directly relevant to the issues raised in these cases.

In Rand, the trial court ruled that New Hampshire’s property tax system was unconstitutional and ordered the State to start redistributing property tax revenue from towns with high property valuations to the rest of the state.

In ConVal, the trial court ruled that New Hampshire’s “adequacy aid” to local school districts was insufficient and ordered a near-doubling of state funding for local schools.

Our briefs support the State’s position that the N.H. Supreme Court should overrule the trial court and find New Hampshire’s property tax and school finance systems constitutional. Our research finds that “property-wealthy” towns don’t necessarily have wealthier people than property-poor towns. They just made good local government decisions in the past. Moreover, their high property values mean that most residents have already fully paid for the privilege of living in a place with good services and/or low taxes. In general, AIER supports choice and competition among local governments as the best way to give households the mix of services and tax levels that they want, while keeping local government as efficient as possible. Centralizing school finance would damage this objective.

Our filings (pdf downloads):

    • Motion to Appear as Amicus Curiae, Contoocook Valley School District & a. v. The State of New Hampshire
    • Brief of Amicus Curiae, Contoocook Valley School District & a. v. The State of New Hampshire
    • Motion to Appear as Amicus Curiae, Steven Rand & a. v. The State of New Hampshire
    • Brief of Amicus Curiae, Steven Rand & a. v. The State of New Hampshire
A young waitress cheerfully takes orders from a group.

In Leave Me Alone and I’ll Make You Rich: How the Bourgeois Deal Enriched the World, Deirdre McCloskey and I distinguish the Bourgeois Deal — ”leave me alone, and I’ll make you rich” — from the Blue Blood Deal of aristocratic oligarchy and the Bureaucratic Deal of the modern welfare state. The Bourgeois Deal is the ethos of Adam Smith’s Commercial Society, and the permission-requiring and command-giving Blue Blood and Aristocratic Deals are the ethos of the Political Society. A single sentence embodies each:

  • Bourgeois Deal, Voluntary, Commercial Society: “May I take your order?”
  • Blue Blood and Bureaucratic Deals, Administrative Society: “That’s an order!”

Note the assumptions here about political equality — or the lack of it. The person saying, “May I take your order?” voluntarily subordinates himself to another’s wishes. The person saying, “That’s an order!” subordinates others to his wishes. The person saying, “May I take your order?” invites others to evaluate a menu of options in light of their own knowledge and preferences. The person saying, “That’s an order!” compels others to ignore their own knowledge and preferences. The commercial society’s order-taker asks people to cooperate. The administrative society’s order-giver commands people to cooperate.

Which respects others’ humanity and dignity? Which respects their knowledge, experience, and autonomy? 

Consider a chicken restaurant. “May I take your order?” contains a lot of information. It says, in effect, that a team of people who are there of their own volition — even if being there is the best of a lot of very bad options — stand ready to fry chicken obtained from one willing seller with knowledge about chicken farming and put it on a bun obtained from another willing seller with knowledge about baking. These willing sellers, in turn, went into their occupations with the conviction that raising chickens or baking buns would be the best way to provide for themselves and their families.

Political choices are different. The candidate who covets your vote offers an exchange of sorts — a “plausible belief,” to quote Thomas Sowell — in exchange for a vote. Still, it’s a plausible belief that the candidate will give orders that the voter finds congenial, and with luck, to other people. It is, in effect, an offer to make someone else do what you want them to do without you having to go to the trouble of offering them something better than their alternatives. It’s an offer to give other people “offers” they can’t refuse.

The great American statesman Daniel Webster put it this way in 1837:

“There are men, in all ages, who mean to exercise power usefully; but who mean to exercise it. They mean to govern well; but they mean to govern. They promise to be kind masters; but they mean to be masters.”

“That’s an order!” might be necessary under certain circumstances. Firms exist because of prohibitive negotiation and transaction costs. The military has a chain of command. It might be necessary to tolerate an evil like taxation to avoid the greater evils of invasion, subjugation, and domination. “Because I said so” isn’t an entirely indefensible response to a child wondering why he can’t drink the stuff in the bottles under the kitchen sink. These are exceptions to the general rules that Adam Smith, and so many after him, have thought should govern relationships between adults and equals, not general rules to which liberty is the exception. 

When we’re asking about the kind of society we want to live in, it might be possible that we should want to live in a society where we recognize one another’s right to say “no, thank you” to an offer — that is to say, a world where people take orders instead of give them.

Close-up of the in vitro fertilization procedure, with needle injecting into egg under microscope.

During the most recent debate with Vice President Harris, former President Trump declared that he has “been a leader on IVF… everybody else knows it.” Trump, of course, was referring to his recent campaign promise that the government pay for or that insurance be mandated to pay for all IVF treatment costs.  

Whether Trump’s proposal would make him a leader is a point of debate, given Democrats and introduced a bill mandating insurance coverage of IVF earlier this summer. But whatever the case, Trump’s IVF proposal would certainly lead in the wrong direction. 

The proposal has many downsides. To begin with, government-funded IVF would be enormously costly. A back-of-the-envelope estimate indicates that government funding IVF would cost about $7 billion annually. This figure assumes that the average IVF cycle costs between $15,000 and $20,000, doctors perform about 413,776 assisted reproductive technology (ART) cycles annually, and IVF constitutes more than 99 percent of ART procedures/​cycles. 

This figure, however, assumes that the current number of ART cycles and average IVF cycle costs stay consistent, which is highly unlikely. Currently, most patients self-pay for IVF, which limits IVF use. Furthermore, a subsidized program creates new incentives for would-be parents to delay childbearing or engage in elective fertility preservation, leading to growing use of the program over time.  

Israel provides a case in point: in Israel, IVF has been publicly funded since it was first introduced in 1981. Reliance on the technology has grown since then, when it was a nascent technology, and between 1990 and 2012, the number of IVF cycles increased eightfold.  

Some of the increase in utilization is no doubt due to innovations that improve the procedure’s effectiveness. For instance, the development of intracytoplasmic sperm injection (ICSI) in the early 1990s meant that IVF became beneficial to a much larger portion of the population, as ICSI helped resolve many cases of male infertility. Even since major technological innovations like ICSI, IVF utilization in Israel has grown. The percentage of births attributable to IVF in Israel in 1995 was only 1.7 percent, but by 2018 that figure had nearly tripled. 

In large part due to its generous policy, Israel also has by far the highest per capita IVF use of any country. Israel’s generous IVF program funds unlimited IVF until a woman has delivered two live children, and benefit eligibility continues up until 45 years of age. Israel also covers elective fertility preservation, and in line with Trump’s proposal, Israel’s policy covers “all treatment costs,” including medication, procedures, testing, and more advanced add-ons like preimplantation genetic testing (PGT).  

If the US implemented a program that subsidized or mandated coverage for “all treatment costs,” substantial growth in IVF use would likely occur. Current IVF use in Israel is more than six times greater per capita than in the US. In countries like Denmark, which subsidize IVF generously but to a lesser extent than Israel, IVF use is still more than four times greater per capita than in the US.   

If a US policy were so generous that it induced Israeli levels of IVF use, the program would cost around $43 billion annually, or about what the federal government spends annually on its major housing rental assistance programs (housing vouchers and project-based rental assistance). Even if the program were “only” generous enough to induce Denmark’s level of IVF use, it would cost $27 billion per year, or more than NASA’s annual budget. 

Yet, unlike the federal government’s housing assistance programs, the benefits of an IVF subsidy would surely be regressive if fertility patterns hold. Under existing patterns, women with higher education or higher income are more likely to delay childbearing: according to CDC research 42.9 percent of women with a bachelor’s degree or greater delivered their first child at 30 or older. In comparison, just 3.3-10.5 percent of women with less than a bachelor’s degree delivered their first child at 30 or older. But older women are also more likely to run into fertility issues and subsequently utilize IVF. 

Given the current national debt and deficit’s threat to our economic stability and the related need for fiscal restraint, creating a new, expensive entitlement program with benefits captured by highly educated and high-income beneficiaries is misguided.  

Even setting aside such a program’s steep price tag and regressive profile, would the money be “worth it”? Trump’s stated motives for the program are pro-natal, yet it is not clear that a subsidized program would actually result in more births.  

The new incentives created by such a program suggest that growing reliance on IVF alongside fewer births overall is possible or likely. This is partly because would-be beneficiaries may falsely believe that a subsidized or mandated policy allows them leeway to delay childbearing, only to find that childbearing is more difficult later in life, even with the assistance of reproductive technology. 

Countries like Singapore, Japan, Australia, and Denmark have subsidized reproductive technology and still seen fertility decline in recent years. And in all countries that subsidize IVF besides Israel — a unique country not only because of its extremely generous subsidies but also its broader cultural commitment to natalism — the fertility rate is currently below replacement. 

Beyond the program’s enormous cost and uncertain or negative influence on births, a subsidy or mandate would conflict with some taxpayers’ views on conception and reproduction. While most Americans disagree with more extreme views put forward by IVF critics, it is nonetheless reasonable that critical parties not be forced to subsidize activities that they find objectionable. 

Although Trump’s plan is a disaster from the perspective of cost, incentives, and value neutrality, IVF is a true medical miracle for many couples with fertility challenges. Protecting IVF means protecting individuals’ freedom to avail themselves of the most successful procedure to treat a range of fertility issues and create human life, and doing so is critical.  

But protecting IVF from efforts to limit its use and reduce its efficacy does not mean subsidizing or mandating coverage. Trump and future policymakers would do well to enthusiastically defend the procedure, but avoid the cost and pitfalls of a government-supported industry. 

Revolutionaries burn a carriage in front of the Chateau d’eau in Paris during the French revolution. Lithograph, Nathaniel Currier. 1848.

“In one sense, at any rate, it is more valuable to read bad literature than good literature. Good literature may tell us the mind of one man; but bad literature may tell us the mind of many men….The more dishonest a book is as a book the more honest it is as a public document.” ~G.K. Chesterton, Heretics 

Limitarianism: The Case Against Extreme Wealth by Ingrid Robeyns is a very bad book. Writing a review of it thus presents a challenge. Who wants to read a review that is the equivalent of shooting fish in a barrel of dead fish? Yet, while reading Robeyns’ tendentious screed, I was faced with the absolute certainty that quite a few of my colleagues and students would love this book. Chesterton’s observation thus puts the right question forward. The interesting thing about Limitarianism is not why it is so very flawed, but rather why Robeyns and others would think it was good. 

The thesis of the book is simple. Robeyns thinks it is wrong for anyone to have more than a million dollars in wealth, but she will agree to a compromise of a maximum wealth of ten million dollars. Robeyns doesn’t care what currency unit you use (dollars, pounds, or euros) as long as there is an enforced maximum. To the immediate reply that a 100% tax on wealth over that amount might be problematic, Robeyns repeatedly insists that she isn’t necessarily advocating that tax rate. Not that she thinks there is anything wrong with a 100% wealth tax, there are just other ways to get there. For example, you could convince everyone in the world it is bad to have lots of wealth. 

The bulk of the book is Robeyns shouting at the reader about why anyone having high wealth is so incredibly bad. First: “It’s Dirty Money.” Some wealthy people acquired their wealth by stealing it. Obviously, that is an argument against theft, not high wealth, but in a perfect example of how this book works, having established that we all agree stealing is bad, Robeyns then notes that people get wealthy in lots of other similar ways — like only paying whatever they are required to pay in taxes or owning companies that pay wages less than what Robeyns thinks workers should be paid. You see? Stealing wealth and not paying more than you owe in taxes are both “dirty money.” So, high wealth is evil. 

The roll call of reasons why high wealth is evil goes on like that for a couple hundred pages. High wealth is bad because it “undermines democracy” when wealthy people convince legislators to vote for things Robeyns doesn’t like. High wealth is “setting the world on fire” because rich people use airplanes and some corporations produce and use fossil fuels. Nobody deserves high wealth because wealthy people need a society in order to protect their wealth from theft, and the social contract should be fair and inclusive, not allowing people to get high wealth because of inheritance, luck, or having talent and the ability to work hard. Allowing some people to have high wealth is bad because “there is so much we could do with that money,” the “we” meaning (of course) people like Robeyns. High wealth is bad because it leads to philanthropy, which is terrible because the wealthy person gets to decide who should benefit from the philanthropic enterprise. 

Most of all, it would be good for the wealthy people themselves to give up their wealth because being wealthy is not only psychologically bad for the wealthy, but also the children of the wealthy really suffer from growing up with wealth. So, if you care about the kids, don’t let them grow up wealthy. I know that last sentence sounds like I am exaggerating and that there is no way Robeyns is as extreme as the last three paragraphs make her sound. But here is Robeyns: “People are free to make themselves as unhappy as they like. But that doesn’t take away our societal responsibility toward their children.” Similarly, the rich “are just as vulnerable, psychologically, as the rest of us, and if we care about the vulnerability of other people in general, then we should also care about how excessive wealth can destroy the lives of the super-rich.” 

There is an aura of unreality hovering over nearly every page of this book. The most jarring portion comes early when Robeyns sets out to refute anyone who thinks that all the wealth in the world today has been a big benefit to the poor. Lots of people are under the impression that there is less extreme poverty in the world now than there was in the past. Robeyns is here to assure us that this may not be true. Again, it may seem hard to believe Robeyns really says this. But, “the dominant narrative—that in the past everyone was very poor, and we have greatly reduced extreme poverty on a global scale—is misleading at best.” How is it possible that Robeyns could raise doubt about the fact that there is less extreme poverty today than there was in the past? First, the data before 1981 are not perfect, so maybe people really were better off in the past. Second, if instead of using $2 a day in income as the measuring line for extreme poverty, we use a higher number, then there are more poor people today than we estimate using the lower number. (Not surprisingly, she does not note that no matter what threshold you pick for extreme poverty, the global rate has declined.) 

Robeyns is willing to concede, however, that maybe there is more wealth in the world than in the past. But, even if so, the higher levels of wealth still aren’t a good thing. Because some people have much higher wealth than others, we cannot say that the increasing wealth is actually a good thing for the poor people who, while they may no longer be starving to death, are not as rich as the super wealthy. Her inability to acknowledge joyfully that there has been a massive decline in extreme poverty over time is tied very closely to the strangest parts of the book. There is no place in this book where Robeyns seems aware of the mechanisms by which wealth is generated. In Robeyns’ view, some very bad people have acquired a large amount of wealth by doing very bad things, and thus the net result of all that increase in wealth is negative no matter what has happened to the poorest people in the world. 

As I said at the outset, writing an entire review just documenting how bad this book is would be an incredibly easy task. Pick a page at random, and you’ll find multiple examples of an argument neither cohesive nor persuasive. The question is: how is it possible that the book is this bad? The answer is found in the Introduction. On the third page, Robeyns notes, “For a long time, I felt that there was something wrong with an individual amassing so much money, but I couldn’t properly articulate why.” So, she “decided to deploy my training in philosophy and economics to answer the question: Can a person be too rich?” The arguments in this book did not lead Robeyns to her conclusion; she started with the conclusion. When you start your investigation already knowing the answer to the question, then you may not notice that the reasons you offer for your conclusion are not persuasive to someone who is skeptical about the conclusion. If it seems like the arguments are non sequiturs attacking straw men, that isn’t important to Robeyns. The conclusion is right even if the arguments fail. The result of this approach is a religious book written for the already converted. 

What makes Robeyns’ book so useful for understanding what many people are thinking is that it becomes obvious that people who want to get rid of high wealth are not reaching the conclusion because they are persuaded by reasons of the sort found in Robeyns’ book. Instead, it is an article of faith. If having high wealth is inherently evil, then the conclusion is obvious. There is no reason to permit inherently evil acts to continue if we can stop them. Trying to explain why high wealth is evil is beside the point; it just is. 

Ten Years After, the 1970s rock band, provides a marvelous way to think about this mindset in “I’d Love To Change the World.” “Tax the rich, feed the poor/ ‘Til there are no rich no more.” I have always thought those lines were pretty funny and highly ironic; taxing the rich to feed the poor doesn’t help end poverty; it just gets rid of the rich. But, in reading Robeyns’ book, my realization was that there are people who do not think those lines are ironic. Taxing the rich to feed the poor is desirable not because it will help the poor, but simply to get rid of the rich.  

Of course, the idea that a society should get rid of the wealthy is not new. Lycurgus, the crafter of ancient Spartan society, implemented a whole series of radical changes (breaking up large land holdings, forbidding the manufacture of luxurious items, inhibiting trade with other cities, forcing everyone to eat at communal meals) in order to rid Sparta of the rich. He seemed total unconcerned that Sparta would be a poorer society; Lycurgus’ ideal Spartan lifestyle was one devoid of any hints of luxury. 

Lycurgus provides an interesting contrast to Robeyns. Both have the ideal of a world in which there “are no rich no more.” There is an intellectual honesty in Lycurgus’ implicit argument that a poor-but-equal world is superior to a rich-but-unequal world. That is not what Robeyns is arguing, however. Limitarianism wants to have it both ways. Robeyns wants to get rid of the wealthy, but does not want to get rid of the wealth. In Robeyns’ Limitarian Paradise, there is no trade-off between the technological marvels and phenomenal wealth in the modern world and limiting everyone to no more than one or ten million dollars of wealth. Somehow, we can redistribute all the wealth in the world and still keep on generating just as much wealth in the future, even though creative and hard-working people have hit their personal limit on wealth. Robeyns argues this will happen if we develop a culture “where material gain is not the leading incentive — where people may also choose to work hard because of personal commitment, challenges they have set for themselves, or for intrinsic pleasure, esteem, and honor.” 

To pretend that you can have all the riches of the modern world and eliminate the ability for anyone to become wealthy is a sure sign of someone who has no understanding of how all this wealth was generated in the first place. Robeyns’ book, however, provides insight into why people advocating income limitation plans often seem so unaware of how economic growth occurs. If getting rid of rich people is akin to a religious mandate to rid the world of evil, then of course it is safe to impute bad motives to anyone arguing that there are possibly benefits to the world from allowing people to do things that will make them wealthy. Despite appearances, Robeyns book is not really an attempt to persuade anyone of her beliefs; instead, it is an insight into the minds of zealots. 

Andrés Manuel López Obrador addresses a press conference. EneasMx. July 2024.

Here in the US we don’t normally have much interest in the domestic politics of our neighbors Mexico and Canada. Mexico, for example, is part of our presidential election only insofar as immigration is a top issue. But we don’t ask “why are people from throughout Mexico and Latin America crossing our border in large numbers?”  

Mexicans aren’t just fleeing their homeland to enter the relatively safe and stable US – they are now leaving Southern Mexico for Guatemala, of all places, amid the near civil war there between feuding drug lords in Chiapas. As Mexico’s political system deteriorates, crime is increasing, and the outgoing president, Andres Manuel Lopez Obrador (or AMLO as he is known) completes his quest to eliminate the independence of the country’s judiciary. 

AMLO’s administration has followed in the footsteps of other left-wing populist leaders in the Western Hemisphere in Venezuela, Colombia, Bolivia and Nicaragua. He has decried “neoliberalism,” consolidated power, appealed directly to the poor, working class, and alienated middle-class of Mexico. During his time as president, AMLO had many confrontations with the Mexican Supreme Court specifically and the judiciary more broadly. Particularly in the last few years of his presidency as his power increased and his respect for the law declined, he and the court clashed as he tried to expand the power of the military, pull back from fighting crime and tried to ‘pause’ stable, productive economic relations with the US. This prompted him to push through a “reform” that will see every judge throughout Mexico subject to regular election, not lifetime appointment as American federal judges are, and Mexican judges were previously. 

He is promoting the reform as a way to eliminate judicial corruption, but this is a charade. The reform is merely a way for his party, the powerful Morena party, to exert control over one of the last independent institutions in the country. Judges will be beholden to political interests, not law. Voters in Mexico are completely unprepared to understand what makes a “good” or “qualified” judge. A voter in the capital of Mexico City, a metropolitan area of perhaps 20 million people, would be voting for thousands of judicial candidates with no information other than party identification, including for the Supreme Court. Money will purchase judges and corruption will far outstrip the problems the system now faces. 

Some US states, and some countries, have various forms of “elections” for judges, be it options for recall, yes or no votes on maintaining a judge, or even a few with regularly competitive elections. But federal courts are appointed for life, which provides some insulation from political forces. So why has Lopez Obrador decided to make this radical move? In Bolivia, which has judicial elections, the Economist just recently described the electing top judges as a “disaster” which poisoned the country’s politics. What’s bad for the nation as a whole, though, can still be very good for a power-hungry politician. In Bolivia, the two main competitors for the presidency are jockeying for support from the elected court and everyone understands that the court’s decisions are motivated by nothing more substantive than politics. 

In some ways, this looks like a return to the days of the monolithic PRI party that dominated Mexican politics in the 20th century. But there’s an additional reason why AMLO has proposed the reform. Destroying the judiciary as a counterbalance to the elected power in Mexico is straight out of the playbook populists have used to consolidate their control. In Venezuela, for example, President/Supreme Dictator Nicolas Maduro had his hand picked Supreme Court stamp his stolen election as legitimate, and AMLO has dreams of a similar scenario when the US or a future Mexican president comes after him for aiding Mexican drug traffickers or forces him to explain how he and his sons have bank accounts in the Cayman Islands or Switzerland stuffed full of pesos. Once the judiciary is under control, populists next take over the central bank and begin to intimidate independent media outlets. The institutional and social checks on majoritarian power are eliminated creating an opportunity for dictatorship, and lifetime rule and enrichment through graft. 

How is this related to immigration and relations between the US and Mexico? Those Mexicans who are leaving for other countries see very limited economic opportunities, no domestic security, and no hope in their political institutions after a six-year assault on foreign investment, the rule of law, and economic development under President Lopez Obrador. Daniel Ortega in Nicaragua, the Castro boys in Cuba, Maduro in Venezuela and now AMLO, all claim to be men of the people fighting for the common person against the evil forces of capitalism and imperialism. Shockingly, the situations in their countries deteriorate, their citizens flee, and they consolidate their grip on power. 

More than $35 billion in direct foreign investment in Mexico is now on hold according to the Wall Street Journal. The peso, which has historically been a very stable currency, has dropped 15 percent since the judicial ‘reform’ was announced. Protests from the legal community and the political opposition are intensifying. In at least five of Mexico’s states, battles between rival drug organizations have produced bloodshed and instability. The costs to the Mexican nation will be enormous. Eliminating an independent judiciary may very well force Mexico to pull out of the USMCA, the replacement for NAFTA. Remember Mexico is the United States’ largest trading partner, and vice versa. Such a shift will have devastating economic effects on both sides of the border. 

But three sets of interests benefit from Lopez-Obrador’s plan. The first is the old institutional power brokers in public and private unions. The move towards economic and political liberalization undercut their ability to pursue graft, control jobs, and influence policy. They’d like a return to the “good old days” of no foreign businesses interfering with their rule of the labor market, which will increasingly become blurred as the private sector shrinks while the economy contracts. 

The second set of interests are those involved in the illegal drug trade. In the past, analysts argued that while the Mexican government lacked the capacity to effectively fight the war against the large cartels that once ruled the drug trade, under Lopez Obrador two things have changed. The first is that the days of the large cartels are in the past. The Netflix series Narcos is great television, but there is a reason it is set in the 80s and 90s. Now smaller operations, more prone to violent turf wars with unstable business operations, have replaced the once mighty cartels with bloody results. The second is that Morena has used local alliances with those in this new entrepreneurial drug “industry” to help ensure they have convincingly won local and state elections. 

Lopez Obrador in fact helped shield some of the larger drug organizations from US prosecution, which led to his infamous “hugs not bullets” policy. Whether he was motivated by a newfound interest in humanitarianism or following the requests of his friends in the drug industry, AMLO diverted military and police resources away from fighting the drug traffic. Unsurprisingly hugs did not stop the drugs, and the US responded by pressuring the Mexican government to work with them for high profile arrests, like that of the son of the notorious “El Chapo,” now in a US prison. According to press reports the Blackhawk helicopters and troops sent in to get him were not hugging anyone. 

Finally, this new system benefits the army, which is another important strategy in the populist playbook. As we’ve seen in Venezuela, once politicians control the formal political institutions and the criminal activities in society, they need the support of the army. Military coups are part of the landscape and history of Latin America.  

Lopez Obrador has adroitly freed the military from fighting the drug war, something they had no desire to do. Instead, he has placed the military in charge of lucrative endeavors like managing ports and airports. They have taken over the construction of major infrastructure projects. These activities are all in direct violation of the Mexican Constitution, and the Supreme Court ruled against these moves just last year in one of their many fights with AMLO. The possibility for political power and of course further bribes and graft at ports and large construction projects shouldn’t be difficult to predict. He has also essentially placed the police under military control, which has expanded the army’s power base. 

The third winner in all of this is Morena, and indirectly Lopez Obrador. His party controls Mexico, and should the judicial reform pass, their formal power may rival if not exceed that of the PRI. While some observers are hopeful that his successor Claudia Sheinbaum may pivot in a different direction and guide Mexico towards more moderation, it’s difficult to see how she might do so if she chooses to reject AMLO’s legacy. As one astute observer noted to me, while it is possible she may become her own leader and wish to escape AMLO’s shadow, a more likely alternative may be that she follows the Medvedev/Putin model and is simply complicit in allowing his rule to continue into a second term. After all, this observer said, it is AMLO who controls Morena, and Morena is the most powerful civilian force in the nation. 

The losers here are obvious. First and foremost, Mexicans of all social and economic classes will lose the lifeline that NAFTA/USMCA, economic liberalization, and attempts at political liberalism has created. Millions of Mexicans ascended from working class status “clase trabajadora” to the middle class through manufacturing jobs and foreign investment from the US and abroad. As others have correctly outlined, those treaties depended on a guarantee of an independent judiciary. While some of those commercial agreements may transfer to international commercial courts, many won’t. Existing investment may wither, and new investment will almost certainly go from paused to stopped.  

AMLO himself has a vision of Mexico that is very much anti-development. He has a romantic vision of an early twentieth century, rural Mexico. Whether his base will realize (too late) what that means for the future is difficult to tell. But if I were to predict the future, I’d return to the issue of immigration. While many across the political spectrum are falling all over themselves to oppose “illegal” immigrants, immigration has long benefited the US. Whether it was the waves of Southern European immigrants at the end of the nineteenth and start of the twentieth century, the Irish immigrants from the 1850s after the Irish potato famine, or even the many Venezuelans who have entered the US recently, running for their lives from Maduro and his Cuban handlers. Immigrants come with skills, and they are willing to take risks and work. Once we move past the political grandstanding, we will eventually see how good it is to get a new influx for our workforce. 

But in the short term, these developments will be problematic for the US because Mexico is right next door and heading down a very dangerous path. Empowering one party, strengthening the military, destroying checks and balances, and allowing criminals to have free rein throughout Mexico will not end well for other countries. The attractions of militarism and the reliance of a strong hand “la mano duro,” as it is known in Latin America, is tantalizing. But it is a fantasy, a mirage that leads to dictatorship. Mexico should reject the abolition of its independent, albeit imperfect courts. If they need more evidence of how this will end, they shouldn’t look to AMLO for promises but look to Venezuelans and how their courts have protected the only one person — the dictator, not his subjects. 

Economic misconceptions persist due to misguided intuitions that overlook complex factors, a preference for principles over outcomes, the influence of epistemic bubbles, and political tribalism. Despite frequent refutation flawed ideas endure, requiring constant vigilance from economists.

Harwood Economic Review

Table of Contents

Governments, Not Markets, Impel ESG
Allen Mendenhall and
Daniel Sutter

Investors Make Houses More Affordable, Not Less
David Youngberg

Sense and Nonsense on Petrodollars
Peter C. Earle

Boosters Beware: Stadiums Aren’t Magic
Art Carden

Protectionists Are Wrong: Free Trade is the Path to Prosperity
Vance Ginn

Overpopulation: An Ancient Myth Refuted
Aidan Grogan

HEROctober2024LowRes

Signage at the East River Plaza mall in Harlem, NY reflects grocery options competing side-by-side, including warehouse clubs and discounters. 2021.

Nearly two years ago, Kroger and Albertsons, America’s two largest traditional brick and mortar supermarket companies, agreed to a $24.6 billion merger. Ever since, the Federal Trade Commission has argued against allowing the merger, claiming that it would “lead to higher prices for groceries and other essential items” and “lead to lower quality products and services.” 

That led to a just-completed hearing (whose results have not yet been announced) about whether to grant an injunction against the merger, until the FTC takes its case before one of its administrative law judges. There are also state level challenges. On the other hand, Kroger has sued to challenge the constitutionality of the FTC trying their case before a “home team” ALJ rather than an actual trial in federal court.

However, the picture the FTC is painting of the “biggest getting bigger,” leading to consumer harm, is so muddled it cannot support their argument.

To begin with, simply looking at the increased number of stores in a merged K-A chain–to over 5,000–is far less indicative of any increased market power than it is being presented as. The reason, seldom even mentioned, is that “the vast majority of Kroger and Albertsons stores are in markets where the other is not located.” That means that in the vast majority of areas, where their footprints do not significantly overlap, merging the chains will create no increased market power to harm consumers. In all those places, the FTC case that merger will cause consumer harm collapses. In contrast, the claims in support of the merger, that it will allow merged operations to lower costs and make them more effective competitors for shoppers’ patronage at all their stores, still makes sense. 

The magnitudes involved are instructive. Most measures put the number of overlapping stores at about 1,400 (roughly 28 percent). How believable is it that K-A would go to the great expense of integrating all their operations just to be able to raise prices in no more than 28 percent of their stores? Not very.

In addition, not every case where the chains’ stores are in proximity would cause competitive concerns. I live in one such area. My wife and I live roughly a mile from a Ralphs (Kroger) and a mile in the other direction from a Vons (Albertsons), and between us, we shop at both of them multiple times in most months. But if they merged, it would not be a competitive disaster that puts us at risk. We are even closer to a Trader Joes and a Sprouts (in what was previously an Albertsons store) which we also shop at. We are two miles from a Walmart neighborhood market and a Target with a sizeable supermarket section. We are within 5 miles of Costco (and another one is being planned even closer to us), Sam’s Club, a Walmart Super Center and an Aldi. We also use Amazon and Instacart to get groceries. There is intense competition, whether or not Vons and Ralphs merge. But if that merger made them a stronger, lower-cost competitor, we would gain as consumers. And our case is not so unusual. Supermarket News has reported that “the average family today shops at five different grocers on a regular basis.”

Even if we ignore the fact that proximity does not equate to monopoly power to abuse consumers, it would only require roughly 700 divestitures (half of the number of overlapping stores, or 14 percent percent of the over 5,000 combined stores)–to address all such market power concerns. And Kroger has from the beginning offered to make divestitures to ameliorate the FTC’s competitive concerns (which have long been satisfactorily utilized for that purpose in grocery mergers), making it all but impossible to believe that such a Kroger-Albertsons merger would harm consumers. Interestingly, the FTC argued that the company who would manage the divestitures (C&S Wholesale Grocers) might not operate as efficiently as Albertsons, which would undermine competition. But since Albertson’s costs are reportedly higher than Kroger’s, the FTC is essentially admitting the case for the K-A merger increasing their efficiency. 

We must also understand that in antitrust, the higher the market share forecast to result from a merger, the greater the presumption of greater monopoly power and harm to consumers, and the more likely the FTC could prevail in litigation (despite a recent series of court losses due to its over-reaching). That provides a FTC determined to win with a massive incentive to manipulate market definitions to make monopoly power appear even where it doesn’t exist. For instance, say you had a small store on a street corner which sold salt, among other things. If it was the only store on that corner selling salt, defining the relevant market as sellers of salt on that street corner would make you a monopolist, even though you had no market power in fact. 

That explains why the FTC has in this case reached back into their long-rejected 1960s bag of anti-consumer tricks to get their desired result, aiming to uphold Justice Potter Stewart’s famous dissent that “The sole consistency I can find is that, in [merger] litigation under Section 7 [Of the Clayton Antitrust Act] ‘the Government always wins’.” Or as I put it elsewhere, “The government’s desire to demonstrate monopoly power to justify the rejection of a merger led to a cottage industry of sorts, finding ways to distort measures…to find monopoly power where there was no power to hurt consumers.”

In recent years, the FTC has defined the relevant market for such mergers as including “traditional” brick-and-mortar supermarkets (of which Kroger and Albertsons are the largest) and food and grocery sales at hypermarkets (Walmart supercenters). Further, they have viewed the relevant market as only including stores where a consumer could purchase all or nearly all of their household’s weekly food and grocery needs at a single stop at a single retailer, within a range of between two and 10 miles (depending on circumstances).

That definition is nowhere near reasonable today, unless that the goal is to maximize the apparent monopoly power a K-A merger would create, in spite of the current grocery market being perhaps the most competitive one in history. 

Walmart stores that are not supercenters are excluded. But Walmart and Sam’s Club have more than 5,300 stores, and its grocery revenue is more than twice that of Kroger and Albertsons combined. And when it comes to local competition, it is worth noting that 90 percent of the US population lives within 10 miles of a Walmart store.

Wholesale club stores, like Costco (and Sam’s Club and BJ’s Wholesale Club) are omitted from that definition of the market, which is particularly problematic because they also have a larger catchment area than supermarkets. Further, it is hard to see how they are not part of the relevant market when roughly 40 percent of Americans are Costco members, an average Costco (the world’s second largest grocer) store sells five times the groceries of the average US supermarket, and Costco does half again as much business as Albertsons.

Online sellers like Amazon/Whole Foods are also excluded, even though it is the worlds’ fifth largest grocer, and closing in on Albertsons. Aldi (also owner of Trader Joe’s) is excluded (as a “hard discounter” or “limited assortment” store), even though a quarter of Americans now shop there. Instacart sales are excluded, as are natural and organic markets and ethnic and specialty stores.

Looking at the broader grocery market also undermines the FTC claims. Kroger might be the biggest traditional grocery retailer, but they sell fewer total groceries in the US than Walmart, Amazon, or Costco. Even after the proposed Kroger-Albertsons merger, it would only represent 9 percent of those grocery sales. And while a Kroger-Albertsons merger would appear to threaten competition based on their share of the FTC’s market definition, traditional supermarkets have been losing a great deal of market share to those excluded from that definition, showing just how effective they are as competitors. Since 1998, warehouse clubs and supercenters have seen their share of retail grocery sales double, while supermarkets’ share dropped by more than a quarter. In 2020, 98 percent of people who regularly bought “center aisle” products like paper towels, cleaning supplies and canned goods bought them at a grocery store, but by 2023, 37 percent said they made none of those goods in a grocery store, largely shifting to online purchases. And now about one out of eight consumers buy their groceries “mostly” or “exclusively” online.

These results are summarized by the National Academies of Sciences description of the retail grocery sector as “highly competitive,” largely due to the growth of warehouse clubs, superstores and online retailers, which are overlooked by the FTC’s market definition, not threatened with monopolization by the prospect of a Kroger-Albertsons merger. And no amount of repetition of claims that consumers are being protected by the FTC’s actions makes it true.

Artist’s concept of a central bank digital currency.

When it comes to designing digital currencies that protect the identity and transactions data of their users, developers have made a lot of progress in a relatively short period of time. It is technically feasible to design a retail central bank digital currency — or, CBDC — that promotes financial privacy. But one must also consider what is politically feasible. Unfortunately, there is little prospect that the United States government would actually adopt a privacy-protecting CBDC.

If adopted, a CBDC will eventually — if not initially — be used to surveil the transactions of Americans.

The government is already using existing technologies to surveil its citizens. There’s no reason to think the government would give up its ability to monitor transactions with the introduction of a CBDC. Indeed, it seems much more likely that the government would seize the opportunity to expand its capabilities. Therefore, it is absolutely crucial to maintain a private banking system firewall between the government and our transactions data.

Let’s start with the status quo. The government has essentially deputized the private banking system to monitor customer transactions. Banks keep records on customer transactions, which the government can access by subpoena. The government also requires banks to report suspicious activity and currency transactions in excess of $10,000.

As Nick Anthony at Cato has shown, the real (inflation-adjusted) reporting thresholds have gradually declined over time. When the Bank Secrecy Act rules were rolled out in 1972, banks were required to report currency transactions worth $10,000 or more. If that reporting threshold had been indexed to inflation, it would be around $74,000 today. Since it wasn’t indexed to inflation, banks must file many more reports today on transactions worth much less than those that would have triggered a reporting requirement in the past.

Other thresholds are even lower. For example, money-service businesses must obtain and record information for transactions worth just $3,000.

The government vigorously defends its ability to monitor transactions. It prosecutes those making transactions just below reporting thresholds —a separate crime called structuring. It seizes cash and collectibles, which make it more difficult to monitor transactions, even in cases where there is no evidence of criminal activity. And it undermines new financial privacy-protecting technologies.

Consider the government’s response to cryptocurrencies, some of which offer a high degree of financial privacy. The Financial Crimes Enforcement Network requires cryptocurrency exchanges to register as money-service businesses and comply with Know Your Customer requirements. If transactions can ultimately be traced through the blockchain to these on- and off-ramps, then the financial privacy that cryptocurrencies offer is largely eroded.

Consider the government’s response to cryptocurrency mixing services, which make it more difficult to trace one’s transactions back to an exchange where his or her identity may be discovered. The Office of Foreign Asset Control has added the wallet addresses of mixing services to the Specially Designated Nationals and Blocked Persons list, effectively making it illegal for Americans to employ those mixing services.

Why would a government work so hard to ensure it can monitor transactions just to turn around and issue a financial privacy-protecting CBDC? Again: it seems much more likely that the government would issue a CBDC that bolsters its ability to monitor transactions.

The ostensibly private messaging service ANOM serves as a useful comparison. ANOM was not private. Unbeknownst to its users, ANOM was actually the centerpiece of the Federal Bureau of Investigation’s Operation Trojan Shield. Messages sent using the ANOM app were not only delivered to recipients, but also to the FBI’s database.

The FBI maintains that it did not technically violate the fourth amendment by using a backdoor in the messaging app to snoop on US citizens, because it transferred the data to Lithuania, where foreigners would snoop on US citizens and then tip off the FBI when illegal activity was suspected. Think about that. The FBI developed the ability to spy on US citizens, promoted the use of the enabling technology, and then handed the data collected by this technology over to foreign nationals in order to circumvent the Constitutional constraints designed to safeguard US citizens from such activities. These efforts not only undermined the due process afforded to criminals — though that would be bad enough. It also facilitated the snooping on perfectly lawful messages. Some of these messages involved intimate details shared between romantic partners. Others involved protected conversations between attorneys and their clients.

If the government will build a backdoor into a messaging app — and has been caught trying to bribe engineers to install others — then one should expect it will build a backdoor into a payments app, as well.

Americans do not have much financial privacy today. We would have even less financial privacy if not for the private banking system firewall between the government and our transactions data. This firewall isn’t perfect. But it is better than nothing. 

To see how such a firewall promotes financial privacy, consider the Internal Revenue System’s efforts to access the customer data of Coinbase in 2016. At the time, Coinbase was boasting that it had 5.9 million customers — many more than had reported crypto holdings to the IRS. Citing this discrepancy, the IRS secured a John Doe summons.

In 2017, I described the summons as follows:

Basically, the IRS wants any and all information that Coinbase has so that it can sift through that information for the slightest hint of misreporting. It has requested account registration information for all Coinbase account holders, including confirmed devices and payment methods; any agreements or instructions that grant third party access or control for any account; records of all payments processed by Coinbase for merchants; and all correspondence between Coinbase and its users regarding accounts.

Needless to say, the scope of the summons was very broad.

Recognizing the duty — and, perhaps more importantly, the profit motive — it had to protect its customers, Coinbase appealed. Eventually, the courts decided that Coinbase would have to hand over some customer data on around 13,000 high-transacting users.

Kraken has also resisted an overly broad summons to hand over customer data to the IRS, to similar effect.

I hold the old-fashioned view that, in a liberal democracy, the government should have to demonstrate probable cause before acquiring the authority and ability to sift through one’s financial records. The degree of financial privacy afforded by the current system certainly falls short of that standard. Nonetheless, it affords much more financial privacy than one could reasonably hope for if the government held the data, as would likely be the case with a CBDC.

Financial privacy is very important for a free society. What we do reveals much more about who we are than what we say. And what we do often requires making payments. In order to exercise our freedoms, we must be able to selectively share the details of our lives with others — and withhold such details from those who would otherwise use them to harm us.

We should take steps to bolster financial privacy in the United States. The introduction of a retail CBDC would be a step in the wrong direction.

President Franklin D. Roosevelt signs the Social Security Act. 1935. 

Donald Trump and Joe Biden began the campaign season by staying away from social security reform. Kamala Harris has only promised to strengthen it without providing details. Mr. Trump then proposed a truly bad idea and has refused to back down. That idea is the elimination of income taxes on social security benefits.  

The richest retirees receive the most Social Security and thereby put the most pressure on an already unsustainable budget. Eliminating the income tax on benefits will result in them getting even more after-tax income, while significantly reducing income tax revenue at a time when it only takes our country 260 days to tack on another trillion to the national debt.  

The Social Security program was too vulnerable to demographic bubbles from the very beginning and subsequent reforms have increasingly over-promised benefits thereby inviting our present budget insolvency. Voters are frustrated and losing confidence. They are looking for genuine leadership, not the “third rail of politics” policy détente we now have.

Harris and Trump now have an opportunity to such leadership. One thing could be done quickly to reduce the unfunded liability gap in Social Security funding. It’s easy to explain to voters, it will appeal to both younger and older voters, and it will especially appeal to those in the political middle who are looking for practical solutions rather than ideologically driven bumper sticker slogans. It would behoove both candidates to jump on this reform proposal first. 

In 1972, an amendment was passed to protect Social Security beneficiaries from the effects of inflation. A mistake was made in the procedure for implementing the Cost of Living Adjustments indexing of benefits. This had the effect of over-accounting for the effects of inflation, leading to the prospect of benefit levels soaring out of control as inflation worsened in the 70s. In 1976, a Congressional panel led by a Harvard economist, William Hsiao, was convened in part to correct the error. The panel also recommended that the initial benefits calculation employ price indexing rather than wage indexing out of fear that the latter would produce an unsustainable budget. Unfortunately, wage indexing was chosen over price indexing. 

This was a costly mistake, and we are still paying for it. As noted by Alex Durante in a recent Tax Foundation report,   

Had price indexing [rather than wage indexing] been implemented under Hsiao’s proposal, Social Security would have run surpluses every year from 1982 to 2023, except for 2021. There would have been temporary shortfalls starting in 2024, but by 2044, Social Security would have been running surpluses again. Surpluses in Social Security could permit a reduction in the tax rate or allow some of the revenue raised from payroll taxes to support Medicare, which is also running large deficits.

While this was a terrible missed opportunity, the main lesson is still valid: wage indexing makes benefits grow too fast for program stability. Luckily, it is not too late to take Hsiao’s advice.   

According to the Social Security Administration’s 2023 Trustee Report, adjusting the initial benefit calculation with a price index rather than a wage index will remove about 80 percent of the unfunded liability gap over the next 75 years, and that’s if instituted in 2029. The results are even more dramatic if we start sooner. That is major gain with minimal pain. 

Most voters don’t realize that social security benefits have been, and continue to be, rising in inflation-adjusted terms due to wage indexing of the initial benefit calculation. This is because when the economy is growing, wages normally grow faster than prices (that’s what produces growing real personal income over time). As a result, since 1977, each new class of social security recipients lives a little larger than the ones before.  

This is very foolish.  

Young people are understandably worried about being cheated out of some of their Social Security benefits, and having the real value of the benefits they do receive eroded by inflation. They are not worried about not getting more from Social Security when they retire in real terms than their parents and grandparents did.  

Most young people will happily support this reform because it provides strong assurance that they will get something they value greatly (a credible guarantee of not being impoverished in old age) in return for giving something up they don’t care about (getting more than their parents and grandparents did per dollar contributed).  

This simple reform will not harm current retirees in any way and will produce a tremendous relief to those who are ready to retire and are already uneasy about their 401Ks, as well as younger workers who are simply looking for fair treatment.  

The media and voters should force the candidates to explain why they won’t pledge, now, to drop wage indexing to stabilize Social Security going forward.