Jason Riley describes the late Paul Ehrlich as “always wrong, never in doubt.” A slice:
Ehrlich had visited India and concluded that poor people were overbreeding. He believed that the developing world simply had “too many people” and calculated that the earth’s population needed to be cut in half. “The operation will demand many apparently brutal and heartless decisions,” and the “pain may be intense,” he cautioned, sounding like a cartoon villain. But it would be “coercion in a good cause.” Ehrlich urged wealthy nations to cut off food assistance to the Third World. He endorsed an Indian official’s proposal for “sterilizing all Indian males with three or more children.” It was for their own good, he insisted.
The world’s population grew, but famine on the scale that Ehrlich predicted never materialized. Within a decade, India not only produced enough food to feed itself, thanks to technological advances in agriculture that Ehrlich hadn’t anticipated, but was a net exporter of wheat. “Since 1900 the world has increased its population by 400 per cent; its cropland area by 30 per cent; its average yields by 400 per cent and its total crop harvest by 600 per cent,” Matt Ridley wrote in his 2010 book, “The Rational Optimist.” “So per capita food production has risen by 50 per cent.”
Making spectacularly wrong predictions of imminent catastrophe became something of a habit for Ehrlich over the decades. His dire forecasts about global cooling and warming were wide of the mark, a twofer. He speculated that the U.S. and Europe would be forced to ration food and encouraged couples to limit themselves to one or two children. In 1971, he said that by “the year 2000 the United Kingdom will be simply a small group of impoverished islands, inhabited by some 70 million hungry people.” Three years later, he predicted that “America’s economic joyride is coming to an end: there will be no more cheap, abundant energy, no more cheap abundant food.”
Noah Rothman reports on Paul Ehrlich’s disastrous legacy. A slice:
Central to Ehrlich’s thesis in The Population Bomb was his contention that the Earth had a finite “carrying capacity,” and its limits were already being tested by the mid-20th century. Humanity would soon have to ration its resources and consign those for whom it could no longer care to triage.
Ehrlich’s modern Malthusianism fired the imaginations of the international environmental left, but he seemed compelled to forever up the ante on his dire predictions. He subsequently anticipated that, by 1980, the average American lifespan would decline to just 42. “Most of the people who are going to die in the greatest cataclysm in the history of man have already been born,” Ehrlich wrote in 1969. “The death rate will increase until at least 100-200 million people per year will be starving to death during the next ten years,” he declared the following year. By 1971, Ehrlich was willing to “take even money that England will not exist in the year 2000.” Roughly 100 to 200 million people, he assumed, would die of starvation between 1980 and 1989 in what he deemed “the Great Die-Off.”
Sure, he got some of the “details and timing” of the events he predicted wrong, his allies will concede. But, to them, the eschatological gist of his work still rang true. “Population growth, along with over-consumption per capita, is driving civilization over the edge,” Ehrlich told The Guardian as recently as 2018, “billions of people are now hungry or micronutrient malnourished, and climate disruption is killing people.” With the confidence of a Marxian economist, Ehrlich never questioned his faith in where humanity’s addiction to prosperity was taking it. “As I’ve said many times, ‘perpetual growth is the creed of the cancer cell,’” he said.
When President Donald Trump implemented his Liberation Day tariffs last spring, the president’s senior adviser, Peter Navarro, suggested that these tariffs could generate an additional $700 billion a year.
At the time, I estimated that a more realistic estimate of the maximum additional revenue these tariffs could reap was likely less than $300 billion, which would barely fund two weeks of federal government spending.
Almost a year later, customs duties revenue data suggest that prediction was far closer to reality than what the administration was promising.
Tariff revenues did tick up during this 11-month period, averaging just under $27 billion a month from April 2025 through February 2026, or roughly $296 billion in cumulative tariff revenues since Liberation Day.
To understand why America’s rare-earth problem is really a regulatory problem, it helps to understand what naturally occurring radioactive materials (NORM) actually are. Every piece of rock on earth contains trace amounts of every element, including radioactive ones. A mineral deposit is simply a place where a particular element is more concentrated than usual. Most of the time, the radioactive trace elements in a deposit stay with the waste rock when the ore is extracted, get dumped back in the hole, and nobody thinks about them again.
Rare earths are different. When you process rare earth ores, the naturally occurring thorium in the rock tends to concentrate alongside the rare earth elements you actually want. The processing step that gives you usable rare earth material also gives you concentrated thorium, which is radioactive. Concentrated radioactivity is, rightly, subject to strict regulation. The question is not whether to regulate it. The question is whether the regulations are sensible.
In the United States, they are not.
After numerous exchanges with Tim Worstall, a former metals trader and commodities analyst who spent decades working in rare earth markets with the enormous patience (thank you, Tim), here’s what I have come to understand. One of the most important rare earth feedstocks in the world is monazite, a mineral that occurs as a byproduct of common mining operations. When companies mine mineral sands for ilmenite and rutile, the feedstocks for titanium production, an industry operating at millions of tons per year, produce monazite as a leftover. Monazite is loaded with rare-earth elements. Because processing involves handling concentrated thorium, and because American NORM regulations make that legally treacherous for any company without specific legacy authorizations that almost no one has, the monazite either gets sold to China for a fraction of its value (because there is no other market) or is simply stockpiled on site indefinitely.
How stupid is that? A mineral containing exactly what America says it urgently needs sits in warehouses and waste piles across the country, made worthless by the regulations.
One common consequence of war is to make censorship more politically tempting, which makes it imperative to call out government efforts to chill free speech in moments like this.
President Donald Trump complained over the weekend on social media about a “misleading headline” that he said overstated the damage to United States warplanes from an Iranian strike in Saudi Arabia. His Federal Communications Commission chairman, Brendan Carr, sprung to attention, amplifying the president’s post and threatening broadcasters. Carr, who met with Trump at Mar-a-Lago on Saturday, said they will “lose their licenses” if they don’t “operate in the public interest.”
The message to broadcasters such as NBC, ABC and CBS is clear: They might face regulatory reprisals, up to being forced off the air, for casting the Iran war in a negative light — or as Carr puts it, engaging in “news distortions.” Presumably, coverage favorable to the war, even if distorted, would be in the public interest.
John O. McGinnis talks about his new book, Why Democracy Needs the Rich.
James Hartley praises Adam Smith’s “knack for synthesis.”
The Wall Street Journal‘s Matthew Hennessey hits an important nail squarely on its head:
The Journal reports that 39 states now mandate high-school students take a personal-finance course as a graduation requirement.
Wonderful! Great! It’s good to learn how to live within your means. Young people need to hear about credit scores and nest eggs.
Unfortunately, economics is being edged aside to make room. To some ears that may sound like a wash. It’s actually a net loss.
The difference between economics and finance is the difference between physics and engineering. One is general, the other specific. If it’s easier to conceptualize, think of the difference between a literature course and a journalism class.
Economics is a social science, which means it’s fundamentally an attempt to better understand human behavior. It’s concerned with production and consumption. It teaches students to think about margins and trade-offs.
Finance is more concrete, pragmatic, practical. It’s budgeting and planning, savings and investment.
Studying economics makes you a more thoughtful decision-maker. Studying personal finance makes you a better manager of money and credit. Each has its place, but when teaching young people, we generally start with concepts before moving to real-world applications, right? We want students to walk before they run.
Basic economic education is essential. In an ideal world every American would get a rudimentary grounding in the principles of supply and demand while still in short pants.
Middle-school students would learn about scarcity, incentives and opportunity costs alongside their algebra and Earth science. High schoolers would read Adam Smith as well as Shakespeare.
That this isn’t already standard practice has contributed to an epidemic of economic ignorance. The consequences are everywhere—at least two generations of Americans who don’t understand what prices are and how they’re set, who are ignorant of how wealth is created, who see markets as a rigged game, who think that national prosperity is a fixed pie, who believe changes in tax rates have no effect on behavior, who think you can just freeze the rent . . . the list goes on.
Teaching kids how to be responsible in their personal financial lives is important. But let’s not stop teaching them why it matters. That’s a trade-off we’ll live to regret.


The 2009 American Reinvestment and Recovery Act is probably the single piece of legislation with the most provisions expanding the safety net. To name a few: it increased unemployment and food stamp benefits; it expanded eligibility for both programs with its “alternative base period” calculation of the unemployment benefit and by granting states relief from the food stamp program’s work requirements; it federally funded extended unemployment benefits, so that employers would not have to pay for the extended benefits received by their former employees. The act’s “recovery” and “stimulus” monikers are ironic because, like other legislation that expanded the safety net, the transfer provisions of the act helped keep labor hours low after the act went into effect.
The policymakers behind Social Security took it upon themselves to manage the future and savings of all Americans intelligently and rationally. But what they set in place was a system that would eventually bind the coming generations to promises they could not reasonably afford. It was, in other words, the foundational political program of the twentieth century – well meaning, choice eliminating, and ignoring obvious secondary effects.
[I]t hath been the Observation of many Ages, that Bigotry and Industry, Manufactures and Persecution, cannot possibly subsist together, or cohabit in the same Country.
As a moral philosopher, Smith was concerned about the nature of moral excellence. But like many other Enlightenment intellectuals, he tried to begin by describing man as he really is. His conception of man was not as an intrinsically good creature corrupted by society, nor as an irredeemably evil creature except for the grace of God. His project was to take man as he is and to make him more like what he is capable of becoming, not by exerting government power and not primarily by preaching, but by discovering the institutions that make men tolerably decent and may make them more so.
