Tag Archives: Economics

Crash and War Anger

Crash and War Anger

All of us love validation – especially when it comes from an admired source.

That’s the way I feel after reading the NYT Review by Fareed Zakaria – possibly my most admired journalist

The review is of a book called Crash, by an eminent scholar writing about the consequences of the crash of 2008. The review is below.

It validates my deep belief that the seeds of Trump’s victory go back to the “crash” of 2008. It was a moment of major negative “reset” for far too many Americans. Their savings, or their employability, or their home values, or their prospects for credit changed so negatively that it created an emergent body politic. The new body politic was characterized by a primary sentiment: seething anger. More importantly, it was characterized by a call to action: “throw the bums out!”.

The deep irony here is that democracy handed the angry a “throw the bums out” choice that many didn’t want – Barack Obama.

But their anger at inside-the-beltway Republicans, and George W. Bush, was so strong that – inside the ballot box – they pulled the lever for Obama.

When Donald Trump had the courage to viciously criticize the Republican establishment, and especially Bush, he was speaking directly to this new body politic. If their sentiment was resentment, they found their gladiator in Trump.

My only beef with the book and the review is that they do not go back far enough.

I believe the seeds of Trump’s victory go back to 9/11. It was that fateful day that itself created a new body politic, whose primary sentiment was “We are under attack and we must fight back.”

George W. Bush was responding to that scary, new sentiment when he announced not just one, but two new wars. History will record that the Iraq War – which cost trillions – was a major mistake. History will be somewhat more kind about the war in Afghanistan, which led the nation into a massively expensive 15+ year engagement of limited success and many, many unintended consequences.

So my point is that 9/11 reigned holy hell on the nation – because of the new body politic of “we are under attack and we must fight back.” – by pushing a very weak leader George W. Bush – to start two wars that almost immediately looked incompetent and wrong.

The 2008 crash was the final straw. Two stupid wars and a major economic reset were enough to push most Americans over the edge to a seething anger and a “throw the bums out” call to action.

It took the decade after to weave a tapestry of cause and effect, supported by right wing media. Never mind that most of the tapestry was a fabrication. Never mind that he was a serial liar. It was soothing to have a gladiator (Trump) that spoke the truth about the subjects that really mattered: “those folks in Washington don’t know what they are doing and they need to go”; “you are being screwed by the economic resets” and “the war in Iraq was a major mistake”.“

There is very little question in my mind that Donald Trump will go down in history as our worst president. He will be remembered by his failures, his indecency and his lack of integrity. He will be remembered for his failures abroad, where he embarrasses us and plays the fool, and his failures at home, where he depletes the treasury and breaks the back of the Affordable Care Act. Everywhere he goes, he does what is bad, and undoes decades of progress In defining what is good,, e.g. environmental regulation. His indecency and his lack of integrity will leave lasting scars on the office, but hopefully schools and parents will now have an example of what not to be, how not to act.

How did the nation get to this horrible outcome? We will only have perspective on this decades from now, but the “second draft of history”, to me, traces it all back to 9/11 and it’s two awful wars. The cruel irony was that after eight Bush years of misguided foreign adventurism, the American economy collapsed. It was the straw that broke the camels back, causing all of us to say “we are mad as hell and we need to throw the bums out!”

So the math, looking back thirty years form now, might well be:

Afghan war + Iraq war + economic crash = Obama
(Obama was the backlash. We threw the bums out for him, and thank God he was as level headed and as competent and decent as he was)

Economic reset for most Americans + unresolved racist and nationalistic impulses + Comey + Russia = Trump
(Trump was the backlash to Obama, supported by all events above)
================
CRASH

CREDIT: https://www.nytimes.com/2018/08/10/books/review/adam-tooze-crashed.html?rref=collection%2Fsectioncollection%2Fbook-review&action=click&contentCollection=review&region=rank&module=package&version=highlights&contentPlacement=1&pgtype=sectionfront
NONFICTION
Looking Back at the Economic Crash of 2008

By Fareed Zakaria

Aug. 10, 2018

How a Decade of Financial Crises Changed the World

By Adam Tooze
706 pp. Viking. $35.

Steve Bannon can date the start of the Trump “revolution.” When I interviewed him for CNN in May, in Rome, he explained that the origins of Trump’s victory could be found 10 years ago, in the financial crisis of 2008. “The implosion of those world capital markets has never really been sorted out,” he told me. “The fuse that was lit then that eventually brought the Trump revolution is the same thing that’s happened here in Italy.” (Italy had just held elections in which populist forces had won 50 percent of the vote.) Adam Tooze would likely agree. An economic historian at Columbia University, he has written a detailed account of the financial shocks and their aftereffects, which, his subtitle asserts, “changed the world.”

If journalism is the first rough draft of history, Tooze’s book is the second draft. A distinguished scholar with a deep grasp of financial markets, Tooze knows that it is a challenge to gain perspective on events when they have not yet played out. He points out that a 10-year-old history of the crash of 1929 would have been written in 1939, when most of its consequences were ongoing and unresolved. But still he has persisted and produced an intelligent explanation of the mechanisms that produced the crisis and the response to it. We continue to live with the consequences of both today.

As is often the case with financial crashes, markets and experts alike turned out to have been focused on the wrong things, blind to the true problem that was metastasizing. By 2007, many were warning about a dangerous fragility in the system. But they worried about America’s gargantuan government deficits and debt — which had exploded as a result of the Bush administration’s tax cuts and increased spending after 9/11. It was an understandable focus. The previous decade had been littered with collapses when a country borrowed too much and its creditors finally lost faith in it — from Mexico in 1994 to Thailand, Malaysia and South Korea in 1997 to Russia in 1998. In particular, many fretted about the identity of America’s chief foreign creditor — the government of China. Yet it was not a Chinese sell-off of American debt that triggered the crash, but rather, as Tooze writes, a problem “fully native to Western capitalism — a meltdown on Wall Street driven by toxic securitized subprime mortgages.”

Tooze calls it a problem in “Western capitalism” intentionally. It was not just an American problem. When it began, many saw it as such and dumped the blame on Washington. In September 2008, as Wall Street burned, the German finance minister Peer Steinbruck explained that the collapse was centered in the United States because of America’s “simplistic” and “dangerous” laissez-faire approach. Italy’s finance minister assured the world that its banking system was stable because “it did not speak English.”

In fact this was nonsense. One of the great strengths of Tooze’s book is to demonstrate the deeply intertwined nature of the European and American financial systems. In 2006, European banks generated a third of America’s riskiest privately issued mortgage-backed securities. By 2007, two-thirds of commercial paper issued was sponsored by a European financial entity. The enormous expansion of the global financial system had largely been a trans-Atlantic project, with European banks jumping in as eagerly and greedily to find new sources of profit as American banks. European regulators were as blind to the mounting problems as their American counterparts, which led to problems on a similar scale. “Between 2001 and 2006,” Tooze writes, “Greece, Finland, Sweden, Belgium, Denmark, the U.K., France, Ireland and Spain all experienced real estate booms more severe than those that energized the United States.”

But while the crisis may have been caused in both America and Europe, it was solved largely by Washington. Partly, this reflected the post-Cold War financial system, in which the dollar had become the hyperdominant global currency and, as a result, the Federal Reserve had truly become the world’s central bank. But Tooze also convincingly shows that the European Central Bank mismanaged things from the start. The Fed acted aggressively and also in highly ingenious ways, becoming a guarantor of last resort to the battered balance sheets of American but also European banks. About half the liquidity support the Fed provided during the crisis went to European banks, Tooze observes.

Before the rescue and even in its early stages, the global economy was falling into a bottomless abyss. In the first months after the panic on Wall Street, world trade and industrial production fell at least as fast as they did during the first months of the Great Depression. Global capital flows declined by a staggering 90 percent. The Federal Reserve, with some assistance from other central banks, arrested this decline. The Obama fiscal stimulus also helped to break the fall. Tooze points out that almost all serious analyses of the stimulus conclude that it played a significant positive role. In fact, most experts believe it ended much too soon. He also points out that large parts of the so-called Obama stimulus were the result of automatic government spending, like unemployment insurance, that would have happened no matter who was president. And finally, he notes that China, with its own gigantic stimulus, created an oasis of growth in an otherwise stagnant global economy.

The rescue worked better than almost anyone imagined. It is worth recalling that none of the dangers confidently prophesied by legions of critics took place. There was no run on the dollar or American treasuries, no hyperinflation, no double-dip recession, no China crash. American banks stabilized and in fact prospered, households began saving again, growth returned slowly but surely. The governing elite did not anticipate the crisis — as few elites have over hundreds of years of capitalism. But once it happened, many of them — particularly in America — acted quickly and intelligently, and as a result another Great Depression was averted. The system worked, as Daniel Drezner notes in his own book of that title.

But therein lies the unique feature of the crash of 2008. Unlike that of 1929, it was not followed by a Great Depression. It was not so much the crisis as the rescue and its economic, political and social consequences that mattered most. On the left, the entire episode discredited the market-friendly policies of Tony Blair, Bill Clinton and Gerhard Schroeder, disheartening the center-left and emboldening those who want more government intervention in the economy in all kinds of ways. On the right, it became a rallying cry against bailouts and the Fed, buoying an imaginary free-market alternative to government intervention. Unlike in the 1930s, when the libertarian strategy was tried and only deepened the Depression, in the last 10 years it has been possible for the right to argue against the bailouts, secure in the knowledge that their proposed policies will never actually be implemented.

Bannon is right. The crash brought together many forces that were around anyway — stagnant wages, widening inequality, anger about immigration and, above all, a deep distrust of elites and government — and supercharged them. The result has been a wave of nationalism, protectionism and populism in the West today. A confirmation of this can be found in the one major Western country that did not have a financial crisis and has little populism in its wake — Canada.

The facts remain: No government handled the crisis better than that of the United States, which acted in a surprisingly bipartisan fashion in late 2008 and almost seamlessly coordinated policy between the outgoing Bush and incoming Obama administrations. And yet, the backlash to the bailouts has produced the most consequential result in the United States.
Tooze notes in his concluding chapter that experts are considering the new vulnerabilities of a global economy with many new participants, especially the behemoth in Beijing. But instead of a challenge from an emerging China that began its rise outside the economic and political system, we are confronting a quite different problem — an erratic, unpredictable United States led by a president who seems inclined to redo or even scrap the basic architecture of the system that America has painstakingly built since 1945. How will the world handle this unexpected development? What will be its outcome? This is the current crisis that we will live through and that historians will soon analyze.

Fareed Zakaria is a CNN anchor, a Washington Post columnist and the author of “The Post American World.”

Our miserable 21st century

Below is dense – but worth it. It is written by a conservative, but an honest one.

It is the best documentation I have found on the thesis that I wrote about last year: that the 21st century economy is a structural mess, and the mess is a non-partisan one!

My basic contention is really simple:

9/11 diverted us from this issue, and then …
we compounded the diversion with two idiotic wars, and then …
we compounded the diversion further with an idiotic, devastating recession. and then …
we started to stabilize, which is why President Obama goes to the head of the class, and then …
we built a three ring circus, and elected a clown as the ringmaster.

While we watch this three-ring circus in Washington, no one is paying attention to this structural problem in the economy….so we are wasting time, when we should be tackling this central issue of our time. Its a really complicated one, and there are no easy answers (sorry Trump and Bernie Sanders).

PUT YOUR POLITICAL ARTILLERY DOWN AND READ ON …..

=======BEGIN=============

CREDIT: https://www.commentarymagazine.com/articles/our-miserable-21st-century/

Our Miserable 21st Century
From work to income to health to social mobility, the year 2000 marked the beginning of what has become a distressing era for the United States
NICHOLAS N. EBERSTADT / FEB. 15, 2017

In the morning of November 9, 2016, America’s elite—its talking and deciding classes—woke up to a country they did not know. To most privileged and well-educated Americans, especially those living in its bicoastal bastions, the election of Donald Trump had been a thing almost impossible even to imagine. What sort of country would go and elect someone like Trump as president? Certainly not one they were familiar with, or understood anything about.

Whatever else it may or may not have accomplished, the 2016 election was a sort of shock therapy for Americans living within what Charles Murray famously termed “the bubble” (the protective barrier of prosperity and self-selected associations that increasingly shield our best and brightest from contact with the rest of their society). The very fact of Trump’s election served as a truth broadcast about a reality that could no longer be denied: Things out there in America are a whole lot different from what you thought.

Yes, things are very different indeed these days in the “real America” outside the bubble. In fact, things have been going badly wrong in America since the beginning of the 21st century.

It turns out that the year 2000 marks a grim historical milestone of sorts for our nation. For whatever reasons, the Great American Escalator, which had lifted successive generations of Americans to ever higher standards of living and levels of social well-being, broke down around then—and broke down very badly.

The warning lights have been flashing, and the klaxons sounding, for more than a decade and a half. But our pundits and prognosticators and professors and policymakers, ensconced as they generally are deep within the bubble, were for the most part too distant from the distress of the general population to see or hear it. (So much for the vaunted “information era” and “big-data revolution.”) Now that those signals are no longer possible to ignore, it is high time for experts and intellectuals to reacquaint themselves with the country in which they live and to begin the task of describing what has befallen the country in which we have lived since the dawn of the new century.

II
Consider the condition of the American economy. In some circles people still widely believe, as one recent New York Times business-section article cluelessly insisted before the inauguration, that “Mr. Trump will inherit an economy that is fundamentally solid.” But this is patent nonsense. By now it should be painfully obvious that the U.S. economy has been in the grip of deep dysfunction since the dawn of the new century. And in retrospect, it should also be apparent that America’s strange new economic maladies were almost perfectly designed to set the stage for a populist storm.

Ever since 2000, basic indicators have offered oddly inconsistent readings on America’s economic performance and prospects. It is curious and highly uncharacteristic to find such measures so very far out of alignment with one another. We are witnessing an ominous and growing divergence between three trends that should ordinarily move in tandem: wealth, output, and employment. Depending upon which of these three indicators you choose, America looks to be heading up, down, or more or less nowhere.
From the standpoint of wealth creation, the 21st century is off to a roaring start. By this yardstick, it looks as if Americans have never had it so good and as if the future is full of promise. Between early 2000 and late 2016, the estimated net worth of American households and nonprofit institutions more than doubled, from $44 trillion to $90 trillion. (SEE FIGURE 1.)

Although that wealth is not evenly distributed, it is still a fantastic sum of money—an average of over a million dollars for every notional family of four. This upsurge of wealth took place despite the crash of 2008—indeed, private wealth holdings are over $20 trillion higher now than they were at their pre-crash apogee. The value of American real-estate assets is near or at all-time highs, and America’s businesses appear to be thriving. Even before the “Trump rally” of late 2016 and early 2017, U.S. equities markets were hitting new highs—and since stock prices are strongly shaped by expectations of future profits, investors evidently are counting on the continuation of the current happy days for U.S. asset holders for some time to come.

A rather less cheering picture, though, emerges if we look instead at real trends for the macro-economy. Here, performance since the start of the century might charitably be described as mediocre, and prospects today are no better than guarded.

The recovery from the crash of 2008—which unleashed the worst recession since the Great Depression—has been singularly slow and weak. According to the Bureau of Economic Analysis (BEA), it took nearly four years for America’s gross domestic product (GDP) to re-attain its late 2007 level. As of late 2016, total value added to the U.S. economy was just 12 percent higher than in 2007. (SEE FIGURE 2.) The situation is even more sobering if we consider per capita growth. It took America six and a half years—until mid-2014—to get back to its late 2007 per capita production levels. And in late 2016, per capita output was just 4 percent higher than in late 2007—nine years earlier. By this reckoning, the American economy looks to have suffered something close to a lost decade.

But there was clearly trouble brewing in America’s macro-economy well before the 2008 crash, too. Between late 2000 and late 2007, per capita GDP growth averaged less than 1.5 percent per annum. That compares with the nation’s long-term postwar 1948–2000 per capita growth rate of almost 2.3 percent, which in turn can be compared to the “snap back” tempo of 1.1 percent per annum since per capita GDP bottomed out in 2009. Between 2000 and 2016, per capita growth in America has averaged less than 1 percent a year. To state it plainly: With postwar, pre-21st-century rates for the years 2000–2016, per capita GDP in America would be more than 20 percent higher than it is today.

The reasons for America’s newly fitful and halting macroeconomic performance are still a puzzlement to economists and a subject of considerable contention and debate.1Economists are generally in consensus, however, in one area: They have begun redefining the growth potential of the U.S. economy downwards. The U.S. Congressional Budget Office (CBO), for example, suggests that the “potential growth” rate for the U.S. economy at full employment of factors of production has now dropped below 1.7 percent a year, implying a sustainable long-term annual per capita economic growth rate for America today of well under 1 percent.

Then there is the employment situation. If 21st-century America’s GDP trends have been disappointing, labor-force trends have been utterly dismal. Work rates have fallen off a cliff since the year 2000 and are at their lowest levels in decades. We can see this by looking at the estimates by the Bureau of Labor Statistics (BLS) for the civilian employment rate, the jobs-to-population ratio for adult civilian men and women. (SEE FIGURE 3.) Between early 2000 and late 2016, America’s overall work rate for Americans age 20 and older underwent a drastic decline. It plunged by almost 5 percentage points (from 64.6 to 59.7). Unless you are a labor economist, you may not appreciate just how severe a falloff in employment such numbers attest to. Postwar America never experienced anything comparable.

From peak to trough, the collapse in work rates for U.S. adults between 2008 and 2010 was roughly twice the amplitude of what had previously been the country’s worst postwar recession, back in the early 1980s. In that previous steep recession, it took America five years to re-attain the adult work rates recorded at the start of 1980. This time, the U.S. job market has as yet, in early 2017, scarcely begun to claw its way back up to the work rates of 2007—much less back to the work rates from early 2000.

As may be seen in Figure 3, U.S. adult work rates never recovered entirely from the recession of 2001—much less the crash of ’08. And the work rates being measured here include people who are engaged in any paid employment—any job, at any wage, for any number of hours of work at all.

On Wall Street and in some parts of Washington these days, one hears that America has gotten back to “near full employment.” For Americans outside the bubble, such talk must seem nonsensical. It is true that the oft-cited “civilian unemployment rate” looked pretty good by the end of the Obama era—in December 2016, it was down to 4.7 percent, about the same as it had been back in 1965, at a time of genuine full employment. The problem here is that the unemployment rate only tracks joblessness for those still in the labor force; it takes no account of workforce dropouts. Alas, the exodus out of the workforce has been the big labor-market story for America’s new century. (At this writing, for every unemployed American man between 25 and 55 years of age, there are another three who are neither working nor looking for work.) Thus the “unemployment rate” increasingly looks like an antique index devised for some earlier and increasingly distant war: the economic equivalent of a musket inventory or a cavalry count.

By the criterion of adult work rates, by contrast, employment conditions in America remain remarkably bleak. From late 2009 through early 2014, the country’s work rates more or less flatlined. So far as can be told, this is the only “recovery” in U.S. economic history in which that basic labor-market indicator almost completely failed to respond.

Since 2014, there has finally been a measure of improvement in the work rate—but it would be unwise to exaggerate the dimensions of that turnaround. As of late 2016, the adult work rate in America was still at its lowest level in more than 30 years. To put things another way: If our nation’s work rate today were back up to its start-of-the-century highs, well over 10 million more Americans would currently have paying jobs.

There is no way to sugarcoat these awful numbers. They are not a statistical artifact that can be explained away by population aging, or by increased educational enrollment for adult students, or by any other genuine change in contemporary American society. The plain fact is that 21st-century America has witnessed a dreadful collapse of work.
For an apples-to-apples look at America’s 21st-century jobs problem, we can focus on the 25–54 population—known to labor economists for self-evident reasons as the “prime working age” group. For this key labor-force cohort, work rates in late 2016 were down almost 4 percentage points from their year-2000 highs. That is a jobs gap approaching 5 million for this group alone.

It is not only that work rates for prime-age males have fallen since the year 2000—they have, but the collapse of work for American men is a tale that goes back at least half a century. (I wrote a short book last year about this sad saga.2) What is perhaps more startling is the unexpected and largely unnoticed fall-off in work rates for prime-age women. In the U.S. and all other Western societies, postwar labor markets underwent an epochal transformation. After World War II, work rates for prime women surged, and continued to rise—until the year 2000. Since then, they too have declined. Current work rates for prime-age women are back to where they were a generation ago, in the late 1980s. The 21st-century U.S. economy has been brutal for male and female laborers alike—and the wreckage in the labor market has been sufficiently powerful to cancel, and even reverse, one of our society’s most distinctive postwar trends: the rise of paid work for women outside the household.

In our era of no more than indifferent economic growth, 21st–century America has somehow managed to produce markedly more wealth for its wealthholders even as it provided markedly less work for its workers. And trends for paid hours of work look even worse than the work rates themselves. Between 2000 and 2015, according to the BEA, total paid hours of work in America increased by just 4 percent (as against a 35 percent increase for 1985–2000, the 15-year period immediately preceding this one). Over the 2000–2015 period, however, the adult civilian population rose by almost 18 percent—meaning that paid hours of work per adult civilian have plummeted by a shocking 12 percent thus far in our new American century.

This is the terrible contradiction of economic life in what we might call America’s Second Gilded Age (2000—). It is a paradox that may help us understand a number of overarching features of our new century. These include the consistent findings that public trust in almost all U.S. institutions has sharply declined since 2000, even as growing majorities hold that America is “heading in the wrong direction.” It provides an immediate answer to why overwhelming majorities of respondents in public-opinion surveys continue to tell pollsters, year after year, that our ever-richer America is still stuck in the middle of a recession. The mounting economic woes of the “little people” may not have been generally recognized by those inside the bubble, or even by many bubble inhabitants who claimed to be economic specialists—but they proved to be potent fuel for the populist fire that raged through American politics in 2016.

III
So general economic conditions for many ordinary Americans—not least of these, Americans who did not fit within the academy’s designated victim classes—have been rather more insecure than those within the comfort of the bubble understood. But the anxiety, dissatisfaction, anger, and despair that range within our borders today are not wholly a reaction to the way our economy is misfiring. On the nonmaterial front, it is likewise clear that many things in our society are going wrong and yet seem beyond our powers to correct.

Some of these gnawing problems are by no means new: A number of them (such as family breakdown) can be traced back at least to the 1960s, while others are arguably as old as modernity itself (anomie and isolation in big anonymous communities, secularization and the decline of faith). But a number have roared down upon us by surprise since the turn of the century—and others have redoubled with fearsome new intensity since roughly the year 2000.

American health conditions seem to have taken a seriously wrong turn in the new century. It is not just that overall health progress has been shockingly slow, despite the trillions we devote to medical services each year. (Which “Cold War babies” among us would have predicted we’d live to see the day when life expectancy in East Germany was higher than in the United States, as is the case today?)

Alas, the problem is not just slowdowns in health progress—there also appears to have been positive retrogression for broad and heretofore seemingly untroubled segments of the national population. A short but electrifying 2015 paper by Anne Case and Nobel Economics Laureate Angus Deaton talked about a mortality trend that had gone almost unnoticed until then: rising death rates for middle-aged U.S. whites. By Case and Deaton’s reckoning, death rates rose somewhat slightly over the 1999–2013 period for all non-Hispanic white men and women 45–54 years of age—but they rose sharply for those with high-school degrees or less, and for this less-educated grouping most of the rise in death rates was accounted for by suicides, chronic liver cirrhosis, and poisonings (including drug overdoses).

Though some researchers, for highly technical reasons, suggested that the mortality spike might not have been quite as sharp as Case and Deaton reckoned, there is little doubt that the spike itself has taken place. Health has been deteriorating for a significant swath of white America in our new century, thanks in large part to drug and alcohol abuse. All this sounds a little too close for comfort to the story of modern Russia, with its devastating vodka- and drug-binging health setbacks. Yes: It can happen here, and it has. Welcome to our new America.

In December 2016, the Centers for Disease Control and Prevention (CDC) reported that for the first time in decades, life expectancy at birth in the United States had dropped very slightly (to 78.8 years in 2015, from 78.9 years in 2014). Though the decline was small, it was statistically meaningful—rising death rates were characteristic of males and females alike; of blacks and whites and Latinos together. (Only black women avoided mortality increases—their death levels were stagnant.) A jump in “unintentional injuries” accounted for much of the overall uptick.
It would be unwarranted to place too much portent in a single year’s mortality changes; slight annual drops in U.S. life expectancy have occasionally been registered in the past, too, followed by continued improvements. But given other developments we are witnessing in our new America, we must wonder whether the 2015 decline in life expectancy is just a blip, or the start of a new trend. We will find out soon enough. It cannot be encouraging, though, that the Human Mortality Database, an international consortium of demographers who vet national data to improve comparability between countries, has suggested that health progress in America essentially ceased in 2012—that the U.S. gained on average only about a single day of life expectancy at birth between 2012 and 2014, before the 2015 turndown.

The opioid epidemic of pain pills and heroin that has been ravaging and shortening lives from coast to coast is a new plague for our new century. The terrifying novelty of this particular drug epidemic, of course, is that it has gone (so to speak) “mainstream” this time, effecting breakout from disadvantaged minority communities to Main Street White America. By 2013, according to a 2015 report by the Drug Enforcement Administration, more Americans died from drug overdoses (largely but not wholly opioid abuse) than from either traffic fatalities or guns. The dimensions of the opioid epidemic in the real America are still not fully appreciated within the bubble, where drug use tends to be more carefully limited and recreational. In Dreamland, his harrowing and magisterial account of modern America’s opioid explosion, the journalist Sam Quinones notes in passing that “in one three-month period” just a few years ago, according to the Ohio Department of Health, “fully 11 percent of all Ohioans were prescribed opiates.” And of course many Americans self-medicate with licit or illicit painkillers without doctors’ orders.

In the fall of 2016, Alan Krueger, former chairman of the President’s Council of Economic Advisers, released a study that further refined the picture of the real existing opioid epidemic in America: According to his work, nearly half of all prime working-age male labor-force dropouts—an army now totaling roughly 7 million men—currently take pain medication on a daily basis.

We already knew from other sources (such as BLS “time use” surveys) that the overwhelming majority of the prime-age men in this un-working army generally don’t “do civil society” (charitable work, religious activities, volunteering), or for that matter much in the way of child care or help for others in the home either, despite the abundance of time on their hands. Their routine, instead, typically centers on watching—watching TV, DVDs, Internet, hand-held devices, etc.—and indeed watching for an average of 2,000 hours a year, as if it were a full-time job. But Krueger’s study adds a poignant and immensely sad detail to this portrait of daily life in 21st-century America: In our mind’s eye we can now picture many millions of un-working men in the prime of life, out of work and not looking for jobs, sitting in front of screens—stoned.

But how did so many millions of un-working men, whose incomes are limited, manage en masse to afford a constant supply of pain medication? Oxycontin is not cheap. As Dreamland carefully explains, one main mechanism today has been the welfare state: more specifically, Medicaid, Uncle Sam’s means-tested health-benefits program. Here is how it works (we are with Quinones in Portsmouth, Ohio):

[The Medicaid card] pays for medicine—whatever pills a doctor deems that the insured patient needs. Among those who receive Medicaid cards are people on state welfare or on a federal disability program known as SSI. . . . If you could get a prescription from a willing doctor—and Portsmouth had plenty of them—Medicaid health-insurance cards paid for that prescription every month. For a three-dollar Medicaid co-pay, therefore, addicts got pills priced at thousands of dollars, with the difference paid for by U.S. and state taxpayers. A user could turn around and sell those pills, obtained for that three-dollar co-pay, for as much as ten thousand dollars on the street.

In 21st-century America, “dependence on government” has thus come to take on an entirely new meaning.

You may now wish to ask: What share of prime-working-age men these days are enrolled in Medicaid? According to the Census Bureau’s SIPP survey (Survey of Income and Program Participation), as of 2013, over one-fifth (21 percent) of all civilian men between 25 and 55 years of age were Medicaid beneficiaries. For prime-age people not in the labor force, the share was over half (53 percent). And for un-working Anglos (non-Hispanic white men not in the labor force) of prime working age, the share enrolled in Medicaid was 48 percent.

By the way: Of the entire un-working prime-age male Anglo population in 2013, nearly three-fifths (57 percent) were reportedly collecting disability benefits from one or more government disability program in 2013. Disability checks and means-tested benefits cannot support a lavish lifestyle. But they can offer a permanent alternative to paid employment, and for growing numbers of American men, they do. The rise of these programs has coincided with the death of work for larger and larger numbers of American men not yet of retirement age. We cannot say that these programs caused the death of work for millions upon millions of younger men: What is incontrovertible, however, is that they have financed it—just as Medicaid inadvertently helped finance America’s immense and increasing appetite for opioids in our new century.

It is intriguing to note that America’s nationwide opioid epidemic has not been accompanied by a nationwide crime wave (excepting of course the apparent explosion of illicit heroin use). Just the opposite: As best can be told, national victimization rates for violent crimes and property crimes have both reportedly dropped by about two-thirds over the past two decades.3 The drop in crime over the past generation has done great things for the general quality of life in much of America. There is one complication from this drama, however, that inhabitants of the bubble may not be aware of, even though it is all too well known to a great many residents of the real America. This is the extraordinary expansion of what some have termed America’s “criminal class”—the population sentenced to prison or convicted of felony offenses—in recent decades. This trend did not begin in our century, but it has taken on breathtaking enormity since the year 2000.

Most well-informed readers know that the U.S. currently has a higher share of its populace in jail or prison than almost any other country on earth, that Barack Obama and others talk of our criminal-justice process as “mass incarceration,” and know that well over 2 million men were in prison or jail in recent years.4 But only a tiny fraction of all living Americans ever convicted of a felony is actually incarcerated at this very moment. Quite the contrary: Maybe 90 percent of all sentenced felons today are out of confinement and living more or less among us. The reason: the basic arithmetic of sentencing and incarceration in America today. Correctional release and sentenced community supervision (probation and parole) guarantee a steady annual “flow” of convicted felons back into society to augment the very considerable “stock” of felons and ex-felons already there. And this “stock” is by now truly enormous.

One forthcoming demographic study by Sarah Shannon and five other researchers estimates that the cohort of current and former felons in America very nearly reached 20 million by the year 2010. If its estimates are roughly accurate, and if America’s felon population has continued to grow at more or less the same tempo traced out for the years leading up to 2010, we would expect it to surpass 23 million persons by the end of 2016 at the latest. Very rough calculations might therefore suggest that at this writing, America’s population of non-institutionalized adults with a felony conviction somewhere in their past has almost certainly broken the 20 million mark by the end of 2016. A little more rough arithmetic suggests that about 17 million men in our general population have a felony conviction somewhere in their CV. That works out to one of every eight adult males in America today.

We have to use rough estimates here, rather than precise official numbers, because the government does not collect any data at all on the size or socioeconomic circumstances of this population of 20 million, and never has. Amazing as this may sound and scandalous though it may be, America has, at least to date, effectively banished this huge group—a group roughly twice the total size of our illegal-immigrant population and an adult population larger than that in any state but California—to a near-total and seemingly unending statistical invisibility. Our ex-cons are, so to speak, statistical outcasts who live in a darkness our polity does not care enough to illuminate—beyond the scope or interest of public policy, unless and until they next run afoul of the law.

Thus we cannot describe with any precision or certainty what has become of those who make up our “criminal class” after their (latest) sentencing or release. In the most stylized terms, however, we might guess that their odds in the real America are not all that favorable. And when we consider some of the other trends we have already mentioned—employment, health, addiction, welfare dependence—we can see the emergence of a malign new nationwide undertow, pulling downward against social mobility.
Social mobility has always been the jewel in the crown of the American mythos and ethos. The idea (not without a measure of truth to back it up) was that people in America are free to achieve according to their merit and their grit—unlike in other places, where they are trapped by barriers of class or the misfortune of misrule. Nearly two decades into our new century, there are unmistakable signs that America’s fabled social mobility is in trouble—perhaps even in serious trouble.

Consider the following facts. First, according to the Census Bureau, geographical mobility in America has been on the decline for three decades, and in 2016 the annual movement of households from one location to the next was reportedly at an all-time (postwar) low. Second, as a study by three Federal Reserve economists and a Notre Dame colleague demonstrated last year, “labor market fluidity”—the churning between jobs that among other things allows people to get ahead—has been on the decline in the American labor market for decades, with no sign as yet of a turnaround. Finally, and not least important, a December 2016 report by the “Equal Opportunity Project,” a team led by the formidable Stanford economist Raj Chetty, calculated that the odds of a 30-year-old’s earning more than his parents at the same age was now just 51 percent: down from 86 percent 40 years ago. Other researchers who have examined the same data argue that the odds may not be quite as low as the Chetty team concludes, but agree that the chances of surpassing one’s parents’ real income have been on the downswing and are probably lower now than ever before in postwar America.

Thus the bittersweet reality of life for real Americans in the early 21st century: Even though the American economy still remains the world’s unrivaled engine of wealth generation, those outside the bubble may have less of a shot at the American Dream than has been the case for decades, maybe generations—possibly even since the Great Depression.

IV
The funny thing is, people inside the bubble are forever talking about “economic inequality,” that wonderful seminar construct, and forever virtue-signaling about how personally opposed they are to it. By contrast, “economic insecurity” is akin to a phrase from an unknown language. But if we were somehow to find a “Google Translate” function for communicating from real America into the bubble, an important message might be conveyed:

The abstraction of “inequality” doesn’t matter a lot to ordinary Americans. The reality of economic insecurity does. The Great American Escalator is broken—and it badly needs to be fixed.

With the election of 2016, Americans within the bubble finally learned that the 21st century has gotten off to a very bad start in America. Welcome to the reality. We have a lot of work to do together to turn this around.

1 Some economists suggest the reason has to do with the unusual nature of the Great Recession: that downturns born of major financial crises intrinsically require longer adjustment and correction periods than the more familiar, ordinary business-cycle downturn. Others have proposed theories to explain why the U.S. economy may instead have downshifted to a more tepid tempo in the Bush-Obama era. One such theory holds that the pace of productivity is dropping because the scale of recent technological innovation is unrepeatable. There is also a “secular stagnation” hypothesis, surmising we have entered into an age of very low “natural real interest rates” consonant with significantly reduced demand for investment. What is incontestable is that the 10-year moving average for per capita economic growth is lower for America today than at any time since the Korean War—and that the slowdown in growth commenced in the decade before the 2008 crash. (It is also possible that the anemic status of the U.S. macro-economy is being exaggerated by measurement issues—productivity improvements from information technology, for example, have been oddly elusive in our officially reported national output—but few today would suggest that such concealed gains would totally transform our view of the real economy’s true performance.)
2 Nicholas Eberstadt, Men Without Work: America’s Invisible Crisis (Templeton Press, 2016)
3 This is not to ignore the gruesome exceptions—places like Chicago and Baltimore—or to neglect the risk that crime may make a more general comeback: It is simply to acknowledge one of the bright trends for America in the new century.
4 In 2013, roughly 2.3 million men were behind bars according to the Bureau of Justice Statistics.

One could be forgiven for wondering what Kellyanne Conway, a close adviser to President Trump, was thinking recently when she turned the White House briefing room into the set of the Home Shopping Network. “Go buy Ivanka’s stuff!” she told Fox News viewers during an interview, referring to the clothing and accessories line of the president’s daughter. It’s not clear if her cheerleading led to any spike in sales, but it did lead to calls for an investigation into whether she violated federal ethics rules, and prompted the White House to later state that it had “counseled” Conway about her behavior.

To understand what provoked Conway’s on-air marketing campaign, look no further than the ongoing boycotts targeting all things Trump. This latest manifestation of the passion to impose financial harm to make a political point has taken things in a new and odd direction. Once, boycotts were serious things, requiring serious commitment and real sacrifice. There were boycotts by aggrieved workers, such as the United Farm Workers, against their employers; boycotts by civil-rights activists and religious groups; and boycotts of goods produced by nations like apartheid-era South Africa. Many of these efforts, sustained over years by committed cadres of activists, successfully pressured businesses and governments to change.

Since Trump’s election, the boycott has become less an expression of long-term moral and practical opposition and more an expression of the left’s collective id. As Harvard Business School professor Michael Norton told the Atlantic recently, “Increasingly, the way we express our political opinions is through buying or not buying instead of voting or not voting.” And evidently the way some people express political opinions when someone they don’t like is elected is to launch an endless stream of virtue-signaling boycotts. Democratic politicians ostentatiously boycotted Trump’s inauguration. New Balance sneaker owners vowed to boycott the company and filmed themselves torching their shoes after a company spokesman tweeted praise for Trump. Trump detractors called for a boycott of L.L. Bean after one of its board members was discovered to have (gasp!) given a personal contribution to a pro-Trump PAC.

By their nature, boycotts are a form of proxy warfare, tools wielded by consumers who want to send a message to a corporation or organization about their displeasure with specific practices.

Trump-era boycotts, however, merely seem to be a way to channel an overwhelming yet vague feeling of political frustration. Take the “Grab Your Wallet” campaign, whose mission, described in humblebragging detail on its website, is as follows: “Since its first humble incarnation as a screenshot on October 11, the #GrabYourWallet boycott list has grown as a central resource for understanding how our own consumer purchases have inadvertently supported the political rise of the Trump family.”

So this boycott isn’t against a specific business or industry; it’s a protest against one man and his children, with trickle-down effects for anyone who does business with them. Grab Your Wallet doesn’t just boycott Trump-branded hotels and golf courses; the group targets businesses such as Bed Bath & Beyond, for example, because it carries Ivanka Trump diaper bags. Even QVC and the Carnival Cruise corporation are targeted for boycott because they advertise on Celebrity Apprentice, which supposedly “further enriches Trump.”

Grab Your Wallet has received support from “notable figures” such as “Don Cheadle, Greg Louganis, Lucy Lawless, Roseanne Cash, Neko Case, Joyce Carol Oates, Robert Reich, Pam Grier, and Ben Cohen (of Ben & Jerry’s),” according to the group’s website. This rogues gallery of celebrity boycotters has been joined by enthusiastic hashtag activists on Twitter who post remarks such as, “Perhaps fed govt will buy all Ivanka merch & force prisoners & detainees in coming internment camps 2 wear it” and “Forced to #DressLikeaWoman by a sexist boss? #GrabYourWallet and buy a nice FU pantsuit at Trump-free shops.” There’s even a website, dontpaytrump.com, which offers a free plug-in extension for your Web browser. It promises a “simple Trump boycott extension that makes it easy to be a conscious consumer and keep your money out of Trump’s tiny hands.”

Many of the companies targeted for boycott—Bed, Bath & Beyond, QVC, TJ Maxx, Amazon—are the kind of retailers that carry moderately priced merchandise that working- and middle-class families can afford. But the list of Grab Your Wallet–approved alternatives for shopping are places like Bergdorf’s and Barney’s. These are hardly accessible choices for the TJ Maxx customer. Indeed, there is more than a whiff of quasi-racist elitism in the self-congratulatory tweets posted by Grab Your Wallet supporters, such as this response to news that Nordstrom is no longer planning to carry Ivanka’s shoe line: “Soon we’ll see Ivanka shoes at Dollar Store, next to Jalapeno Windex and off-brand batteries.”

If Grab Your Wallet is really about “flexing of consumer power in favor of a more respectful, inclusive society,” then it has some work to do.
And then there are the conveniently malleable ethics of the anti-Trump boycott brigade. A small number of affordable retailers like Old Navy made the Grab Your Wallet cut for “approved” alternatives for shopping. But just a few years ago, a progressive website described in detail the “living hell of a Bangladeshi sweatshop” that manufactures Old Navy clothing. Evidently progressives can now sleep peacefully at night knowing large corporations like Old Navy profit from young Bangladeshis making 20 cents an hour and working 17-hour days churning out cheap cargo pants—as long as they don’t bear a Trump label.

In truth, it matters little if Ivanka’s fashion business goes bust. It was always just a branding game anyway. The world will go on in the absence of Ivanka-named suede ankle booties. And in some sense the rash of anti-Trump boycotts is just what Trump, who frequently calls for boycotts of media outlets such as Rolling Stone and retailers like Macy’s, deserves.
But the left’s boycott braggadocio might prove short-lived. Nordstrom denied that it dropped Ivanka’s line of apparel and shoes because of pressure from the Grab Your Wallet campaign; it blamed lagging sales. And the boycotters’ tone of moral superiority—like the ridiculous posturing of the anti-Trump left’s self-flattering designation, “the resistance”—won’t endear them to the Trump voters they must convert if they hope to gain ground in the midterm elections.

As for inclusiveness, as one contributor to Psychology Today noted, the demographic breakdown of the typical boycotter, “especially consumer and ecological boycotts,” is a young, well-educated, politically left woman, undermining somewhat the idea of boycotts as a weapon of the weak and oppressed.

Self-indulgent protests and angry boycotts are no doubt cathartic for their participants (a 2016 study in the Journal of Consumer Affairs cited psychological research that found “by venting their frustrations, consumers can diminish their negative psychological states and, as a result, experience relief”). But such protests are not always ultimately catalytic. As researchers noted in a study published recently at Social Science Research Network, protesters face what they call “the activists’ dilemma,” which occurs when “tactics that raise awareness also tend to reduce popular support.” As the study found, “while extreme tactics may succeed in attracting attention, they typically reduce popular public support for the movement by eroding bystanders’ identification with the movement, ultimately deterring bystanders from supporting the cause or becoming activists themselves.”

The progressive left should be thoughtful about the reality of such protest fatigue. Writing in the Guardian, Jamie Peck recently enthused: “Of course, boycotts alone will not stop Trumpism. Effective resistance to authoritarianism requires more disruptive actions than not buying certain products . . . . But if there’s anything the past few weeks have taught us, it’s that resistance must take as many forms as possible, and it’s possible to call attention to the ravages of neoliberalism while simultaneously allying with any and all takers against the immediate dangers posed by our impetuous orange president.”

Boycotts are supposed to be about accountability. But accountability is a two-way street. The motives and tactics of the boycotters themselves are of the utmost importance. In his book about consumer boycotts, scholar Monroe Friedman advises that successful ones depend on a “rationale” that is “simple, straightforward, and appear[s] legitimate.” Whatever Trump’s flaws (and they are legion), by “going low” with scattershot boycotts, the left undermines its own legitimacy—and its claims to the moral high ground of “resistance” in the process.

========END===============

2013 Health Spend Slows

The flattening out of health care costs – at 17% of GDP – is now a five year trend. $9,255 per person!

Kaiser Report

Growth In U.S. Health Spending In 2013 Is Lowest Since 1960

National health spending grew 3.6 percent in 2013, the lowest annual increase since the Centers for Medicare and Medicaid Services (CMS) began tracking the statistic in 1960.
Spending slowed for private health insurance, Medicare, hospitals, physicians and clinical services and out-of-pocket spending by consumers.  However, it accelerated for Medicaid and for prescription drugs, according to the report, published online by the journal Health Affairs.
Health care spending has grown at historically low rates for the past five years.The study found that health care spending rose to $2.9 trillion, or $9,255 per person, in 2013. As a share of gross domestic product, health care remained at 17.4 percent, the same share since 2009, the CMS researchers found.
The 3.6 percent spending growth for 2013 tracks a CMS estimate from September and is 0.5 percentage point lower than 2012.
Medicare spending, which was nearly $586 billion in 2013, rose 6.1%.
Consumer out-of-pocket spending, including co-payments and deductibles or payments for services not covered by a consumer’s health insurance, was $339.4 billion in 2013.
Spending for physician and clinical services was $586.7 billion, up 3.8%.
By the way, 190 million are covered by private health insurance, up .7%

Von Hayek and Decentralization of Knowledge

Frederick Von Hayek

“In The Use of Knowledge in Society (1945), Hayek …that…. society’s members achieve diverse, complicated ends through a principle of spontaneous self-organization. “

In 2011 “The Use of Knowledge in Society” was selected as one of the top 20 articles published in the American Economic Review during its first 100 years.

He won a Nobel Prize.

Jimmy Wales cites “The Use of Knowledge in Society”, which he read as an undergraduate,[8] as “central” to his thinking about “how to manage the Wikipedia project”.Hayek argued that information is decentralized – that each individual only knows a small fraction of what is known collectively – and that as a result, decisions are best made by those with local knowledge rather than by a central authority.

Harvard Economic Research Project

After my senior year at Brandeis, my Economics Department Chair, Dr. Anne Carter, asked me to intern for the summer of 1972 at the Harvard Economic Research Project. It was an amazing time.

Dr. Carter was the Director of Research for the Project, which was begun in 1948 by Wassily Leontief. Leontief remained its director until 1973.

Little did I know that Wassily Leontief would receive the Nobel Prize for Economics in 1973. And Anne Carter, who certainly mentored me, went on to become Dean of Faculty at Brandeis.

The Project was to exploit the new methodology of Input-Output Economics, using the very primitive computers available at the time.

My summer job was to see if I could find tools that would allow the methodology to be used in environmental fields. I decided to focus on water pollution, and zeroed on on the extent to which water pollution could be predicted by the extent to which “biochemical oxygen demand” was present in any given body of water.