Category Archives: History

A New Americanism

Why a Nation Needs a National Story
By Jill Lepore – Carl Degler issued a warning: “If we historians fail to provide a nationally defined history, others less critical and less informed will take over the job for us.”

The nation-state was in decline, said the wise men of the time. The world had grown global. Why bother to study the nation?

Francis Fukuyama is a political scientist, not a historian. But his 1989 essay “The End of History?” illustrated Degler’s point. Fascism and communism were dead, Fukuyama announced at the end of the Cold War.

Fukuyama was hardly alone in pronouncing nationalism all but dead. A lot of other people had, too. That’s what worried Degler.

Nation-states, when they form, imagine a past. That, at least in part, accounts for why modern historical writing arose with the nation-state.

But in the 1970s, studying the nation fell out of favor in the American historical profession. Most historians started looking at either smaller or bigger things, investigating the experiences and cultures of social groups or taking the broad vantage promised by global history.

But meanwhile, who was doing the work of providing a legible past and a plausible future—a nation—to the people who lived in the United States? Charlatans, stooges, and tyrants.

The endurance of nationalism proves that there’s never any shortage of blackguards willing to prop up people’s sense of themselves and their destiny with a tissue of myths and prophecies, prejudices and hatreds, or to empty out old rubbish bags full of festering resentments and calls to violence.

When historians abandon the study of the nation, when scholars stop trying to write a common history for a people, nationalism doesn’t die. Instead, it eats liberalism.

Maybe it’s too late to restore a common history, too late for historians to make a difference. But is there any option other than to try to craft a new American history—one that could foster a new Americanism? more>

Updates from Chicago Booth

The safest bank the Fed won’t sanction – A ‘narrow bank’ offers security against financial crises
By John H. Cochrane – One might expect that those in charge of banking policy in the United States would celebrate the concept of a “narrow bank.” A narrow bank takes deposits and invests only in interest-paying reserves at the Fed. A narrow bank cannot fail unless the US Treasury or Federal Reserve fails. A narrow bank cannot lose money on its assets. It cannot suffer a run. If people want their money back, they can all have it, instantly. A narrow bank needs essentially no asset risk regulation, stress tests, or anything else.

A narrow bank would fill an important niche. Right now, individuals can have federally insured bank accounts, but large businesses need to handle amounts of cash far above deposit insurance limits. For that reason, large businesses invest in repurchase agreements, short-term commercial paper, and all the other forms of short-term debt that blew up in the 2008 financial crisis. These assets are safer than bank accounts, but, as we saw, not completely safe.

A narrow bank is completely safe without deposit insurance. And with the option of a narrow bank, the only reason for companies to invest in these other arrangements is to try to harvest a little more interest. Regulators can feel a lot more confident shutting down run-prone alternatives if narrow bank deposits are widely available. more>

Related>

Globalization at a Crossroads

By Gordon Brown – Whether or not one realizes it, 2018 may have been a historic turning point. Poorly managed globalization has led to nationalist “take-back-control” movements and a rising wave of protectionism that is undermining the 70-year-old American-led international order. The stage is set for China to develop its own parallel international institutions, auguring a world divided between two competing global-governance systems.

Whatever happens in the next few years, it is already clear that the 2008-2018 decade marked an epochal shift in the balance of economic power.

Whereas around 40% of production, manufacturing, trade, and investment was located outside the West in 2008, over 60% is today.

For decades after its formation in the 1970s, the Group of Seven (G7) – Canada, France, Germany, Italy, Japan, the United Kingdom, and the US – essentially presided over the entire world economy. But by 2008, I and others had begun to discern a changing of the guard. Behind the scenes, North American and European leaders were debating whether it was time to create a new premier forum for economic cooperation that would include emerging economies.

These debates were often heated. On one side were those who wanted to keep the group small (one early US proposal envisioned a G7+5); on the other side were those who wanted the group to be as inclusive as possible. To this day, the results of those earlier negotiations are not fully understood.

The current trade conflict between the United States and China is symptomatic of a larger transition in global financial power. On the surface, the Trump administration’s confrontation with China is about trade, with disputes over currency manipulation thrown in for good measure. But from Trump’s speeches, one gathers that the real battle is about something bigger: the future of technological dominance and global economic power.

While Trump at least detects the growing threat to American supremacy, he has ignored the most obvious strategy for responding to it: namely, a united front with US allies and partners around the world. Instead, Trump has asserted a prerogative to act unilaterally, as if America still rules over a unipolar world. As a result, a trail of geopolitical ruin already lies in his wake. more>

Economics as a moral tale

By John Rapley – Think of human development as a long journey.

At the beginning, we live at the mercy of nature. Dependent on its bounty, we pray for rains and freedom from natural disasters and plagues. At the end of the journey, nature lives at our mercy. We use science and technology to release new wealth and remake the planet.

Economists began to compose the narrative of this odyssey, from subjection to dominion, in the 1700s. Once it became apparent that Europe had broken with millennia of stasis to begin a long period of rising growth – the same through which we are still living – political economists abandoned philosophical reflection to draft roadmaps to development.

Two broad types emerged. One approach described the walk, the other the walker.

The first presumed that the context in which we made the journey – the natural environment, the institutions, the culture, the legal and political systems – determined the direction of the path. In this model, the government bore responsibility to build the path so that it could accommodate as many people as possible.

The second approach took a more individualist perspective. It presumed that the walker determined his or her own success in the journey. It concentrated on the moral, intellectual and physical attributes it believed an individual needed to advance. In this model, the task of the government was to sweep aside obstacles impeding the gifted few from embarking on their personal journeys – restraints that ranged from restrictions on labor mobility to usury laws. Thus liberated, gifted individuals would beat the path to prosperity.

By 1948, Western economies had emerged from crisis, beginning a decades-long period of rising growth and prosperity. Rather than pack up and go home, the development industry now turned its attention to a new frontier. With Europe’s overseas empires breaking up, dozens of new nation-states were coming into being, each of them eager to ‘catch up’ with its erstwhile colonial master. Amid this exciting atmosphere, the development industry could use its expertise to play a clear and prominent role, one captured in the subtitle to the then-Bible of development, Walt Rostow’s Stages of Economic Growth (1960) – ‘a non-communist manifesto’.

By now, statist economics was enshrined in theory and sanctified by practice. more>

The Truth About the Gig Economy

By Annie Lowrey – The workforce is getting Uberized. The gig economy is taking over the world. Independent contractor jobs are the new normal.

In the post-recession years, this became conventional wisdom, as more and more Americans took jobs—well, “jobs”—with companies like Postmates, Fiverr, TaskRabbit, and Lyft. But the gig economy was then and is now a more marginal phenomenon than it might have seemed.

The gig economy might be new and big and radical and transformative. It might represent a powerful business model for venture investors and tech companies. But Uber and similar companies were not and are not driving tidal changes in the way that Americans make a living.

Wild predictions aside, it was always clear that many gig workers were taking on these kinds of jobs as a temporary stopgap or a way to supplement their income, rather than as a substitute for a full-time position. A comprehensive look at the Uber workforce by Krueger and Jonathan Hall, the company’s internal head of economic research, found that, “Most of Uber’s driver-partners had full- or part-time employment prior to joining Uber, and many continued in those positions after starting to drive with the Uber platform.”

There’s another reason why a false narrative might have hold: Gig work is vastly more prevalent in the big coastal cities where many investors and journalists live, leading to a kind of media myopia about the scale of the phenomenon. And gig work seemed like the future. more>

Why Wall Street Isn’t Useful for the Real Economy

By Lynn Stout – In the wake of the 2008 crisis, Goldman Sachs CEO Lloyd Blankfein famously told a reporter that bankers are “doing God’s work.” This is, of course, an important part of the Wall Street mantra: it’s standard operating procedure for bank executives to frequently and loudly proclaim that Wall Street is vital to the nation’s economy and performs socially valuable services by raising capital, providing liquidity to investors, and ensuring that securities are priced accurately so that money flows to where it will be most productive.

The mantra is essential, because it allows (non-psychopathic) bankers to look at themselves in the mirror each day, as well as helping them fend off serious attempts at government regulation. It also allows them to claim that they deserve to make outrageous amounts of money.

According to the Statistical Abstract of the United States, in 2007 and 2008 employees in the finance industry earned a total of more than $500 billion annually—that’s a whopping half-trillion dollar payroll (Table 1168).

Let’s start with the notion that Wall Street helps companies raise capital. If we look at the numbers, it’s obvious that raising capital for companies is only a sideline for most banks, and a minor one at that. Corporations raise capital in the so-called “primary” markets where they sell newly-issued stocks and bonds to investors.

However, the vast majority of bankers’ time and effort is devoted to (and most bank profits come from) dealing, trading, and advising investors in the so-called “secondary” market where investors buy and sell existing securities with each other.

In 2009, for example, less than 10 percent of the securities industry’s profits came from underwriting new stocks and bonds; the majority came instead from trading commissions and trading profits (Table 1219).

This figure reflects the imbalance between the primary issuing market (which is relatively small) and the secondary trading market (which is enormous). In 2010, corporations issued only $131 billion in new stock (Table 1202).

That same year, the World Bank reports, more than $15 trillion in stocks were traded in the U.S. secondary marketmore than the nation’s GDP. more>

How a Decade of Crisis Changed Economics

By J. W. Mason – Has economics changed since the crisis?

As usual, the answer is: it depends. If we look at the macroeconomic theory of PhD programs and top journals, the answer is clearly, no. Macroeconomic theory remains the same self-contained, abstract art form that it has been for the past twenty-five years.

As Joan Robinson once put it, economic theory is the art of pulling a rabbit out of a hat right after you’ve stuffed it into the hat in full view of the audience.

Many producers of this kind of model actually have a quite realistic understanding of the behavior of real economies, often informed by firsthand experience in government. The combination of real insight and tight genre constraints leads to a strange style of theorizing, where the goal is to produce a model that satisfies the methodological conventions of the discipline while arriving at a conclusion that you’ve already reached by other means. It’s the economic equivalent of the college president in Randall Jarrell’s Pictures from an Institution:

About anything, anything at all, Dwight Robbins believed what Reason and Virtue and Tolerance and a Comprehensive Organic Synthesis of Values would have him believe. And about anything, anything at all, he believed what it was expedient for the president of Benton College to believe. You looked at the two beliefs, and lo! the two were one. more>

Economics as a moral tale

By John Rapley – Think of human development as a long journey.

At the beginning, we live at the mercy of nature. Dependent on its bounty, we pray for rains and freedom from natural disasters and plagues. At the end of the journey, nature lives at our mercy.

We use science and technology to release new wealth and remake the planet. Today, as humans implant themselves with microchips, install artificial organs and plan Mars colonies, we even aim for a ‘singularity’ that will lift us out of nature once and for all.

Economists began to compose the narrative of this odyssey, from subjection to dominion, in the 1700s. Once it became apparent that Europe had broken with millennia of stasis to begin a long period of rising growth – the same through which we are still living – political economists abandoned philosophical reflection to draft roadmaps to development.

Two broad types emerged. One approach described the walk, the other the walker. The first presumed that the context in which we made the journey – the natural environment, the institutions, the culture, the legal and political systems – determined the direction of the path. In this model, the government bore responsibility to build the path so that it could accommodate as many people as possible.

The second approach took a more individualist perspective. It presumed that the walker determined his or her own success in the journey. It concentrated on the moral, intellectual and physical attributes it believed an individual needed to advance. In this model, the task of the government was to sweep aside obstacles impeding the gifted few from embarking on their personal journeys – restraints that ranged from restrictions on labor mobility to usury laws. Thus liberated, gifted individuals would beat the path to prosperity. more>

Is America’s future capitalist or socialist?

By Ezra Klein – In American politics, and particularly in the Democratic Party, the primacy of capitalism is, for the first time in ages, an open question.

Sanders is expected to run again in 2020, and to run with the support of a grassroots movement that thrills to his break with capitalist convention. He’ll face, among others, Massachusetts Sen. Elizabeth Warren, who says one key difference between her and Sanders is that she’s “a capitalist to my bones.”

But what are the actual differences between liberal reformers of capitalism, like Warren and Pearlstein, and democratic socialists, like Sanders? I invited Pearlstein to discuss his book, and the broader capitalism vs. socialism divide, with Bhaskar Sunkara, editor of the journal Jacobin, and author of the forthcoming book, The Socialist Manifesto. Their debate follows, lightly edited for style and length.

A CEO like Charles Wilson could say “what was good for the country was good for General Motors and vice versa,” but he was responding to the same exact market pressures as CEOs today. The only difference is that he was constrained by unions and a liberal political coalition.

Social democracy was always predicated on economic expansion. Expansion gave succor to both the working class and capital. When growth slowed and the demands of workers made deeper inroads into firm profits, business owners rebelled against the class compromise. And they were in the structural position to force their own solutions, even in countries like Sweden where there were experiments with wage-earner funds and other left-solutions to the crisis. more>

How Universal Basic Income Solves Widespread Insecurity and Radical Inequality

By Daniel Nettle – Today should be the best time ever to be alive. Thanks to many decades of increasing productive efficiency, the real resources available to enable us to do the things we value—the avocados, the bicycles, the musical instruments, the bricks and glass—are more abundant and of better quality than ever. Thus, at least in the industrialized world, we should be living in the Age of Aquarius, the age where the most urgent problem is self-actualization, not mere subsistence: not ‘How can we live?’, but ‘How shall we live?’.

Why then, does it not feel like the best time ever?

Contrary to the predictions of mid-twentieth-century economists, the age of universal wellbeing has not really materialized. Working hours are as high as they were for our parents, if not higher, and the quality of work is no better for most people. Many people work several jobs they do not enjoy, just to keep a roof over their heads, food on the table, and the lights on. In fact, many people are unable to satisfy these basic wants despite being in work.

Big problems require big ideas.

Our current generation of politicians don’t really have ideas big enough to deal with the problems of widespread insecurity and marked inequality. Big ideas come along every few decades. The last one was about forty years ago: neoliberalism, the idea that market competition between private-sector corporations would deliver the social outcomes we all wanted, as long as government got out of the way as far as possible.

Our current politicians propose to deal with symptoms piecemeal—a minimum-wage increase here, a price cap there, rent-control in the other place; tax credits for those people; financial aid to buy a house for those others. At best we are dealing with one symptom at a time. Each piecemeal intervention increases the complexity of the state; divides citizens down into finer and finer ad hoc groups each eligible for different transactions; requires more bureaucratic monitoring; and often has unintended and perverse knock-on effects.

A Universal Basic Income (UBI) is a regular financial payment made to all eligible adults, whether they work or not, regardless of their other means, and without any conditionality whatever. Receiving it is a fundamental entitlement that comes with being a member of society: people can know that it will always be there, now and in the future. more>