Category Archives: History

The information arms race can’t be won, but we have to keep fighting

By Cailin O’Connor -Arms races happen when two sides of a conflict escalate in a series of ever-changing moves intended to outwit the opponent. In biology, a classic example comes from cheetahs and gazelles. Over time, these species have evolved for speed, each responding to the other’s adaptations.

One hallmark of an arms race is that, at the end, the participants are often just where they started. Sometimes, the cheetah catches its prey, and sometimes the gazelle escapes. Neither wins the race because, as one gets better, so does its opponent. And, along the way, each side expends a great deal of effort. Still, at any point, the only thing that makes sense is to keep escalating.

Arms races happen in the human world too. The term arms race, of course, comes from countries at war who literally amass ever-more sophisticated and powerful weapons. But some human arms races are more subtle.

As detailed in the Mueller report – but widely known before – in the lead-up to the 2016 presidential election in the United States, the Russian government (via a group called the Internet Research Agency) engaged in large-scale efforts to influence voters, and to polarize the US public. In the wake of this campaign, social-media sites and research groups have scrambled to protect the US public from misinformation on social media.

What is important to recognize about such a situation is that whatever tactics are working now won’t work for long. The other side will adapt. In particular, we cannot expect to be able to put a set of detection algorithms in place and be done with it. Whatever efforts social-media sites make to root out pernicious actors will regularly become obsolete.

The same is true for our individual attempts to identify and avoid misinformation. Since the 2016 US election, ‘fake news’ has been widely discussed and analyzed. And many social-media users have become more savvy about identifying sites mimicking traditional news sources. But the same users might not be as savvy, for example, about sleek conspiracy theory videos going viral on YouTube, or about deep fakes – expertly altered images and videos.

What makes this problem particularly thorny is that internet media changes at dizzying speed. more>

Updates from Chicago Booth

A.I. is only human
By Jeff Cockrell – If you applied for a mortgage, would you be comfortable with a computer using a collection of data about you to assess how likely you are to default on the loan?

If you applied for a job, would you be comfortable with the company’s human-resources department running your information through software that will determine how likely it is that you will, say, steal from the company, or leave the job within two years?

If you were arrested for a crime, would you be comfortable with the court plugging your personal data into an algorithm-based tool, which will then advise your judge on whether you should await trial in jail or at home? If you were convicted, would you be comfortable with the same tool weighing in on your sentencing?

Much of the hand-wringing about advances in artificial intelligence has been concerned with AI’s effects on the labor market. “AI will gradually invade almost all employment sectors, requiring a shift away from human labor that computers are able to take over,” reads a report of the 2015 study panel of Stanford’s One Hundred Year Study on Artificial Intelligence. But whether AI ultimately creates massive unemployment or inspires new, as-yet-unknown professional fields, its perils and promises extend beyond the job market. By replacing human decision-making with automated processes, we can make businesses and public institutions more effective and efficient—or further entrench systemic biases, institutionalize discrimination, and exacerbate inequalities.

It’s an axiom of computing that results are dependent on inputs: garbage in, garbage out.

What if companies’ machine-learning projects come up with analyses that, while logical and algorithmically based, are premised on faulty assumptions or mismeasured data?

What if these analyses lead to bad or ethically questionable decisions—either among business leaders or among policy makers and public authorities? more>

Related>

The Surveillance Threat Is Not What Orwell Imagined

By Shoshana Zuboff – George Orwell repeatedly delayed crucial medical care to complete 1984, the book still synonymous with our worst fears of a totalitarian future — published 70 years ago this month.

Since 1984’s publication, we have assumed with Orwell that the dangers of mass surveillance and social control could only originate in the state. We were wrong. This error has left us unprotected from an equally pernicious but profoundly different threat to freedom and democracy.

For 19 years, private companies practicing an unprecedented economic logic that I call surveillance capitalism have hijacked the Internet and its digital technologies. Invented at Google beginning in 2000, this new economics covertly claims private human experience as free raw material for translation into behavioral data. Some data are used to improve services, but the rest are turned into computational products that predict your behavior.

These predictions are traded in a new futures market, where surveillance capitalists sell certainty to businesses determined to know what we will do next. This logic was first applied to finding which ads online will attract our interest, but similar practices now reside in nearly every sector — insurance, retail, health, education, finance and more — where personal experience is secretly captured and computed for behavioral predictions. By now it is no exaggeration to say that the Internet is owned and operated by private surveillance capital.

In the competition for certainty, surveillance capitalists learned that the most predictive data come not just from monitoring but also from modifying and directing behavior. more>

Updates from Chicago Booth

Financial contagion spreads through supply chains
By Michael Maiello – As big financial institutions such as Lehman Brothers fell into distress in 2008, a credit contagion spread through the financial industry, creating a credit drought for the economy as lenders retrenched and hoarded capital.

It has been less clear how credit contagion can spread through other industries, but research by George Washington’s Şenay Ağca, Georgetown’s Volodymyr Babich, Chicago Booth’s John R. Birge, and City University of Hong Kong’s Jing Wu suggests that credit shocks follow the supply chains of distressed companies.

Ağca, Babich, Birge, and Wu examined daily changes in credit default swap (CDS) spreads for all contracts with a five-year maturity between 2003 and 2014. A CDS is a derivative contract guaranteeing the owner a payout in the event that the borrower defaults. The contract’s price is known as the spread, which is the cost to insure against the default of $100 of the issuer’s debt. A widening spread signals that the market believes the issuer is more likely to default. Because the CDS market is deep and liquid, with information priced rapidly into the spread, the researchers argue that it is a better indicator of default expectations than laggard credit ratings or notes from bond analysts.

Take Ford Motor Company’s November 2008 earnings report, which highlighted massive losses, looming layoffs, and drastic cuts in capital spending. The CDS spreads linked to the company’s debt quickly widened, as one might expect. CDS spreads of American Axle & Manufacturing, a major Ford supplier, did the same, the researchers find. It makes sense that if Ford was slashing spending, its suppliers would have been suffering, they note.

But by contrast, CDS spreads were unchanged for companies with no relationship to either Ford or American Axle, such as semiconductor manufacturer Advanced Micro Devices. This suggests the mechanism by which contagion spreads is based on quantifiable business relationships, the researchers find. more>

Related>

How Adam Smith became a (surprising) hero to conservative economists

By Glory M Liu – People like to fight over Adam Smith. To some, the Scottish philosopher is the patron saint of capitalism who wrote that great bible of economics, The Wealth of Nations (1776). Its doctrine, his followers claim, is that unfettered markets lead to economic growth, making everyone better off. In Smith’s now-iconic phrase, it’s the ‘invisible hand’ of the market, not the heavy hand of government, that provides us with freedom, security and prosperity.

To others, such as the Nobel prizewinning economist Joseph Stiglitz, Smith is the embodiment of a ‘neoliberal fantasy’ that needs to be put to rest, or at least revised. They question whether economic growth should be the most important goal, point to the problems of inequality, and argue that Smith’s system would not have enabled massive accumulations of wealth in the first place. Whatever your political leanings, one thing is clear: Smith speaks on both sides of a longstanding debate about the fundamental values of modern market-oriented society.

But these arguments over Smith’s ideas and identity are not new. His complicated reputation today is the consequence of a long history of fighting to claim his intellectual authority.

Smith’s first biographer, Dugald Stewart, deliberately portrayed him in the 1790s as an introverted, awkward genius whose magnum opus was an apolitical handbook of sorts. Stewart downplayed Smith’s more politically subversive moments, such as his blistering criticism of merchants, his hostility towards established religion, and his contempt for ‘national prejudice’, or nationalism. Instead, Stewart shined a spotlight on what he believed was one of ‘the most important opinions in The Wealth of Nations’: that ‘Little else is requisite to carry a state to the highest degree of opulence from the lowest barbarism, but peace, easy taxes, and a tolerable administration of justice; all the rest being brought about by the natural course of things.’

Stewart’s biography (first delivered as an eulogy in 1793, then published in 1794 and 1795) appeared in the wake of major events that terrified British audiences: the French Revolution of 1789, the Reign of Terror that followed and the sedition trials that followed in both England and Scotland. As the British historian Emma Rothschild has shown, Stewart’s depiction of Smith’s ideas cherrypicked in order to imbue political economy with scientific authority. She writes that he wanted to portray political economy as ‘an innocuous, technical sort of subject’, to help construct a politically ‘safe’ legacy for Smith during politically dangerous times. Stewart’s effort marked the beginning of Smith’s association with ‘conservative economics’.

Smith would soon earn a reputation as the father of the science of political economy – what we now know as economics. more>

Welcome to GAFA land—where the winner takes it all

By Susanne Wixforth – Amazon symbolizes the four platform giants, also known as GAFA (Google, Apple, Facebook and Amazon). The company is Europe’s largest platform retailer and its turnover is twice as high as that of its 20 largest competitors. While Amazon´s chief executive earned 2.16 million dollars per hour in 2017, its workers must be grateful if they receive the statutory minimum wage, which in the EU varies between €1.42 and €11.27 per hour.

In 2018, Amazon generated a turnover of about 210 billion euro worldwide, an increase of over 30 per cent on the previous year’s. The company has more than 600,000 employees across the globe. With a market capitalization of more than 730 billion euro, it is one of the most valuable listed companies. Its operating profit amounts to around 11 billion euro. Nevertheless, thanks to a ruling it had agreed on in advance with the tax authorities of Luxembourg, Amazon did not pay taxes on 75 per cent of its turnover between 2003 and 2014.

In the long run, the platform economy not only poses a risk to the stability and budgets of countries where the corporations earn their money but do not pay taxes—it also undermines social cohesion.

Amazon generates its turnover mainly through four channels: as one of the biggest online retailers, as the operator of by far the largest online marketplace for third-party suppliers, as one of the largest providers of online services and as the distributor of the ordered products.

Because of its large market power in some trading segments, independent traders depend on Amazon to reach their customers. There is evidence that Amazon is trying to force traders out through its sheer market power, for instance through copying products and undercutting prices. more>

The unlikely origins of USB, the port that changed everything

By Joel Johnson – In the olden days, plugging something into your computer—a mouse, a printer, a hard drive—required a zoo of cables.

If you’ve never heard of those things, and if you have, thank USB.

When it was first released in 1996, the idea was right there in the first phrase: Universal Serial Bus. And to be universal, it had to just work. “The technology that we were replacing, like serial ports, parallel ports, the mouse and keyboard ports, they all required a fair amount of software support, and any time you installed a device, it required multiple reboots and sometimes even opening the box,” says Ajay Bhatt, who retired from Intel in 2016. “Our goal was that when you get a device, you plug it in, and it works.”

But it was an initial skeptic that first popularized the standard: in a shock to many geeks in 1998, the Steve Jobs-led Apple released the groundbreaking first iMac as a USB-only machine.

Now a new cable design, Type-C, is creeping in on the typical USB Type-A and Type-B ports on phones, tablets, computers, and other devices—and mercifully, unlike the old USB cable, it’s reversible. The next-generation USB4, coming later this year, will be capable of achieving speeds upwards of 40Gbps, which is over 3,000 times faster than the highest speeds of the very first USB.

Bhatt couldn’t have imagined all of that when, as a young engineer at Intel in the early ’90s, he was simply trying to install a multimedia card. The rest is history, one that Joel Johnson plugged in to with some of the key players. more>

Eight Reasons Why Inequality Ruins the Economy

What matters is not so much the level of inequality as the effect it has.
By Chris Dillow – Roland Benabou gave the example (pdf) of how egalitarian South Korea has done much better than the unequal Philippines. And IMF researchers have found (pdf) a “strong negative relation” between inequality and the rate and duration of subsequent growth spells across 153 countries between 1960 and 2010.

Correlations, of course, are only suggestive. They pose the question: what is the mechanism whereby inequality might reduce growth? Here are eight possibilities:

1. Inequality encourages the rich to invest not innovation but in what Sam Bowles calls “guard labor” (pdf) – means of entrenching their privilege and power. This might involve restrictive copyright laws, ways of overseeing and controlling workers, or the corporate rent-seeking and lobbying that has led to what Brink Lindsey and Steven Teles call the “captured economy.

An especially costly form of this rent-seeking was banks’ lobbying for a “too big to fail” subsidy. This encouraged over-expansion of the banking system and the subsequent crisis, which has had a massively adverse effect upon economic growth.

3. “Economic inequality leads to less trust” say (pdf) Eric Uslaner and Mitchell Brown. And we’ve good evidence that less trust means less growth.

One reason for this is simply that if people don’t trust each other they’ll not enter into transactions where there’s a risk of them being ripped off.

5. Inequality can cause the rich to be fearful of future redistribution or nationalization, which will make them loath to invest. National Grid is belly-aching, maybe rightly, that Labour’s plan to nationalize it will delay investment. But it should instead ask: why is Labour proposing such a thing, and why is it popular? more>

Updates from Chicago Booth

How to react to a colleague’s microaggression
Should you intervene when one coworker is being insensitive toward another?
By Jane L. Risen and George Wu – The fourth installment of our quarterly Business Practice feature invites you to imagine witnessing a slight in a group meeting.

Greg’s request that Becky take notes is commonly termed a microaggression, described by Columbia’s Derald Wing Sue and his coresearchers as “brief and commonplace daily verbal, behavioral, or environmental indignities, whether intentional or unintentional, that communicate hostile, derogatory, or negative . . . slights and insults.”

The term, as coined by the psychiatrist Chester Pierce, refers to an action that denigrates a racial group; but in this case, Greg’s request can be seen as disparaging Becky and women more generally.

Scholars such as Joan C. Williams of the University of California, Hastings College of the Law have observed that women get “stuck” disproportionately with administrative tasks, such as taking notes, ordering lunch, and scheduling meetings, and research by Carnegie Mellon’s Linda Babcock and Laurie Weingart, Maria P. Recalde of the International Food Policy Research Institute, and Lise Vesterlund of the University of Pittsburgh has found women are more likely to be assigned or volunteer to take on “nonpromotable work.”

Interpersonal conflict is seldom pleasant, and this scenario is especially tricky because Greg may not have meant to slight Becky. A confrontation, particularly a public one in front of other product managers, could therefore lead Greg to be defensive.

Finally, the situation is complex strategically: Should you speak to Greg now or later?

Is a subtle approach or a more direct confrontation appropriate?

Should you talk about the specific behavior or provoke a larger conversation about culture and norms? more>

Related>

The free market is not the answer

By Jochen Steinhilber – We are discussing the digital transformation, which will profoundly change how we live, work and participate in politics and society in the decades to come.

The political and social significance of digital networking, smart factories and big data depends on how technology is used. It can deepen social inequalities and cement domination and profit maximization, or it can improve working and living conditions and facilitate participation. That is why digitalization needs political direction and should be based on social agreements.

But how can this be achieved without, for example, bringing those companies under tighter democratic control that, for many years, have been engaged in secret negotiations on international trade policy to ‘protect’ the digital and services agenda from all state intervention for years to come?

Also, those who will rightly champion the ecological transformation in the coming years and want to pursue it in a maximally inclusive way will have to ask themselves how this can be achieved under the current relations of power between the economy, politics and democracy—especially under lower growth rates that allow less space for redistribution.

Anyone who now claims that, considering the challenges of climate protection, a debate on economic democracy is a diversionary tactic and at best of theoretical rather than political interest, ignores the fact that the important strategic decisions must be taken at the economic level.

Do we really want to leave crucial questions—where can growth continue because it serves the common good? what must be dismantled because it is ecologically and socially harmful? and who pays for the change?—for the most part to the dominant market players?

And finally, the frequently-invoked crisis of democracy at least suggests that we need to rethink how the economy works. more>