Monthly Archives: September 2019

Yes, contemporary capitalism can be compatible with liberal democracy

By William A. Galston – Ever since Aristotle examined the relationship between class structure and regimes of government, political scientists have understood that a strong middle class is the foundation of stable constitutional governance. The reason is straightforward: in societies divided into the rich few and the many poor, class warfare is inevitable.

The rich will use the state to defend what they have; the poor, to gain a bigger share. Sometimes this will be through majoritarian democracy, but is often led by a strongman claiming to act in the people’s name.

By contrast to either of these ambitions, the middle class tends more to prize the rule of law and to seek incremental rather than radical change.

Through much of human history, class structure was a product of chance and force, not policy. Because economic growth as we know it today existed neither in theory nor reality, economies were understood as zero-sum games. Political communities gained through plunder and conquest (or imperial tribute); economic classes gained through redistribution.

It is only during the past three centuries that long-term, secular economic growth, from which in principle all can gain, was conceptualized and realized as part of humankind’s lived experience.

There was nothing natural or automatic about this process. The vibrant markets on which growth depends are systems of rules backed by public power as well as social norms. Wise policies are needed to ensure that the fruits of growth are widely shared. When these conditions are satisfied, market economies tend to generate not only broad improvements in living standards but also growing middle classes that the poor can hope to enter.

Market-driven economic growth tends therefore to support constitutional governance in its modern form, combining elements of majoritarian democracy with protected individual rights and liberties. more>

Updates from Chicago Booth

How to develop a superstar strategy
By Ram Shivakumar – We live in an age of growing corporate inequality, with a few dominant companies and many underperformers.

The superstar archetype is Google, established in 1998 with the aim of rank-ordering web pages in what was then the nascent industry of search. By the beginning of the 21st century, Google had no revenues and no established business model. Fast-forward 18 years and a few hundred acquisitions, and Alphabet, Google’s parent company, has a market value in excess of US$750 billion.

In almost every industry, a small number of companies are capturing the lion’s share of profits. The top 10 percent of companies worldwide with more than $1 billion in revenues (when ranked by profit) earned 80 percent of all economic profits from 2014 to 2016, according to a recent study by the McKinsey Global Institute. The 40 biggest companies in the Fortune 500 captured 52 percent of the total profit earned by all the corporations on that list, according to an analysis of the 2019 ranking by Fortune.

This leaves less and less for the smaller fish to feed on. The middle 60 percent of businesses earned close to zero economic profit from 2014 to 2016, according to McKinsey, while each of those in the bottom 10 percent recorded economic losses of $1.5 billion on average.

Why do some companies succeed so categorically while the majority struggle? This question drives much of the management-consulting industry. It has also inspired a library’s worth of management books with varying explanations. Is it because successful companies have visionary and disciplined leaders, as management consultant Jim Collins argues in his best seller Good to Great? Is it because successful companies have superior management systems and organizational cultures? Is it because of positional advantages, as Harvard’s Michael Porter might argue? Or is it all down to timing and luck?

Concluding that luck is a big factor would be unlikely to sell many paperbacks in an airport bookstore; yet, undoubtedly, chance events have played an important role in many successes and failures. more>

Related>

Updates from Ciena

How coherent technology decisions that start in the lab impact your network
What is the difference between 400G, 600G and 800G coherent solutions? It seems to be obvious, but is it just about maximum wavelength capacity? Why are different baud, modulations or DSP implementations used, and more importantly, what are the networking implications associated with each?
By Helen Xenos – 32QAM, 64QAM, and hybrid modulation….32, 56, 64, now 95Gbaud? Are they really any different? Fixed grid, flex grid, what’s 75GHz? Is your head spinning yet?

Coherent optical technology is a critical element that drives the amount of capacity and high-speed services that can be carried across networks and is a critical element in controlling their cost. But with multiple generations of coherent solutions available and more coming soon, navigating the different choices can be difficult. Unless you are immersed in the details and relationships between bits and symbols, constellations and baud in your everyday life, it can be confusing to understand how the technology choices made in each solution influence overall system performance and network cost.

To clarify these relationships, here is an analogy that helps provide a more intuitive understanding: consider performance-optimized coherent optical transport as analogous to freight transport.

The goal of network providers using coherent is to transport as much capacity as they can, in the most cost-efficient manner that they can, using wavelengths across their installed fiber. This is similar to wanting to be as efficient as possible in freight transport, carrying as much payload as you can using available truck and road resources.

So now, let’s look at a coherent modem – this is the subsystem that takes in client traffic (ex. 100 Gigabit Ethernet) and converts it into an optical signal using a certain modulation technique, and this optical signal is what we call a wavelength. Each wavelength carries a certain throughput (for example 100Gb/s), takes up a certain amount of spectrum, and requires a certain amount of channel spacing on a fiber. In most systems today, there is 4800GHz spectrum available in the C-band. So, for example, if a user deploys 100G wavelengths with 50GHz fixed channel spacing, their fiber can transport 96 x 100Gb/s or 9.6Tbs of capacity. more>

Related>

Updates from Siemens

Essentra Components Achieves Cost Savings Up To 10%
By Emilia Maier – Essentra Components is a global leader in manufacturing and distributing plastic injection molded, vinyl dip molded and metal items.

The company is focused on being a low-cost producer, so they can secure revenue growth at attractive margins, and facilitate continuous improvement programs with tight cost controls and productivity gains, serving to reduce conversion costs.

With the integrated calculation system for component and tool costs from Siemens, Essentra Components delivers cost-effective, high-quality products in response to customer needs. Essentra is using the global costing solution in the bidding phase to deliver fast and accurate costs worldwide.

“Quote generation is done today within one hour, as opposed to five hours before we had Teamcenter product cost management, so we save 80% of our time,” Derek Bean, Manager, Divisional Engineering Solutions Essentra Components.

The cost estimators at Essentra consolidate and verify the cost results in terms of plausibility, competitiveness, opportunities and risks with the help of the Profitability Analysis module in Teamcenter Product Cost Management. more>

Related>

National Security: Keeping Our Edge

By James Manyika, William H. McRaven and Adam Segal – The United States leads the world in innovation, research, and technology development. Since World War II, the new markets, industries, companies, and military capabilities that emerged from the country’s science and technology commitment have combined to make the United States the most secure and economically prosperous nation on earth.

This seventy-year strength arose from the expansion of economic opportunities at home through substantial investments in education and infrastructure, unmatched innovation and talent ecosystems, and the opportunities and competition created by the opening of new markets and the global expansion of trade.

This time there is no Sputnik satellite circling the earth to catalyze a response, but the United States faces a convergence of forces that equally threaten its economic and national security. First, the pace of innovation globally has accelerated, and it is more disruptive and transformative to industries, economies, and societies. Second, many advanced technologies necessary for national security are developed in the private sector by firms that design and build them via complex supply chains that span the globe; these technologies are then deployed in global markets.

The capacities and vulnerabilities of the manufacturing base are far more complex than in previous eras, and the ability of the U.S. Department of Defense (DOD) to control manufacturing-base activity using traditional policy means has been greatly reduced. Third, China, now the world’s second-largest economy, is both a U.S. economic partner and a strategic competitor, and it constitutes a different type of challenger.

Tightly interconnected with the United States, China is launching government-led investments, increasing its numbers of science and engineering graduates, and mobilizing large pools of data and global technology companies in pursuit of ambitious economic and strategic goals.

The United States has had a time-tested playbook for technological competition. It invests in basic research and development (R&D), making discoveries that radically change understanding of existing scientific concepts and serve as springs for later-stage development activities in private industry and government.

It trains and nurtures science, technology, engineering, and mathematics (STEM) talent at home, and it attracts and retains the world’s best students and practitioners. It wins new markets abroad and links emerging technology ecosystems to domestic innovations through trade relationships and alliances. And it converts new technological advances into military capabilities faster than its potential adversaries.

Erosion in the country’s leadership in any of these steps that drive and diffuse technological advances would warrant a powerful reply. However, the United States faces a critical inflection point in all of them. more>

Economics Can’t Explain Why Inequality Decreases

A problem with Piketty’s explanation
By Peter Turchin – In September I went to an international conference in Vienna, Austria, The Haves and the Have Nots: Exploring the Global History of Wealth and Income Inequality. One thing I learned at the conference is that, apparently, economists don’t really know why inequality increases and decreases. Especially, why it decreases.

Let’s start with Thomas Piketty, since Capital in the Twenty First Century is currently the “bible” (or should I say “Das Kapital”?) of inequality scholars.

Piketty provides a good explanation of why inequality increases. It’s good not in the sense that everybody agrees with it, but in the sense of being good science: a general mechanism that is supported by mathematics and by data.

So far so good. But how does Piketty explain the decline of inequality during the middle of the twentieth century? It was a result of unique circumstances—two destructive world wars and the Great Depression. In other words, and forgive me for crudeness, shit happens.

This is not a particularly satisfactory conclusion. Of course, it’s possible that the general trend of inequality is always up, except for random exogenous events that knock it down once in a while. So devastating wars destroy property, and by making the wealthy poorer reduce inequality. This is one of the inequality-reducing forces that Piketty mentions several times in his book.

To me such exogenous explanations are not satisfactory. My intuition (which I understand may not be shared by all) is that when inequality gets too high, there are forces that bring it down. In other words, to some degree it’s a regulatory process, and that’s why we don’t see truly extreme forms of inequality (when one person owns everything).

In Piketty’s view, the only reason we don’t see such extremes is because some kind of random event always intervenes before we get to it. more>

Updates from ITU

Why radiocommunications are so crucial for natural disaster management
By Mario Maniewicz – As the Director of ITU’s Radiocommunications Bureau, I could not highlight enough the relevance of radiocommunications, and more specifically the relevance of satellite communications in the management and mitigation of eventual crises.

Radiocommunication services have driven substantial transformation in many development-related sectors including environment, health and education – making them a key accelerator towards the achievement of the SDGs.

If we look into the United Nations Sustainable Development Goal (SDG) No. 13 on Climate Action, its first target is to strengthen resilience and adaptive capacity to climate-related hazards and natural disasters in all countries.

Allow me to illustrate the key role satellite communications play towards achieving this SDG target by providing vital connectivity before, during and after a disaster occurs.

  • In order to be well-prepared for an event, accurate climate prediction and the detection of climate-related hazards are key. And both rely heavily on data obtained from space sensing and earth observation satellite systems.
  • In the unlikely detection of a natural disturbance in the state of the atmosphere, timely awareness and early warning of the population allows them to be better prepared and less impacted by a natural or environmental adversity.
  • Moreover, satellite communications are often used for rescue and relief operations as well as in vital life-saving responses, since they remain as a resilient solution even when terrestrial communications have been severely damaged.
  • Finally, satellite communications continue to provide valuable services until other telecommunication and basic services have been restored.

Taking into account the relevance of connectivity, especially for regions and countries affected by disasters, the ITU is striving to ensure that all the world’s people have access to affordable communications. more>

Related>

Are wages rising, falling, or stagnating?

By Richard V. Reeves, Christopher Pulliam, and Ashley Schobert – What is really happening to wages in America?

Over the past 12 months, average hourly wages rose 3.2 percent, according to the latest jobs report from the Bureau of Labor Statistics. But the longer-term story is contested.

Many analysts and commentators lament the situation of stagnating wages, while others celebrate wage growth.

To take just two of hundreds of examples, our colleagues in the Hamilton Project here at Brookings report “long-run wage stagnation for lower-wage workers”, while Michael Strain over at AEI writes that “the wages of a typical worker have increased by 32% over the past three decades. That’s a significant increase in purchasing power”. Though we would be remiss if we did not point out that this corresponds to less than a one percent increase per year.

The honest but boring answer to the question of what is happening to wages is: It depends. Specifically, it depends on how you measure it.

As so often, methodology really, really, really matters.

In the case of wage growth, four analytical decisions bear heavily on the results: which time period, which deflator, which workers (by gender), and which workers (in terms of position). more>

What’s Elizabeth Warren’s wealth tax worth?

By Isabel V. Sawhill and Christopher Pulliam – On both sides of the Atlantic, economic inequality has rocketed up the political agenda and inspired a new wave of populism. Wealth inequality is high and rising in the UK and staggeringly so in the US. The top 1% of American households now have more wealth than the bottom 90%. In the UK, the top 10% holds over half the wealth. The richest 400 individuals in the US average a net worth of $7.2 billion.

How did we get to this point? As Thomas Piketty, in his book Capital, famously argued, a capitalist economy left to its own devices will tend to produce not just inequality but ever-rising inequality of wealth – and the income derived from wealth. The main reason is because the returns earned on assets such as stocks and bonds normally exceed the growth of wages.

Imagine an economy with one capitalist and one wage earner. If the annual rate of return to financial assets is, say, 3%, but wages are only growing by 2%, more and more income ends up in the hands of the capitalist. Wealth then begets more wealth as the capitalist, not needing to spend all of his added income, adds to his existing wealth and reaps ever-growing income from that wealth. Unless a war or other shock destroys his wealth (think depression or the devastation in Europe after the Second World War), or government decides to tax it away, we end up with the rise in wealth inequality that we are now seeing in many rich countries – the US in particular.

There is something deeply disturbing about Piketty’s work. If one takes his thesis seriously, it means that the inequality of wealth and its corollary, income inequality, along with their continued growth, is the new normal. They are baked into a capitalist economy.

Of course, some financial capital gets invested in productive assets that help the economy grow. But productive investment and growth have slowed in recent decades, making it hard to argue that the rise in wealth at the top has benefited everyone. In the meantime, the accumulation of wealth in high-income households is one reason that income inequality is rising so sharply at the very top. While the richest 20% of US households, which benefit from a lot of human capital but not a lot of wealth, saw their market incomes rise by 96% between 1979 and 2016, the top 1% – which receives far more of their income from wealth – saw their incomes rise by a staggering 219%.

In short, growing wealth inequality spawns growing income inequality, so if we care about the latter, we cannot focus only on redistributing income. We need to tackle the accumulation of wealth as well.

What to do? Senator Elizabeth Warren, a serious contender for the US presidency, has proposed a wealth tax. more>

Updates from Chicago Booth

Want to pay less tax? Improve your firm’s internal reporting
By Marty Daks – When companies engage in the great American pastime known as tax avoidance, many parse the Internal Revenue Code for loopholes to reduce their effective tax rate. But research suggests they should also scrutinize the quality of their internal reporting.

Internal information quality (IIQ), a term coined by Chicago Booth’s John Gallemore and University of North Carolina’s Eva Labro, encompasses computer reporting systems and any other resources that a company devotes to ensuring the quality and ease of access of information within a firm. The elements that constitute IIQ have been largely overlooked in tax-avoidance literature—perhaps because they are usually not observable, and are difficult for academics to measure.

Gallemore and Labro argue companies should pay more attention to these issues, which they define in terms of the accessibility, usefulness, reliability, accuracy, quantity, and signal-to-noise ratio of the data and knowledge within an organization. Their findings suggest that firms with high IIQ tend to enjoy lower effective tax rates and, all else being equal, a smaller tax bite.

Gallemore and Labro employed four publicly available variables, using data from 1994 to 2010, to rate firms’ IIQ: the speed at which management released an earnings announcement after its fiscal year closed, the accuracy of management’s earnings forecasts, the absence of material weaknesses in internal controls, and the lack of restatements due to errors.

The researchers used these measures to identify companies that released earnings more rapidly and forecasted them more accurately, and had fewer Section 404 citations and restatements due to errors. They assigned these firms higher IIQ ratings.

High-IIQ firms, they find, tend to exhibit some positive traits, including centralized and standardized business transaction processing, more-efficient reporting practices, and the ability to share data across business units and geographical locations. more>

Related>