Tag Archives: Skills

How digitization must be harnessed to save jobs

A framework agreement between the social partners should ensure job security and worker involvement are prioritized across the European Union.
By Esther Lynch – The announcement of jobs losses around Europe as a result of the Covid-19 pandemic has become an almost daily occurrence, as all sectors struggle to cope with the impact of lockdown. Preliminary research indicates that restructuring job losses have doubled in the second quarter of 2020, compared with previous years. Some 60 million EU workers are at risk of unemployment as the consequences of the crisis play out.

Trade unions are at the forefront of efforts to protect workers’ livelihoods. At the end of June, the European Trade Union Confederation and the three major EU-level employers’ organizations—BusinessEurope, CEEP and SMEunited—signed an Autonomous Framework Agreement, to work together on the introduction of digitalisation in workplaces across Europe. In the context of Covid-19, the deal has much wider relevance, as it provides a blueprint for negotiating a ‘just transition’ and change in the world of work.

The priority is to encourage an approach that fully involves workers and their trade unions. This must apply to restructuring situations caused by the virus, as well as to planned change. The agreement sets out that, instead of making redundancies, employers need to look at other options for maintaining and investing in their workforces, creating new opportunities and enabling workers to adapt to change.

The agreement applies across the EU, covering both public and private sectors and all economic activities, including online platform workers. The right of trade unions to represent workers is recognized and the agreement specifies that, in preparing for negotiations, unions must be able to consult all employees and should have the facilities and information required to participate fully throughout.

The issues of digitization, restructuring and equipping different sectors to respond to the coronavirus crisis are all interlinked. The fact that so many workers have suddenly found themselves relying on digital technologies to carry out their tasks has created a step change in terms of work organization. One survey indicates that 74 per cent of companies expect some of their staff to continue working remotely in the long term. These workers must have full employee rights and representation, with no erosion of pay and working conditions. more>

Updates from Adobe

Drawing Fashion
By Kasia Smoczynska – Kasia Smoczynska finds her inspiration on the catwalk.

At the launch of Givenchy’s 2019 Spring Collection in Paris, it was a technicolor gown that took her breath away. Created by British fashion designer Clare Waight Keller, the dress is a kaleidoscope of moving ribbons topped with an Elizabethan ruffle. To Smoczynska, it looked like “thousands of colorful fringes moving in every direction.”

“I had to draw it,” she says.

A few days later, Smoczynska was back in Leeds, England, where she lives and works. She dropped her iPad Pro onto an easel she had once purchased to make gouache paintings. (“I did maybe two paintings, and now I use it only for my iPad,” she admits.) Then, firing up Adobe Fresco, Smoczynska started to draw the gown with sweeping digital brushstrokes of yellows, blues, and reds. “I knew it would be fun to illustrate all those fringes,” she recalls.

The finished piece is typical of Smoczynska’s work: expressive, spontaneous, and charged with energy. “In my eyes it’s a look for a modern queen,” she says, explaining why her model sits atop a throne instead of marching down the catwalk. more>

Related>

Updates from Ciena

Planning for 5G Success: A Tale of Two Operators
The industry is moving forward with 5G deployments, motivated by differentiated service offerings. Blue Planet’s Soumen Chatterjee describes how 5G Automation is helping two mobile network operators plan their own path to 5G success.
By Soumen Chatterjee – In my earlier blog, I wrote about the promise of 5G network slicing, which opens the door to a variety of service offerings, to support differentiated requirements across industry sectors. In the interim, the current challenging economic time of the coronavirus pandemic has given mobile network operators (MNOs) a chance to re-assess their 5G strategies and double-down on pursuing new service opportunities.

The shift in consumer lifestyle patterns may have impacted the timing of some 5G use cases – industrial automation demand may slow, but interest for multi-media remote sporting experiences is anticipated. 5G brings unprecedented opportunities to provide customers with new services and an exceptional user experience, given performance of up to 100 Gbps and latency in the order of 1 millisecond. But 5G also brings additional operational complexity with network slicing technology, new radios, rearchitected transport, and a virtualized 5G core. 5G needs automation in the backend to manage this increased complexity and to contain associated operational costs. For MNOs, automation is a must, not an option.

In my discussions with MNOs, it is apparent that planning for 5G deployments is heavily influenced by an operator’s legacy infrastructure – infrastructure that exists in the field and systems that exist in the network operations center (NOC). However, no matter the starting point, it is essential to have dynamic planning capabilities that simplify and accelerate each phase of the process.

At one incumbent mobile operator, they are planning to roll-out small cell 5G radios alongside their 4G radios, in non-standalone (NSA) mode. However, they first need to get visibility of their current network assets. Their legacy inventory and operational support systems (OSS) are disjointed, so it is difficult to obtain an accurate and comprehensive view.

Furthermore, those OSS are not up to the task of modelling new 5G constructs. It would be an extremely heavy lift to shoehorn 5G data in, with very limited scope for extensibility. On the other hand, introduction of a new system could further fragment or duplicate operational data.

This is when Blue Planet’s federation capabilities prove to be a crucial step for 5G planning. With Blue Planet’s 5G Automation solution, data from existing systems is federated, reconciled, and synchronized into a new unified data model built on state-of-the-art graph database technology which can accommodate complex 5G relationships. There are also existing business processes – mostly manual – that rely on OSS, which need to be modernized to support automated 5G workflows.

Another MNO customer is a new entrant who is not encumbered by pre-existing infrastructure and OSS, has more flexibility in designing new systems and processes to support their 5G strategy, and can implement them more quickly. This MNO is planning to deploy tens of thousands of 5G cell sites in standalone mode (SA) within a few years. To scale expediently, they need to design-in automation of their business processes from the outset. Blue Planet’s 5G Automation solution is a natural fit, as it provides multi-vendor service orchestration and assurance founded on a unified inventory of hybrid physical and virtual infrastructure

Beyond the radio infrastructure, both MNOs are looking ahead to architecting customizable network slices end-to-end across the radio access network (RAN), transport and cloud domains, to satisfy their customers’ requirements. To this end, Blue Planet provides the holistic operational system to help determine the placement of 5G Core (5GC) virtualized network functions (VNFs) at the edge or in the core, with necessary compute capacity, to best support a variety of latency and bandwidth needs. more>

Related>

Updates from Chicago Booth

Can regulation rein in algorithmic bias?
By Sendhil Mullainathan – Last year, you published a paper documenting how an algorithm used by health-care organizations generated racially biased results. What takeaways did that offer in terms of how algorithmic bias differs from human bias?

That paper might be, by some measures, among the strangest papers I’ve ever worked on. It’s a reminder of the sheer scale that algorithms can reach.

Exact numbers are hard to get, but about 80 million Americans are evaluated through this algorithm. And it’s not for some inconsequential thing: it is an algorithm used by many health-care systems to decide which patients should get put into what are called care-management programs. Care-management programs are for people who are going to be at the hospital a lot. If you have many conditions, you’re going to be in the system frequently, so you shouldn’t have to go through the normal front door, and maybe you should have a concierge who works just with you. You get additional resources to manage this complex care.

It costs a lot of money to put somebody in a care-management program. You really want to target these programs. So the question is, who should be in them?

Over the past five years, there have been algorithms developed using health records of people to figure out who is at highest risk of using health care a lot. These algorithms produce a risk score, and my coresearchers and I wanted to know if there was any racial bias in these scores.

The way we looked for it was to take two people given the same score by the algorithm—one white and one Black. Then we looked at those two people and asked whether, on average, the white person had the same level of sickness as the Black person. What we found is that he or she didn’t, that when the algorithm gives two people the same score, the white person tends to be much healthier than the Black person. And I mean much healthier, extremely so. If you said, “How many white people would I have to remove from the program, and how many Black people would I have to put in, until their sickness levels were roughly equalized?” you would have to double the number of Black patients. It is an enormous gap.

I say it’s one of the craziest projects I’ve worked on in part because of the sheer scale of this thing. But there are a lot of social injustices that happen at a large scale. What made it really weird was when we said, “Let’s figure out what’s causing it.” In the literature on algorithmic bias, everyone acts like algorithms are people, like they’re biased [in the sense that people are]. It’s just a little piece of code. What went wrong in the code?

What we found is something that we’re finding again and again in all of our A.I. work, that every time you see that an algorithm has done something really bad, there’s no engineering error. That’s very, very different than the traditional bugs in code that you’re used to: when your computer crashes, some engineering bug has shown up. I’ve never seen an engineering bug in A.I. The bug is in what people asked the algorithm to do. They just made a mistake in how they asked the question. more>

Related>

Supersonic Flight is Back

The 1970s Concord supersonic transport has been re-imaged by Boom Technology with an eye for the modern business traveler.
By John Blyler – The supersonic phoenix is rising again. The latest incarnation of faster-than-sound flight for the commercial market is being created by Boom Supersonic, the aerospace startup company. Boom recently announced that its supersonic demonstrator, XB-1, will roll out on October 7, 2020.  XB-1 is an independently developed supersonic jet and that will demonstrate key technologies for Overture, Boom’s commercial airliner, such as advanced carbon fiber composite construction, computer-optimized high-efficiency aerodynamics, and an efficient supersonic propulsion system.

Boom Technology’s co-founder and then VP of Technology, Joshua Krall, was a keynote speaker at Dassault Systemes’s annual 3DExperience Forum in 2019.

“Our goal is to make high-speed travel available to everyone – not just the thrill seekers or rich travelers,” explained Krall at the forum. “It not so much about time saved but about life gained back from faster air travel.”

For all but the wealthiest travelers, the promise of a supersonic adventure never arrived. The last supersonic aircrafty was the British-French turbojet powered Concorde which operated from 1974 until 2003. It had a maximum speed over twice the speed of sound, at Mach 2.04 (1,354 mph or 2,180 km/h at cruise altitude). Most of today’s commercial jet aircraft reach around 400 – 500 knots (460 – 575 mph) or roughly 0.6 mach. The mach number is simply a percentage of the speed of sound. At sea level with an air temperature of 15 degrees Celsius, the speed of sound is 761 mph.

There are zero scientific barriers to supersonic flights, said Krall. However, there are cost barriers. Boom hopes to bring the reduce the cost barrier for supersonic flight. It plans to do so with Overture, the first Boom aircraft for commercial use. Overture will travel at mach 2.2 or approximately 2.6 times faster than existing commercial jet aircraft. It will cruise at 60K feet, about twice as high as commercial aircraft. more>

Is this Last Mile for the Million-Mile Battery?

Announcements from Tesla and CATL show that a long-lived, cobalt-free and competitively price EV and grid/home batteries may finally have arrived.
By John Blyler – The much discussed 1 million-mile (1.6 million kilometers) battery may now be a reality. As the name suggests, these batteries would last for 1 million miles without breaking down. Tesla, along with China-based Contemporary Amperex Technology (CATL), have announced such a battery that not only lasts longer but also costs less than $100/kWh and uses cobalt-free materials. Why are these two features important?

It has long been a metric for the success of electronic vehicles (EV) that their battery energy density be on parity with traditional gasoline-powered engines. Such a condition would allow EVs to compete with gasoline vehicles on both weight and range – especially the latter. This means that, if gasoline is 100 times more energy-dense than a battery, that a vehicle would need 100 lbs of battery to go as far as 1-lb of gasoline.

But past studies by the Argonne National Labs have shown that system efficiency is another key consideration when comparing EV and gasoline energy densities. The research lab noted that electric powertrains are far more efficient than powertrains powered by gasoline. In many cases, less than 20% of the energy contained in a gallon of gas actually gets converted to forward motion. After that power has been transmitted through a transmission and differential to the wheels, it would have suffered significantly more mechanical losses.

By contrast, an electric powertrain can be more than 90% efficient. This would suggest that the energy density of an EV battery could be far less than equivalent to a gasoline-powered vehicle and still come out ahead. more>

Related>

5 Best Practices for Utilizing Open Source Software

Open source software is everywhere and has the potential to help businesses accelerate development and improve software quality. Achieving these results can be challenging if care is not taken.
By Jacob Beningo – Here are five best practices for utilizing open source software successfully.

Best Practice #1 – Use an abstraction layer to remove dependencies
One of the common issues with code bases I review is that developers tightly couple their application code with the software libraries they use. For example, if a developer is using FreeRTOS, their application code makes calls specific to the FreeRTOS APIs in such a way that if a developer ever decided to change their RTOS, they’d have to rewrite a lot of code to replace all those RTOS calls. You might decide that changing libraries is rare, but you’d be surprised how often teams start down a path with one OS, library or component only to have to go back and rewrite code when they decide they need to make a change.

The first thing teams should do when they select an open source component, and even commercial components, is to create an abstraction layer to interact with that component. Using RTOS as an example, a team would use an OS abstraction layer, OSAL, that would allow them to write their application code with OS independent APIs. If the OS changes, the application doesn’t care, because it’s accessing an abstraction layer and the software change can take minutes rather than days.

Best Practice #2 – Leverage integrated software when possible
Most open source software is written in its own sandbox without much thought given to other components with which it may need to interact. Components are often written with different coding standards, styles, degrees of testing, and so on. When you start to pull together multiple open source components that were not designed to work with each other, it can result in long debugging sessions, headaches, and missed deadlines. Whenever possible, select components that have already been integrated and tested together.

Best Practice #3 – Perform a software audit and quality analysis
There is a lot of great open source software and a lot of not so great software. Before a developer decides to use an open source component in their project, they need to make sure they take the time to perform their due diligence on the software or hire someone to do it for them. This involves taking the time to audit the component and perform a quality analysis. Quality is often in the eye of the beholder.

At a minimum, when starting out with an open source component, the source code should be reviewed for:

Complexity using cyclomatic complexity measurements
Functionally to ensure it meets the businesses needs and objectives
Adherence to best practices and coding standards (based on needs)
Ability to handle errors
Testability

Best Practice #4 – Have the license reviewed by an attorney
Open source software licensing can be difficult to navigate. There are a dozen or so different licensing schemes, which place different requirements on the user. In some cases, the developer can use the open source software as they see fit. In others, the software can be used but any other software must also be open sourced. This means that it may require releasing a product’s secret sauce, which could damage their competitive market advantage. more>

There’s a hidden economic trendline that is shattering the global trade system

By Marshall Auerback and Jan Ritch-Frel – Former U.S. Treasury Secretary Lawrence Summers has recently conceded: “In general, economic thinking has privileged efficiency over resilience, and it has been insufficiently concerned with the big downsides of efficiency.” Policy across the globe is, therefore, moving in a more overtly nationalistic direction to rectify this shortcoming.

COVID-19 has accelerated a process that was well underway before it, spreading beyond U.S.-China-EU trade negotiations and into the world’s 50 largest economies. As much as many defenders of the old order lament this trend, it is as significant a shift as the dawn of the World Trade Organization global trade era.

Economists, politicians, and leading pundits are often tempted to see new economic patterns through the prisms of the past; we are therefore likely to hear that we’re back in an era of 19th-century mercantilism, or 1970s-style stagflation. But that misses the moment—the motives are different, and so are the outcomes.

What we are experiencing is the realisation by state planners of developed countries that new technologies enable a rapid ability to expand or initiate new and profitable production capacity closer to or inside their own markets. The cost savings in transport, packaging and security and benefits to regional neighbours and these countries’ domestic workforces will increasingly compete with the price of goods produced through the current internationalized trade system. U.S. national politicians from President Donald Trump to Senator Elizabeth Warren will be joined by a growing chorus who see the long-term domestic political benefit of supporting this transition.

The combination of high-speed communication, advances in automated manufacturing and computing combined with widespread access to the blueprints and information necessary to kick-start new production capacity increasingly makes the current international network of supply chains resemble a Rube Goldberg contraption, and it lightens the currency outflow challenge that many economies have had to deal with for the past seven decades.

Growing political will to restore manufacturing capacity in the national interest will have a shattering effect on countries that built up their economies through a labour price advantage over the past 40 years. No amount of currency depreciation or product dumping can overcome the reality of a country’s foreign customer base suddenly opting to produce and buy their own goods at competitive prices.

Taken in sum, the transformation underway isn’t just Donald Trump demanding less dependency on China’s production capacity—it’s a global process. It’s also India signalling it’s going to try to strike its own technological path away from China.

The rationales provided by governments to escape the strictures of the existing trade arrangements and into the new era are fairly easy: a mix of opportunism and need, tied to the exigencies of the moment, such as the current pandemic, and long-term national security, which of course can ultimately amount to any economic activity of scope. Senator Elizabeth Warren’s introduction in July of her sweeping Pharmaceutical Supply Chain Defense and Enhancement Act demonstrates that the U.S. power establishment is beginning to reach a consensus on this issue—no longer the sole province of Trump-era nationalism. “To defeat the current COVID-19 crisis and better equip the United States against future pandemics, we must boost our country’s manufacturing capacity,” Warren said, recasting the consequences of decades of policy to offshore our economic production as an “overreliance on foreign countries.” Likewise, Senator Tom Cotton has introduced a new bill focusing on domestic production of semiconductors, titled the “American Foundries Act of 2020,” which aims to rebuild the country’s semiconductor capacity. This bill too has significant bipartisan backing.

The government of Japan’s newly defined restrictions on foreign investment as reported by the Financial Times of around a dozen sectors including “power generation, military equipment, [computer] software [and technology]” in effect prioritize the claims of domestic manufacturers on national security grounds.

The government of Australia has likewise outlined new powers to scrutinize new overseas investment, as well as forcing foreign companies to sell their assets if they pose a national security threat. The proposals come in the wake of an intensifying trade war between the governments of Beijing and Canberra, alongside “a dramatic increase in the number of foreign investment bids probed by Australia’s spy agency ASIO, over fears that China was spying on sensitive health data,” according to news.com.au. This is happening at the same time that there has been an overhaul of thought with regard to manufacturing, something Australia hasn’t typically done much of. The headlines from Australia are beginning to look a lot like the Area Development stories in the United States.

The Canadian government has also announced plans to enhance foreign investment scrutiny “related to public health or critical supply chains during the pandemic, as well as any investment by state-owned companies or by investors with close ties to foreign governments,” according to the Globe and Mail. This attempt to disaggregate beneficial foreign investment flows from those deemed contrary to the national interest used to be a common feature of government policy in the post-World War II period. Canada established the Foreign Investment Review Agency in 1973 as a result of mounting concerns about rising overseas investment, notably the domination of U.S. multinationals, in the Canadian economy. Its provisions were repeatedly downgraded as globaliaation pressures intensified, but its value is now being reassessed for compatibility with national health policy and resiliency in manufacturing chains. Predictably, pharmaceutical independence is high on the list.

Taiwan, “a net importer of surgical masks before the pandemic, [has] created an onshore mask-manufacturing industry in just a month after registering its first infections in January,” reports the Financial Times. “Taiwan’s President Tsai Ing-wen… said Taipei would repeat that approach to foster other new industries.” And world economists have noted that Taiwan and Vietnam lead the world in growth of global market share in exports, at the expense of larger economies like China.

In Europe, the EU leadership is publicly indicating a policy of subsidy and state investment in companies to prevent Chinese buyouts or “undercutting… prices.” This was supposed to represent a cross-European effort, but the coronavirus policy response is increasingly driven at the national level. Consequently, it is starting to fracture the EU’s single market, which has long been constructed on an intricate network of cross-border supply chains and strict rules preventing state subsidies to national champions.

Even Germany, with a vibrant export sector that has long made it a beneficiary of globalization, has also signaled a move toward greater economic nationalism.

Economic nationalist considerations are also driving a shift in Britain’s negotiating stance in the current Brexit trade negotiations with the EU, with the UK clearly prioritizing national sovereignty over frictionless free trade with its former single-market partners, even if that means a so-called “Hard Brexit.” The EU’s single-market rules specifically preclude state aid to specific industries if it undermines the operation of the single market. But the UK’s chief negotiating officer, David Frost, has made it clear that the ability to break free from the EU’s rulebook was essential to the purpose of Brexit, even if that meant reverting to the less favorable WTO trade relationship that exists for other non-EU countries.

Over the past 40 years, this kind of overt economic nationalism, especially as it has pertained to domestic manufacturing capabilities, has generally been eschewed by the United States, at least until the ascension of Donald Trump to the White House. In part, this is a product of the fact that as global hegemon, the United States used to be able to dominate global institutions (such as the International Monetary Fund and the WTO) and shape them toward U.S. national interests. But when necessary, national security considerations have intervened.

More recently, national security considerations in the semiconductor industry have again revived in the wake of the Trump administration’s growing dispute with Chinese 5G telecommunications equipment maker Huawei. The U.S. Commerce Department has now mandated that all semiconductor chip manufacturers using U.S. equipment, IP, or design software will require a license before shipping to Huawei. This decision has forced the world’s biggest chipmaker—Taiwan Semiconductor Manufacturing Company (TSMC)—to stop taking fresh orders from Huawei, as it uses U.S. equipment in its own manufacturing processes. Paradoxically, then, the Trump administration has exploited pre-existing global supply linkages in the furtherance of a more robust form of economic nationalism. The same policy attitude is now visible with regard to pharmaceuticals (as it is in other parts of the world, to the likely detriment of China and India).

A shift like this will have a knock-on effect that will reverberate to the other parts of the world that for centuries have been forcibly limited—by arms and finance—to being sources of raw material export, refined if they were lucky. They will watch closely what happens with Australia, which for the majority of the past 150 years has been an exporter of food and minerals, but is now jumping on the project to establish a national manufacturing base.

As dozens of countries build their own manufacturing base—something only a handful of countries controlled for most of modern history—big questions will emerge about geopolitical stabilization and the classical tools of foreign influence. The world today in some respects resembles the 19th century’s balance-of-power politics, even as the majority of countries understand that some minimal level of state collaboration is essential to combat shared challenges. China is party to a growing number of global disputes, as emerging great powers typically experience: the U.S. vs. China, China vs. IndiaJapan vs. ChinaChina vs. Australia, and the EU vs. China. But hot wars are unlikely to feature as prominently as they did two centuries ago.

Expect to see Cold War-style conflict intensify, however, albeit in new forms. Instead of the old geopolitical arenas including access to vital commodities or stable petroleum markets, the new forms of the competition will put greater weight on access to advanced research and technologies, such as the collection, transfer and storage of data and the quantum computing power to process it.

The speed at which global supply chains can potentially shift to accommodate the rise in economic nationalism is considerable. The success with which we manage the transition will largely settle the debate as to whether it is, in fact, the better path to greater prosperity and global stability. more>

Updates from McKinsey

Taking supplier collaboration to the next level
Closer relationships between buyers and suppliers could create significant value and help supply chains become more resilient. New research sheds light on the ingredients for success.
By Agustin Gutierrez, Ashish Kothari, Carolina Mazuera, and Tobias Schoenherr – Companies with advanced procurement functions know that there are limits to the value they can generate by focusing purely on the price of the products and services they buy. These organizations understand that when buyers and suppliers are willing and able to cooperate, they can often find ways to unlock significant new sources of value that benefit them both

Buyers and suppliers can work together to develop innovative new products, for example, boosting revenues and profits for both parties. They can take an integrated approach to supply-chain optimization, redesigning their processes together to reduce waste and redundant effort, or jointly purchasing raw materials. Or they can collaborate in forecasting, planning, and capacity management—thereby improving service levels, mitigating risks, and strengthening the combined supply chain.

Earlier work has shown that supplier collaboration really does move the needle for companies that do it well. In one McKinsey survey of more than 100 large organizations in multiple sectors, companies that regularly collaborated with suppliers demonstrated higher growth, lower operating costs, and greater profitability than their industry peers.

Despite the value at stake, however, the benefits of supplier collaboration have proved difficult to access. While many companies can point to individual examples of successful collaborations with suppliers, executives often tell us that they have struggled to integrate the approach into their overall procurement and supply-chain strategies.

Several factors make supplier collaboration challenging. Projects may require significant time and management effort before they generate value, leading companies to prioritize simpler, faster initiatives, even if they are worth less. Collaboration requires a change in mind-sets among buyers and suppliers, who may be used to more transactional or even adversarial relationships. And most collaborative efforts need intensive, cross-functional involvement from both sides, a marked change to the normal working methods at many companies. This change from a cost-based to a value-based way of thinking requires a paradigm shift that is often difficult to come by. more>

Related>

‘Shareholder value’ versus the public good: the case of Germany

Support for companies amid the pandemic must come with social and ecological strings attached.
By Emre Gömec and Mustafa Erdem Sakinç – With uncertainty around the world about how and when the coronavirus outbreak will decelerate, whole business sectors have been affected by lockdowns and are facing ruin. In Germany, more than 750,000 companies have put over 12 million employees on reduced working hours (Kurzarbeit), dwarfing the 3 million hit by the 2008 crisis.

Society’s loss goes beyond the toll on employment. As the crisis lengthens, innovative capabilities accumulated over years and even decades may atrophy and disappear, making it far more difficult to emerge from the pandemic with a healthy economy.

This ‘innovation drain’ can be avoided if, and only if, corporations devote every available resource to retaining, and reinvesting, in productive capacity. Implementation of the rescue packages adopted in Germany in March and June must thus fundamentally address future practices of corporate resource allocation.

Making government support conditional on replacing value-extractive practices, such as excessive dividend payments and executive compensation, is the most effective way to block damaging business decisions which undermine investment in productive capabilities and secure employment.

Germany’s case was, it’s true, not as dramatic as that of the US, where S&P 500 companies, having fallen victim to the American disease of corporate financialization, distributed 92 per cent of their net income between 2009 and 2018 in stock buybacks and dividends. Still, in the decade from 2010 to 2019, 65 German companies in the DAX 30 and MDAX 60 indices paid out a total of €338.8 billion, or 46 per cent of their combined profits, in dividends, in addition to €35.3 billion, or 5 per cent of profits, in stock buybacks. more>