Updates from McKinsey

Skill shift: Automation and the future of the workforce
Demand for technological, social and emotional, and higher cognitive skills will rise by 2030. How will workers and organizations adapt?
By Jacques Bughin, Eric Hazan, Susan Lund, Peter Dahlström, Anna Wiesinger, and Amresh Subramaniam – Skill shifts have accompanied the introduction of new technologies in the workplace since at least the Industrial Revolution, but adoption of automation and artificial intelligence (AI) will mark an acceleration over the shifts of even the recent past. The need for some skills, such as technological as well as social and emotional skills, will rise, even as the demand for others, including physical and manual skills, will fall. These changes will require workers everywhere to deepen their existing skill sets or acquire new ones. Companies, too, will need to rethink how work is organized within their organizations.

his briefing, part of our ongoing research on the impact of technology on the economy, business, and society, quantifies time spent on 25 core workplace skills today and in the future for five European countries—France, Germany, Italy, Spain, and the United Kingdom—and the United States and examines the implications of those shifts.

  1. How will demand for workforce skills change with automation?
  2. Shifting skill requirements in five sectors
  3. How will organizations adapt?
  4. Building the workforce of the future

Over the next ten to 15 years, the adoption of automation and AI technologies will transform the workplace as people increasingly interact with ever-smarter machines. These technologies, and that human-machine interaction, will bring numerous benefits in the form of higher productivity, GDP growth, improved corporate performance, and new prosperity, but they will also change the skills required of human workers.

To measure skill shifts from automation and AI, we modeled skill shifts going forward to 2030—and found that they accelerated. While the demand for technological skills has been growing since 2002, it will gather pace in the 2016 to 2030 period. The increase in the need for social and emotional skills will similarly accelerate. By contrast, the need for both basic cognitive skills and physical and manual skills will decline. more>

Related>

Updates from ITU

New Opportunities, New Challenges for AI
By Houlin Zhao – At ITU, we are working hard with partners across the world to ensure the trusted, safe and inclusive development of AI technologies — and equitable access to their benefits. That is why we organize the annual AI for Good Global Summit, the leading United Nations summit on how to harness the power of AI to improve lives worldwide.

The Summit connects AI innovators with those seeking solutions to the world’s greatest challenges so as to identify practical applications of AI that can accelerate progress towards the UN Sustainable Development Goals (SDGs).

This year’s Summit was organized into five “Breakthrough Tracks”: AI and Health; AI and Education; AI and Human Dignity and Equality; Scaling AI and AI for Space. There were also sessions on the future of Smart Mobility, AI and agriculture, AI’s role in arts and culture, the unintended consequences of AI — and much more.

In addition, the Summit showcased the latest in AI technologies — from drones, exoskeletons, and robotics to avatars, autonomous cars, and AI-powered health solutions. more>

Related>

3 Reasons Embedded Security Is Being Ignored

By Jacob Beningo – The IoT has grown to the point that everyone and their brother is in the process of connecting their products to the Internet. This is great because it opens new revenue generating opportunities for businesses and in some cases completely new business models that can generate rapid growth. The problem that I am seeing though is that in several cases there seems to be little to no interest in securing these devices.

(I draw this conclusion from the fact that embedded conferences, webinars, articles and even social media conversations seem to draw far less interest then nearly any other topic).

I’m going to explore the primary reasons why I believe development teams are neglecting security in their embedded products and explain why security doesn’t have to be a necessary evil.

Reason #1 – The Perception That Adding Security Is Expensive

I believe that there is still a perception in the embedded space that security is expensive. Right now, if you were to survey the availability of security experts, you will find that there is a severe shortage at the moment.

Reason #2 – We Will “Add It Later”

Nobody wants to be on the front page due to a security breach. I believe in many cases, companies want to include security, but in the early stages of product development, when funds are short, security is often the lowest priority. With many good intentions, the teams often think they’ll add it later after we get through this sprint or this development cycle. The problem that is encountered here is that you can’t add security on at the end of the development cycle.

Reason #3 – Teams Are In Too Big A Hurry

Nearly every development team that I encounter is behind schedule and in a hurry. New start-ups, seasoned successful teams, there is always way too much to do and never enough time (or budget). In many cases, teams may be developing a new product and need to get to market fast in order to start generating revenue so that they can pay the bills.

Security is a foundational element to any connected device. Security cannot be added on at the end of a product and must be carefully thought through from the very beginning. Without thinking about it up front, the development team can’t ensure they have the right hardware components in place to properly isolate their software components or expect to have the right software frameworks in their application to properly manage and secure their product. more>

Updates from Adobe

On the Edge of Failure
By Alejandro Chavetta – I’m in Hollywood to meet photographer Joe Pugliese. I walk past star-studded sidewalks and restaurants you’ve seen a million times on movies and TV, but there are no celebrity sightings, just regular Angelenos going about their business. It’s a fitting match for Joe’s photographs, which bridge the gap between stars and civilians by normalizing the celebrity and elevating the rest of us to a hero expression of ourselves.

Today, Joe is known for celebrity portraits of Jennifer Lopez, President Obama, Jamie Lee Curtis, and many others that appear in such publications as Wired, Variety, and Texas Monthly, but what most folks don’t know is that Joe got his start putting out a BMX zine using his mom’s Xerox machine, a starting point rooted in graphic design that continues to inform his practice even now.

In high school, I made a Xerox zine of me and my BMX friends. I was having a lot of fun with the graphic design and realized that I needed to take some photos for it, so I picked up a yard-sale camera.

I was still more interested in graphic design as I started shooting. And it was a little clumsy because I would shoot and then I would take it to the processing lab, wait a day or two, get back a print that would get messed up, or I wanted it to be bigger or smaller. Photography didn’t click for me until I set up a darkroom. My parents let me black out the window in my bedroom, and I had another yard sale find of an enlarger and trays and caustic chemicals. It was the most rudimentary set-up.

I had a book that showed me how to develop in a dark room. The first time I put that print into the developer and nothing happened, I thought, “Total failure. Why did I bother with this?” And as I’m thinking about the failure, the print comes up, the image appears, and it was absolute magic. I wasn’t a failure. I could shoot and be in control of the output from start to finish. more>

Related>

Why hiring the ‘best’ people produces the least creative results

By Scott E Page – The complexity of modern problems often precludes any one person from fully understanding them. Factors contributing to rising obesity levels, for example, include transportation systems and infrastructure, media, convenience foods, changing social norms, human biology and psychological factors.

Designing an aircraft carrier, to take another example, requires knowledge of nuclear engineering, naval architecture, metallurgy, hydrodynamics, information systems, military protocols, the exercise of modern warfare and, given the long building time, the ability to predict trends in weapon systems.

The multidimensional or layered character of complex problems also undermines the principle of meritocracy: the idea that the ‘best person’ should be hired. There is no best person. When putting together an oncological research team, a biotech company such as Gilead or Genentech would not construct a multiple-choice test and hire the top scorers, or hire people whose resumes score highest according to some performance criteria. Instead, they would seek diversity. They would build a team of people who bring diverse knowledge bases, tools and analytic skills. That team would more likely than not include mathematicians (though not logicians such as Griffeath). And the mathematicians would likely study dynamical systems and differential equations.

Believers in a meritocracy might grant that teams ought to be diverse but then argue that meritocratic principles should apply within each category. Thus the team should consist of the ‘best’ mathematicians, the ‘best’ oncologists, and the ‘best’ biostatisticians from within the pool.

That position suffers from a similar flaw. Even with a knowledge domain, no test or criteria applied to individuals will produce the best team. Each of these domains possesses such depth and breadth, that no test can exist.

Consider the field of neuroscience. Upwards of 50,000 papers were published last year covering various techniques, domains of inquiry and levels of analysis, ranging from molecules and synapses up through networks of neurons. Given that complexity, any attempt to rank a collection of neuroscientists from best to worst, as if they were competitors in the 50-metre butterfly, must fail.

What could be true is that given a specific task and the composition of a particular team, one scientist would be more likely to contribute than another. Optimal hiring depends on context. Optimal teams will be diverse.

Yet the fallacy of meritocracy persists. Corporations, non-profits, governments, universities and even preschools test, score and hire the ‘best’. This all but guarantees not creating the best team.

Ranking people by common criteria produces homogeneity. And when biases creep in, it results in people who look like those making the decisions. That’s not likely to lead to breakthroughs. more>

Updates from McKinsey

Nudge, don’t nag
With such a fine line between a nudge and a nag, it’s important to acknowledge and understand the subtle differences between the two.
By Bill Schaninger, Alexander DiLeonardo and Stephanie Smallets – In recent years, nudging has been hailed as the latest trend in HR and a novel, new scientific management approach. And for good reason: using nudges has improved everything from customer retention and employee safety to organizational commitment and innovation.

When nudges are executed with care, they have remarkable results. However, in many cases there is a misconception about what a nudge actually is – organizations often launch initiatives that either miss the mark or are just reminders in disguise. When that happens, the nudge is actually a nag, and it risks losing its impact and becoming downright annoying. What can you do to ensure you’re using nudges and not nags?

If your nudges check the three boxes below, you’re well on your way.

Before we dive into what makes a good nudge, it’s important to note what a nudge is. According to Harvard professor Cass Sunstein and Nobel prize winner Richard Thaler, a nudge guides choice without removing options or changing incentives. It’s like leading a horse to the water and framing its options such that the horse is empowered to and actively chooses to drink it – rather than eat the grass or lay in the sunshine.

It’s also just as important to highlight what a nudge is not. A nudge is not a reminder to do something, nor is it a call to action. Nudges aren’t mandatory and they don’t have consequences. If you’re constantly reminding or commanding the horse to drink, it’s not a nudge. It’s also not a nudge if the horse isn’t brought to the water the next day for forgoing the water the day before.

A good nudge is all about choice. At its core, nudging is all about choice. The reason why nudging is so impactful is that it gives people control over their destiny: They can choose whether or not they proceed with the “desirable” option.

A good nudge is easy to follow. Good nudges are easy to understand and empower people to be well-informed. Providing irrelevant, complex or confusing information is difficult to process and can make people feel like they were hoodwinked into making the choice.

A good nudge is personal. By far, the best nudges are those that use technology and analytics to tailor them to the audience. Nudges that take into account individuals’ mindsets, preferences and behaviors ensure that the most desirable option overall is also the most desirable option for that specific individual – a true win-win. more>

Related>

Updates from Ciena

The future is near. Is your business network ready to adapt?
New technologies are changing the way we do business, so enterprises cannot be limited by network performance. Learn why your IP network should adapt to support your business’ needs, and not the other way around.
By Vinicius Santos – Legacy business models are disappearing fast, making us almost forget about how things were implemented just a few years ago. The video streaming business is less than ten years old and physically traveling to a store to rent a hard copy of a movie seems like ancient history. Most of us can barely remember when we had to save essential files on in-house data centers servers instead of somewhere in the cloud. Even sharing data using thumb drives is becoming rather “unusual.”

Well-established businesses are facing waves of digital transformation and are trying to align with customers’ expectations while fighting to maintain their current market share from disruptive innovators. At the same time, these disruptive innovators are becoming much faster when moving from niche markets to mainstream and highly lucrative markets, using technology, speed, and agility as their main tools to better serve their targeted markets.

It’s a process where every new technology is a piece of the transformational engine, creating new business models and opportunities that consequently create additional technologies. The wheel of innovation is not just spinning fast, it’s accelerating!

At the forefront of this transformation are technologies such as cloud computing, analytics, edge compute, machine learning, big data, automation, and the Internet of Things (IoT). All of these technology building blocks have a single enabling factor that tends to be neglected in most conversations: connectivity. more>

Related>

Updates from Chicago Booth

A plain way to cut smoking rates
By Meredith Lidard Kleeman – Tobacco has been a known carcinogen for more than 50 years, yet cigarettes continue to attract new smokers to the harmful, addictive habit every day. Research suggests that marketing, including package labels and brand logos, plays an important role in encouraging young people to take up smoking and legitimizing the habit for many smokers who are trying to quit—and that policy makers may have a way to change that.

In recent years, some 120 countries have added mandatory pictorial health warnings to packaging, and a handful have passed plain-packaging laws. These efforts to discourage new smokers and reduce tobacco-related disease and deaths appear to be working. Australia, the first country to implement a plain-packaging mandate, in 2012, saw monthly cigarette sales decline after the mandate was introduced, according to research from Chicago Booth’s Pradeep K. Chintagunta, Deakin University’s André Bonfrer, University of New South Wales’s John Roberts, and University of South Australia’s David Corkindale.

The researchers analyzed sales data from before and after Australia implemented the plain-packaging mandate and compared these with data from New Zealand, where the mandate hadn’t yet been imposed (but was in 2018)

Across the world, tobacco products are subject to strict marketing and advertising regulations, but tobacco marketers still control packaging design in most respects. Australia’s plain-packaging mandate presented the researchers with an opportunity to study the role this packaging plays in influencing product sales. more>

Related>

Limited liability is causing unlimited harm

The purpose of limited-liability protection was to encourage investment in corporations, yet it has evolved into a source of systemic market failure.
By Katharina Pistor – In a recent tweetOlivier Blanchard, a former chief economist of the International Monetary Fund, wondered how we can ‘have so much political and geopolitical uncertainty and so little economic uncertainty’. Markets are supposed to measure and allocate risk, yet shares in companies that pollute, peddle addictive painkillers, and build unsafe airplanes are doing just fine. The same goes for corporations that openly enrich shareholders, directors and officers at the expense of their employees, many of whom are struggling to make a living and protect their pension plans. Are markets wrong, or are the red flags about climate change, social tensions, and political discontent actually red herrings?

Closer inspection reveals that the problem lies with markets. Under current conditions, markets simply cannot price risk adequately, because market participants are shielded from the harms that corporations inflict on others. This pathology goes by the name of ‘limited liability’, but when it comes to the risk borne by shareholders, it would be more accurate to call it ‘no liability’.

Under the prevailing legal dispensation, shareholders are protected from liability when the corporations whose shares they own harm consumers, workers and the environment. Shareholders can lose money on their holdings, but they also profit when (or even because) companies have caused untold damage by polluting oceans and aquifers, hiding the harms of the products they sell or pumping greenhouse-gas emissions into the atmosphere. The corporate entity itself might face liability, perhaps even bankruptcy, but the shareholders can walk away from the wreckage, profits in hand.

The stated justification for limited liability is that it encourages investment in—and risk-taking by—corporations, leading to economically beneficial innovations. But we should recognize that sparing owners from the harms their companies cause amounts to a hefty legal subsidy. As with all subsidies, the costs and benefits should be reassessed from time to time. And in the case of limited liability, the fact that markets fail to price the risk of activities that are known to cause substantial harm should give us pause. more>

Updates from McKinsey

Climate risk and decarbonization: What every mining CEO needs to know
Building a climate strategy won’t be quick or easy—but waiting is not an option.
By Lindsay Delevingne, Will Glazener, Liesbet Grégoir, and Kimberly Henderson – In the mining industry, the impact of climate change and how the industry can respond to it has increasingly been a topic of discussion over the past decade.

Mining is no stranger to harsh climates; much of the industry already operates in inhospitable conditions. But forecasts of hazards such as heavy precipitation, drought, and heat indicate these effects will get more frequent and intense, increasing the physical challenges to mining operations.

Under the 2015 Paris Agreement, 195 countries pledged to limit global warming to well below 2.0°C, and ideally not more than 1.5°C above preindustrial levels. That target, if pursued, would manifest in decarbonization across industries, creating major shifts in commodity demand for the mining industry and likely resulting in declining global mining revenue pools. Mining-portfolio evaluation must now account for potential decarbonization of other sectors.

The mining sector itself will also face pressure from governments, investors, and society to reduce emissions. Mining is currently responsible for 4 to 7 percent of greenhouse-gas (GHG) emissions globally. Scope 1 and Scope 2 CO2 emissions from the sector (those incurred through mining operations and power consumption, respectively) amount to 1 percent, and fugitive-methane emissions from coal mining are estimated at 3 to 6 percent. 1 A significant share of global emissions—28 percent—would be considered Scope 3 (indirect) emissions, including the combustion of coal.

The mining industry has only just begun to set emission-reduction goals. Current targets published by mining companies range from 0 to 30 percent by 2030, far below the Paris Agreement goals. Mines theoretically can fully decarbonize (excluding fugitive methane) through operational efficiency, electrification, and renewable-energy use. Capital investments are required to achieve most of the decarbonization potential, but certain measures, such as the adoption of renewables, electrification, and operational efficiency, are economical today for many mines. more>

Related>