Category Archives: Science

Updates from ITU

How the city of Philadelphia plans to measure its digital divide
By Sarah Wray – The City of Philadelphia has issued a request for proposal (RFP) to rapidly quantify the number of households that are without Internet connectivity or relying on unstable, low-bandwidth options.

The RFP, issued with non-profit the Mayor’s Fund for Philadelphia, seeks to enable the city to benchmark its progress on closing the digital divide and inform the next phase of policy, program and budget decisions.

Mark Wheeler, Chief Information Officer, City of Philadelphia, told Cities Today: “To address digital equity problems, the City of Philadelphia needs to be able to benchmark its impact with programmes like PHLConnectED.”

“The city seeks feedback from firms or research agencies who have the means to measure Internet use (by type of technology) by Philadelphia households. We are looking for any and all ways to achieve quantifiable measures,” said Wheeler. “Because we are smart city and innovation-oriented, proposals that make sophisticated use of commercial data modelling and artificial intelligence are of particular interest.”

Closing the digital divide has shot to the top of cities’ priority lists amid the pandemic as everything from work to shopping for essentials and even access to critical information and services has shifted online. Access to education has been a particularly urgent concern. more>

Related>

Why Immigration Drives Innovation

Economic history reveals one unmistakable psychological pattern.
By Joseph Henrich – When President Coolidge signed the Johnson-Reed Act into law in 1924, he drained the well-spring of American ingenuity. The new policy sought to restore the ethnic homogeneity of 1890 America by tightening the 1921 immigration quotas. As a result, immigration from eastern Europe and Italy plummeted, and Asian immigrants were banned. Assessing the law’s impact, the economists Petra Moser and Shmuel San show how this steep and selective cut in immigration stymied U.S. innovation across a swath of scientific fields, including radio waves, radiation and polymers—all fields in which Eastern European immigrants had made contributions prior to 1924. Not only did patenting drop by two-thirds across 36 scientific domains, but U.S-born researchers became less creative as well, experiencing a 62% decline in their own patenting. American scientists lost the insights, ideas and fresh perspectives that inevitably flow in with immigrants.

Before this, from 1850 to 1920, American innovation and economic growth had been fueled by immigration. The 1899 inflow included a large fraction of groups that were later deemed “undesirable”: e.g., 26% Italians, 12% “Hebrews,” and 9% “Poles.” Taking advantage of the randomness provided by expanding railroad networks and changing circumstances in Europe, a trio of economists—Sandra Sequeira, Nathan Nunn and Nancy Qian–demonstrate that counties that ended up with more immigrants subsequently innovated more rapidly and earned higher incomes, both in the short-term and today. The telephone, hot blast furnace, screw propeller, flashlight and ironclad ship were all pioneered by immigrants. The analysis also suggests that immigrants made native-born Americans more creative. Nikola Tesla, a Serbian who grew up in the Austrian Empire, provided George Westinghouse, a New Yorker whose parents had migrated from Westphalia, with a key missing component for his system of electrification based on AC current (Tesla also patented 100s of other inventions).

In ending the quotas imposed under the Harding-Coolidge administration, President Johnson remarked in 1964 that “Today, with my signature, this system is abolished…Men of needed skill and talent were denied entrance because they came from southern or eastern Europe or from one of the developing continents…” By the mid-1970s, U.S innovation was again powerfully fueled by immigrants, now coming from places like Mexico, China, India, Philippines and Vietnam. From 1975 to 2010, an additional 10,000 immigrants generated 22% more patents every five years. Again, not only did immigrants innovate, they also stoked the creative energies of the locals. more>

Updates from ITU

Towards environmental efficiency in the age of AI
ITU – The rapid adoption of artificial intelligence (AI) and emerging technologies has sparked the need for a sustainable approach able to safeguard the environment. A recent ITU workshop provided a platform to discuss environmental efficiency in the age of AI, increasing automation, and smart manufacturing.

The workshop discussed emerging technologies’ potential to contribute to climate action as part of global efforts to achieve the UN Sustainable Development Goals. It also highlighted practical tools to evaluate environmental aspects of emerging technologies and discussed the role to be played by international standardization in supporting the expansion of this toolkit.

The workshop’s discussions fed into a meeting of the ITU Focus Group on environmental efficiency for AI and emerging technologies (FG-AI4EE). The group is analyzing the relationship between emerging technologies and environmental efficiency to benchmark best practices and provide a basis for new ITU standards. “This focus group is among the first global platforms for the environmental aspects of emerging technologies,” noted Paolo Gemma, Huawei, Co-Chair of the Focus Group.

The Focus Group is open to all interested parties. Sign-up as a participant and join the mailing list on the homepage. For more information, contact tsbfgai4ee@itu.int. more>

Related>

Updates from McKinsey

Derisking digital and analytics transformations
While the benefits of digitization and advanced analytics are well documented, the risk challenges often remain hidden.
By Jim Boehm and Joy Smith – bank was in the midst of a digital transformation, and the early stages were going well. It had successfully transformed its development teams into agile squads, and leaders were thrilled with the resulting speed and productivity gains. But within weeks, leadership discovered that the software developers had been taking a process shortcut that left customer usernames and passwords vulnerable to being hacked. The transformation team fixed the issue, but then the bank experienced another kind of hack, which compromised the security of customer data. Some applications had been operating for weeks before errors were detected because no monitors were in place to identify security issues before deployment. This meant the bank did not know who might have had access to the sensitive customer data or how far and wide the data might have leaked. The problem was severe enough that it put the entire transformation at risk. The CEO threatened to end the initiative and return the teams to waterfall development if they couldn’t improve application development security.

This bank’s experience is not rare. Companies in all industries are launching digital and analytics transformations to digitize services and processes, increase efficiency via agile and automation, improve customer engagement, and capitalize on new analytical tools. Yet most of these transformations are undertaken without any formal way to capture and manage the associated risks. Many projects have minimal controls designed into the new processes, underdeveloped change plans (or none at all), and often scant design input from security, privacy, and risk and legal teams. As a result, companies are creating hidden nonfinancial risks in cybersecurity, technical debt, advanced analytics, and operational resilience, among other areas. The COVID-19 pandemic and the measures employed to control it have only exacerbated the problem, forcing organizations to innovate on the fly to meet work-from-home and other digital requirements.

McKinsey recently surveyed 100 digital and analytics transformation leaders from companies across industries and around the globe to better understand the scope of the issue. 1 While the benefits of digitization and advanced analytics are well documented, the risk challenges often remain hidden. From our survey and subsequent interviews, several key findings emerged:

  1. Digital and analytics transformations are widely undertaken now by organizations in all sectors.
  2. Risk management has not kept pace with the proliferation of digital and analytics transformations—a gap is opening that can only be closed by risk innovation at scale.

more>

The rule of law: a simple phrase with exacting demands

If the finger is to be pointed—rightly—at Hungary and Poland, then the EU must insist on compliance by all with universal norms.
By Albena Azmanova and Kalypso Nicolaidis – That the European Union, in its moment of public healthcare emergency and acute economic plight, should find itself paralysed over such a seemingly abstract matter as the rule of law is one of the great paradoxes of our times. And yet this is exactly the conundrum plaguing approval of the EU’s seven-year budget and recovery fund, totaling €1.81 trillion, which Poland and Hungary have been blocking over rule-of-law conditionality for the funds’ disbursement.

Respect for the rule of law is one of those self-evident truths—the absolute minimum requirement of decent political rule—which should be unproblematic in the family of liberal democracies that is the EU. It is equally beyond doubt that the prompt approval of the pandemic recovery fund is in everyone’s interest.

Many commentators assert that the EU should stand up to the defiant governments, in the name of its fundamental values. We do too. But our hope is that we, in Europe, can use this moment as an opportunity to question ourselves further.

Most of us may believe that the arguments put forward to resist rule-of-law conditionality are disingenuous. And they are. But we must still take them seriously when they are presented in line with … the rule of law.

Hungary and Poland are claiming that, by being poorly defined, the rule-of-law principle opens the door to discretionary decisions and thus to the abuse of power.

The rule of law as a political principle and legal norm was indeed born of the ambition to constrain the arbitrary power of central authority. This was why the English barons forced King John to adopt the royal charter of rights, the Magna Carta, on June 15th 1215. The specification of basic freedoms, codified not as privileges for a handful of aristocrats but as abstract and unconditional rights, was meant to ensure that no authority could place itself above these rights in pursuit of its political ends

It is true that the EU should make no compromises with the very foundation of the liberal political order. But the EU itself has complied with these principles erratically and selectively, thus violating the spirit of the rule of law.

This has been evident in several instances—from lack of concern with the Silvio Berlusconi media monopoly in Italy to France’s semi-permanent state of emergency, Malta’s and Slovakia’s complacency with political murder and the Spanish government’s response to the 2017 independence referendum in Catalonia. Often, the EU is content with narrowly reducing the remit of the rule of law to a simple matter of legality—ignoring routine violations of core values, such as the right to peaceful assembly, freedom of speech or even the right to liberty and life itself.

Has the EU not thereby set itself up for the current crisis, supplying the ammunition for autocrats to try to absolve themselves from compliance with the rule of law? more>

Updates from McKinsey

E-commerce: How consumer brands can get it right
Consumer brands need to make direct-to-consumer economics feasible and the customer experience seamless.
By Arun Arora, Hamza Khan, Sajal Kohli, and Caroline Tufft – Consumer brands have been seeking to establish direct relations with end customers for a range of reasons: to generate deeper insights about consumer needs, to maintain control over their brand experience, and to differentiate their proposition to consumers. Increasingly, they also do it to drive sales (see sidebar, “Why go direct?”).

For any brands that have considered establishing a direct-to-consumer (DTC) channel in the past and decided against it, now is the time to reconsider. COVID-19 has accelerated profound business trends, including the massive consumer shift to digital channels. In the United States, for example, the increase in e-commerce penetration observed in the first half of 2020 was equivalent to that of the last decade. In Europe, overall digital adoption has jumped from 81 percent to 95 percent during the COVID-19 crisis.

Many companies have been active in launching new DTC programs during the pandemic. For example, PepsiCo and Kraft Heinz have both launched new DTC propositions in recent months. Nike’s digital sales grew by 36 percent in the first quarter of 2020, and Nike is aiming to grow the share of its DTC sales from 30 percent today to 50 percent in the near future. “The accelerated consumer shift toward digital is here to stay,” said John Donahoe, a Silicon Valley veteran who became Nike president and CEO in January. 1 Our consumer sentiment research shows that two-thirds of consumers plan to continue to shop online after the pandemic.

The vast majority of consumer brands are used to selling through intermediaries, including retailers, online marketplaces, and specialized distributors. Their experience with direct consumer relationships and e-commerce is limited. As a result, they often hesitate to launch an e-commerce channel despite the obvious opportunity it offers. Just 60 percent of consumer-goods companies, at best, feel even moderately prepared to capture e-commerce growth opportunities. more>

Updates from Ciena

Can utilities have their multi-layered cake and eat it too?
Utilities are facing increasing bandwidth demands on their communications networks. Ciena’s Mitch Simcoe explains how modernizing networks to a private packet-optical fiber architecture can help utilities scale to support new smart grid applications.
By Mitch Simcoe – Utilities are increasingly in the eye of the storm these days. Whether it’s having to deal with hurricanes in the Gulf Coast over the last few months or wildfires on the West Coast, utilities have had to put more sensors out in the field to keep abreast of changing weather conditions and potential risks to their power grids. The increasing demands for utilities to show that they are carbon-free is also changing the way they generate and distribute energy. The one common denominator that utilities have is more data to collect and backhaul from their power grids, which is driving increasing demand on their communications networks.

Many utilities may not realize it, but recent advancements have resulted in several bandwidth-intensive applications and processes driving up demand on their networks:

  1. Video Surveillance
    Security continues to be top of mind for utilities and security surveillance in the past has been more “after the fact”; where video surveillance is stored locally at the substation and only accessed after a security breach. Today’s approach is to backhaul all security video footage to a centralized data center and apply artificial intelligence (AI) techniques to proactively determine if a security breach is in the process of occurring. In those cases, security personnel can be dispatched on site in near-real time. Each video camera at a substation can generate 9 Gigabytes of data per day and a typical substation could have a dozen video cameras to surveil.
  2. Synchrophasors
    Prior to the big power outage of 2003 in the Northeast United States (where 50 million households lost power for two days), sensors on the power grid using SCADA (Supervisory Control and Data Acquisition) would sample the state of the grid about once every four seconds. This significant outage could have been avoided had the grid been sampling data more frequently. To address this, a device called a synchrophasor (not the Star Trek type!) was introduced, which would sample the state of the grid 30 to 60 times per second. This has allowed the grid to be more reliable but produces significantly more data to backhaul and process. Each synchrophasor PMU (Performance Measurement Unit) can generate 15 Gigabytes of data per day and all of that must be backhauled to a central data center for analysis.
  3. Smart Meters
    In the US, over 50% of households are now serviced by a smart meter that measures your household’s power consumption every 15 minutes. Beyond their billing function, they help utilities track power consumption hotspots during peak usage. For a utility of 1 million households, which would be the middle range for most US Investor-owned Utilities (IOUs), this can generate 1 terabyte of data per day that needs to be backhauled to a central data center for processing.
  4. Internet of Things (IoT) devices
    These include what we mentioned earlier: weather sensors and sensors on power equipment to proactively identify issues. Smart thermostats in homes is another growing trend which utilities are using to offer smart “on-demand” billing plans where you allow the utility to raise your thermostat during periods of peak usage during the hot summer months in exchange for a lower cents per kWh price.

For the first three categories we mentioned above, a utility of 1 million households would result in a daily requirement for data backhaul of 6 to 8 terabytes. With this amount of data to backhaul and process, it is no wonder utilities are exhausting the available capacity of their legacy communications networks.

The Information Technology (IT) group in a utility is tasked with managing many of these new applications associated with a smarter grid. Some utilities have been leasing copper-based TDM services for many years from service providers for smart grid, IT and substation traffic. The cost of this approach has been onerous and only gets more expensive as service providers are migrating their networks away from copper to fiber and wireless options. more>

Updates from Chicago Booth

This one ubiquitous job actually has four distinct roles
The avatars of the strategist
By Ram Shivakumar – Among the occupational titles that have become ubiquitous in the 21st century, “strategist” remains something of a mystery. What does the strategist do? What skills and mindset distinguish the strategist from others?

Is the strategist a visionary whose mandate is to look into the future and set a course of direction? A planner whose charter is to develop and implement the company’s strategic plan? An organization builder whose mission is to inspire a vibrant and energetic culture? Or is it all of the above?

Academic scholarship does not settle this question. Over the past 50 years, many competing schools of thought on strategy have emerged. The two most prominent are the positioning school and the people school. The positioning school, closely associated with ideas developed by Harvard’s Michael Porter, argues that strategy is all about distinctiveness and not operational efficiency. In this view, the acquisition of a valuable position depends on the unique combination of activities that an organization performs (or controls). The people school, closely associated with the ideas of Stanford’s Jeffrey Pfeffer, posits that the principal difference between high-performance organizations and others lies in how each group manages its most important resource—people. In this view, high-performance organizations foster a culture that reward teamwork, integrity, and commitment.

Because these two schools differ in their doctrines (assumptions and beliefs) and principles (ideas and insights), each envisions a distinct role for the strategist. more>

Related>

Democracy’s biggest challenge is information integrity

By Laura Thornton – As the world watches the United States’ elections unfold, the intensity of our polarization is on display. This election was not marked by apathy. On the contrary – citizens turned out in record numbers, some standing in lines all day, to exercise their franchise with powerful determination and the conviction of their choice.

What is notable is how diametrically opposed those choices are, the divergence is not only voters’ visions for America but perceptions of the reality of America. It has long been said that Americans, like citizens elsewhere, increasingly live in parallel universes. Why is this? I believe quite simply it boils down to information.

While there are ample exceptions and complexities, in one universe, people consume a smattering of different news sources, perhaps one or two newspapers, some journals, television and radio broadcasts and podcasts. Many of the sources are considered left-leaning. These Americans tend to hold university degrees and vote for Democrats.

The other universe includes those who primarily get their news from one or two sources, like Fox News, and rely on Facebook and perhaps their community, friends, and family for information.  They lean Republican, and many are not university educated — the so-called “education gap” in American politics. The majority of Republicans, in fact, cite Fox for their primary source of news, and those who watch Fox News are overwhelmingly supportive of Republicans and Trump.  Both universes gravitate toward echo chambers of like-minded compatriots, rarely open or empathetic to the views and experiences of others.

There are obvious exceptions and variations. The New York Times-reading, educated Republican holding his nose but counting on a tax break. Or the low-information voter who votes Democratic with her community.

In the two big general universes, sadly the divide is not just about opinions or policy approaches.  They operate with different facts.  As Kellyanne Conway, former Trump advisor, famously put it, “alternative facts.” more>

Updates from Siemens

Designing large scale automation and robotic systems using Solid Edge
By David Chadwick – Precision Robotics and Automation Ltd (PARI) is a leading developer of automation and robotic systems globally. Their customers in the automotive sector include established giants like Ford, Chrysler, PSA, Daimler-Benz, Tata Motors, Mahindra, and new significant players like VinFast. PARI designs, manufactures and installs complete, automated systems including multi-station lines for machining and assembly of powertrain components and assemblies.

PARI has been a major user of Solid Edge for 15 years with 160 licenses deployed at their headquarters near Pune in India. Typical automation solutions deployed by PARI incorporate a wide variety of robots, actuators and sensors and other mechatronic items. These systems can comprise over 25,000 unique components.

Mangesh Kale, Managing Director of PARI describes their design process. “If a six-axis robot is required for a specific application then we use robots from major suppliers like FANUC, ABB and Kuka, or other makes specified by the customer. We typically receive 3D models from these manufacturers and we integrate these into our automation system designs. However, many applications demand gantry type robots that we design and manufacture ourselves. In a typical solution, about 60% of the design is using standardized commodities of PARI. However, custom parts are typically 40% of the design. For example, the gripper sub-assembly for any material handling solution is typically a custom design. This design meets specific application needs to handle components at different stages in the machining or assembly process. The customization required for assembly processes is even higher. We find that Solid Edge is a very powerful and flexible solution for designing these sub-systems.” more>

Related>