Tag Archives: Technology

Updates from Siemens

Designing large scale automation and robotic systems using Solid Edge
By David Chadwick – Precision Robotics and Automation Ltd (PARI) is a leading developer of automation and robotic systems globally. Their customers in the automotive sector include established giants like Ford, Chrysler, PSA, Daimler-Benz, Tata Motors, Mahindra, and new significant players like VinFast. PARI designs, manufactures and installs complete, automated systems including multi-station lines for machining and assembly of powertrain components and assemblies.

PARI has been a major user of Solid Edge for 15 years with 160 licenses deployed at their headquarters near Pune in India. Typical automation solutions deployed by PARI incorporate a wide variety of robots, actuators and sensors and other mechatronic items. These systems can comprise over 25,000 unique components.

Mangesh Kale, Managing Director of PARI describes their design process. “If a six-axis robot is required for a specific application then we use robots from major suppliers like FANUC, ABB and Kuka, or other makes specified by the customer. We typically receive 3D models from these manufacturers and we integrate these into our automation system designs. However, many applications demand gantry type robots that we design and manufacture ourselves. In a typical solution, about 60% of the design is using standardized commodities of PARI. However, custom parts are typically 40% of the design. For example, the gripper sub-assembly for any material handling solution is typically a custom design. This design meets specific application needs to handle components at different stages in the machining or assembly process. The customization required for assembly processes is even higher. We find that Solid Edge is a very powerful and flexible solution for designing these sub-systems.” more>


Keep science irrational

By Michael Strevens – Modern science has a whole lot going for it that Ancient Greek or Chinese science did not: advanced technologies for observation and measurement, fast and efficient communication, and well-funded and dedicated institutions for research. It also has, many thinkers have supposed, a superior (if not always flawlessly implemented) ideology, manifested in a concern for objectivity, openness to criticism, and a preference for regimented techniques for discovery, such as randomized, controlled experimentation. I want to add one more item to that list, the innovation that made modern science truly scientific: a certain, highly strategic irrationality.

‘Experiment is the sole judge of scientific “truth”,’ declared the physicist Richard Feynman in 1963. ‘All I’m concerned with is that the theory should predict the results of measurements,’ said Stephen Hawking in 1994. And dipping back a little further in time, we find the 19th-century polymath John Herschel expressing the same thought: ‘To experience we refer, as the only ground of all physical enquiry.’ These are not just personal opinions or propaganda; the principle that only empirical evidence carries weight in scientific argument is widely enforced across the scientific disciplines by scholarly journals, the principal organs of scientific communication. Indeed, it is widely agreed, both in thought and in practice, that science’s exclusive focus on empirical evidence is its greatest strength.

et there is more than a whiff of dogmatism about this exclusivity. Feynman, Hawking, Herschel all insist on it: ‘the sole judge’; ‘all I’m concerned with’; ‘the only ground’. Are they, perhaps, protesting too much? What about other considerations widely considered relevant to assessing scientific hypotheses: theoretical elegance, unity, or even philosophical coherence? Except insofar as such qualities make themselves useful in the prediction and explanation of observable phenomena, they are ruled out of scientific debate, declared unpublishable. It is that unpublishability, that censorship, that makes scientific argument unreasonably narrow. It is what constitutes the irrationality of modern science – and yet also what accounts for its unprecedented success. more>

Updates from Chicago Booth

There will be more innovation post-COVID. Here’s why.
By Harry L. Davis – Since the COVID-19 pandemic threw our lives into disarray, we’ve had to change how we do anything involving other people. Rather than counting on bumping into colleagues in the hall, we now have to schedule Zoom calls around the competing demands (childcare, a broken water heater) that everyone is dealing with. There isn’t time for the kind of small talk that often, unpredictably, leads to big ideas.

There are unquestionably benefits to handling some tasks over video conference. Last spring, I taught a class in which groups of students take on consulting projects with the guidance of Chicago-based Kearney. Consultants spend countless hours on airplanes to make face-to-face meetings with their clients possible, and it’s a big part of their culture. In past years, regular in-person meetings and schmoozing were built into the syllabus.

Of course, none of that was possible this year. Our students were thrust into a new world where even senior executives were caught off-guard and without webcams. Whiteboard brainstorming sessions became Zoom calls.

Curious about their experiences, we surveyed the students about the impact of remote work throughout the quarter. While pessimistic at first, by the end of the nine-week course, they later felt that their remote situation was actually helping them be more efficient and helped them do do a better job responding to their clients’ needs. I had a similar experience with teaching remotely—although daunted at first, I found that I was able to deliver my classes effectively, even if I was tethered to my desk chair.

Once the pandemic is behind us, we’ll have to choose what to return to and what to keep from our remote way of working. I think Zoom and its ilk will continue to have an important place for those situations where teams are geographically dispersed or there’s some urgent decision that needs to be made. But the type of work that delivers innovation—creative work—will still best be done in person. more>


Do social media algorithms erode our ability to make decisions freely?

Social media algorithms, artificial intelligence, and our own genetics are among the factors influencing us beyond our awareness. This raises an ancient question: do we have control over our own lives? This article is part of The Conversation’s series on the science of free will.
By Lewis Mitchell and James Bagrow – Have you ever watched a video or movie because YouTube or Netflix recommended it to you? Or added a friend on Facebook from the list of “people you may know”?

And how does Twitter decide which tweets to show you at the top of your feed?

These platforms are driven by algorithms, which rank and recommend content for us based on our data.

As Woodrow Hartzog, a professor of law and computer science at Northeastern University, Boston, explains:

If you want to know when social media companies are trying to manipulate you into disclosing information or engaging more, the answer is always.

So if we are making decisions based on what’s shown to us by these algorithms, what does that mean for our ability to make decisions freely?

An algorithm is a digital recipe: a list of rules for achieving an outcome, using a set of ingredients. Usually, for tech companies, that outcome is to make money by convincing us to buy something or keeping us scrolling in order to show us more advertisements.

The ingredients used are the data we provide through our actions online – knowingly or otherwise. Every time you like a post, watch a video, or buy something, you provide data that can be used to make predictions about your next move.

These algorithms can influence us, even if we’re not aware of it. As the New York Times’ Rabbit Hole podcast explores, YouTube’s recommendation algorithms can drive viewers to increasingly extreme content, potentially leading to online radicalization.

Facebook’s News Feed algorithm ranks content to keep us engaged on the platform. It can produce a phenomenon called “emotional contagion”, in which seeing positive posts leads us to write positive posts ourselves, and seeing negative posts means we’re more likely to craft negative posts — though this study was controversial partially because the effect sizes were small.

Also, so-called “dark patterns” are designed to trick us into sharing more, or spending more on websites like Amazon. These are tricks of website design such as hiding the unsubscribe button, or showing how many people are buying the product you’re looking at right now. They subconsciously nudge you towards actions the site would like you to take. more>

Updates from McKinsey

Small capital markets businesses have been insulated against many of the troubles affecting their larger competitors. Now things are getting tougher.
By Fuad Faridi, Jared Moon, Anoop Ravindranath, Roger Rudisuli, Manu Saxena, Matthew Steinert – The capital markets arms of regional and national banks are often seen as smaller versions of the capital markets businesses of the top 10 global firms. However, they are actually quite different. Regional businesses have evolved along their own paths, with distinct client franchises, operating models, and sources of profitability. As a result, they require a strategic agenda that is tailored to their specific needs.

Until recently, the capital markets businesses of regional banks have been insulated against many of the troubles affecting larger global banks, and in some cases have performed better than them. Regional capital markets businesses’ returns on equity (ROE) held up better than those of their larger counterparts and regionals maintained their market share.

Now, however, there are signs that life is getting tougher. Structural shifts such as increasing electronification and falling revenues, and questions about the sustainability and “fit” with their parent organizations, are leading to increased scrutiny of these businesses. This has been especially pronounced in Europe. Temporary revenue upticks from market volatility linked to COVID-19 are seen as providing only temporary relief.

Even seemingly more robust franchises are being forced to answer difficult questions. They have found that after stripping out the impact from internal flows and adjacent client bases, the businesses that remain are often far less profitable. They have also found it challenging to unlock the next phase of growth.

In response, regional players across geographies are focusing on improving productivity end to end. A subset of firms is also trying to identify three to five pockets of opportunity for capturing revenue. more>

3 Keys to Engineering Success

Although success can be defined in different ways by different people, there are three very specific keys to engineering success.

By Jacob Beningo – Every engineer and engineering team wants to be successful. Success can be defined in many different ways whether it is meeting a deadline, making a customer happy, or completing work within the budget. Whatever the definition of success is, there are three keys to successful engineering, and they aren’t necessarily technical.

Success Key #1 – Maintaining Discipline

Related: 50 Top Private Engineering Firms of 2020

The first key to success is that even under the toughest conditions, discipline needs to be maintained. This isn’t a military thing, it’s common sense. I see a lot of teams that when things start to get tough, corners recklessly start getting cut. The loss of discipline creates additional problems that further get in the way of delivering and quickly become a self-feeding doom loop that wastes time and kills budget.

Maintaining discipline for success must be done at more than one level at the company. First, individual developers need to agree that no matter what pressure is put on them, they will follow their processes, perform their due diligence, and not allow themselves to decay into wild west programming. Individual developers form the foundation and if they crack, the whole project is going with them. Second, the collective team needs to agree that they will maintain their discipline no matter what. Everyone working together will help ensure that they are successful. Finally, the company management team needs to be on-board and understand that while there may be a fire today or a critical delivery date, the team has to maintain the discipline to make the delivery successful. All three levels of the business need to be on board.

In my experience, engineering success comes down to much more than technical prowess. It comes down to having and maintain discipline. It requires carefully managing expectations to deliver what is needed when it is needed not by overpromising and under-delivering. Perhaps most importantly, to have long-term success, it requires having fun doing whatever it is that you do and with the people you are doing it with. more>

Staying Focused on the Big Picture

U.S. election-related uncertainty may persist a while longer, but the relatively optimistic longer-term economic outlook hasn’t changed.
By Lisa Shalet – Now that former Vice President Joseph Biden is President-Elect, much of the election uncertainty has dissipated. Markets have factored in Biden’s win as well as the apparent lack of a Congressional Democratic sweep, but headlines concerning the transition of power could contribute to volatility.

We encourage investors to ignore short-term price swings based on the headlines and stay focused on the bigger picture. We still believe that investors should emphasize global stocks over bonds. Morgan Stanley & Co. strategists forecast that the S&P 500 Index, a broad measure of the U.S. market that is now trading around 3500, may reach 3700 by the middle of next year.

Several key points in our economic outlook are unlikely to change due to election results. Here are three reasons why:

The V-shaped economic recovery is on solid ground. October’s nonfarm payroll data was a solid upside surprise, with the unemployment rate falling and the labor participation rate rising. Consumer sentiment is holding up, and manufacturing and services indicators continue to show expansion. Housing and durable goods orders support the capital spending narrative of the new business cycle. In 2021, U.S. GDP could grow at an annualized pace of 5% to 6%—in part because the recession this year enhances the year-over-year comparison, but also given the midyear return to growth. Such economic expansion could power double-digit increases in corporate profits.

The Federal Reserve remains ultra-dovish. The central bank has stayed firm on holding its key short-term fed funds rate near zero through December, 2023. Low interest rates can stimulate growth by facilitating more borrowing, allowing consumers and businesses to spend more. The Fed has yet to define metrics or time frames for “average inflation targeting,” which will likely allow inflation to trend higher without rate intervention to check its rise. Under a policy known as quantitative easing, the Fed also continues to buy government bonds at a significant pace, a direct injection of liquidity across fixed-income markets that can also contribute to economic growth.

The COVID-19 trajectory is unlikely to lead to national lockdowns. The recent surge in new infections is unfortunate and concerning, however, as was the case in the summer, the U.S. economy remains resilient in the face of localized shutdowns. We believe that public health measures and vaccine availability will drive the pandemic’s economic impact. Hopefully by January, we could be past the peak of new cases and closer to available vaccines. Drug development pipelines remain on track to deliver some scaled vaccine distribution by summer, 2021. more>

Updates from McKinsey

Unlocking value: Four lessons in cloud sourcing and consumption
Companies that are successful in sourcing and managing the consumption of cloud adopt a more dynamic, analytical, and demand-driven mindset.
By Abhi Bhatnagar, Will Forrest, Naufal Khan, and Abdallah Salami – Cloud adoption is no longer a question of “if” but of “how fast” and “to what extent.” Between 2015 and 2020, the revenue of the big-three public cloud providers (AWS, Microsoft Azure, and Google Cloud Platform) has quintupled, and they have more than tripled their capital-expenditures investment to meet increasing demand. And enterprises are ever more open to cloud platforms: more than 90 percent of enterprises reported using cloud technology in some way.

These trends reflect a world where enterprises increasingly “consume” infrastructure rather than own it. The benefits of this model are plentiful. Cloud adopters are attracted by the promise of flexible infrastructure capacity, rapid capacity deployment, and faster time to market for digital products. The COVID-19 crisis has accentuated the need for speed and agility, making these benefits even more important. From an infrastructure-economics perspective, perhaps the most attractive innovation of cloud is the ability to tailor the consumption of infrastructure to the needs of the organization. This promises greater economic flexibility by transforming underutilized capital expenditures into optimally allocated operations expenditures.

While this concept is attractive in theory, many enterprises are facing challenges in capturing the value in reality. Enterprises estimate that around 30 percent of their cloud spend is wasted. Furthermore, around 80 percent of enterprises consider managing cloud spend a challenge. Thus, even though more than 70 percent of enterprises cite optimizing cloud spend as a major goal, realizing value remains elusive.

In our experience, a major driver of value capture is transforming the approach to sourcing and consuming cloud. Enterprises that approach this task with a traditional sourcing and infrastructure-consumption mindset are likely to be surprised by the bill. The flexibility to consume cloud as needed and cost effectively places responsibility on enterprises to maintain a real-time view of their needs and continuously make deliberate decisions on how best to adjust consumption. more>


Updates from ITU

G20: Call to action on international standards
ITU – Organizers of the Riyadh International Standards Summit held on 4 November 2020 issued a call to action for the recognition, support and adoption of international standards. This is the first ever summit on standardization held within G20-related activities.

The Riyadh International Standards Summit was initiated by Saudi Standards, Metrology and Quality Organization (SASO) and was organized with the International Electrotechnical Commission (IEC), International Organization for Standardization (ISO), International Telecommunication Union (ITU), Saudi Communications & Information Technology Commission (CITC), and Saudi Food and Drug Authority (SFDA). The event was hosted by SASO and the G20 Saudi Secretariat as part of the International Conferences Programme honouring the G20 Saudi presidency year, 2020. It forms part of the Kingdom of Saudi Arabia’s efforts, during its presidency, to enhance cooperation between countries of the world in various fields.

Originally intended to take place in the Kingdom of Saudi Arabia, which currently holds the G20 Presidency, in light of the global pandemic, the Summit instead took place virtually and welcomed participants from all over the world.

The Riyadh International Standards Summit concluded with the call to action for “each country to recognize, support, and adopt international standards to accelerate digital transformation in all sectors of the economy to help overcome global crises, such as COVID-19, and contribute towards the achievement of the United Nations Sustainable Development Goals (SDGs)”. more>


Updates from Chicago Booth

Would you trust a machine to pick a vaccine?
Machine learning is being tasked with an increasing number of important decisions. But the answers it generates involve a degree of uncertainty.
By Emily Lambert – Back before COVID-19, Chicago Booth’s Sanjog Misra was on vacation in Italy with his son, who has a severe nut allergy. They were at a restaurant and couldn’t read the menu, so Misra opened an app that translates Italian to English, and pointed his phone at the menu to find out if one of the dishes was peanut free. The app said it was.

But as he prepared to order, Misra had a thought: Since the app was powered by machine learning, how much could he trust its response? The app didn’t indicate if the conclusion was 99 percent certain to be correct, or 80 percent certain, or just 51 percent. If the app was wrong, the consequences could be dire for his son.

Machine learning is increasingly ubiquitous. It’s inside your Amazon Alexa. It directs self-driving cars. It makes medical decisions and diagnoses. In the past few years, machine-learning methods have come to dominate data analysis in academia and industry. Some teachers are using ML to read students’ assignments and grade homework. There is evidence that machine learning outperforms dermatologists at diagnosing skin cancer. Researchers have used ML to mine research papers for information that could help speed the development of a COVID-19 vaccine.

They’re also using ML to predict the shapes of proteins, an important factor in drug development. “All of our work begins in a computer, where we run hundreds of millions of simulations. That’s where machine learning helps us find things quickly,” says Ian Haydon, the scientific communications manager for the Institute for Protein Design at the University of Washington. Some scientists there are developing vaccines, and others are trying to develop a drug compound that would stop the novel coronavirus from replicating. more>