Category Archives: Science

5 Techniques for Accelerating Engineering Development

By Jacob Beningo – Whether its a parts company, software supplier, or all the way to system integrators and even consultants, no one seems immunte to the ideas of decreasing costs and faster time to market, while improving product quality.

We want to do more at the same or better quality level, while also decreasing the resources we use to achieve our end goals.

That is not to say this is an impossible goal. In fact it’s quite obtainable. In many cases it all comes down to engineering development time and costs.

Here are my top five techniques for accelerating engineering development. These five techniques are just a few examples of low-hanging fruit that companies and developers can consider when trying to accelerate engineering development.

  1. Master Your Defects
    Embedded software developers on average spend 20 – 40% of their time debugging their software. That sounds outrageous, but if you look at the Aspencore 2017 Embedded Survey results or speak to developers at embedded systems conferences you’ll find that figure is accurate!
  2. Have the Right Tools for the Job
    If you want to go fast, you need to have the right tools.
  3. Focus on Your Value; Outsource the Rest
    For engineers (and any business for that matter), it’s important to recognize what value you are bringing to the table.
  4. Leverage Existing Software Platforms
    Leveraging existing software platforms, even ones that are certified, can dramatically accelerate engineering development.
  5. Leverage Existing Hardware Platforms
    For many embedded products, the core hardware features tend to be the same. In fact, probably 80% is the same or similar guts and the remaining 20% is where companies differentiate.

more>

Updates from Ciena

Reducing resourcing challenges by out-tasking multi-vendor network infrastructure projects
In today’s increasingly complex multi-vendor network environments, many businesses are compelled to out-task their multi-vendor operations to a single provider of specialized network services. Ciena’s Atura Bavisi details the qualities needed when looking for the right multi-vendor services partner.
By Atura Bavisi – Businesses today are constantly changing, often in unique and different ways due to market-specific conditions, but they all share something in common: a complex network environment. Operators are always looking for ways to optimize their network, at once reducing complexity while adding flexibility to handle the rapidly growing traffic demands.

These conditions often create a need for multi-vendor networks. If a business would like to reduce its OPEX and at the same time improve network performance without significantly increasing their IT resources, then buying network equipment from multiple vendors and leveraging vendor-specific services to implement and maintain this disparate equipment become critical.

However, multi-vendor projects come with their own set of challenges. For example, the multi-vendor approach often reduces visibility across the network, making it difficult to plan effectively or to provision resources to support new services rapidly. What’s more, the cost of working with multiple suppliers and in-house service teams to design and deploy solutions can be prohibitive and a logistical challenge, as well as requiring multiple custom interfaces.

Very often, corporations don’t have the ability to recruit the right highly specialized personnel to meet all these technical requirements stemming from a multi-vendor network, and most vendors only focus on their own products and solutions. more>

Related>

Updates from Siemens

New technology in industry is creating a platform economy
By Frank_Fang – Twenty years ago, product-centric companies dominated a list of the most valuable companies in the world. The list was a Who’s Who of automotive, manufacturing, oil and gas, and brick-and-mortar retailers.

Today, platform-based businesses rule.

This new economy forces product-centric manufacturing companies to rethink how they transform digitally to survive and thrive in a data-rich market. It’s no secret that new technology and new approaches eventually supersede the old.

We’re witnessing one of these periods now. As manufacturers look for ways to radically redefine processes through the hype of the sharing economy, online platforms, the end of money and all the other buzzwords people use today, digital twin evolution will lead to platform economy, a state Viktor Mayer-Schönberger foresees in his book Reinventing Capitalism in the Age of Big Data.

Digital twins, which evolve from decades of simulation and analysis in engineering, are high fidelity models for actual physical objects such as a product or production process. Using computer aided-design, model-based system engineering and multiphysics simulation tools, a designer or engineer creates a digital representation for a physical object or process.

The digital twin is no longer science fiction. For example, NASA used this approach to design, engineer and produce two Mars rovers: Curiosity and InSight.

Since you can’t build a Mars environment on earth, you simply bring Mars to the computer and digitally test your Mars rover. more>

Related>

To end poverty, think like a spy

By Paul M. Bisca – For anyone working to end poverty, fragile states call for the ultimate juggling act. Countries in conflict seldom control their territories, and even when most areas are at peace, others may still be engulfed by violence for decades to come.

The intensity of civil wars can ebb and flow, while forcibly displaced people cross borders in search of shelter. Politicians and warlords can shift alliances abruptly and neighboring states often interfere militarily to prop up local proteges.

When geopolitics is not at play, internal disputes over land, water, or other scarce resources can ignite fighting between local populations. To make sense of all these moving parts, even the most knowledgeable experts must look for new ways to comprehend the world.

What can be done?

To better manage the unknown, development professionals might want to take a leaf from the intelligence community book and draw inspiration from how spies try to predict the future. Reduced to its simplest terms, the CIA defines intelligence as “knowledge and foreknowledge of the world around us—the prelude to decisions by policymakers.”

Other definitions emphasize the collection, processing, integration, analysis, and interpretation of available information from closed and open sources.

Development practitioners are not spies, nor should they aspire to be. Further, the idea that project managers and economists should behave like spies is bound to raise eyebrows for professionals driven by the quest for sustainability and equity.

Yet, the methodology of intelligence is well-suited to paint in our minds the interplay of actions, information, and analysis needed to navigate the complex, uncertain, and downright dangerous environments where extreme poverty stubbornly persists.

This approach is not about acting like James Bond, but rather about thinking like him. more>

Updates from Georgia Tech

Signals from Distant Lightning Could Help Secure Electric Substations
By John Toon – Side channel signals and bolts of lightning from distant storms could one day help prevent hackers from sabotaging electric power substations and other critical infrastructure, a new study suggests.

By analyzing electromagnetic signals emitted by substation components using an independent monitoring system, security personnel could tell if switches and transformers were being tampered with in remote equipment. Background lightning signals from thousands of miles away would authenticate those signals, preventing malicious actors from injecting fake monitoring information into the system.

The research, done by engineers at the Georgia Institute of Technology, has been tested at substations with two different electric utilities, and by extensive modeling and simulation. Known as radio frequency-based distributed intrusion detection system (RFDIDS), the technique was described February 26 at the 2019 Network and Distributed System Security Symposium (NDSS) in San Diego.

“We should be able to remotely detect any attack that is modifying the magnetic field around substation components,” said Raheem Beyah, Motorola Foundation Professor in Georgia Tech’s School of Electrical and Computer Engineering and co-founder of Fortiphyd Logic, Inc. “We are using a physical phenomenon to determine whether a certain action at a substation has occurred or not.”

Opening substation breakers to cause a blackout is one potential power grid attack, and in December 2015, that technique was used to shut off power to 230,000 persons in the Ukraine. Attackers opened breakers in 30 substations and hacked into monitoring systems to convince power grid operators that the grid was operating normally. Topping that off, they also attacked call centers to prevent customers from telling operators what was happening. more>

Related>

Updates from Chicago Booth

Purely evidence-based policy doesn’t exist
By Lars Peter Hansen – Recently, I was reminded of the commonly used slogan “evidence-based policy.”

Except for pure marketing purposes, I find this terminology to be a misnomer, a misleading portrayal of academic discourse and the advancement of understanding. While we want to embrace evidence, the evidence seldom speaks for itself; typically, it requires a modeling or conceptual framework for interpretation.

Put another way, economists—and everyone else—need two things to draw a conclusion: data, and some way of making sense of the data.

That’s where modeling comes in. Modeling is used not only to aid our basic understanding of phenomena, but also to capture how we view any implied trade-offs for social well-being. The latter plays a pivotal role when our aim is to use evidence in policy design.

This is intuitive if you think about the broad range of ideas and recommendations surrounding macroeconomic policy and the spirited, sometimes acrimonious way in which they’re debated.

If everything were truly evidence based, to the extent we can agree on the accuracy of the evidence, why would there be such heterogeneity of opinion? The disagreement stems from the fact that people are using different models or conceptual frameworks, each with its own policy implications.

Each of them might be guided by evidence, but policy conclusions can rarely be drawn directly from the evidence itself. more>

Related>

Updates from Ciena

Following the 3-pillar approach to effective security strategy
Large-scale data breaches are reported in the press almost daily, with devastating consequences for the organizations and individuals involved. A multi-layer security strategy minimizes cybersecurity risks for your organization and streamlines the compliance journey in the run-up to upcoming legislation.
By Paulina Gomez – Technology innovation – the continued evolution of cloud computing, the rapid increase in Internet of Things (IoT) and the growth of Artificial Intelligence (AI) – is expected to drive a 100x increase in connected devices and a 1,000x increase in data traffic by 2020 (2016 Mobility Report, November 2016, Ericsson). Each new device doesn’t just drive traffic, it also dramatically expands the network attack surface – increasing the opportunity of cybercriminals to leverage sophisticated methods to exploit these opportunities.

In response to the rapidly evolving cybersecurity threat landscape, regulations around the world are upping the pressure on organizations to protect their sensitive customer and operational data. The maximum fine for a data breach in the upcoming European General Data Protection Regulation (GDPR), for example, could be up to 4% of global revenues; enough to put even large organizations out of business.

How can an organization minimize its security risks? It’s about more than just encryption and firewalls. A comprehensive, multi-layer security strategy is vital to an effective defense.

By following these three key pillars to achieve the confidentiality, integrity, and availability of data in your network, you will be protecting your data, your customers, and your business. more>

A new generation of young managers is reshaping how we work

By Stephane Kasriel – No matter where you look, so much rapid change is happening that even how companies manage their talent strategy is shifting. Gone are the days of HR managing workforce planning with an Excel spreadsheet. To remain not only competitive but relevant, more companies are turning to detailed workforce plans, and younger generations of managers are much more likely to be putting these plans in place. As they do, and as they ascend to more senior roles, they’re reshaping the future of work.

More than half of younger generation managers polled see future workforce planning as a “top priority” for their departments–nearly three times more than their baby boomer counterparts, according to my company Upwork’s 2019 Future Workforce Report.

Whereas baby boomers are known for keeping their employees close, millennials, who now make up more than half the U.S. workforce, overwhelmingly desire “flexible and fluid” work settings.

Younger generation managers are also more likely to see it as an individual’s right to work remotely. After all, they’ve grown up in the digital era. They do not understand why someone should be tethered to a desk nine-to-five if modern technology frees them to work anytime, anywhere, and from any connected device.

In fact, many believe they are more productive working remotely than they would be in rigid office environments with all of their distractions. more>

Why We Stink at Tackling Climate Change

By David P. Barash – What’s wrong with us? Not us Democrats, Republicans, or Americans. Rather, what’s wrong with our species, Homo sapiens?

If human beings are as Hamlet suggested, “noble in reason, infinite in faculty,” then why are we facing so many problems?

In many ways, people are better off than ever before: reduced infant mortality, longer lifespans, less poverty, fewer epidemic diseases, even fewer deaths per capita due to violence.

And yet global threats abound and by nearly all measures they are getting worse: environmental destruction and wildlife extinction, ethnic and religious hatred, the specter of nuclear war, and above all, the disaster of global climate change.

For some religious believers, the primary culprit is original sin. For ideologues of left, right, and otherwise, it’s ill-functioning political structures.

From my biological perspective, it’s the deep-seated disconnect between our slow-moving, inexorable biological evolution and its fast-moving cultural counterpart—and the troublesome fact we are subject to both, simultaneously.

It seems inevitable that as these cultural skills developed and provided leverage over the material and natural world—not to mention over other human beings, less adroit at these things—natural selection favored those individuals most able to take advantage of such traits. Up to a point, our biological and cultural evolution would have been mutually reinforcing. We are now past that point.

There is no reason for our biological and cultural evolution to proceed in lockstep, and many reasons for them to have become disconnected. more>

When the monsoon goes away

By Sunil Amrith – More than 70 per cent of total rainfall in South Asia occurs during just three months each year, between June and September. Within that period, rainfall is not consistent: it is compressed into a total of just 100 hours of torrential rain, spread across the summer months. Despite advances in irrigation, 60 per cent of Indian agriculture remains rain-fed, and agriculture employs around 60 per cent of India’s population. No comparable number of human beings anywhere in the world depend on such seasonal rainfall.

Both before and after independence, the imperious power of the monsoon troubled India’s rulers. In the first decade of the 20th century, the finance minister in the imperial government declared that ‘every budget is a gamble on the rains’ – a statement that is still quoted regularly in the Indian media.

In the late 1960s, India’s prime minister Indira Gandhi said: ‘For us in India, scarcity is only a missed monsoon away.’ The foreboding remains.

The scale of the monsoon system exists far beyond human intervention. If technology could intervene, it was on the landscape, in the form of infrastructure. By the early 20th century, engineers around the world were confident that they could neutralize the risk of climatic variability by constructing dams that would fuse water storage, flood control, irrigation and power generation. India was no exception. more>