Category Archives: Science

Are We Living Through Climate Change’s Worst-Case Scenario?

By Robinson Meyer – The year 2018 was not an easy one for planet Earth.

In the United States, carbon emissions leapt back up, making their largest year-over-year increase since the end of the Great Recession. This matched the trend across the globe. According to two major studies, greenhouse-gas emissions worldwide shot up in 2018—accelerating like a “speeding freight train,” as one scientist put it.

Many economists expect carbon emissions to drop somewhat throughout the next few decades. But maybe they won’t. If 2018 is any indication, meekly positive energy trends will not handily reduce emissions, even in developed economies like the United States. It raises a bleak question:

Are we currently on the worst-case scenario for climate change?

When climate scientists want to tell a story about the future of the planet, they use a set of four standard scenarios called “representative concentration pathways,” or RCPs. RCPs are ubiquitous in climate science, appearing in virtually any study that uses climate models to investigate the 21st century. They’ve popped up in research about subjects as disparate as southwestern mega-droughts, future immigration flows to Europe, and poor nighttime sleep quality.

Each RCP is assigned a number that describes how the climate will fare in the year 2100. Generally, a higher RCP number describes a scarier fate: It means that humanity emitted more carbon dioxide into the atmosphere during the 21st century, further warming the planet and acidifying the ocean. The best-case scenario is called RCP 2.6. The worst case is RCP 8.5.

“God help us if 8.5 turns out to be the right scenario,” Jackson told me. more>

Updates from Chicago Booth

How to make money on Fed announcements—with less risk
By Dee Gill – Andreas Neuhierl and Michael Weber find gains of about 4.5 percent when investors bought or shorted markets in the roughly 40 days before and after Federal Open Market Committee (FOMC) announcements that ran counter to market expectations. Investors can make money on these “surprises,” even if they did not take positions before the announcements, the findings suggest.

Markets routinely forecast the content of FOMC announcements, which reveal the Fed’s new target interest rates, and usually react when the Fed does not act as expected. An FOMC announcement is an expansionary surprise when its new target rate is lower than the market forecasts and contractionary when it’s higher than expectations.

Share prices moved predictably ahead of and following both types of surprises, the study notes. Prices began to rise about 25 days ahead of an expansionary surprise, for about a 2.5 percent gain during that time. Before a contractionary surprise, prices generally fell. The researchers find that the movements occured in all industries except mining, where contractionary surprises tended to push share prices higher. more>

Related>

The Future of Machine Learning: Neuromorphic Processors

By Narayan Srinivasa – Machine learning has emerged as the dominant tool for the implementation of complex cognitive tasks resulting in machines that have demonstrated, in some cases, super-human performance. However, these machines require training with a large amount of labeled data and this energy-hungry training process has often been prohibitive in the absence of costly super-computers.

The ways in which animals and humans learn is far more efficient, driven by the evolution of a different processor in the form of a brain that simultaneously optimizes energy of computation with efficient information processing capabilities. The next generation of computers, called neuromorphic processors, will strive to strike this delicate balance between efficiency of computation with the energy needed for this computation.

The foundation for the design of neuromorphic processors is rooted in our understanding of how biological computation is very different from the digital computers of today (Figure).

The brain is composed of noisy analog computing elements including neurons and synapses. Neurons operate as relaxation oscillators. Synapses are implicated in memory formation in the brain and can only resolve between three-to-four bits of information at each synapse. It is well known that the brain operates using a plethora of brain rhythms but without any global clock (i.e., clock free) where the dynamics of these elements operate in an asynchronous fashion. more>

Updates from Ciena

Fiber densification without complexity the goal of two new Ciena platforms
By Helen Xenos – Providing a first-class experience is the primary driver of densification – allowing network providers to deliver higher capacity connectivity to retain content quality even with the advent of new high-capacity applications. Ease of deployment and operations is also critical, with the quick roll out of network resources giving end-users have faster access to the latest digital services they rely on.

While network providers are starting to see this is the destination they must get to, the question is how?

Today, Ciena comes ready to enable network providers to adapt to these evolving networking requirements with two new products –  the 6500 Reconfigurable Line System (RLS) and the 8180 Coherent Networking Platform.  With these new products, Ciena is tackling fiber densification challenges head on to drive a better customer experience in metro and long-haul Data Center Interconnect (DCI), cable access modernization, and 4G/5G infrastructure builds.

The 6500 Reconfigurable Line System (RLS) is a programmable, open, modular line system that scales to support the highest bandwidth requirements of metro and long-haul datacenter interconnect (DCI) as well as cable access applications. more>

Related>

Updates from Siemens

Manufacturing Execution System
Siemens – Siemens’ Manufacturing Execution System (MES) ensures that quality and efficiency are built into the manufacturing process and that they are proactively and systematically enforced.

The Manufacturing Execution System connects multiple plants and sites, and integrates easily with equipment, controllers, product lifecycle and enterprise business applications, quality management and laboratory management systems, and many other applications. The result is complete visibility, control and optimization of production and processes across the enterprise.

Benefits of Siemens Manufacturing Execution System:

  • Proactive Control & Quality
  • Granular Enterprise Visibility
  • True Continuous Improvement
  • Brand Risk Reduction
  • Improved Profit Margin

more>

Updates from Ciena

5 Ways DCI Growth is Driving New Innovations in Transport Networking

By Kent Jordan – Data center interconnect (DCI) is at the heart of new global business models, cloud adoption, and digital content delivery and services. Cloud, ICP, and colocation operators are dominating DCI sales, and DCI is becoming more crucial for other industries as well.

According to the Equinix Global Interconnection Index, global interconnect bandwidth is forecast to grow to over 8,200T by 2021, which is substantially higher than last year’s projection.

Telecommunications, manufacturing, and banking are all expected to be large contributors to total interconnect bandwidth by 2021. Smaller traffic areas, such as wholesale, retail, and healthcare, are also expected to grow at double-digit rates leading to the need for higher capacity services over time.

Purpose-built, compact, modular systems have sprouted up, offering massive scalability to enable global deployments while reducing operational expenses related to data center space, power, and cooling. These systems offer modularity and pay-as-you-grow scalability for lower traffic scenarios, so enterprises can cost-effectively scale connectivity for cloud services and applications. more>

Related>

Updates from Siemens

Aerospace and Defense Verification Management
Siemens – Our aerospace and defense verification management solution helps companies achieve faster time to certification by providing a single, integrated environment that ensures all product verification events, whether simulation modeling and analysis or physical tests, are driven by requirements, planned and executed in the correct sequence, link individual tests and analyses to necessary resources and provide full traceability.

For commercial aircraft development and certification and military development and qualification, increasing global competition puts contractors under pressure to win new orders and to deliver on time and at cost. Aerospace and defense companies must also demonstrate, in an auditable and efficient manner, that program requirements are achieved through successful test definition, simulation, planning and execution.

Successful product launches and customer acceptance require manufacturers to verify that product requirements have been fulfilled throughout the design and development of the product. more>

Related>

Updates from Ciena

The Story Behind the First Reliable Trans-Atlantic Submarine Cable Laid 150 Years Ago
By Brian Lavallée – As mentioned in a previous blog, undersea cable networks deployed around the world carry close to 100% of all intercontinental communications traffic, but they’re not a new phenomenon by any means. In fact, this week is the 150-year anniversary of the first reliable trans-Atlantic telegraph cable that was put into service way back in 1866. You’re not hallucinating; it was indeed a century and a half ago!

The 1866 submarine cable snaked along the Atlantic Ocean seabed to connect Telegraph Field at Foilhommerum Bay on Valentia Island (Ireland) to Heart’s Content in Newfoundland (now part of Canada). The 1866 cable wasn’t actually the first trans-Atlantic submarine cable though; it was the fourth attempt, though the first which was successful, after multiple failed attempts in 1857, 1858, and 1865. If at first you don’t succeed, try, try, and try again — and they did.

The first message successfully sent across a trans-Atlantic cable occurred on August 16, 1858 and ushered in an era of drastically reduced communication times.

The first repeatered trans-Atlantic cable was TAT-1 deployed nearly a century later in 1956, which used such newfangled technologies as coaxial cable, polyethylene insulation instead of gutta-percha tree sap, reliable vacuum tubes in submerged repeaters instead of newly introduced (and untrusted) transistors, as well as other engineering improvements in the 1950s. TAT-1 was a submerged fossil by today’s standards, but an absolutely critical step to where we are today.

What will future generations think of the submarine cables that we’re so proud and fond of today? Will today’s cables be viewed in the future the same way we view 8-track cassettes today? more>

Related>

How a World Order Ends

And What Comes in Its Wake
By Richard Haass -A stable world order is a rare thing. When one does arise, it tends to come after a great convulsion that creates both the conditions and the desire for something new. It requires a stable distribution of power and broad acceptance of the rules that govern the conduct of international relations. It also needs skillful statecraft, since an order is made, not born. And no matter how ripe the starting conditions or strong the initial desire, maintaining it demands creative diplomacy, functioning institutions, and effective action to adjust it when circumstances change and buttress it when challenges come.

Eventually, inevitably, even the best-managed order comes to an end. The balance of power underpinning it becomes imbalanced. The institutions supporting it fail to adapt to new conditions. Some countries fall, and others rise, the result of changing capacities, faltering wills, and growing ambitions. Those responsible for upholding the order make mistakes both in what they choose to do and in what they choose not to do.

But if the end of every order is inevitable, the timing and the manner of its ending are not. Nor is what comes in its wake. Orders tend to expire in a prolonged deterioration rather than a sudden collapse. And just as maintaining the order depends on effective statecraft and effective action, good policy and proactive diplomacy can help determine how that deterioration unfolds and what it brings. Yet for that to happen, something else must come first: recognition that the old order is never coming back and that efforts to resurrect it will be in vain.

As with any ending, acceptance must come before one can move on.

Although the Cold War itself ended long ago, the order it created came apart in a more piecemeal fashion—in part because Western efforts to integrate Russia into the liberal world order achieved little. One sign of the Cold War order’s deterioration was Saddam Hussein’s 1990 invasion of Kuwait, something Moscow likely would have prevented in previous years on the grounds that it was too risky. Although nuclear deterrence still holds, some of the arms control agreements buttressing it have been broken, and others are fraying.

The liberal order is exhibiting its own signs of deterioration. Authoritarianism is on the rise not just in the obvious places, such as China and Russia, but also in the Philippines, Turkey, and eastern Europe. Global trade has grown, but recent rounds of trade talks have ended without agreement, and the World Trade Organization (WTO) has proved unable to deal with today’s most pressing challenges, including nontariff barriers and the theft of intellectual property.

Resentment over the United States’ exploitation of the dollar to impose sanctions is growing, as is concern over the country’s accumulation of debt. more>

Updates from Ciena

How will Adaptive IP change your IP networks?
By James Glover – Over the last several years, network operators have been searching to control costs and accelerate innovation, while avoiding heavily integrated solutions that lock them into a single vendor’s solution. This search has led to explosive growth and innovation in the “open source” software and hardware communities that facilitates increased choice for best-in-breed network solutions and services.

Disaggregation, programmability, and open Application Programming Interfaces (APIs) are together playing a major role in disrupting legacy network designs by shifting innovation from hardware to software. The software-based virtualization of network functions and services allow for improved scaling and flexibility via a new approach to designing, deploying, and managing network architecture.

Today there is a disruptive trend towards virtualization of applications, services, and disaggregation (separation of hardware and software) of infrastructure. Combine this trend with the never-ending need for bandwidth, scaling and flexibility in network deployments, and we see an entirely new approach to network architecture emerging.

The entire end-to-end network must be agile to enable compute, storage and networking resources when and where required by leveraging programmable resources that don’t require physical reconfiguration to accommodate evolving service demands. This has led to open source discussions around open APIs, such as NETCONF/YANG, routing protocol extensions and enhancements, path computation, remote procedure calls, and so on… but what about OPEX?

How can operators address costs that scale linearly alongside network scaling?

Through automation and orchestration. more>

Related>