Category Archives: Net

Updates from Ciena

Pluggables: Their role in coherent technology evolution

By Patricia Bower – In the optical networking industry, pluggable client optical modules are a dominant trend for very short links within buildings and campus networks. Market economics that have driven the proliferation of these pluggables include factors such as simplicity, interoperability and volume-driven cost. And in the domain of short-reach (sub-10km), point-to-point fiber optic connections, the advantages listed above for using small form-factor, pluggable modules shine through.

This is particularly so in the case where transport of high-speed Ethernet client signals is the primary requirement. Connectivity within and between data centers has grown at a very rapid rate over the last few years, both from the perspective of transmission speed and number of connections. The use of optical signaling to transport these high-speed Ethernet signals has proven to be very efficient.

The optical networking industry has a well-established and large ecosystem of vendors bringing small form-factor client modules to market. Many of these are supported by MSAs (Multi-Source Agreements) which can be one of two types; those that define optical transmission specifications and those that define mechanical forms.

More recently, the data rates supported by pluggable form factors have increased.  The 100G Lambda MSA group, of which Ciena is a member, has exhibited live demonstrations of interoperable Ethernet modules from member companies.  The 100G Lambda MSA specifies 100Gb/s over 2km and 10km of single-mode fiber (SMF), and 400 Gb/s links over 2km of SMF.  These modules will be based on the use of PAM-4 coding to get to a data rate of 100Gbps per wavelength. more>

Related>

Updates from Adobe

The Art of the Unnatural
By Brendan Seibel – When he was a kid, Jason DeMarte enjoyed visiting natural history museums to see the dioramas filled with taxidermy wildlife and carefully positioned plants.

As an adult, he determined that those scenes are intended more to capture the imagination than to document reality. And while the diorama designers’ motives may be pure, there is a darker side.

DeMarte saw a correlation between the museums’ “perfect, pristine snippets of nature” and product photography of the sort he’d done for Toys ’R’ Us to pay the bills while earning an MFA in photography. That experience of creating flawless images of merchandise exemplified photography’s role in cultivating consumer desire for false perfection through manipulation and good lighting.

“I started thinking about nature in a different way, as a commodity, as a way of packaging, promoting, and selling a commodifiable object,” DeMarte says.

The disconnect between manufactured perceptions of nature and the imperfect reality has been DeMarte’s artistic focus ever since. His work is a commentary on the artifice underpinning our concept of the world, as well as our constant desire for something “better.” more>

Related>

Updates from ITU

Monitoring our changing planet
By Houlin Zhao – The Earth is a fragile planet with finite resources to sustain the world’s growing population. As we work together to build a sustainable global economy, spaceborne remote sensors are poised to play an increasingly important role in achieving the United Nations’ Sustainable Development Goals (SDGs).

Indeed, ITU Member States and the global community now see the potential for using Earth observations and geospatial information as fundamental inputs for achieving the SDGs. Remote sensing provides critical information across a wide range of applications, including air quality, disaster management, public health, agriculture, water availability, coastal zone management, and the health of the Earth’s ecosystems.

For example, spaceborne sensing data is used to assess the impact of natural disasters and to be better prepared for hazardous events around the globe. Data from spaceborne remote sensors is also increasingly used to guide efforts to minimize the damage that urban growth has on the environment.

These are just a few examples of how remote sensing measurements — and the science they enable — provide a great service to humanity. This edition of the ITU News Magazine provides more such examples and a wealth of insight into how ITU’s work helps realize the social and economic benefits of Earth observation from space. more (pdf)>

Related>

Tools for thinking: Isaiah Berlin’s two concepts of freedom

By Maria Kasmirli – ‘Freedom’ is a powerful word.

We all respond positively to it, and under its banner revolutions have been started, wars have been fought, and political campaigns are continually being waged.

But what exactly do we mean by ‘freedom’?

The fact that politicians of all parties claim to believe in freedom suggests that people don’t always have the same thing in mind when they talk about it.

Might there be different kinds of freedom and, if so, could the different kinds conflict with each other? Could the promotion of one kind of freedom limit another kind? Could people even be coerced in the name of freedom?

The 20th-century political philosopher Isaiah Berlin (1909-97) thought that the answer to both these questions was ‘Yes’, and in his essay ‘Two Concepts of Liberty’ (1958) he distinguished two kinds of freedom (or liberty; Berlin used the words interchangeably), which he called negativefreedom and positive freedom.

Negative freedom is freedom from interference. You are negatively free to the extent that other people do not restrict what you can do. If other people prevent you from doing something, either directly by what they do, or indirectly by supporting social and economic arrangements that disadvantage you, then to that extent they restrict your negative freedom.

Berlin stresses that it is only restrictions imposed by other people that count as limitations of one’s freedom. Restrictions due to natural causes do not count. The fact that I cannot levitate is a physical limitation but not a limitation of my freedom. more>

Updates from Ciena

The implications behind service and content provider requirements for coherent optical solutions
By Helen Xenos – In 2007, I was asked to write a white paper about this really cool new “coherent technology” our team was working on and explain how the coherent receiver would completely revolutionize optical networks as we knew them. As I tried to get started, I quickly learned that the only source for content were the engineers actually working on the project – my efforts of scrolling through pages upon Google search pages netted zero information.
Article
The evolving coherent optical networking landscape: a deep dive

In the end, I wrote the paper by transcribing what a dear co-worker and mentor, Michel Belanger, who was one of the designers, patiently explained to me (it took several hours). He made sure I understood the significance of coherent technology and how it would change the game in optical networks.

Fast forward a dozen years – there is no shortage of information pertaining to coherent technology, and there are about a dozen coherent module and system suppliers. Coherent optical systems have become the networking foundation that underpins the digital economy that we know today.

Network providers are ubiquitously deploying coherent to scale networks for capacity, reduce transport costs and provide a better end-user experience to their customers. In fact, they are now looking at expanding the role that coherent technology plays in the network and deploy it in space and power/optimized applications in addition to traditional infrastructure, submarine and data center interconnect (DCI) build-outs.

As coherent technology plays an increasingly critical role for successful network evolution, we must step back and ask ourselves:

  • What do network providers need from their coherent solution partners to succeed?
  • What are the implications of the divergent customer and networking requirements to the suppliers of the technology?

more>

Related>

Artificial Intelligence and the Future of Humans

Experts say the rise of artificial intelligence will make most people better off over the next decade, but many have concerns about how advances in AI will affect what it means to be human, to be productive and to exercise free will.
By Janna Anderson, Lee Rainie and Alex Luchsinger – Digital life is augmenting human capacities and disrupting eons-old human activities. Code-driven systems have spread to more than half of the world’s inhabitants in ambient information and connectivity, offering previously unimagined opportunities and unprecedented threats.

As emerging algorithm-driven artificial intelligence (AI) continues to spread, will people be better off than they are today?

Some 979 technology pioneers, innovators, developers, business and policy leaders, researchers and activists answered this question in a canvassing of experts conducted in the summer of 2018.

The experts predicted networked artificial intelligence will amplify human effectiveness but also threaten human autonomy, agency and capabilities. They spoke of the wide-ranging possibilities; that computers might match or even exceed human intelligence and capabilities on tasks such as complex decision-making, reasoning and learning, sophisticated analytics and pattern recognition, visual acuity, speech recognition and language translation. They said “smart” systems in communities, in vehicles, in buildings and utilities, on farms and in business processes will save time, money and lives and offer opportunities for individuals to enjoy a more-customized future.

Yet, most experts, regardless of whether they are optimistic or not, expressed concerns about the long-term impact of these new tools on the essential elements of being human. more>

Updates from Siemens

Why fulfilling airworthiness requirements means going digital
By Dave Chan and John Cunneen – Any organization that must consistently prove airworthiness requirements can relate to the frustrating tasks of locating and providing proof their products will perform in accordance with standards, rules and laws in a myriad of countries.

No more so is this appropriate than in the aerospace industry where everything is built on safety. Every rule, every design requirement has blood on it. These rules exist because someone was or can be hurt, a plane could crash, or any number of catastrophic incidents can occur.

This is why there are rigorous standards in place to ensure anything that can take off and land, from the smallest glider and helicopter to the largest commercial airliner and military jet, must receive and maintain an airworthiness certificate. The process of aircraft certification can be daunting simply because many organizations don’t take proactive approaches in the development phase through delivery to make it so. more>

Related>

Updates from ITU

Time to eliminate the password: New report on next-generation authentication for digital financial services
By ITU News – “We don’t want digital financial services to be built on the wrong foundation, which is the password,” says Abbie Barbir, Rapporteur for ITU standardization work on ‘Identity management architecture and mechanisms’ (Q10/17).

Over 3 billion usernames and passwords were stolen in 2016, and the number of data breaches in 2017 rose 44.7 per cent higher than that recorded in 2016.

“We are moving away from the ‘shared secret’ model of authentication,” says digital ID strategist and standards expert, Andrew Hughes of InTurn Consulting, referring principally to the username-password model of authentication.

“Considering the prevalence of data breaches, there are no secrets anymore,” says Hughes.

Designed to overcome the limitations of passwords, specifications developed by the FIDO Alliance (‘Fast Identity Online’) enable users to authenticate locally to their device using biometrics, with the device then authenticating the user online with public key cryptography.

This model is not susceptible to phishing, man-in-the-middle attacks or other forms of attacks targeting user credentials.

“This is the biggest transformation we have seen in authentication in 20 years,” says Jeremy Grant, Managing Director of Technology Business Strategy at Venable. more>

Related>

A New Americanism

Why a Nation Needs a National Story
By Jill Lepore – Carl Degler issued a warning: “If we historians fail to provide a nationally defined history, others less critical and less informed will take over the job for us.”

The nation-state was in decline, said the wise men of the time. The world had grown global. Why bother to study the nation?

Francis Fukuyama is a political scientist, not a historian. But his 1989 essay “The End of History?” illustrated Degler’s point. Fascism and communism were dead, Fukuyama announced at the end of the Cold War.

Fukuyama was hardly alone in pronouncing nationalism all but dead. A lot of other people had, too. That’s what worried Degler.

Nation-states, when they form, imagine a past. That, at least in part, accounts for why modern historical writing arose with the nation-state.

But in the 1970s, studying the nation fell out of favor in the American historical profession. Most historians started looking at either smaller or bigger things, investigating the experiences and cultures of social groups or taking the broad vantage promised by global history.

But meanwhile, who was doing the work of providing a legible past and a plausible future—a nation—to the people who lived in the United States? Charlatans, stooges, and tyrants.

The endurance of nationalism proves that there’s never any shortage of blackguards willing to prop up people’s sense of themselves and their destiny with a tissue of myths and prophecies, prejudices and hatreds, or to empty out old rubbish bags full of festering resentments and calls to violence.

When historians abandon the study of the nation, when scholars stop trying to write a common history for a people, nationalism doesn’t die. Instead, it eats liberalism.

Maybe it’s too late to restore a common history, too late for historians to make a difference. But is there any option other than to try to craft a new American history—one that could foster a new Americanism? more>

Updates from Chicago Booth

The safest bank the Fed won’t sanction – A ‘narrow bank’ offers security against financial crises
By John H. Cochrane – One might expect that those in charge of banking policy in the United States would celebrate the concept of a “narrow bank.” A narrow bank takes deposits and invests only in interest-paying reserves at the Fed. A narrow bank cannot fail unless the US Treasury or Federal Reserve fails. A narrow bank cannot lose money on its assets. It cannot suffer a run. If people want their money back, they can all have it, instantly. A narrow bank needs essentially no asset risk regulation, stress tests, or anything else.

A narrow bank would fill an important niche. Right now, individuals can have federally insured bank accounts, but large businesses need to handle amounts of cash far above deposit insurance limits. For that reason, large businesses invest in repurchase agreements, short-term commercial paper, and all the other forms of short-term debt that blew up in the 2008 financial crisis. These assets are safer than bank accounts, but, as we saw, not completely safe.

A narrow bank is completely safe without deposit insurance. And with the option of a narrow bank, the only reason for companies to invest in these other arrangements is to try to harvest a little more interest. Regulators can feel a lot more confident shutting down run-prone alternatives if narrow bank deposits are widely available. more>

Related>