Category Archives: Telecom industry

Updates from Ciena

Reducing resourcing challenges by out-tasking multi-vendor network infrastructure projects
In today’s increasingly complex multi-vendor network environments, many businesses are compelled to out-task their multi-vendor operations to a single provider of specialized network services. Ciena’s Atura Bavisi details the qualities needed when looking for the right multi-vendor services partner.
By Atura Bavisi – Businesses today are constantly changing, often in unique and different ways due to market-specific conditions, but they all share something in common: a complex network environment. Operators are always looking for ways to optimize their network, at once reducing complexity while adding flexibility to handle the rapidly growing traffic demands.

These conditions often create a need for multi-vendor networks. If a business would like to reduce its OPEX and at the same time improve network performance without significantly increasing their IT resources, then buying network equipment from multiple vendors and leveraging vendor-specific services to implement and maintain this disparate equipment become critical.

However, multi-vendor projects come with their own set of challenges. For example, the multi-vendor approach often reduces visibility across the network, making it difficult to plan effectively or to provision resources to support new services rapidly. What’s more, the cost of working with multiple suppliers and in-house service teams to design and deploy solutions can be prohibitive and a logistical challenge, as well as requiring multiple custom interfaces.

Very often, corporations don’t have the ability to recruit the right highly specialized personnel to meet all these technical requirements stemming from a multi-vendor network, and most vendors only focus on their own products and solutions. more>

Related>

Updates from Ciena

Extending the 100G Edge
Introducing Ciena’s new 5171 Service Aggregation Switch and Service Aggregation Platform, bringing more capacity closer to the network edge, enabling deployment into outdoor street cabinets and other uncontrolled locations.
By Wayne Hickey – Streaming applications like Amazon Prime, Facebook, Netflix, and YouTube (… and the list goes on) are consuming Internet content at a torrid pace with the end nowhere in sight. To adjust to the high-speed pace of today’s business, enterprises use technology to change what they do and how they do it. Think about healthcare, hospitality, financial, manufacturing, and education organizations, to name a few—all are taking advantage of digital transformation to push more and more data.

Additionally, in the next few years, the promise of 5G is expected to make the number of connected devices and bandwidth swell. To deliver much faster download speeds and latency of just a few milliseconds, the network edge is key.

Coherent packet-optical technology is playing an increasingly critical role in solving networking business challenges: capacity, cost reduction, and competitiveness. Increasingly, network providers look to coherent packet-optical technology to solve the scalability and cost per bit challenges in their network.

Here are a few ways packet switching with integrated coherent DWDM is helping network providers … more>

Related>

Updates from ITU

Earth observation for weather prediction – solving the interference problem
By ITU News – “Today, several dozen satellites contribute to the accumulation of critical knowledge about the Earth’s system, enabling scientists to describe specific links between a major natural disturbance in the upper atmosphere, and changes in the weather thousands of miles away,” says Mario Maniewicz, Director of the ITU Radiocommunication Bureau.

“As accurate weather predictions need to start from the best possible estimate of the current state of the atmosphere, it is crucial that meteorologists have real-time, accurate global observations about what is happening in the Earth’s atmosphere over land and oceans. And for this, they rely on space sensing.”

Space sensing relies on the deployment of sensors to obtain data critical for Earth observation from space. Active sensors are radar systems on spaceborne platforms. They obtain data through the transmission and reception of radiowaves. Passive sensors, meanwhile, are very sensitive receivers that measure the electromagnetic energy emitted and scattered by the Earth, and the chemical constituents in the Earth’s atmosphere. They require protection from radio-frequency interference.

Spaceborne sensors measure the background natural radiative emission floor, therefore any man-made signal (e.g. communications, radars) that rises above this natural emission floor will likely interfere with the measurements. This interference can be tolerated only if its energy is well below the sensor sensitivity. more>

Related>

Updates from Ciena

The implications behind service and content provider requirements for coherent optical solutions
By Helen Xenos – In 2007, I was asked to write a white paper about this really cool new “coherent technology” our team was working on and explain how the coherent receiver would completely revolutionize optical networks as we knew them. As I tried to get started, I quickly learned that the only source for content were the engineers actually working on the project – my efforts of scrolling through pages upon Google search pages netted zero information.
Article
The evolving coherent optical networking landscape: a deep dive

In the end, I wrote the paper by transcribing what a dear co-worker and mentor, Michel Belanger, who was one of the designers, patiently explained to me (it took several hours). He made sure I understood the significance of coherent technology and how it would change the game in optical networks.

Fast forward a dozen years – there is no shortage of information pertaining to coherent technology, and there are about a dozen coherent module and system suppliers. Coherent optical systems have become the networking foundation that underpins the digital economy that we know today.

Network providers are ubiquitously deploying coherent to scale networks for capacity, reduce transport costs and provide a better end-user experience to their customers. In fact, they are now looking at expanding the role that coherent technology plays in the network and deploy it in space and power/optimized applications in addition to traditional infrastructure, submarine and data center interconnect (DCI) build-outs.

As coherent technology plays an increasingly critical role for successful network evolution, we must step back and ask ourselves:

  • What do network providers need from their coherent solution partners to succeed?
  • What are the implications of the divergent customer and networking requirements to the suppliers of the technology?

more>

Related>

Updates from Ciena

4 critical requirements for the next-gen photonic layer
By Paulina Gomez – Today’s market dynamics are making it harder for network providers to effectively compete in an environment where revenue per bit is declining, and network bandwidth requirements are exploding. In the face of these business challenges, network providers are realizing they must evolve and transform their networks towards a more programmable infrastructure that can scale and respond on demand, to meet changing customer expectations and unpredictable traffic requirements.

While coherent optics are a critical element in enabling a programmable optical infrastructure, alone they are not enough to fulfill operators’ requirements for successful network transformation.

So what else is needed?

The photonic layer is the foundation of this programmable infrastructure, leveraging the latest coherent optical technology to deliver maximum scale at the lowest cost per bit. When examining the requirements of metro and long-haul infrastructure applications, including global data center interconnect (DCI) networks, there is a growing need for an agile, resilient and intelligent photonic layer.

This Reconfigurable Add-Drop Multiplexer (ROADM)-based optical foundation leverages flexible, instrumented photonics and Layer 0 software control to scale the network for maximum capacity at the lowest space, power, and cost per bit. more>

Related>

Updates from Ciena

On the Submarine Network Horizon in 2019
By Brian Lavallée – The submarine networking industry is truly fascinating from technology, social, economic, political, and even historical perspectives. All of these facets are intertwined, as new cables are planned and deployed as well as when the unspeakable occurs, and they must be repaired.

The undersea cable network infrastructure is critical infrastructure, and given there’s no Plan B for this part of the global internet, associated technological innovation must continue to evolve at a frenetic pace to ensure the industry can not only maintain pace with voracious growth in demand, but also to ensure the enormous capacity being carried today and ever-increasing amount of tomorrow is protected and continuously optimized to ensure a stable and viable financial future for submarine cable operators.

Several technologies and visions at the forefront of submarine network innovation were hot topics of discussion in 2018 and will undoubtedly be even hotter in 2019. I highlight some notable examples below.

If submarine cable networks are to continue evolving alongside their terrestrial counterparts, these issues will continue to be critical topics of conversation in our industry throughout 2019. more>

Related>

Updates from Datacenter.com

Why can’t a data center guarantee the uptime of your environment?
Datacenter.com – One of the main reasons for choosing a data center, is to limit the risk of downtime and unavailability of the company’s critical environment. A data center offers redundant power feeds, multiple power sources (main grid and emergency generators) and redundant fiber paths to make sure one feed/source and path will always be available. So far, a data center can guarantee a certain uptime. The uptime guaranteed often guarantees the availability of at least one feed/source or path; in terms of data center design: N.

Does that ensure the uptime of your environment? To maximize your environment ‘s uptime, the resources a data center delivers must be used.

When choosing a high-standard datacenter, the equipment you will use in that data center must be able to use the safeguards that a data center offers. The infrastructure of power, fiber paths and cooling are as strong as the weakest link.

For example, when using a server that is only connected to one feed, the guaranteed uptime on one of the two power feeds do not apply anymore for the power on that server. When using a fiber connection from one fiber path, that fiber path is the single point of failure, although the data center has two redundant fiber paths. The same for using an ATS, when using it, the power to the ATS is dual feed, however the power path behind the ATS will be the single-point of failure. To achieve the highest uptime, you must use the safeguards that a data center offers as most as possible. more>

Related>

Updates from Ciena

The evolving coherent optical networking landscape: a deep dive
By Helen Xenos – Over the past decade, network providers have used coherent technology to increase traffic carrying capacity by orders of magnitude over existing assets.

These new initiatives, in turn, are generating new and divergent requirements for coherent optical solutions beyond the need to efficiently scale for bandwidth growth.

In order to retain and grow their customer base, service providers are investing to offer innovative services – like delivering original video content (AT&T acquiring Time Warner) and enabling connectivity of “smart” devices both in the home and in a mobile setting (Bell’s managed security IoT service).

They are also evaluating and upgrading to new, simpler, scalable access architectures, to be able to offer new services unlocked with 5G. One notable example of spending shifting to the edge is Verizon’s announced $1B spend over 3 years for fiber from Corning, as well as their purchase of WOW’s Chicago fiber-based infrastructure.

Challenged with a multi-vendor infrastructure consisting of various technology generations, service providers are working to streamline operations and increase network automation to accelerate service delivery and improve customer satisfaction. At the same time, they are looking to increase operational efficiencies and reduce costs with a more open, programmable infrastructure that can quickly respond to new bandwidth demands with less deployed hardware.

Consistent among all network providers is the need for a more responsive, automated, and self-optimizing network. Technologies such as advanced coherent optics, alongside a flexible photonic layer and open application programmable interfaces (APIs) play a starring role in making this possible. more>

Related>

Updates from ITU

New ITU standards bring broadband to places as remote as Mount Everest
ITU News – New ITU standards aim to bring high-speed broadband services to rural communities with lightweight, terabit-capable optical cable that can be deployed on the ground’s surface with minimal expense and environmental impact.

The standards are giving developing countries the confidence to consider the roll-out of optical networks in some of the world’s most challenging conditions.

Nepal, for example, has highlighted its intention to use ITU-standardized lightweight optical cable to connect places as remote as Mount Everest Base Camp and Annapurna Trekking Trail.

Why lightweight optical cable?

Satellite communications are characterized by high latency, struggling to support the interactive services associated with broadband. Radiocommunications can provide ‘last-mile’ connectivity. But in the broadband era, optical infrastructure is indispensable – rural communities are often many, many kilometers away from core networks.

The Editor of the new standards, Haruo Okamura of Waseda University, offers a compelling example: “Optical cable is becoming an absolute must for telemedicine. Only optical cable provides capacity high enough and latency low enough for the live transmission of HD medical imagery to remote medical professionals.”

The installation of ultra-high speed optical networks, however, comes with a great deal of cost and complexity.

“Today the costs of optical cable installation are typically 70 to 80 per cent of the entire CAPEX of the network,” says Okamura. “The designs of conventional optical cables are specific to their installation environment – whether duct, directly buried, lashed aerial or submerged – with installation methods relying on specialized machinery and skilled labor.”

This challenge is made even greater by the low densities of remote rural communities, where fiber roll-outs demand a disproportionate level of initial capital investment relative to the potential return on such investment.

New ITU standards aim to change that equation by providing a low-cost ‘do-it-yourself’ solution able to be deployed in even the world’s most remote areas. more>

Related>

Updates from ITU

ITU brings new clarity to 5G transport
ITU – 2018 has seen the launch of a major ITU drive to define the requirements of IMT-2020/5G systems as they relate to transport networks, the extremely high-capacity optical networks that form the ‘backbone’ of the ICT ecosystem.

These 5G transport projects have built strong momentum, drawing on the expertise of a wide range of working groups within ITU’s standardization expert group for ‘transport, access and home’, ITU-T Study Group 15.

The baseline for this work was established in February 2018 with the release of an influential ITU Technical Report placing emerging 5G radio requirements in the context of their demands on transport networks.

The second version of this Technical Report was agreed in October 2018. Download the report… more>