Category Archives: Communication industry

Updates from Ciena

The Adaptive Network: Why automation alone isn’t enough
By Keri Gilder – Just imagine, instead of 70, your heart rate was at 100 beats per minute. This could be a warning sign that you are on the verge of having a heart attack.

If your doctor were to get this information in real time, they could check the readings against your medical records and see that this is completely out of the norm and then warn you to seek medical assistance immediately.

However, if your personal trainer received that same information, would they reach the same conclusion as your doctor? Your trainer has access to a different database, which might show your resting heart rate as well as the rate during high-intensity training. Knowing that you are likely exercising, they would instead conclude that there is no need to go to the hospital after all.

This clearly demonstrates that just accepting raw data without filtering and proper analysis is no longer good enough and can potentially have serious repercussions. Instead, it is critical that we have diversity of thought when it comes to how we interpret data.

This is not just true for our health or other day-to-day scenarios, but can also be applied to the communication networks that carry and house our information. more>

Related>

Where Did Qualcomm Go Wrong?

By Bolaji Ojo – It’s a justifiable question. The Qualcomm–NXP trip was an expensive sortie: Qualcomm has paid $2 billion in mandatory break-off fees to NXP, but the bill for the hidden costs may be much higher. For nearly two years, the communications IC and IP supplier and its target endured prolonged uncertainties. Even now, the spasms from customer disruptions remain strong while many employees, though heaving a sigh of relief, must figure out where they truly belong in the enterprise.

Qualcomm is moving on resolutely from the NXP debacle. It must. However, the implications and lessons — if any — are industry-wide. One of the largest acquisitions in the history of the semiconductor industry foundered because of oppositions from various fronts, including customers who might have benefited from it. Simply dumping the blame on nebulous factors and faceless regulators will result in the industry learning nothing from the experience. Perhaps the transaction was destined to fail. Perhaps it could have been better managed and successfully, too. A thorough assessment of why this deal collapsed would offer lessons that can be applied to future deals.

There are no signs that Qualcomm will conduct a detailed analysis of why and how the bid unraveled. It is easier — again — to simply toss more money at stakeholders and move on. NXP’s management and shareholders who had tendered their equity could slake their thirst with $2 billion in Qualcomm’s money. more>

Guidelines to Achieve Digital Transformation

GSR-18 BEST PRACTICE GUIDELINES ON NEW REGULATORY FRONTIERS TO ACHIEVE DIGITAL TRANSFORMATION
itu.int – Digitization is increasingly and fundamentally changing societies and economies and disrupting many sectors in what has been termed the 4th Industrial Revolution. Meanwhile, ICT regulation has evolved globally over the past ten years and has experienced steady transformation.

As regulators, we need to keep pace with advances in technology, address the new regulatory frontiers and create the foundation upon which digital transformation can achieve its full potential. Being prepared for digital transformation and emerging technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), Machine to Machine communications (M2M) and 5G is fundamental.

Advances in technology are creating new social phenomena and business models that impact every aspect of our personal and professional lives – and which challenge regulatory paradigms. M2M, cloud computing, 5G, AI and IoT are all bringing further profound change. Recognizing the potential of emerging technologies and the impact that policy and regulatory frameworks can have on their success, regulators should encourage a regulatory paradigm pushing frontiers and enabling the digital transformation. more> draft doc (pdf)

Updates from Ciena

Why the Secret Behind Strong Early Adoption of 400G Technology is … 200G

By Helen Xenos – This month, we shipped our 5,000th 400G-capable coherent optical transponder, confirming our prediction that the use of 400G technology is ramping 3 times faster than 100G.  What may come as a surprise, however, is that the dominant application driving 400G deployments is not 400G, but 200G (long haul-datacenter interconnect to be precise).

Why? The technology that enables 400G wavelengths has a lot to do with expanding the application space for 200G as well.

To fully understand the demand drivers for 400G, it’s important to clarify the various ways 400G is defined. The term “400G” is quite popular in today’s optical networking conversations, but can also have different meanings depending on the context in which it is being used.

So, which applications are driving 400G deployments? We hear so much about the fast-growing metro WDM market, 400ZR and the need to maximize capacity for short reach DCI applications, that intuitively you would think this is the “sweet spot” application.

In fact, the most popular use case we see for early 400G adoption is to support the rise of 200G long-haul for aggressive DCI network builds. more>

Updates from Ciena

4 Data Center Interconnect Developments You Need to Know
By Kent Jordan – The Data Center Interconnect (DCI) market is evolving rapidly and new compact, modular devices have been introduced to help network operators quickly and easily deploy new capacity to keep up with demand. But, as the adage says, “the more things change, the more they stay the same”. So, what’s new with DCI and what hasn’t changed in the last year?

One thing that hasn’t changed is the need for more interconnect capacity. Interconnect bandwidth growth is still on the rise, and it’s growing rapidly. By 2020, interconnect bandwidth has been forecasted to grow up to 5,000 Tbps, with double-digit growth rates across a variety of industry segments from Cloud and IT to Healthcare and Energy. All are poised to experience large capacity growth in the coming years, which means many of the same challenges from the past year still exist.

Network operators are challenged with keeping up with growing demand and offering content and/or services globally. They also have a need for automation to speed bandwidth activation and improve their customers’ quality of experience. On-going operational costs remain a challenge as well, with a need to reduce footprint and power consumption. more>

Updates from Ciena

5 key wireline network improvements needed for 5G
By Brian Lavallee – Ask an end-user about how their phone connects to the network, and they’ll likely only talk about cellular or wireless technology, which is also where most of the current 5G industry hype is focused, and for good reason, as this is the first part of the network to be upgraded. However, the reality is that RAN (Radio Access Network) only makes up a small portion of the end-to-end path that data from a connected device must travel to provide connectivity. The rest of the path is primarily a fiber-optic transport network.

With 5G coming soon, featuring data rates as much as 100 times faster than what’s currently available, the wireline infrastructure that connects end-users (man and machine) to accessed content residing in data centers, must be ready to support upwards of 1,000 times more data flowing across it.

How can network operators prepare? Well, here are five key areas within the wireline network that will need to be upgraded and modernized to support 5G.

  1. Fronthaul
  2. Scalability
  3. Densification
  4. Virtualization
  5. Network Slicing

The move to 5G won’t be a simple network upgrade. It’s a long journey with a high-performance wireline network as the critical component to commercial success for both 4G strategies and the evolution toward 5G. more>

An Inside Look at Smart Cities

Bank of America Merrill Lynch – Countless people and technologies keep our cities safe, clean, and efficient; some we interact with in plain sight, and others operate beneath the surface, improving our lives in ways we don’t fully realize.

But for all the richness of cities, urban living can be filled with challenges, from traffic jams to taxed energy systems to overcrowded sidewalks and transit. Many of these difficulties are rooted in dated infrastructure – so as the number of people living in cities continues to rise, investing in and modernizing city infrastructure becomes critical.

The ultimate goal? Creating a “smart city” – one that leverages technology to improve quality of life for its residents, and creates better systems and structures to support it. One that looks ahead to future generations and starts the work now to meet those needs. Investing in the “smartness” of a city not only modernizes it, but creates a stronger, more sustainable place to live and work.

The good news is that the challenge of creating a smart city presents great opportunities. In fact, the smart city market could grow from an estimated US$1 trillion in 20174 to US$3.5 trillion by the mid-2020s. This means opportunities for companies, investors and, of course, the residents themselves. How do you uncover those opportunities?

Step one is imagining what it might be like to live in a “smart city”. more>

Updates from datacenter.com

How Internet of Things (IoT) will change data centers
datacenter.com – The world of Internet is steadily merging with the world of physical ‘things’. Because of this convergence the Internet of Things (IoT) has arisen, a giant global network connecting all web-enabled things, including people, in the World. From your fridge to your car to the cosmonaut orbiting around the earth, our virtual world will connect billions of smart devices with each other, creating an ecosystem where these ‘things’ will have ability to sense, interact and communicate with each other and influence actions with or without human intervention.

As of the year 2011, there are more Internet enabled devices in the world than actual human beings.

What is the impact on Data Centers?

  • Higher capacity per cabinet
  • Security and data privacy
  • Data center locations does matter

Research firms like IDC and Garner estimates, that the global spending on the Internet of Things (IoT) will reach US$ 772.2B in 2018, an increase of 14,6% over the US$674B spent in 2017. more>

Related>

It’s Time to Protect Identity Like We Protect Critical Infrastructure

By Andre Durand – With the past year’s record-breaking wave of breaches, it is now safe to believe that most Americans have had their personal identity information exposed—and analysis of the Hudson Bay breach has confirmed this knowledge is now being traded in dark markets.

The long-term ramifications are going to have an impact on every public and private sector organization that utilizes our identity to conduct business and to provide access to critical systems, which will create disruption in our day-to-day activities and even to our way of life.

Because business and government institutions that handle personal information are vital to our society, it’s time to designate “identity” as a new segment in the nation’s critical infrastructure, a set of 16 sectors the Homeland Security Department deems essential to the nation’s well-being.

The entire identity chain must be strengthened to prevent these criminal activities. Birth certificates, which can be used to open bank accounts, are still administered by hospitals that are ill-equipped to manage security. Our government devotes huge resources to ensuring that currency can’t be counterfeited, yet it pays scant attention to documents that can be used to obtain multiple forms of ID. Every physical document we use to prove our identity should be made far harder to duplicate.

We can then move onto our digital systems. more>

Related>

Updates from Georgia Tech

Human Factors Research Helps Accelerate Mission Planning
By John Toon – The key to a successful flight mission is planning – sometimes several hours of it. Georgia Tech Research Institute (GTRI) specialists in human factors and human computer interfaces are working with NAVAIR PMA-281, Strike Planning and Execution Systems in Patuxent River, Maryland, to streamline the current mission planning process and identify user interface requirements supporting multi-domain mission management in next-generation naval planning capabilities.

With guidance from the GTRI researchers, the project will improve usability of the mission planning software tools, creating a more consistent and intuitive screen design that’s easier to learn and more logical to follow. This effort could benefit all Department of Defense (DoD) agencies for collaborative mission planning.

“We are working with Navy and Marine Corps aviators to identify areas in mission planning where work-flow can be streamlined, reducing the time required to mission plan,” said Marcia Crosland, project director for GTRI’s Joint Mission Planning System (JMPS) User Interface Design and Usability efforts. “Our task has been to define the user interface concepts and decision-making tools to help reduce the time required for mission planning. We’ve created detailed designs and specifications to direct current and future development of mission planning systems.” more>

Related>