Category Archives: Telecom industry

Updates from Ciena

Virtualizing the World of Cable
By Wayne Hickey – When cable operators saw huge demands in linear video, Video-on-Demand (VoD) and high-speed data services, and faced with an aging analog infrastructure, they moved to a Converged Cable Access Platform (CCAP) to increase capacity and throughput. CCAP combines headend functions into a single architecture by combing Edge Quadrature Amplitude Modulation (EQAM) and Cable Modem Termination System (CMTS).

Back in June 2011, CableLabs created CCAP by blending two competing platforms, a Comcast-backed Converged Multiservice Access Platform (CMAP) and a Time Warner Cable Converged Edge Services Access Router (CESAR) platform. The following year CCAP products were introduced, and deployed the year after.

Fast forward to today, cable operators are looking to implement software-based access platforms, migrate away from commonly deployed centralized, purpose-built CCAP equipment, and virtualize CCAP (vCCAP) — and thus begin the shift to a Distributed Access Architecture (DAA). Developed by CableLabs, vCCAP is the latest cable technologies that combines functions including the CMTS and EQAM.

Virtualizing and distributing MAC and PHY functions enables digital combining, eliminates analog optics with cost effective 10G Ethernet transport, and converts analog fiber nodes to digital optic IP-enabled devices. DAA makes it easier to push fiber deeper into the edge of the network, and along with the ability to support denser wavelengths for each fiber, digital optics greatly improves Carrier-to-Noise-Ratio (CNR), which will enable higher orders of QAM on the coax and higher performance DOCSIS technologies. more> https://goo.gl/EoPwPL

Related>

Updates from Ciena

Optic Zoo Networks Keeps Vancouver’s Data Traveling at Blistering Speeds with Ciena

By Tony Ross – Optic Zoo Networks is a recognized brand throughout metro Vancouver due to our extensive carrier grade dark fiber network and infrastructure. Based on demand and to further accelerate our growth and better serve Tier 1 service providers, we knew it was time to take our offerings to the next level.

Our customers need to support bandwidth-hogging applications like virtual and augmented reality, as well as Internet of Things (IoT). However, in order for data to continue to flow with ease, we needed to ensure that Optic Zoo Networks was ready to support that growth. That meant offering new Carrier Ethernet Services (CES), and in turn, required that we build a Carrier Ethernet Network (CEN).

To continue to support top-echelon service providers, however, we needed to build a CEN that could scale instantaneously and meet the needs of organizations in a range of industries – from finance, healthcare, education, and more.

For example, customers that previously wanted to upgrade to higher levels of bandwidth had to go through inefficient processes, such as having to order a network loop that could take weeks. With our CEN, today’s 1G customers can easily upgrade to 10G tomorrow with a simple software upgrade. more> https://goo.gl/fh54t3

Related>

Updates from Ciena

#Ciena25: The Story Behind the Founding of Ciena

By Bruce Watson – The company that would eventually become Ciena began its life as an inspiration inside the head of David Huber.  The former General Instruments engineer had an idea for how to help cable companies squeeze more television channels through their lines to end consumers.  In 1992, he set out to turn those ideas into a reality, and on November 8, 1992, the paperwork was officially filed in Delaware for the new company.

Huber immediately began searching for venture capital funding.  In late 1993, Huber was introduced to Pat Nettles, a veteran leader of several telecom companies.  By early 1994, Nettles was brought on-board to run the business side of things and was soon the company’s first CEO (though owning a doctorate in particle physics, Nettles was no stranger to the technology side of things himself).

Nettles quickly convinced Huber that it was the long-distance phone companies, not the cable TV industry, that would be the best target for Huber’s invention.

The introduction between the two was orchestrated by Jon Bayless, a venture capitalist who’s firm Sevin Rosen Funds provided $3 million in start-up funding for the business in February 1994. more> https://goo.gl/ZdVzLE

Updates from Ciena

Future of 5G
By Susan Friedman, Brian Lavallée – 5G is coming, and with it comes the expectation of wireless speeds that are 100X or more what we experience today with 4G. In fact, one of the goals of 5G is to achieve maximum download speeds of 10 Gbps per user. This influx of traffic won’t come without a cost to the underlying networks that support it.

To succeed, mobile network operators (MNOs) will need more than just a new radio access network, they will also need fiber—and lots of it – to manage the massive increase in bandwidth that will come as billions more users, both human and machine, join the network.

5G is expected to be deployed strategically in different locations, especially in the early days. If consumers are expecting all 3G and 4G networks to be replaced with 5G, they’ll be disappointed. 5G is expected to complement 3G/4G where it makes sense. And depending on where service providers believe applications and use cases will be most lucrative, they can roll out speeds of up to 10 Gb/s.

This means if you’re in a rural community, chances are you probably won’t get 5G in the early days. In cities and metro areas you’ll see potential applications like enhanced mobile broadband, self-driving cars, video broadcast services, and other use cases that will require high-bandwidth and/or low-latency. So, service providers will deploy 5G in geographic areas where it makes economic sense. more> https://goo.gl/kmxQSs

What Happens When Quantum Physics Meets Cryptography?


By Paulina Gomez – In today’s world of ever-increasing security threats and breaches, encryption is a common technique used to protect critical information from getting into the wrong hands. In cryptography, encryption is the process of encoding a plaintext message in such a way that only authorized parties can access it. The result of this process is encrypted information, also known as ciphertext. But how is this done exactly? The plaintext message is transformed using an algorithm (or cipher) to make it unreadable to anyone except those possessing special knowledge, which is referred to as the key.

Today’s state-of-art secure communications use advanced mathematics to protect in-flight data leveraging highly secure algorithms, such as in Ciena’s WaveLogic Encryption solution. Even though many cryptographic algorithms used today are publicly available, such as the popular Advanced Encryption Standard (AES), they are very difficult to crack in a reasonable amount of time given the computational power of today’s computers. In fact, the keys used in modern cryptography are so large that breaking the AES-256 standard would require “fifty supercomputers that could check a billion billion (1018) AES keys per second [and] would, in theory, require about 3×1051 years.”

The field of Quantum Cryptography is an area of security research and development focused on the introduction of new technologies that will offer more resistance to the computing power of quantum computers. Quantum cryptography draws its strength from the unpredictable nature of photons – the smallest particles in the universe. more> https://goo.gl/FTh77p

Net neutrality 2.0: Perspectives on FCC regulation of internet service providers

By Stuart N. Brotman – The final outcome of this high-profile, high-impact proceeding will not be apparent until sometime late in 2017, at the earliest. Congress may also become more seriously involved at some point on the legislative front.

But without a doubt, as Chairman Pai noted in his Newseum speech, a “fierce debate” lies ahead for a number of months at least. And if past is prologue, the FCC may well receive an avalanche of comments in response to these proposed changes; the record in the Title II Order shows that over four million comments were filed by interested parties and the general public combined.

There will be no lack of political discourse, to be sure.

As we move into 2016, an unresolved national communications policy dilemma remains: whether the public-switched telephone network and the internet are parallel systems or parts of a larger ubiquitous network environment. Determining which characterization will be followed has profound consequences for regulatory treatment.

Given the emerging dominance of mobile over fixed service, if the FCC can’t regulate both, it may win the battle but lose the war. Given that a further appeal is likely regardless of which side prevails, including possible review by the U.S. Supreme Court, Congress may find itself re-emerging as the best source of guidance for the FCC. Legislative action can definitively clarify whether Congress intends for the telephone network and internet to be joined at the hip, or should continue to function in parallel with differing regulatory treatment. more> https://goo.gl/f4x8Uh

Related>

What’s Wrong With America’s Current Approach to Cybersecurity?

By Gregory Michaelidis – Go behind the headlines of the latest megahack, and what you’ll find is a growing public-safety and national-security crisis.

We are barely discussing how to help people help themselves in the digital world, let alone do their part in protecting our major networks and critical infrastructure.

Until we embrace a vision of public cybersecurity that sees people, at all ranges of skill, as essential to our collective security, there will be no widespread cybersecurity.

Right now, America’s collective cybersecurity effort is headed toward near-certain failure for reasons within our own control. In less than a decade — thanks to the influx of dollars and high-level policy and press attention — cybersecurity has transformed what is actually a “people problem with a technology component” into its exact opposite.

Official Washington and Silicon Valley have adopted a set of faulty assumptions about cybersecurity and internalized them to such a degree it’s practically a new religion, somewhere between late-19th-century technological determinism and medieval alchemy. more> https://goo.gl/elH8r2

90 years later, the broadcast public interest standard remains ill-defined

By Jack Karsten – The public interest standard has governed broadcast radio and television since Congress passed the Radio Act of 1927. However, decades of successive court cases and updated telecommunications laws have done little to clarify what falls into the public interest.

Prior to the public interest standard, free speech advocates argued with the broadcasting industry over who should have editorial control over content. Industry groups opposed a common carrier approach that would have allowed anyone to buy airtime. The resulting compromise established a short-term renewable licensing regime, overseen by the Federal Communications Commission since 1934, which required broadcasters to act on behalf of all others who did not receive a license. Congress granted the FCC the flexibility to revise its interpretation of the public interest standard to reflect changing circumstances. Since its founding, the FCC has repeatedly refused to set forth its own concrete definition of the public interest.

The Telecommunications Act of 1996 updated the 1934 Communications Act, but did not address the public interest standard beyond maintaining the status quo. more> https://goo.gl/AfmULj

ITU releases annual global ICT data and ICT Development Index country rankings

ITU – The Measuring the Information Society Report is widely recognized as the repository of the world’s most reliable and impartial global data and analysis on the state of global ICT development and is extensively relied upon by governments, international organizations, development banks and private sector analysts and investors worldwide.

“To bring more people online, it is important to focus on reducing overall socio-economic inequalities,” said ITU Secretary-General Houlin Zhao. “Education and income levels are strong determinants of whether or not people use the Internet.”

An increasingly ubiquitous, open, fast and content-rich Internet has changed the way many people live, communicate, and do business, delivering great benefits for people, governments, organizations and the private sector. However, many people are still not using the Internet, and many users do not fully benefit from its potential.

  • Most people have access to Internet services but many do not actually use them.
  • The full potential of the Internet remains untapped.
  • Access to the Internet is not enough; policy-makers must address broader socio-economic inequalities and help people acquire the necessary skills to take full advantage of the Internet.
  • Many people still do not own or use a mobile phone.
  • Affordability is the main barrier to mobile-phone ownership.
  • Asia and the Pacific has the lowest average purchasing power parity (PPP) $ price for mobile-cellular services of all regions.
  • Fixed-broadband prices continued to drop significantly in 2015 but remain high – and clearly unaffordable – in a number of LDCs.
  • Mobile-broadband is cheaper and more widely available than fixed-broadband, but still not deployed in the majority of LDCs (Least Developed Countries).

Mobile phone adoption has largely been monitored based on mobile-cellular subscription data since these are widely available and regularly collected and disseminated by regulators and operators.

At the end of 2016, there are almost as many mobile-cellular subscriptions as people on earth and 95% of the global population lives in an area that is covered by a mobile-cellular signal. However, since many people have multiple subscriptions or devices, other metrics need to be produced to accurately assess mobile uptake, such as the number of mobile phone users or mobile phone owners. more> https://goo.gl/L3Nh90

Related>

Updates from GE

By Dorothy Pomerantz – America’s largest machine — the power grid — has been pumping lifeblood electricity from power plants to our homes and businesses for more than a century.

The vast network of wires, switches, transformers and other technology has gone through periodic upgrades, but the infrastructure is aging and increasingly prone to blackouts. Unfortunately, the stress on the network is starting to show at exactly the time when we need it to shoulder and move thousands of megawatts from new wind farms and solar installations popping up all over the country.

Ever since Westinghouse and Tesla beat Edison in the “current wars,” alternating current (AC) has been the dominant method of shipping power over long distances. But Clean Line is now working with GE — the company Edison co-founded — to change that.

HVDC (highest-voltage direct-current) is a much more efficient way to move energy over long distances than alternating current. “If you used AC, you would need more wires in the air to get the same amount of power the same distance,” says Neil Kirby, HVDC business development manager at Grid Solutions from GE Energy Connections.

Wind turbines will send electricity to GE-built stations, where it will be converted from AC to DC. The electricity will flow across HVDC lines and will be converted back into AC before it goes into homes and factories. This method conserves more energy and is more economical and environmentally friendly than transporting the electricity as AC the entire way. more> https://goo.gl/GUZbMp