Category Archives: Telecom industry

Updates from Ciena

Future of 5G
By Susan Friedman, Brian Lavallée – 5G is coming, and with it comes the expectation of wireless speeds that are 100X or more what we experience today with 4G. In fact, one of the goals of 5G is to achieve maximum download speeds of 10 Gbps per user. This influx of traffic won’t come without a cost to the underlying networks that support it.

To succeed, mobile network operators (MNOs) will need more than just a new radio access network, they will also need fiber—and lots of it – to manage the massive increase in bandwidth that will come as billions more users, both human and machine, join the network.

5G is expected to be deployed strategically in different locations, especially in the early days. If consumers are expecting all 3G and 4G networks to be replaced with 5G, they’ll be disappointed. 5G is expected to complement 3G/4G where it makes sense. And depending on where service providers believe applications and use cases will be most lucrative, they can roll out speeds of up to 10 Gb/s.

This means if you’re in a rural community, chances are you probably won’t get 5G in the early days. In cities and metro areas you’ll see potential applications like enhanced mobile broadband, self-driving cars, video broadcast services, and other use cases that will require high-bandwidth and/or low-latency. So, service providers will deploy 5G in geographic areas where it makes economic sense. more> https://goo.gl/kmxQSs

What Happens When Quantum Physics Meets Cryptography?


By Paulina Gomez – In today’s world of ever-increasing security threats and breaches, encryption is a common technique used to protect critical information from getting into the wrong hands. In cryptography, encryption is the process of encoding a plaintext message in such a way that only authorized parties can access it. The result of this process is encrypted information, also known as ciphertext. But how is this done exactly? The plaintext message is transformed using an algorithm (or cipher) to make it unreadable to anyone except those possessing special knowledge, which is referred to as the key.

Today’s state-of-art secure communications use advanced mathematics to protect in-flight data leveraging highly secure algorithms, such as in Ciena’s WaveLogic Encryption solution. Even though many cryptographic algorithms used today are publicly available, such as the popular Advanced Encryption Standard (AES), they are very difficult to crack in a reasonable amount of time given the computational power of today’s computers. In fact, the keys used in modern cryptography are so large that breaking the AES-256 standard would require “fifty supercomputers that could check a billion billion (1018) AES keys per second [and] would, in theory, require about 3×1051 years.”

The field of Quantum Cryptography is an area of security research and development focused on the introduction of new technologies that will offer more resistance to the computing power of quantum computers. Quantum cryptography draws its strength from the unpredictable nature of photons – the smallest particles in the universe. more> https://goo.gl/FTh77p

Net neutrality 2.0: Perspectives on FCC regulation of internet service providers

By Stuart N. Brotman – The final outcome of this high-profile, high-impact proceeding will not be apparent until sometime late in 2017, at the earliest. Congress may also become more seriously involved at some point on the legislative front.

But without a doubt, as Chairman Pai noted in his Newseum speech, a “fierce debate” lies ahead for a number of months at least. And if past is prologue, the FCC may well receive an avalanche of comments in response to these proposed changes; the record in the Title II Order shows that over four million comments were filed by interested parties and the general public combined.

There will be no lack of political discourse, to be sure.

As we move into 2016, an unresolved national communications policy dilemma remains: whether the public-switched telephone network and the internet are parallel systems or parts of a larger ubiquitous network environment. Determining which characterization will be followed has profound consequences for regulatory treatment.

Given the emerging dominance of mobile over fixed service, if the FCC can’t regulate both, it may win the battle but lose the war. Given that a further appeal is likely regardless of which side prevails, including possible review by the U.S. Supreme Court, Congress may find itself re-emerging as the best source of guidance for the FCC. Legislative action can definitively clarify whether Congress intends for the telephone network and internet to be joined at the hip, or should continue to function in parallel with differing regulatory treatment. more> https://goo.gl/f4x8Uh

Related>

What’s Wrong With America’s Current Approach to Cybersecurity?

By Gregory Michaelidis – Go behind the headlines of the latest megahack, and what you’ll find is a growing public-safety and national-security crisis.

We are barely discussing how to help people help themselves in the digital world, let alone do their part in protecting our major networks and critical infrastructure.

Until we embrace a vision of public cybersecurity that sees people, at all ranges of skill, as essential to our collective security, there will be no widespread cybersecurity.

Right now, America’s collective cybersecurity effort is headed toward near-certain failure for reasons within our own control. In less than a decade — thanks to the influx of dollars and high-level policy and press attention — cybersecurity has transformed what is actually a “people problem with a technology component” into its exact opposite.

Official Washington and Silicon Valley have adopted a set of faulty assumptions about cybersecurity and internalized them to such a degree it’s practically a new religion, somewhere between late-19th-century technological determinism and medieval alchemy. more> https://goo.gl/elH8r2

90 years later, the broadcast public interest standard remains ill-defined

By Jack Karsten – The public interest standard has governed broadcast radio and television since Congress passed the Radio Act of 1927. However, decades of successive court cases and updated telecommunications laws have done little to clarify what falls into the public interest.

Prior to the public interest standard, free speech advocates argued with the broadcasting industry over who should have editorial control over content. Industry groups opposed a common carrier approach that would have allowed anyone to buy airtime. The resulting compromise established a short-term renewable licensing regime, overseen by the Federal Communications Commission since 1934, which required broadcasters to act on behalf of all others who did not receive a license. Congress granted the FCC the flexibility to revise its interpretation of the public interest standard to reflect changing circumstances. Since its founding, the FCC has repeatedly refused to set forth its own concrete definition of the public interest.

The Telecommunications Act of 1996 updated the 1934 Communications Act, but did not address the public interest standard beyond maintaining the status quo. more> https://goo.gl/AfmULj

ITU releases annual global ICT data and ICT Development Index country rankings

ITU – The Measuring the Information Society Report is widely recognized as the repository of the world’s most reliable and impartial global data and analysis on the state of global ICT development and is extensively relied upon by governments, international organizations, development banks and private sector analysts and investors worldwide.

“To bring more people online, it is important to focus on reducing overall socio-economic inequalities,” said ITU Secretary-General Houlin Zhao. “Education and income levels are strong determinants of whether or not people use the Internet.”

An increasingly ubiquitous, open, fast and content-rich Internet has changed the way many people live, communicate, and do business, delivering great benefits for people, governments, organizations and the private sector. However, many people are still not using the Internet, and many users do not fully benefit from its potential.

  • Most people have access to Internet services but many do not actually use them.
  • The full potential of the Internet remains untapped.
  • Access to the Internet is not enough; policy-makers must address broader socio-economic inequalities and help people acquire the necessary skills to take full advantage of the Internet.
  • Many people still do not own or use a mobile phone.
  • Affordability is the main barrier to mobile-phone ownership.
  • Asia and the Pacific has the lowest average purchasing power parity (PPP) $ price for mobile-cellular services of all regions.
  • Fixed-broadband prices continued to drop significantly in 2015 but remain high – and clearly unaffordable – in a number of LDCs.
  • Mobile-broadband is cheaper and more widely available than fixed-broadband, but still not deployed in the majority of LDCs (Least Developed Countries).

Mobile phone adoption has largely been monitored based on mobile-cellular subscription data since these are widely available and regularly collected and disseminated by regulators and operators.

At the end of 2016, there are almost as many mobile-cellular subscriptions as people on earth and 95% of the global population lives in an area that is covered by a mobile-cellular signal. However, since many people have multiple subscriptions or devices, other metrics need to be produced to accurately assess mobile uptake, such as the number of mobile phone users or mobile phone owners. more> https://goo.gl/L3Nh90

Related>

Updates from GE

By Dorothy Pomerantz – America’s largest machine — the power grid — has been pumping lifeblood electricity from power plants to our homes and businesses for more than a century.

The vast network of wires, switches, transformers and other technology has gone through periodic upgrades, but the infrastructure is aging and increasingly prone to blackouts. Unfortunately, the stress on the network is starting to show at exactly the time when we need it to shoulder and move thousands of megawatts from new wind farms and solar installations popping up all over the country.

Ever since Westinghouse and Tesla beat Edison in the “current wars,” alternating current (AC) has been the dominant method of shipping power over long distances. But Clean Line is now working with GE — the company Edison co-founded — to change that.

HVDC (highest-voltage direct-current) is a much more efficient way to move energy over long distances than alternating current. “If you used AC, you would need more wires in the air to get the same amount of power the same distance,” says Neil Kirby, HVDC business development manager at Grid Solutions from GE Energy Connections.

Wind turbines will send electricity to GE-built stations, where it will be converted from AC to DC. The electricity will flow across HVDC lines and will be converted back into AC before it goes into homes and factories. This method conserves more energy and is more economical and environmentally friendly than transporting the electricity as AC the entire way. more> https://goo.gl/GUZbMp

Only governments can safeguard the openness of the internet

By Rufus Pollock – On 6 October 1536, in the prison yard of Vilvoorde castle near modern-day Brussels, a man named William Tyndale was strangled then burnt at the stake. His crime? To translate the Latin Bible into English, his native tongue.

A priest and scholar, Tyndale was an information freedom-fighter, whose mission was to open up the scripture for ordinary men and women.

The internet’s low-cost transmission can just as easily create information empires and robber barons as it can digital democracy and information equality. The growing value of being able to mine and manipulate huge data-sets, to generate predictions about consumers’ behaviour and desires, creates a self-reinforcing spiral of network effects. Data begets more data, locked down behind each company’s walls where their proprietary algorithms can exploit it for profit.

But in an alternative, more open world, how would we pay to create information in the first place? After all, it costs real money and real resources to make new software, movies or drugs.

What matters is who owns information, not just the infrastructure by which it is distributed. Digital technology must be combined with concrete actions that protect openness across the spectrum, from maps to medicines, from software to schools.

Better that we do it through public institutions, instead of relying on mavericks and martyrs. more> https://goo.gl/hBYcS6

Politicians Overreact to AT&T-Time Warner Deal

By Paula Dwyer – To understand the folly of blocking this takeover, think back to 1974 to the original AT&T antitrust case, which also began from a fear of vertical integration. Back then, the concern was that a single company controlled all the local landlines and the company that made the equipment.

For sure, AT&T had a monopoly, but it was created and sanctioned by the federal government. All that was needed was a government deregulation order and a green light that it wouldn’t block competitors.

Instead, the U.S. sued to break up Ma Bell.

After eight years of courtroom battles, AT&T in 1982 consented to be broken into seven Baby Bells and AT&T, which could only offer long-distance service. Many mergers later, one of AT&T’s offspring, Southwestern Bell, had acquired four of its siblings plus the old AT&T, and took the AT&T name.

The two remaining Baby Bells joined with GTE and became Verizon. The result is even more concentration than before.

If the U.S. had simply deregulated plain old telephone service, any one of these technologies (fiber-optics, Arpanet, cellular network) could have forced AT&T to adjust or disappear. more> https://goo.gl/XLVsfB

Related>

We’ve Reached the End of Global Trade

By Rana Foroohar – Globalization is usually defined as the free movement of people, goods and capital. It’s been the most important economic force of modernity.

Until the financial crisis of 2008, global trade grew twice as fast as the global economy itself. Yet, thanks to both economics and politics, globalization as we have known it is changing fast.

The question is: Have we reached peak trade?

“If you think about globalization in traditional terms, in terms of old-line trade in goods, for example, then yes,” says McKinsey Global Institute research director Susan Lund.

“But if you think of it in terms of the flow of digital data and ideas, then no—it’s actually increasing.” Indeed, the cross-border flow of digital data—e-commerce, web searches, online video, machine-to-­machine ­interactions—has grown 45 times larger since 2005 and is projected to grow much faster than the global economy over the next few years. more> https://goo.gl/5xcMO5