Future of 5G
By Susan Friedman, Brian Lavallée – 5G is coming, and with it comes the expectation of wireless speeds that are 100X or more what we experience today with 4G. In fact, one of the goals of 5G is to achieve maximum download speeds of 10 Gbps per user. This influx of traffic won’t come without a cost to the underlying networks that support it.
To succeed, mobile network operators (MNOs) will need more than just a new radio access network, they will also need fiber—and lots of it – to manage the massive increase in bandwidth that will come as billions more users, both human and machine, join the network.
5G is expected to be deployed strategically in different locations, especially in the early days. If consumers are expecting all 3G and 4G networks to be replaced with 5G, they’ll be disappointed. 5G is expected to complement 3G/4G where it makes sense. And depending on where service providers believe applications and use cases will be most lucrative, they can roll out speeds of up to 10 Gb/s.
This means if you’re in a rural community, chances are you probably won’t get 5G in the early days. In cities and metro areas you’ll see potential applications like enhanced mobile broadband, self-driving cars, video broadcast services, and other use cases that will require high-bandwidth and/or low-latency. So, service providers will deploy 5G in geographic areas where it makes economic sense. more> https://goo.gl/kmxQSs
Posted in Broadband, Business, Communication industry, Economic development, Economy, Net, Technology, Telecom industry
Tagged 5G, Broadband, Ciena, Internet, Net evolution, Technology, Wireless
Autodesk Highlights Next-Gen Storytelling & Collaboration Tools at SIGGRAPH 2017
Autodesk – Leading up to SIGGRAPH 2017, Autodesk released a series of updates for its media and entertainment tools, including Autodesk Media & Entertainment Collection, Autodesk Maya, Shotgun, Arnold, Autodesk 3ds Max, and Autodesk Flame. Engineered to streamline and accelerate production on films, TV shows, games and immersive experiences, the new releases include improvements and user-requested enhancements that connect creative workflows and teams, helping them bring engaging stories to life for a worldwide audience.
“The continued growth of AR and VR and steady flow of new productions from Netflix, Amazon and others, mean animation and VFX houses are in more demand than ever. We’re focused on helping our customers create, connect and compute faster and more efficiently so they can balance their increasing project loads with tighter schedules and budgets,” Chris Bradshaw, Senior Vice President, Media & Entertainment, Autodesk, stated. “Everything we’re showing at SIGGRAPH streamlines production and equips artists with the tools to handle nearly any creative scenario.” more> cadinnovation.com
Posted in Broadband, Business, Economic development, Economy, Education, How to, Media, Net, Technology
Tagged Autodesk, Broadband, Industrial economy, Internet, Jobs, Skills
By Derek Thompson – Before getting to the future, let’s start with the present of television. Pay TV—that is, the bundle of channels one can buy from Comcast or DirecTV—is in a ratings free fall among all viewers born since the Nixon administration.
This has created a business crisis for entertainment companies like Disney. Old Disney’s television strategy was: Focus on making great content and then sell it to distribution companies, like Comcast and DirecTV. This worked brilliantly when practically the entire country subscribed to the same television product.
Thanks to virtuous cycle of bundling, separating content and distribution used to be the obvious play for Disney
But New Disney is looking for a fresh play. Now that young households are cutting the cord, it wants to own both content and distribution.
There aren’t many great examples of legacy media empires successfully transitioning to the digital age without a few disasters along the way, or at least a long period of readjustment. Just look at American newspapers, or the music labels at the beginning of the 2000s. more> https://goo.gl/jfcC64
Posted in Broadband, Business, Economy, History, Media, Net, Technology
Tagged Broadband, Digital transformation, Entertainment, Internet, Pay television, Technology
By Paulina Gomez – In today’s world of ever-increasing security threats and breaches, encryption is a common technique used to protect critical information from getting into the wrong hands. In cryptography, encryption is the process of encoding a plaintext message in such a way that only authorized parties can access it. The result of this process is encrypted information, also known as ciphertext. But how is this done exactly? The plaintext message is transformed using an algorithm (or cipher) to make it unreadable to anyone except those possessing special knowledge, which is referred to as the key.
Today’s state-of-art secure communications use advanced mathematics to protect in-flight data leveraging highly secure algorithms, such as in Ciena’s WaveLogic Encryption solution. Even though many cryptographic algorithms used today are publicly available, such as the popular Advanced Encryption Standard (AES), they are very difficult to crack in a reasonable amount of time given the computational power of today’s computers. In fact, the keys used in modern cryptography are so large that breaking the AES-256 standard would require “fifty supercomputers that could check a billion billion (1018) AES keys per second [and] would, in theory, require about 3×1051 years.”
The field of Quantum Cryptography is an area of security research and development focused on the introduction of new technologies that will offer more resistance to the computing power of quantum computers. Quantum cryptography draws its strength from the unpredictable nature of photons – the smallest particles in the universe. more> https://goo.gl/FTh77p
Posted in Business, Communication industry, Economy, Education, Nature, Net, Product, Science, Technology, Telecom industry
Tagged Broadband, Cryptography, Internet, Physics, Quantum Computing, Technology
By Tom Wheeler – The tremor in Silicon Valley emerged from Brussels, not the San Andreas Fault. The European Union’s decision on Google’s search practices makes clear the absence of domestic regulation has opened the door for policies to be decided by foreign governments.
It should be a worry – and a wake-up – for all the companies whose platforms drive internet services.
Thanks to the interconnectedness of the internet, imposing rules in one major market necessarily impacts operations in other markets. While the internet platform companies may celebrate how they have avoided regulation at home, it does not mean they have avoided government oversight – just that such policies come from other governments. And because the effects of a keystroke can circle the world in seconds, policy imposed by the EU, for instance, can be felt far beyond the European continent.
While protecting consumers and competition is their goal, it would be an unnatural act for foreign regulators not to take into consideration the effect the internet giants have on companies in the countries of those regulators.
Thus, the question occurs whether the success of the U.S. internet giants in keeping their own government at arms’ length is not actually counter-productive.
Rather than the U.S. setting the international standard for appropriate oversight of the platforms of the internet – and in doing so advancing and protecting American economic influence, consumer interests and innovation – the U.S. internet companies’ actions have defaulted the leadership to other countries with perhaps other goals. more> https://goo.gl/XFu73j
Posted in Broadband, Business, Communication industry, CONGRESS WATCH, Economic development, Economy, Education, FCC, Leadership, Net, Regulations
Tagged EU, FCC, Internet, Regulation, Standards, United States
By David Kully – The source of the increasing concentration in many markets, in the view of some commentators, was a shift that began in the 1970s in how antitrust enforcers and the courts view the role of antitrust enforcement.
At that time, economists in the “Chicago School” led an evolution away from concern about protecting small competitors from larger competitors to a current enforcement paradigm that emphasizes “consumer welfare” and calls for intervention by the government only if a merger or alleged anticompetitive practice is likely to harm consumers – through higher prices, lower output, poorer quality products or services, or diminished incentives to innovate. This shift, according to critics, made antitrust enforcers less likely to go to court to block large mergers or take on monopolies, with the result being the concentrated marketplaces we see today.
The nostalgia for the antitrust enforcement of the past, however, ignores important concerns about an approach predicated on attacking large firms merely because of their size. The evolution in antitrust thinking that began with the Chicago School was driven by economic research establishing that some mergers and certain practices that antitrust law previously forbade offer tangible benefits to society. Critics offer no countervailing basis to believe that these benefits would not be lost if we were to revert to past thinking. more> https://goo.gl/rt1ZSQ