Coherent optical turns 10: Here’s how it was made
By Bo Gowan – This is the story of how a team of over 100 people in Ciena’s R&D labs pulled together an impressive collection of technology innovations that created a completely new way of transporting data over fiber…and in the processes helped change the direction of the entire optical networking industry.
Back in 2008, many in the industry had serious doubts that commercializing coherent fiber optic transport was even possible, much less the future of optical communications. That left a team of Ciena engineers to defy the naysayers and hold the torch of innovation.
“What we first began to see at Telecom 99 was that we could achieve these high speeds the brute force way, but it was really, really painful,” said Dino DiPerna in an interview. Dino, along with many in his team, were brought on by Ciena as part of the company’s 2010 acquisition of Nortel’s optical business. He now serves as Ciena’s Vice President of Packet-Optical Platforms R&D and is based in Ottawa.
By ‘brute force’ Dino is referring to the traditional time-division multiplexing (TDM) method that had been used until then to speed up optical transmission – basically turning the light on and off at increasingly faster speeds (also called the baud or symbol rate). “But once you start pushing past 10 billion times per second, you begin running into significant problems,” said DiPerna.
Those complexities had to do with the underlying boundaries of what you can do with light. The fundamental issue at hand was the natural spread and propagation of light as it travels along the fiber – created by two phenomenon called chromatic dispersion and polarization mode dispersion, or PMD. As you push past 10G speeds, the tolerance to chromatic dispersion goes down with the square of the baud. Due to PMD and noise from optical amplifiers, a 40 Gbaud stream will lose at least 75% of its reach compared to a 10 Gbaud stream.
This reach limitation had two consequences. First, it meant adding more costly regenerators to the network. Second, it meant that the underlying fiber plant required a more expensive, high-quality fiber to operate properly at 40G transmission speeds. more>
Posted in Broadband, Communication industry, Economy, History, Net, Product, Science, Technology, Telecom industry
Tagged Broadband, Business improvement, Ciena, Fiber optics, Internet, Physics, Technology
The good and bad of blockchain
By Rose Jacobs – There’s a drawback: blockchains have the potential to increase collusion, according to Chicago Booth’s Lin William Cong and Zhiguo He.
The researchers’ modeling, part of their research into how blockchains might affect competition, suggests that the way blockchains work as a decentralized ledger involves distributing more information, which could make it easier for competitors to quietly and often tacitly collude to keep prices high, ultimately to the detriment of consumers. But Cong and He propose a few potential remedies.
Blockchain is less well-known than Bitcoin but may have more staying power. Its main functionality is providing “decentralized consensus,” say Cong and He. In most societies and economies, parties in a contract rely on a government, court, or other third-party arbitrator to essentially oversee and enforce rules in private contracts—to provide consensus, as the researchers put it. Blockchain provides that function in a more decentralized manner by generating, storing, and distributing the record of rules and regulations.
Blockchain idealists would have all transactions stored on one chain—the one that already exists, thanks to Bitcoin. This would create a massive, democratic, stable, and unified public record. But most companies don’t buy into this vision.
Critics say this is because they want to control the chains, keeping out new competitors by using private or “permissioned” blockchains. more>
Posted in Banking, Broadband, Business, Economy, History, Net, Technology
Tagged BitCoin, Blockchain, Broadband, Business, cartel, Internet, Technology
Simulation Supports Program to Help Pilots in Degraded Visual Environments
By John Toon – A degraded visual environment occurs when helicopters landing on loose soil, such as desert terrain, stir up dust that creates brownout conditions which make it challenging for pilots to see obstacles on the ground. The simulation will support the development of a multi-sensor system designed to give U.S. Army rotorcraft pilots better situational awareness during these challenging conditions.
GTRI researchers are developing different ways to show fused sensor images to pilots during brownout conditions. In an Army cockpit simulator lab, experienced rotorcraft pilots will use the simulations to determine how information should be presented during high-stress approach, landing and takeoff conditions. The pilot feedback will assist the Army in defining the Pilot Vehicle Interface for the new Degraded Visual Environment (DVE) system that will be used on Black Hawk and Chinook helicopters. It will also be used to inform a milestone decision for integration into the Army aviation platforms.
The simulation project is challenging because the data comes from different sources, at different data rates and different resolutions. The emulator must work accurately under varying conditions, including daytime and nighttime operations. Because the system is used to analyze pilot interaction with the new sensors, the provided solution includes flexibility to easily reconfigure various parameters such as symbology sets, types of sensors, sensor performance characteristics, and symbology color. more>
- Researchers Determine Routes of Respiratory Infectious Disease Transmission on Aircraft, John Toon
- A Future Colorfully Lit by the Mystifying Physics of Paint-On Semiconductors, Ben Brumfield
- Turbocharging Fuel Cells with a Multifunctional Catalyst, Ben Brumfield
- The Minds of the New Machines, T.J. Becker
- Modernizing Information Systems to Support a New Generation of Army Families, Josh Brown
- New Insights Could Pave The Way For Self-Powered Low Energy Devices, Josh Brown
- Easy as 1, 2, 3! Really? A. Maureen Rouhi
- Comparison Shows Value of DNA Barcoding in Selecting Nanoparticles, John Toon
- More Startups Join Engage, Laura Diamond
- Deep Learning Can Now Help Prevent Heart Failure, Kristen Perez
- The Next Frontier in Molecular Engineering, Georgia Parmelee
- Georgia Tech Researchers Bring Transparency to Telephone Blacklists, Tess Malone
Posted in Broadband, Business, Energy & emissions, Net, Science, Technology, Transportation
Tagged Broadband, Business improvement, Georgia Tech, Health, Nature, Technology
datacenter.com – Cloud direct connect allows enterprises to access public cloud services (i.e. Amazon, Google, Microsoft, TencentCloud, etc) over a dedicated, private connection rather than over the public Internet. The benefits of direct cloud connections to your own network include greater reliability, better performance (better speed, lower latencies) and a higher security than typical connections over the Internet.
The costs of WAN and public Internet connectivity can be significant. The cost of moving a lot of data to your cloud provider vary per provider, but often are expensive. By using a neutral data center, you have access to multiple carriers who can provide you the necessary public Internet connections and direct cloud connect services. By segmenting the various network workloads, you can often realize savings in bandwidth and reduce the costs of moving data to your public cloud provider.
By replacing a “best effort” network, such as the public Internet with a direct connection to your cloud provider, you gain consistency in throughput and performance. As the mathematical principle states, “the shortest distance between two points is a line.” By using cloud direct connect services you’re connecting to the cloud provider in a straight line and increasing your performance. more>
Posted in Broadband, Business, Economy, Net, telecom
Tagged Broadband, Business improvement, Data center, Internet, Super regions, Technology, Wireline