Tag Archives: Broadband

The information arms race can’t be won, but we have to keep fighting

By Cailin O’Connor -Arms races happen when two sides of a conflict escalate in a series of ever-changing moves intended to outwit the opponent. In biology, a classic example comes from cheetahs and gazelles. Over time, these species have evolved for speed, each responding to the other’s adaptations.

One hallmark of an arms race is that, at the end, the participants are often just where they started. Sometimes, the cheetah catches its prey, and sometimes the gazelle escapes. Neither wins the race because, as one gets better, so does its opponent. And, along the way, each side expends a great deal of effort. Still, at any point, the only thing that makes sense is to keep escalating.

Arms races happen in the human world too. The term arms race, of course, comes from countries at war who literally amass ever-more sophisticated and powerful weapons. But some human arms races are more subtle.

As detailed in the Mueller report – but widely known before – in the lead-up to the 2016 presidential election in the United States, the Russian government (via a group called the Internet Research Agency) engaged in large-scale efforts to influence voters, and to polarize the US public. In the wake of this campaign, social-media sites and research groups have scrambled to protect the US public from misinformation on social media.

What is important to recognize about such a situation is that whatever tactics are working now won’t work for long. The other side will adapt. In particular, we cannot expect to be able to put a set of detection algorithms in place and be done with it. Whatever efforts social-media sites make to root out pernicious actors will regularly become obsolete.

The same is true for our individual attempts to identify and avoid misinformation. Since the 2016 US election, ‘fake news’ has been widely discussed and analyzed. And many social-media users have become more savvy about identifying sites mimicking traditional news sources. But the same users might not be as savvy, for example, about sleek conspiracy theory videos going viral on YouTube, or about deep fakes – expertly altered images and videos.

What makes this problem particularly thorny is that internet media changes at dizzying speed. more>

Updates from Ciena

Latest trends in optical networks- straight from NGON & DCI World
By Helen Xenos – “If you are not sitting at the edge of your seat, you are taking up too much space.”

I heard this quote from a friend recently and thought it was interestingly appropriate in describing the optical networking industry these days. No one has time to sit back. Technology is evolving at an incredibly fast pace, new developments are occurring at a regular cadence, and network providers are regularly evaluating different architecture approaches for evolving their networks.

In attending the 21st Annual NGON & DCI World event in beautiful Nice last week, I had an opportunity to get a pulse on the latest topics and trends that are driving change in the optical networking landscape.

A popular topic at all optical events – and NGON was no exception – is the discussion of the next technology breakthrough that will bring new levels of capacity scale and cost reduction to transport networks.

If we look at coherent optical shipments, capacity and average selling price data over the past decade, what is the principal way that network providers have been able to keep up with exponentially increasing bandwidth demands while maintaining transport costs relatively flat? Through coherent technology innovations that have enabled higher throughput at less cost.

So, how will we get to the next level of cost reduction?

The consistent response to this question in multiple sessions at NGON was higher baud, which means coherent optical solutions that have a higher symbol rate and can process more data per second, resulting in more fiber capacity with less equipment. more>

Related>

The Surveillance Threat Is Not What Orwell Imagined

By Shoshana Zuboff – George Orwell repeatedly delayed crucial medical care to complete 1984, the book still synonymous with our worst fears of a totalitarian future — published 70 years ago this month.

Since 1984’s publication, we have assumed with Orwell that the dangers of mass surveillance and social control could only originate in the state. We were wrong. This error has left us unprotected from an equally pernicious but profoundly different threat to freedom and democracy.

For 19 years, private companies practicing an unprecedented economic logic that I call surveillance capitalism have hijacked the Internet and its digital technologies. Invented at Google beginning in 2000, this new economics covertly claims private human experience as free raw material for translation into behavioral data. Some data are used to improve services, but the rest are turned into computational products that predict your behavior.

These predictions are traded in a new futures market, where surveillance capitalists sell certainty to businesses determined to know what we will do next. This logic was first applied to finding which ads online will attract our interest, but similar practices now reside in nearly every sector — insurance, retail, health, education, finance and more — where personal experience is secretly captured and computed for behavioral predictions. By now it is no exaggeration to say that the Internet is owned and operated by private surveillance capital.

In the competition for certainty, surveillance capitalists learned that the most predictive data come not just from monitoring but also from modifying and directing behavior. more>

Updates from Ciena

Reimagining Ethernet Service Delivery with Intelligent Automation
By Thomas DiMicelli – Communications service providers introduced Ethernet-based services almost 20 years ago as a more flexible and cost-effective alternative to TDM-based services. These services have been continuously enhanced over the years and are widely deployed today; however, traditional Ethernet service activation processes are increasingly out of alignment with market requirements.

I asked Andreas Uys, CTO at Dark Fibre Africa (DFA), an innovative open-access fibre optic company that operates in South Africa, to outline some of the issues concerning Ethernet service activation, and how CSPs can overcome them.

“The limitations of traditional Ethernet service activation processes are quite significant,” Andreas said. “Some of this is due to the way SPs are organized, and some is due to the reliance on manual operations; taken together, these issues dramatically slow the order to service process and delay time to revenue.”

Andreas continued: “Ethernet service activation naturally involves different departments… customer service reps generate work-orders, engineering designs the services, and the operations team provisions and manages the services. Each department has its own ‘siloed’ systems and rely on emails and spreadsheets to track workflow progress. This results in a time-consuming process, even to generate a simple quote.”

“Engineers design the service using stale data from multiple offline inventory systems,” Andreas added, “which results in non-optimal designs that waste network resources. Similarly, the operations team uses multiple tools to manually configure each element or domain in the service path, which adds cost and the potential for errors into the process.”

With fragmented systems and workflows, offline service design tools and error-prone manual provisioning, it is clear that the Ethernet service activation process needs to be updated. So what is the way forward? more>

Related>

Updates from Ciena

GeoMesh Extreme: Release the Kraken!
importance to global communications, and how Ciena’s new GeoMesh Extreme allows submarine cable operators to integrate several technology advancements to enable an open submarine network solution with greater choice and performance.
By Brian Lavallée – Now that I’ve got your attention, what exactly is a kraken?

It’s a legendary sea monster that terrorized ships that sailed the North Atlantic Ocean. It was an unknown danger that dwelled the ocean deep and could attack without warning resulting in untold mayhem.

Whether the kraken legend originates from a giant squid or octopus sighting is debatable, but it terrorized sailors nonetheless, as they never knew if or when the kraken could be encountered. Legends die hard, but there are real dangers that lurk beneath the oceans of the world, and this is precisely where submarine cables live and work.

Hundreds of years ago, when the kraken was terrifying sailors crisscrossing the world’s oceans, ships were the only method of sharing information between continents that were separated by thousands of kilometers of water. This was until the first reliable transoceanic submarine cable was established over 150 years ago, way back in 1866.

This pioneering telegraph cable transmitted at rates that we’d scoff at today, but it was undoubtedly a monumental performance leap when compared to sending handwritten letters back and forth between continents, which could take weeks and even months. Imagine you waited months to receive an important letter, but couldn’t read the sender’s handwriting?! Oh, the horror!

Most modern submarine cables are based on coherent optical transmission technology, which enables colossal capacity improvements over the early telegraph cables of yesteryear, and can reliably carry multiple terabits of data each second.

We’ve come a long way in improving on how much data we can cram into these optical fibers that are the size of a human hair, housed in cables the size of a common garden hose, and laid upon the world’s seabeds for thousands of kilometers. We’ve also come a long way in being utterly and completely dependent upon this critical infrastructure, now carrying $10 trillion – yes, TRILLION – worth of transactions every day, over 95% of all inter-continental traffic, and are experiencing over 40% CAGR growth worldwide.

This network infrastructure will become more critical, if that’s even possible! more>

Related>

Updates from ITU

How AI can improve agriculture for better food security
ITU News – Roughly half of the 821 million people considered hungry by the United Nations are those who dedicate their lives to producing food for others: farmers.

This is largely attributed to the vulnerability of farmers to agricultural risks, such as extreme weather, conflict, and market shocks.

Smallholder farmers, who produce some 60-70% of the world’s food, are particularly vulnerable to risks and food insecurity.

Emerging technologies such as Artificial Intelligence (AI), however, have been particularly promising in tackling challenges such as lack of expertise, climate change, resource optimization and consumer trust.

AI assistance can, for instance, enable smallholder farmers in Africa to more effectively address scourges such as viruses and the fall armyworm that have plagued the region over the last 40 years despite extensive investment, said David Hughes, Co-Founder of PlantVillage and Assistant Professor at Penn State University at a session on AI for Agriculture at last week’s AI for Good Global Summit. more>

Related>

Updates from Datacenter.com

Private Cloud vs Public Cloud: what is the best solution?
Datacenter.com – Cloud computing spans a range of classifications, types and architecture models. The transformative networked computing model can be categorized into three major types: Public Cloud, Private Cloud and Hybrid Cloud.

Hybrid IT has rapidly proven that it offers the flexibility for delivering new software applications and enhanced features quickly critical agility in the age of digital business. With that in mind, enterprises now need to identify the best distribution of services and applications and their strategy for connecting to the clouds they use.

This article explores the key differences between the classifications of Public, and Private cloud environments.

Public Cloud refers to the cloud computing model with which the IT services are delivered across the Internet. The computing functionality may range from common services such as email, apps and storage to the enterprise-grade OS platform or infrastructure environments used for software development and testing. The cloud vendor is responsible for developing, managing and maintaining the pool of computing resources shared between multiple tenants from across the network.

The advantages of Public Cloud solutions for business customers include:

  • No investments required to deploy and maintain the IT infrastructure;
  • Flexible pricing options based on different SLA offerings

However, there are disadvantages as well, including:

  • The total cost of ownership (TCO) can rise exponentially for large-scale usage, specifically for midsize to large enterprises

more>

Related>

Updates from Ciena

Avoid outage outrage: Why AI-assisted operations is the next big thing for networks
By Kailem Anderson – You can’t go far in the broader tech industry these days without coming across a conversation about the future of artificial intelligence (AI). I’ve been talking about AI and machine learning (ML) in telecom networks for quite a while, and for good reason: network operators desperately want and need AI to help simplify their complex network operations.

Troubleshooting and resolving issues in today’s increasingly complex and dynamic networks has become a major operational burden, complicated by multiple management systems and a flood of raw network data and alarms.

This results in two major challenges. First, the flood of raw data obfuscates true insight into the state of the network, making it difficult to detect indications of potential network outages before customers have been impacted. Second, the “trouble-to-resolve” process becomes slow and tedious as the team struggles to identify, isolate, and rectify the issue’s root cause.

These challenges can result in network troubles that last for weeks or even months. In North America alone, an IHS Markit report from 2016 estimated that network outages cost enterprises $700 billion a year in lost revenues and productivity.

To address these challenges, Blue Planet has today introduced a comprehensive solution built around AI and advanced analytics. more>

Related>

How European telcos are monitoring our online activity

The security and privacy of personal data are being jeopardized as Deep Packet Inspection is deployed by internet service providers.
By Katherine Barnett – Europe has not escaped the global move towards ‘surveillance capitalism’. Numerous pieces of legislation are under consideration which put online freedoms and privacy at risk—the UK’s Online Harms white paper is just one example.

The European Digital Rights (EDRi) organization recently discovered that European telcos were monitoring internet connections and traffic through a technique known as Deep Packet Inspection (DPI).

European telcos have so far escaped penalization for their use of DPI, on the grounds that it counts as ‘traffic management’. Under current net-neutrality law, it is technically allowed for purposes of network optimization—but its use for commercial or surveillance purposes is banned.

In January, however, the EDRi produced a report, outlining how as many as 186 European ISPs had been violating this constraint, using DPI to affect the pricing of certain data packages and to slow down internet services running over-capacity. Alongside 45 other NGOs and academics, it is pushing for the use of DPI to be terminated, having sent an open letter to EU authorities warning of the dangers.

Deep Packet Inspection is a method of inspecting traffic sent across a user’s network. It allows an ISP to see the contents of unencrypted data packets and grants it the ability to reroute or block traffic.

Data packets sent over a network are conventionally filtered by examining the ‘header’ of each packet, meaning the content of data traveling over the network remains private. They work like letters, with simple packet filtering allowing ISPs to see only the ‘address’ on the envelope but not the contents.

DPI however gives ISPs the ability to ‘open the envelope’ and view the contents of data packets. It can also be used to block or completely reroute data.

Regulators have so far turned a blind eye to this blatant disregard for net-neutrality law and telcos are pushing for DPI to be fully legalized.

This sparks major concerns about user privacy and security, as DPI renders visible all unencrypted data sent across a user’s connection, allowing ISPs to see browsing activity. more>

Updates from Ciena

Learn about the technology behind Ciena’s WaveLogic 5
By Kacie Levy – If you are like me your to-do lists get longer every day, so finding the time to stay up-to-date on industry trends can be a challenge. Which is why we created Ciena’s Chalk Talk Video series. These videos provide an opportunity for you to spend a few minutes with our experts and learn more about the future of networking.

We recently introduced Ciena’s WaveLogic 5 to the market, our next-gen 800G-capable coherent optical chipset, which includes two distinct solutions to address the divergent requirements network operators and Internet Content Providers are encountering:

  • WaveLogic 5 Extreme: will deliver 800G of capacity over a single wavelength with tunable capacity from 200G, supports customers who need maximum capacity and performance from their networks.
  • WaveLogic 5 Nano: will deliver the strength of Ciena’s coherent optical technology and expertise in footprint-optimized 100G-400G solutions, targeting applications where space and power are the primary considerations.

As Ciena’s Scott McFeely said during the unveiling, there was a lot to unpack in the announcement.

So, in the Chalk Talk Videos below Joe Shapiro, the product manager responsible for Ciena’s WaveLogic Coherent solutions, provides an overview of what each WaveLogic 5 solution is, key technological features, and the benefits of these important solutions. more>

Related>