Tag Archives: Ciena

Updates from Ciena

Updated: 800G – nothing but the facts
If you have been following Ciena, you know 800G adoption is underway. With that comes a lot of interest and questions. Ciena’s Helen Xenos sat down to share insights from 800G customer deployments to help you understand the facts.
By Helen Xenos – 800G is a hot topic of discussion in the optical industry today – it’s everywhere! And as is customary when a new technology emerges, there are various opinions and speculation as to the value and expected rate of adoption, especially these days when lab access and live trials pose a unique challenge. Who has real product? Is anyone going to deploy 800G in the near term? Are there technological and operational barriers that still need to be overcome?

As the only vendor with commercially available 800G product shipping today (since April 2020), we are in the unique and fortunate position here at Ciena where we don’t need to speculate.

Curious to know the facts around 800G deployments?

In just over nine months of commercial availability of WaveLogic 5 Extreme, Ciena has shipped more than 6,000 coherent modems to over 75 customers around the globe, all of whom are actively deploying the technology in their networks.  The rate of early technology adoption is impressive – more than twice as fast as the ramp of competitive 600G solutions, as can be seen from the Cignal AI graph below (source: Transport Applications Report).

In this blog, I’ll share details of these deployments, and insights behind the strong ramp, so you can cut through the hype and get to the facts about 800G. more>

Related>

Updates from Ciena

How quickly can you activate new MPLS services?
MPLS tunnels are the go-to technology to deliver network services. But provisioning and activation can take days or even weeks. Blue Planet’s Mitch Auster details how intelligent automation can solve the many complexities of MPLS service activation.
By Mitch Auster – When it comes to delivering high quality services between geographically distributed locations, providers across the world have a go-to technology they rely on – MPLS tunnels. Each service request from a customer comes with unique requirements – a bank may require gold priority paths with redundancy, a television network may demand temporary network connectivity to stream an event, a federal agency might want to send traffic excluding certain countries, or a customer could ask for a high-bandwidth, low-latency path for data traversing between headquarters and their data center.

MPLS service activation in weeks

The current approach for provisioning and activating an MPLS service with such unique customer requirements can take days or even weeks. Under the present mode of operation, a provider must have access to the current network topology of an ever-changing network, evaluate the performance metrics of each device, link and path between multiple source and destinations pairs, use manual, legacy offline planning tools to compute the new path, and manually configure all the routers along the new path.

Add multi-vendor devices or multiple autonomous systems to this mix, and the overall cost in terms of both OPEX and efforts can be quite high. But with the competition waiting with improved offers, customers may not be willing to wait for weeks, or longer, while the provider searches for the most efficient path that meets customer constraints. more>

Updates from Ciena

Network Edge: Enterprises are ready for a more comprehensive approach – but can telcos deliver?
Over the last few years, enterprises have begun embracing more automation and virtualization in the wide area network (WAN), says Ciena’s Artur Kwiatkowski. As their IT architectures migrate towards (multi) cloud centricity, their network environment – and especially the network edge – must evolve to be more flexible and increasingly self-configurable by the end user. To accelerate this evolution, enterprises across many industries are deploying virtual network functions such as virtual routers, firewalls, and software-defined WAN (SD-WAN). For many of them, that last application has been the starting point towards a virtualized network environment.
Was SD-WAN overhyped?
By Artur Kwiatkowski – Originally, part of its promise was about commercials: it offered a more attractive cost structure for the enterprise WAN. This was to result from increased reliance on cheap underlying network transport technologies (e.g. dedicated internet access services rather than MPLS). What has proved more transformational, however – especially in European markets where price deltas for underlay services were not that great in the first place – is the increased control that enterprises gain with SD-WAN bundles over the performance of their networks, and the ability to decouple the overlay (management and policy) function from the underlay (transport) function.

Very quickly, SD-WAN became a hyped (possibly even over-hyped) concept, and vast majority of communications service providers (CSPs) active in the B2B space scrambled to pull together an SD-WAN market offering. In many cases, these boiled down to a managed service delivered by the SD-WAN providers / equipment vendors themselves, and then white-labelled by the telco as they were resold to the enterprise end user.

It also soon transpired that SD-WAN was not a one-size-fits-all application. As a result, majority of larger CSPs today have multiple SD-WAN solutions in their product catalogue, aimed at various market segments, from small businesses to global enterprises. This is not a problem in itself, but many of them end up siloed and isolated in the context of the wider service portfolio. They also often rely on manual processes for operational aspects such as service turn-up. The resulting image of CSPs is that of a bevy of swans swimming upstream – looking distinguished and graceful above the waterline, all the while peddling frantically underneath where no one can see, just to keep moving forward.

Change seems to be on the horizon, however. more>

Related>

Updates from Ciena

For years we’ve been hearing that 2020 would be the year that 5G networks would begin to be deployed. Well, it’s finally here, and MNOs are indeed starting to roll out 5G services. But beyond new phones and RAN technology, it’s going to be those that embrace automation who will ultimately drive faster transitions to 5G. To that end, Blue Planet has unveiled new capabilities for 5G automation.
By Kailem Anderson – As we watched the standard come together, 5G set some lofty expectations in terms of performance gains that 5G networks will deliver to users over 4G. These included things like 10 to 100 times faster speeds, 1000 times the bandwidth, support for 10 to 100 times more devices, 99.999% availability, and latency as low as 1 millisecond.

This vastly improved speed, capacity and latency opens up all kinds of new use cases for mobile network operators (MNOs). The increase in users and use cases also means the number of network services connections required of 5G networks is unprecedented and, more importantly, the speed at which these services need to be created and managed, typically in a multi-vendor environment, is significantly faster than what today’s OSS, NMS, and manual processes can handle. This velocity and volume will affect the entire network lifecycle, including planning, designing and deploying services, and day-to-day operations. Automation will play a critical role in helping operators meet these challenges to speed the delivery of 5G networks and derive new revenues.

Finally, with 5G still being an emerging technology, the standards associated with it too are evolving. In order to adhere to the emerging 5G standards, MNOs need a cloud-native 5G solution that is designed and developed based on openness and works in a multi-vendor network with no vendor lock in.

As 5G scales, automation will, in turn, increasingly rely on AI and ML (machine learning) to fully automate some operational processes, including predicting situations like a network fault before it occurs and taking corrective actions before it impacts customers, or understanding when specific network resources are near capacity and scaling them up to meet the growing requirements of the services that rely on them. Of course, this type of AI-assisted operations is a topic I’ve been discussing for quite some time.

The promises of 5G, automation and AI are great, but the path to get there is filled with many technical hurdles. Here on the Blue Planet team, we’ve been working hard to deliver an intelligent 5G automation solution that helps MNOs lessen the bumps. more>

Updates from Ciena

Cable operators – the move to edge compute
Almost 60% of cable executives surveyed by Broadband Success Partners said improved customer experience or enablement of new revenue streams is the most important driver at their company for moving to edge compute. Learn more about the research in this Q&A.
Ciena – The cable industry is deploying Distributed Access Architectures (DAA) and extending fiber closer to the customer as we move toward 10G. Does this mean the cable industry is well positioned for edge compute – moving compute and storage closer to the edge? What are the drivers, use cases, challenges, and investment areas as we evaluate moving to edge compute? These are some of the questions Broadband Success Partners discussed recently with executives at cable operators in North America.

We had an opportunity to further discuss the drivers, challenges, technology enablers and investment areas with David Strauss, Principal at Broadband Success Partners, and Fernando Villarruel, Chief Architect, MSO Practice at Ciena.

What insights did you get from cable executives regarding drivers to move to edge compute?

David: We asked executives in network engineering and business services what the top drivers are to move to edge compute – almost 60% noted either improved customer experience (29%) or enablement of new revenue streams (also 29%) as the most important driver at their company. For tier 1 operators the financial factors, new revenues and cost savings, were deemed most important. For tier 2 operators customer experience and scalability were identified as most important. Network engineering executives value all the drivers somewhat equally, while business services executives place a premium on customer experience and new revenues.

The reasons why these executives chose the driver they did are varied – ranging from “an improved customer experience due to lower latency for gaming and video optimization” to “choosing something that’s scalable is key so as to not augment later.” more>

Related>

Updates from Ciena

Adaptive Learning is the Future of Education. Are Education Networks Ready?
Educators are increasingly leaning on EdTech and Adaptive Learning tools that personalize and improve the student learning experience. Ciena’s Daniele Loffreda details the critical role the network plays in making these disruptive new learning tools a reality.
By Daniele Loffreda – As teachers and administrators strive to improve student performance and graduation rates, they’re increasingly leveraging new Educational Technology (EdTech) to deliver a higher quality learning experience. Digital applications such as streaming video, mixed-reality, gamification, and online global collaboration enable a “learning beyond the classroom walls” environment.

However, educators are quickly realizing that even with EdTech innovations, the traditional “one-size-fits-all” approach to education fails to make the grade. Student populations are increasingly diverse in terms of culture, location, economic background, and learning styles. Educators are increasingly aware that not everyone can absorb the lesson plan in the same way, and that teaching needs to be more personalized to the individual student. To provide more personalized learning experience to students, while ensuring adherence to government performance standards, educators are turning to Adaptive Learning systems.

Adaptive Learning uses computer artificial intelligence algorithms that adjust the educational content to the student’s learning style and pace. Based upon a student’s reaction to content, algorithms detect patterns and respond in real-time with prompts, revisions, and interventions based upon the student’s unique needs and abilities. Combining Adaptive Learning platforms with predictive analytics and other EdTech applications helps to transform the learning experience for both the student and the teacher. more>

Related>

Updates from Ciena

Why you need the highest level of trust in your encryption solution
In Europe, full certification from BSI is one of the top security accreditations you can get, and Ciena’s optical encryption solutions have achieved it; just one more reason why Ciena is the ideal partner to help you protect your in-flight data and streamline your compliance strategy, be it to the European General Data Protection Regulation (GDPR), or any other data regulation. Jürgen Hatheier, CTO EMEA at Ciena, explains why.
By Jürgen Hatheier – Several recent headlines about large GDPR-related financial penalties for data breach violations, continue to spark massive investments in security compliance initiatives across all industries. However, these programs are sometimes limited to protecting data held within the organization – with little or no consideration for ‘in-flight’ data traveling between locations across the Wide Area Network (WAN).

This is no longer adequate, as large amounts of data are transported over high-capacity wavelengths across fiber optic networks and cybercriminals are exploiting any potential gaps within an organization’s security strategy. It’s not uncommon, for example, for hackers to use ‘wiretap’ devices to steal data as it travels over optical fiber connections.

The frequent lack of physical security makes this kind of attack relatively easy, allowing hackers to access fiber and install wiretaps in cabinets in the street, under man-hole covers, and in other easy-to-breach locations. These wiretaps can also be left in place for long periods of time without being detected, leading to large quantities of data being stolen, with no indication when breaches even started.

The only way to prevent the theft of data using wiretaps is to encrypt ‘in-flight’ data as it travels over WAN connections. The ability to do this effectively has now become a key requirement in tender processes for network rollouts, and is critical for protecting customers and their data in the GDPR era. more>

Related>

Updates from Ciena

Rethinking NaaS as a journey to openness and automation
NaaS can feel like an abstract concept, and various misconceptions abound on what it is and what is possible. But as Blue Planet’s Kailem Anderson explains, NaaS has measurable and quantifiable benefits that are achievable today.
By Kailem Anderson – What is Network as a Service (NaaS)? It’s a simple enough question, but there is a lot of confusion in the marketplace about the answer.

Some common misconceptions or myths about NaaS are that it is just a new way for Communications Service Providers (CSPs) to sell virtualized services to enterprises, that its only about operations support system transformation through open and programmable APIs, or that it means the same thing as software-defined networking (SDN).

Perhaps the biggest misconception, however, is that NaaS isn’t real – that it is a futuristic goal. While NaaS is, indeed, a ‘future state’ vision for CSPs, they can and are using it in production environments today.

I like to think of NaaS as an evolutionary journey toward a network, operations and business architecture that is open, agile and automated. Successful completion of this journey will result in digital transformation that allows CSPs to take back control of their networks, save on operational costs, increase innovation, accelerate time to market, and improve customer experience. more>

Updates from Ciena

Accelerate mission response with a simpler, Adaptive Network
Jim Westdorp, Ciena Government Chief Technologist, outlines how a holistic, end-to-end networking approach can help agencies meet growing digital and cybersecurity demands.
By Jim Wesdorp – The rapid transition to remote work and constituent demands for improved user experiences are challenging government agencies to digitize services—from tax payments to employee benefits. At the same time, government databases are increasingly becoming major targets for individual and nation-backed attackers. Budgetary constraints and diminishing tech expertise only complicate matters as agencies struggle to balance cost- and performance-optimization alongside cyber resiliency.

So how can government agencies accelerate digital transformation, defend against hackers, and support legacy applications and complex infrastructures?

The answer: a network infrastructure that is simpler to manage. Modern IT and communications can enable automation, improve performance, and help assure cyber resiliency at a time when government agencies are under unprecedented pressure to deliver services quickly and securely.

It takes more than technology, though, to simplify a network. A foundational step in any modernization effort is to conduct an inventory of a network’s physical assets, from routers to servers, and determine both the network elements and attached management software used to construct it. more>

Related>

Updates from Ciena

Can utilities have their multi-layered cake and eat it too?
Utilities are facing increasing bandwidth demands on their communications networks. Ciena’s Mitch Simcoe explains how modernizing networks to a private packet-optical fiber architecture can help utilities scale to support new smart grid applications.
By Mitch Simcoe – Utilities are increasingly in the eye of the storm these days. Whether it’s having to deal with hurricanes in the Gulf Coast over the last few months or wildfires on the West Coast, utilities have had to put more sensors out in the field to keep abreast of changing weather conditions and potential risks to their power grids. The increasing demands for utilities to show that they are carbon-free is also changing the way they generate and distribute energy. The one common denominator that utilities have is more data to collect and backhaul from their power grids, which is driving increasing demand on their communications networks.

Many utilities may not realize it, but recent advancements have resulted in several bandwidth-intensive applications and processes driving up demand on their networks:

  1. Video Surveillance
    Security continues to be top of mind for utilities and security surveillance in the past has been more “after the fact”; where video surveillance is stored locally at the substation and only accessed after a security breach. Today’s approach is to backhaul all security video footage to a centralized data center and apply artificial intelligence (AI) techniques to proactively determine if a security breach is in the process of occurring. In those cases, security personnel can be dispatched on site in near-real time. Each video camera at a substation can generate 9 Gigabytes of data per day and a typical substation could have a dozen video cameras to surveil.
  2. Synchrophasors
    Prior to the big power outage of 2003 in the Northeast United States (where 50 million households lost power for two days), sensors on the power grid using SCADA (Supervisory Control and Data Acquisition) would sample the state of the grid about once every four seconds. This significant outage could have been avoided had the grid been sampling data more frequently. To address this, a device called a synchrophasor (not the Star Trek type!) was introduced, which would sample the state of the grid 30 to 60 times per second. This has allowed the grid to be more reliable but produces significantly more data to backhaul and process. Each synchrophasor PMU (Performance Measurement Unit) can generate 15 Gigabytes of data per day and all of that must be backhauled to a central data center for analysis.
  3. Smart Meters
    In the US, over 50% of households are now serviced by a smart meter that measures your household’s power consumption every 15 minutes. Beyond their billing function, they help utilities track power consumption hotspots during peak usage. For a utility of 1 million households, which would be the middle range for most US Investor-owned Utilities (IOUs), this can generate 1 terabyte of data per day that needs to be backhauled to a central data center for processing.
  4. Internet of Things (IoT) devices
    These include what we mentioned earlier: weather sensors and sensors on power equipment to proactively identify issues. Smart thermostats in homes is another growing trend which utilities are using to offer smart “on-demand” billing plans where you allow the utility to raise your thermostat during periods of peak usage during the hot summer months in exchange for a lower cents per kWh price.

For the first three categories we mentioned above, a utility of 1 million households would result in a daily requirement for data backhaul of 6 to 8 terabytes. With this amount of data to backhaul and process, it is no wonder utilities are exhausting the available capacity of their legacy communications networks.

The Information Technology (IT) group in a utility is tasked with managing many of these new applications associated with a smarter grid. Some utilities have been leasing copper-based TDM services for many years from service providers for smart grid, IT and substation traffic. The cost of this approach has been onerous and only gets more expensive as service providers are migrating their networks away from copper to fiber and wireless options. more>