Updates from Ciena

AI Ops: Let the data talk
The catalysts and ROI of AI-powered network analytics for automated operations were the focus of discussion for service providers at the recent FutureNet conference in London. Blue Planet’s Marie Fiala details the conversation.
By Marie Fiala – Do we need perfect data? Or is ‘good enough’ data good enough? Certainly, there is a need to find a pragmatic approach or else one could get stalled in analysis-paralysis. Is closed-loop automation the end goal? Or is human-guided open loop automation desired? If the quality of data defines the quality of the process, then for closed-loop automation of critical business processes, one needs near-perfect data. Is that achievable?

These issues were discussed and debated at last week’s FutureNet conference in London, where the show focused on solving network operators’ toughest challenges. Industry presenters and panelists stayed true to the themes of AI and automation, all touting the necessity of these interlinked software technologies, yet there were varied opinions on approaches. Network and service providers such as BT, Colt, Deutsche Telekom, KPN, Orange, Telecom Italia, Telefonica, Telenor, Telia, Telus, Turk Telkom, and Vodafone weighed in on the discussion.

On one point, most service providers were in agreement: there is a need to identify a specific business use case with measurable ROI, as an initial validation point when introducing AI-powered analytics into operations. more>

Related>

All the ways recycling is broken—and how to fix them

You may throw a plastic container in the recycling bin and assume it’s going to easily become a new item. But every step of our recycling system—from product design to collection to sorting—has major flaws. Fortunately, promising technology is starting to come online that could revolutionize the process.
By Adele Peters – You may have read that there’s a recycling crisis in the U.S. After years of accepting our used plastic and cardboard, China now won’t take it, which often means there is no place for it to go. Some city recycling programs—unable to find other buyers—have quietly started sending recyclables to incinerators or landfills, news that could make anyone question the point of separating your trash at all.

Each year, by one estimate, Americans throw out around 22 million tons of products that could have been recycled. Tens of millions of homes don’t have access to recycling; for those that do, everything from broken blenders to old clothing still ends up in the trash. If you drop an empty package in a recycling bin and it’s trucked off to a sorting facility, that doesn’t necessarily guarantee it will be recycled. You might have unwittingly tossed something that your local recycling service doesn’t accept, or the package might have been designed in a way that makes it unrecyclable.

Some parts of the system do work. The aluminum in a beer can, for example, can easily be made into new beer cans, over and over again. But a plastic package might be chopped up, melted, mixed with other types of plastic, and “downcycled” into a lower-quality material that can only be used for certain products, like park benches or black plastic planters.

When the U.S. was sending much of its paper and plastic trash to China, for more than two decades, the bales were often so poorly sorted that they contained garbage. The system never extracted the full value from those materials.

When a truck picks up recyclables from curbside bins, they take them to sorting facilities. Inside these centers, called “MRFs” or materials recycling facilities, people work with automated equipment to sort through the detritus of everyday life. Trucks dump mixed materials into the facility, where it’s loaded onto a conveyor belt; typically, in a first step, people standing next to the machine quickly pull out trash and materials like plastic bags that can jam equipment.

As materials move through a facility, the system uses gravity, screens, filters, and other techniques to separate out paper, metal, glass, and plastics; optical sorting equipment identifies each type of plastic. more>

How digital technology is destroying our freedom

“We’re being steamrolled by our devices” —Douglas Rushkoff
By Sean Illing – There’s a whole genre of literature called “technological utopianism.” It’s an old idea, but it reemerged in the early days of the internet. The core belief is that the world will become happier and freer as science and technology develops.

The role of the internet and social media in everything from the spread of terrorist propaganda to the rise of authoritarianism has dampened much of the enthusiasm about technology, but the spirit of techno-utopianism lives on, especially in places like Silicon Valley.

Douglas Rushkoff, a media theorist at Queens College in New York, is the latest to push back against the notion that technology is driving social progress. His new book, Team Human, argues that digital technology in particular is eroding human freedom and destroying communities.

We’re social creatures, Rushkoff writes in his book, yet we live in a consumer democracy that restricts human connection and stokes “whatever appetites guarantee the greatest profit.” If we want to reestablish a sense of community in this digital world, he argues, we’ll have to become conscious users of our technology — not “passive objects” as we are now.

But what does that mean in practical terms? Technology is everywhere, and we’re all more or less dependent upon it — so how do we escape the pitfalls? more>

How to govern a digitally networked world

Because the internet is a network of networks, its governing structures should be too. The world needs a digital co-governance order that engages public, civic and private leaders.
By Anne-Marie Slaughter and Fadi Chehadé – Governments built the current systems and institutions of international cooperation to address 19th- and 20th-century problems. But in today’s complex and fast-paced digital world, these structures cannot operate at ‘internet speed’.

Recognizing this, the United Nations secretary-general, António Guterres, last year assembled a high-level panel—co-chaired by Melinda Gates and the Alibaba co-founder Jack Ma—to propose ways to strengthen digital governance and cooperation. (Fadi Chehadé, co-author of this article, is also a member.) It is hoped that the panel’s final report, expected in June, will represent a significant step forward in managing the potential and risks of digital technologies.

Digital governance can mean many things, including the governance of everything in the physical world by digital means. We take it to mean the governance of the technology sector itself, and the specific issues raised by the collision of the digital and physical worlds (although digital technology and its close cousin, artificial intelligence, will soon permeate every sector).

Because the internet is a network of networks, its governing structures should be, too. Whereas we once imagined that a single institution could govern global security or the international monetary system, that is not practical in the digital world. No group of governments, and certainly no single government acting alone, can perform this task.

Instead, we need a digital co-governance order that engages public, civic and private leaders on the basis of three principles of participation.

First, governments must govern alongside the private and civic sectors in a more collaborative, dynamic and agile way.

Secondly, customers and users of digital technologies and platforms must learn how to embrace their responsibilities and assert their rights.

Thirdly, businesses must fulfill their responsibilities to all of their stakeholders, not just shareholders. more>

Updates from Ciena

Protecting your business from cyber threats
The phone rings — there’s been a breach. Ciena’s chief security architect Jim Carnes explains how to integrate security into each aspect of your business to mitigate this stressor – and stop fearing that call.

By Jim Carnes – It’s Friday afternoon (it always happens on Friday afternoon) and the phone rings — there’s a breach. Your internet provider has called and malware associated with the latest botnet has been detected coming from your corporate network. The incident response plans are triggered and everyone goes into high alert, looking for the source.

The common thought trajectory goes something like: How could this happen? We use the latest and greatest security products. Did someone open a phishing email? Did a hacker breach our firewall or was a vendor compromised? There goes my weekend.

How can we stop fearing that Friday afternoon call?

Integrating security into each aspect of your business could mitigate this stressor. When people, processes, inventory and technology are coordinated, the fear and uncertainty of security breaches is replaced with straightforward and seamless responses that protect your Friday evening dinner plans.

The conversation should always begin with your business. You need to understand the processes, the people and the vendor and partner relationships. Understanding how the critical aspects of the company function and interact will often point to gaps in security.

Are the tools that facilitate secure business processes in place? Look for:

  • Single-sign solutions to ease integration of people and technology
  • Multi-factor authentication solutions that ease the password management burden on users (compromised passwords are responsible for nearly half of organizations that are breached according to the 2017 Verizon DBIR)
  • Product suites that integrate business processes and technology solutions
  • Secure supply chains that enumerate the risks to both hardware and software solutions while protecting them (a white paper published by the SANS Institute offers guidance on combating supply chain cyber risk)

Whether your business is delivering software, hardware or services, the development of those solutions include security from the start. The ability to clearly articulate the purpose of the system, how it will be used, who will be using it and what value it provides will help begin the conversation. Articulating these key factors will help define the threat environment, the adversaries and the controls necessary to mitigate the attacks.

Mitigations will therefore have context and be able to address real threats, rather than generic ones. more>

Related>

Updates from ITU

What do ‘AI for Social Good’ projects need? Here are 7 key components.
By Anna Bethke – At their core, ‘AI for Social Good’ projects use artificial intelligence (AI) hardware and software technologies to positively impact the well-being of people, animals or the planet – and they span most, if not all, of the United Nations Sustainable Development Goals (SDGs).

The range of potential projects continues to grow as the AI community advances our technology capability and better understands the problems being faced.

Our team of AI researchers at Intel achieved success by working with partners to understand the problems, collecting the appropriate data, retraining algorithms, and molding them into a practical solution.

At their core, an AI for Social Good project requires the following elements:

  1. A problem to solve, such as improving water quality, tracking endangered species, or diagnosing tumors.
  2. Partners to work together in defining the most complete view of the challenges and possible solutions.
  3. Data with features that represent the problem, accurately labeled, with privacy maintained.
  4. Compute power that scales for both training and inference, no matter the size and type of data, or where it lives. An example of hardware choice is at ai.intel.com/hardware.
  5. Algorithm development, which is the fun part! There are many ways to solve a problem, from a simple logistic regression algorithm to complex neural networks. Algorithms match the problem, type of data, implementation method, and more.
  6. Testing to ensure the system works in every way we think it should, like driving a car in rain, snow, or sleet over a variety of paved and unpaved surfaces. We want to test for every scenario to prevent unanticipated failures.
  7. Real-world deployment, which is a critical and complicated step that should be considered right from the start. Tested solutions need a scalable implementation system in the real world, or risk its benefits not seeing the light of day.

At the end of May, Intel AI travels to Geneva, Switzerland, for the UN’s AI for Good Global Summit hosted by ITU and will speak to each of these elements in a hands-on workshop. more>

Related>

Updates from Ciena

The benefits of an integrated C&L-band photonic line system
Network providers are looking for new alternatives to unlock additional network capacity. Ciena’s Kent Jordan explains how upgrading to the L-band can help – if done in the right way.
By Kent Jordan – The photonic layer is the foundation for high capacity networks. Whether the network application is to increase connectivity between data centers, deliver bandwidth-intensive content, or to move business applications into the cloud, the photonic layer provides the mechanism to efficiently light the fiber by assigning and routing wavelengths across the optical spectrum. However, today’s photonic layer systems utilize only a portion of the usable spectrum within the fiber, and operators are increasingly looking at expansion into the L-band to increase capacity.

There are a few factors driving the desire for L-band. First, and foremost, is traffic demand. Networks with high bandwidth applications and sustained bandwidth growth are quickly faced with capacity exhaustion. Once existing capacity is consumed, lighting additional fiber pairs is required. If the cost of laying or leasing new fiber is too prohibitive, then alternatives to unlocking additional capacity are needed.

The L-band is one such solution, and it can be used to double the fiber capacity. But, for operators to consider deploying L-band solutions, they must be simple to plan and deploy, and the upgrade to L-band must not impact existing traffic in the C-band.

Building the foundation for a scalable network infrastructure isn’t just about knowing what building blocks to use. It also includes selecting the appropriate architecture and understanding how the pieces fit together, so when it is time to increase capacity, there aren’t any surprises, performance hits, or suboptimal capacity limits. more>

Related>

Updates from Chicago Booth

How local productivity growth affects workers near and far
One city’s boom can be felt across a nation
Chicago Booth – When big cities experience an economic boom, you expect an upsurge in wages and growth in those areas. But there’s some nuance: according to Chicago Booth’s Richard Hornbeck and University of California at Berkeley’s Enrico Moretti, one area’s surge particularly benefits low-skilled workers locally—and high-skilled workers elsewhere.

Using total factor productivity (TFP) as a measure of local productivity growth, Hornbeck Amount and Moretti analyzed two decades of data from major US cities to quantify the direct effects on people living in booming cities and the indirect effects on people elsewhere. Allowing for trade-offs between salary and cost-of-living increases, as well as unequal distribution of benefits across different groups, the researchers find that low-skilled workers gained the most from local productivity growth.

But gains extended further afield: a boom in San Diego or Los Angeles, say, was also felt in other cities. And high-skilled workers gained more from productivity growth in other cities. more>

Related>

Updates from Siemens

Perceptible differences that drive top-line growth
Siemens – 150 million times a day…

…someone, somewhere in the world, chooses a Unilever product.

Unilever’s brand portfolio spans 14 categories of home, personal care and food products and includes world favorites such as Lipton, Knorr, Dove and Omo. The company employs 179,000 people in 100 countries worldwide. Its products are sold in the Americas, Europe and Asia/Africa in roughly equal distribution.

Innovation is critical to sustaining Unilever’s growth. “We see product innovation as one of the key drivers of top-line growth,” says Huw Evans, R&D director of information in Unilever’s Home and Personal Care Division. Unilever defines product innovation this way:

“Product innovation means providing the consumer with a product that delivers a perceivable benefit that is differentiated from those of our competitors and that differentiation drives the choice to purchase and use that product,” explains Evans.

“You can change products to improve their price differentials, for example, but if the consumer is not really experiencing a difference, then we wouldn’t classify that as innovation. Innovation is about consumer-perceptible benefits that drive choice. To help achieve this Unilever invests €1 billion every year in research and development, which includes support for five major laboratories around the world that explore new thinking and techniques to help develop our products.” more>

Related>

Updates from Datacenter.com

The staffing challenge in data center market becomes a quality opportunity
The data center industry is growing rapidly. The amount of vacancies for specialized data center personnel in the area of Amsterdam, for example, is rising. A shortage of specialized/skilled data center personnel is expected, a staffing challenge. How can the data center market respond? Will it result in decreasing quality of data center services? We believe in the opportunity of improving!

datacenter.com – It is expected that the data center industry will rapidly grow and will result in a shortage of skilled IT personnel.

Almost all data centers have plans to expand and cloud providers are considering or starting to operate their own data center. So, the amount of companies and the size of the companies are expanding, means that specialists are needed to design, maintain and operate these high-tech buildings and services. The employees that are the hardest to find are ones with a very broad knowledge of power, cooling and construction knowledge. Due to the shortage in the skilled employees, the more expensive the employees will be.

Instead of trying to do everything by yourself, you can use the experts. The companies that build technical infrastructures and buildings are the once that (mostly) have in-depth knowledge within their field of expertise. Instead of engineering your data center with some multi-talent employees and asking the experts to fill-in the blanks, the experts can design and a project manager within the field can connect them. more>

Related>