Category Archives: Broadband

Investing in People to Build Human Capital

World Bank – Scientific and technological advances are transforming lives: they are even helping poorer countries close the gap with rich countries in life expectancy. But, poorer countries still face tremendous challenges, as almost a quarter of children under five are malnourished, and 60 percent of primary school students are failing to achieve even a rudimentary education.

In fact, more than 260 million children and youth in poorer countries are receiving no education at all.

“Human capital” – the potential of individuals – is going to be the most important long-term investment any country can make for its people’s future prosperity and quality of life.

In many countries, the workforce is unprepared for the future that is fast unfolding.

This is a key insight from the World Bank’s forthcoming World Development Report 2019: The Changing Nature of Work. The frontier for skills is moving faster than ever before. Countries need to gear up now to prepare their workforces for the tremendous challenges and opportunities that are being driven by technological change. more>

Updates from Autodesk

ElumTools™ is a fully integrated lighting calculation Add-in for Autodesk® Revit®.
ElumTools™ – ElumTools performs accurate point-by-point lighting calculations for both interior and exterior lighting directly within the Revit model using our industry-standard radiosity calculation engine.

ElumTools is able to leveraging existing Revit geometry and luminaire families by utilizing Revit spatial elements of Rooms, Spaces, Regions and Areas as calculation boundaries.

The surface reflectance and color can be interpreted from the Revit materials properties and mapped to more suitable values if necessary. Reflectances can also be assigned using Revit Categories to encompass all ceilings, walls and floors for example.

Lighting fixture families already present in Revit are used by ElumTools and photometric file associations can be easily created if not already present. A wide variety of prominent manufacturers IES files are available with a few clicks from our cloud-based “Instabase.”

Each calculation is presented in an interactive visualization window with complete 3D walk-through capability. more>

Updates from Ciena

The Adaptive Network: Why automation alone isn’t enough
By Keri Gilder – Just imagine, instead of 70, your heart rate was at 100 beats per minute. This could be a warning sign that you are on the verge of having a heart attack.

If your doctor were to get this information in real time, they could check the readings against your medical records and see that this is completely out of the norm and then warn you to seek medical assistance immediately.

However, if your personal trainer received that same information, would they reach the same conclusion as your doctor? Your trainer has access to a different database, which might show your resting heart rate as well as the rate during high-intensity training. Knowing that you are likely exercising, they would instead conclude that there is no need to go to the hospital after all.

This clearly demonstrates that just accepting raw data without filtering and proper analysis is no longer good enough and can potentially have serious repercussions. Instead, it is critical that we have diversity of thought when it comes to how we interpret data.

This is not just true for our health or other day-to-day scenarios, but can also be applied to the communication networks that carry and house our information. more>

Related>

Where Did Qualcomm Go Wrong?

By Bolaji Ojo – It’s a justifiable question. The Qualcomm–NXP trip was an expensive sortie: Qualcomm has paid $2 billion in mandatory break-off fees to NXP, but the bill for the hidden costs may be much higher. For nearly two years, the communications IC and IP supplier and its target endured prolonged uncertainties. Even now, the spasms from customer disruptions remain strong while many employees, though heaving a sigh of relief, must figure out where they truly belong in the enterprise.

Qualcomm is moving on resolutely from the NXP debacle. It must. However, the implications and lessons — if any — are industry-wide. One of the largest acquisitions in the history of the semiconductor industry foundered because of oppositions from various fronts, including customers who might have benefited from it. Simply dumping the blame on nebulous factors and faceless regulators will result in the industry learning nothing from the experience. Perhaps the transaction was destined to fail. Perhaps it could have been better managed and successfully, too. A thorough assessment of why this deal collapsed would offer lessons that can be applied to future deals.

There are no signs that Qualcomm will conduct a detailed analysis of why and how the bid unraveled. It is easier — again — to simply toss more money at stakeholders and move on. NXP’s management and shareholders who had tendered their equity could slake their thirst with $2 billion in Qualcomm’s money. more>

Supercomputers

By Laura Panjwani – High-performance computers (HPC), also known as supercomputers, give scientists the power to solve extremely complex or data intensive problems by concentrating the processing power of multiple, parallel computers.

Performance of a supercomputer is measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS), the measurement used in conventional computing.

The technology has a plethora of applications—including quantum mechanics, climate research, oil and gas exploration, chemistry, aerospace and automotive technologies, and much more.

In addition to environmental applications, supercomputers are also key to many up-and-coming technologies, including autonomous vehicles. In our article, “Using Deep Learning, AI Supercomputing, NVIDIA Works to Make Fully Self-Driving Cars a Reality” we highlighted Xavier, a complete system-on-chip (SoC) that integrates a new graphics processing unit (GPU) architecture called Volta, a custom 8 core CPU architecture, and a new computer vision accelerator. It features 9 billion transistors and a processor that will deliver 30 trillion operations per second (TOPS) of performance, while consuming only 30 watts of power.

This technology is the most complex SOC ever created and is a key part of the NVIDIA DRIVE Pegasus AI computing platform, the world’s first AI car supercomputer designed for fully autonomous Level 5 robotaxis. more>

Updates from Datacenter.com

Cabinet airflow management done right
By Hans Vreeburg – Let’s start with some basic understanding of airflows within data centers. Nowadays almost all data centers apply hot and cold corridors to optimize the cooling capabilities. In the ideal situation cold air goes straight to the servers’ inlets. The hot air exiting the servers is returned directly to the cooling unit. This setup enables systems to run at the highest possible efficiency, using the least amount of power. The cooling setup has a big influence on the PUE (Power Usage Effectiveness): a lower PUE results in lower total energy consumption of the data center. This indirectly saves the environment and lowers OPEX costs. Could a small gap in your server rack really have that much influence?

As said above, the ideal setup is cold air entering the servers, while hot air exits. Gaps can lead to a higher demand of cold air than actually required by the servers. See it as a large water pipe: it normally needs a specific amount of water. When we make some holes in the pipe, you will need to pump in more water to get the same amount at the end of the pipe. more>

Related>

The Western Illusion of Chinese Innovation

By Zhang Jun – In the West, many economists and observers now portray China as a fierce competitor for global technological supremacy. They believe that the Chinese state’s capacity is enabling the country, through top-down industrial policies, to stand virtually shoulder-to-shoulder with Europe and the US.

This is a serious misrepresentation.

While it is true that digital technologies are transforming China’s economy, this reflects the implementation of mobile-Internet-enabled business models more than the development of cutting-edge technologies, and it affects consumption patterns more than, say, manufacturing.

In fact, Western observers – not just the media, but also academics and government leaders, including US President Donald Trump – have fundamentally misunderstood the nature and exaggerated the role of China’s policies for developing strategic and high-tech industries. Contrary to popular belief, these policies do little more than help lower the entry cost for firms and enhance competition. In fact, such policies encourage excessive entry, and the resulting competition and lack of protection for existing firms have been constantly criticized in China. Therefore, if China relies on effective industrial policies, they would not create much unfairness in terms of global rules.

Clearly, there is a big difference between applying digital technologies to consumer-oriented business models and becoming a world leader in developing and producing hard technology. more>

How to govern AI to make it a force for good

In the interview, Gasser identifies three things policymakers and regulators should consider when developing strategies for dealing with emerging technologies like AI.
Urs Gasser – “Everyone is talking about Artificial Intelligence and its many different applications, whether it’s self-driving cars or personal assistance on the cell phone or AI in health,” he says. “It raises all sorts of governance questions, questions about how these technologies should be regulated to mitigate some of the risks but also, of course, to embrace the opportunities.”

One of the largest challenges to AI is its complexity, which results in a divide between the knowledge of technologists and that of the policymakers and regulators tasked to address it, Gasser says.

“There is actually a relatively small group of people who understand the technology, and there are potentially a very large population affected by the technology,” he says.

This information asymmetry requires a concerted effort to increase education and awareness, he says.

“How do we train the next generation of leaders who are fluent enough to speak both languages and understand engineering enough as well as the world policy and law enough and ethics, importantly, to make these decisions about governance of AI?”

Another challenge is to ensure that new technologies benefit all people in the same way, Gasser says.

Increasing inclusivity requires efforts on the infrastructural level to expand connectivity and also on the data level to provide a “data commons” that is representative of all people, he says. more>

AI and quantum computing: The third and fourth exponentials

By Pete Singer – Dr. John E. Kelly, III, Senior Vice President, Cognitive Solutions and IBM Research, with 40 years of experience in the industry, recalled how the first era of computing began with mechanical computers 100 years ago, and then transition into the programmable era of computing.

In 1980, Kelly said “we were trying to stack two 16 kilobis DRAMs to get a 32 bit stack and we were trying to cram a thousand transistors into a microprocessor.” Microprocessors today have 15 billion transistors. “It’s been a heck of a ride,” he said.

A third exponential is now upon us, Kelly said. “The core of this exponential is that data is doubling every 12 to 18 months. In fact, in some industries like healthcare, data is doubling every six months,” he said.

The challenge is that the data is useless unless it can be analyzed. “Our computers are lousy in dealing with that large unstructured data and frankly there aren’t enough programmers in the world to deal with that explosion of data and extract value,” Kelly said. “The only way forward is through the use of machine learning and artificial intelligence to extract insights from that data.”

Quantum computing, which Kelly describe as a fourth exponential, is also coming which will in turn dwarf all of the previous ones. “Beyond AI, this is going to be the most important thing I’ve ever seen in my career. Quantum computing is a complete game changer,” he said. more>

Guidelines to Achieve Digital Transformation

GSR-18 BEST PRACTICE GUIDELINES ON NEW REGULATORY FRONTIERS TO ACHIEVE DIGITAL TRANSFORMATION
itu.int – Digitization is increasingly and fundamentally changing societies and economies and disrupting many sectors in what has been termed the 4th Industrial Revolution. Meanwhile, ICT regulation has evolved globally over the past ten years and has experienced steady transformation.

As regulators, we need to keep pace with advances in technology, address the new regulatory frontiers and create the foundation upon which digital transformation can achieve its full potential. Being prepared for digital transformation and emerging technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), Machine to Machine communications (M2M) and 5G is fundamental.

Advances in technology are creating new social phenomena and business models that impact every aspect of our personal and professional lives – and which challenge regulatory paradigms. M2M, cloud computing, 5G, AI and IoT are all bringing further profound change. Recognizing the potential of emerging technologies and the impact that policy and regulatory frameworks can have on their success, regulators should encourage a regulatory paradigm pushing frontiers and enabling the digital transformation. more> draft doc (pdf)