World Bank – Scientific and technological advances are transforming lives: they are even helping poorer countries close the gap with rich countries in life expectancy. But, poorer countries still face tremendous challenges, as almost a quarter of children under five are malnourished, and 60 percent of primary school students are failing to achieve even a rudimentary education.
In fact, more than 260 million children and youth in poorer countries are receiving no education at all.
“Human capital” – the potential of individuals – is going to be the most important long-term investment any country can make for its people’s future prosperity and quality of life.
In many countries, the workforce is unprepared for the future that is fast unfolding.
This is a key insight from the World Bank’s forthcoming World Development Report 2019: The Changing Nature of Work. The frontier for skills is moving faster than ever before. Countries need to gear up now to prepare their workforces for the tremendous challenges and opportunities that are being driven by technological change. more>
Posted in Broadband, Business, Economic development, Economy, Education, History, Leadership, Science, Technology
Tagged Broadband, Capital, Government, Internet, Jobs, Leadership, Skills, Technology
ElumTools™ is a fully integrated lighting calculation Add-in for Autodesk® Revit®.
ElumTools™ – ElumTools performs accurate point-by-point lighting calculations for both interior and exterior lighting directly within the Revit model using our industry-standard radiosity calculation engine.
ElumTools is able to leveraging existing Revit geometry and luminaire families by utilizing Revit spatial elements of Rooms, Spaces, Regions and Areas as calculation boundaries.
The surface reflectance and color can be interpreted from the Revit materials properties and mapped to more suitable values if necessary. Reflectances can also be assigned using Revit Categories to encompass all ceilings, walls and floors for example.
Lighting fixture families already present in Revit are used by ElumTools and photometric file associations can be easily created if not already present. A wide variety of prominent manufacturers IES files are available with a few clicks from our cloud-based “Instabase.”
Each calculation is presented in an interactive visualization window with complete 3D walk-through capability. more>
The Adaptive Network: Why automation alone isn’t enough
By Keri Gilder – Just imagine, instead of 70, your heart rate was at 100 beats per minute. This could be a warning sign that you are on the verge of having a heart attack.
If your doctor were to get this information in real time, they could check the readings against your medical records and see that this is completely out of the norm and then warn you to seek medical assistance immediately.
However, if your personal trainer received that same information, would they reach the same conclusion as your doctor? Your trainer has access to a different database, which might show your resting heart rate as well as the rate during high-intensity training. Knowing that you are likely exercising, they would instead conclude that there is no need to go to the hospital after all.
This clearly demonstrates that just accepting raw data without filtering and proper analysis is no longer good enough and can potentially have serious repercussions. Instead, it is critical that we have diversity of thought when it comes to how we interpret data.
This is not just true for our health or other day-to-day scenarios, but can also be applied to the communication networks that carry and house our information. more>
Posted in Broadband, Business, Communication industry, Economy, Net, Science, Technology
Tagged Broadband, Business, Business improvement, Ciena, Internet, Net evolution, United States
By Bolaji Ojo – It’s a justifiable question. The Qualcomm–NXP trip was an expensive sortie: Qualcomm has paid $2 billion in mandatory break-off fees to NXP, but the bill for the hidden costs may be much higher. For nearly two years, the communications IC and IP supplier and its target endured prolonged uncertainties. Even now, the spasms from customer disruptions remain strong while many employees, though heaving a sigh of relief, must figure out where they truly belong in the enterprise.
Qualcomm is moving on resolutely from the NXP debacle. It must. However, the implications and lessons — if any — are industry-wide. One of the largest acquisitions in the history of the semiconductor industry foundered because of oppositions from various fronts, including customers who might have benefited from it. Simply dumping the blame on nebulous factors and faceless regulators will result in the industry learning nothing from the experience. Perhaps the transaction was destined to fail. Perhaps it could have been better managed and successfully, too. A thorough assessment of why this deal collapsed would offer lessons that can be applied to future deals.
There are no signs that Qualcomm will conduct a detailed analysis of why and how the bid unraveled. It is easier — again — to simply toss more money at stakeholders and move on. NXP’s management and shareholders who had tendered their equity could slake their thirst with $2 billion in Qualcomm’s money. more>
Posted in Broadband, Business, Communication industry, Economy, Net, telecom
Tagged Broadband, Business, Capital, Manufacturing, NXP Semiconductors, Qualcomm
By Laura Panjwani – High-performance computers (HPC), also known as supercomputers, give scientists the power to solve extremely complex or data intensive problems by concentrating the processing power of multiple, parallel computers.
Performance of a supercomputer is measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS), the measurement used in conventional computing.
The technology has a plethora of applications—including quantum mechanics, climate research, oil and gas exploration, chemistry, aerospace and automotive technologies, and much more.
In addition to environmental applications, supercomputers are also key to many up-and-coming technologies, including autonomous vehicles. In our article, “Using Deep Learning, AI Supercomputing, NVIDIA Works to Make Fully Self-Driving Cars a Reality” we highlighted Xavier, a complete system-on-chip (SoC) that integrates a new graphics processing unit (GPU) architecture called Volta, a custom 8 core CPU architecture, and a new computer vision accelerator. It features 9 billion transistors and a processor that will deliver 30 trillion operations per second (TOPS) of performance, while consuming only 30 watts of power.
This technology is the most complex SOC ever created and is a key part of the NVIDIA DRIVE Pegasus AI computing platform, the world’s first AI car supercomputer designed for fully autonomous Level 5 robotaxis. more>
Posted in Broadband, Business, Economic development, Economy, Education, Net, Science, Technology
Tagged Broadband, High-performance computers, HPC, Internet, Supercomputing, Technology
Cabinet airflow management done right
By Hans Vreeburg – Let’s start with some basic understanding of airflows within data centers. Nowadays almost all data centers apply hot and cold corridors to optimize the cooling capabilities. In the ideal situation cold air goes straight to the servers’ inlets. The hot air exiting the servers is returned directly to the cooling unit. This setup enables systems to run at the highest possible efficiency, using the least amount of power. The cooling setup has a big influence on the PUE (Power Usage Effectiveness): a lower PUE results in lower total energy consumption of the data center. This indirectly saves the environment and lowers OPEX costs. Could a small gap in your server rack really have that much influence?
As said above, the ideal setup is cold air entering the servers, while hot air exits. Gaps can lead to a higher demand of cold air than actually required by the servers. See it as a large water pipe: it normally needs a specific amount of water. When we make some holes in the pipe, you will need to pump in more water to get the same amount at the end of the pipe. more>
GSR-18 BEST PRACTICE GUIDELINES ON NEW REGULATORY FRONTIERS TO ACHIEVE DIGITAL TRANSFORMATION
itu.int – Digitization is increasingly and fundamentally changing societies and economies and disrupting many sectors in what has been termed the 4th Industrial Revolution. Meanwhile, ICT regulation has evolved globally over the past ten years and has experienced steady transformation.
As regulators, we need to keep pace with advances in technology, address the new regulatory frontiers and create the foundation upon which digital transformation can achieve its full potential. Being prepared for digital transformation and emerging technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), Machine to Machine communications (M2M) and 5G is fundamental.
Advances in technology are creating new social phenomena and business models that impact every aspect of our personal and professional lives – and which challenge regulatory paradigms. M2M, cloud computing, 5G, AI and IoT are all bringing further profound change. Recognizing the potential of emerging technologies and the impact that policy and regulatory frameworks can have on their success, regulators should encourage a regulatory paradigm pushing frontiers and enabling the digital transformation. more> draft doc (pdf)
Posted in Broadband, Business, Communication industry, Economic development, Economy, Education, Energy, Healthcare, Net, Science, Technology, Telecom industry
Tagged Broadband, Business improvement, Internet, ITU, Net evolution, Regulations, Technology
By Steve Denning – The article isn’t suggesting that firms embracing Agile are either angels or devils. I have yet to see a firm espousing Agile that has no flaws: those flaws must be seen for what they are and they need to be addressed.
If not addressed, they will cause serious financial, economic or social problems. Some of the flaws need to be addressed by the firms themselves and will be reinforced by the marketplace. Others may require government intervention.
Among the flaws for which the marketplace will by itself tend to generate corrective action are:
- Failure to continue innovating
- Sweat-shop workplaces
- Share buybacks
- Rethink “maximizing shareholder value”
- Abuse of monopoly power and privacy
We need to see Agile by the clear light of day, neither through rose-colored spectacles in which everything is kumbaya, nor through a glass darkly in which everything is evil.
The saying “you can’t have it both ways” doesn’t mean that we can’t walk and chew gum at the same time. more>
Posted in Broadband, Business, CONGRESS WATCH, Economy, Education, Energy & emissions, History, Leadership, Media, Net, Science, Technology
Tagged Agile management, Broadband, Business improvement, Congress Watch, Government, Internet, Leadership, Productivity
Why the Secret Behind Strong Early Adoption of 400G Technology is … 200G
By Helen Xenos – This month, we shipped our 5,000th 400G-capable coherent optical transponder, confirming our prediction that the use of 400G technology is ramping 3 times faster than 100G. What may come as a surprise, however, is that the dominant application driving 400G deployments is not 400G, but 200G (long haul-datacenter interconnect to be precise).
Why? The technology that enables 400G wavelengths has a lot to do with expanding the application space for 200G as well.
To fully understand the demand drivers for 400G, it’s important to clarify the various ways 400G is defined. The term “400G” is quite popular in today’s optical networking conversations, but can also have different meanings depending on the context in which it is being used.
So, which applications are driving 400G deployments? We hear so much about the fast-growing metro WDM market, 400ZR and the need to maximize capacity for short reach DCI applications, that intuitively you would think this is the “sweet spot” application.
In fact, the most popular use case we see for early 400G adoption is to support the rise of 200G long-haul for aggressive DCI network builds. more>
Posted in Broadband, Business, Communication industry, Economy, Net, Science, Technology, Telecom industry
Tagged 400G, Broadband, Ciena, Fiber optics, Internet, Net evolution
Variable Fonts Are the Future of Web Type
By Mandy Michael – A variable font is a single file that acts like multiple fonts. Variable fonts can improve page-load times, but their appeal goes way beyond that: Site visitors get an improved reading experience, and designers get greater creative freedom.
While it’s still early days, some software applications—including the latest Illustrator and Photoshop—and many web browsers do support the technology, and more will follow. It’s a good time to understand how variable fonts work and how to use them in your web designs.
Inventive type designers aren’t restricting themselves to expected variations, such as weight, width, or italic. They’re creating variations that address effect, readability, and style. more>
Posted in Broadband, Business, Education, History, How to, Net, Product, Technology
Tagged Adobe, Broadband, Business improvement, Internet, Jobs, Net evolution, Skills