By Kalyan Sundhar – The standards that dictate how 5G systems should work and interoperate were released earlier this year from the Internet Engineering Task Force (IETF) in an eagerly awaited update. The new telecommunications standards cleared the way for those planning to develop, build, or leverage 5G technology.
It is clear that a great deal of thought went into the development of the latest versions of the 5G standards to spur the growth of the 5G market and deliver new opportunities. Technology that follows these standards will ensure that the reliability of these networks is much more stable as it fills in the new market gaps.
This new version of the standards has opened the door for stand-alone (SA) 5G networks that do not rely on 4G for 5G signaling and kicking off a frantic rush to own the 5G market. While 4G networks are still available for added support, companies that do not have an existing 4G infrastructure can build their 5G deployments from scratch. This is due to a section of the standards that governs 4G handovers through interweaving 5G cells with existing 4G deployments.
The standards are only the foundation that will support the development of the 5G industry, but there is still plenty of work needed by companies to get it right. What that will look like is up to individual interpretation as there are gaps in the guidelines that make up the new standards. Interoperability will continue to be a challenge as organizations implement proprietary visions for 5G within those gaps. more>
Posted in Broadband, Business, Economy, Education, How to, Net, Product, Science, Technology
Tagged 5G, Broadband, Business improvement, Internet, Productivity, Technology, Test & measurement
By George Bradt – Most understand the need to follow up and monitor progress on a theoretical level. Yet there are few guidelines to how frequently you should do that. Let me suggest that varies by the nature of what you’re monitoring, ranging from daily or even more frequently for tasks to annually for strategic plans.
Ben Harkin discussed the value of monitoring and reporting in the Psychological Journal. His headline is “Frequently Monitoring Progress Toward Goals Increases Chance of Success” – especially if you make the results public. While he was more focused on personal habits and goals, the findings are applicable to organizational behavior as well.
Here’s my current best thinking on the right frequency of monitoring. The main discriminant is the nature of the work and level of people doing the work with tighter, more frequent monitoring of tactical efforts and looser, less frequent monitoring of more strategic efforts.
- Daily or more frequently – Tasks
- Weekly – Projects
- Monthly – Programs
- Quarterly – Business Reviews, adjustments
- Annually – Strategic/Organizational/Operational processes
Posted in Book review, Business, How to, Leadership, Product, Science, Technology
Tagged Business improvement, Leadership, monitoring, Organization, Productivity, Technology, Test & measurement
By Alison Gillespie – Researchers at the National Institute of Standards and Technology (NIST) have produced and precisely measured a spectrum of X-rays using a new, state-of-the-art machine. The instrument they used to measure the X-rays took 20 years to develop, and will help scientists working at the agency make some of the world’s most accurate measurements of materials for use in everything from bridges to pharmaceuticals.
The process of building the instrument for making the new measurements was painstaking. “This new specialized precision instrument required both a tremendous amount of mechanical innovation and theoretical modeling,” said James Cline, project leader of the NIST team that built the machine.
“That we were able to dedicate so many years and such high-level scientific expertise to this project is reflective of NIST’s role in the world of science.” more> https://goo.gl/e0zrET
Posted in Economic development, Education, Energy & emissions, Nature, Science, Technology
Tagged Copper X-ray emission spectrum, Electronics, Industrial economy, NIST, Physics, Technology, Test & measurement
By Lynnette Reese – The IEEE 802.3bs standard for 400Gbps is on track to be ratified and released late this year. Higher speed technologies tend to get driven to adoption as soon as they are available.
In 2004, 10Gbps was the leading edge. In 2010 40Gbps Ethernet and 100Gbps were introduced. How did we get this far, so fast?
The present group is leveraging a parallel lane structure to get to 400Gbps. For electrical interfaces the fastest speeds in the spec will be 50Gbps. When discussing optical fiber transmission, then the variation depends on the distance that one requires.
Technically, 400Gbps is not possible without switching away from non-return-to-zero modulation (also known as NRZ-type) encoding, the encoding scheme that everyone thinks of when they visualize Ethernet communication and other serial data transmission schemes.
NRZ data is encoded into a binary pattern with fixed voltage levels. A binary 0 is represented by the lower voltage level; the higher voltage level indicates binary 1. In 1000base-T Ethernet, the stream of 0s and 1s is driven at a 1000 bits per second (1Gbps) transmission rate.
At present, the physical “wall” of streaming 0s and 1s for single lane electrical interfaces is 25 Gbps, found in the standards as 802.3bj across backplanes and cables, and 802.3bm across chip-to-chip and chip-to-module interfaces.
In May 2016, an IEEE 802.3 task force formed to develop a single-lane 50 Gbps Ethernet standard. The 802.3bs standard, which defines 400Gbps in aggregate, will use an encoding scheme called PAM4 (4-Level Pulse Amplitude Modulation) to reach 50Gbps per channel. PAM4 is an encoding scheme that doubles the bit rate by providing four signal levels in the space of the two that NRZ presently provides. PAM4 cleverly divides the least significant bit (LSB) signal level in half and adds it to the signal of the most significant bit (MSB). more> https://goo.gl/fcDF8f
Posted in Broadband, Communication industry, Economy, Education, Net, Product, Science, Technology
Tagged Broadband, Encding, Ethernet, Internet, Signal, Technology, Test & measurement
By Paul Blanchard – Up to now, our technological progress has largely been a matter of trial and error. We make something new, evaluate its performance, then alter some part of the fabrication process and see whether it performs better or worse, all without direct knowledge of what is changing at the atomic level.
But if we could see what’s going on at that scale—if we could map out each individual atom and understand the role that it plays—we could create new and better materials not through blind experimentation, but through design.
For all that we’ve been able to accomplish while ignoring them, the fact is that individual atoms matter. The speed of a transistor, the efficiency of a solar cell, and the strength of an I-beam are ultimately determined by the configuration of the atoms inside. Today, new and improved microscopy techniques are getting us closer and closer to the goal of being able to see each and every atom within the materials we make—a very exciting prospect.
Over the past three years, I’ve been lucky enough to be part of a team working with one such new and improved microscopy technique, a method called 3-D atom probe tomography, or APT for short. APT is very different from conventional microscopy—at least, the sort of microscopy that I’m accustomed to. In conventional microscopy, we shine a beam of light particles or electrons on our specimen, whatever it is we want to look at, and create a magnified image using lenses or by mapping how our beam bounces off it.
In atom probe tomography, on the other hand, we don’t just look at our specimen—we literally take it apart, atom-by-atom. more> https://goo.gl/c0VdE3
Posted in Economic development, Education, How to, Nature, Science, Technology
Tagged 3-D atom probe tomography, APT, Electronics, Physics, Technology, Test & measurement
This Ship Has Sailed: U.S. Navy Commissions An All-Electric Stealth Destroyer Zumwalt For Service
By Tomas Kellner – The U.S. Navy has commissioned for service the USS Zumwalt, its largest and most advanced stealth destroyer. The ceremony took place in Baltimore on Saturday (Oct 15).
The Navy estimates the 15,600-ton vessel can hit a target at a range of more than 60 miles. It also has a wave-piercing tumblehome design and a unique superstructure that make it less visible to enemy radar at sea.
The ship is equally innovative below deck. Traditionally, the Navy powered its vessels with gas turbines driving controllable pitch propellers through large and complex gearboxes. But the new destroyer has on board a 78-megawatt power station supplying electricity to an advanced integrated power system (IPS). GE Marine designed the system, which powers giant GE induction motors connected directly to the propeller shafts and routes electricity to a vast array of sensors, weapons, radar and other critical systems on board.
As a result, the ship will have nearly 10 times as much available power as its predecessors. In fact, the Zumwalt could become the first ship carrying next-generation weapons such as electromagnetic railguns, which use a strong electromagnetic pulse, rather than gunpowder, to shoot projectiles. “We’re no longer restricting the engines to provide propulsion power only,” Adam Kabulski, director for naval accounts at GE Power Conversion, told GE Reports.
“This design allows you to send electric power wherever you need it. You can access many megawatts in a short amount of time and convert it into energy. It’s instantaneous.”
The system is also highly redundant. Instead of the typical three-phase motors, the Zumwalt’s advanced induction motors have 15 phases. Kabulski said that by simply reversing the direction of the rotating magnetic field in the motor, for example, the shaft can turn in the opposite direction to give astern power. more> https://goo.gl/9WVdnz
Posted in Economic development, Economy, Energy, Science, Technology, Transportation
Tagged Business improvement, Electric propulsion, GE, Manufacturing, Technology, Test & measurement, Zumwalt
Shhh… The New 737 MAX Redefines a Quiet Airplane
Boeing – “The noise safety standards are becoming increasingly stringent as people start to live closer to airports,” said Barry St. Germaine, a Boeing Test & Evaluation (BT&E) pilot.
“We will not be able to fly into certain airports if we don’t meet the noise requirements, so tests like this ensure we continue to maintain the current market and gain access to new markets around the world,” he said.
The 737 MAX is designed to be 40 percent quieter than today’s Next-Generation 737. Community noise testing is intended to validate that design. more> https://goo.gl/DtYV7H
Posted in Economic development, Economy, Education, Product, Regulations, Science, Technology, Transportation
Tagged Boeing, Business improvement, Industrial economy, Noise, Technology, Test & measurement
The KC-46A: Air Refueling in 3D
Boeing – On January 24th, Boeing and U.S. Air Force aircrews successfully completed the KC-46A tanker’s first refueling flight in the skies above Washington state. Following takeoff from Boeing Field in Seattle, the KC-46A test team worked through a series of test points before smoothly offloading 1,600 pounds of fuel to an F-16 fighter aircraft flying at 20,000 feet.
The KC-46A that accomplished the refueling milestone will soon begin refueling a number of other military aircraft as well, including a C-17, F/A-18, A-10 and AV-8B. Also known as EMD-2, the tanker made its first flight September 25, 2015 and has now completed 32 flights.
The KC-46A is a multirole tanker Boeing is building for the U.S. Air Force that can refuel all allied and coalition military aircraft compatible with international aerial refueling procedures and can carry passengers, cargo and patients. Overall, Boeing plans to build 179 KC-46 aircraft for the U.S. Air Force. more> boeing.com/innovation/
Posted in Economic development, Economy, Energy, Net, Product, Science, Technology, Transportation
Tagged Boeing, Business improvement, KC-46A, Productivity, Technology, Test & measurement
By Charles Murray – An expanded hardware-in-the-loop (HIL) test system promises to offer a better way to test engineering’s most complex products, from defense aircraft to autonomous cars.
Introduced at this week’s NIWeek 2016 show in Austin, TX, the turnkey HIL Simulators product essentially consists of a new chassis that works with the company’s existing HIL platform. Its open, modular architecture is targeted at high-current applications that need to simulate complex sensor data from such sources as cameras, radar, and RF signals.
For big original equipment manufacturers, such as the automakers, the new HIL simulators offer a more cost-effective way to test their most complex products. “The alternative to HIL is field testing,” Noah Reding, senior business segment manager for real-time test at National Instruments, told Design News.
“In field testing, they need prototype vehicles, which can cost a half-million dollars. And they need test tracks, which can cost $100,000 per day. HIL can significantly reduce cost, because it allows them to do it in the lab.” more> http://goo.gl/QygUSD
Posted in Broadband, Business, Communication industry, Economic development, Economy, Education, Net, Product, Technology
Tagged Broadband, Business improvement, Industrial economy, Internet, Super regions, Technology, Test & measurement
By Elizabeth Montalbano – Emeritus Professor Wanda Lewis in the School of Engineering at the University of Warwick has been studying how nature relieves stress for 25 years, taking an approach called “form-finding,” a process of shaping an object, or a structure, by loads applied to it
This process is different than engineering methods that start from an assumed shape and then check the stresses and displacements in a structure under an applied load, she said.
Form-finding enables the design of rigid structures that follow a strong natural form, structures that are sustained by a force of pure compression or tension without bending stresses, Lewis said. These stresses are the main points of weakness in structures and what causes bridges to fail or buckle under weight or stress and cause damage or even collapse.
“In form-finding, we go in the opposite direction — the shape of the structure is not known initially, it is found by the application of load and involves repetitive calculations to find a shape that is in equilibrium with all forces,” she said. more> http://goo.gl/xHtqjI
Posted in Construction, Economic development, Economy, Education, How to, Science, Technology
Tagged Business improvement, Construction, Form-finding, Physics, Technology, Test & measurement