If you don’t test your back-up power, don’t expect it to work!
Datacenter.com – Last week we again conducted a black building test. This time including the power expansion which will be active soon. We always have full confidence in our black building test, but it’s good to test this regularly. After a series of power outages at various data centers, we ensure there is a power supply when you really need it.
If the power goes out for a short or long time, that is of course annoying if your favorite TV show was just on TV. But if a power outage means that your critical IT infrastructure no longer works, there are far-reaching consequences for your company. How is a power outage picked up by Datacenter.com.
For companies, organizations and governments, reliable power supply is one of the most important reasons for placing their IT equipment in a data center. Recent high-profile data center failures brought the issue of reliability to the fore. Historically, data center power cuts have been experienced by many blue-chip banks and telecoms providers, so no-one can claim immunity from such problems.
To guarantee the reliability of that power supply, we test our emergency power supply on a monthly basis. During such a test, we activate the generators to ensure the redundant power supply path works appropriately. By setting up a switch that starts our redundant set of emergency power supply, synchronizes with the mains, and delivering power to the IT equipment. During regular generator tests, the generator is putted next to the grid so the equipment that is not behind UPS remains switched on. This means that the cooling system and the lighting will continue to work during such a test. more>
Posted in Broadband, Business, Economic development, Economy, Education, How to, Net, Technology
Tagged Broadband, Business improvement, datacenter.com, Internet, Skills, Technology, Test & measurement
By Arthur Pini – The eye diagram is a general-purpose tool for analyzing serial digital signals. It shows the effects of vertical noise, horizontal jitter, duty cycle distortion, inter-symbol interference, and crosstalk, all of which can close the “eye.” While engineers have used eye diagrams for decades, oscilloscopes continually get new features that increase its value.
Oscilloscopes form eye diagrams—the separation between the two binary data states “1” and “0”—by overlaying multiple single clock periods on a persistence display. The accumulation shows the history of multiple acquisitions.
Additive noise tends to close the eye vertically while timing jitter and uncertainty closes the eye horizontally. Duty cycle distortion (DCD) and inter-symbol interference (ISI) change the shape of the eye. The channel will fail if the eye closes to the point where the receiver can no longer recognize “0” and “1” states.
In the days of analog oscilloscopes, the eye diagram was formed by triggering the oscilloscope with the serial data clock and acquiring multiple bits over time using a persistence or storage display. This technique adds the trigger uncertainty or trigger jitter to the eye diagram for each acquisition. Digital oscilloscopes form the eye by acquiring very long record with many serial bits.
The clock period is determined, and the waveform is broken up or “sliced” into multiple single-bit acquisitions overlaid in the persistence display. In this way, all the data is acquired with a single value of trigger jitter that’s eliminated by using differential time measurements within the eye. more>
By Kalyan Sundhar – The standards that dictate how 5G systems should work and interoperate were released earlier this year from the Internet Engineering Task Force (IETF) in an eagerly awaited update. The new telecommunications standards cleared the way for those planning to develop, build, or leverage 5G technology.
It is clear that a great deal of thought went into the development of the latest versions of the 5G standards to spur the growth of the 5G market and deliver new opportunities. Technology that follows these standards will ensure that the reliability of these networks is much more stable as it fills in the new market gaps.
This new version of the standards has opened the door for stand-alone (SA) 5G networks that do not rely on 4G for 5G signaling and kicking off a frantic rush to own the 5G market. While 4G networks are still available for added support, companies that do not have an existing 4G infrastructure can build their 5G deployments from scratch. This is due to a section of the standards that governs 4G handovers through interweaving 5G cells with existing 4G deployments.
The standards are only the foundation that will support the development of the 5G industry, but there is still plenty of work needed by companies to get it right. What that will look like is up to individual interpretation as there are gaps in the guidelines that make up the new standards. Interoperability will continue to be a challenge as organizations implement proprietary visions for 5G within those gaps. more>
Posted in Broadband, Business, Economy, Education, How to, Net, Product, Science, Technology
Tagged 5G, Broadband, Business improvement, Internet, Productivity, Technology, Test & measurement
By George Bradt – Most understand the need to follow up and monitor progress on a theoretical level. Yet there are few guidelines to how frequently you should do that. Let me suggest that varies by the nature of what you’re monitoring, ranging from daily or even more frequently for tasks to annually for strategic plans.
Ben Harkin discussed the value of monitoring and reporting in the Psychological Journal. His headline is “Frequently Monitoring Progress Toward Goals Increases Chance of Success” – especially if you make the results public. While he was more focused on personal habits and goals, the findings are applicable to organizational behavior as well.
Here’s my current best thinking on the right frequency of monitoring. The main discriminant is the nature of the work and level of people doing the work with tighter, more frequent monitoring of tactical efforts and looser, less frequent monitoring of more strategic efforts.
- Daily or more frequently – Tasks
- Weekly – Projects
- Monthly – Programs
- Quarterly – Business Reviews, adjustments
- Annually – Strategic/Organizational/Operational processes
Posted in Book review, Business, How to, Leadership, Product, Science, Technology
Tagged Business improvement, Leadership, monitoring, Organization, Productivity, Technology, Test & measurement
By Alison Gillespie – Researchers at the National Institute of Standards and Technology (NIST) have produced and precisely measured a spectrum of X-rays using a new, state-of-the-art machine. The instrument they used to measure the X-rays took 20 years to develop, and will help scientists working at the agency make some of the world’s most accurate measurements of materials for use in everything from bridges to pharmaceuticals.
The process of building the instrument for making the new measurements was painstaking. “This new specialized precision instrument required both a tremendous amount of mechanical innovation and theoretical modeling,” said James Cline, project leader of the NIST team that built the machine.
“That we were able to dedicate so many years and such high-level scientific expertise to this project is reflective of NIST’s role in the world of science.” more> https://goo.gl/e0zrET
Posted in Economic development, Education, Energy & emissions, Nature, Science, Technology
Tagged Copper X-ray emission spectrum, Electronics, Industrial economy, NIST, Physics, Technology, Test & measurement
By Lynnette Reese – The IEEE 802.3bs standard for 400Gbps is on track to be ratified and released late this year. Higher speed technologies tend to get driven to adoption as soon as they are available.
In 2004, 10Gbps was the leading edge. In 2010 40Gbps Ethernet and 100Gbps were introduced. How did we get this far, so fast?
The present group is leveraging a parallel lane structure to get to 400Gbps. For electrical interfaces the fastest speeds in the spec will be 50Gbps. When discussing optical fiber transmission, then the variation depends on the distance that one requires.
Technically, 400Gbps is not possible without switching away from non-return-to-zero modulation (also known as NRZ-type) encoding, the encoding scheme that everyone thinks of when they visualize Ethernet communication and other serial data transmission schemes.
NRZ data is encoded into a binary pattern with fixed voltage levels. A binary 0 is represented by the lower voltage level; the higher voltage level indicates binary 1. In 1000base-T Ethernet, the stream of 0s and 1s is driven at a 1000 bits per second (1Gbps) transmission rate.
At present, the physical “wall” of streaming 0s and 1s for single lane electrical interfaces is 25 Gbps, found in the standards as 802.3bj across backplanes and cables, and 802.3bm across chip-to-chip and chip-to-module interfaces.
In May 2016, an IEEE 802.3 task force formed to develop a single-lane 50 Gbps Ethernet standard. The 802.3bs standard, which defines 400Gbps in aggregate, will use an encoding scheme called PAM4 (4-Level Pulse Amplitude Modulation) to reach 50Gbps per channel. PAM4 is an encoding scheme that doubles the bit rate by providing four signal levels in the space of the two that NRZ presently provides. PAM4 cleverly divides the least significant bit (LSB) signal level in half and adds it to the signal of the most significant bit (MSB). more> https://goo.gl/fcDF8f
Posted in Broadband, Communication industry, Economy, Education, Net, Product, Science, Technology
Tagged Broadband, Encding, Ethernet, Internet, Signal, Technology, Test & measurement
By Paul Blanchard – Up to now, our technological progress has largely been a matter of trial and error. We make something new, evaluate its performance, then alter some part of the fabrication process and see whether it performs better or worse, all without direct knowledge of what is changing at the atomic level.
But if we could see what’s going on at that scale—if we could map out each individual atom and understand the role that it plays—we could create new and better materials not through blind experimentation, but through design.
For all that we’ve been able to accomplish while ignoring them, the fact is that individual atoms matter. The speed of a transistor, the efficiency of a solar cell, and the strength of an I-beam are ultimately determined by the configuration of the atoms inside. Today, new and improved microscopy techniques are getting us closer and closer to the goal of being able to see each and every atom within the materials we make—a very exciting prospect.
Over the past three years, I’ve been lucky enough to be part of a team working with one such new and improved microscopy technique, a method called 3-D atom probe tomography, or APT for short. APT is very different from conventional microscopy—at least, the sort of microscopy that I’m accustomed to. In conventional microscopy, we shine a beam of light particles or electrons on our specimen, whatever it is we want to look at, and create a magnified image using lenses or by mapping how our beam bounces off it.
In atom probe tomography, on the other hand, we don’t just look at our specimen—we literally take it apart, atom-by-atom. more> https://goo.gl/c0VdE3
Posted in Economic development, Education, How to, Nature, Science, Technology
Tagged 3-D atom probe tomography, APT, Electronics, Physics, Technology, Test & measurement
This Ship Has Sailed: U.S. Navy Commissions An All-Electric Stealth Destroyer Zumwalt For Service
By Tomas Kellner – The U.S. Navy has commissioned for service the USS Zumwalt, its largest and most advanced stealth destroyer. The ceremony took place in Baltimore on Saturday (Oct 15).
The Navy estimates the 15,600-ton vessel can hit a target at a range of more than 60 miles. It also has a wave-piercing tumblehome design and a unique superstructure that make it less visible to enemy radar at sea.
The ship is equally innovative below deck. Traditionally, the Navy powered its vessels with gas turbines driving controllable pitch propellers through large and complex gearboxes. But the new destroyer has on board a 78-megawatt power station supplying electricity to an advanced integrated power system (IPS). GE Marine designed the system, which powers giant GE induction motors connected directly to the propeller shafts and routes electricity to a vast array of sensors, weapons, radar and other critical systems on board.
As a result, the ship will have nearly 10 times as much available power as its predecessors. In fact, the Zumwalt could become the first ship carrying next-generation weapons such as electromagnetic railguns, which use a strong electromagnetic pulse, rather than gunpowder, to shoot projectiles. “We’re no longer restricting the engines to provide propulsion power only,” Adam Kabulski, director for naval accounts at GE Power Conversion, told GE Reports.
“This design allows you to send electric power wherever you need it. You can access many megawatts in a short amount of time and convert it into energy. It’s instantaneous.”
The system is also highly redundant. Instead of the typical three-phase motors, the Zumwalt’s advanced induction motors have 15 phases. Kabulski said that by simply reversing the direction of the rotating magnetic field in the motor, for example, the shaft can turn in the opposite direction to give astern power. more> https://goo.gl/9WVdnz
Posted in Economic development, Economy, Energy, Science, Technology, Transportation
Tagged Business improvement, Electric propulsion, GE, Manufacturing, Technology, Test & measurement, Zumwalt
Shhh… The New 737 MAX Redefines a Quiet Airplane
Boeing – “The noise safety standards are becoming increasingly stringent as people start to live closer to airports,” said Barry St. Germaine, a Boeing Test & Evaluation (BT&E) pilot.
“We will not be able to fly into certain airports if we don’t meet the noise requirements, so tests like this ensure we continue to maintain the current market and gain access to new markets around the world,” he said.
The 737 MAX is designed to be 40 percent quieter than today’s Next-Generation 737. Community noise testing is intended to validate that design. more> https://goo.gl/DtYV7H
Posted in Economic development, Economy, Education, Product, Regulations, Science, Technology, Transportation
Tagged Boeing, Business improvement, Industrial economy, Noise, Technology, Test & measurement
The KC-46A: Air Refueling in 3D
Boeing – On January 24th, Boeing and U.S. Air Force aircrews successfully completed the KC-46A tanker’s first refueling flight in the skies above Washington state. Following takeoff from Boeing Field in Seattle, the KC-46A test team worked through a series of test points before smoothly offloading 1,600 pounds of fuel to an F-16 fighter aircraft flying at 20,000 feet.
The KC-46A that accomplished the refueling milestone will soon begin refueling a number of other military aircraft as well, including a C-17, F/A-18, A-10 and AV-8B. Also known as EMD-2, the tanker made its first flight September 25, 2015 and has now completed 32 flights.
The KC-46A is a multirole tanker Boeing is building for the U.S. Air Force that can refuel all allied and coalition military aircraft compatible with international aerial refueling procedures and can carry passengers, cargo and patients. Overall, Boeing plans to build 179 KC-46 aircraft for the U.S. Air Force. more> boeing.com/innovation/
Posted in Economic development, Economy, Energy, Net, Product, Science, Technology, Transportation
Tagged Boeing, Business improvement, KC-46A, Productivity, Technology, Test & measurement