Category Archives: Technology

How To Improve Results With The Right Frequency Of Monitoring

By George Bradt – Most understand the need to follow up and monitor progress on a theoretical level. Yet there are few guidelines to how frequently you should do that. Let me suggest that varies by the nature of what you’re monitoring, ranging from daily or even more frequently for tasks to annually for strategic plans.

Ben Harkin discussed the value of monitoring and reporting in the Psychological Journal. His headline is “Frequently Monitoring Progress Toward Goals Increases Chance of Success” – especially if you make the results public. While he was more focused on personal habits and goals, the findings are applicable to organizational behavior as well.

Here’s my current best thinking on the right frequency of monitoring. The main discriminant is the nature of the work and level of people doing the work with tighter, more frequent monitoring of tactical efforts and looser, less frequent monitoring of more strategic efforts.

  • Daily or more frequently – Tasks
  • Weekly – Projects
  • Monthly – Programs
  • Quarterly – Business Reviews, adjustments
  • Annually – Strategic/Organizational/Operational processes



Four things that matter more than the Paris Agreement

In a new report, “Undiplomatic Action: A practical guide to the new politics and geopolitics of climate change,” David Victor and Bruce Jones write:

“Without confidence in new technologies and the policy and investment support that follows from that confidence, even the most advanced and elaborated global diplomatic agreements can only produce an ever-wider chasm between stated goals and realistically achievable outcomes.”

They contend that “real world” actions on the ground, not global goals, will drive energy transitions at the local level and in the private sector.

In the paper, they outline four key factors they believe matter even more than the global agreement:

  1. Facilitate leadership through small groups
  2. Focus on near-term emissions reductions
  3. Invest in technological innovation
  4. Demonstrate success and enable better governance


Source: Four things that matter more than the Paris Agreement


Updates from GE

Leading The Charge: As Battery Storage Sweeps The World, GE Finds Its Place In The Sun
By Tomas Kellner – The “duck curve” has two distinct peaks — one in the morning and the other after sunset — connected by a sagging belly pulled down by the deluge of renewable energy generated by the millions of solar panels sprinkled across California’s roofs and fields.

On a sunny Sunday, this glut of input could even lead to oversupply, a situation where wholesale energy prices drop so much that producers pay utilities to take their energy.

The problem reverses when the sun sinks into the Pacific. Power producers must quickly crank up their plants – many of them burning gas or coal – to replace those missing solar electrons with 11,000 megawatts to keep the state’s homes and businesses humming.

“The peak for solar power generation is at noon,” says Eric Gebhardt, vice president of strategic technology for GE Power. “What if you could store this energy and release it six hours later when the sun goes down and people come home, start cooking dinner and watch TV?” Gebhardt asks.

That’s precisely the point of GE’s Reservoir, a new grid-scale energy storage system the company unveiled today. The grid has to be perfectly balanced, meaning that power supply and demand match, to prevent it from crashing.

The Reservoir will allow producers to “decouple when energy is produced and when it is consumed,” Gebhardt says. “Without it, if you have too much solar during the day, the only option you have is to curtail production.”

The rise of the electric car unleashed innovation in the battery space, and the spread of solar power has brought costs down 50 percent over the last four years, says Keith Longtin, product breakout leader at GE Global Research in Niskayuna, New York. “You are now getting to a point where energy storage starts to make sense,” he says. more>


Updates from Ciena

What is Fiber Densification?
By Helen Xenos – The term “network densification” is being used more often in relation to wireless network deployments, and more recently, “fiber densification” has become a hot a topic of discussion. So, what exactly is densification?

Densification simply describes the goal or end state of supporting more capacity within the same area or footprint. It is borne from the need of network providers to not only keep up with the increase in bandwidth demand they are seeing, but also grow their competitive edge in delivering a better end user experience for their customers.

Cable or Multi-Service Operators (MSOs) are undergoing a multi-year upgrade of their Hybrid Fiber Coax (HFC) access infrastructure. To provide a better quality of experience to subscribers, they are delivering higher capacity to smaller groups of homes and pushing fiber closer to the edge of the network.

HFC Fiber nodes, which on average service 500 homes per node, are being replaced with 10 to 12 Digital Fiber nodes. These nodes will now service 40 to 64 homes, be pushed deeper into the access, and increase per-user capacity.

An incredible amount of digital fiber nodes are expected to be deployed in the next few years, from tens to hundreds of thousands globally in 2018 and 2019. Fiber densification, the ability to pack as much capacity as possible over the limited fiber resources available, is of critical importance to achieve business objectives.

Finally, the simplest example of fiber densification is the hyperscale data center interconnect application. Global content providers are deploying huge amounts of fiber between massive data centers to maintain their aggressive pace of innovation and keep up with the doubling of bandwidth they are seeing on a yearly basis. more>


Updates from Siemens

PLM ALM Integration using Teamcenter Linked Data Framework

By Jatish Mathew – Reports from the field indicate that the power window system in a particular car model has a defect. The anti-pinch feature does not work all the time. Customer service files a high priority incident report.

Representatives from different engineering teams meet and try to find the root cause of the problem.

The problem may be due to hardware failure such as a stuck button, it can be in the embedded software, or it can be a combination of hardware-software. Each team analyzes the problem using their tools and processes but when these teams need to coordinate what do they do?

The biggest worry for engineers, when they work with different teams, is that the practices, processes, and tools they use are diverse. How do they ensure that teams effectively collaborate without losing the processes and systems that work well for them?

In this post, we will explore how hardware (PLM domain) and software (ALM domain) teams work together to solve the power window problem. The automotive company in our example uses Linked Data Framework (Customer Only Access) to integrate and collaborate across domains. It is an integration framework to integrate different enterprise information systems such as Product Lifecycle Management (PLM) systems and Application Lifecycle Management (ALM) systems.

PLM ALM integration using Linked Data Framework helps with the following business problems:

  • How do you implement a process such as change management across different domains such as PLM and ALM?
  • How do you avoid creating new applications, and avoid user training?
  • How do you enable ALM users to access PLM data without learning PLM concepts or new tools?




How Bitcoin Ends

By Douglas Rushkoff – Bitcoin was a clever idea. Idealistic, even. But it isn’t working out quite as its developers imagined. In fact, once all the coin has been mined, bitcoin will simply reinforce the very banking system it was invented to disrupt.

Watching the bitcoin phenomenon is a bit like watching the three-decade decline of the internet from a playspace for the counterculture to one for venture capitalists. We thought the net would break the monopoly of top-down, corporate media. But as business interests took over it has become primarily a delivery system for streaming television to consumers, and consumer data to advertisers.

Likewise, bitcoin was intended to break the monopoly of the banking system over central currency and credit. But, in the end, it will turn into just another platform for the big banks to do the same old extraction they always have. Here’s how.

Central currency is not the only kind of money that ever existed. For many centuries, gold and other precious metals served as money.

In essence, bitcoin is money built and maintained by nerds, based on the premise that good nerds will outnumber the bad nerds. Sure, bad actors can dedicate all of their processing power to fake transactions, but they will be outnumbered by those who want the token to work properly.

What is the incentive for people to spend millions of dollars on computers and power once there’s no more kickback of coin? more>


The future of political warfare: Russia, the West, and the coming age of global digital competition

By Alina Polyakova and Spencer Phipps Boyer – The Kremlin’s political warfare against democratic countries has evolved from overt to covert influence activities. But while Russia has pioneered the toolkit of asymmetric measures for the 21st century, including cyberattacks and disinformation campaigns, these tools are already yesterday’s game. Technological advances in artificial intelligence (AI), automation, and machine learning, combined with the growing availability of big data, have set the stage for a new era of sophisticated, inexpensive, and highly impactful political warfare.

In the very near term, it will become more difficult, if not impossible, to distinguish between real and falsified audio, video, or online personalities. Malicious actors will use these technologies to target Western societies more rapidly and efficiently. As authoritarian states such as Russia and China invest resources in new technologies, the global competition for the next great leap in political warfare will intensify.

As the battle for the future shifts to the digital domain, policymakers will face increasingly complex threats against democracies. The window to mount an effective “whole-of- society” response to emerging asymmetric threats is quickly narrowing. more>


Updates from GE

Industrial Medicine: Cell Therapy Scales Up
By Maggie Sieger – Cell therapy is a new way to treat serious diseases like cancer by extracting living cells from a donor or a patient, changing them so they can recognize and attack diseased cells or deliver treatment, and returning them to the patient’s body. But manufacturing the cells is a costly and time-consuming endeavor. A single dose can cost hundreds of thousands of dollars to make.

That’s because in the more than 900 ongoing regenerative medicine trials worldwide — a 19 percent jump since 2016 — researchers generally manufacture each patient’s dose of bio-engineered cells by hand. The individualized nature of cell therapy makes it not only prohibitively pricey, but also difficult to scale into commercial production.

That hasn’t been a problem while cell therapy was still confined to research labs. But as medical science advances and regulators approve a growing numbers of modified cell therapies for general use, handcrafting doses won’t be enough. “It’s relatively easy to do 15 or 20 doses by hand, but it’s nearly impossible to efficiently make thousands,” says GE Healthcare’s Aaron Dulgar-Tulloch, director of cell therapy research and development at the Centre for Advanced Therapeutic Cell Technologies (CATCT) in Toronto.

One way to speed the process is GE Healthcare’s FlexFactory for cell therapy. Cellular Biomedicine Group Inc. (CBMG) will be the first company to install this closed, semi-automated system for manufacturing bio-engineered cells in its Shanghai plant and use it to create cell therapies to treat various blood and solid tumor cancers. more>


Updates from GE

Making Waves: GE Unveils Plans To Build An Offshore Wind Turbine The Size Of A Skyscraper, The World’s Most Powerful
By Tomas Kellner – These turbines come with a 12-megawatt generator sitting 150 meters above the waves. Each will be capable of powering 16,000 homes and producing 67 gigawatt-hours per year, based on wind conditions on a typical German North Sea site — that’s 45 percent more energy than any other offshore wind turbine available today.

“We asked ourselves ‘What is the biggest rotor we would still feel comfortable with?’ and then we pushed ourselves some more,” Vincent Schellings recalls. “From a technology perspective, it seems like a stretch. But we know it’s doable. The beauty of the turbine is that it gives an edge over the competition. There’s nothing like this. Not even close.”

The size matters. The huge rotor allows the engineers to catch a lot more wind and ramp up what the industry calls “capacity factor.” This number describes the amount of power the turbine can produce per year at a given site, versus the energy it could have generated had it run full power all the time.

GE’s Haliade-X clocks in at 63 percent, “five to seven points higher than the competition,” Schellings says. “Basically, every point of capacity factor is worth $7 million per 100 megawatts for our customers. That’s a nice upside.” more>


Updates from Georgia Tech

Robot Monitors Chicken Houses and Retrieves Eggs
By John Toon – “Today’s challenge is to teach a robot how to move in environments that have dynamic, unpredictable obstacles, such as chickens,” said Colin Usher, a research scientist in GTRI’s Food Processing Technology Division.

“When busy farmers must spend time in chicken houses, they are losing money and opportunities elsewhere on the farm. In addition, there is a labor shortage when it comes to finding workers to carry out manual tasks such as picking up floor eggs and simply monitoring the flocks. If a robot could successfully operate autonomously in a chicken house 24 hours a day and seven days a week, it could then pick up floor eggs, monitor machinery, and check on birds, among other things. By assigning one robot to each chicken house, we could also greatly reduce the potential for introductions of disease or cross-contamination from one house to other houses.”

The autonomous robot is outfitted with an ultrasonic localization system similar to GPS but more suited to an indoor environment where GPS might not be available. This system uses low-cost, ultrasonic beacons indicating the robot’s orientation and its location in a chicken house. The robot also carries a commercially available time-of-flight camera, which provides three-dimensional (3D) depth data by emitting light signals and then measuring how long they take to return. The localization and 3D data together allow the robot’s software to devise navigation plans around chickens to perform tasks. more>