Tag Archives: Business improvement

How to Win When You’re Under Attack in a Meeting


Just Listen, Author: Mark Goulston.

By Art Petty – For high-stakes topics involving strategy and investments, you’re in competition with others for attention and resources, and not everyone wants you to win. When faced with a direct or passive-aggressive attack on your ideas and character, your response speaks volumes about your maturity and leadership to everyone involved.

Learn to navigate meeting room confrontations with diplomacy, grace, and a good bit of psychology, and you will go far.

For all sorts of good reasons, we’re wired as humans to quickly recognize dangerous situations and respond accordingly. Our brains shift precious resources away from the slower, smaller processing center and trigger a flood of chemicals preparing us for fight or flight. Drunk with adrenaline, we’re apt to either lash out or look for the first exit, including shrinking and withdrawing.

Dr. Goulston suggests we run through a simple mantra that allows us to derail the amygdala hijack and maintain our presence of mind.

Your goal is to gain a few precious seconds and work your reboot process. more>


How To Improve Results With The Right Frequency Of Monitoring

By George Bradt – Most understand the need to follow up and monitor progress on a theoretical level. Yet there are few guidelines to how frequently you should do that. Let me suggest that varies by the nature of what you’re monitoring, ranging from daily or even more frequently for tasks to annually for strategic plans.

Ben Harkin discussed the value of monitoring and reporting in the Psychological Journal. His headline is “Frequently Monitoring Progress Toward Goals Increases Chance of Success” – especially if you make the results public. While he was more focused on personal habits and goals, the findings are applicable to organizational behavior as well.

Here’s my current best thinking on the right frequency of monitoring. The main discriminant is the nature of the work and level of people doing the work with tighter, more frequent monitoring of tactical efforts and looser, less frequent monitoring of more strategic efforts.

  • Daily or more frequently – Tasks
  • Weekly – Projects
  • Monthly – Programs
  • Quarterly – Business Reviews, adjustments
  • Annually – Strategic/Organizational/Operational processes



Updates from Ciena

What is Fiber Densification?
By Helen Xenos – The term “network densification” is being used more often in relation to wireless network deployments, and more recently, “fiber densification” has become a hot a topic of discussion. So, what exactly is densification?

Densification simply describes the goal or end state of supporting more capacity within the same area or footprint. It is borne from the need of network providers to not only keep up with the increase in bandwidth demand they are seeing, but also grow their competitive edge in delivering a better end user experience for their customers.

Cable or Multi-Service Operators (MSOs) are undergoing a multi-year upgrade of their Hybrid Fiber Coax (HFC) access infrastructure. To provide a better quality of experience to subscribers, they are delivering higher capacity to smaller groups of homes and pushing fiber closer to the edge of the network.

HFC Fiber nodes, which on average service 500 homes per node, are being replaced with 10 to 12 Digital Fiber nodes. These nodes will now service 40 to 64 homes, be pushed deeper into the access, and increase per-user capacity.

An incredible amount of digital fiber nodes are expected to be deployed in the next few years, from tens to hundreds of thousands globally in 2018 and 2019. Fiber densification, the ability to pack as much capacity as possible over the limited fiber resources available, is of critical importance to achieve business objectives.

Finally, the simplest example of fiber densification is the hyperscale data center interconnect application. Global content providers are deploying huge amounts of fiber between massive data centers to maintain their aggressive pace of innovation and keep up with the doubling of bandwidth they are seeing on a yearly basis. more>


Think Like a Gambler: Innovation Is About Making Bets


Thinking in Bets, Author: Annie Duke.

By Alan Pentz – As humans we are often overconfident in our decision-making and even if we are unsure, we become more confident after a decision has been made. Studies of confirmation bias show that we seek information confirming our views and filter out evidence to the contrary. That’s a great strategy to feel good in the short term but isn’t going to lead to the best outcomes for your organization in the long term.

Thinking in bets (or thinking probabilistically) forces us out of that framework. Duke points out that people who are asked probabilistic questions are less sure and tend to hedge. It’s easy to say, “I’m 100 percent sure about this,” when nothing is really on the line, but if I ask you how much would you bet that you are right, suddenly the calculus changes.

So how does this impact government innovation? more>


Updates from Siemens

PLM ALM Integration using Teamcenter Linked Data Framework

By Jatish Mathew – Reports from the field indicate that the power window system in a particular car model has a defect. The anti-pinch feature does not work all the time. Customer service files a high priority incident report.

Representatives from different engineering teams meet and try to find the root cause of the problem.

The problem may be due to hardware failure such as a stuck button, it can be in the embedded software, or it can be a combination of hardware-software. Each team analyzes the problem using their tools and processes but when these teams need to coordinate what do they do?

The biggest worry for engineers, when they work with different teams, is that the practices, processes, and tools they use are diverse. How do they ensure that teams effectively collaborate without losing the processes and systems that work well for them?

In this post, we will explore how hardware (PLM domain) and software (ALM domain) teams work together to solve the power window problem. The automotive company in our example uses Linked Data Framework (Customer Only Access) to integrate and collaborate across domains. It is an integration framework to integrate different enterprise information systems such as Product Lifecycle Management (PLM) systems and Application Lifecycle Management (ALM) systems.

PLM ALM integration using Linked Data Framework helps with the following business problems:

  • How do you implement a process such as change management across different domains such as PLM and ALM?
  • How do you avoid creating new applications, and avoid user training?
  • How do you enable ALM users to access PLM data without learning PLM concepts or new tools?




More Democracy At Work? Do We Need That?

By Peter Scherrer – It is, in my view, more necessary now than ever before to put the fight for more democracy at work on the political agenda. But at the same time, it is an issue to which neither the general public nor the EU political élite pays much, if any, attention, even though it is of great importance for millions of working people. The European Trade Union Confederation (ETUC), at its Executive Committee meeting this week, went ahead in that spirit and adopted the strategy.

ETUC members are deeply convinced that a European approach to democracy at work can directly improve working life, collective labor rights and the concrete participation of workers in society and the economy.

The performance of EU Member States like Sweden, Denmark, Germany and Austria demonstrates that extending workers’ participation rights in companies and in administration is not an obstacle to a productive and profitable economy.

Many EU member countries have developed fair rights to information and consultation and a significant number have workers’ representation on company boards. The active involvement of trade unionists and workers’ representatives contributes to economic success and employment stability.

A glance at the current situation shows that democracy at work is being eroded by e.g. increasing centralization of company decision-making in all areas and increased concealment of real ownership etc. This widening gap could be partly closed by European legislation on workers’ participation.

But a huge danger to the options for more democratic labor/industrial relations comes from the rapid growth in the proportion of ‘digital’ workers and employees in the so-called sharing economy. more>


Updates from Ciena

Densifying the Network, One Small Cell at a Time

By Wayne Hickey – Mobile network usage is growing at an astounding rate of 42% CAGR, as data rates rise driven by an insatiable customer appetite for video, gaming, social media, and live streaming. With the omnipresence of smartphone technology, advancement towards 5G, and mobile data as the major use cases – MNOs (Mobile Network Operators) struggle to maintain with growing customer demands.

There are three primary ways that MNOs can add capacity to their wireless network:

  1. Buy more spectrum
  2. Make spectrum utilization more efficient by optimizing spectral efficiency
  3. Densify the network, by adding more cell sites, while reusing available spectrum

A mobile network must be designed to physically reach the intended number of subscribers and adapt to the changing capacity needs of those subscribers. To do so, MNOs segment their networks by base station coverage by using macro cells and small cells (ex. micro cells, pico cells, nano cells, femtocells, and even WiFi cells, or hotspots).

Macro cells cover large geographic areas while the various types of small cells cover much smaller and varied geographic areas serving fewer end-users, both indoor and outdoor.

Macro cell sites use high powered radios, generally for large coverage areas. Small cells use much lower power radios, require less space, and increase data capacity by proliferation or densification of the network. Densification of the network means deploying lots of small cells to enable more overall users, lower latency, better mobile device battery life, and expanded coverage. The approach is to basically reuse spectrum over and over again, by keeping the coverage area small, and managing the interference between cells using a variety of techniques. more>


Updates from GE

Industrial Medicine: Cell Therapy Scales Up
By Maggie Sieger – Cell therapy is a new way to treat serious diseases like cancer by extracting living cells from a donor or a patient, changing them so they can recognize and attack diseased cells or deliver treatment, and returning them to the patient’s body. But manufacturing the cells is a costly and time-consuming endeavor. A single dose can cost hundreds of thousands of dollars to make.

That’s because in the more than 900 ongoing regenerative medicine trials worldwide — a 19 percent jump since 2016 — researchers generally manufacture each patient’s dose of bio-engineered cells by hand. The individualized nature of cell therapy makes it not only prohibitively pricey, but also difficult to scale into commercial production.

That hasn’t been a problem while cell therapy was still confined to research labs. But as medical science advances and regulators approve a growing numbers of modified cell therapies for general use, handcrafting doses won’t be enough. “It’s relatively easy to do 15 or 20 doses by hand, but it’s nearly impossible to efficiently make thousands,” says GE Healthcare’s Aaron Dulgar-Tulloch, director of cell therapy research and development at the Centre for Advanced Therapeutic Cell Technologies (CATCT) in Toronto.

One way to speed the process is GE Healthcare’s FlexFactory for cell therapy. Cellular Biomedicine Group Inc. (CBMG) will be the first company to install this closed, semi-automated system for manufacturing bio-engineered cells in its Shanghai plant and use it to create cell therapies to treat various blood and solid tumor cancers. more>


Ten Keys To Launching An Agile Transformation In A Large Firm

By Steve Denning – The successful Agile transformations that I have seen in large organizations have typically begun without authority or budget resources. That’s because at the outset the organization usually doesn’t understand what Agile is or what it is getting into. This can lead some despair among Agile coaches as to whether Agile transformation is even possible in large organizations.

In fact, a comprehensive survey of successful organizational change in large organizations by Larry Prusak and Tom Davenport back in 2003 concluded that deep change rarely begins at the very top of a large organization. In part, that’s because the CEO is usually too busy to understand what’s involved or give it the commitment that it needs. It’s also because, if the change is led from the top, it risks being perceived as “just another command-and-control brainwave.”

In theory, the change could also be led by someone at the lowest level of the organization, though it can be hard for people at that level to see what’s going on beyond their own unit, or to acquire the organizational knowledge or the social capital to mobilize broader support.

So typically, the change begins at the middle, or upper-middle, of the organization and follows a certain pattern.

The pattern is similar to what I saw in a large and very change-resistant organization— the World Bank— where I was working in the late 1990s and where I—quixotically—set out to effect a change its strategy, without any budget resources or authority to do so. The organizational transformation in question wasn’t Agile, but it was a big, deep change involving a shift in organizational culture.

The dynamic that I experienced in the World Bank—the whips, the scorns, the opposition, the skullduggery—is something that I’ve seen play out in many organizations implementing Agile. If your challenge is an Agile transformation in a large organization, here are ten fundamental characteristics that you are likely to encounter, more>


Updates from GE

Making Waves: GE Unveils Plans To Build An Offshore Wind Turbine The Size Of A Skyscraper, The World’s Most Powerful
By Tomas Kellner – These turbines come with a 12-megawatt generator sitting 150 meters above the waves. Each will be capable of powering 16,000 homes and producing 67 gigawatt-hours per year, based on wind conditions on a typical German North Sea site — that’s 45 percent more energy than any other offshore wind turbine available today.

“We asked ourselves ‘What is the biggest rotor we would still feel comfortable with?’ and then we pushed ourselves some more,” Vincent Schellings recalls. “From a technology perspective, it seems like a stretch. But we know it’s doable. The beauty of the turbine is that it gives an edge over the competition. There’s nothing like this. Not even close.”

The size matters. The huge rotor allows the engineers to catch a lot more wind and ramp up what the industry calls “capacity factor.” This number describes the amount of power the turbine can produce per year at a given site, versus the energy it could have generated had it run full power all the time.

GE’s Haliade-X clocks in at 63 percent, “five to seven points higher than the competition,” Schellings says. “Basically, every point of capacity factor is worth $7 million per 100 megawatts for our customers. That’s a nice upside.” more>