Effective supply chain management relies on everyone involved in the supply chain network providing timely, accurate and consistent information to identify issues and streamline the process.
If supply chain managers want to transform their supply chains in 2019, they must work to drive information transparency, tracking of goods and optimization. As the supply chain becomes more fragmented, digital transformation and bringing stakeholders onto a common SCM platforms will provide significant benefits.
One research team is trying to tackle the growing problem of plastic waste ending up in the ocean.
Purdue University researchers have created a new chemical conversion technique that could turn 90 percent of polyolefin waste, a common form of plastic, into more beneficial products like clean fuels, pure polymers, naphtha and monomers.
“Our strategy is to create a driving force for recycling by converting polyolefin waste into a wide range of valuable products, including polymers, naphtha [a mixture of hydrocarbons], or clean fuels,” Linda Wang, the Maxine Spencer Nichols Professor in the Davidson School of Chemical Engineering at Purdue University and leader of the research team developing this technology, said in a statement.
“Our conversion technology has the potential to boost the profits of the recycling industry and shrink the world’s plastic waste stock.”
The world is facing two related long-term crises: climate change and the accelerating depletion of coal, oil, and natural gas reserves.
Scientists that monitor global climate conditions recommend a reduction of CO2 emissions, but many countries are not doing so.
World Energy Council data of fossil fuel consumption rates measured against known reserves indicate most countries will run out over the next 100 years. Still coal is proving difficult to abandon.
The difficultly in replacing nonrenewable energy sources comes down to cost.
We propose the United States undertake a $1B Manhattan-like project (herein called “Manhattan 2”) that brings together about one thousand of the brightest engineering minds to develop low-cost alternatives to existing infrastructure over a five-year period.
The goal is to provide a means for every solar-suitable home to more fully utilize solar energy, and to dramatically reduce energy consumed by each home.
When faced with difficult business decisions, you may have to choose the riskier option.
Courage is not a descriptive word applied to technology management. That’s unfortunate because it can aptly portray difficult decisions managers face.
What seems like straightforward progress can run into nasty surprises that few could have predicted. Let’s take a look at some that have caught my attention, and then I’ll tell you of one such situation while I was CEO of Data Translation.
The interlocking toy system was the brainchild of Godtfred Kirk Christiansen, son of a Danish toymaker. His father Ole started the company in 1932 and named it Lego—a twist of the Danish words leg godt, meaning “play well.”
Their first plastic bricks, modeled on an earlier British design, were not very popular until Godtfred hit upon the idea of actually inventing a system of compatible toys.
Christiansen first received a U.S. patent for a “toy building brick” in 1961. That original design of a rectangular plastic piece with eight “primary projections” (studs) on the top and three “secondary projections” (tubes) underneath is virtually unchanged in nearly six decades.
IBM unveiled new technology to reduce power outages by helping energy companies predict where trees and other vegetation may threaten power lines.
IBM worked with Oncor, the largest utility company in Texas and the fifth largest in the U.S., to develop a solution tailored for the energy and utility industry, to help improve operations and provide reliable electric service for millions of customers across the state.
The Weather Company Vegetation Management – Predict is built on IBM PAIRS Geoscope, a technology developed by IBM Research.
The system processes massive, complex geospatial and time-based datasets collected by satellites, drones, aerial flights, millions of Internet of Things (IOT) sensors, and weather models.
The concept of the digital twin has been around for several years, but it is just now coming to the forefront thanks to the Internet of Things (IoT). The digital twin has a wide range of uses, from validating models with real-world data, planning its production processes, and predicting failure out in the field.
Shankar Raman, director of portfolio development for digital manufacturing at Siemens PLM, notes there are three things about the digital twin that makes the case for its value.
“I have been to multiple shows and exhibit floors and people are still asking what the digital twin is and how it enables digitization,” Raman told Design News.
“The key is that there are the three factors that matter in the digital twin. The digital twin allows us to understand, to predict, and to optimize. Those three components of the digital twin drive positive business outcomes.”
Our studies have found leading companies were more likely to use digital tools at every phase of the research and development (R&D) process. The laggards might use digital tools for technical activities but not for management activities, such as planning and project management.
Those in the vanguard were particularly likely to use product data management (PDM) and product lifecycle management (PLM) systems, citing the benefits such systems provide in reducing both product and development costs.
Though it is true that advanced industries, and in particular, R&D, have been slower to digitize than many other sectors, there remains a sometimes-overlooked silver lining. By simply introducing technology and transparency into decision-making and operations, engineers and R&D staff are often more empowered, enjoy a better experience and can focus on what they like most — building & designing.
In an essay, we have examined the challenges and opportunities that come with tech-enabled transformations. Among them is that companies can take a piecemeal approach, leading to suboptimal outcomes.
Maybe quantum computing is a job for artificial intelligence.
To call quantum computing complicated is a gross understatement. Rather than any single complex challenge, quantum computing is a series of obstacles all superimposed (pun intended) onto each other.
Even though quantum processors based on superconducting circuits already exist in labs today, they don’t compare in speed or processing power to today’s typical desktop, laptop, and tablet computers.
Even if you can settle on materials, a physical architecture, and a form factor for your quantum device, you’re still faced with the very real difficulties of actually measuring quantum signals so you can take advantage of the processing and storage enhancements offered by quantum computing.