BOM Management: An introduction
By Susan Zimmerlee – What exactly is BOM Management? Is that the same as BOM Configuration Management? Or product variability management? Or Master Data Management? Or a PLM BOM???
The answer seems to be that it depends on who you ask!
BOM management is a tough topic because those words mean something different to each company that I work with. Even within a single company, you could ask different departments and get different answers.
Which bill of materials management or BOM management solution is best for you? I’ve sat on both the selling and buying end of this discussion, and there is no single answer for everybody. It’s like asking – which vehicle is best?
The answer depends on if you’re hauling heavy loads or trying to get someplace really fast. The BOM management discussion needs to be similar – what is it that you need your BOM management system to do for you? Whether you make paper towels or space ships, at a basic level, BOM management is a critical element that takes you from an idea to a delivered product. To have more detailed discussions about BOM management, we need to establish a baseline of some of the key elements involved:
- Part: Managing a part bill of materials, also known as the physical product definition or product master, is commonly the main topic of Master Data Management (MDM) discussions.
- Design: In a design BOM (often called the virtual product definition), mechanical designers and engineering are usually focused on generating the 3D components that make up the product.
- DMU: Digital mock up (or DMU) refers to the ability to virtually view and interrogate your configured BOM throughout its lifecycle.
- BOM Configuration Management: BOM configuration management is the discipline of managing the content of the product definition throughout its lifecycle.
- Variability: Product variability is part of BOM configuration management.
- Architecture: To better manage configuration and product variability, product architectures help to organize similar content across several products.
- Coordinated Change: Coordinating product change across various product representations is an issue that is gaining more and more visibility as products grow more and more complex.
Posted in Broadband, Business, Education, How to, Product, Technology
Tagged Business improvement, Industrial economy, Jobs, PLM, Productivity, Siemens PLM, Skills, Technology
By Jay Shambaugh, Ryan Nunn, and Becca Portman – It is difficult to overstate the importance of technological progress for living standards. Consider the example of Argentina and Austria, as shown in figure A. These countries have roughly the same level of per capita inputs (labor and capital), but there is a vast gulf between them in economic output: Austria’s per capita income is more than double Argentina’s.
Labor and capital play vital roles in generating economic output and helping to explain differences in national incomes, but large disparities in per capita national income—in other words, national living standards—are due to the various ways that economies use their resources, and not just to the quantities of resources available.
In the language of growth accounting, total factor productivity (TFP) is the measure of how effective an economy is at producing economic output with a given amount of inputs. Across developed and developing economies, the majority of per capita income differences are due to total factor productivity variation (Hall and Jones 1999; Klenow and Rodríguez-Clare 1997).
In other words, most of per capita income differences are not explained by differences in available capital and labor. Moreover, sustained growth over time in per capita incomes requires growth in TFP (Solow 1957). Without technological progress, increases in labor and capital have a bounded potential to raise per capita income. more>
Posted in Book review, Business, Economic development, Economy, History, Intellectual Property, Media, Net
Tagged Business improvement, Capital, Industrial economy, Patent, Technology
Boeing and subsidiary Liquid Robotics team up to explore deeper possibilities for autonomous systems
BY Dan Raley – Created by Boeing subsidiary Liquid Robotics, this maritime innovation known as the Wave Glider was originally intended to record the songs of migrating whales. When integrated with Boeing’s advanced sensors for defense applications, the Wave Glider can locate undersea vehicles at substantial distances, hunt for mines, monitor land radar, and gather and relay data to other systems, all while operating on solar and wave power for months at a time.
“It’s a hidden treasure,” said Jim Bray, Boeing autonomous systems technology integrator in St. Louis. “There’s a lot going on under the sea.”
Covered with fiberglass panels and small antennas topside and tethered to a wing-like propulsion system beneath it called a sub, the Wave Glider communicates by low-Earth-orbit satellite through a command-and-control unit and surface radio modem, similarly to someone sending a text message by smartphone.
“It’s revolutionary stuff,” said Scott Willcox, Liquid Robotics technology lead. “It’s like reinventing the sail — fundamentally, it’s a new way to get around the ocean. What you can do with it is almost limitless.”
In Ventura, Calif., in July, seven months after Boeing acquired Liquid Robotics, the companies teamed to test new Wave Glider capabilities in the ocean that would be presented to a customer for the first time. The testing demonstrated how transponders placed on the ocean floor by the Wave Glider conceivably could provide an oceanic GPS. An unmanned undersea vehicle in need of updating its location could use these underwater acoustics to determine where it is and never have to surface. more>
Posted in Communication industry, EARTH WATCH, Nature, Net, Technology, Transportation
Tagged Boeing, Business improvement, Net evolution, Ocean, Sensors, Technology
By George Mattathil –
In a nutshell, the current situation with cyber security
] is the direct result of the developments during the the “internet bubble,” in the 1990s. Collapse of the Bell Labs permitted the unchecked growth of the “internet bubble” and related hype.
The divestiture and the collapse of the Bells Labs left a vacuum for network technology leadership, that was substituted by hype that surrounded the “internet mania.” As a result, current network industry is operating on flawed foundational principles.
This added to the deficiencies in economic decision systems for (network) technology adoption, with the results we are seeing today: cyber security  challenges, internet malware  attacks and political controversies .
One of the consequences of the flawed network foundations is that the Broadband  adoption (which includes IoT) is progressing much slower than it could.
Another side effect is that ongoing network deployments are architecturally incoherent, resulting in enhanced complexity and cost. more>
Posted in Broadband, Business, Communication industry, CONGRESS WATCH, Economic development, Economy, Education, FCC, History, Net, net neutrality, Telecom industry
Tagged Broadband, Business improvement, Government, Internet, Technology, Technology adoption, United States
Imaging Technique Unlocks the Secrets of 17th Century Artists
By John Toon – The secrets of 17th century artists can now be revealed, thanks to 21st century signal processing. Using modern high-speed scanners and the advanced signal processing techniques, researchers at the Georgia Institute of Technology are peering through layers of pigment to see how painters prepared their canvasses, applied undercoats, and built up layer upon layer of paint to produce their masterpieces.
The images they produce using the terahertz scanners and the processing technique – which was mainly developed for petroleum exploration – provide an unprecedented look at how artists did their work three centuries ago. The level of detail produced by this terahertz reflectometry technique could help art conservators spot previous restorations of paintings, highlight potential damage – and assist in authenticating the old works.
Beyond old art, the nondestructive technique also has potential applications for detecting skin cancer, ensuring proper adhesion of turbine blade coatings and measuring the thickness of automotive paints.
Without the signal processing, researchers might only be able to identify layers 100 to 150 microns thick. But using the advanced processing, they can distinguish layers just 20 microns thick. Paintings done before the 18th century have been challenging to study because their paint layers tend to be thin, Citrin said. Individual pigments cannot be resolved by the technique, though the researchers hope to be able to obtain that information in the future. more>
- Wearable Computing Ring Allows Users to Write Words and Numbers with Thumb, Jason Maderer
- When Physics Gives Evolution a Leg Up by Breaking One, Ben Brumfield
- Advancing the Path to Organic Electronics Beyond Cell Phone Screens, Ben Brumfield
- A Popular Tool to Trace Earth’s Oxygen History Can Give False Positives, Ben Brumfield
- Contribution statements and author order on research studies still leave readers guessing, Josh Brown
- Apprenticeship Program Helps Students Gain Skills, Péralte C. Paul
- Transfer Technique Produces Wearable Gallium Nitride Gas Sensors, John Toon
- Student Teams Compete in Service Academies Swarm Challenge – with GTRI Assistance, John Toon
- The Kendeda Building for Innovative Sustainable Design Launches on Campus, Lance Wallace
- Creating the Next Code Composers, Stacy Braukman
- Astrobiology Rising at Georgia Tech, A. Maureen Rouhi
- “Instant Replay” for Computer Systems Shows Cyber Attack Details, John Toon
- “Combosquatting” Attack Hides in Plain Sight to Trick Computer UsersExamples of combosquatted domains, John Toon
- Rousing Masses to Fight Cancer with Open Source Machine Learning, Ben Brumfield
- 80 years of the Georgia Tech Research Corporation
- The Force is with Muscle Spindles
- Google Plugs In Georgia Tech Chemistry Team’s Software for its Quantum Computing Product
- National group honors research using lasers and AI to automatically assess health of highway pavement and catalog road signs
Posted in EARTH WATCH, Economic development, Economy, Education, Energy & emissions, Healthcare, History, Science, Technology
Tagged Earth, Georgia Tech, Health, Physics, Space, Technology
By Jake Schwartz – Bureau of Labor Statistics estimates suggest, for example, that there will be 1 million more computing jobs than applicants to fill them by 2020.
Of course, the skills gap is about more than just supply and demand. It stems from what economists call “friction,” exacerbated by megatrends like the shrinking shelf life of skills and persistent equity gaps in K-12 and higher education systems struggling to keep up with the pace of change. But it also reflects decades of self-inflicted wounds within corporate America.
I’ve observed three troubling drivers of the economic friction fueling the skills gap:
- a surprising lack of visibility and long-term planning around concrete skill and talent needs within the enterprise;
- incredible inertia around and adherence to old-school hiring practices that perpetuate growing equity gaps through a search for new skills in conventional places; and
- a tendency to misplace hope that our higher education and workforce development systems can somehow “solve” the problem with minimal corporate involvement or responsibility.
Imagine the possibilities if just a fraction of that spending was allocated to investments in re-skilling existing workers.
And yet, corporate training fads, from an obsession with online training (it’s cheaper), to a belief that all employees should spend their off-hours being “self-guided learners,” only exacerbate the delta between average investments in talent acquisition ($20,000 to $40,000 per head) and corporate training ($1,000 per person per year). more>
Posted in Broadband, Business, CONGRESS WATCH, Economic development, Economy, Education, Net, Science, Technology
Tagged Business improvement, Government, Jobs, Skills, Technology, Training
Matrix Reimagined: Brand New GE Startup Is Developing Novel Ways To Draw Blood
Ny Tomas Kellner – Drawbridge, a new business founded by GE Ventures, is building an easy-to-use blood collection device that could be used anywhere — at a clinic in San Francisco, in a remote village in Borneo or potentially even at home. Users will be able to apply the device to the upper arm and activate it. It will then store and stabilize the sample in a special cartridge.
Drawbridge, a new business founded by GE Ventures, is building an easy-to-use blood collection device that could be used anywhere — at a clinic in San Francisco, in a remote village in Borneo or potentially even at home. Users will be able to apply the device to the upper arm and activate it. It will then store and stabilize the sample in a special cartridge.
The playing field is huge. The global blood collection market stands at $7 billion, and health professionals in the U.S. alone draw more than 1 billion blood samples every year. Handling blood is also an important factor in treating patients — blood test results reportedly influence 70 percent of clinical decisions.
The blood stabilization technology inside the device, a high-tech paper-like material known as “the matrix,” was originally developed by scientists at GE Global Research, leveraging knowledge and expertise from the GE Healthcare team.
The collection device will draw a small amount of blood and channel it onto the matrix, which stores the sample for later extraction and testing. The matrix also stabilizes the collected blood sample and eliminates the need to refrigerate it, which will simplify transporting it to the lab.
When GE Ventures learned about the technology, Stack and her colleagues thought they could build a business around it, as they did with other companies they’ve launched. more>
By Akshat Rathi – The optimism surrounding renewable energy masks some harsh realities. Despite decades of progress, about 80% of the world’s energy still comes from fossil fuels—the same as in the 1970s. Since then, we’ve kept adding renewable capacity, but it hasn’t outpaced the growth of the world’s population and its demand for energy.
Today, about 30% of total world energy (and 40% of the world’s electricity) is supplied by coal, which emits more carbon dioxide per unit of energy produced than nearly any other fuel source.
The hugely valuable oil and gas industries, accounting for 33% and 24% of total world energy use, respectively, are also entrenched. “Based on what we know now, we would need major technological breakthroughs or weak world growth, including for large emerging and developing economies, for oil demand to peak in the next 20 years,” says Gian Maria Milesi-Ferretti of the International Monetary Fund. Despite the growth in electric vehicles, most oil companies agree that peak oil is “not in sight.”
If you’re still not convinced, consider this: there are a handful of industries essential to the modern way of life that generate large amounts of carbon dioxide as a side product of the chemistry of their manufacturing process. These carbon-intensive industries—including cement, steel, and ethanol—produce about 20% of all global emissions.
If we want to keep using these products and reach zero emissions, the only option is to have these industries deploy carbon capture. more>
Posted in Business, EARTH WATCH, Economic development, Economy, Energy & emissions, History, Leadership, Media, Net, Science, Technology, Transportation
Tagged Carbon capture and storage, Climate change, Earth, Financial crisis, Super regions, Technology
15 middle-class jobs that can’t be automated—a CBR thought experiment
By Howard R. Gold – A much-publicized 2013 study by Oxford University researchers Carl Benedikt Frey and Michael A. Osborne estimates that “about 47 percent of total US employment is at risk” from advances in computerization, particularly machine learning, robotics, and artificial intelligence. Using US Bureau of Labor Statistics data, Frey and Osborne rated 702 occupations on a scale of 0 to 100 percent for risk of displacement by emerging computer technologies. Workers in heavily blue-collar industries such as production, construction, transportation, maintenance and repair, and farming and fisheries face the highest risk, along with white-collar employees in service and sales.
The job categories at lowest risk, according to Frey and Osborne: management; computer, engineering, and science; education, legal, arts, and media; and, of course, health care. The latter accounted for half of the 20 occupations to which Frey and Osborne give the lowest probability of replacement by computerization.
Core skills such as “originality,” “social perceptiveness,” “assisting and caring for others,” “persuasion,” and ”negotiation” are the most difficult for computers to replicate, Frey and Osborne determine. (For more, see “If robots take our jobs, will they make it up to us?” July 2017.) more>