By Tom Wheeler – “Here’s how the telecom industry plans to defang their regulators,” a September 12, 2013 Washington Post headline announced. “[T]elecom giants including Verizon, AT&T and Comcast have launched multiple efforts to shift regulation of their broadband business to other agencies that don’t have nearly as much power as the FCC,” the article explained.
The companies’ goal: to move regulatory jurisdiction from the Federal Communications Commission to the Federal Trade Commission (FTC). Strategically, it is a brilliant sleight of hand since the FTC has no rulemaking authority and no telecommunications expertise, yet the companies and the policymakers who support them can trot out the line that the FTC will protect consumers.
With this vote, the FCC walked away from over a decade of bipartisan efforts to oversee the fairness and openness of companies such as Comcast, AT&T, Charter, and Verizon. These four companies control over 75 percent of the residential internet access in America, usually through a local monopoly. Henceforth, they will be able to make their own rules, subject only to very limited after-the-fact review.
The assertion that the FTC will be able to provide that protection adequately is an empty promise. The people at the FTC are good people, but they have neither network expertise, nor the authority to make rules. more>
Posted in Broadband, Business, Communication industry, CONGRESS WATCH, Economy, FCC, History, Net, net neutrality, Regulations, telecom
Tagged Broadband, Congress Watch, FCC, Government, Internet, Net Neutrality, United States, Wireline
BOM Management: An introduction
By Susan Zimmerlee – What exactly is BOM Management? Is that the same as BOM Configuration Management? Or product variability management? Or Master Data Management? Or a PLM BOM???
The answer seems to be that it depends on who you ask!
BOM management is a tough topic because those words mean something different to each company that I work with. Even within a single company, you could ask different departments and get different answers.
Which bill of materials management or BOM management solution is best for you? I’ve sat on both the selling and buying end of this discussion, and there is no single answer for everybody. It’s like asking – which vehicle is best?
The answer depends on if you’re hauling heavy loads or trying to get someplace really fast. The BOM management discussion needs to be similar – what is it that you need your BOM management system to do for you? Whether you make paper towels or space ships, at a basic level, BOM management is a critical element that takes you from an idea to a delivered product. To have more detailed discussions about BOM management, we need to establish a baseline of some of the key elements involved:
- Part: Managing a part bill of materials, also known as the physical product definition or product master, is commonly the main topic of Master Data Management (MDM) discussions.
- Design: In a design BOM (often called the virtual product definition), mechanical designers and engineering are usually focused on generating the 3D components that make up the product.
- DMU: Digital mock up (or DMU) refers to the ability to virtually view and interrogate your configured BOM throughout its lifecycle.
- BOM Configuration Management: BOM configuration management is the discipline of managing the content of the product definition throughout its lifecycle.
- Variability: Product variability is part of BOM configuration management.
- Architecture: To better manage configuration and product variability, product architectures help to organize similar content across several products.
- Coordinated Change: Coordinating product change across various product representations is an issue that is gaining more and more visibility as products grow more and more complex.
Posted in Broadband, Business, Education, How to, Product, Technology
Tagged Business improvement, Industrial economy, Jobs, PLM, Productivity, Siemens PLM, Skills, Technology
By Jay Shambaugh, Ryan Nunn, and Becca Portman – It is difficult to overstate the importance of technological progress for living standards. Consider the example of Argentina and Austria, as shown in figure A. These countries have roughly the same level of per capita inputs (labor and capital), but there is a vast gulf between them in economic output: Austria’s per capita income is more than double Argentina’s.
Labor and capital play vital roles in generating economic output and helping to explain differences in national incomes, but large disparities in per capita national income—in other words, national living standards—are due to the various ways that economies use their resources, and not just to the quantities of resources available.
In the language of growth accounting, total factor productivity (TFP) is the measure of how effective an economy is at producing economic output with a given amount of inputs. Across developed and developing economies, the majority of per capita income differences are due to total factor productivity variation (Hall and Jones 1999; Klenow and Rodríguez-Clare 1997).
In other words, most of per capita income differences are not explained by differences in available capital and labor. Moreover, sustained growth over time in per capita incomes requires growth in TFP (Solow 1957). Without technological progress, increases in labor and capital have a bounded potential to raise per capita income. more>
Posted in Book review, Business, Economic development, Economy, History, Intellectual Property, Media, Net
Tagged Business improvement, Capital, Industrial economy, Patent, Technology
Lost money? Reinvest!
By Erik Kobayashi-Solomon – Investors sometimes play a psychological trick on themselves when they lose money, research suggests—and that mental accounting trick may help improve their investment performance.
According to Cary D. Frydman and David H. Solomon at the University of Southern California and Chicago Booth’s Samuel Hartzmark, investors who sell a losing investment often avoid the psychological pain by immediately reinvesting in another stock. By doing so, instead of thinking of the action as realizing a loss, they frame it as rolling capital into a related investment. The reference point used to compute gains and losses is linked to the amount paid for the original asset.
That mental accounting trick may help them avoid an often-made mistake. A key insight of behavioral finance is that investors, to avoid the pain of realizing a loss, fall prey to the disposition effect: they tend to be more likely to sell winners than losers. But the act of reinvesting makes investors more willing to sell a losing stock and realize a loss sooner. more>
Boeing and subsidiary Liquid Robotics team up to explore deeper possibilities for autonomous systems
BY Dan Raley – Created by Boeing subsidiary Liquid Robotics, this maritime innovation known as the Wave Glider was originally intended to record the songs of migrating whales. When integrated with Boeing’s advanced sensors for defense applications, the Wave Glider can locate undersea vehicles at substantial distances, hunt for mines, monitor land radar, and gather and relay data to other systems, all while operating on solar and wave power for months at a time.
“It’s a hidden treasure,” said Jim Bray, Boeing autonomous systems technology integrator in St. Louis. “There’s a lot going on under the sea.”
Covered with fiberglass panels and small antennas topside and tethered to a wing-like propulsion system beneath it called a sub, the Wave Glider communicates by low-Earth-orbit satellite through a command-and-control unit and surface radio modem, similarly to someone sending a text message by smartphone.
“It’s revolutionary stuff,” said Scott Willcox, Liquid Robotics technology lead. “It’s like reinventing the sail — fundamentally, it’s a new way to get around the ocean. What you can do with it is almost limitless.”
In Ventura, Calif., in July, seven months after Boeing acquired Liquid Robotics, the companies teamed to test new Wave Glider capabilities in the ocean that would be presented to a customer for the first time. The testing demonstrated how transponders placed on the ocean floor by the Wave Glider conceivably could provide an oceanic GPS. An unmanned undersea vehicle in need of updating its location could use these underwater acoustics to determine where it is and never have to surface. more>
Posted in Communication industry, EARTH WATCH, Nature, Net, Technology, Transportation
Tagged Boeing, Business improvement, Net evolution, Ocean, Sensors, Technology
By George Mattathil –
In a nutshell, the current situation with cyber security
] is the direct result of the developments during the the “internet bubble,” in the 1990s. Collapse of the Bell Labs permitted the unchecked growth of the “internet bubble” and related hype.
The divestiture and the collapse of the Bells Labs left a vacuum for network technology leadership, that was substituted by hype that surrounded the “internet mania.” As a result, current network industry is operating on flawed foundational principles.
This added to the deficiencies in economic decision systems for (network) technology adoption, with the results we are seeing today: cyber security  challenges, internet malware  attacks and political controversies .
One of the consequences of the flawed network foundations is that the Broadband  adoption (which includes IoT) is progressing much slower than it could.
Another side effect is that ongoing network deployments are architecturally incoherent, resulting in enhanced complexity and cost. more>
Posted in Broadband, Business, Communication industry, CONGRESS WATCH, Economic development, Economy, Education, FCC, History, Net, net neutrality, Telecom industry
Tagged Broadband, Business improvement, Government, Internet, Technology, Technology adoption, United States
Imaging Technique Unlocks the Secrets of 17th Century Artists
By John Toon – The secrets of 17th century artists can now be revealed, thanks to 21st century signal processing. Using modern high-speed scanners and the advanced signal processing techniques, researchers at the Georgia Institute of Technology are peering through layers of pigment to see how painters prepared their canvasses, applied undercoats, and built up layer upon layer of paint to produce their masterpieces.
The images they produce using the terahertz scanners and the processing technique – which was mainly developed for petroleum exploration – provide an unprecedented look at how artists did their work three centuries ago. The level of detail produced by this terahertz reflectometry technique could help art conservators spot previous restorations of paintings, highlight potential damage – and assist in authenticating the old works.
Beyond old art, the nondestructive technique also has potential applications for detecting skin cancer, ensuring proper adhesion of turbine blade coatings and measuring the thickness of automotive paints.
Without the signal processing, researchers might only be able to identify layers 100 to 150 microns thick. But using the advanced processing, they can distinguish layers just 20 microns thick. Paintings done before the 18th century have been challenging to study because their paint layers tend to be thin, Citrin said. Individual pigments cannot be resolved by the technique, though the researchers hope to be able to obtain that information in the future. more>
- Wearable Computing Ring Allows Users to Write Words and Numbers with Thumb, Jason Maderer
- When Physics Gives Evolution a Leg Up by Breaking One, Ben Brumfield
- Advancing the Path to Organic Electronics Beyond Cell Phone Screens, Ben Brumfield
- A Popular Tool to Trace Earth’s Oxygen History Can Give False Positives, Ben Brumfield
- Contribution statements and author order on research studies still leave readers guessing, Josh Brown
- Apprenticeship Program Helps Students Gain Skills, Péralte C. Paul
- Transfer Technique Produces Wearable Gallium Nitride Gas Sensors, John Toon
- Student Teams Compete in Service Academies Swarm Challenge – with GTRI Assistance, John Toon
- The Kendeda Building for Innovative Sustainable Design Launches on Campus, Lance Wallace
- Creating the Next Code Composers, Stacy Braukman
- Astrobiology Rising at Georgia Tech, A. Maureen Rouhi
- “Instant Replay” for Computer Systems Shows Cyber Attack Details, John Toon
- “Combosquatting” Attack Hides in Plain Sight to Trick Computer UsersExamples of combosquatted domains, John Toon
- Rousing Masses to Fight Cancer with Open Source Machine Learning, Ben Brumfield
- 80 years of the Georgia Tech Research Corporation
- The Force is with Muscle Spindles
- Google Plugs In Georgia Tech Chemistry Team’s Software for its Quantum Computing Product
- National group honors research using lasers and AI to automatically assess health of highway pavement and catalog road signs
Posted in EARTH WATCH, Economic development, Economy, Education, Energy & emissions, Healthcare, History, Science, Technology
Tagged Earth, Georgia Tech, Health, Physics, Space, Technology