By Tom Wheeler – “Here’s how the telecom industry plans to defang their regulators,” a September 12, 2013 Washington Post headline announced. “[T]elecom giants including Verizon, AT&T and Comcast have launched multiple efforts to shift regulation of their broadband business to other agencies that don’t have nearly as much power as the FCC,” the article explained.
The companies’ goal: to move regulatory jurisdiction from the Federal Communications Commission to the Federal Trade Commission (FTC). Strategically, it is a brilliant sleight of hand since the FTC has no rulemaking authority and no telecommunications expertise, yet the companies and the policymakers who support them can trot out the line that the FTC will protect consumers.
With this vote, the FCC walked away from over a decade of bipartisan efforts to oversee the fairness and openness of companies such as Comcast, AT&T, Charter, and Verizon. These four companies control over 75 percent of the residential internet access in America, usually through a local monopoly. Henceforth, they will be able to make their own rules, subject only to very limited after-the-fact review.
The assertion that the FTC will be able to provide that protection adequately is an empty promise. The people at the FTC are good people, but they have neither network expertise, nor the authority to make rules. more>
Posted in Broadband, Business, Communication industry, CONGRESS WATCH, Economy, FCC, History, Net, net neutrality, Regulations, telecom
Tagged Broadband, Congress Watch, FCC, Government, Internet, Net Neutrality, United States, Wireline
BOM Management: An introduction
By Susan Zimmerlee – What exactly is BOM Management? Is that the same as BOM Configuration Management? Or product variability management? Or Master Data Management? Or a PLM BOM???
The answer seems to be that it depends on who you ask!
BOM management is a tough topic because those words mean something different to each company that I work with. Even within a single company, you could ask different departments and get different answers.
Which bill of materials management or BOM management solution is best for you? I’ve sat on both the selling and buying end of this discussion, and there is no single answer for everybody. It’s like asking – which vehicle is best?
The answer depends on if you’re hauling heavy loads or trying to get someplace really fast. The BOM management discussion needs to be similar – what is it that you need your BOM management system to do for you? Whether you make paper towels or space ships, at a basic level, BOM management is a critical element that takes you from an idea to a delivered product. To have more detailed discussions about BOM management, we need to establish a baseline of some of the key elements involved:
- Part: Managing a part bill of materials, also known as the physical product definition or product master, is commonly the main topic of Master Data Management (MDM) discussions.
- Design: In a design BOM (often called the virtual product definition), mechanical designers and engineering are usually focused on generating the 3D components that make up the product.
- DMU: Digital mock up (or DMU) refers to the ability to virtually view and interrogate your configured BOM throughout its lifecycle.
- BOM Configuration Management: BOM configuration management is the discipline of managing the content of the product definition throughout its lifecycle.
- Variability: Product variability is part of BOM configuration management.
- Architecture: To better manage configuration and product variability, product architectures help to organize similar content across several products.
- Coordinated Change: Coordinating product change across various product representations is an issue that is gaining more and more visibility as products grow more and more complex.
Posted in Broadband, Business, Education, How to, Product, Technology
Tagged Business improvement, Industrial economy, Jobs, PLM, Productivity, Siemens PLM, Skills, Technology
By Jay Shambaugh, Ryan Nunn, and Becca Portman – It is difficult to overstate the importance of technological progress for living standards. Consider the example of Argentina and Austria, as shown in figure A. These countries have roughly the same level of per capita inputs (labor and capital), but there is a vast gulf between them in economic output: Austria’s per capita income is more than double Argentina’s.
Labor and capital play vital roles in generating economic output and helping to explain differences in national incomes, but large disparities in per capita national income—in other words, national living standards—are due to the various ways that economies use their resources, and not just to the quantities of resources available.
In the language of growth accounting, total factor productivity (TFP) is the measure of how effective an economy is at producing economic output with a given amount of inputs. Across developed and developing economies, the majority of per capita income differences are due to total factor productivity variation (Hall and Jones 1999; Klenow and Rodríguez-Clare 1997).
In other words, most of per capita income differences are not explained by differences in available capital and labor. Moreover, sustained growth over time in per capita incomes requires growth in TFP (Solow 1957). Without technological progress, increases in labor and capital have a bounded potential to raise per capita income. more>
Posted in Book review, Business, Economic development, Economy, History, Intellectual Property, Media, Net
Tagged Business improvement, Capital, Industrial economy, Patent, Technology
Lost money? Reinvest!
By Erik Kobayashi-Solomon – Investors sometimes play a psychological trick on themselves when they lose money, research suggests—and that mental accounting trick may help improve their investment performance.
According to Cary D. Frydman and David H. Solomon at the University of Southern California and Chicago Booth’s Samuel Hartzmark, investors who sell a losing investment often avoid the psychological pain by immediately reinvesting in another stock. By doing so, instead of thinking of the action as realizing a loss, they frame it as rolling capital into a related investment. The reference point used to compute gains and losses is linked to the amount paid for the original asset.
That mental accounting trick may help them avoid an often-made mistake. A key insight of behavioral finance is that investors, to avoid the pain of realizing a loss, fall prey to the disposition effect: they tend to be more likely to sell winners than losers. But the act of reinvesting makes investors more willing to sell a losing stock and realize a loss sooner. more>
By George Mattathil –
In a nutshell, the current situation with cyber security
] is the direct result of the developments during the the “internet bubble,” in the 1990s. Collapse of the Bell Labs permitted the unchecked growth of the “internet bubble” and related hype.
The divestiture and the collapse of the Bells Labs left a vacuum for network technology leadership, that was substituted by hype that surrounded the “internet mania.” As a result, current network industry is operating on flawed foundational principles.
This added to the deficiencies in economic decision systems for (network) technology adoption, with the results we are seeing today: cyber security  challenges, internet malware  attacks and political controversies .
One of the consequences of the flawed network foundations is that the Broadband  adoption (which includes IoT) is progressing much slower than it could.
Another side effect is that ongoing network deployments are architecturally incoherent, resulting in enhanced complexity and cost. more>
Posted in Broadband, Business, Communication industry, CONGRESS WATCH, Economic development, Economy, Education, FCC, History, Net, net neutrality, Telecom industry
Tagged Broadband, Business improvement, Government, Internet, Technology, Technology adoption, United States
By Andrew Soergel – As recently as Wednesday, President Donald Trump was quoted during a Cabinet meeting as saying he sees “no reason why we don’t go to 4 percent, 5 percent and even 6 percent” gross domestic product expansion in the months and years ahead.
Economists have broadly doubted these claims – though few quibble with the idea that the GOP-constructed tax plan would have a modestly positive impact on markets and the economy over the near term. Analyses from the Joint Committee on Taxation, the Tax Policy Center and the University of Pennsylvania’s Wharton Budget Model have all predicted a final bill, in a best case scenario, would add a few fractions of a percentage point to the country’s GDP growth rate over the course of the next 10 years.
A growing number of experts are using the term “sugar high” to describe what the tax bill is likely to do to the U.S. economy – provide some short-term energy for growth before petering out or, even worse, pushing the country toward a crash. more>
By Jake Schwartz – Bureau of Labor Statistics estimates suggest, for example, that there will be 1 million more computing jobs than applicants to fill them by 2020.
Of course, the skills gap is about more than just supply and demand. It stems from what economists call “friction,” exacerbated by megatrends like the shrinking shelf life of skills and persistent equity gaps in K-12 and higher education systems struggling to keep up with the pace of change. But it also reflects decades of self-inflicted wounds within corporate America.
I’ve observed three troubling drivers of the economic friction fueling the skills gap:
- a surprising lack of visibility and long-term planning around concrete skill and talent needs within the enterprise;
- incredible inertia around and adherence to old-school hiring practices that perpetuate growing equity gaps through a search for new skills in conventional places; and
- a tendency to misplace hope that our higher education and workforce development systems can somehow “solve” the problem with minimal corporate involvement or responsibility.
Imagine the possibilities if just a fraction of that spending was allocated to investments in re-skilling existing workers.
And yet, corporate training fads, from an obsession with online training (it’s cheaper), to a belief that all employees should spend their off-hours being “self-guided learners,” only exacerbate the delta between average investments in talent acquisition ($20,000 to $40,000 per head) and corporate training ($1,000 per person per year). more>
Posted in Broadband, Business, CONGRESS WATCH, Economic development, Economy, Education, Net, Science, Technology
Tagged Business improvement, Government, Jobs, Skills, Technology, Training
Matrix Reimagined: Brand New GE Startup Is Developing Novel Ways To Draw Blood
Ny Tomas Kellner – Drawbridge, a new business founded by GE Ventures, is building an easy-to-use blood collection device that could be used anywhere — at a clinic in San Francisco, in a remote village in Borneo or potentially even at home. Users will be able to apply the device to the upper arm and activate it. It will then store and stabilize the sample in a special cartridge.
Drawbridge, a new business founded by GE Ventures, is building an easy-to-use blood collection device that could be used anywhere — at a clinic in San Francisco, in a remote village in Borneo or potentially even at home. Users will be able to apply the device to the upper arm and activate it. It will then store and stabilize the sample in a special cartridge.
The playing field is huge. The global blood collection market stands at $7 billion, and health professionals in the U.S. alone draw more than 1 billion blood samples every year. Handling blood is also an important factor in treating patients — blood test results reportedly influence 70 percent of clinical decisions.
The blood stabilization technology inside the device, a high-tech paper-like material known as “the matrix,” was originally developed by scientists at GE Global Research, leveraging knowledge and expertise from the GE Healthcare team.
The collection device will draw a small amount of blood and channel it onto the matrix, which stores the sample for later extraction and testing. The matrix also stabilizes the collected blood sample and eliminates the need to refrigerate it, which will simplify transporting it to the lab.
When GE Ventures learned about the technology, Stack and her colleagues thought they could build a business around it, as they did with other companies they’ve launched. more>