Category Archives: Communication industry

The internet of (economic) things

By Jonathan Sallet – Robert Gordon argues that, with the exception of a decade starting in the mid-1990s, information networks have not driven productivity in the way that electricity transformed the American manufacturing sector in the 20th Century. But some believe now that IoT (internet of things) can boost productivity growth by increasing the efficiency of traditional business operations such as manufacturing, transportation, and retail. Whether the United States can return to historical productivity growth levels is critical to the American economy.

IoT standards raise a series of policy questions: Are industry standards being set in a pro-competitive fashion?

Are companies complying with their obligations under standards (a question featured in an analogous context in the recent Federal Trade Commission complaint against Qualcomm)?

And what kind of role should government play in establishing the standards at the outset? more> https://goo.gl/p3Zwph

All The Things Wrong With the Web Today, According to its Inventor

By Joon Ian Wong – Tim Berners-Lee isn’t particularly pleased with the way things have gone with his creation.

Advertising’s pernicious effect on the news. The web is cleaving into the haves and have-nots of news readership. Wealthy readers will pay to opt out of advertising; less privileged readers will have to stick with news that’s ad-supported,

Social networks are ignoring their responsibility to the truth. Social networks absorb their users’ personal data, but wind up “disempowering” those same users by isolating them from the wider web,

Online privacy is a “human right” that’s being trampled. Government surveillance and corporate monetization of personal data threaten web users’ right to privacy. more> https://goo.gl/kqgTNp

3 Ways Exponential Technologies are Impacting the Future of Learning


By Sveta McShane – Exponential technologies have a tendency to move from a deceptively slow pace of development to a disruptively fast pace. We often disregard or don’t notice technologies in the deceptive growth phase, until they begin changing the way we live and do business. Driven by information technologies, products and services become digitized, dematerialized, demonetized and/or democratized and enter a phase of exponential growth.

In the past three decades, jobs requiring routine manual or routine cognitive skills have declined as a percent of the labor market. On the other hand, jobs requiring solving unstructured problems, communication, and non-routine manual work have grown.

The best chance of preparing young people for decent paying jobs in the decades ahead is helping them develop the skills to solve these kinds of complex tasks. more> https://goo.gl/UemBt9

What’s Wrong With America’s Current Approach to Cybersecurity?

By Gregory Michaelidis – Go behind the headlines of the latest megahack, and what you’ll find is a growing public-safety and national-security crisis.

We are barely discussing how to help people help themselves in the digital world, let alone do their part in protecting our major networks and critical infrastructure.

Until we embrace a vision of public cybersecurity that sees people, at all ranges of skill, as essential to our collective security, there will be no widespread cybersecurity.

Right now, America’s collective cybersecurity effort is headed toward near-certain failure for reasons within our own control. In less than a decade — thanks to the influx of dollars and high-level policy and press attention — cybersecurity has transformed what is actually a “people problem with a technology component” into its exact opposite.

Official Washington and Silicon Valley have adopted a set of faulty assumptions about cybersecurity and internalized them to such a degree it’s practically a new religion, somewhere between late-19th-century technological determinism and medieval alchemy. more> https://goo.gl/elH8r2

90 years later, the broadcast public interest standard remains ill-defined

By Jack Karsten – The public interest standard has governed broadcast radio and television since Congress passed the Radio Act of 1927. However, decades of successive court cases and updated telecommunications laws have done little to clarify what falls into the public interest.

Prior to the public interest standard, free speech advocates argued with the broadcasting industry over who should have editorial control over content. Industry groups opposed a common carrier approach that would have allowed anyone to buy airtime. The resulting compromise established a short-term renewable licensing regime, overseen by the Federal Communications Commission since 1934, which required broadcasters to act on behalf of all others who did not receive a license. Congress granted the FCC the flexibility to revise its interpretation of the public interest standard to reflect changing circumstances. Since its founding, the FCC has repeatedly refused to set forth its own concrete definition of the public interest.

The Telecommunications Act of 1996 updated the 1934 Communications Act, but did not address the public interest standard beyond maintaining the status quo. more> https://goo.gl/AfmULj

How Ethernet Will Get to 400Gbps

By Lynnette Reese – The IEEE 802.3bs standard for 400Gbps is on track to be ratified and released late this year. Higher speed technologies tend to get driven to adoption as soon as they are available.

In 2004, 10Gbps was the leading edge. In 2010 40Gbps Ethernet and 100Gbps were introduced. How did we get this far, so fast?

The present group is leveraging a parallel lane structure to get to 400Gbps. For electrical interfaces the fastest speeds in the spec will be 50Gbps. When discussing optical fiber transmission, then the variation depends on the distance that one requires.

Technically, 400Gbps is not possible without switching away from non-return-to-zero modulation (also known as NRZ-type) encoding, the encoding scheme that everyone thinks of when they visualize Ethernet communication and other serial data transmission schemes.

NRZ data is encoded into a binary pattern with fixed voltage levels. A binary 0 is represented by the lower voltage level; the higher voltage level indicates binary 1. In 1000base-T Ethernet, the stream of 0s and 1s is driven at a 1000 bits per second (1Gbps) transmission rate.

At present, the physical “wall” of streaming 0s and 1s for single lane electrical interfaces is 25 Gbps, found in the standards as 802.3bj across backplanes and cables, and 802.3bm across chip-to-chip and chip-to-module interfaces.

In May 2016, an IEEE 802.3 task force formed to develop a single-lane 50 Gbps Ethernet standard. The 802.3bs standard, which defines 400Gbps in aggregate, will use an encoding scheme called PAM4 (4-Level Pulse Amplitude Modulation) to reach 50Gbps per channel. PAM4 is an encoding scheme that doubles the bit rate by providing four signal levels in the space of the two that NRZ presently provides. PAM4 cleverly divides the least significant bit (LSB) signal level in half and adds it to the signal of the most significant bit (MSB). more> https://goo.gl/fcDF8f

The Future of Work — 3 Mega-Trends

By Graham Brown-Martin – Technology is just part of a broader spectrum of human activity and social change is driven by society rather than machines, that is, we have agency to act independently and make free choices.

The path of innovation and its social consequences are almost entirely shaped by society as a result of numerous social factors such as culture, politics, regulatory mechanisms and economic arrangements. The latter one is particularly apposite given the post-WWII obsession with neoclassical economics, as taught in most universities.

Political decisions supported by economic frameworks have excluded citizens from the discourse and, as a result, are now unraveling across the western world. It turns out that the things we value most are the things that are difficult or impossible to measure.

This obsession for economics and measuring what could be measured and ignoring what it couldn’t gave us global agencies such as the World Bank, IMF and OECD.

But these organizations have been unable to apply their frameworks, wedded as they are to a single metric of GDP, to the worlds most pressing challenges such as climate change, increasing population or growing inequalities, rather they have exacerbated them. more> https://goo.gl/DywzVb

The future of the open internet — and our way of life — is in your hands

By Quincy Larson – So far, the story of the internet has followed the same tragic narrative that’s befallen other information technologies over the past 160 years:

  • the telegram
  • the telephone
  • cinema
  • radio
  • television

Each of these had roughly the same story arc:

  1. Inventors discovered the technology.
  2. Hobbyists pioneered the applications of that technology, and popularized it.
  3. Corporations took notice. They commercialized the technology, refined it, and scaled it.
  4. Once the corporations were powerful enough, they tricked the government into helping them lock the technology down. They installed themselves as “natural monopolies.”
  5. After a long period of stagnation, a new technology emerged to disrupt the old one. Sometimes this would dislodging the old monopoly. But sometimes it would only further solidify them.

And right now, we’re in step 4 the open internet’s narrative. We’re surrounded by monopolies.

The problem is that we’ve been in step 4 for decades now. And there’s no step 5 in sight. The creative destruction that the Economist Joseph Schumpeter first observed in the early 1900s has yet to materialize. more> https://goo.gl/dFd7MK

Updates from Georgia Tech

New Projects Create a Foundation for Next-Gen Flexible Electronics
By Josh Brown – Four projects set to move forward at the Georgia Institute of Technology aim to lay the groundwork for manufacturing next-generation flexible electronics, which have the potential to make an impact on industries ranging from health care to defense.

Researchers at Georgia Tech are partnering with Boeing, Hewlett Packard Enterprises, General Electric, and DuPont as well other research institutions such as Binghamton University and Stanford University on the projects.

Flexible electronics are circuits and systems that can be bent, folded, stretched or conformed without losing their functionality. The systems are often created using machines that can print components such as logic, memory, sensors, batteries, antennas, and various passives using conductive ink on flexible surfaces. Combined with low-cost manufacturing processes, flexible hybrid electronics unlock new product possibilities for a wide range of electronics used in the health care, consumer products, automotive, aerospace, energy and defense sectors.

“Flexible electronics will make possible new products that will help us address problems associated with food supply, clean water, clean energy, health, infrastructure, and safety and security,” said Suresh Sitaraman, a professor in the George W. Woodruff School of Mechanical Engineering, who is leading Georgia Tech’s flexible electronics activities. more> https://goo.gl/qjx3UT

Related>

The identity threat

By Teri Takai – The big problem for many government agencies is that most of them still rely on declarative legacy roles, rubber-stamping certifications and manual processes to manage identities and roles — all of which expose them to continual and multiple access risks. External threat actors compromise identities to evade detection from existing defenses, while insiders work under the radar to access data for exfiltration.

To provide a robust defense and protect the identity-based perimeter, government agencies must consider new thinking and approaches.

The core issue is security leaders are not attacking the evolving security landscape through proactive planning and change management. Instead, they are stuck in a reactive mode.

It is not hard to understand why: the user profile is 24-7, global, instantaneous, and rich in consumer-driven IT. more> https://goo.gl/X59JUA