Tag Archives: Super regions

Updates from Datacenter.com

Private Cloud vs Public Cloud: what is the best solution?
Datacenter.com – Cloud computing spans a range of classifications, types and architecture models. The transformative networked computing model can be categorized into three major types: Public Cloud, Private Cloud and Hybrid Cloud.

Hybrid IT has rapidly proven that it offers the flexibility for delivering new software applications and enhanced features quickly critical agility in the age of digital business. With that in mind, enterprises now need to identify the best distribution of services and applications and their strategy for connecting to the clouds they use.

This article explores the key differences between the classifications of Public, and Private cloud environments.

Public Cloud refers to the cloud computing model with which the IT services are delivered across the Internet. The computing functionality may range from common services such as email, apps and storage to the enterprise-grade OS platform or infrastructure environments used for software development and testing. The cloud vendor is responsible for developing, managing and maintaining the pool of computing resources shared between multiple tenants from across the network.

The advantages of Public Cloud solutions for business customers include:

  • No investments required to deploy and maintain the IT infrastructure;
  • Flexible pricing options based on different SLA offerings

However, there are disadvantages as well, including:

  • The total cost of ownership (TCO) can rise exponentially for large-scale usage, specifically for midsize to large enterprises

more>

Related>

What’s Great Power Competition? No One Really Knows

By Katie Bo Williams – More than a year since the new National Defense Strategy refocused the U.S. military away from counterinsurgency and back towards the country’s greatest strategic competitors, some policy and strategy experts say the Pentagon hasn’t yet figured out how to “compete” with Russia and China.

In fact, it hasn’t even settled on a definition for the “competition” in “great power competition.”

The uncertainty has left former officials scratching their heads about how, specifically, the Defense Department plans to counter China and Russia beneath the threshold of armed conflict. It also appears to be pulling the Pentagon’s policy planners beyond their traditional purview of fighting and winning wars.

“The NDS has two pieces to it: it says you have to compete with China and Russia and prepare for conflict with China and Russia,” said Mara Karlin, a former deputy assistant defense secretary for strategy and force development. “Those are different. The way you would manage and develop your force is different depending on which one you are biasing towards.” more>

Updates from Ciena

500G transpacific. Yep, we did that!
The news from SubOptic? Let’s start with our successful single-wavelength 500G field trial over a 9,000km transpacific cable. Ciena’s Brian Lavallee explains more about this milestone as well as other highlights from this important technical conference.
By Brian Lavallée – SubOptic 2019 has recently come to a close, and as the locals say, “laissez les bon temps rouler”, or let the good times roll – and they did.

We shared the news of a successful single-wavelength 500G field trial over a 9,000km transpacific cable, which was completed just before the event. Of course, this means we can also do 500G single-wavelength transmission across much shorter transatlantic distances too. The transpacific field trial leveraged our very latest WaveLogic 5 Extreme coherent optical technology, which truly takes our pioneering submarine networking solution,

GeoMesh Extreme, to the extreme. In just under a decade, we’ve leaped from 10G to 500G transpacific – a truly impressive feat.

How did we achieve such performance?

By leveraging advanced Digital Signal Processing (DSP) capabilities, 95Gbaud operation, Probabilistic Constellation Shaping (PCS), throughput-optimized FEC, and nonlinear mitigation techniques. more>

Related>

Updates from Datacenter.com

Planning a hybrid cloud implementation? Don’t forget the importance of the network
Almost every company is working on some form of cloud transformation, and we’ve noticed that almost everyone is pursuing a hybrid cloud strategy. Because hybrid sees a wide range of on-premise, hosted and cloud-based services side by side, it will only be cost-effective if you can establish reliable, secure connectivity between the various elements of your hybrid architecture.
datacenter.com – Today everything has to be on demand

The network is often forgotten because IT teams are planning hybrid cloud transformation projects. Without properly dimensioned legacy-to-public cloud connectivity, transformation projects can be compromised and run into serious customer experience problems.

“Transformation projects can be paralyzed without properly dimensioned legacy-to-public cloud connectivity”

This is why organizations are now trying to request and order on-demand capacity from the network as they need, reducing traditional constraints such as capex, delays at external suppliers and long project timelines.

From one to multiple networks

By connecting on demand, you can also adjust the bandwidth up or down to suit your project. For example, if you perform a major update on your cloud platform or your IT services go into production as quickly as they are built for the development and testing phase of your project you can adjust your bandwidth based on the (temporary) need.

“By connecting on-demand, you can also adjust the bandwidth up or down to your project” more>

Related>

Updates from Ciena

Tomorrow’s cities: evolving from “smart” to Adaptive
Cities are going smart – trying to deal with the proliferation of people, sensors, automobiles and a range of devices that demand network access and generate mind-boggling amounts of data. However, being smart is not an instance in time, and a “smart city” is not static. To be worthy of the name, a smart city must continually evolve and stay ahead of demand. This is only possible if the city’s underlying network is just as smart and can adapt to its constantly changing environment.
By Daniele Loffreda – Cities are constantly in flux. Populations move in; populations move out. Demographics change, economic growth falls and then soars. New leadership steps in and—if you believe all the commercials—technology will make everyone’s life better.

Municipal governments understand the need to consider which smart city applications will best serve the demands of their diverse demographic segments. The City of Austin’s Head of Digital Transformation, Marni White, summed up these challenges stating, “Our problems will continue to change over time, so our solutions also need to change over time.”

The one constant in the smart city is the network running underneath these solutions—and the truly smart city has a network that adapts.

Smart city applications must be aligned with where a city and its citizens want to go. Some municipalities that created model smart-cities early on have had to initiate extensive revamping. For example, the City of Barcelona has long been at the cutting edge of using digital devices and the Internet of Things to improve municipal operations; however, in 2017, Mayor Ada Colau gave Barcelona’s CTO, Francesca Bria, a mandate to “rethink the smart city from the ground up.”

This meant shifting from a “technology-first” approach, centered on interconnected devices, to a “citizen-first” focus that responds to the changing needs that residents themselves help define. more>

Related>

Why the US bears the most responsibility for climate change, in one chart

By Umair Irfan – Humans are pumping more carbon dioxide into the atmosphere at an accelerating rate. But climate change is a cumulative problem, a function of the total amount of greenhouse gases that have accumulated in the sky. Some of the heat-trapping gases in the air right now date back to the Industrial Revolution. And since that time, some countries have pumped out vastly more carbon dioxide than others.

The wonderful folks at Carbon Brief have put together a great visual of how different countries have contributed to climate change since 1750. The animation shows the cumulative carbon dioxide emissions of the top emitters and how they’ve changed over time.

What’s abundantly clear is that the United States of America is the all-time biggest, baddest greenhouse gas emitter on the planet.

That’s true, despite recent gains in energy efficiency and cuts in emissions. These relatively small steps now cannot offset more than a century of reckless emissions that have built up in the atmosphere. Much more drastic steps are now needed to slow climate change. And as the top cumulative emitter, the US bears a greater imperative for curbing its carbon dioxide output and a greater moral responsibility for the impacts of global warming.

Yet the United States is now the only country aiming to withdraw from the Paris climate agreement. more>

Notre Dame

By Natasha Frost, Ephrat Livni, Whet Moser, Jessanne Collins, Adam Pasick and Luiz Romero – Far more than a postcard-ready icon of the city of Paris, the 13th-century building is beloved by people of all faiths, is a trove of art and relics, and has been immortalized in numerous works of literature. It has also been through dramatic ups and downs over the years.

But the reverberations of Monday’s fire spread as quickly as the blaze itself, transcending the physical damage. The blaze revealed fault lines in European politics, flaws in social media’s algorithm-driven fact-checking efforts, the usefulness of drones in firefighting, and just how personally humanity can feel the pain of a cultural tragedy.

In so many people’s imaginations, Paris is not supposed to change. Monuments such as Notre Dame are not supposed to be affected by the passage of time; but neither were the National Museum of Brazil, the treasures of Palmyra, the Glasgow School of Art, nor any other cultural treasures we’ve had snatched from us recently. more>

The new spirit of postcapitalism

Capitalism emerged in the interstices of feudalism and Paul Mason finds a prefiguring of postcapitalism in the lifeworld of the contemporary European city.
By Paul Mason – Raval, Barcelona, March 2019. The streets are full of young people (and not just students)—sitting, sipping drinks, gazing more at laptops than into each other’s eyes, talking quietly about politics, making art, looking cool.

A time traveler from their grandparents’ youth might ask: when is lunchtime over? But it’s never over because for many networked people it never really begins. In the developed world, large parts of urban reality look like Woodstock in permanent session—but what is really happening is the devalorization of capital.

But just 20 years after the roll-out of broadband and 3G telecoms, information resonates everywhere in social life: work and leisure have become blurred; the link between work and wages has been loosened; the connection between the production of goods and services and the accumulation of capital is less obvious.

The postcapitalist project is founded on the belief that, inherent in these technological effects lies a challenge to the existing social relations of a market economy, and in the long term, the possibility of a new kind of system that can function without the market, and beyond scarcity.

But during the past 20 years, as a survival mechanism, the market has reacted by creating semi-permanent distortions which—according to neoclassical economics—should be temporary.

In response to the price-collapsing effect of information goods, the most powerful monopolies ever seen have been constructed. Seven out of the top ten global corporations by market capitalization are tech monopolies; they avoid tax, stifle competition through the practice of buying rivals and build ‘walled gardens’ of interoperable technologies to maximize their own revenues at the expense of suppliers, customers and (through tax avoidance) the state. more>

All the ways recycling is broken—and how to fix them

You may throw a plastic container in the recycling bin and assume it’s going to easily become a new item. But every step of our recycling system—from product design to collection to sorting—has major flaws. Fortunately, promising technology is starting to come online that could revolutionize the process.
By Adele Peters – You may have read that there’s a recycling crisis in the U.S. After years of accepting our used plastic and cardboard, China now won’t take it, which often means there is no place for it to go. Some city recycling programs—unable to find other buyers—have quietly started sending recyclables to incinerators or landfills, news that could make anyone question the point of separating your trash at all.

Each year, by one estimate, Americans throw out around 22 million tons of products that could have been recycled. Tens of millions of homes don’t have access to recycling; for those that do, everything from broken blenders to old clothing still ends up in the trash. If you drop an empty package in a recycling bin and it’s trucked off to a sorting facility, that doesn’t necessarily guarantee it will be recycled. You might have unwittingly tossed something that your local recycling service doesn’t accept, or the package might have been designed in a way that makes it unrecyclable.

Some parts of the system do work. The aluminum in a beer can, for example, can easily be made into new beer cans, over and over again. But a plastic package might be chopped up, melted, mixed with other types of plastic, and “downcycled” into a lower-quality material that can only be used for certain products, like park benches or black plastic planters.

When the U.S. was sending much of its paper and plastic trash to China, for more than two decades, the bales were often so poorly sorted that they contained garbage. The system never extracted the full value from those materials.

When a truck picks up recyclables from curbside bins, they take them to sorting facilities. Inside these centers, called “MRFs” or materials recycling facilities, people work with automated equipment to sort through the detritus of everyday life. Trucks dump mixed materials into the facility, where it’s loaded onto a conveyor belt; typically, in a first step, people standing next to the machine quickly pull out trash and materials like plastic bags that can jam equipment.

As materials move through a facility, the system uses gravity, screens, filters, and other techniques to separate out paper, metal, glass, and plastics; optical sorting equipment identifies each type of plastic. more>

A New Americanism

Why a Nation Needs a National Story
By Jill Lepore – Carl Degler issued a warning: “If we historians fail to provide a nationally defined history, others less critical and less informed will take over the job for us.”

The nation-state was in decline, said the wise men of the time. The world had grown global. Why bother to study the nation?

Francis Fukuyama is a political scientist, not a historian. But his 1989 essay “The End of History?” illustrated Degler’s point. Fascism and communism were dead, Fukuyama announced at the end of the Cold War.

Fukuyama was hardly alone in pronouncing nationalism all but dead. A lot of other people had, too. That’s what worried Degler.

Nation-states, when they form, imagine a past. That, at least in part, accounts for why modern historical writing arose with the nation-state.

But in the 1970s, studying the nation fell out of favor in the American historical profession. Most historians started looking at either smaller or bigger things, investigating the experiences and cultures of social groups or taking the broad vantage promised by global history.

But meanwhile, who was doing the work of providing a legible past and a plausible future—a nation—to the people who lived in the United States? Charlatans, stooges, and tyrants.

The endurance of nationalism proves that there’s never any shortage of blackguards willing to prop up people’s sense of themselves and their destiny with a tissue of myths and prophecies, prejudices and hatreds, or to empty out old rubbish bags full of festering resentments and calls to violence.

When historians abandon the study of the nation, when scholars stop trying to write a common history for a people, nationalism doesn’t die. Instead, it eats liberalism.

Maybe it’s too late to restore a common history, too late for historians to make a difference. But is there any option other than to try to craft a new American history—one that could foster a new Americanism? more>