Category Archives: Net

How evil happens

BOOK REVIEW

Passions and Tempers: A History of the Humours, Author: Noga Arikha.
Eichmann in Jerusalem, Author: Hannah Arendt.
The Science of Evil: On Empathy and the Origins of Cruelty, Author: Simon Baron-Cohen.
Home Fire, Author: Kamila Shamsie.

By Noga Arikha – The ‘sapiens’ in Homo sapiens does not fully describe our species: we are as violent as we are smart.

This might be why we are the only Homo genus left over in the first place, and why we have been so destructively successful at dominating our planet. But still the question nags away: how are ordinary people capable of such obscene acts of violence?

Today, biology is a powerful explanatory force for much human behavior, though it alone cannot account for horror. Much as the neurosciences are an exciting new tool for human self-understanding, they will not explain away our brutishness. Causal accounts of the destruction that humans inflict on each other are best provided by political history – not science, nor metaphysics. The past century alone is heavy with atrocities of unfathomable scale, albeit fathomable political genesis.

The social neuroscientist Tania Singer at the Max Planck Institute in Leipzig in Germany defines empathy as the ability to ‘resonate’ with the feelings of the other. It develops from babyhood on – as imitation at first, then joint attention – into the ability to adopt the point of view of another, along with a shift in spatial perception from self to other, as if one were literally stepping into another’s shoes.

This requires an ability to distinguish between self and other in the first place, an aspect of the so-called ‘theory of mind’ that one acquires over the first five years of life.

But while empathy ensures the cohesion of a group or a society, it is also biased and parochial. Revenge thrives on it. more>

Related>

Updates from Ciena

The Adaptive Network: Why automation alone isn’t enough
By Keri Gilder – Just imagine, instead of 70, your heart rate was at 100 beats per minute. This could be a warning sign that you are on the verge of having a heart attack.

If your doctor were to get this information in real time, they could check the readings against your medical records and see that this is completely out of the norm and then warn you to seek medical assistance immediately.

However, if your personal trainer received that same information, would they reach the same conclusion as your doctor? Your trainer has access to a different database, which might show your resting heart rate as well as the rate during high-intensity training. Knowing that you are likely exercising, they would instead conclude that there is no need to go to the hospital after all.

This clearly demonstrates that just accepting raw data without filtering and proper analysis is no longer good enough and can potentially have serious repercussions. Instead, it is critical that we have diversity of thought when it comes to how we interpret data.

This is not just true for our health or other day-to-day scenarios, but can also be applied to the communication networks that carry and house our information. more>

Related>

Where Did Qualcomm Go Wrong?

By Bolaji Ojo – It’s a justifiable question. The Qualcomm–NXP trip was an expensive sortie: Qualcomm has paid $2 billion in mandatory break-off fees to NXP, but the bill for the hidden costs may be much higher. For nearly two years, the communications IC and IP supplier and its target endured prolonged uncertainties. Even now, the spasms from customer disruptions remain strong while many employees, though heaving a sigh of relief, must figure out where they truly belong in the enterprise.

Qualcomm is moving on resolutely from the NXP debacle. It must. However, the implications and lessons — if any — are industry-wide. One of the largest acquisitions in the history of the semiconductor industry foundered because of oppositions from various fronts, including customers who might have benefited from it. Simply dumping the blame on nebulous factors and faceless regulators will result in the industry learning nothing from the experience. Perhaps the transaction was destined to fail. Perhaps it could have been better managed and successfully, too. A thorough assessment of why this deal collapsed would offer lessons that can be applied to future deals.

There are no signs that Qualcomm will conduct a detailed analysis of why and how the bid unraveled. It is easier — again — to simply toss more money at stakeholders and move on. NXP’s management and shareholders who had tendered their equity could slake their thirst with $2 billion in Qualcomm’s money. more>

The Progressives’ Plan to Win in 2018

By Elaine Godfrey – Democrats have been grappling with key questions about coalition building since the 2016 election: Should they prioritize winning back the voters they lost to Trump?

Should they attempt to woo the white voters gradually fleeing the party?

Progressives this weekend said, emphatically, no. It’s a genuine attempt to remake the Democratic Party at a time when racial and class tensions are the highest they’ve been since the 1960s—and it’s also put them on a collision course with party leaders and other Democrats.

That doesn’t mean ignoring whites and Trump voters, she says. Instead, “it’s rejecting the notion that our way to victory is having a centrist, moderate right-leaning strategy that feels like we could peel off Romney Republicans, versus investing in communities of color, marginalized groups, and progressive white people,” Anoa Changa said. “There is this notion that … we can’t address the issues of race, systemic oppression, because we don’t want to piss these voters off. We have to find a way to do both.” more>

Related>

Supercomputers

By Laura Panjwani – High-performance computers (HPC), also known as supercomputers, give scientists the power to solve extremely complex or data intensive problems by concentrating the processing power of multiple, parallel computers.

Performance of a supercomputer is measured in floating-point operations per second (FLOPS) instead of million instructions per second (MIPS), the measurement used in conventional computing.

The technology has a plethora of applications—including quantum mechanics, climate research, oil and gas exploration, chemistry, aerospace and automotive technologies, and much more.

In addition to environmental applications, supercomputers are also key to many up-and-coming technologies, including autonomous vehicles. In our article, “Using Deep Learning, AI Supercomputing, NVIDIA Works to Make Fully Self-Driving Cars a Reality” we highlighted Xavier, a complete system-on-chip (SoC) that integrates a new graphics processing unit (GPU) architecture called Volta, a custom 8 core CPU architecture, and a new computer vision accelerator. It features 9 billion transistors and a processor that will deliver 30 trillion operations per second (TOPS) of performance, while consuming only 30 watts of power.

This technology is the most complex SOC ever created and is a key part of the NVIDIA DRIVE Pegasus AI computing platform, the world’s first AI car supercomputer designed for fully autonomous Level 5 robotaxis. more>

What’s More Dangerous, Immigration Or Russian Meddling?

By Robert Reich – What’s the most worrisome foreign intrusion into the United States—unauthorized immigrants, Chinese imports, or interference in our democracy?

For Trump, it’s immigrants and imports. He doesn’t care much about the third.

Yet Trump continues to assert that talk of Russian meddling in American elections is “a big hoax.” And his White House still has no plan for dealing with it.

In fact, Trump has it backwards.

Illegal immigration isn’t the problem he makes it out to be. Illegal border crossings have been declining for years.

And if the Chinese want to continue to send us cheap imports that we pay for with U.S. dollars and our own IOUs, that’s as much of a potential problem for them as it is for us.

But Russian attacks on our democracy are a clear and present threat aimed at the heart of America. more>

Updates from Datacenter.com

Cabinet airflow management done right
By Hans Vreeburg – Let’s start with some basic understanding of airflows within data centers. Nowadays almost all data centers apply hot and cold corridors to optimize the cooling capabilities. In the ideal situation cold air goes straight to the servers’ inlets. The hot air exiting the servers is returned directly to the cooling unit. This setup enables systems to run at the highest possible efficiency, using the least amount of power. The cooling setup has a big influence on the PUE (Power Usage Effectiveness): a lower PUE results in lower total energy consumption of the data center. This indirectly saves the environment and lowers OPEX costs. Could a small gap in your server rack really have that much influence?

As said above, the ideal setup is cold air entering the servers, while hot air exits. Gaps can lead to a higher demand of cold air than actually required by the servers. See it as a large water pipe: it normally needs a specific amount of water. When we make some holes in the pipe, you will need to pump in more water to get the same amount at the end of the pipe. more>

Related>

How to govern AI to make it a force for good

In the interview, Gasser identifies three things policymakers and regulators should consider when developing strategies for dealing with emerging technologies like AI.
Urs Gasser – “Everyone is talking about Artificial Intelligence and its many different applications, whether it’s self-driving cars or personal assistance on the cell phone or AI in health,” he says. “It raises all sorts of governance questions, questions about how these technologies should be regulated to mitigate some of the risks but also, of course, to embrace the opportunities.”

One of the largest challenges to AI is its complexity, which results in a divide between the knowledge of technologists and that of the policymakers and regulators tasked to address it, Gasser says.

“There is actually a relatively small group of people who understand the technology, and there are potentially a very large population affected by the technology,” he says.

This information asymmetry requires a concerted effort to increase education and awareness, he says.

“How do we train the next generation of leaders who are fluent enough to speak both languages and understand engineering enough as well as the world policy and law enough and ethics, importantly, to make these decisions about governance of AI?”

Another challenge is to ensure that new technologies benefit all people in the same way, Gasser says.

Increasing inclusivity requires efforts on the infrastructural level to expand connectivity and also on the data level to provide a “data commons” that is representative of all people, he says. more>

In extremis

By Nabeelah Jaffer – to understand what has led someone to extremism it is not enough to point to ideology. Ideas alone did not bring Mair to leave his home that morning with a sawn-off shotgun and a seven-inch knife. The accounts that emerged in the weeks after Cox’s murder dwelt on many details of Mair’s previously blameless life.

‘Loneliness is the common ground of terror’ – and not just the terror of totalitarian governments, of which Hannah Arendt was thinking when she wrote those words in The Origins of Totalitarianism. It also generates the sort of psychic terror that can creep up on a perfectly ordinary individual, cloaking everything in a mist of urgent fear and uncertainty.

Totalitarian ideas offer a ‘total explanation’ – a single idea is sufficient to explain everything. Independent thought is rendered irrelevant in the act of joining up to their black-and-white worldview.

Becoming an ‘idealist’ assuaged these fears (the word is perhaps better read as ‘ideologue’). After all, if you sign up to the idea that class struggle, racial competition or civilizational conflict is absolute, then you can achieve meaning and kinship as part of a race, class or civilization without ever requiring two-sided thought – the kind of thought that involves weighing competing imperatives and empathizing with a range of people. more>

What makes people distrust science? Surprisingly, not politics

By Bastiaan T Rutjens – Today, there is a crisis of trust in science. Many people – including politicians and, yes, even presidents – publicly express doubts about the validity of scientific findings. Meanwhile, scientific institutions and journals express their concerns about the public’s increasing distrust in science.

How is it possible that science, the products of which permeate our everyday lives, making them in many ways more comfortable, elicits such negative attitudes among a substantial part of the population?

Understanding why people distrust science will go a long way towards understanding what needs to be done for people to take science seriously.

Political ideology is seen by many researchers as the main culprit of science skepticism. The sociologist Gordon Gauchat has shown that political conservatives in the United States have become more distrusting of science, a trend that started in the 1970s.

From these studies there are a couple of lessons to be learned about the current crisis of faith that plagues science. Science skepticism is quite diverse. Further, distrust of science is not really that much about political ideology, with the exception of climate-change skepticism, which is consistently found to be politically driven.

Additionally, these results suggest that science skepticism cannot simply be remedied by increasing people’s knowledge about science. more>