Tag Archives: Physics

How to play mathematics


The Pearly Gates of Cyberspace, Author: Margaret Wertheim.
Physics on the Fringe, Author: Margaret Wertheim.
African Fractals: Modern Computing and Indigenous Design, Author: Ron Eglash.

(glasbergen.com)By Margaret Wertheim – The world is full of mundane, meek, unconscious things materially embodying fiendishly complex pieces of mathematics. How can we make sense of this? I’d like to propose that sea slugs and electrons, and many other modest natural systems, are engaged in what we might call the performance of mathematics.

Rather than thinking about maths, they are doing it.

In the fibers of their beings and the ongoing continuity of their growth and existence they enact mathematical relationships and become mathematicians-by-practice. By looking at nature this way, we are led into a consideration of mathematics itself not through the lens of its representational power but instead as a kind of transaction.

Rather than being a remote abstraction, mathematics can be conceived of as something more like music or dancing; an activity that takes place not so much in the writing down as in the playing out.

Since at least the time of Pythagoras and Plato, there’s been a great deal of discussion in Western philosophy about how we can understand the fact that many physical systems have mathematical representations: the segmented arrangements in sunflowers, pine cones and pineapples (Fibonacci numbers); the curve of nautilus shells, elephant tusks and rams horns (logarithmic spiral); music (harmonic ratios and Fourier transforms); atoms, stars and galaxies, which all now have powerful mathematical descriptors; even the cosmos as a whole, now represented by the equations of general relativity.

The physicist Eugene Wigner has termed this startling fact ‘the unreasonable effectiveness of mathematics’.

Why does the real world actualize maths at all? And so much of it?

Even arcane parts of mathematics, such as abstract algebras and obscure bits of topology often turn out to be manifest somewhere in nature. more> https://goo.gl/ifKV2Z

Updates from Georgia Tech

Pioneer of Modern Electronics
By Michael Baxter – The smartphone you peer into, the LED bulb in your desk lamp, the Blu-Ray player that serves up your favorite film – all are here largely because of Russell Dupuis, a professor in electrical and computer engineering at Georgia Tech.

That’s because an essential component of their manufacturing traces back to a process that Dupuis developed in the late 1970s, a process that ushered in a new breed of mass-produced compound semiconductors. These electronic components – particularly those forged of elements from columns III and V in the periodic table — can operate at extremely high frequencies or emit light with extraordinary efficiency. Today, they’re the working essence of everything from handheld laser pointers to stadium Jumbotrons.

The process is known as metalorganic chemical vapor deposition, or MOCVD, and until Dupuis, no one had figured out how to use it to grow high-quality semiconductors using those III-V elements. Essentially, MOCVD works by combining the atomic elements with molecules of organic gas and flowing the mixture over a hot semiconductor wafer. When repeated, the process grows layer after layer of crystals that can have any number of electrical properties, depending on the elements used. more> https://goo.gl/eG2G8e


Developing the APTitude to Design New Materials, Atom-by-Atom

By Paul Blanchard – Up to now, our technological progress has largely been a matter of trial and error. We make something new, evaluate its performance, then alter some part of the fabrication process and see whether it performs better or worse, all without direct knowledge of what is changing at the atomic level.

But if we could see what’s going on at that scale—if we could map out each individual atom and understand the role that it plays—we could create new and better materials not through blind experimentation, but through design.

For all that we’ve been able to accomplish while ignoring them, the fact is that individual atoms matter. The speed of a transistor, the efficiency of a solar cell, and the strength of an I-beam are ultimately determined by the configuration of the atoms inside. Today, new and improved microscopy techniques are getting us closer and closer to the goal of being able to see each and every atom within the materials we make—a very exciting prospect.

Over the past three years, I’ve been lucky enough to be part of a team working with one such new and improved microscopy technique, a method called 3-D atom probe tomography, or APT for short. APT is very different from conventional microscopy—at least, the sort of microscopy that I’m accustomed to. In conventional microscopy, we shine a beam of light particles or electrons on our specimen, whatever it is we want to look at, and create a magnified image using lenses or by mapping how our beam bounces off it.

In atom probe tomography, on the other hand, we don’t just look at our specimen—we literally take it apart, atom-by-atom. more> https://goo.gl/c0VdE3

Updates from Georgia Tech

The Health Informatics Revolution
By John Toon – Using massive data sets, machine learning, and high-performance computing, health analytics and informatics is drawing us closer to the holy grail of health care: precision medicine, which promises diagnosis and treatment tailored to individual patients. The information, including findings from the latest peer-reviewed studies, will arrive on the desktops and mobile devices of clinicians in health care facilities large and small through a new generation of decision-support systems.

“There are massive implications over the coming decade for how informatics will change the way care is delivered, and probably more so for how care is experienced by patients,” said Jon Duke, M.D., director of Georgia Tech’s Center for Health Analytics and Informatics.

“By providing data both behind the scenes and as part of efforts to change behavior, informatics is facilitating our ability to understand patients at smaller population levels. This will allow us to focus our diagnostic paths and treatments much better than we could before.”

Georgia Tech’s health informatics effort combines academic researchers in computing and the biosciences, practitioners familiar with the challenges of the medical community, extension personnel who understand the issues private companies face, and engineers and data scientists with expertise in building and operating secure networks tapping massive databases.

“It takes all of these components to really make a difference in an area as complex as health informatics,” said Margaret Wagner Dahl, Georgia Tech’s associate vice president for information technology and analytics.

“This integrated approach allows us to add value to collaborators as diverse as pharmaceutical companies, health care providers, large private employers, and federal agencies.” more> https://goo.gl/63pIZd


Updates from GE

No Laughing Matter: The World Is Running Out Of Helium, But It Won’t Hold These MRI Engineers Down
By Tomas Kellner and Dorothy Pomerantz – MRI machines explore the body by using powerful magnets and pulsing radio frequency signals. For the magnets to work, MRI manufacturers such as GE use liquid helium to cool them to minus 452 degrees Fahrenheit (minus 269 Celsius), just above absolute zero. At that temperature, they lose all electrical resistance and become superconducting.

“When you power up a super-cooled magnet, it can produce the same magnetic field for a thousand years with no more power required,” MR engineer and inventor Trifon Laskaris told GE Reports. The problem is that some machines need as much as 8,000 liters of the helium, and the world is running out of it, to the chagrin of radiologists and party-store owners alike.

After the fall of the Soviet Union, the Helium Privatization Act of 1996 got the government out of the business of producing the gas. But sales from the huge U.S. helium reserve stored in porous rock deep underneath Amarillo, Texas, kept down prices and gave private producers few incentives to enter the market. The shortage followed. more> https://goo.gl/emDpN3

Established Technology Nodes: The Most Popular Kid at the Dance

By Michael White – he functionality we crave, such as smart power management for longer battery life, and Wi-Fi and Bluetooth for more connectivity, are more cost-effective when implemented at established nodes between 40 nm and 180 nm. Consequently, the high consumer demand for these capabilities is driving increased demand for ICs manufactured using these processes.

In a nutshell, the nodes that best support radio frequency (RF) and mixed-signal IC designs with low power, low cost and high reliability are seeing a much higher demand than in the past.

The other dynamic driving a longer than expected life of established nodes—40/45 nm and 32/28 nm in particular—is the wafer cost trend at 20 nm and below. 20 nm and below are well-suited for advanced CPUs, application processors, etc., but from a price/performance perspective, they are generally a poor fit for sensors, connectivity, analog mixed-signal (AMS) applications, etc.

Although you wouldn’t necessarily know it from reading press releases each week, designs at 65 nm and larger still account for approximately 43% of all wafer production and 48% of wafer fab capacity. Even more significant, nodes 65nm and larger account for approximately 85% of all design starts (Figure). Clearly, established nodes are not fading away any time soon. more> http://goo.gl/n9QKCj

Must science be testable?


Answers for Aristotle: How Science and Philosophy Can Lead Us to A More Meaningful Life, Author: Massimo Pigliucci.
Conjectures and Refutations, Author: Karl Popper.

By Massimo Pigliucci – The general theory of relativity is sound science; ‘theories’ of psychoanalysis, as well as Marxist accounts of the unfolding of historical events, are pseudoscience. This was the conclusion reached a number of decades ago by Karl Popper, one of the most influential philosophers of science. Popper was interested in what he called the ‘demarcation problem’, or how to make sense of the difference between science and non-science, and in particular science and pseudoscience.

You might have heard of string theory. It’s something that the fundamental physics community has been playing around with for a few decades now, in their pursuit of what Nobel physicist Steven Weinberg grandly called ‘a theory of everything.’

It isn’t really a theory of everything, and in fact, technically, string theory isn’t even a theory, not if by that name one means mature conceptual constructions, such as the theory of evolution, or that of continental drift.

In fact, string theory is better described as a general framework – the most mathematically sophisticated one available at the moment – to resolve a fundamental problem in modern physics: general relativity and quantum mechanics are highly successful scientific theories, and yet, when they are applied to certain problems, like the physics of black holes, or that of the singularity that gave origin to the universe, they give us sharply contrasting predictions. more> http://goo.gl/DAJ33Q

We Might Live in a Virtual Universe — But It Doesn’t Really Matter

By Maxim Roubintchik – The first thing to realize is this: Our perception of reality is already separate from reality itself.

To paraphrase Morpheus from the movie The Matrix, reality is simply an electrical impulse being interpreted by your brain. We experience the world indirectly and imperfectly. If we could see the world as it is, there would be no optical illusions, no color blindness and no mind tricks.

Further, we only experience a simplified version of all this mediated sensory information. The reason? Seeing the world as it is requires too much processing power — so our brain breaks it into heuristics (or simplified but still useful representations). Our mind is constantly looking for patterns in our world and will match them with our perception.

From this we can conclude the following:

Our perception of reality is already different from reality itself. What we call reality is our brains’ attempt to process the incoming flood of sensory data.

If our perception of reality is dependent on a simplified flow of information, it doesn’t matter what the source of this information is — whether it’s the physical world or a computer simulation feeding us the same information. more> http://goo.gl/fIwQEb

Nature’s Way to Relieve Stress Inspires Designs for Nearly Indestructible Bridges

By Elizabeth Montalbano – Emeritus Professor Wanda Lewis in the School of Engineering at the University of Warwick has been studying how nature relieves stress for 25 years, taking an approach called “form-finding,” a process of shaping an object, or a structure, by loads applied to it

This process is different than engineering methods that start from an assumed shape and then check the stresses and displacements in a structure under an applied load, she said.

Form-finding enables the design of rigid structures that follow a strong natural form, structures that are sustained by a force of pure compression or tension without bending stresses, Lewis said. These stresses are the main points of weakness in structures and what causes bridges to fail or buckle under weight or stress and cause damage or even collapse.

“In form-finding, we go in the opposite direction — the shape of the structure is not known initially, it is found by the application of load and involves repetitive calculations to find a shape that is in equilibrium with all forces,” she said. more> http://goo.gl/xHtqjI

Updates from Georgia Tech

Roadmap for Advanced Cell Manufacturing Shows Path to Cell-Based Therapeutics
By John Toon – An industry-driven consortium has developed a national roadmap designed to chart the path to large-scale manufacturing of cell-based therapeutics for use in a broad range of illnesses including cancer, neuro-degenerative diseases, blood and vision disorders and organ regeneration and repair.

Over the past decade, new and emerging cell-based medical technologies have been developed to manage and possibly cure many conditions and diseases. In 2012 alone, these technologies treated more than 160,000 patients. Before these treatments can be more widely available, however, the cell therapeutics community will have to develop the capability for advanced, large-scale manufacturing of high-quality and consistent living cells.

To advance that goal, the Georgia Research Alliance (GRA) and the Georgia Institute of Technology (Georgia Tech) have launched the National Cell Manufacturing Consortium (NCMC), an industry-academic-government partnership that recently released the National Roadmap for Advanced Cell Manufacturing. Establishment of the consortium and development of this 10-year national roadmap was sponsored by the National Institute of Standards and Technology (NIST).

The roadmap was announced June 13 at the White House Organ Summit. more> http://goo.gl/bjvzQr