Category Archives: Energy

The right to hydrogen

By Jorgo Chatzimarkakis – Even one year ago hydrogen would have been regarded as an interesting niche technology – but rather science-fiction than a future cornerstone of the European Green Deal. As with many other important historical disruptions, the pandemic has also led to viewing this technology from a different angle on a global level. This became visible this very week when John Kerry visited Berlin in order to prepare a joint approach with the Europeans towards the next big climate conference: The former US- Vice-President mentioned green hydrogen continuously at the same level as electrification. And this mirrors the importance that has been attributed to hydrogen by the Europeans already a year now, starting with the launch of the European hydrogen strategy in July last year.

Including renewable produce hydrogen into the energy, mobility and industry scenario leads to major shifts with regards to policies, investments and overall expectations. And this is good because we have to use the technology which helps us to quickly achieve palpable results in the “battle” for zero-emissions that we all strive for. But this situation unequivocally leads to distribution struggles. The issues on the agenda are: Who should be favored by which legislation? Who should be invited to conferences preparing big decisions? Who would be justified to receive public funding? Who will be favored by private investment?

Picking up a thought mentioned already: We do not need to focus on technology A versus technology B, but maintain technological openness and focus on those solutions that allow us to achieve our goals in the best way. I believe that even the advocates for an all-electric scenario have rightly understood that hydrogen will most probably become the other leg of the energy, mobility and industry transition next to electrification. more>

Updates from McKinsey

Streaming and royalties in mining: Let the music play on
Renewed growth sentiment among miners’ management teams, combined with the rise of streaming-and-royalty financing over the past ten years, suggests that this particular type of alternative financing could be set for significant expansion over the next decade.
By Scott Crooks, Siddharth Periwal, Oliver Ramsbottom, Elijah Saragosa, and Jessica Vardy – Following the commodity downturn in 2014, many miners were forced to focus on cost-out initiatives, deleveraging balance sheets and returning cash to shareholders who had become disillusioned with the industry’s track record. Growth projects were inevitably over budget (and often behind schedule), and M&A deals were often completed at lofty premiums—but, in hindsight, they often were executed at the top of the market, resulting in value destruction. In the post-boom environment, many mining companies found it challenging to raise capital from either the public-debt or public-equity markets. As a result, many industry commentators predicted the emergence of private debt and private equity. While the growth in private debt and equity has been below expectations, one form of alternative financing that has blossomed has been streaming-and-royalty financing. Expansion in this form of alternative financing, coupled with increasing focus on growth by management teams, leads us to believe that streaming-and-royalty financing is poised for strong growth over the next decade.

Metal streaming-and-royalty contracts are transactions under which mining companies sell future production or revenues in return for an up-front cash payment. There are some distinct differences between the two types. Streaming deals are normally focused on specific commodities produced by a particular project, such as precious-metal by-products from a base-metals project. In return for this up-front cash payment (the “deposit balance”), the streaming partner secures a share of future production at an agreed-upon discounted price, which may be fixed or alternatively a floating percentage of the prevailing spot price. Thus, miners receive payment on delivery for streamed physical volumes. In contrast, royalty deals are normally commodity agnostic and based on overall project revenues; the royalty company never actually “sees” the commodities that the mine produces, but rather just receives a share of the revenue generated (the royalty). In effect, streaming deals are settled by the physical transfer of metal while royalty deals are settled with cash.

Royalty ownership in the mining industry is generally agreed to have originated with Franco-Nevada in the mid-1980s. The mining company’s first royalty investment in 1986 involved spending half the corporate treasury to acquire 4 percent of the revenues from a mine in Nevada owned by Western State Minerals. Following this initial transaction, Franco-Nevada went on to purchase royalties in various other commodities, further developing the mining sector’s royalty business model. The arrival of the precious-metals streaming business model is often attributed to Wheaton River: while seeking to raise funds in 2004 to expand its core business of gold mining, the company conceived the idea of streaming silver by-product from the San Dimas gold mine in Mexico to a new subsidiary company, Silver Wheaton. In the world’s first streaming agreement, Silver Wheaton purchased yet-to-be-produced silver from Wheaton River’s operations in Mexico in return for an up-front payment and additional payments on delivery of the silver. New players have emerged in the past decade in the streaming-and-royalty sector, including Triple Flag in 2016, Nomad Royalty in 2019, and Deterra Royalties in 2020. However, the industry remains very consolidated, with the top three players—Wheaton Precious Metals, Franco-Nevada Corporation, and Royal Gold—representing approximately 80 percent of the total value of streaming-and-royalty contracts as defined by volume of gold equivalent ounces (GEOs). more>

Related>

Optimizing Thermoelectric Energy Generation

By Elizabeth Montalbano – Deriving energy from the heat electronic devices emit so they can provide their own sustainable sources of power is a Holy Grail for scientists developing power sources for sensors that will drive the future of healthcare devices as well as the Internet of Things.

Researchers in Japan now have come up with a new thermoelectric generator that can convert temperature differences to electricity can be used to power small, flexible devices.

Scientists at Osaka University developed the device in the form of a bismuth telluride semiconductor on a thin, polymer film that weighs less than a paperclip and smaller than the size of an adult fingernail.

However, packed in the tiny device is a maximum output power density of 185 milliwatts per square centimeter, which “meets standard specifications for portable and wearable sensors,” Tohru Sugahara, an associate professor at the university’s Institute of Scientific and Industrial Research, in a press statement. more>

Updates from Adobe

December 2019 Giveaways
Adobe – Experiencing good design, illustration, photography, motion graphics, and video is like a gift—and it’s a gift that you, the creative community, give us at Adobe every day. So in return, we’re giving you gifts.

They range from Photoshop actions to lettering sets to texture packs and more. They’re all high quality, free, and copyright-free, and you can use them in any project, personal or commercial.

All we ask is that you don’t re-distribute them. more>

Related>

Universe in a bubble

Maybe we don’t have to speculate about what life is like inside a bubble. It might be the only cosmic reality we know.
By J Richard Gott – The explanation for the accelerating cosmic expansion, surprising as it was at first, was readily available from the theoretical toolbox of physicists. It traced back to an idea from Albert Einstein, called the cosmological constant. Einstein invented it in 1917, as part of a failed attempt to produce a static Universe based on his general theory of relativity. At that time, the data seemed to support such a model.

In 1922, the Russian mathematician Alexander Friedmann showed that relativity in its simplest form, without the cosmological constant, seemed to imply an expanding or contracting Universe. When Hubble’s observations showed conclusively that the Universe was expanding, Einstein abandoned the cosmological constant, but the possibility that it existed never went away.

Then the Belgian physicist Georges Lemaître showed that the cosmological constant could be interpreted in a physical way as the vacuum of empty space possessing a finite energy density accompanied by a negative pressure. That idea might sound rather bizarre at first. We are accustomed, after all, to thinking that the vacuum of empty space should have a zero energy density, since it has no matter in it. But suppose empty space had a finite but small energy density – there’s no inherent reason why such a thing could not be possible.

Negative pressure has a repulsive gravitational effect, but at the same time the energy itself has an attractive gravitational effect, since energy is equivalent to mass. (This is the relationship described by E=mc2, another implication of special relativity.) Operating in three directions – left-right, front-back, and up-down – the negative pressure creates repulsive effects three times as potent as the attractive effects of the vacuum energy, making the overall effect repulsive. We call this vacuum energy dark energy, because it produces no light. Dark energy is the widely accepted explanation for why the expansion rate of the Universe is speeding up.

Distant galaxies will flee from us because of the stretching of space between us and them. After a sufficient number of doublings, the space between them and us will be stretching so fast that their light will no longer be able to cross this ever-widening gap to reach us. Distant galaxies will fade from view and we will find ourselves seemingly alone in the visible Universe. more>

Updates from ITU

If we want to solve climate change, water governance is our blueprint
By Elizabeth Taylor – The phrase “fail to prepare or prepare to fail” comes to mind as we enter an era in which governments and communities must band together to mitigate climate change. Part of what makes our next steps so uncertain is knowing we must work together in ways that we have – so far – failed to do. We either stall, or offer up “too little, too late” strategies.

These strategies include cap-and-trade economic incentive programs, like the Kyoto Protocol and other international treaties. Insightful leaders have drawn attention to the issue, but lukewarm political will means that they are only able to defer greenhouse gas emissions-reduction targets in the future. A global crisis demands global commitment. How can we work together to face a universal threat? What of the complex challenges that demand unified monitoring and responses?

One principal impediment is the lack of coherent technical infrastructure.

Currently, our arsenal for facilitating collective action is understocked. Our policies are unable to invoke tide-turning change because they lack a cohesive infrastructure. In the absence of satisfactory tools to make them happen, our policies and pledges become feelgood initiatives rather than reaching full effectiveness.

What tools might lead us to act collectively against climate change? It’s easy to focus on the enormous scale of global cooperation needed, or the up-front investments it will take to mitigate the crisis. But as the writer E.L. Doctorow reminded us, we can’t be intimidated by the process: “Writing a novel is like driving a car at night,” he said. “You can see only as far as your headlights, but you can make the whole trip that way.”

We don’t have to possess all the answers as we set out to save our communities. We don’t have to know exactly what we will meet along the way. At a minimum, we must only understand how to use our headlights to see the first few feet ahead of us.

So what is the first step on our path?

It is the substance that underpins our industry, health and survival. It remains a central source of conflict around the world, yet it also creates partnerships. Our first step is water.

Water challenges us with issues of scarcity, quality and distribution. It may seem to be a local issue, but combined with local tensions and a globalized economy, water governance is set to become one of our greatest tests of diplomatic finesse and technological synergy.

If we can properly align local and global water governance and management, we can prepare the tools, the organizational blueprint and the political momentum needed to solve climate change. more>

Related>

Why the US bears the most responsibility for climate change, in one chart

By Umair Irfan – Humans are pumping more carbon dioxide into the atmosphere at an accelerating rate. But climate change is a cumulative problem, a function of the total amount of greenhouse gases that have accumulated in the sky. Some of the heat-trapping gases in the air right now date back to the Industrial Revolution. And since that time, some countries have pumped out vastly more carbon dioxide than others.

The wonderful folks at Carbon Brief have put together a great visual of how different countries have contributed to climate change since 1750. The animation shows the cumulative carbon dioxide emissions of the top emitters and how they’ve changed over time.

What’s abundantly clear is that the United States of America is the all-time biggest, baddest greenhouse gas emitter on the planet.

That’s true, despite recent gains in energy efficiency and cuts in emissions. These relatively small steps now cannot offset more than a century of reckless emissions that have built up in the atmosphere. Much more drastic steps are now needed to slow climate change. And as the top cumulative emitter, the US bears a greater imperative for curbing its carbon dioxide output and a greater moral responsibility for the impacts of global warming.

Yet the United States is now the only country aiming to withdraw from the Paris climate agreement. more>

Updates from Georgia Tech

Neuroscientists Team with Engineers to Explore how the Brain Controls Movement
By Carol Clark – Scientists have made remarkable advances into recording the electrical activity that the nervous system uses to control complex skills, leading to insights into how the nervous system directs an animal’s behavior.

“We can record the electrical activity of a single neuron, and large groups of neurons, as animals learn and perform skilled behaviors,” says Samuel Sober, an associate professor of biology at Emory University who studies the brain and nervous system. “What’s missing,” he adds, “is the technology to precisely record the electrical signals of the muscles that ultimately control that movement.”

The Sober lab is now developing that technology through a collaboration with the lab of Muhannad Bakir, a professor in Georgia Tech’s School of Electrical and Computer Engineering.

The technology will be used to help understand the neural control of many different skilled behaviors to potentially gain insights into neurological disorders that affect motor control.

“By combining expertise in the life sciences at Emory with the engineering expertise of Georgia Tech, we are able to enter new scientific territory,” Bakir says. “The ultimate goal is to make discoveries that improve the quality of life of people.” more>

Related>

Guidelines to Achieve Digital Transformation

GSR-18 BEST PRACTICE GUIDELINES ON NEW REGULATORY FRONTIERS TO ACHIEVE DIGITAL TRANSFORMATION
itu.int – Digitization is increasingly and fundamentally changing societies and economies and disrupting many sectors in what has been termed the 4th Industrial Revolution. Meanwhile, ICT regulation has evolved globally over the past ten years and has experienced steady transformation.

As regulators, we need to keep pace with advances in technology, address the new regulatory frontiers and create the foundation upon which digital transformation can achieve its full potential. Being prepared for digital transformation and emerging technologies such as Artificial Intelligence (AI), the Internet of Things (IoT), Machine to Machine communications (M2M) and 5G is fundamental.

Advances in technology are creating new social phenomena and business models that impact every aspect of our personal and professional lives – and which challenge regulatory paradigms. M2M, cloud computing, 5G, AI and IoT are all bringing further profound change. Recognizing the potential of emerging technologies and the impact that policy and regulatory frameworks can have on their success, regulators should encourage a regulatory paradigm pushing frontiers and enabling the digital transformation. more> draft doc (pdf)

Updates from Siemens

Closed Loop Quality Management for Electronics
Siemens – Optimize and simplify business processes by standardizing and unifying quality related processes and workflows throughout your entire organization.

Quality planning begins during the engineering and design process of your product, and continuous with quality control during the manufacturing of the product.

With the collection of quality data from design and production you are able to initiate the problem solving process and improve your product and your manufacturing processes continuously and sustainably.

The Plan-Do-Check-Act (PDCA) cycle describes the four phases of the continuous improvement process (CIP) and is the basis for the Siemens PLM quality philosophy. more>

Related>