5 Tips for Building Fog Networks

By Chuck Byers – Fog computing was conceived as a way to enable applications in high-throughput, high-compute ecosystems that require real-time processing. In the world of the Internet of Things, fog is already supporting deployments on a global scale.

The OpenFog Consortium defines fog computing as: “A horizontal, system-level architecture that distributes computing, storage, control and networking functions closer to the users along a Cloud-to-Thing continuum.”

In IoT, for example, applications are sometimes difficult to predict or pin down. It is often helpful to define the application space of a network or network element in terms of a three-layer taxonomy of the vertical market served, the use cases in those verticals and the specific applications within those use cases.

Fog nodes are fundamental processing elements that enable high-compute operations in close proximity to end nodes. Fog nodes may serve multiple verticals, use cases or applications in an efficient, combined network. Once these high-level requirements are well understood, the design of the network and its elements can commence. more>

Updates from Adobe

BRIT(ISH): Visualizing the UK with Type
By Isabel Lea – I’ve always believed the role of a designer to be like that of a translator. For as long as I can remember, I’ve been obsessed with words and languages, and so I’ve naturally gravitated towards a kind of design that allows me to translate these things from the written to the visual, bringing them to life.

As the first Adobe Creative Resident based in the United Kingdom, I’ve been afforded the opportunity to spend a year visualizing culture and language in our everyday lives through design. BRIT(ISH) is my starting project for the residency. The project is an attempt to explore insights and ideas about being young and British during this turbulent time. I aimed to visualize often-intangible emotions in a playful way that other people can understand. Each object in the collection directly responds to quotations, insights, and stories I collected from the environment around me in the UK.

For me, projects that respond to stories and insights are often the most interesting because they add a level of unpredictability to the process and can result in something much more authentic. more>

Related>

The Tragedy of the Commons: How Elinor Ostrom Solved One of Life’s Greatest Dilemmas

By David Sloan Wilson – As an evolutionary biologist who received my PhD in 1975, I grew up with Garrett Hardin’s essay “The Tragedy of the Commons,” published in Science magazine in 1968. His parable of villagers adding too many cows to their common pasture captured the essence of the problem that my thesis research was designed to solve.

The farmer who added an extra cow gained an advantage over other farmers in his village but it also led to an overgrazed pasture. The biological world is full of similar examples in which individuals who behave for the good of their groups lose out in the struggle for existence with more self-serving individuals, resulting in overexploited resources and other tragedies of non-cooperation.

Is the so-called tragedy of the commons ever averted in the biological world and might this possibility provide solutions for our own species?

Evolutionary theory’s individualistic turn coincided with individualistic turns in other areas of thought. Economics in the postwar decades was dominated by rational choice theory, which used individual self-interest as a grand explanatory principle. The social sciences were dominated by a position known as methodological individualism, which treated all social phenomena as reducible to individual-level phenomena, as if groups were not legitimate units of analysis in their own right. And UK Prime Minister Margaret Thatcher became notorious for saying during a speech in 1987 that “there is no such thing as society; only individuals and families.” It was as if the entire culture had become individualistic and the formal scientific theories were obediently following suit. more>

Updates from Chicago Booth

Why Bitcoin and blockchain may stumble
By Alex Verkhivker – In mid-May, the Bitcoin Gold market suffered what’s known as a 51 percent attack. A market participant with sufficient computing power was able to take control of the underlying ledger and commit fraud, Quartz reported. Other cryptocurrencies have reportedly been similarly attacked.

Could this sort of thing sink cryptocurrency markets completely?

Even those who dismiss Bitcoin as a fad often praise blockchain, the open-source digital-ledger technology underlying it, as a breakthrough in electronic record keeping. The innovation of Bitcoin’s founder, Satoshi Nakamoto, was to create a process in which people have trust in a database that lacks a centralized authority such as a government, court, or bank; rather, records are verified by anonymous “miners,” who create a verified trail, or chain, of transactions.

When bitcoins are exchanged, information about the transactions is grouped together into a block. The miners race each other to solve a computationally intense puzzle, and the winning miner adds a block to the chain, while other miners verify that the new transactions are accurate. All miners keep a copy of the chain of transactions, making the blockchain a verifiable and trusted but ultimately decentralized database.

This process was a significant computer-science innovation, but how does it work economically speaking? In thinking that through, Eric Budish crafts a worrying argument about the future of Bitcoin. more>

Related>

Updates from Georgia Tech

Looking Back in Time to Watch for a Different Kind of Black Hole
By John Toon – Black holes form when stars die, allowing the matter in them to collapse into an extremely dense object from which not even light can escape. Astronomers theorize that massive black holes could also form at the birth of a galaxy, but so far nobody has been able to look far enough back in time to observe the conditions creating these direct collapse black holes (DCBH).

The James Webb Space Telescope, scheduled for launch in 2021, might be able look far enough back into the early Universe to see a galaxy hosting a nascent massive black hole. Now, a simulation done by researchers at the Georgia Institute of Technology has suggested what astronomers should look for if they search the skies for a DCBH in its early stages.

DCBH formation would be initiated by the collapse of a large cloud of gas during the early formation of a galaxy, said John H. Wise, a professor in Georgia Tech’s School of Physics and the Center for Relativistic Astrophysics. But before astronomers could hope to catch this formation, they would have to know what to look for in the spectra that the telescope could detect, which is principally infrared.

Black holes take about a million years to form, a blip in galactic time. In the DCBH simulation, that first step involves gas collapsing into a supermassive star as much as 100,000 times more massive than our sun. The star then undergoes gravitational instability and collapses into itself to form a massive black hole. Radiation from the black hole then triggers the formation of stars over period of about 500,000 years, the simulation suggested. more>

Related>

In the future, you’ll never have to leave your neighborhood

By Layla McCay – In the city’s center, people stroll in landscaped gardens, enjoying the positive impact of nature, exercise, and socialization on their mental health and well-being. But for those living on the outskirts, that epicenter can feel distant, separated by slashes of motorways.

Public transportation often points inward in a spoke-and-wheel configuration, emphasizing that there is just one truly desirable destination. People of the peripheries must commute back and forth, below ground and along highways, on trains and buses, losing time for friends and family, relaxation, leisure, culture, and sports. The fable of city life is out of reach, lost in the sprawl.

Instead of focusing on city centers, we should reconfigure the infrastructure of the outskirts. The result could see the end of such epicenters: a future where we identify as much with our hyper-local neighborhoods as we do with the greater metropolis.

We can see this in the growing trend of placemaking. This is a planning and design approach that works with communities to understand, imagine, and deliver solutions that meet their local needs, rather than relying on the whims of a grand city plan. more>

Related>

Updates from Autodesk

GTXRaster CAD Series: Incorporated into “Full” AutoCAD
GTX – With over 34 years in the Technical Imaging industry, GTX has provided engineering professionals the best Windows® based products in the market to accommodate scanned paper drawings – black and white or color images. GTX will significantly enhance your ability to handle scanned drawings, maps and other raster images and bring them into your modern CAD, EDM or GIS environments.

These new product releases, the GTXRaster CAD 2019 Series*, known as the “AutoCAD for raster” and the windows standalone, GTXImage CAD™ Version 21 Series, are a unique, cost-effective solution for bringing legacy drawings into your digital environment. You can modify and enhance scanned raster archives with the speed and flexibility of both raster and vector editing techniques. Whether your requirement is for raster cleanup, 2D CAD drafting, hybrid raster/vector editing or full automatic raster-to-vector conversion with intelligent text recognition, total flexibility is provided by these product series. more>

Updates from datacenter.com

Distribution basics
Hans Vreeburg – One of the most important factors in choosing a data center, is the right distribution to your cabinet. You need the best and safest way to get your basic requirements. Each room is its own compartment: an event in one room should have no effect on the other rooms in the data center.

Let’s start with the most familiar one: power. Whether you use a single feed or multiple, you need a way to get the power to your equipment. I think all data centers should use busbars: the only system that offers flexibility and is future proof. Ideally, the power distribution should be installed above the cabinets. This creates even more flexibility and prevents human errors: we all have witnessed the wrong cabinet being disconnected because of the dim-lit, small, environment that’s underneath a raised floor. Talking about the raised floor: that’s an outdated way to get the required power to a cabinet.

A raised floor can cause a lot of confusion, because it’s dark and offers limited working space; raising the risk of human error. But why do these errors occur? Most of the times something had to be done quickly and the wrong black power cable was traced. An elevated position gives you more control over who can reach the power distribution. You can even use advanced camera software to prevent and/or control people accessing it. more>

Related>

Updates from Siemens

Using NX and Learning Advantage to enable students to develop the professional skills required by industry
Siemens – With 1,700 employees and 15,000 students, Luleå University of Technology in northern Sweden is a thriving center of teaching and research that collaborates with businesses, educational institutions and public bodies across the world.

The Department of Engineering Sciences and Mathematics is home to a range of engineering courses that encompass materials, mechanics, power and sustainable energy. For engineering students within this department, the study of computer-aided design (CAD) is a basic requirement. However, students from other departments can select CAD as an optional subject. These include electrical engineers and space engineers, plus those studying subjects such as business administration and computer science. According to Peter Jeppsson, senior lecturer at Luleå University, CAD is a very popular choice.

The department has well-equipped workshops with a range of tooling machinery. Jeppsson describes the ethos of the department: “At the university we teach CAD software and engineering theory at the same time, not as separate subjects. We give students the opportunity to solve real-world problems and make better products by considering overall function, performance, production and lifecycle. We use computer-aided design and simulation for every aspect of a product.” more>

Updates from GE

Can You Hear Me Now? New GE Voices Site Gives Employees, Partners A Place To Learn And Speak Up
By Maureen O’Hagan – When William “Mo” Cowan was named GE’s president of global government affairs and policy in August, he came with a unique perspective, forged through experience that few can claim: He had served for a time as a U.S. senator, filling John Kerry’s empty seat when Kerry became secretary of state.

Cowan now has at his disposal a powerful tool to amplify that engagement. The company just relaunched GE Voices, an online hub where employees, suppliers and others connected to GE can learn more about — and speak up about — some of the key policy issues affecting the company today. Subscribers — there are more than 75,000 of them — can access explainers to see how hot policy issues like tax reform and tariffs affect them personally.

Front and center on the site is an interactive map showing the company’s broad presence in the United States. The first thing you’ll notice is that the GE family is everywhere you look, with dots representing GE’s manufacturing and research facilities, suppliers, educational partners and venture companies stretching from Maine to Florida, New York to California, Alaska to Hawaii. There are dots in all 50 states. more>