Can utilities have their multi-layered cake and eat it too?
Utilities are facing increasing bandwidth demands on their communications networks. Ciena’s Mitch Simcoe explains how modernizing networks to a private packet-optical fiber architecture can help utilities scale to support new smart grid applications.
By Mitch Simcoe – Utilities are increasingly in the eye of the storm these days. Whether it’s having to deal with hurricanes in the Gulf Coast over the last few months or wildfires on the West Coast, utilities have had to put more sensors out in the field to keep abreast of changing weather conditions and potential risks to their power grids. The increasing demands for utilities to show that they are carbon-free is also changing the way they generate and distribute energy. The one common denominator that utilities have is more data to collect and backhaul from their power grids, which is driving increasing demand on their communications networks.
Many utilities may not realize it, but recent advancements have resulted in several bandwidth-intensive applications and processes driving up demand on their networks:
- Video Surveillance
Security continues to be top of mind for utilities and security surveillance in the past has been more “after the fact”; where video surveillance is stored locally at the substation and only accessed after a security breach. Today’s approach is to backhaul all security video footage to a centralized data center and apply artificial intelligence (AI) techniques to proactively determine if a security breach is in the process of occurring. In those cases, security personnel can be dispatched on site in near-real time. Each video camera at a substation can generate 9 Gigabytes of data per day and a typical substation could have a dozen video cameras to surveil.
Prior to the big power outage of 2003 in the Northeast United States (where 50 million households lost power for two days), sensors on the power grid using SCADA (Supervisory Control and Data Acquisition) would sample the state of the grid about once every four seconds. This significant outage could have been avoided had the grid been sampling data more frequently. To address this, a device called a synchrophasor (not the Star Trek type!) was introduced, which would sample the state of the grid 30 to 60 times per second. This has allowed the grid to be more reliable but produces significantly more data to backhaul and process. Each synchrophasor PMU (Performance Measurement Unit) can generate 15 Gigabytes of data per day and all of that must be backhauled to a central data center for analysis.
- Smart Meters
In the US, over 50% of households are now serviced by a smart meter that measures your household’s power consumption every 15 minutes. Beyond their billing function, they help utilities track power consumption hotspots during peak usage. For a utility of 1 million households, which would be the middle range for most US Investor-owned Utilities (IOUs), this can generate 1 terabyte of data per day that needs to be backhauled to a central data center for processing.
- Internet of Things (IoT) devices
These include what we mentioned earlier: weather sensors and sensors on power equipment to proactively identify issues. Smart thermostats in homes is another growing trend which utilities are using to offer smart “on-demand” billing plans where you allow the utility to raise your thermostat during periods of peak usage during the hot summer months in exchange for a lower cents per kWh price.
For the first three categories we mentioned above, a utility of 1 million households would result in a daily requirement for data backhaul of 6 to 8 terabytes. With this amount of data to backhaul and process, it is no wonder utilities are exhausting the available capacity of their legacy communications networks.
The Information Technology (IT) group in a utility is tasked with managing many of these new applications associated with a smarter grid. Some utilities have been leasing copper-based TDM services for many years from service providers for smart grid, IT and substation traffic. The cost of this approach has been onerous and only gets more expensive as service providers are migrating their networks away from copper to fiber and wireless options. more>
Posted in Broadband, Business, Communication industry, Economy, Education, History, How to, Net, Regulations, Science, Technology
Tagged Broadband, Business improvement, Ciena, Fiber optics, Internet, Skills, Technology
Designing large scale automation and robotic systems using Solid Edge
By David Chadwick – Precision Robotics and Automation Ltd (PARI) is a leading developer of automation and robotic systems globally. Their customers in the automotive sector include established giants like Ford, Chrysler, PSA, Daimler-Benz, Tata Motors, Mahindra, and new significant players like VinFast. PARI designs, manufactures and installs complete, automated systems including multi-station lines for machining and assembly of powertrain components and assemblies.
PARI has been a major user of Solid Edge for 15 years with 160 licenses deployed at their headquarters near Pune in India. Typical automation solutions deployed by PARI incorporate a wide variety of robots, actuators and sensors and other mechatronic items. These systems can comprise over 25,000 unique components.
Mangesh Kale, Managing Director of PARI describes their design process. “If a six-axis robot is required for a specific application then we use robots from major suppliers like FANUC, ABB and Kuka, or other makes specified by the customer. We typically receive 3D models from these manufacturers and we integrate these into our automation system designs. However, many applications demand gantry type robots that we design and manufacture ourselves. In a typical solution, about 60% of the design is using standardized commodities of PARI. However, custom parts are typically 40% of the design. For example, the gripper sub-assembly for any material handling solution is typically a custom design. This design meets specific application needs to handle components at different stages in the machining or assembly process. The customization required for assembly processes is even higher. We find that Solid Edge is a very powerful and flexible solution for designing these sub-systems.” more>
Posted in Business, Economic development, Economy, Education, History, How to, Net, Product, Science, Technology
Tagged Automation, Business improvement, Manufacturing, PLM, Robotics, Siemens, Skills, Technology
By Michael Strevens – Modern science has a whole lot going for it that Ancient Greek or Chinese science did not: advanced technologies for observation and measurement, fast and efﬁcient communication, and well-funded and dedicated institutions for research. It also has, many thinkers have supposed, a superior (if not always ﬂawlessly implemented) ideology, manifested in a concern for objectivity, openness to criticism, and a preference for regimented techniques for discovery, such as randomized, controlled experimentation. I want to add one more item to that list, the innovation that made modern science truly scientific: a certain, highly strategic irrationality.
‘Experiment is the sole judge of scientific “truth”,’ declared the physicist Richard Feynman in 1963. ‘All I’m concerned with is that the theory should predict the results of measurements,’ said Stephen Hawking in 1994. And dipping back a little further in time, we ﬁnd the 19th-century polymath John Herschel expressing the same thought: ‘To experience we refer, as the only ground of all physical enquiry.’ These are not just personal opinions or propaganda; the principle that only empirical evidence carries weight in scientific argument is widely enforced across the scientific disciplines by scholarly journals, the principal organs of scientific communication. Indeed, it is widely agreed, both in thought and in practice, that science’s exclusive focus on empirical evidence is its greatest strength.
et there is more than a whiff of dogmatism about this exclusivity. Feynman, Hawking, Herschel all insist on it: ‘the sole judge’; ‘all I’m concerned with’; ‘the only ground’. Are they, perhaps, protesting too much? What about other considerations widely considered relevant to assessing scientific hypotheses: theoretical elegance, unity, or even philosophical coherence? Except insofar as such qualities make themselves useful in the prediction and explanation of observable phenomena, they are ruled out of scientiﬁc debate, declared unpublishable. It is that unpublishability, that censorship, that makes scientific argument unreasonably narrow. It is what constitutes the irrationality of modern science – and yet also what accounts for its unprecedented success. more>
Posted in Book review, Business, Economic development, Economy, Education, History, How to, Nature, Science, Technology
Tagged Business improvement, Internet, Physics, Science, Skills, Technology