Existing power grids were designed to transmit electricity over relatively short distances, however, increasingly grids are required to supply major cities from remote offshore wind farms at the same time as integrating local generation. With generators feeding variable amounts of energy from renewable sources into the grid at all voltage levels, it is more difficult to balance supply and demand, and the risks of overloads and fluctuations increase.
By 2020 it is estimated that there will be over 50 billion smart devices connected to the internet, creating vast quantities of data which can be harnessed to develop smart systems for managing electricity systems, both at a local and national level to reduce the costs of balancing the electricity system.
The management and operation of the future power system and its components – particularly active power distribution grids and microgrids – will require new control functionality, including the following key functions and services:
- Advanced monitoring and diagnostics: Monitoring and state estimation capabilities and real-time condition monitoring of components in the medium and low-voltage distribution grids, including self-diagnostic capabilities.
- Optimisation/self-optimisation capabilities: Fluctuating electricity generation from renewable sources requires the ability to (self-) optimise operations in medium- and low-voltage grids, including effective integration of flexible loads and storage systems.
- Automatic grid (topology) reconfiguration: Support of automatic or semiautomatic adjustment of the distribution grid topology due to optimisation processes or fault management and power system restoration.
- Adaptive protection: Automatic or semiautomatic adaption of protection devices (eg protection relays and breakers) with respect to the actual power grid conditions (eg adaptation of the protection system settings due to the bidirectional power flow caused by DERs).
- Distributed power system management: Distributed control with automatic decision finding processes and proactive fault prevention have to be provided for the power system infrastructure operators in medium- and low-voltage grids.
- Islanding possibilities/microgrids: Local operation of islands/micro grids can improve the availability of the electricity supply due to failures on higher voltage levels.
- Distributed generation/distributed energy resources with ancillary services: Use of ancillary services provided by DER (eg local voltage or frequency control and virtual inertia) improves power grid optimisation.
- Demand response/energy management support: Electric loads and energy storage systems and demand response provide additional flexibility in power system operation.
- Advanced forecasting support: Forecasting of (distributed) generation and load profiles for optimised grid operation.
- Self-healing: Automatic or semiautomatic restoration of grid operation in case of component/grid faults helps power system and infrastructure operators.
- Asset management/condition-dependent power system maintenance: Preventive maintenance according to component/device conditions and remaining lifetime.
Relying on traditional linear mathematical models to manage these processes is not feasible, since both the manpower required to encode the models and the computing power to process them would be extremely large. that would be required to solve it. A more real-time approach is required.
Using AI, an efficient and adaptive framework can be developed that can look across multiple assets and infrastructure, and, given all the operational parameters, intelligently optimises their behaviour.
AI for transmission system management
Earlier this year, National Grid was reported to be in early discussions with DeepMind, Google’s machine-learning company, to determine the role AI could have in managing the grid, particularly in relation to better integrating renewables by using machine learning to predict peaks in demand and supply.
“It would be amazing if you could save 10% of the country’s energy usage without any new infrastructure, just from optimisation,”
– Demis Hassabis, chief executive of DeepMind.
DeepMind’s algorithms have enabled Google to cut the amount of energy used by the cooling systems in its data centres by 40%, cutting their overall electricity consumption by 15%.
Meanwhile in the US, the Department of Energy’s SLAC National Accelerator Laboratory is sponsoring a project known as GRIP, for Grid Resilience and Intelligence Project, to combine artificial intelligence with large amounts of data to identify places where the electricity grid is vulnerable to disruption so these areas can be reinforced in advance and recover faster when failures do occur.
The eventual goal is an autonomous grid that seamlessly absorbs routine power fluctuations from clean energy sources like solar and wind and quickly responds to disruptive events – from major storms to eclipse-induced dips in solar power – with minimal intervention from humans.
The project will use both machine learning, where computers ingest large amounts of data and teach themselves how a system behaves, and artificial intelligence, which uses the knowledge the machines have acquired to solve problems. The data analytics platform will initially be tested a major California utility:
“The idea is to populate the platform with information about what your particular part of the grid looks like, in terms of things like solar and wind power sources, batteries where energy is stored, and how it’s laid out to distribute power to homes and businesses. Then you begin to look for anomalies – things that could be configured better,”
– said Sila Kiliccote, director of SLAC’s Grid Integration, Systems and Mobility lab, GISMo, and principal investigator for the project.
For example, a grid can be divided into “islands,” or microgrids, that can be isolated to prevent a power disruption from spreading and taking the whole system down.
Siemens is coordinating a major research project in Germany, designed to determine the extent to which existing control centre technology can be exploited to better understand grid fluctuations, and to identify when and where entirely new structures and architectures will be needed.
At the moment, control centre operators have low visibility of grid fluctuations – they can only see the amount of electricity being transported at each location, and whether a line is overloaded. The “DynaGridCenter” project will transmit measurement data to the control centre, to be analysed in real time…this data is captured using existing measurement technology which has not historically been fed to the control centre despite being installed at various points on the grid.
Phasor Measurement Units, for example, measure the extent and phase angle of current and voltage every 10-20 milliseconds, and can be configured to enable values from different transformer substations to be directly compared, providing real time data on grid fluctuations.
AI models supporting DSR
AI models are being developed by energy aggregators to make real-time about how consumer sites and assets should be managed to deliver the most value; when, where and how much flexibility exists, when to charge and discharge a battery, and when to use, store or export energy generated by the solar panels and other behind-the-meter generators.
Aggregator Open Energi is developing such an AI model. It collects between 10,000 and 25,000 messages per second relating to 30 different data points and performs tens of millions of switches per year. This data is being used to train a deep learning model which combines asset-level constraints from a bottom-up approach with portfolio-level modelling that can tweak the outputs to improve the aggregated solution.
The model can look at a sequence of actions leading to the rescheduling of power consumption and make grid-scale predictions identifying the costs of various actions.
An early entrant in this field is Swiss-German firm Alpiq, which launched an intelligent system called GridSense in 2014, which aims to make imperceptible changes to the energy consumption of domestic or commercial equipment into which it is integrated (such as boilers, heat pumps, charging stations for electric vehicles) based on measurements such as grid load, weather forecasts and electricity charges.
This article from law firm Osborne Clarke describes the technology developed by Alpiq, which has filed a patent application for a method of programming energy flow in an accumulator of an electric vehicle. By recording information on past usages of the accumulator at a point of connection to a grid it is possible to locally estimate the future usage of the accumulator; and programming the energy flow between the grid and the accumulator, which may be in either direction, on the basis of the estimated information.
This allows the charging process to be optimised, using machine learning and data mining techniques in order to determine the optimal level of charging when the vehicle is connected to the grid, avoiding the need for full charging if only a small amount of energy is needed for the next trip.
The complexity of the electricity system is increasing, new technologies such as renewable generation, storage and electric vehicles are changing the fundamental dynamics of the grid, and the growth in demand-side response, changing consumer behaviours, and the emergence of connected devices are creating a major shift in the interactions affecting the system. AI is likely to play an increasing role in managing and optimising these interactions, and helping system operators to adapt to the new grid landscapes without falling back on expensive reinforcement projects.
“It would be amazing if you could save 10% of the country’s energy usage without any new infrastructure, just from optimisation,”
Yes, it would.
And why not create a beautoiful single point of failure, and attack vector, for the whole grid so it can be shut down by some PFY hacker in Afghanistan
The grid’s already vulnerable to that as evidenced by the attack on the Ukrainian power grid in 2015 where the SCADA systems were hacked…