How much electricity will you need tomorrow? Answering that question is a lot like looking ahead to your morning commute — somewhat predictable, but by no means ironclad.
To manage the inherent uncertainty in predicting power needs and avoid surprises, electric grid operators rely on computer models that help estimate everything from power demand to traffic patterns.
This challenge of factoring in the certain and the unknown to deliver electricity under all kinds of scenarios involves a series of incredibly complex math problems. With the assistance of artificial intelligence (AI), researchers at the U.S. Department of Energy’s (DoE) Argonne National Laboratory are developing new ways to extract insights from reams of data on the electric grid, with the goal of ensuring greater reliability, resilience and efficiency. The work combines Argonne’s long-standing grid expertise with its advanced computing facilities and experts.
Grid operators have always dealt with challenges and some amount of uncertainty from factors such as extreme weather to equipment failures. Now, fluctuating supplies of renewable energy, some of it flowing from consumers with rooftop solar panels outfitted with smart meters, are increasing the number of variables grid operators must consider.
Argonne researchers are working on optimization models that use machine learning, a form of AI, to simulate the electric system and the severity of various problems. In a region with 1,000 electric power assets, such as generators and transformers, an outage of just three assets can produce nearly a billion scenarios of potential failure. Which of those possibilities will merit the most attention?
Working in Parallel
Solving such a complex model is time-consuming. With resources such as the Argonne Leadership Computing Facility (ALCF), researchers can simulate multiple scenarios in parallel, moving the process along more quickly.
“The idea is to generate a large number of scenarios and train the machine learning model to tell us the answer,” said Kibaek Kim, an assistant computational mathematician at Argonne. “Instead of solving a number of difficult optimization models over several hours or days, we train the model ahead of time and then get the answer right away.”
The researchers train the machine by feeding it a set of data that includes the solutions, as if the machine were studying previous “exams” before trying new ones. This is called supervised learning. Another technique, unsupervised learning, involves feeding a computer raw data and allowing it to sift out patterns without telling it any “answers.”
In one study, Kim and his colleagues used a type of model called a graph convolutional neural network to recommend optimal controls that would prevent transmission lines from overloading if there were a problem with any of the lines. They found this model, which used machine learning to quickly find a solution, generated far fewer errors than more conventional ones. The work was conducted using Argonne’s Laboratory Computing Resource Center (LCRC) and its Joint Laboratory for System Evaluation. In addition to the LCRC, Kim’s work involves collaboration across Argonne.
Kim’s team is working to make such models even more robust, giving grid operators stronger guidance that can inform more reliable planning and operations for contingent events such as storms, equipment malfunctions and big fluctuations in renewable energy generation.
Speed up Calculations
Other work at Argonne involves applying AI to speed up the daily calculations that go into regional electric system planning. One such calculation is the security constrained unit commitment (SCUC), which helps grid operators set a schedule for daily and hourly power generation.
“In power systems, this SCUC problem is solved multiple times a day,” said Feng Qiu, principal computational scientist at Argonne. “Since this problem is solved repeatedly, we can accumulate a lot of data and discover patterns that could be used to solve the next round.”
Rather than replacing current analytics with machine learning, said Qiu, the idea is to bolster existing ones using machine learning to offer “hints” learned from previous solutions. Making use of LCRC’s Bebop cluster, a team led by Argonne postdoctoral appointee Alinson Santos Xavier developed AI that can solve SCUC 12 times faster, on average, than conventional methods. An early version of the method was used successfully in tests at Midcontinent Independent System Operator (MISO), which oversees electricity delivery across 15 states and one Canadian province.
“All this can lead to a more efficient market and more cost-effective electricity production,” Qiu said. “For long-term planning, it could help grid operators consider more scenarios and make better expansion plans.”
Modernized grids increasingly incorporate sensors that can monitor conditions throughout the system, and these too offer opportunities for enhanced data processing. Devices placed on transmission lines and at substations, for example, serve as sentinels that alert grid operators to equipment problems when they occur.
Argonne scientists have evaluated a year’s worth of sensor data from ComEd, a utility that serves nearly four million customers in the U.S. Midwest. This time, the researchers used unsupervised learning, feeding the data to the machine and asking it to look for anomalies in the sensor outputs.
Alert on Unexpected
“It’s not always known to the operator when things do not work as they should,” said Mihai Anitescu, senior computational mathematician at Argonne, who worked on the project. “Our approach decides whether the current conditions of the system are expected based on past behavior, or whether something is new and different. This information can be used to alert the operator that they may have something they don’t expect on the grid.”
This sort of classification work can also be applied to weather forecasting for renewable energy, correcting for underestimates of wind resources near bodies of water, for example, and combining numerical models with physical measurements to improve accuracy.
AI work, Anitescu said, involves pure data — recognizing speech patterns, for example, or analyzing a picture: “There aren’t many physical rules.”
That’s not the case for large real-world systems like the weather or the electric grid. “You really have to reconcile data, even if there’s a lot of it, with the physical information,” he said. “This is very much in its infancy, and it’s really where supercomputers are necessary.”