About two years ago, the Pacific Northwest Smart Grid Demonstration Project flipped the switch on the country's biggest test of a technology concept that could transform distributed energy-grid integration: transactive energy.

The goal was to show that near-real-time energy and pricing data could provide the right signals for utility-controlled distributed energy assets and demand-response platforms to better balance electricity supply and demand. The testing ground was the Bonneville Power Administration (BPA) grid system, with its combination of firm hydropower, growing wind and solar power, and an increasing focus on demand reduction.

Now the experiment is over, the nodes mostly shut down, and the findings have been released in a report from Pacific Northwest National Laboratory (PNNL). The report (PDF) lays out evidence that data sharing will spur big improvements to grid operations -- but only if several key “capability enhancements” can be put in place to make it happen. 

“We basically validated that the technology works, which is important,” PNNL’s Ron Melton, director of the project, said in an interview this week. At the same time, “We’re trying to push technology forward on the transactive side, and there’s some work that needs to be done on that,” he added.

The project spanned five states, 11 utilities and 60,000 metered customers, linked up to 27 different “nodes” in the Pacific Northwest’s power grid. Every five minutes, those nodes communicated the delivered cost of electricity at that moment, plus a prediction of how much electricity they would need over the coming minutes, hours and days.

These data points included wind power forecasts, marginal costs at different generation plants, transmission system congestion data, and other such factors that go into keeping the transmission system stable. At the same time, they provided a signal that participating utilities could communicate to distribution grid-connected assets -- smart thermostats and adjustable water heaters, demand-response-enabled industrial sites, or grid-responsive energy storage systems.

The first big question was whether the transactive system would be capable of providing an accurate picture of what was happening on the grid. On that front, “BPA challenged us with some circumstances or events they had observed during the course of the project [in order] to see how the transactive system dealt with those,” Melton said.

Participating vendors took on different tasks, with IBM building the infrastructure to disseminate the signal and integrate it with the utilities’ responsive assets; Spirae providing the control platform for distributed energy assets; QualityLogic ensuring interoperability; 3TIER (now Vaisala) doing wind forecasting; and Alstom Grid defining and calculating the simulated wholesale power price.

“We were able to verify that the transactive system, with the data we had at hand, did correctly provide an incentive signal that represented the types of changes that were happening in the system,” he said, with each utility node “recommending local load changes that were the correct type of load changes for the circumstance.” So far, so good. 

Where it worked, where it didn’t

One of the key factors in PNNL’s latest project was its inclusion of future responses on the grid. That’s a lot different than PNNL’s past transactive energy pilots, such as the 2006 GridWise project in Washington state’s Olympic peninsula, or 2013’s demonstration with AEP Ohio. Those previous projects were were based on a “double-auction market technique” that simply captured real-time energy prices to tell home appliances when to buy or not buy energy based on certain grid factors.

"For this project, we recognized that we were taking a much broader look at the system,” said Melton. The Pacific Northwest project enlisted about 50 megawatts of responsive loads, enough to cause their own supply-demand disruptions in the course of responding to transactive signals -- unless they’re informed on how those real-time decisions will affect future energy and grid costs.

“What we’re trying to do here is the convergence of an economic approach with the control and coordination of distributed energy resources. If all you do is broadcast a price, but you don’t look at what the response is going to be, then you’ve got an open-loop system from a control point of view.”

Bringing the future into view helps mitigate the oscillations and disruptions that could arise in this situation, he said. “It’s in the future where we’re doing this negotiation between demand and supply in this technique. When the loads respond, they may change the cost factors. […] If that changes, we have to continue the negotiations, until we come to a stable point.” 

Some of the project’s future-looking features worked well. For example, the wind forecasting piece of the puzzle yielded accurate data on when the region’s wind farms were going to ramp up and down.

The system also caught when the Columbia Generating Station, Washington state’s sole nuclear power plant, unexpectedly powered down.

“In the zone above the power plant, the incentive signal jumped up when the information showing the trip began to arrive in the system, and as you got geographically further away from the zone, the incentive signal didn’t rise as fast, which is what we would have expected," noted Melton.

Forecasting didn’t yield such solid results in areas where the data is lacking or in which techniques for collecting data aren’t quite up to speed. 

While participating utilities deployed more than 30,000 smart meters to increase data on the load side, more work needs to be done to create better local load forecasts and find opportunities to take advantage of home energy management systems and other devices, said Melton.

A similar data problem occurred when trying to get distributed resources to help solve transmission system congestion problems, he said. “We had challenges getting the proper location of all of the different resources and determining them relative to our defined transmission zones. So we were just not able to get to the point where we were able to represent that in our regional model.”

Bridging the gap between traditional and transactive grids

PNNL’s project was also limited by a mismatch between the way grid assets are managed today and how a fully transactive system would have them operate.

For example, while utilities pledged a number of flexible loads to participate, they were primarily relying on conventional demand-response programs, which have a limited number of events per month. If that resource had already tapped out its monthly event quota or daily duration, it couldn’t respond to a transactive signal, no matter how lucrative or pressing the message it was sending.

“This is one of the challenges going forward -- to have the business or operational practices at the distribution-utility level change, such that they’re able to provide a more continuous response,” said Melton. “That, of course, means different ways of interacting with their customers.”

Similar issues emerged at the bulk-generation level. While BPA observed the transactive system in action, it didn't use the resulting data to inform its day-to-day procurement of generation resources. Even so, IBM used grid-modeling tools to demonstrate how BPA could feed the transactive system’s data into its forecast model and economic dispatch in order to do things like pick available demand reduction instead of bringing more generation on-line.

“Since we had no way to affect the dispatch stack on the bulk side, we couldn’t do that as part of the live experiment,” he said.

But IBM’s model showed that with a 30 percent penetration of transactive control on the distribution side and 30 percent penetration of wind on the bulk power side, the engineers could expect to see an 8 percent reduction in peak load on the bulk power side. Those are the kinds of results that could encourage utilities and grid operators to take additional steps toward putting transactive energy concepts into practice.