It’s not easy being among the first states and utilities to integrate distributed energy resources into a century-old grid paradigm. But whatever the mistakes and missteps that may result, at least the rest of the country can learn from them.
That’s the gift being presented to state regulators in a new report from the Interstate Renewable Energy Council (IREC), titled Optimizing the Grid: a Regulator’s Guide to Hosting Capacity Analyses for Distributed Energy Resources. The report takes a deep dive into the pros and cons of the approach of four key states -- California and Hawaii, where rising amounts of DERs are creating grid integration challenges today, and New York and Minnesota, which are pushing ahead with regulatory reforms to manage DERs to come.
No state has put together an end-to-end DER integration framework, but hosting capacity analyses, or HCAs, are an important first step on that path, the report notes. In simple terms, an HCA “evaluates a variety of circuit operational criteria -- typically thermal, power quality/voltage, protection, and safety/ reliability -- under the presence of a given level of DER penetration and identifies the limiting factor or factors for DER interconnections.”
These limiting factors may or may not represent true conditions on the grid. And the costs of adding the capacity to accommodate new DERs is generally put on the developer, in what IREC calls a “cost-causer pays” system.
An HCA, by contrast, can pull together real data on a feeder’s load curves, design and physical and operational characteristics, to “allow utilities, regulators and electric customers to make more efficient and cost-effective choices about deploying DERs on the grid,” the report notes.
And, if properly designed -- or as IREC puts it, “adopted with intention,” with an eye toward getting all stakeholders involved and including a state’s broader energy and environmental policy goals -- they can also bridge data gaps between developers, customers and utilities to make their grid interactions more efficient and economical.
But getting this design right is hard. States that are doing it have a mixed track record -- at least according to the perspective of IREC, which counts such DER providers as SolarCity (now Tesla), Sunrun, Stem, Engie and Green Mountain Power as members.
According to its analysis, “the failure to consider the use cases prior to selecting the methodologies has resulted in a potential need to revise the methodologies in California,” the state that’s arguably furthest ahead in both real-world DERs and policy initiatives to integrate them. “In addition, stakeholders have voiced concerns about whether the methodologies used in Minnesota and New York will actually be able to achieve those states’ goals.”
Interconnection’s tough choices: Streamlining vs. getting the facts right
To delve into these discrepancies, let’s turn to the practical use cases for HCAs today, which fall into two main categories -- better DER interconnection and better distribution planning.
Today, the interconnection use case is the more pressing, since utilities and state regulators are already struggling to accommodate the number of DERs trying to interconnect to the system, as we’ve seen in Hawaii and New York.
HCAs could really help, by giving utilities and developers more accurate data on DER limits down the the individual circuit level. We’ve seen this principle put into play in Hawaii, the first state to see utility limits on new rooftop solar installations due to hosting capacity concerns. It's also an important part of California’s mandated integrated capacity analysis measurements, DER hosting capacity maps, and other emerging data tools.
The challenge for state regulators seeking to enable this kind of circuit-by-circuit visibility is that it takes a lot of data, and a lot of computing power, to yield accurate results. And in an ironic twist, the goal of streamlining DER interconnection has led some regulators to choose a more streamlined method of making these measurements that could come to haunt them down the road, according to IREC.
In general terms, IREC divides the types of methodologies now being used by utilities to analyze hosting capacity into three categories: stochastic, iterative and streamlined. Let’s look at the latter two, since the first is addressed in a special appendix on mid-Atlantic utility Pepco and otherwise left unaddressed.
- The streamlined method applies a set of simplified algorithms for each power system limitation (typically thermal, safety/reliability, power quality/voltage, and protection) to approximate the DER capacity limit at nodes across the distribution circuit.
- The iterative method directly models DERs on the distribution grid to identify hosting capacity limitations. A power flow simulation is run iteratively at each node on the distribution system until a violation of one of the four power system limitations is identified. The iterative method is also sometimes referred to as the detailed method.
There are some areas of overlap between these methods, as well as differences between iterations of the same methods. But in simple terms, streamlined methods are easier and require less computing power, but don’t yield as accurate results.
And while going the cheaper, faster route may be appealing to utilities and regulators, a working group of stakeholders in California decided that utilities needed to use the more expensive, yet more accurate, iterative method to model circuits accurately enough to serve as a guide for streamlining interconnection.
“The iterative method optimizes precision because it measures the actual technical capacity of the system, and it proved to be particularly well suited to complex feeders,” the group noted.
Still, based on the side-by-side testing of the two methods, the California Public Utilities Commission eventually picked the iterative methodology systemwide for the interconnection use case, the report states. This decision will doubtless increase the computing costs to the state’s three big investor-owned utilities, which are tasked with creating the new models for their Integrated Capacity Analysis pilot projects, but eventually are expected to provide them across their distribution networks.
But they’ll also provide more mathematically useful data for automating a process that, at the end of the day, still has to trust its results for keeping the grid safe and stable. Hawaii’s main investor-owned utility is using a method similar to California’s iterative method in analyzing its island grids, as part of a broad-reaching effort to integrate customer-owned solar, batteries, electric vehicles and demand response into its operations, the report noted.
In New York, by contrast, utilities are moving ahead with a “streamlined” method for calculating their circuits’ hosting capacities, in the form of the DRIVE software developed by the Electric Power Research Institute (EPRI). IREC isn’t happy about this, saying there’s “considerable uncertainty” about whether it can serve as a reliable source of real-world interconnection data.
The limitations of the streamlined approach were outlined in a use case for Xcel Energy in Minnesota. Last year, Xcel used EPRI’s DRIVE tool to apply a small distributed distributed resource scenario analysis on more than 1,000 distribution feeders. But “owing to limitations in the DRIVE tool, Xcel did not include in its analysis existing or forecasted DERs, and it did not apply mitigations to determine if hosting capacity could be increased,” the report noted. And the minimum and maximum hosting capacity data it published for each feeder came with no map or downloadable data.
Planning for what?
IREC noted that Minnesota is still in the early stages of its distributed energy regulatory reforms, and hasn’t yet decided on use cases for its hosting capacity analyses. But in New York, which has made its Reforming the Energy Vision (REV) initiative a centerpiece clean energy and climate goals, there was a failure to figure out what the HCAs were supposed to be able to do, according to IREC.
While the state’s utilities set a goal of creating HCA maps to identify optimal interconnection locations for large-scale solar PV, they “declined to clearly identify and define interconnection as a use case,” and united on the use of EPRI’s DRIVE tool, despite the record from California.
And despite comments from stakeholders like IREC asking to “clearly define use cases, and look into whether the utilities’ HCA methods were yielding data “accurate and reliable enough to meet those use cases,” the New York Public Service Commission declined to investigate the matter.
This instance underscores a key difference between California and New York on the matter of hosting capacity analyses -- stakeholder involvement. In the California working group that ended up recommending the iterative method for HCA, utility and non-utility members alike shared “productive, iterative, and ongoing negotiations, with the utilities fielding stakeholder questions, responding to recommendations and concerns,” and communicating in person, online and over the phone.
“By contrast, stakeholders in New York’s Reforming the Energy Vision engagement groups reported that utilities had already made critical decisions before talking to stakeholders at engagement group meetings,” the report noted.
When non-utility concerns were raised, “utilities did not report back during the working group process about what input would or would not be taken into account, thereby allowing for the iteration and discussion that could lead to consensus. As a result, the meetings seemed to serve more as an opportunity to inform stakeholders of utilities’ plans than a meaningful opportunity for stakeholders to help shape the outcome of the process.”
That doesn’t mean the streamlined method is useless, of course. While “it can provide only a rough approximation of hosting capacity levels due to its reliance on abstract algorithms,” the report notes, “it is less data-intensive and thus could allow more simulations to be run in a timely manner.” That could lend the method to the second category of use for HCAs, as a tool to help the aforementioned states plan for DERs as part of their multibillion-dollar distribution system investment needs.
IREC and Sandia National Laboratory have come up with the concept of an Integrated Distribution Planning process to help define terms of use for this use case. The four steps are:
- Mapping the hosting capacity of the system
- Forecasting DER growth and load growth
- Identifying and prioritizing grid upgrade needs by comparing growth to available circuit hosting capacities
- Proactively pursuing grid solutions, including non-wires alternatives, to meet identified needs and integrate and optimize DERs on the grid
So far, New York and California have yet to specify how they’re planning to use HCAs in their grid planning. But to get the most value out of them, they will need to identify the key data they need, how accurate and granular it will need to be, and how to keep the model up to date with revised load forecasting and the natural increase in DERs on the system.
“Regulators should ensure that the HCA methodology is scalable so that, even under an incremental approach, the full grid and range of DERs can eventually be analyzed,” the report added. After all, while utilities may be analyzing only a subset of pilot feeders today, eventually, they’re expected to extend this visibility to every feeder on their system, and to DERs in hard-to-predict combinations.