Updated research assesses the performance of utility-scale PV in the United States, and how it has changed as plants age

March 15, 2022

Berkeley Lab is pleased to announce the release of a new report, titled “Plant-level performance and degradation of 31 GWDC of utility-scale PV in the United States.” This report updates an earlier analysis first published two years ago, with the benefit of a 50% larger sample than in the original study, as well as two additional years of performance to evaluate. Specifically, we assess the plant-level performance of a fleet of 631 utility-scale PV plants totaling 31.0 GWDC of capacity that achieved commercial operations in the United States from 2007-2018, and thus have been operating for at least two (2019 and 2020) and as many as thirteen (2008-2020) full calendar years. In aggregate, these 631 plants contributed >50% of all solar electricity generated in the United States in 2019 (across all sectors—residential, commercial, and utility-scale—and including concentrating solar thermal power) and 40% of all solar electricity generated in 2020.

Using detailed information on individual plant characteristics (e.g., fixed-tilt vs. tracking, azimuth, tilt, DC:AC ratio, location), in conjunction with satellite-based, modeled irradiance data for each site, we assess the extent to which actual first-year plant-level performance has lived up to expected (as modeled) performance. We then analyze fleet-wide degradation in energy output in subsequent years, by employing a “fixed effects” regression model to statistically isolate the impact of age on plant performance.  We find results that are similar to our original study, which speaks to the robustness of the findings

On average, these plants’ first-year performance has fallen short of modeled expectations to a modest degree, with the largest underperformers concentrated among newer plants (Figure 1a) located in Texas, California, the Southeast, and the Northeast (Figure 1b).

Figure 1. Actual vs. modeled first-year capacity factor by (a) year and (b) region

Moving beyond the first year, the subsequent fleet-wide degradation rate of -1.2%/year (±0.1%) is essentially the same as the -1.3%/year (±0.2%) found in our original study (Figure 2), and is of greater magnitude than is commonly assumed. 

Figure 2. Comparison of updated (blue) to original (orange) age fixed effects

We emphasize, however, that these fleet-wide estimates reflect both “recoverable” (i.e., correctable performance issues, like soiling or tracker malfunction) and “unrecoverable” (non-correctable, like delamination or cell hot spots) degradation across the entire plant, and so will naturally be of greater magnitude than module- or cell-level studies, and/or studies that focus only on “unrecoverable” degradation. Moreover, application of the fixed effects model to a variety of sub-samples in an attempt to tease out potential degradation drivers suggests that newer and larger plants with higher DC:AC ratios—i.e., plants that more closely resemble recent plants—have experienced a lower magnitude of degradation—-0.7%/year (±0.4%)—that is more in line with other estimates from the recent literature.

The updated study, along with a summary slide deck and a 20-minute pre-recorded webinar, can all be freely downloaded from here: https://emp.lbl.gov/publications/plant-level-performance-and

We thank the Solar Energy Technologies Office (SETO) within the Office of Energy Efficiency and Renewable Energy (EERE) at the U.S. Department of Energy (DOE) for funding this work.






Signup for our Mailing List