Sandia Labs benchmarks PV software providers in first ever blind comparison analysis

March 20, 2026 at 7:49 AM
Emiliano Bellini
PV Magazine (International) Solar_Renewables Renewable Energy Software Solar design & proposals PV Modules ✓ Processed

AI Analysis

Relevance Score: 1.00/1.0

Summary

Sandia National Laboratories conducted the first-ever blind comparison of seven commercial PV modeling software, revealing that differences in weather handling, system modeling, derates, and assumptions grow as system complexity increases. The study emphasizes that software choice should consider project complexity, workflow, and modeling features rather than relying on rankings alone.

<p class="p1"><span class="s1">Sandia National Laboratories conducted the first-ever blind comparison of seven commercial PV modeling software, revealing that differences in weather handling, system modeling, derates, and assumptions grow as system complexity increases. The study emphasizes that software choice should consider project complexity, workflow, and modeling features rather than relying on rankings alone.</span></p><p>A group of scientists from the U.S. Department of Energy's <a href="https://www.pv-magazine.com/2025/08/26/confused-with-pv-connectors-words-matter/" rel="noopener" target="_blank">Sandia National Laboratories</a> has conducted a comprehensive assessment of seven PV modeling software tools &#8211; 3E SynaptiQ, PlantPredict, PVsyst, RatedPower, SAM, SolarFarmer, and Solargis Evaluate &#8211; and has found that their performance diverges as system complexity increases.”</p>
<p><span>&#8220;This is the first ever blind and independent comparison of commercially used PV software, with predictions submitted directly by the software providers,&#8221; the research's corresponding author, Marios Theristis, told <strong>pv magazine</strong>. &#8220;We did not rank tools, instead, we focused on how do different modeling features and assumptions affect predictions.&#8221;</span></p>
<p>&#8220;We compiled summary tables of software features and then compared the predictions made by the providers,&#8221; he went on to say. &#8220;We observed that results align closely for simple systems, by which we mean fixed-tilt, flat-terrain, small-scale, monofacial systems, while differences increase as systems become more complex and are linked to specific modeling choices and software features.&#8221;</p>
<p>In the study &#8220;<span class="title-text"><a href="https://www.sciencedirect.com/science/article/pii/S0038092X25009703#fig0005" rel="noopener" target="_blank">Feature review of photovoltaic modeling software utilizing blind performance assessment</a>,&#8221; published in <em>Solar Energy</em>, the Sandia group explained that, unlike earlier studies that focused on small systems, single locations, or anonymous participants, their work presents a transparent, feature-level comparison of widely used PV modeling software supporting both pre-construction post-construction activities. </span></p>
<p>The scientists categorized software features into weather and irradiance, DC system modeling, AC system modeling, and derates. The software tools were tested using one year of data from from two fixed-tilt, monofacial, south-facing systems in Albuquerque, United States and an undisclosed site in Germany, with a capacity of 15.4 kW and 14.5 MW, respectively.</p>
<p>Measurements were performed independently, with instrument details withheld from the software providers, enabling an unbiased blind comparison of PV modeling performance across software platforms, according to the research team. Weather and irradiance data were also filtered before distribution to software providers.</p>
<p>The analyis showed that PV modeling results vary significantly across software due to differences in weather handling, system modeling, inverter assumptions, and user-specified derates, while highlighting the critical influence of both software design and user choices on predicted energy outcomes. Moreover, the blind modeling comparison revealed differences across software in plane of array (POA) irradiance transposition, module temperature, DC/AC power, and derates.</p>
<p>Weather modeling, for exampled, varied due to different libraries, transposition models, and assumptions about air mass, albedo, and location, with median POA residuals ranging from 14.65 to 6.06. DC system features were generally consistent, but shading and temperature models varied. By contrast, AC system modeling differed in inverter efficiency, clipping handling, and curtailment adjustments. Furthermore, shading approaches were found to vary by stringing, irradiance decomposition, and terrain assumptions, creating uncertainty.</p>
<p>&#8220;Our findings underscore the need for continuous, independent, and rigorous validation of modeling methods, comparing software tools against complex, real-world systems,&#8221; Theristis said. &#8220;Ultimately, the &#8216;right' tool depends on project complexity, workflow, and the surrounding software ecosystem.&#8221;</p>
<p>

📝 RSS Summary Only
Tags: Renewables Residential PV PV Energy Storage PV plant Research Rooftop PV modeling PV modules Energy Commercial & Industrial PV PV software Software solar modules Utility Scale PV Solar PV residential PV PV modeling Renewable Energy photovoltaic science Solar Power Modules & Upstream Manufacturing Inverters rooftop solar Technology and R&D photovoltaics renewable energies solar panels Solar solar energy Technology
RSS Categories: Commercial & Industrial PV
Collected 3 weeks, 5 days ago
View Original Article