We've partnered with the CSIRO Energy Centre in Newcastle, NSW, a world-class solar research facility, and currently have 11 panels from leading brands on test. This includes initial lab testing of the new panels, then testing in CSIRO's outdoor facility where their 'real world' performance is closely monitored. We report the results after three months and again at 12 months.
CHOICE maintains a highly professional NATA-accredited laboratory and the vast majority of our product testing is done in-house. However some tests, including solar panels, demand particular expertise and equipment that we don't have, so in these cases we engage an accredited external lab to do the testing according to our requirements.
For this test, we've partnered with the CSIRO PV Performance Laboratory in Newcastle, NSW. It was the first facility in the southern hemisphere to be accredited to the international quality standard for measurements on solar photovoltaics (solar cells and panels). Principal research scientist Dr Chris Fell believes the facility will play an important role in the development of Australian solar technologies and in the nation's transition to a clean, renewable energy supply.
Use your mouse to explore a 360 degree view of this video or watch it through your virtual reality headset.
With so many to choose from, what makes us choose one solar panel to test over another? As with most of our product testing, our aim is to test the most popular brands and types on the market and what you are most likely to see at retailers.
We contact manufacturers and suppliers to find out about their range of products, and where possible we check market sales information, to make sure we test models that are typical of what consumers are actually getting. From this information we put together a final list for buying. Usually CHOICE then buys the products through retail channels, but for this project, the CSIRO managed the purchasing on our behalf, to streamline the process and get the products sent directly to them. Solar panels aren't typically bought one or two at a time so our usual purchasing methods weren't appropriate for this project.
In our first solar panel test from 2015 to 2017, we tested 15 panels. For our current test we have 11 panels, but have included two samples of each panel. This helps make sure any individual sample variations are detected.
Out of interest, we've also included two samples of a new type, the Yingli YL285CG2530F-1 bifacial panel. We're not including it in the review yet because industry standard measurement methods for this type have not yet been agreed on. The results for the first year are encouraging and indicate that when used in a well-designed installation, bifacial modules can indeed deliver more power than an equivalent single-faced panel.
Indoor lab test
The first phase of the test involves measurements in the laboratory, including electroluminescence tests to identify any cracks or flaws in the panel that are invisible to the naked eye.
A visual inspection is also performed: tiny imperfections were found on some panels but these don't appear to affect those panels' power output, at least so far.
This phase also includes flash testing, which determines the power output of the panel when brand new. It's important that a panel really does produce at least its minimum claimed power output under standard test conditions. Allowing a margin of error, all the panels are within specification at this stage, though a few look like they might be borderline. The table shows each panel's claimed power tolerance – this is the potential difference between the claimed power rating and actual performance (under standard conditions).
We're testing two samples of each panel, and averaging the results for their performance. This helps detect any variations between samples (which are usually minimal, but we do see some small differences in some cases).
After the initial lab tests, the panels were mounted on racks in CSIRO's outdoor facility, where they're open to the sun and weather just as they would be on a typical rooftop. Outdoor testing tells us how the panels perform in real-world conditions.
Key data including the panels' current and voltage plus sunlight and weather conditions are recorded every 10 minutes. This data helps the lab analyse how the panel would perform in other locations with different weather and sunlight conditions.
We report on their outdoor performance after three months (April-June 2018) and again after a full 12 months.
Our tests show that in the field (or on your roof) you can usually expect a lower power yield than the nominal power rating on the label or what we measure in the lab. The field values are different to the lab values because in the field the panels are subject to elevated temperatures, clouds and changing sun angles. The panels respond differently to these variations, which is why our outdoor test results are lower than the panels' nominal output – on average, the panels' outputs after 12 months are, on average, 11% below their initial output measured in the lab. To allow for the variations in outdoor conditions, our field measurements for each panel have been combined to give its average power output under a standard amount of sunlight, 1000 watts (W) per square metre. This allows a fairer comparison to the lab test results.
We have calculated a performance score for each panel. These are based on the actual power yield you could expect for a theoretical 1kW array of each model, based on our outdoor measurements after three months outdoors. It's theoretical because while you can use 4 x 250W panels to make a 1kW array, you'd have to use 3.33 x 300W panels or 3.06 x 327W panels, which clearly can't be done. But it would be unfair to directly compare a 250W panel to a 300W panel by their energy yield, hence we calculate based on a 1kW array.
The score is based on a scale such that the closer to the maximum possible 1000W output, the higher the score.
As you'll see in the solar panel review there are differences, with some models performing better than the rest. However they are all performing well and within the levels we would expect based on their specifications.
For more technical details behind our testing and assessment methodology, see here.
The CSIRO lab's sophisticated measurements of the panels and of the solar conditions in the test area allow them to determine several key factors relating to the test conditions and the performance of each model. For the technically minded, these are primarily:
- the solar irradiance, or the amount of sunlight actually falling on the panels at any given time
- temperature coefficient (how the panel's output depends on its temperature)
- thermal coefficient (what temperature the panel runs at for a given irradiance and ambient temperature)
- irradiance response (the relationship between output power and the incident irradiance)
This information, together with meteorological data for various locations around Australia, allows them to calculate how each panel would perform in those locations, assuming optimum installation (i.e. north facing and at the correct angle).
Interestingly, the test showed that the relative performance of each panel is generally similar regardless of the location. While you would get a lower power output for a given panel in Hobart than the same panel in Darwin, due to Hobart's lower levels of solar irradiance, the best performers in our test will still be the best in either location, and likewise for the lower-ranked models.
Solar panel testing requires a very specific laboratory, as described above in How we test. While CHOICE does have high quality NATA-accredited laboratories, we don't have the necessary equipment or skills to test solar panels to the Australian standard. So instead, for this test, we've partnered with the CSIRO as explained above.
PHOTO: CSIRO Energy Centre