When the Grid Fails: How Digital MRV Monitors Solar PV Performance, Community Electricity Access, and SDG 7 Clean Energy Outcomes in South Africa
dMRV AI Carbon Credits
Project Type: Renewable Energy | Solar PV | Energy Industries (Renewable Sources)
Location: Gauteng, South Africa
Methodology: AMS-I.F. (Renewable Electricity Generation for Captive Use and Mini-Grid)

South Africa's Electricity Crisis and Why Solar PV Carbon Projects Matter
​
To understand why a small-scale solar PV grouped project in Gauteng matters for the global carbon market, you need to understand what has been happening to electricity supply in South Africa.
​
Eskom, the national utility that supplies the overwhelming majority of South Africa's electricity, has been implementing load shedding — structured, rotating power cuts — since 2007. The system was designed as a temporary emergency measure to manage demand when generating capacity is insufficient. It was never supposed to become a permanent feature of South African life. By 2022, it had become exactly that. The quantity and duration of power cuts in 2022 reached the highest levels since load shedding began — at its worst, households and businesses were facing cuts of eight to twelve hours per day.
​
The cause is infrastructure. South Africa's coal-fired generating fleet is ageing. Plant breakdowns are frequent. Maintenance backlogs have accumulated over years of underinvestment. New generating capacity has not been built fast enough to compensate for the declining reliability of existing plants. The result is a structural electricity deficit that affects every household, business, school, clinic, and community facility in the country — but hits low-income communities and small businesses hardest, because they have the least ability to absorb the cost of backup systems or the economic disruption of unpredictable power cuts.
​
Into this context, small-scale solar PV installations offer something that money alone cannot easily buy: reliable electricity when the grid goes down. A household or business with a functioning solar PV system and battery storage can maintain lighting, refrigeration, phone charging, and basic appliances through a load shedding event. For a spaza shop that loses stock when refrigeration fails, or a small business that loses productive hours every time the power cuts, reliable solar electricity is a direct livelihood benefit.
​
The carbon project described in this case study provides new, small-scale solar PV installations across Gauteng under the VCS AMS-I.F. methodology. It generates verified carbon credits by demonstrating that the solar installations are displacing grid electricity that would otherwise have been generated primarily from coal. But the project's value extends beyond the carbon calculation — and the monitoring framework needs to capture all of it.

Project Brief: Small-Scale Grouped Solar PV in Gauteng
​
The project commissions new small-scale solar PV installations across the Gauteng province of South Africa. Each installation provides a clean, renewable source of electricity for the household or business it serves. The grouped project structure means that multiple individual installations are monitored and verified together under a single VCS project framework — which is the standard approach for small-scale solar projects where individual installations are too small to justify standalone VCS registration.
​
The methodology is AMS-I.F. — Renewable Electricity Generation for Captive Use and Mini-Grid — which is the appropriate VCS methodology for solar PV installations that generate electricity primarily for the use of the site where they're installed, rather than exporting to a grid. The carbon calculation is based on the difference between the clean solar electricity generated and the emissions that would have been produced by the South African national grid to provide the same electricity — a grid that, given Eskom's coal-heavy generation mix, has a relatively high emissions factor per kWh.
​
At an estimated 7,662 VCS credits per year, this is a modest project by volume. But small-scale solar projects are exactly the type that the carbon market needs to work better for — community-scale interventions with direct development benefits that often struggle to justify the overhead of traditional MRV approaches. Digital community-reported monitoring changes that equation substantially, making rigorous ongoing verification economically viable for projects at this scale.
​
Why Solar PV dMRV Needs Three Monitoring Layers
​
A VCS AMS-I.F. solar project has three distinct monitoring requirements that correspond to three distinct types of evidence.
The first is performance data. The carbon calculation depends on knowing how much electricity the solar installations are actually generating. Monthly kWh readings, compared to expected performance, establish whether the systems are delivering the emission reductions claimed. System downtime, panel condition, and maintenance records all feed into the permanence argument — a poorly maintained system that degrades rapidly is not delivering the long-term emission reductions that the carbon credit represents.
​
The second is community impact data. AMS-I.F. projects that are claiming co-benefits — improved energy access, reduced energy costs, income generation impacts — need community-reported evidence to support those claims. How many hours per day do beneficiary households and businesses actually have electricity? How often do they experience outages? Has the solar installation helped with income generation? These questions can only be answered by the people living and working with the installations.
​
The third is SDG impact data. Increasingly, carbon buyers and impact investors want projects to document their SDG contributions with specific, auditable evidence rather than general assertions. SDG 7 — Affordable and Clean Energy — is the most directly relevant SDG for a solar electricity project. Documenting community electricity access status, supply reliability, affordability perceptions, payment methods, and whether access has changed since the project maps directly to SDG 7 indicator 7.1.1 (proportion of population with access to electricity) and related sub-indicators.
​
The three CitizenClimate surveys described below address each layer in sequence.

The Community Electricity User Survey: What Households and Businesses Actually Experience
​
The Community Electricity User Survey is the community impact layer — the set of questions that establishes what electricity access actually means for the households and businesses receiving solar installations, in their own words and from their own experience.
​
Survey configuration: GPS location enabled. Orange colour coding with Home icon — a deliberate visual choice that immediately communicates this survey is about homes and households rather than technical systems. Continuous survey type, allowing ongoing monitoring of electricity experience as conditions change. 10 reward points per submission. Currency reward of 5 units per submission — providing direct incentive for consistent reporting from community members. Timeframe September to December 2025.
​
Badge progression: First Voice (5 surveys), Regular Reporter (20), Community Advocate (40), Usage Expert (60), Technical Helper (100). The progression from First Voice — your initial participation matters — through to Technical Helper, which acknowledges that regular reporters develop genuine expertise in documenting their electricity experience, is thoughtfully designed for a community that is being asked to report on a technical service over time.
​
Survey Field 1 — Household/Business Location (Text, Required): A required text field for the location of the household or business being surveyed. Combined with the GPS location attached to every submission, this creates a precise spatial record of which installations are being monitored and where. For a grouped project spread across Gauteng, the location record is essential for mapping coverage and identifying areas where monitoring is sparse or where issues are concentrated.
​
Survey Field 2 — Number of people in household (Value, optional): An optional numeric field for household size. Optional rather than required — not every submission needs this context, and making it optional removes a potential barrier to submission for community members who are uncertain or prefer not to share this detail. When provided, household size contextualises the electricity access experience — a solar installation serving eight people has a different impact than one serving two.
​
Survey Field 3 — How many hours per day do you have electricity? (Value, Required): A required numeric field. Hours of electricity per day is one of the most direct measures of energy access quality. For communities dealing with load shedding, this number fluctuates significantly — a household might have 16 hours of electricity on a good day and 8 on a bad one. Tracking this over time across multiple reporting households creates a picture of actual electricity availability that is far more granular than what any official data source provides.
​
This field also creates a direct baseline comparison. A household that reports 8 hours per day before solar installation and 20 hours per day after — because the solar system bridges the load shedding gaps — is documenting a transformation in energy access that the carbon calculation alone doesn't capture.
​
Survey Field 4 — What times of day do you typically have power? (Text, Required): A required open text field. Time-of-day availability matters enormously for how useful electricity actually is. Electricity only during daylight hours when solar generation is active might not cover evening cooking or lighting. Electricity that covers morning and evening peaks but cuts out during the day may be adequate for some uses but not others. This open text field captures the texture of electricity access that structured multiple-choice options would miss.
​
Survey Field 5 — How often do you experience power outages? (Multiple Choice): Five frequency options: Never, Rarely (1-2 times per month), Sometimes (3-5 times per month), Often (more than 5 times per month), Daily.
​
For a project operating in the South African load shedding context, this field is particularly revealing. A community member reporting Daily outages before solar installation, and Rarely after, is documenting exactly the energy security improvement that the project is designed to deliver. Tracking this across all reporting households creates a population-level picture of outage frequency reduction that is directly relevant to the project's energy security co-benefit claims.
​
Survey Field 6 — When outages occur, how long do they typically last? (Text, optional): An optional open text field. Duration of outages, combined with frequency, gives the full picture of electricity disruption. Short frequent outages have different impacts than long infrequent ones. For a solar system with battery storage, outage duration relative to battery capacity determines whether the system actually bridges the gap — an outage longer than the battery can sustain will still affect the community member. This field captures that nuance.
​
Survey Field 7 — What do you use electricity for? (Text, optional): An optional open text field. End-use documentation is valuable for two reasons. First, it contextualises the impact — electricity used for refrigeration has different livelihood implications than electricity used for phone charging. Second, it provides evidence for the types of productive use that the project is enabling, which supports co-benefit claims around economic activity and livelihood improvement.
​
Survey Field 8 — How would you rate the reliability of your electricity supply? (Multiple Choice): Beginning with Excellent — always available when needed, and Good — mostly reliable. This structured reliability rating creates a comparable, trackable metric across all reporting households — a community-generated reliability index for the project's electricity supply, updated continuously as submissions come in.
​
Survey Field 9 — Has access to electricity helped with income-generating activities? (Yes/No, optional): An optional binary question. This is the livelihood impact question — the direct link between electricity access and economic activity. For a spaza shop owner who can now keep a fridge running, or a seamstress who can now use an electric sewing machine in the evening, the answer is yes — and that yes is a piece of co-benefit evidence that the carbon calculation never captures. Optional status protects community members who prefer not to share this information, whilst still generating a meaningful evidence base from those who do.
​
Survey Field 10 — What is your biggest challenge with the current electricity situation? (Text, optional): An open text field for the most pressing electricity challenge. This is the qualitative intelligence layer — the things that don't fit any structured response option. Answers might include the cost of electricity, the unpredictability of load shedding schedules, appliance damage from voltage fluctuations, or concerns about the reliability of the solar installation itself. This field captures issues that project managers need to know about but wouldn't think to ask directly.
​
Survey Field 11 — Additional comments or suggestions (Text, optional): A final open feedback field. For a community-led project, this is the space where communities can speak directly to the project — raise concerns, suggest improvements, acknowledge benefits, or flag problems that don't fit anywhere else in the survey. It is the formal channel for community voice in the ongoing monitoring process.
​
The Solar PV Performance and Maintenance Survey: Monthly Generation and System Health
​
The Solar PV Performance and Maintenance Survey is the technical carbon verification instrument. It collects the quantitative performance data and condition assessments that feed directly into the AMS-I.F. emission reduction calculation and provide the evidence base for permanence arguments.
​
Survey configuration: GPS location enabled. Blue colour coding with Star icon — visually distinct from the orange household survey, signalling a technical rather than community context. Continuous survey type. 15 reward points per submission — slightly higher than the community user survey, reflecting the greater technical complexity and time required. Currency reward of 5 units per submission. Timeframe September to December 2025.
​
Badge progression: First Inspector (5), Equipment Guardian (20), Power Tracker (40), Problem Solver (60), Safety First (100). The badge names map directly to the role being performed — from initial inspector to safety-conscious expert. Problem Solver acknowledges that effective monitoring isn't just passive data collection; it involves identifying issues and supporting resolution. Safety First at the 100-survey level recognises that sustained, expert engagement with electrical equipment requires safety consciousness.
​
Survey Field 1 — Current electricity generation reading (kWh) (Value, Required): A required numeric field for the current meter reading in kWh. This is the primary carbon verification data point. The cumulative kWh generation reading is the direct input to the AMS-I.F. emission reduction calculation — every unit of solar electricity generated displaces a unit that would otherwise have come from the grid. Required status ensures no inspection record is submitted without the fundamental generation measurement.
​
Survey Field 2 — Previous month's generation (kWh) (Value, Required): A required numeric field for last month's reading. The combination of current and previous month readings creates the month-on-month generation delta — the actual kWh generated in the period. This is more robust than relying on a single cumulative reading because it catches data entry errors, identifies months where generation dropped unexpectedly, and creates a time-series record of system performance that verifiers can examine for trends and anomalies.
​
Survey Field 3 — Expected vs. actual performance (Multiple Choice): Beginning with Above expected (>105%) and Meeting expectations (95-105%). This performance classification field contextualises the raw kWh numbers — a generation reading means more when you know whether it represents over-performance, expected performance, or underperformance relative to the system's design specification. For a VCS verifier reviewing generation records, consistent underperformance flags are an important audit signal that prompts deeper investigation.
​
Survey Field 4 — System downtime this month (hours) (Value, Required): A required numeric field for hours of system downtime in the current month. Downtime directly reduces the emission reductions delivered — a system that was offline for 72 hours in a month generated fewer kWh than its design specification assumes. Required status ensures downtime is always recorded, including zero — distinguishing a month with no downtime from a month where downtime wasn't measured.
​
Survey Field 5 — Reason for any downtime (Multiple Choice, custom input allowed): Five options: No downtime, Weather-related, Equipment malfunction, Grid connection issues, Maintenance activities.
​
The downtime cause matters as much as the duration for permanence assessment. Weather-related downtime is expected and doesn't reflect on system quality. Equipment malfunction is a flag for maintenance intervention. Grid connection issues in a South African context may relate to load shedding interactions with the solar system. Maintenance activities represent planned downtime that should be documented separately from unplanned failures. Custom input is allowed because not every downtime cause fits the preset categories.
​
Survey Field 6 — Solar panel condition (Multiple Choice, Required): Four options: Excellent — clean, no damage; Good — minor dust/debris; Fair — noticeable soiling; Poor — damaged panels observed.
​
Panel condition affects generation performance directly. Dust and soiling on solar panels reduce their output — a panel with significant soiling may generate 10-15% less electricity than a clean panel under the same irradiance. Regular condition monitoring identifies panels that need cleaning, which is a simple maintenance action with a direct positive effect on generation. Poor — damaged panels observed is the critical flag: a physically damaged panel may be generating significantly below specification, and the damage may worsen over time if not addressed. Required status ensures panel condition is always documented.
​
Survey Field 7 — Any repairs or replacements made. If yes, describe (Text, optional): An optional open text field for repair and maintenance records. When repairs are made, this field creates the maintenance log that demonstrates active system stewardship — that the project is not just installing panels and leaving them to degrade, but actively maintaining the assets whose long-term performance underpins the carbon credit permanence argument.
​
Survey Field 8 — Photos taken during inspection (Photo/Video, Low quality 640x480, optional): GPS-tagged photographic evidence of the installation at time of inspection. Low quality compression at 640x480 50% is optimised for South African mobile data costs and connectivity constraints, which are a real barrier to data submission in lower-income communities. A GPS-tagged photo of a panel in Good condition, dated and located, is a powerful verification asset — it ties the condition assessment to a specific installation at a specific point in time. Optional status removes a barrier for inspectors in situations where photography isn't practical, while still encouraging visual documentation where possible.
​
The SDG 7 Survey: Documenting Affordable and Clean Energy Access
​
The SDG 7: Affordable and Clean Energy Survey is the impact attribution layer — the set of questions that maps community electricity experience directly to the UN SDG 7 framework and produces evidence that the project is contributing to official energy access targets, not just generating carbon credits.
​
Survey configuration: GPS location enabled. Yellow colour coding with Star icon — visually distinct from both the orange household survey and the blue technical survey, and appropriately energetic for the clean energy theme. Continuous survey type. 20 reward points — the highest point value in the three-survey package, reflecting the importance and complexity of this survey. No currency reward, consistent with the principle that SDG impact data should be reported honestly without heavy financial incentivisation distorting responses.
​
Survey Field 1 — Does your household currently have access to electricity? (Multiple Choice, Required): Three options: Yes, No, Not sure. This is the baseline SDG 7.1.1 indicator question — the most fundamental measure of energy access. Required status ensures every SDG 7 survey submission includes an electricity access status record. The Not sure option captures genuine uncertainty — a household that has inconsistent access due to load shedding may legitimately be unsure how to answer a binary yes/no question.
​
Survey Field 2 — If yes, would you describe your electricity supply as reliable? (Multiple Choice, Required): Four options: Yes, always reliable; Sometimes reliable; Not reliable at all; Not applicable (no electricity access).
​
This reliability question goes beyond the binary access measure to capture quality of access — which is the dimension that matters most in the South African load shedding context. A household that technically has electricity access but experiences it as unreliable is not experiencing the energy security that SDG 7 intends. The Not applicable option ensures the question doesn't create a nonsensical requirement for households that answered No to Field 1. Required status means every submission includes a reliability assessment.
​
Survey Field 3 — How do you mainly pay for your electricity? (Multiple Choice, custom input, Required): Four options: Pre-paid system/tokens, Monthly bills, Other (please specify), Not applicable (no electricity access).
​
Payment method is an important dimension of energy affordability and access. Pre-paid token systems — which are common in South African low-income communities — have different affordability dynamics than monthly bills. They can act as a de facto rationing mechanism: households buy tokens when they can afford them and go without electricity when they can't. Monthly bills create a different financial burden but provide more predictable access. Custom input captures payment arrangements that don't fit the preset categories.
​
Survey Field 4 — Do you consider the cost of electricity to be: (Multiple Choice, Required): Four options: Affordable, Somewhat affordable, Unaffordable, Not applicable (no electricity access).
​
SDG 7 specifically includes affordability as a dimension of clean energy access — "affordable and clean energy" is the full name of the goal. This field directly addresses the affordability component. For a solar project that may be reducing electricity costs by displacing expensive grid electricity or eliminating the need to purchase paraffin or candles for lighting, community-reported affordability assessments track whether those cost reductions are being felt by beneficiaries.
​
Survey Field 5 — Has your household's access to electricity changed since the project? (Multiple Choice, Required): Five options: Improved, Worsened, Stayed the same, Not applicable (did not have electricity before or after), Not sure.
​
This is the direct impact attribution question for the SDG 7 survey — the equivalent of the conservation project impact question in the SDG 1 poverty survey. Has the project actually changed electricity access for this household?
​
The inclusion of Worsened as a response option is methodologically important. A solar installation that is underperforming, badly maintained, or poorly matched to household needs could theoretically leave a household worse off than before — if, for instance, it replaced a functioning grid connection with an unreliable solar system, or if the cost of the installation created financial strain. The willingness to capture and report that answer is what makes positive responses credible to sophisticated carbon buyers and SDG validators. Not sure acknowledges that some households genuinely can't yet assess whether their situation has changed.





What the Three-Survey Package Looks Like for AMS-I.F. Verifiers
​
A VCS verification body reviewing this project's monitoring data has access to a substantially more complete evidence package than standard small-scale solar project MRV typically provides.
​
The performance layer includes monthly kWh generation readings from each installation, with current and previous month figures creating a verifiable time-series record of actual generation. Expected versus actual performance classifications flag underperforming systems for follow-up. System downtime hours and causes are recorded per month. Panel condition assessments document physical system health over time. GPS-tagged inspection photos tie all of this to specific installations at specific points in time.
​
The community impact layer includes household and business electricity access reports documenting hours of electricity per day, outage frequency, reliability ratings, and income generation impacts — a continuous, GPS-tagged record of what electricity access actually means to the people receiving it. This is the evidence that makes the co-benefit claims in the project documentation real rather than assumed.
​
The SDG impact layer provides structured, comparable data on electricity access status, supply reliability, payment methods, affordability perceptions, and project-attributed access changes — mapped to SDG 7 indicators and updated continuously as community members submit surveys. For carbon buyers who are asking "does this project actually contribute to SDG 7, or does it just claim to?" — this is the answer.
Using This Model for Your Solar PV Carbon Project
The three-survey approach developed for this project is directly replicable for other AMS-I.F. grouped solar projects, community solar initiatives, and small-scale renewable energy carbon projects across Sub-Saharan Africa and beyond.
​
The generation monitoring fields — current kWh, previous month kWh, expected vs actual, downtime — work for any solar PV installation regardless of system size or geography. The downtime cause categories adapt to local contexts. The panel condition assessment works universally.
​
The community electricity survey adapts to different energy access contexts. In an off-grid rural setting, the outage frequency question and load shedding context would be replaced by questions about generator use, battery storage availability, and hours of solar generation. In a mini-grid context, the reliability questions take on different dimensions. The core structure — location, household size, hours of electricity, reliability, income generation impact, challenges — translates across contexts.
​
The SDG 7 survey is directly transferable to any clean energy project anywhere. SDG 7.1.1 is a universal indicator. The electricity access, reliability, affordability, and change-since-project questions apply whether the project is in Gauteng, rural Bihar, or off-grid Senegal.
​
For project developers working with AMS-I.F. or similar methodologies at the small-scale grouped level, where traditional MRV overhead can consume a disproportionate share of project economics, this three-survey community-reported approach makes rigorous ongoing monitoring viable at a cost that small-scale projects can actually sustain.
Your Project
Could Work Like This
If you're working on a climate or environmental project that needs verified community data, you're probably facing similar questions to the ones in this case study.
​
How do you prove your project is working beyond just the technical metrics? What data do your funders need for carbon credits or ESG reporting? How do you catch problems on the ground before they undermine your results? Most importantly—how do you ensure the people affected by your project actually understand and benefit from it?
​
The difference between projects that succeed and ones that struggle often comes down to whether you're measuring the right things. Carbon calculations tell you about emissions. Community feedback tells you whether the intervention is actually working in practice. Education ensures that feedback is informed, not just reactive.
​
We've built the survey systems, education modules, and geotagged monitoring tools that made this project work. The same approach adapts to your context—different activities, different locations, different communities, different objectives.
​
What you get:
-
Custom education modules that teach participants about what they're monitoring and why it matters
-
Multilingual surveys designed for offline use in areas with limited connectivity
-
GPS-tagged responses that show location-specific patterns and problems
-
Anonymous feedback systems that protect privacy whilst collecting honest data
-
Verified data packages that meet carbon credit, MRV, and ESG reporting requirements
-
Operational insights that help you fix problems before they become failures
​
What your project needs:
-
A climate, environmental, or development initiative (planning stage or already operating)
-
Community members whose participation and feedback would strengthen your project
-
Funders or stakeholders who want proof of impact alongside technical metrics
​
The platform works whether you're monitoring 10 hectares or 10,000, whether you're in a remote village or an urban centre, whether your participants speak Spanish, English, French, Hindi, Indonesian, or Ukrainian.
​
Get Started
If you're working on a project that needs more than just technical data—where community engagement and verified feedback actually matter—let's talk about how this approach could work for you.
​
​​
​
​
Or if you're not sure whether this approach fits your situation, send us a quick message describing what you're trying to achieve. We'll tell you honestly whether education-based community monitoring makes sense for your context.
​
Email us: nick@citizenclimate.net