Overview
ClimateVision currently produces segmentation masks with zero explainability. NGOs and government agencies cannot trust black-box predictions for high-stakes environmental decisions. We need pixel-level attribution showing WHICH bands and spatial regions drove each classification.
Scope
Acceptance Criteria
- Every prediction can return a SHAP heatmap overlay
- Band attribution is returned in API response
- Notebook demonstrates explainability across all 3 analysis types
- Tests pass in CI
Resources
src/climatevision/models/unet.py — U-Net architecture
src/climatevision/inference/pipeline.py — inference flow
- SHAP library docs: https://shap.readthedocs.io/
Difficulty: Intermediate
Owner: Linda Oraegbunam (@obielin)
Labels: good first issue, governance, explainability, backend
Overview
ClimateVision currently produces segmentation masks with zero explainability. NGOs and government agencies cannot trust black-box predictions for high-stakes environmental decisions. We need pixel-level attribution showing WHICH bands and spatial regions drove each classification.
Scope
src/climatevision/inference/pipeline.pysrc/climatevision/governance/explainability.pywith:explain_prediction(model, image, analysis_type)→ returns SHAP values + heatmapGET /api/explain/{run_id}endpoint returning SHAP values as GeoJSON/PNGnotebooks/06_explainability.ipynbwith visual examplesAcceptance Criteria
Resources
src/climatevision/models/unet.py— U-Net architecturesrc/climatevision/inference/pipeline.py— inference flowDifficulty: Intermediate
Owner: Linda Oraegbunam (@obielin)
Labels:
good first issue,governance,explainability,backend