This repository contains the public denoising code used in the thesis work on Electronic Speckle Pattern Interferometry (ESPI), together with the curated V4/V5 result package used for the final thesis interpretation.
The repository focuses on the denoising stage of the broader workflow. It includes the main DnCNN-Lite variants with Efficient Channel Attention (ECA), lightweight plotting utilities for the final thesis figures, and canonical CSV result tables for downstream comparison, robustness, and latency analysis.
It should be read as the public denoising component of the thesis, with V3 retained for historical baseline context and V4/V5 retained as the final curated thesis evidence.
The full thesis spans three code components:
- Pseudo-noisy data generation for supervision and controlled ablations.
- DnCNN-ECA denoising, which is the scope of this repository.
- Classification and evaluation, maintained in a separate repository.
In practical terms, this repository corresponds to the denoising component plus the final V4/V5 thesis result package.
- Historical baseline script:
espi_dncnn_lite_eca.py - Fair-ablation and robustness-oriented v4 script:
espi_dncnn_lite_eca_FULL_PATCH_v4.py - Extended research-oriented v5 script:
espi_dncnn_lite_eca_FULL_PATCH_v5.py - Canonical thesis result tables in
results/v4v5_final/ - Plotting scripts in
scripts/ - Supporting notes, changelogs, and thesis mapping documents
| Purpose | File |
|---|---|
| Lightweight baseline / core DnCNN-Lite ECA script | espi_dncnn_lite_eca.py |
| Stable v4 comparison script with fair ECA vs no-ECA controls | espi_dncnn_lite_eca_FULL_PATCH_v4.py |
| Extended v5 research script with dual pooling and advanced ECA options | espi_dncnn_lite_eca_FULL_PATCH_v5.py |
| Downstream result figure generation | scripts/plot_downstream_v4v5.py |
| Robustness figure generation | scripts/plot_robustness.py |
| Latency figure generation | scripts/plot_latency.py |
The final thesis conclusions are tied to the curated package in results/v4v5_final/.
Key conclusions supported by that package include:
- The supervision regime matters more than architecture complexity alone.
- Models trained on pseudo-noisy synthetic supervision can hurt downstream classification, even when denoising metrics appear favorable.
- Models trained on real-aligned pairs improve downstream classification performance and support the final system-level thesis conclusion.
- The lightweight V4R ECA configuration is the best practical balance of downstream performance, robustness, and computational cost in the final thesis package.
- The more aggressive V5 design is preserved as a higher-cost exploratory extension rather than the definitive best model.
.
|-- README.md
|-- REPRODUCE.md
|-- MODEL_CARD.md
|-- DNCNN_VERSIONS_COMPARISON_REPORT.md
|-- V4_CHANGELOG_AND_EXPECTED_IMPACT.md
|-- V5_CHANGELOG.md
|-- CITATION.cff
|-- requirements.txt
|-- docs/
| |-- REPOSITORY_SCOPE.md
| `-- THESIS_RESULTS_NOTES.md
|-- experiments/
| `-- manifests/
| `-- TEMPLATE_run_manifest.yaml
|-- results/
| `-- v4v5_final/
| |-- README_RESULTS.md
| |-- downstream_summary.csv
| |-- robustness_3seed_summary.csv
| |-- latency_params_summary.csv
| |-- plots_data_accuracy_macrof1.csv
| `-- plots_data_robustness.csv
`-- scripts/
|-- plot_downstream_v4v5.py
|-- plot_robustness.py
`-- plot_latency.py
pip install -r requirements.txtRequirements are intentionally minimal and centered on the PyTorch training and plotting stack.
See REPRODUCE.md for command-line examples aligned with the public scripts.
For thesis-file mapping, see:
docs/REPOSITORY_SCOPE.mddocs/THESIS_RESULTS_NOTES.mdresults/v4v5_final/README_RESULTS.md
The repository preserves version-comparison and changelog documents for traceability:
DNCNN_VERSIONS_COMPARISON_REPORT.mdV4_CHANGELOG_AND_EXPECTED_IMPACT.mdV5_CHANGELOG.md
These notes are useful for understanding architecture evolution. In particular, V3 is retained as historical baseline context, while the canonical final thesis evidence is the curated V4/V5 package in results/v4v5_final/.
The thesis codebase is split across the following repositories:
- DnCNN-ECA denoising (this repository) (
https://github.com/GeorgeSpy/ESPI-DnCNN-ECA) - ESPI classification and evaluation (
https://github.com/GeorgeSpy/espi-classification-models_2) - Pseudo-noisy data generation (
https://github.com/GeorgeSpy/ESPI-pseydonoisy-generator)
This repository is released under the MIT License. See LICENSE for details.