AAI_2025_Capstone_Chronicles_Combined
● Phases 2–5 (60 epochs total): All EfficientNet layers were unfrozen. We switched to focal loss (α = 0.8) to address class imbalance, with γ set to 2.0 initially and then reduced to 1.5 in later phases to soften the penalty on confident predictions and stabilize convergence. We trained for 61 total epochs using the Adam optimizer with a learning rate of 2e-6 and a batch size of 8. Each phase used early stopping based on validation binary accuracy. Performance metrics included binary accuracy and recall, with validation scores improving well into the final epochs. During model optimization, we tuned the focal loss hyperparameters (γ and α), froze/unfroze layers progressively, and introduced per-class threshold calibration based on validation F1 scores. This threshold tuning stage proved especially impactful for recall, allowing us to shift model sensitivity on underrepresented classes like hernia and cardiac conditions. Hybrid CNN Model Training Methodology We developed a custom hybrid CNN architecture capable of processing both image data and tabular metadata from the NIH Chest X-ray dataset. The tabular features, such as patient age and gender, were included based on clinical relevance, as these variables can meaningfully influence radiographic presentation. For example, women tend to have smaller lungs on average, which can complicate diagnostic interpretation. Our training process followed an iterative approach, experimenting with multiple model configurations and evaluation routines to identify a design that yielded the most reliable performance across all diagnostic categories.
16
Made with FlippingBook - Share PDF online