Please use this identifier to cite or link to this item: http://10.1.7.192:80/jspui/handle/123456789/12460
Title: Interpretability of Diabetic Retinopathy images for EfficientNet
Authors: Patel, Yashesh
Keywords: Computer 2022
Project Report
Project Report 2022
Computer Project Report
22MCE
22MCED
22MCED13
CE (DS)
DS 2022
Issue Date: 1-Jun-2024
Publisher: Institute of Technology
Series/Report no.: 22MCED13;
Abstract: Diabetic Retinopathy (DR) emerges as a consequence of either type-1 or type-2 dia- betes, and it is crucial to detect complications early to prevent visual issues such as retinal detachment, vitreous hemorrhage, and glaucoma. The interpretability of automated classifiers in medical diagnoses, like diabetic retinopathy, is of paramount importance. The primary challenge lies in extracting meaningful insights from these classifiers, given their inherent complexities. In recent years, considerable efforts have been devoted to transforming deep learning classifiers from opaque statistical black boxes with high confidence to models that are self-explanatory. A persisting concern revolves around the effective preprocessing of data before classification. Despite the proven efficacy of supervised machine learning schemes in application, challenges persist in dealing with data redundancy, feature selection, and human expert interference. Consequently, we propose a combinatorial deep learning approach for interpreting diabetic retinopathy (DR) detection. Our method integrates the Shapley Additive Explainability (SHAP) and Local Interpretable Model-Agnostic Explanations (LIME) techniques to analyze the output of deep learning models effectively. The outcomes of our experiments demonstrate that our proposed approach surpasses existing schemes in the accurate detection of DR.
URI: http://10.1.7.192:80/jspui/handle/123456789/12460
Appears in Collections:Dissertation, CE (DS)

Files in This Item:
File Description SizeFormat 
22MCED13.pdf22MCED131.05 MBAdobe PDFView/Open


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.