Ocular Disease Detection Using Advanced Neural Network Based Classification Algorithms
One of the most challenging tasks for ophthalmologists is early screening and diagnosis of ocular diseases from fundus images. However, manual diagnosis of ocular diseases is difficult, time-consuming and it can be prone to errors. That is why a computer-aided automated ocular disease detection system is required for the early detection of various ocular diseases using fundus images. Due to the enhanced image classification capabilities of deep learning algorithms, such a system can finally be realized. In this study, we present four deep learning-based models for targeted ocular tumor detection. For this study, we trained the cutting-edge image classification algorithms such as Resnet-34, EfficientNet, MobileNetV2, and VGG-16 on the ODIR dataset consisting of 5000 fundus images that belong to 8 different classes. Each of these classes represents a different ocular disease. The VGG-16 model achieved an accuracy of 97.23%; the Resnet-34 model reached an accuracy of 90.85%; the MobileNetV2 model provided an accuracy of 94.32%, and the EfficientNet classification model achieved an accuracy of 93.82%. All of these models will be instrumental in building a real-time ocular disease diagnosis system.
 Sommer, A., Tielsch, J. M., Katz, J., Quigley, H. A., Gottsch, J. D., Javitt, J. C., Martone, J. F., Royall, R. M., Witt, K. A., & Ezrine, S. (1991). Racial Differences in the Cause-Specific Prevalence of Blindness in East Baltimore. New England Journal of Medicine, 325(20), 1412–1417. https://doi.org/10.1056/nejm199111143252004
 Congdon, N., O'Colmain, B., Klaver, C. C., Klein, R., Muñoz, B., Friedman, D. S., Kempen, J., Taylor, H. R., Mitchell, P., & Eye Diseases Prevalence Research Group (2004). Causes and prevalence of visual impairment among adults in the United States. Archives of ophthalmology (Chicago, Ill. : 1960), 122(4), 477–485.
 Application of Ocular Fundus Photography and Angiography. (2014). Ophthalmological Imaging and Applications, 154–175. https://doi.org/10.1201/b17026-12
 Rowe, S., MacLean, C. H., & Shekelle, P. G. (2004). Preventing Visual Loss From Chronic Eye Disease in Primary Care. JAMA, 291(12), 1487. https://doi.org/10.1001/jama.291.12.1487
 Kessel, L., Erngaard, D., Flesner, P., Andresen, J., Tendal, B., & Hjortdal, J. (2015). Cataract surgery and age‐related macular degeneration. An evidence‐based update. Acta Ophthalmologica, 93(7), 593–600. https://doi.org/10.1111/aos.12665Li, N., Li, T., Hu, C., Wang, K., &
 Li, N., Li, T., Hu, C., Wang, K., & Kang, H. (2021). A Benchmark of Ocular Disease Intelligent Recognition: One Shot for Multi-disease Detection. Benchmarking, Measuring, and Optimizing, 177–193. https://doi.org/10.1007/978-3-030-71058-3_11
 Miranda, E., Aryuni, M., & Irwansyah, E. (2016, November). A survey of medical image classification techniques. In 2016 International Conference on Information Management and Technology (ICIMTech) (pp. 56-61). IEEE.
 He, J., Li, C., Ye, J., Qiao, Y., & Gu, L. (2021). Multi-label ocular disease classification with a dense correlation deep neural network. Biomedical Signal Processing and Control, 63, 102167
 Li, C., Ye, J., He, J., Wang, S., Qiao, Y., & Gu, L. (2020, April). Dense correlation network for automated multi-label ocular disease detection with paired color fundus photographs. In 2020 IEEE 17th International Symposium on Biomedical Imaging (ISBI) (pp. 1-4). IEEE
 Tayal, A., Gupta, J., Solanki, A., Bisht, K., Nayyar, A., & Masud, M. (2021). DL-CNN-based approach with image processing techniques for diagnosis of retinal diseases. Multimedia Systems, 1-22.
 Akil, M., Elloumi, Y., & Kachouri, R. (2020). Detection of retinal abnormalities in fundus image using CNN deep learning networks.
 Meng, X., Xi, X., Yang, L., Zhang, G., Yin, Y., Chen, X. (2018). Fast and effective optic disk localization based on convolutional neural network.Neurocomputing,,312,285–295. https://doi.org/10.1016/j.neucom.2018.05.114
 He, J., Li, C., Ye, J., Qiao, Y., Gu, L. (2021). Self-speculation of clinical features based on knowledge distillation for accurate ocular disease classification. Biomedical Signal Processing and Control, 67, 102491. https://doi.org/10.1016/j.bspc.2021.102491
 Roy, A. G., Conjeti, S., Karri, S. P., Sheet, D., Katouzian, A., Wachinger, C., Navab, N. (2017). ReLayNet: retinal layer and fluid segmentation of macular optical coherence tomography using fully convolutional networks. Biomedical Optics Express, 8(8), 3627. https://doi.org/10.1364/boe.8.003627
 Lee, C. S., Tyring, A. J., Deruyter, N. P., Wu, Y., Rokem, A., Lee, A. Y. (2017). Deep-learning based, automated segmentation of macular edema in optical coherence tomography. Biomedical Optics Express, 8(7), 3440. https://doi.org/10.1364/boe.8.003440
 Playout, C., Duval, R., Cheriet, F. (2019). A Novel Weakly Supervised Multitask Architecture for Retinal Lesions Segmentation on Fundus
 Images. IEEE Transactions on Medical Imaging, 38(10), 2434–2444. https://doi.org/10.1109/tmi.2019.2906319
 Hu, K., Zhang, Z., Niu, X., Zhang, Y., Cao, C., Xiao, F., Gao, X. (2018). Retinal vessel segmentation of color fundus im-ages using multiscale convolutional neural network with an im-proved cross-entropy loss function. Neurocomputing, 309, 179–191. https://doi.org/10.1016/j.neucom.2018.05.011
 M. D. M. Goldbaum, STARE Dataset Website, Clemson University, Clemson, SC, USA, 1975
To ensure uniformity of treatment among all contributors, other forms may not be substituted for this form, nor may any wording of the form be changed. This form is intended for original material submitted to AJCT and must accompany any such material in order to be published by AJCT. Please read the form carefully.
The undersigned hereby assigns to the Asian Journal of Convergence in Technology Issues ("AJCT") all rights under copyright that may exist in and to the above Work, any revised or expanded derivative works submitted to AJCT by the undersigned based on the Work, and any associated written, audio and/or visual presentations or other enhancements accompanying the Work. The undersigned hereby warrants that the Work is original and that he/she is the author of the Work; to the extent the Work incorporates text passages, figures, data or other material from the works of others, the undersigned has obtained any necessary permission. See Retained Rights, below.
AJCT distributes its technical publications throughout the world and wants to ensure that the material submitted to its publications is properly available to the readership of those publications. Authors must ensure that The Work is their own and is original. It is the responsibility of the authors, not AJCT, to determine whether disclosure of their material requires the prior consent of other parties and, if so, to obtain it.
RETAINED RIGHTS/TERMS AND CONDITIONS
1. Authors/employers retain all proprietary rights in any process, procedure, or article of manufacture described in the Work.
2. Authors/employers may reproduce or authorize others to reproduce The Work and for the author's personal use or for company or organizational use, provided that the source and any AJCT copyright notice are indicated, the copies are not used in any way that implies AJCT endorsement of a product or service of any employer, and the copies themselves are not offered for sale.
3. Authors/employers may make limited distribution of all or portions of the Work prior to publication if they inform AJCT in advance of the nature and extent of such limited distribution.
4. For all uses not covered by items 2 and 3, authors/employers must request permission from AJCT.
5. Although authors are permitted to re-use all or portions of the Work in other works, this does not include granting third-party requests for reprinting, republishing, or other types of re-use.
INFORMATION FOR AUTHORS
AJCT Copyright Ownership
It is the formal policy of AJCT to own the copyrights to all copyrightable material in its technical publications and to the individual contributions contained therein, in order to protect the interests of AJCT, its authors and their employers, and, at the same time, to facilitate the appropriate re-use of this material by others.
If you are employed and prepared the Work on a subject within the scope of your employment, the copyright in the Work belongs to your employer as a work-for-hire. In that case, AJCT assumes that when you sign this Form, you are authorized to do so by your employer and that your employer has consented to the transfer of copyright, to the representation and warranty of publication rights, and to all other terms and conditions of this Form. If such authorization and consent has not been given to you, an authorized representative of your employer should sign this Form as the Author.
AJCT requires that the consent of the first-named author and employer be sought as a condition to granting reprint or republication rights to others or for permitting use of a Work for promotion or marketing purposes.
1. The undersigned represents that he/she has the power and authority to make and execute this assignment.
2. The undersigned agrees to indemnify and hold harmless AJCT from any damage or expense that may arise in the event of a breach of any of the warranties set forth above.
3. In the event the above work is accepted and published by AJCT and consequently withdrawn by the author(s), the foregoing copyright transfer shall become null and void and all materials embodying the Work submitted to AJCT will be destroyed.
4. For jointly authored Works, all joint authors should sign, or one of the authors should sign as authorized agent
for the others.
Licenced by :