A Two-Stage Deep Learning Framework for Skin Lesion Detection and Classification Using ResNet18 and EfficientNet-B4
Abstract
Skin diseases encompass a wide range of conditions that require an early and accurate diagnosis for effective treatment. This paper presents a two-stage deep learning framework for automated skin lesion detection and classification using deep convolutional neural networks. The first stage uses a ResNet18 model to detect the presence of a lesion in dermoscopic images. If a lesion is detected, the image is transferred to an EfficientNetB4 model for multiclass classification. Our approach integrates data augmentation, hair removal preprocessing, learning rate scheduling, and early stopping to enhance model performance and robustness. The framework is trained and evaluated on the HAM10000 dataset, addressing challenges such as class imbalance, model fine-tuning, and overfitting. Experimental results demonstrate the effectiveness of this method in accurately identifying and categorizing skin lesions, contributing to the advancement of deep learning-based dermatological diagnosis.
Keywords:
Deep Learning, Convolutional Neural Network, Skin Lesion Detection, Skin Lesion Classification, image preprocessing, ResNet18, EfficientNet-B4, data enhancement, HAM10000 data set, Dermatology, Computer-aided diagnosis, Medical image analysisPublished
Issue
Section
License
Copyright (c) 2025 International Journal on Emerging Research Areas

This work is licensed under a Creative Commons Attribution 4.0 International License.
All published work in this journal is licensed under the Creative Commons Attribution 4.0 International License (CC BY 4.0). This license permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
How to Cite
Similar Articles
- Fr Jins Sebastian, Manu Tom Sebastian, Minnu Elsa Baby, Niya Mary Viby, Image Encryption Using Different Cryptographic Algorithms : A Survey Paper , International Journal on Emerging Research Areas: Vol. 3 No. 1 (2023): IJERA
- Dr nitha C Vellayudan, Akshay K.P, Muhamed Adhil P.M, C.A Sivasankar , Crop Yield and Price Prediction , International Journal on Emerging Research Areas: Vol. 3 No. 1 (2023): IJERA
- Juby Mathew, Maria Jojo, Neha Ann Samson, Noell Biju Michael, Ron T Alumkal, PulseSync: IoT-Enabled Monitoring and Predictive Analytics for Healthcare , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Nihal Anil, Ms. Nighila Abhish, Jesila Joy , Noora Sajil , P R Vishnuraj, Augmented Neat Algorithm For Enhanced Cognitive Interaction (NEAT-X) , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Amina Manaf , Ance Maria Joseph , Angel Joy , Anjaly Anilkumar , K S Rekha, Driver Drowsiness Detection Using Python , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Dr. Indu John, Gauri Santhosh, Jesna Susan Reji, Abdul Musawir, Glady Prince, Detection of Autism Spectrum Disorder in Toddlers using Machine Learning , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Dipjyoti Deka, Rituparna Seal, Shubham Banik, Unmasking Fraudulent Job Ads: A Critical Review of Machine Learning Techniques for Detecting Fake Jobs , International Journal on Emerging Research Areas: Vol. 3 No. 1 (2023): IJERA
- Avinash Krishnan, Belda Ben Thomas, Fr Siju John, Bava Kurian Varghese, Ajumon C Thampi, Computer Aided Carbon Footprint Estimation in Educational Institutions , International Journal on Emerging Research Areas: Vol. 5 No. 1 (2025): IJERA
- Akhil Mohan , E R Sreema, Leshma Mohandas , P U Prabath, Saeedh Mohammed , Virtual Air Canvas , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
- Muhammed Saalim O.S, Fathima Parvin M.A, Albiya Hameed, Hiba Fathima T.S, Amritha Soloman, AGRISEN Precise irrigation System and Smart health monitoring system , International Journal on Emerging Research Areas: Vol. 4 No. 1 (2024): IJERA
You may also start an advanced similarity search for this article.
