Effects of objects and image quality on melanoma classification using deep neural networks


Akkoca Gazioğlu B. S., Kamaşak M. E.

BIOMEDICAL SIGNAL PROCESSING AND CONTROL, cilt.67, 2021 (SCI-Expanded) identifier identifier

  • Yayın Türü: Makale / Tam Makale
  • Cilt numarası: 67
  • Basım Tarihi: 2021
  • Doi Numarası: 10.1016/j.bspc.2021.102530
  • Dergi Adı: BIOMEDICAL SIGNAL PROCESSING AND CONTROL
  • Derginin Tarandığı İndeksler: Science Citation Index Expanded (SCI-EXPANDED), Scopus, Compendex, EMBASE, INSPEC
  • İstanbul Teknik Üniversitesi Adresli: Evet

Özet

Melanoma is a type of skin cancer with a higher mortality rates. Early and accurate diagnosis of melanoma has critical importance on its prognosis. Recently, deep learning models dominated the CAD systems for classification of the potential melanoma lesions. In clinical settings, capturing impeccable skin images is not always possible. Sometimes, the skin images can be blurry, noisy or have low contrast or can have additional data. The aim of this work is to investigate the effects of external objects (ruler, hair) and image quality (blur, noise, contrast) on the classification of melanoma using commonly used Convolutional Neural Network (CNN) models: ResNet50, DenseNet121, VGG16 and AlexNet. We applied data augmentation, trained four models separately and tested our six datasets. In our experiments, melanoma images can be classified with higher accuracy under contrast changes unlike the benign images, and we recommend ResNet model when image contrast is an issue. Noise significantly degrades classification accuracy of melanoma compared to benign lesions. In addition, both classes are sensitive to blur changes. Best accuracy is obtained with DenseNet in blurred and noisy datasets. The images that contain ruler have decreased the accuracy and ResNet has higher accuracy in this set. We calculated the highest accuracy in hairy skin images since it has the maximum number of images in overall dataset. We evaluated the accuracies as 89.22% for hair set, 86% for ruler set and 88.81% for none set. We can infer that DenseNet can be used for melanoma classification with image distortions and degradations.