References

Year

Different Methods

Best Algorithm

Accuracy

[1]

2019

MLP

MLP

96%

[2]

2019

CNN

CNN

77.30%

[3]

2020

RF, KNN, NB, MLP, J48 Trees, LR

MLP

99.8%

[4]

2020

CNN

CNN

97.89%

[5]

2019

C5.0, RF, RPART, KNN, SVM

C5.0

97.00%

[6]

2021

DT, RF, XGBoost

DT

93%

[7]

2021

CNN

CNN

93.33%

[8]

2020

RF, KNN

KNN

94%

[9]

2022

RF, XGBoost, CNN

CNN

92%

[10]

2019

R-CNN, Mask R-CNN

CNN

94%

[11]

2021

ResNet50

RestNet50

74.04%.

[12]

2020

CNN4

CNN4

80.03%

[13]

2018

Artificial Neural Networks

ANN

95%

[14]

2021

SVM

SVM

93.33%

[15]

2022

SMOTE Tomek

SMOTE Tomek

97.72%

[16]

2022

DT, RF, SVM

SVM

84%

[17]

2019

Deep Learning, CNN

CNN

97.70%

[18]

2019

KNN, DT, RF

KNN

93.30%

[19]

2017

MLP, Bayes NET

MLP

95%

[20]

2019

Bayes NET, RF, SVM

SVM

96.38%

[21]

2023

KNN

KNN

99.41%

[22]

2021

CNN, ResNet50

ResNet50

74.04%

[23]

2019

CNN

CNN

97.70%

[24]

2021

SVM

SVM

96%

[25]

2020

CNN

CNN

99.40%

[26]

2019

MLP

MLP

96.20%

[27]

2019

CNN

CNN

77%

[28]

2019

C5.0, RF, RPART, SVM, KNN

SVM

99.77%

[29]

2021

XGBoost classifier

XGBoost

93.33%

[30]

2020

SVM

SVM

99.30%

[31]

2021

LR, J48 Trees, DT, SVM, RF, KNN, NB, MLP

SVM

95%

[32]

2020

DL

DL

93%

[33]

2023

LR, J48 Trees, DT, SVM, RF, KNN, NB, MLP, Compact VGG

KNN

99.99%

[34]

2018

DL

DL

90%

[35]

2017

LR, DT, SVM, RF, KNN, NB, MLP, and J48 Trees

MLP

88%

[36]

2016

DL

DL

87%

[37]

2015

LR, J48 Trees, DT, SVM, RF, KNN, NB, and MLP

SVM

85%

[38]

2014

DL

DL

83%

[39]

2013

LR, J48 Trees, DT, SVM, RF, KNN, NB, and MLP

MLP

82%

[40]

2012

DL

DL

80%

[41]

2021

Compact VGG

VGG

81.70%

[42]

2021

SVMs

SVM

95.00%

[43]

2019

CNN

CNN

93.33%

[44]

2019

SVM, VGG

SVM

94%

[45]

2019

SVM

SVM

89.35%

[46]

2017

KNN, ANN

KNN

88%

[47]

2023

MLP, SVM, KNN

KNN

99%

[48]

2015

SVM, ANN

SVM

85.39%

[49]

2019

AUC, ROC

AUC

90.11%

[50]

2019

AUC, ROC

AUC

92%