全部 标题 作者
关键词 摘要

OALib Journal期刊
ISSN: 2333-9721
费用:99美元

查看量下载量

相关文章

更多...

基于Swin Transformer的皮肤病诊断研究
Research on Dermatological Disease Diagnosis Based on Swin Transformer

DOI: 10.12677/aam.2025.142049, PP. 34-39

Keywords: Swin Transformer,自注意力机制,皮肤病诊断
Swin Transformer
, Self-Attention Mechanism, Dermatological Diagnosis

Full-Text   Cite this paper   Add to My Lib

Abstract:

皮肤病是世界范围内最常见的疾病之一。准确、及时、有效的皮肤病图像分类研究对皮肤病诊断具有重要意义,高精度分类算法研究是该领域的热点和难点。近年来,深度学习算法在皮肤病诊断领域展现出了巨大的发展潜力,具有广阔的应用前景。文章采用基于Transformer模型改进的Swin Transformer模型,构建了基于Swin Transformer的皮肤病诊断模型,并将该模型在HAM10000数据集上进行实验验证。研究证明,该模型显著提高了皮肤病诊断的准确性,有望推动皮肤病诊断领域朝着更加高效、精准的方向迈进,为皮肤病诊断领域的临床实践提供了更有力的实验支撑。
Dermatological diseases are among the most common diseases worldwide. Accurate, timely, and effective classification of dermatological disease images is of great significance for the diagnosis of dermatological diseases, and the research on high-precision classification algorithms is a hot and difficult issue in this field. In recent years, deep learning algorithms have shown great potential for development in the field of dermatological disease diagnosis, with broad application prospects. This paper adopts the Swin Transformer model, an improvement based on the Transformer model, to construct a dermatological disease diagnosis model based on the Swin Transformer. The model is experimentally verified on the HAM10000 dataset. The study proves that the model significantly improves the accuracy of dermatological disease diagnosis and is expected to promote the field of dermatological disease diagnosis in a more efficient and precise direction, providing stronger experimental support for clinical practice in this field.

References

[1]  Proksch, E., Brandner, J.M. and Jensen, J. (2008) The Skin: An Indispensable Barrier. Experimental Dermatology, 17, 1063-1072.
https://doi.org/10.1111/j.1600-0625.2008.00786.x
[2]  Brown, M., Williams, A., Chilcott, R.P., Brady, B., Lenn, J., Evans, C., et al. (2024) Topically Applied Therapies for the Treatment of Skin Disease: Past Present and Future. Pharmacological Reviews, 76, 689-790.
https://doi.org/10.1124/pharmrev.123.000549
[3]  LeCun, Y., Bengio, Y. and Hinton, G. (2015) Deep Learning. Nature, 521, 436-444.
https://doi.org/10.1038/nature14539
[4]  Krit, D., Michael, S., Samer, A., Langlotz, C.P., Swift, A.J. and Bluethgen, C. (2023) External Validation, Radiological Evaluation, and Development of Deep Learning Automatic Lung Segmentation in Contrast-Enhanced Chest CT. European Radiology, 34, 2727-2737.
https://doi.org/10.1007/s00330-023-10235-9
[5]  Bernstein, I.A., Koornwinder, A., Hwang, H.H. and Wang, S.Y. (2024) Automated Recognition of Visual Acuity Measurements in Ophthalmology Clinical Notes Using Deep Learning. Ophthalmology Science, 4, Article 100371.
https://doi.org/10.1016/j.xops.2023.100371
[6]  Price, E. (2018) AI Is Better at Diagnosing Skin Cancer than Your Doctor, Study Finds. Yahoo Finance.
https://finance.yahoo.com/news/ai-better-diagnosing-skin-cancer-182057563.html
[7]  Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A.N., et al. (2017) Attention Is All You Need. Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, 4-9 December 2017, 6000-6010.
[8]  Liu, Z., Lin, Y., Cao, Y., Hu, H., Wei, Y., Zhang, Z., et al. (2021) Swin Transformer: Hierarchical Vision Transformer Using Shifted Windows. 2021 IEEE/CVF International Conference on Computer Vision (ICCV), Montreal, 10-17 October 2021, 9992-10002.
https://doi.org/10.1109/iccv48922.2021.00986
[9]  Tschandl, P., Rosendahl, C. and Kittler, H. (2018) The HAM10000 Dataset, a Large Collection of Multi-Source Dermatoscopic Images of Common Pigmented Skin Lesions. Scientific Data, 5, Article No. 180161.
https://doi.org/10.1038/sdata.2018.161
[10]  Lan, Z., Cai, S., He, X. and Wen, X. (2022) FixCaps: An Improved Capsules Network for Diagnosis of Skin Cancer. IEEE Access, 10, 76261-76267.
https://doi.org/10.1109/access.2022.3181225

Full-Text

Contact Us

service@oalib.com

QQ:3279437679

WhatsApp +8615387084133