Gastric Cancer Diagnosis with Mask R-CNN

Guitao Cao, Wenli Song, Zhenwei Zhao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

33 Scopus citations

Abstract

It is a reliable way to judge gastric cancer by pathological section. Using deep learning method to detect medical images, as an auxiliary diagnosis method, it can improve the speed and accuracy of doctors to diagnose gastric cancer, and reduce misdiagnosis and missed diagnosis. Mask R-CNN is the latest method in the related field at the beginning of the research. It is mainly used to segment the objects in daily life and achieve good results. The medical image is very different from the scene in life, and the detection effect is also weakened. We use the Mask R-CNN method to detect the pathological sections of gastric cancer, and segment the cancer nest, and then optimize it by adjusting parameters. The method finally allows it to obtain a test result with an AP value of 61.2 when detecting medical images.

Original languageEnglish
Title of host publicationProceedings - 2019 11th International Conference on Intelligent Human-Machine Systems and Cybernetics, IHMSC 2019
PublisherInstitute of Electrical and Electronics Engineers Inc.
Pages60-63
Number of pages4
ISBN (Electronic)9781728118598
DOIs
StatePublished - Aug 2019
Event11th International Conference on Intelligent Human-Machine Systems and Cybernetics, IHMSC 2019 - Hangzhou, China
Duration: 24 Aug 201925 Aug 2019

Publication series

NameProceedings - 2019 11th International Conference on Intelligent Human-Machine Systems and Cybernetics, IHMSC 2019
Volume1

Conference

Conference11th International Conference on Intelligent Human-Machine Systems and Cybernetics, IHMSC 2019
Country/TerritoryChina
CityHangzhou
Period24/08/1925/08/19

Keywords

  • Gastric Cancer
  • Instance Segmentation
  • Mask R-CNN
  • Pathological Section

Fingerprint

Dive into the research topics of 'Gastric Cancer Diagnosis with Mask R-CNN'. Together they form a unique fingerprint.

Cite this