TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publication References
  4. Self-supervised U-Net for segmenting flat and sessile polyps
 
Options

Self-supervised U-Net for segmenting flat and sessile polyps

Publikationstyp
Conference Paper
Date Issued
2022-03
Sprache
English
Author(s)
Bhattacharya, Debayan  
Betz, Christian  
Eggert, Dennis  
Schlaefer, Alexander  
Institut
Medizintechnische und Intelligente Systeme E-1  
TORE-URI
http://hdl.handle.net/11420/13138
First published in
Progress in Biomedical Optics and Imaging - Proceedings of SPIE  
Number in series
12033
Start Page
1
End Page
6
Article Number
120333F
Citation
Progress in Biomedical Optics and Imaging - Proceedings of SPIE 12033: 120333F, 1-6 (2022)
Contribution to Conference
SPIE Medical Imaging 2022  
Publisher DOI
10.1117/12.2611364
Scopus ID
2-s2.0-85132820584
Publisher
SPIE
Colorectal Cancer(CRC) poses a great risk to public health. It is the third most common cause of cancer in the US. Development of colorectal polyps is one of the earliest signs of cancer. Early detection and resection of polyps can greatly increase survival rate to 90%. Manual inspection can cause misdetections because polyps vary in color, shape, size and appearance. To this end, Computer-Aided Diagnosis systems(CADx) has been proposed that detect polyps by processing the colonoscopic videos. The system acts a secondary check to help clinicians reduce misdetections so that polyps may be resected before they transform to cancer. Polyps vary in color, shape, size, texture and appearance. As a result, the miss rate of polyps is between 6% and 27% despite the prominence of CADx solutions. Furthermore, sessile and flat polyps which have diameter less than 10 mm are more likely to be undetected. Convolutional Neural Networks(CNN) have shown promising results in polyp segmentation. However, all of these works have a supervised approach and are limited by the size of the dataset. It was observed that smaller datasets reduce the segmentation accuracy of ResUNet++. Self-supervision is a stronger alternative to fully supervised learning especially in medical image analysis since it redresses the limitations posed by small annotated datasets. From the self-supervised approach proposed by Jamaludin et al., it is evident that pretraining a network with a proxy task helps in extracting meaningful representations from the underlying data which can then be used to improve the performance of the final downstream supervised task. In summary, we train a U-Net to inpaint randomly dropped out pixels in the image as a proxy task. The dataset we use for pretraining is Kvasir-SEG dataset. This is followed by a supervised training on the limited Kvasir-Sessile dataset. Our experimental results demonstrate that with limited annotated dataset and a larger unlabeled dataset, self-supervised approach is a better alternative than fully supervised approach. Specifically, our self-supervised U-Net performs better than five segmentation models which were trained in supervised manner on the Kvasir-Sessile dataset.
Subjects
MLE@TUHH
DDC Class
610: Medizin
More Funding Information
This work is funded partially by Technische Universit at Hamburg and by Free and Hanseatic City of Hamburg (Interdisciplinary Graduate School - Innovative Technologies in Cancer Diagnostics and Therapy).
TUHH
Weiterführende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback