TUHH Open Research
Help
  • Log In
    New user? Click here to register.Have you forgotten your password?
  • English
  • Deutsch
  • Communities & Collections
  • Publications
  • Research Data
  • People
  • Institutions
  • Projects
  • Statistics
  1. Home
  2. TUHH
  3. Publications
  4. Needle tracking in low-resolution ultrasound volumes using deep learning
 
Options

Needle tracking in low-resolution ultrasound volumes using deep learning

Citation Link: https://doi.org/10.15480/882.13574
Publikationstyp
Journal Article
Date Issued
2024-07-13
Sprache
English
Author(s)
Grube, Sarah  orcid-logo
Medizintechnische und Intelligente Systeme E-1  
Latus, Sarah  orcid-logo
Medizintechnische und Intelligente Systeme E-1  
Behrendt, Finn  
Medizintechnische und Intelligente Systeme E-1  
Riabova, Oleksandra  
Neidhardt, Maximilian  
Medizintechnische und Intelligente Systeme E-1  
Schlaefer, Alexander  
Medizintechnische und Intelligente Systeme E-1  
TORE-DOI
10.15480/882.13574
TORE-URI
https://hdl.handle.net/11420/49821
Journal
International journal of computer assisted radiology and surgery  
Volume
19
Issue
10
Start Page
1975
End Page
1981
Citation
International Journal of Computer Assisted Radiology and Surgery 19 (10): 1975-1981 (2024)
Publisher DOI
10.1007/s11548-024-03234-8
Scopus ID
2-s2.0-85198466630
Publisher
Springer
Purpose: Clinical needle insertion into tissue, commonly assisted by 2D ultrasound imaging for real-time navigation, faces the challenge of precise needle and probe alignment to reduce out-of-plane movement. Recent studies investigate 3D ultrasound imaging together with deep learning to overcome this problem, focusing on acquiring high-resolution images to create optimal conditions for needle tip detection. However, high-resolution also requires a lot of time for image acquisition and processing, which limits the real-time capability. Therefore, we aim to maximize the US volume rate with the trade-off of low image resolution. We propose a deep learning approach to directly extract the 3D needle tip position from sparsely sampled US volumes. Methods: We design an experimental setup with a robot inserting a needle into water and chicken liver tissue. In contrast to manual annotation, we assess the needle tip position from the known robot pose. During insertion, we acquire a large data set of low-resolution volumes using a 16 × 16 element matrix transducer with a volume rate of 4 Hz. We compare the performance of our deep learning approach with conventional needle segmentation. Results: Our experiments in water and liver show that deep learning outperforms the conventional approach while achieving sub-millimeter accuracy. We achieve mean position errors of 0.54 mm in water and 1.54 mm in liver for deep learning. Conclusion: Our study underlines the strengths of deep learning to predict the 3D needle positions from low-resolution ultrasound volumes. This is an important milestone for real-time needle navigation, simplifying the alignment of needle and ultrasound probe and enabling a 3D motion analysis.
Subjects
Deep learning
Needle tip detection
Real-time
Sparse feature learning
Volumetric ultrasound imaging
MLE@TUHH
DDC Class
617: Surgery, Regional Medicine, Dentistry, Ophthalmology, Otology, Audiology
Funding(s)
Projekt DEAL  
Lizenz
https://creativecommons.org/licenses/by/4.0/
Loading...
Thumbnail Image
Name

s11548-024-03234-8-1.pdf

Type

Main Article

Size

632.28 KB

Format

Adobe PDF

TUHH
Weiterführende Links
  • Contact
  • Send Feedback
  • Cookie settings
  • Privacy policy
  • Impress
DSpace Software

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science
Design by effective webwork GmbH

  • Deutsche NationalbibliothekDeutsche Nationalbibliothek
  • ORCiD Member OrganizationORCiD Member Organization
  • DataCiteDataCite
  • Re3DataRe3Data
  • OpenDOAROpenDOAR
  • OpenAireOpenAire
  • BASE Bielefeld Academic Search EngineBASE Bielefeld Academic Search Engine
Feedback