Please use this identifier to cite or link to this item: https://doi.org/10.15480/882.4884
DC FieldValueLanguage
dc.contributor.authorNeidhardt, Maximilian-
dc.contributor.authorMieling, Till Robin-
dc.contributor.authorBengs, Marcel-
dc.contributor.authorSchlaefer, Alexander-
dc.date.accessioned2023-01-20T18:21:41Z-
dc.date.available2023-01-20T18:21:41Z-
dc.date.issued2023-01-10-
dc.identifier.citationScientific Reports 13 (1): 506 (2023-12)de_DE
dc.identifier.issn2045-2322de_DE
dc.identifier.urihttp://hdl.handle.net/11420/14604-
dc.description.abstractRobotic assistance in minimally invasive surgery offers numerous advantages for both patient and surgeon. However, the lack of force feedback in robotic surgery is a major limitation, and accurately estimating tool-tissue interaction forces remains a challenge. Image-based force estimation offers a promising solution without the need to integrate sensors into surgical tools. In this indirect approach, interaction forces are derived from the observed deformation, with learning-based methods improving accuracy and real-time capability. However, the relationship between deformation and force is determined by the stiffness of the tissue. Consequently, both deformation and local tissue properties must be observed for an approach applicable to heterogeneous tissue. In this work, we use optical coherence tomography, which can combine the detection of tissue deformation with shear wave elastography in a single modality. We present a multi-input deep learning network for processing of local elasticity estimates and volumetric image data. Our results demonstrate that accounting for elastic properties is critical for accurate image-based force estimation across different tissue types and properties. Joint processing of local elasticity information yields the best performance throughout our phantom study. Furthermore, we test our approach on soft tissue samples that were not present during training and show that generalization to other tissue properties is possible.en
dc.language.isoende_DE
dc.publisherMacmillan Publishers Limited, part of Springer Naturede_DE
dc.relation.ispartofScientific reportsde_DE
dc.rights.urihttps://creativecommons.org/licenses/by/4.0/de_DE
dc.subject.ddc600: Technikde_DE
dc.subject.ddc610: Medizinde_DE
dc.titleOptical force estimation for interactions between tool and soft tissuesde_DE
dc.typeArticlede_DE
dc.identifier.doi10.15480/882.4884-
dc.type.diniarticle-
dcterms.DCMITypeText-
tuhh.identifier.urnurn:nbn:de:gbv:830-882.0209036-
tuhh.oai.showtruede_DE
tuhh.abstract.englishRobotic assistance in minimally invasive surgery offers numerous advantages for both patient and surgeon. However, the lack of force feedback in robotic surgery is a major limitation, and accurately estimating tool-tissue interaction forces remains a challenge. Image-based force estimation offers a promising solution without the need to integrate sensors into surgical tools. In this indirect approach, interaction forces are derived from the observed deformation, with learning-based methods improving accuracy and real-time capability. However, the relationship between deformation and force is determined by the stiffness of the tissue. Consequently, both deformation and local tissue properties must be observed for an approach applicable to heterogeneous tissue. In this work, we use optical coherence tomography, which can combine the detection of tissue deformation with shear wave elastography in a single modality. We present a multi-input deep learning network for processing of local elasticity estimates and volumetric image data. Our results demonstrate that accounting for elastic properties is critical for accurate image-based force estimation across different tissue types and properties. Joint processing of local elasticity information yields the best performance throughout our phantom study. Furthermore, we test our approach on soft tissue samples that were not present during training and show that generalization to other tissue properties is possible.de_DE
tuhh.publisher.doi10.1038/s41598-022-27036-7-
tuhh.publication.instituteMedizintechnische und Intelligente Systeme E-1de_DE
tuhh.identifier.doi10.15480/882.4884-
tuhh.type.opus(wissenschaftlicher) Artikel-
dc.type.driverarticle-
dc.type.casraiJournal Article-
tuhh.container.issue1de_DE
tuhh.container.volume13de_DE
dc.identifier.pmid36627354de_DE
dc.rights.nationallicensefalsede_DE
dc.identifier.scopus2-s2.0-85146106594de_DE
tuhh.container.articlenumber506de_DE
local.status.inpressfalsede_DE
local.type.versionpublishedVersionde_DE
datacite.resourceTypeArticle-
datacite.resourceTypeGeneralJournalArticle-
item.mappedtypeArticle-
item.openairetypeArticle-
item.languageiso639-1en-
item.grantfulltextopen-
item.cerifentitytypePublications-
item.creatorOrcidNeidhardt, Maximilian-
item.creatorOrcidMieling, Till Robin-
item.creatorOrcidBengs, Marcel-
item.creatorOrcidSchlaefer, Alexander-
item.creatorGNDNeidhardt, Maximilian-
item.creatorGNDMieling, Till Robin-
item.creatorGNDBengs, Marcel-
item.creatorGNDSchlaefer, Alexander-
item.fulltextWith Fulltext-
item.openairecristypehttp://purl.org/coar/resource_type/c_6501-
crisitem.author.deptMedizintechnische und Intelligente Systeme E-1-
crisitem.author.deptMedizintechnische und Intelligente Systeme E-1-
crisitem.author.deptMedizintechnische und Intelligente Systeme E-1-
crisitem.author.deptMedizintechnische und Intelligente Systeme E-1-
crisitem.author.orcid0000-0002-5107-0864-
crisitem.author.orcid0000-0003-0262-2519-
crisitem.author.orcid0000-0002-2229-9547-
crisitem.author.parentorgStudiendekanat Elektrotechnik, Informatik und Mathematik (E)-
crisitem.author.parentorgStudiendekanat Elektrotechnik, Informatik und Mathematik (E)-
crisitem.author.parentorgStudiendekanat Elektrotechnik, Informatik und Mathematik (E)-
crisitem.author.parentorgStudiendekanat Elektrotechnik, Informatik und Mathematik (E)-
Appears in Collections:Publications with fulltext
Files in This Item:
File Description SizeFormat
s41598-022-27036-7.pdfVerlagsversion2,72 MBAdobe PDFView/Open
Thumbnail
Show simple item record

Page view(s)

40
checked on Feb 1, 2023

Download(s)

6
checked on Feb 1, 2023

Google ScholarTM

Check

Note about this record

Cite this record

Export

This item is licensed under a Creative Commons License Creative Commons