Navigation system for a developed endoscopic surgical robot system

Asaki Hattori, Naoki Suzuki, Mitsuhiro Hayashibe, Shigeyuki Suzuki, Yoshito Otake, Kazuki Sumiyama, Hisao Tajiri, Susumu Kobayashi

Research output: Contribution to journalArticlepeer-review

5 Citations (Scopus)

Abstract

In this paper, we describe a navigation system for an endoscopic surgical robot system. We have been developing an endoscopic robot system that performs surgery on the gastric tubes. The system has two manipulators on both sides of the endoscope's tip. Using these manipulators, surgeons are able to perform surgical procedures like open surgery. We applied a data fusion system to the endoscopic robot system for an image-guided surgery. The data fusion system uses two devices; a magnetic 3D location sensor and a Graphic workstation. The magnetic location sensor is attached to the endoscope's tip and measures the 3D position and direction of the tip. The graphic workstation (GWS) transforms a coordinate system of a 3D patient's organ model by using the sensor's data, and superimposes the organ model onto the captured image of the endoscope in real time. We used this system during an endoscopic mucosal resection (EMR) of a pig. At this experiment, the surgeon was able to observe the inner structure of the animal's organ.

Original languageEnglish
Pages (from-to)539-544
Number of pages6
JournalInternational Congress Series
Volume1268
Issue numberC
DOIs
Publication statusPublished - 2004 Jun 1

Keywords

  • Data fusion
  • Endoscopic robot
  • Image-guided surgery

Fingerprint

Dive into the research topics of 'Navigation system for a developed endoscopic surgical robot system'. Together they form a unique fingerprint.

Cite this