A Testbed for Design and Performance Evaluation of Visual Localization Technique inside the Small Intestine Public

Downloadable Content

Download PDF

Wireless video capsule endoscopy (VCE) plays an increasingly important role in assisting clinical diagnoses of gastrointestinal (GI) diseases. It provides a non-invasive way to examine the entire small intestine, where other conventional endoscopic instruments can barely reach. Existing examination systems for the VCE cannot track the location of a endoscopic capsule, which prevents the physician from identifying the exact location of the diseases. During the eight hour examination time, the video capsule continuously keeps taking images at a frame rate up to six frame per sec, so it is possible to extract the motion information from the content of the image sequence. Many attempts have been made to develop computer vision algorithms to detect the motion of the capsule based on the small changes in the consecutive video frames and then trace the location of the capsule. However, validation of those algorithms has become a challenging topic because conducting experiments on the human body is extremely difficult due to individual differences and legal issues. In this thesis, two validation approaches for motion tracking of the VCE are presented in detail respectively. One approach is to build a physical testbed with a plastic pipe and an endoscopy camera; the other is to build a virtual testbed by creating a three-dimensional virtual small intestine model and simulating the motion of the capsule. Based on the virtual testbed, a physiological factor, intestinal contraction, has been studied in terms of its influence on visual based localization algorithm and a geometric model for measuring the amount of contraction is proposed and validated via the virtual testbed. Empirical results have made contributions in support of the performance evaluation of other research on the visual based localization algorithm of VCE.

  • English
  • etd-050114-133712
Defense date
  • 2014
Date created
  • 2014-05-01
Resource type
Rights statement


In Collection:


Permanent link to this page: