Mar 7 2017
Invasive surgical techniques - cutting through the breastbone for open heart surgery or making a large incision to inspect an abdominal tumor - allow physicians to effectively treat disease but can lead to sometimes serious complications and dramatically slow healing for the patient.
Scientists instead want to deploy dozens, or even thousands of tiny robots to travel the body's venous system as they deliver drugs or a self-assembled interventional tool. Researchers from the University of Houston and Houston Methodist Hospital are developing control algorithms, imaging technology, ultrafast computational methods and human-machine immersion methods to harness the force from a magnetic resonance imaging (MRI) scanner to both image and steer millimeter-sized robots through the body.
"We want to move from science fiction to science feasibility," said Aaron Becker, assistant professor of electrical and computer engineering at UH and principal investigator for a $608,000 Synergy Award from the National Science Foundation to develop prototypes for testing.
To tackle this unprecedented challenge, the award involves two additional investigators: Nikolaos Tsekos, associate professor of computer science and director of the Medical Robotics Laboratory at UH, who has expertise in MRI and computational methods, and Dipan J. Shah, a cardiologist and director of cardiovascular MRI at Houston Methodist Hospital, who brings expertise in clinical MRI and focusing the effort to find solutions that are clinically necessary and valuable.
While MRI has traditionally been used for noninvasive diagnosis, the next frontier is its use as a tool to offer noninvasive or minimally invasive treatment.
The milli-robot development and control work is an outgrowth of Becker's previous research, which was funded in part with an NSF CAREER award and demonstrated the theory behind the proposal. This grant, awarded through NSF's Cyber-Physical Systems (CPS) program, will fund work to build a prototype suitable for animal testing. The MRI control and computational methods follow a previous CPS award in image-guided robotic surgeries led by Tsekos and Shah.
Their current models are up to two centimeters; Becker said the goal is robots that range from 0.5 millimeters to two millimeters. The average human hair, in comparison, is about 0.08 millimeters wide.
MRI provides enough magnetic force to steer the robots through the body's blood vessels but can't penetrate tumors or other tissue. This project is working with two designs, both powered by the MRI scanner, to address that problem, one based on the principle of mechanical resonance and the second modeled after a self-assembling surgical tool, a Gauss gun.
A key issue is real-time control, Becker said, noting that blood vessels move around in the body, making it crucial to be able to see both the anatomy and the robot as it moves in order to keep it moving correctly. Even the fastest current MRI scans are too slow for such control and have a time lag before the information is available. Developing such a system is a multidisciplinary task that must seamlessly integrate sensing with the MRI scanner, milli-robot control and close the loop by controlling the scanner to drive the milli-robots.
Ultimately, Becker said, the goal is to use the power of an MRI to steer large numbers of robots throughout the body. While one milli-robot could target a single lesion, delivering chemotherapy or another intervention, that isn't practical for a late-stage cancer, for example.
"Targeting delivery with dozens of microsurgeons is my goal," he said. In this case, those "microsurgeons" would be robots, guided by a physician.