Telerobotic systems already allow surgeons in one location to control robotic surgical tools in another, so they can perform operations at a distance. A new proximity-sensing system, however, could make such procedures safer and more precise than ever.
In typical telerobotic surgical setups, the surgeon views the incision on a video screen, moving their fingers to correspondingly move robotic manipulator “fingers” or other implements in the remotely-located operating room.
Not only does this technology make it possible for a surgeon in one city to operate on a patient in another, but it can also be used on patients at the surgeon’s own location, helping to smooth out their hand tremors while performing delicate procedures. The systems therefore often incorporate haptic feedback, in which the operator can feel the amount of force that they’re applying to the patient’s bodily tissues, via vibrations that are applied to their fingertips.
That said, in the case of particularly fragile tissue, the surgeon may already be applying too much pressure when they first remotely “touch” it. And it was with this problem in mind that a team at Texas A&M University created the experimental new system.
In its current form, it incorporates optical distance sensors that are applied to the inside of the fingers of a robotic gripper, that’s being remotely controlled by a human operator. As that device closes its fingers to grasp an object, the sensors measure the decreasing distance between themselves and that item.
This data is transmitted to a control glove worn by the operator, which delivers mild electrical pulses to their fingertips. The frequency of those pulses increases as the gripper’s fingers get closer to the object. As a result, the operator can finely modulate the amount of pressure that they’re about to apply to the item, before they actually touch it.
In lab tests, 11 volunteers used the system to remotely perform an object-grasping task. Each person did so twice guided only by video of the gripper, and two more times guided by both video and the haptic feedback. When the feedback was utilized, they were able to reduce their initial contact force by approximately 70 percent.
Ultimately, it is hoped that the technology could be used to minimize patient risk in telerobotic surgeries, and to do so in a manner that isn’t distracting.
“Our goal was to come up with a solution that would improve the accuracy in proximity estimation without increasing the burden of active thinking needed for this task,” says the lead scientist, Asst. Prof. Hangue Park. “When our technique is ready for use in surgical settings, physicians will be able to intuitively know how far their robotic fingers are from underlying structures, which means that they can keep their active focus on optimizing the surgical outcome of their patients.”
A paper on the research was recently published in the journal Scientific Reports.
Source: Texas A&M University
Source of Article