Instinctively Controlling a Complex Surgical Machine with Vera

Background

Vicarious Surgical’s Vera robot is a novel surgical tool that promises to be easier to use and more capable than other existing robotic surgery platforms. The key to this lies in the robot’s physical form: the operating assembly resembles a small human, with fully articulated arms and a binocular camera. Vera promised to be able to perform dozens of types of abdominal surgeries quickly, all through a single entry “port”. Other robotic surgery platforms afford multiple types of surgeries, but require multiple entry ports (and thus multiple incisions), and don’t have the flexibility or mobility of Vera.

Working with the Vera Human Centered Design team, I took on the challenge of finding the most intuitive way to control Vera so that we could unlock the potential that the rest of this incredibly capable robot afforded. With nearly 30 points of articulation, it was a difficult task, but ultimately we were able to create a control scheme that allowed even novice users to move the robot safely and precisely.

Simulating Vera

Iteration time on a robot as complex as Vera is on the order of weeks. Additionally, mathematical or software errors ran the risk of breaking physical components of the robot, costing both time and money for the rest of the development team. Seeing this, I took it upon myself to create a simulated version of Vera using the Unity game engine. The simulation allowed me to experiment with new methods of controlling Vera without putting development timelines or hardware at risk. It accurately modeled the robot’s points of articulation, and enabled me to create custom code to drive those virtual joints as I saw fit. It also interfaced with our work-in-progress Surgeon Console, so users could get a sense of what it felt like to drive the robot using its actual controls.

I intentionally built the simulation to match our current robot architecture. My goal was to help bridge some of the communication gap between the engineering and design teams by making sure our designs could be easily translated to work within the engineering structure.

Formative Research

After creating the simulation and designing several alternative control schemes for the robot, I helped to run two formative user research studies to help determine the best option, and to eventually refine it into a shippable concept. During the studies, participants performed a series of tasks using the simulated robot I created. As they went through the various exercises, we asked participants questions about how they thought the robot was behaving to gauge their understanding of the complex system. We also gathered data to help determine how difficult it was to control the robot with precision and fluidity. Our goal of these studies was to find the control scheme with minimum cognitive overhead, and to find something that allowed the surgeon to instinctively control the robot.

With the support of study data and enthusiastic reviews from many of our surgeon participants, we made a recommendation to ship Vera with one of the control schemes I originally created. Though the details are protected by an NDA, the control scheme allows surgeons to maneuver around the abdominal space easily while maintaining safe control of the robot’s arms and head. Importantly, it is easy to learn, and does not require a deep understanding of the robot’s mechanics to use effectively.