Vera Surgical Robot Controls

Vicarious Surgical’s Vera robot is a novel surgical tool that promises to be easier to use and more capable than other existing robotic surgery platforms. I designed its control scheme.

The key to Vera’s potential ease-of-use lies in the robot’s physical form. The operating assembly resembles a small human, with fully articulated arms and a binocular camera. Vera promised to be able to perform dozens of types of abdominal surgeries quickly, all through a single entry “port”. Other robotic surgery platforms afford multiple types of surgeries, but require multiple entry ports (and thus multiple incisions), and don’t have the flexibility or mobility of Vera.

Working with the Vera Human Centered Design team, I took on the challenge of finding the most intuitive way to control Vera so that we could unlock the potential that the rest of this incredibly capable robot afforded. With nearly 30 points of articulation, it was a difficult task, but ultimately we were able to create a control scheme that allowed even novice users to move the robot safely and precisely.

Simulating Vera

Experimenting with control concepts was time consuming, expensive, and risky. I created a simulation to help speed up and de-risk the process.

Iteration time on a robot as complex as Vera is on the order of weeks. Additionally, mathematical or software errors ran the risk of breaking physical components of the robot, costing both time and money for the rest of the development team. Seeing this, I took it upon myself to create a simulated version of Vera using the Unity game engine. The simulation allowed me to experiment with new methods of controlling Vera without putting development timelines or hardware at risk. It accurately modeled the robot’s points of articulation, and enabled me to create custom code to drive those virtual joints as I saw fit. It also interfaced with our work-in-progress Surgeon Console, so users could get a sense of what it felt like to drive the robot using its actual controls.

I intentionally built the simulation to match the robot’s architecture, complete with individual drives which inserted and extracted the robot’s 3 main tools, and virtual “control points” which represented important frames-of-reference that Vera’s software engineers used to control the robot. That parity between simulation and reality helped me to bridge the communication gap between the company’s engineering and design teams. Using Unity’s runtime editor, I could quickly demonstrate how certain components moved in response to different inputs, and how different control concepts might or might not have been viable. My knowledge of computational geometry also helped me translate ideas amongst the different stakeholders of the project, with the simulation serving as an excellent visualization aide.

Formative Research

After narrowing down a broad set of ideas into a single, testable concept, I helped gather and synthesize feedback in two formative user research studies.

During the studies, surgeons and surgical techs performed a series of tasks using the simulated robot I created. As they went through the various exercises, we asked participants questions about how they thought the robot was behaving to gauge their understanding of the complex system. We also gathered data to help determine how difficult it was to control the robot with precision and fluidity. Our goal of these studies was to find the control scheme with minimum cognitive overhead, and to find something that allowed the surgeon to instinctively control the robot.

With the support of study data and enthusiastic reviews from many of our surgeon participants, we made a recommendation to ship Vera with one of the control schemes I originally created. Though the details are protected by an NDA, the control scheme allows surgeons to maneuver around the abdominal space easily while maintaining safe control of the robot’s arms and head. Importantly, it is easy to learn, and does not require a deep understanding of the robot’s mechanics to use effectively.