Skip to main content

Speedy Collision Detector Could Make Robots Better Human Assistants

Fastron collision detection algorithm could make robots better surgical assistants
A team of engineers at UC San Diego developed Fastron, a faster detection collision algorithm that could enable robots to perform assistive tasks more fluidly in the operating room. Photo credit: David Baillot/UC San Diego Jacobs School of Engineering

By:

  • Liezel Labios

Media Contact:

Published Date

By:

  • Liezel Labios

Share This:

Article Content

Electrical engineers at the University of California San Diego have developed a faster collision detection algorithm that uses machine learning to help robots avoid moving objects and weave through complex, rapidly changing environments in real time. The algorithm, dubbed “Fastron,” runs up to 8 times faster than existing collision detection algorithms.

A team of engineers, led by Michael Yip, a professor of electrical and computer engineering and member of the Contextual Robotics Institute at UC San Diego, will present the new algorithm at the first annual Conference on Robot Learning Nov. 13 to 15 at Google headquarters in Mountain View, Calif. The conference brings the top machine learning scientists to an invitation-only event. Yip’s team will deliver one of the long talks during the 3-day conference.

The team envisions that Fastron will be broadly useful for robots that operate in human environments where they must be able to work with moving objects and people fluidly. One application they are exploring in particular is robot-assisted surgeries using the da Vinci Surgical System, in which a robotic arm would autonomously perform assistive tasks (suction, irrigation or pulling tissue back) without getting in the way of the surgeon-controlled arms or the patient’s organs.

“This algorithm could help a robot assistant cooperate in surgery in a safe way,” Yip said.

The team also envisions that Fastron can be used for robots that work at home for assisted living applications, as well as for computer graphics for the gaming and movie industry, where collision checking is often a bottleneck for most algorithms.

A problem with existing collision detection algorithms is that they are very computation-heavy. They spend a lot of time specifying all the points in a given space—the specific 3D geometries of the robot and obstacles—and performing collision checks on every single point to determine whether two bodies are intersecting at any given time. The computation gets even more demanding when obstacles are moving.

To lighten the computational load, Yip and his team in the Advanced Robotics and Controls Lab (ARClab) at UC San Diego developed a minimalistic approach to collision detection. The result was Fastron, an algorithm that uses machine learning strategies—which are traditionally used to classify objects—to classify collisions versus non-collisions in dynamic environments. “We actually don’t need to know all the specific geometries and points. All we need to know is whether the robot’s current position is in collision or not,” said Nikhil Das, an electrical engineering Ph.D. student in Yip’s group and the study’s first author.

The name Fastron comes from combining Fast and Perceptron, which is a machine learning technique for performing classification. An important feature of Fastron is that it updates its classification boundaries very quickly to accommodate for moving scenes, something that has been challenging for the machine learning community in general to do.

Fastron’s active learning strategy works using a feedback loop. It starts out by creating a model of the robot’s configuration space, or C-space, which is the space showing all possible positions the robot can attain. Fastron models the C-space using just a sparse set of points, consisting of a small number of so-called collision points and collision-free points. The algorithm then defines a classification boundary between the collision and collision-free points—this boundary is essentially a rough outline of where the abstract obstacles are in the C-space. As obstacles move, the classification boundary changes. Rather than performing collision checks on each point in the C-space, as is done with other algorithms, Fastron intelligently selects checks near the boundaries. Once it classifies the collisions and non-collisions, the algorithm updates its classifier and then continues the cycle.

Because Fastron’s models are more simplistic, the researchers set its collision checks to be more conservative. Since just a few points represent the entire space, Das explained, it’s not always certain what’s happening in the space between two points, so the team developed the algorithm to predict a collision in that space. “We leaned toward making a risk-averse model and essentially padded the workspace obstacles,” Das said.  This ensures that the robot can be tuned to be more conservative in sensitive environments like surgery, or for robots that work at home for assisted living.

The team has so far demonstrated the algorithm in computer simulations on robots and obstacles in simulation. Moving forward, the team is working to further improve the speed and accuracy of Fastron. Their goal is to implement Fastron in a robotic surgery and a homecare robot setting.

Paper title: “Fastron: An Online Learning-Based Model and Active Learning Strategy for Proxy Collision Detection.” Authors of the study are Nikhil Das, Naman Gupta and Michael Yip in the Advanced Robotics and Controls Lab (ARClab) at UC San Diego.

Share This:

Category navigation with Social links