Robot Reboot

Lilly Nekervis is modifying a robot dog to be a guide for people who are visually impaired.

October 9th, 2024

In Fall 2023, Lilly Nekervis was taking a web development class in Sitterson Hall when she glanced at a building map and noticed a robotics lab on the second floor. Intrigued, she decided to check it out — and found a headless, tailless, four-legged dog-like machine lying on the floor.

“It was a Spot robot!” Nekervis exclaims when retelling the story. “And I wanted to work on it.”

She knocked on the door of Carolina computer scientist Daniel Szafir, who leads the Interactive Robotics and Novel Technologies (IRON) Lab. After an interview, Szafir offered Nekervis a position as an undergraduate research assistant.

Originally created by Boston Dynamics, Spot robots are equipped with cameras and sensors that enable them to comprehend and navigate their environments. They are trained to complete a range of tasks, from inspecting hazardous areas to aiding in search and rescue operations to transporting equipment. But little research has been done on how they can be used to support individuals with disabilities.

This past summer, Nekervis — a senior majoring in information and library science — began a research project to enhance the robot’s capabilities to assist individuals with visual impairments. She is modifying, creating, and testing existing software and hardware to transform the motorized animal into a guide dog.

And thanks to a Summer Undergraduate Research Fellowship, Nekervis stayed in Chapel Hill to finish the robot’s reboot.

“I hope to bring awareness to those with visual impairments, create more assistive technology, and bridge the gap between robotics and assistive technology,” Nekervis explains.

Innovative adaptations

Nekervis’ goal isn’t to replace guide dogs but rather to provide people with an additional mobility aid.

“Spot is loud — it makes noise — and it would bring attention to people with visual impairments,” she says. “It also allows people to create this sort of autonomous system if they are allergic to dogs, can’t have dogs around, or are not able to take care of a dog.”

Transforming Spot into a guide dog required Nekervis to tackle several challenges. First, she needed to develop software and code, allowing it to safely maneuver around an obstacle while a person was beside it. Next, she integrated cameras to help it detect its environment, a high-performance computing platform to quickly process data, and a speaker and microphone to facilitate interactions between the user and Spot.

“We had to connect all these things together using several different nodes to let the robot interpret and analyze the information coming into it to create a seeing eye dog,” Nekervis explains.

With the assistance of Jim Mahaney, director of the Experimental Engineering Lab, she built a specialized aluminum harness using a lathe, drill press, and welding equipment in the lab’s machine shop.

When testing the modifications, participants wore blindfolds and held onto the harness while being guided by Spot from one location to another. Participants also wore a QR code patch on their pantlegs, allowing the robot to keep track of their positions.

“The most exciting part is getting to work with Spot,” Nekervis shares. “I got to learn so much for this project — everything from working in the machine shop to working in [programming languages] Python, C, and C++. ”

Nekervis will continue her research this fall. She hopes to pursue a PhD in computer science with a focus on robotics, though she remains open to industry roles that align with her interests.

“I love research too much to let go of it,” she says.

Lilly Nekervis is a fourth-year student within the UNC School of Information and Library Science and an undergraduate research assistant in the Interactive Robotics and Novel Technologies (IRON) Lab.