Autonomous robots have been deployed at airports, on college campuses, and even in suburban neighborhoods, but their use remains limited. A group of researchers at the University of Texas Austin now wants to find out how robots can be better enmeshed in a community.
Starting next year, the researchers plan to deploy two autonomous robots on UT Austin’s campus. Students and professors will be able to order free supplies such as wipes and hand sanitizer via a smartphone app, and the robots will deliver door-to-door on campus. The robots will encounter people along the way, with the researchers observing the behavior and interactions between robots and humans.
The goal for this research, which received a $3.6 million grant from the National Science Foundation, is to identify the services that would make the robots useful and to adapt the robots to their community, said Sentis. The information will be used to develop standards of safety, behavior, and communication of these robotic systems, which will be useful for commercial efforts, he said.
“This study that we’re performing is great timing because it’s happening before commercially successful, large-scale deployment,” said Luis Sentis, a professor at UT Austin’s Cockrell School of Engineering and the leader of the project. “[W]e’re trying to be anticipatory and ahead of the markets.”
While Amazon and FedEx are pulling back from their delivery robot ambitions, the global market for autonomous mobile robots is expected to grow from $5 billion to $18 billion by 2030, according to Precedence Research, a market research firm.
What data will be collected from the human-robot encounters in public
Autonomous robots are meant to travel on their own from point A to point B without colliding or falling, but they are only focused on the task at hand, Sentis said. While the robots are autonomous, he said a human will always be supervising in case the robot does something unexpected.
The researchers will obtain data two ways. They will monitor the interactions between the robots and their human supervisors, with the goal of improving oversight for a fleet of robots. The robot chaperones will wear headbands with brain sensors that can detect when stress levels increase, for instance, said Sentis.
Second, the researchers will observe and interview people who encounter robots in a variety of contexts, such as night versus day and crowded areas versus not. The robots also will emit different sounds, and the researchers will gather information about how that affects people, said Sentis. The observations will help the research team make recommendations such as on the distance robots should keep from people.
One of the modes the researchers are creating for the robots is gameplay. Each robot will always carry a ball; in gameplay mode, people can stop the robot in the middle of the delivery task for a game of fetch.
The study will use two types of dog-like robots, one from Boston Dynamics and the Anymal model from Anybotics AG, which has a more industrial look. The autonomous robots are shaped to have four legs like a dog, which allows them to move up and down stairs, said Sentis. The biggest technical challenge with autonomous robots is navigating crowded areas, as people tend to move erratically, Sentis said.
Autonomous robots haven’t always been embraced by the public. In 2020, the New York Police Department used a Boston Dynamics robotic dog to reach places that were too dangerous to send officers. This led to a backlash from critics over surveillance, bias, and privacy.