News Release

Shuran Song receives NSF CAREER Award

Roboticist’s proposed framework to enable robots to learn on their own and adapt to new environments could revolutionize everything from home service robots to emergency response systems

Grant and Award Announcement

Columbia University School of Engineering and Applied Science

Shuran Song Receives NSF CAREER Award

image: Shuran Song, assistant professor of computer science, received the award to create a framework--machine learning algorithms--that will enable robots to explore their environment and decide how to accomplish a task based on the situation they face. view more 

Credit: Columbia Engineering

Shuran Song, assistant professor of computer science, has won an NSF CAREER award for her proposal to create a framework--machine learning algorithms--that will enable robots to explore their environment and decide how to accomplish a task based on the situation they face. 

 

The five-year, $600,000 grant will support her project, “Active Scene Understanding By and For Robot Manipulation.” It will build upon her previous research in 3D-scene understanding and self-supervised learning frameworks. Her new project will focus on creating a framework she calls “active scene understanding,” where the agent leverages its ability to interact with the world in order to better understand what it sees--from discovering new objects to deciphering their physical dynamics, and exploiting the learned knowledge to accomplish the task on hand. 

 

“Despite significant progress, most robot perception systems today remain limited to seeing what they are asked to see,” said Song, who joined Columbia Engineering in 2018 and directs the Columbia Artificial Intelligence and Robotics (CAIR) Lab. Song’s long-term research goal is to build robust perception systems that enable robots to be useful outside of labs and factories and to assist people in our dynamic and unstructured open world. 

 

The major challenge Song hopes to solve is how to build a unified framework that can handle a diverse set of complex environments without needing to be deliberately re-engineered for each new task or scenario. Successful algorithms for inferring scene representations through active interactions could change working processes for a number of applications, including field, space, or home robots. 

 

Song’s proposed “active-scene-understanding framework” improves a robotics system’s fundamental perception and planning capability, making them robust, flexible, and resourceful in unstructured environments. Using the system, robots will be able to gain observations about the environment that would be otherwise difficult or impossible to obtain.  

 

For example, when cooking in a new kitchen, we might open the fridge to catalog available ingredients, turn bottles to read their labels, or gently bend a spatula to gauge its stiffness. In these examples, we use actions (open, turn, bend) to retrieve relevant information (ingredients, label, stiffness) for planning future actions (cooking). Song’s research goal is to enable robots to automatically infer such information-seeking strategies whenever it deems them to be possible and necessary. 

 

Another application could be in emergency response, where robots need to devise their own exploration strategies to rapidly analyze their environment--a collapsed building, for instance-- and to swiftly react to the evolving situation and quickly find people in the rubble. 

 

Song noted, “Our framework, if successful, could transform life around the world--there are all kinds of things that robots should be able to do. It’s really exciting to see how we can advance how robots can learn on their own, and quickly adapt to new environments and tasks.” 


Disclaimer: AAAS and EurekAlert! are not responsible for the accuracy of news releases posted to EurekAlert! by contributing institutions or for the use of any information through the EurekAlert system.