image: Yen-Ling Kuo is an assistant professor of computer science who earned a $665,000 grant to advance "theory of mind" concepts in robots as a faculty affiliate of UVA's Link Lab.
Credit: Photo by Matt Cosner, UVA Engineering
As automation becomes part of everyday life, University of Virginia computer scientist Yen-Ling Kuo wants to help robots become better social partners. Her goal: to design machines that understand not just our words, but also our movements, goals and the unspoken cues that shape human behavior.
For this work, the National Science Foundation has awarded Kuo, an assistant professor and Anita Jones Faculty Fellow in the School of Engineering’s Department of Computer Science, its prestigious CAREER Award, which recognizes early-career faculty who demonstrate the potential to serve as academic role models as they lead research advances in their field.
Kuo’s five-year, $665,000 grant supports her research in bridging artificial intelligence and cognitive science, with a focus on making interactions between humans and robots more intuitive and useful. These capabilities could improve how robots support people in the real world, from navigating sidewalks to assisting in homes, hospitals and factories.
My goal is to build AI systems that think and learn more like people do.
"My goal is to build AI systems that think and learn more like people do," Kuo said. "We want robots that can reason, communicate and collaborate the way humans naturally do through shared understanding, social reasoning and adaptable behavior."
Kuo conducts her research in UVA Engineering’s cyber-physical systems Link Lab. Her recent work explores how robots can learn from human feedback, follow natural language commands and respond with socially meaningful actions. Her team develops computational models that help robots interpret multiple types of signals, from language to gaze and movement.
"A lot of human communication is unspoken — we move, we look, we infer. If robots can pick up on those signals, they become far more useful partners," Kuo said. "We want to give them the social intelligence to act appropriately and the flexibility to adapt as situations change."
With support from the CAREER Award, Kuo will expand this work to include shared representations — such as understanding spatial relationships or aligning on task goals — between humans and robots, and help build a shared mental model to reason about each other’s actions and goals.
A central part of her research will focus on building "theory of mind" into robots so they have the ability to infer what others know, want or intend to do. "It’s like giving robots a way to put themselves in someone else’s shoes," she said.
The CAREER Award adds to Kuo’s growing list of honors, including the Outstanding Women in Robotics and Automation (WiRA) Early Career Contribution Award from the Institute of Electrical and Electronics Engineers Robotics and Automation Society (IEEE RAS) presented at the 2025 International Conference on Robotics and Automation (ICRA); an Outstanding Paper Award from the 2024 Annual Meeting of the Association for Computational Linguistics (ACL); and a Young Faculty Researcher grant from the Toyota Research Institute.
Before joining UVA in 2023, Kuo earned her Ph.D. in computer science, with a minor in cognitive science, from the Massachusetts Institute of Technology. She was also a research intern at the MIT-IBM Watson AI Lab and previously worked as a software engineer at Google.
In addition to her research, Kuo is deeply committed to mentorship and student involvement, a key component of the NSF CAREER Award. Her lab has published or submitted more than a half-dozen research articles in just the first half of 2025.
It’s exciting to be working on questions that sit at the intersection of AI, robotics and human behavior.
"It’s exciting to be working on questions that sit at the intersection of AI, robotics and human behavior," Kuo said. "We want to build robots that people want to work with — and that starts with helping them engage with the rich behavior humans have."