Planning for Human-Robot Systems under Augmented Partial Observability

Date: 2/10/2022

Speaker: Shiqi Zhang

Shiqi Zhang Head Shot

Location: 122 Gates Hall and Zoom

Time: 2:40 p.m.-3:30 p.m.

Abstract: The real world is partially observable to both people and robots. To estimate the world state, a robot needs a perception model to interpret sensory data. How does a robot plan its behaviors without such perception models? I will present our recent research on learning algorithms to help robots perceive and plan in stochastic worlds. With humans in the loop, robot planning becomes more difficult, because people and robots need to estimate not only the world state but also each other’s state. My second half of the talk will be about frameworks for human-robot communication and collaboration. I will share our work on leveraging AR/VR visualization strategies for transparent human-robot teaming toward effective collaboration.

About the Speaker: Dr. Shiqi Zhang is an Assistant Professor with the Department of Computer Science, the State  University of New York (SUNY) at Binghamton. Before that, he was an Assistant Professor at Cleveland State University after working as a Postdoc at UT Austin. He received his Ph.D. in Computer Science (2013) from Texas Tech University, and received his M.S. and B.S. degrees from Harbin Institute of Technology. He is leading an NSF NRI project on knowledge-based robot decision making. He received the Best Robotics Paper Award from AAMAS in 2018, a Ford URP Award from 2019-2022, and an OPPO Faculty Research Award in 2020.