What we talk about when we talk about tooling

Cornell Robotics Grad Students


Location: 122 Gates Hall

Time: 2:40p.m.

Abstract: Join RGSO for our first homegrown seminar: a discussion on tooling. We have a few of our very own students ready to talk about their workflows and tips-and-tricks for how they get stuff done, whether it’s programming or research reviews. Come ready to listen, learn, and, if you have a cool workflow to communicate to others, share.

Certifiable Outlier-Robust Geometric Perception: Robots that See through the Clutter with Confidence

Heng Yang, Massachusetts Institute of Technology


Location: 122 Gates Hall

Time: 2:40p.m.

Abstract: Geometric perception is the task of estimating geometric models from sensor measurements and priors. The ubiquitous existence of outliers —measurements that tell no or little information about the models to be estimated— makes it theoretically intractable to perform estimation with guaranteed optimality. Despite this theoretical intractability, safety-critical robotic applications still demand trustworthiness and performance guarantees on perception algorithms. In this talk, I present certifiable outlier-robust geometric perception, a new paradigm to design tractable algorithms that enjoy rigorous performance guarantees, i.e., they commonly return an optimal estimate with a certificate of optimality, but declare failure and provide a measure of suboptimality on worst-case instances. Particularly, I present three algorithms in the certifiable perception toolbox: (i) a pruner that uses graph theory to filter out gross outliers and boost robustness to against over 95% outliers; (ii) an estimator that leverages graduated non-convexity to compute the optimal estimate with high probability of success; and (iii) a certifier that employs sparse semidefinite programming (SDP) relaxation and a novel SDP solver to endow the estimator with an optimality certificate or escape local minima otherwise. I showcase certifiable outlier-robust perception on real robotic applications such as scan matching, satellite pose estimation, and vehicle pose and shape estimation.

Bio: Heng Yang is a Ph.D. candidate in the Department of Mechanical Engineering and the Laboratory for Information & Decision Systems at the Massachusetts Institute of Technology, working with Prof. Luca Carlone. His research interests include large-scale convex optimization, semidefinite relaxation, robust estimation, and machine learning, applied to robotics and trustworthy autonomy. His work includes developing certifiable outlier-robust machine perception algorithms, large-scale semidefinite programming solvers, and self-supervised geometric perception frameworks. Heng Yang is a recipient of the Best Paper Award in Robot Vision at the 2020 IEEE International Conference on Robotics and Automation (ICRA), a Best Paper Award Honorable Mention from the 2020 IEEE Robotics and Automation Letters (RA-L), and a Best Paper Award Finalist at the 2021 Robotics: Science and Systems (RSS) conference. He is a Class of 2021 RSS Pioneer.


Formalizing the Structure of Multiagent Domains for Autonomous Robot Navigation in Human Spaces

Christoforos Mavrogiannis, University of Washington


Location: 122 Gates Hall

Time: 2:40p.m.

Abstract: Pedestrian scenes pose great challenges for robots due to the lack of formal rules regulating traffic, the lack of explicit coordination among agents, and the high dimensionality of the underlying space of outcomes. However, humans navigate with ease and comfort through a variety of complex multiagent environments, such as busy train stations, crowded malls or academic buildings. Human effectiveness in such domains can be largely attributed to cooperation, which introduces structure to multiagent behavior. In this talk, I will discuss how we can formalize this structure through the use of representations from low-dimensional topology. I will describe how these representations can be used to build prediction and planning algorithms for socially compliant robot navigation in pedestrian domains and show how their machinery may transfer to additional challenging environments such as uncontrolled street intersections.

Bio: Christoforos (Chris) Mavrogiannis is a postdoctoral research associate in the Paul G. Allen School of Computer Science & Engineering at the University of Washington, working with Prof. Siddhartha Srinivasa. His interests lie at the intersection of motion planning, multiagent systems, and human-robot interaction. He is particularly interested in the design and evaluation of algorithms for multiagent domains in human environments. To this end, he employs tools from motion planning and machine learning, and often seeks insights from (algebraic) topology and social sciences. Chris has been a best-paper award finalist at the ACM/IEEE International Conference on Human-Robot Interaction (HRI), and selected as a Pioneer at the HRI and RSS conferences. He has also led open-source initiatives (Openbionics, MuSHR), for which he has been a finalist for the Hackaday Prize and a winner of the Robotdalen International Innovation Award. Chris holds M.S. and Ph.D. degrees from Cornell University, and a Diploma in mechanical engineering from the National Technical University of Athens.

Welcome to the Fall 2021 Robotics Seminar!

Tapomayukh Bhattacharjee and Claire Liang


Location: 122 Gates Hall

Time: 2:40p.m.

Hey everyone! Welcome back for the semester. Robotics seminar is starting a new era and is (officially) a class again. The first seminar will cover the logistics of what to expect from this semester’s seminar/class as well as serve as an introduction to Cornell Robotics as a community. We will be announcing some new resources available (such as the new Robot Library) and taking feedback for what everyone would like to see in the future. The Robotics Graduate Student Organization will also cover some of what is to come for graduate students. If you’re new to the Cornell Robotics community, be sure to come for this week’s seminar!

P.S. Unfortunately, since Cornell is at a yellow COVID level, we will not have snacks for the foreseeable future.