Workshops

2019 Cornell Workshop on Robotics (November 8, 2019)

The inaugural Cornell Workshop on Robotics was held on Friday November 8, 2019 in Upson Hall. Industry engineers and scientists participated with Cornell faculty, post-docs and students. Companies represented included Boeing, Facebook, Fetch, Honda, Microsoft, Moog, Raymond, and the Toyota Research Institute.

Speakers in Session I “Embracing the Messy World: Robotics at Cornell”, formed a panel to address audience questions about Robotics research at Cornell. From left to right, Prof. Dan Lee, Prof. Hadas Kress-Gazit, Prof. Bharath Hariharan, PhD student Maura O’Neill (representing Prof. Rob Shepherd’s Organic Robotics Lab), and Prof. Keith Green.
Speakers in Session I “Embracing the Messy World: Robotics at Cornell”, formed a panel to address audience questions about Robotics research at Cornell. From left to right, Prof. Dan Lee, Prof. Hadas Kress-Gazit, Prof. Bharath Hariharan, PhD student Maura O’Neill (representing Prof. Rob Shepherd’s Organic Robotics Lab), and Prof. Keith Green.

The Workshop presentations and panels revolved around the theme “Embracing the Messy World” – a research frontier recently articulated by Cornell faculty. The aim of this paradigm is to more thoughtfully co-design and integrate robots into existing ecosystems, and to create entirely new ecosystems. Hence, robots “embracing the messy world” are actively supported by their surroundings, rather than opposed, and collaborating with their surroundings to achieve desired outcomes.  Some example ecosystems were featured in subsequent panels including “Session II – Digital Agriculture’, ‘Session III – Factory and Office’, and ‘Session IV – Public and Private Lives’. 

Kress-Gazit summed up Cornell’s overall research in robotics, “We have a lot of expertise in autonomy, in design, and in interaction – human-robot integration, robot-robot interaction, those kinds of systems – but what is unique here is actually the connections between those,” said Kress-Gazit. “It’s not that someone does autonomy and that’s it. We have a lot of collaborations between labs and within labs around these areas.” To demonstrate this breadth and interdisciplinary collaboration, tours were provided of several Cornell robotics labs.  

The Session II panelists described their work in the Messy World ecosystem of Digital Agriculture (left to right): Prof. Kirstin Petersen (Cornell), Chris Layer (Moog), Dr. Anand Mishra (Cornell), Dr. Anirudh Badam (Microsoft), Prof. Wendy Ju, (Cornell Tech)
The Session II panelists described their work in the Messy World ecosystem of Digital Agriculture (left to right): Prof. Kirstin Petersen (Cornell), Chris Layer (Moog), Dr. Anand Mishra (Cornell), Dr. Anirudh Badam (Microsoft), Prof. Wendy Ju, (Cornell Tech)

Dr. Anirudh Badam, principal research scientist at Microsoft, presented his work with agricultural drones and said the workshop was an opportunity to meet some of the faculty and students his company has been collaborating with via the Cornell Initiative on Digital Agriculture.

“And I thought this would be the right place to learn from others on what they’re doing in the space of digital agriculture,” said Badam, who added he also enjoyed learning more about Cornell Robotics. “The first session gave me a sense of how broad the robotics program is, from physical to digital systems, and the panelists talked about things that are important for society including implications for robotics; I really like that perspective.”

Prof. Hadas Kress-Gazit overviews the Autonomous Systems Lab during the Lab Tours Session. Tours of seven Cornell Robotics Labs were available.
Prof. Hadas Kress-Gazit overviews the Autonomous Systems Lab during the Lab Tours Session. Tours of seven Cornell Robotics Labs were available.

Robots in a Messy World: Embracing the “messy world” is an opportunity at the forefront of robotics.  It is a new frontier that demands more of robots while at the same time alleviating constraints imposed by their surroundings, which historically present barriers and challenges that robots must circumvent. 

Most robots today adapt to a world that wasn’t designed for them. They perceive information, make models and assumptions, and act accordingly. By instead integrating insights from the environment and allowing the robot to work with the surrounding agents and modify and optimize their environment, they can extend their reach, productivity and adoption.

Understanding the ways that people and other surrounding agents will respond to robots in these new ecosystems, and designing robots, their behaviors and the way that they are deployed will be important to how these new ecosystems develop, and are received or rejected.  

2019 CORNELL ROBOTICS WORKSHOP AGENDA

8:15am to 9:00am – Badge Pick-up and Continental Breakfast

9:00am to 9:10am – Welcome – Kavita Bala, Chair of the Computer Science Department, and David Erickson, Director of the Sibley School of Mechanical and Aerospace Engineering 

SESSION I: Embracing the Messy World: Robotics at Cornell

Session Chair: Jim Ballingall 

9:10am to 9:20am  – Overview of Robotics at Cornell – Prof. Hadas Kress-Gazit, Cornell

9:20am to 9:30am – Tasks and Decisions – Prof. Dan Lee, Cornell Tech

9:30am to 9:40am  – Sensors and Perception – Prof. Bharath Hariharan, Cornell  

9:40am to 9:50am – Materials and Actuators – Maura O’Neill, Cornell

9:50am to 10:00am – Humans and Robots – Prof. Keith Green, Cornell

10:00am to 10:30am – Q/A Panel – Session I Speakers (5)

10:30am to 11:00am – NETWORKING BREAK

SESSION II: Robotics in 2030 – Messy World Ecosystems – Lightning Talks & Panel – “Robotics and Digital Agriculture”

Session Chair: Prof. Hadas Kress-Gazit

11:00am to 11:05am – Digital Ag – Prof. Kirstin Petersen, Cornell

11:05am to 11:10am – Digital Ag – Anirudh Badam, Microsoft  

11:10am to 11:15am – Digital Ag – Anand Mishra, Cornell 

11:15am to 11:20am – Digital Ag – Chris Layer, Moog

11:20am to 11:25am – Digital Ag – Prof. Wendy Ju, Cornell Tech

11:25am to 12:00pm — Q/A Panel – Session II Speakers (5)

 12:00pm to 1:00pm – NETWORKING LUNCH

SESSION III: Robotics in 2030 – Messy World Ecosystems – Lightning Talks & Panel – “Robotics in the Factory and Office”

Session Chair: Prof. Guy Hoffman  

1:00pm to 1:05pm – Factory – Prof. Hadas Kress-Gazit, Cornell

1:05pm to 1:10pm – Warehouses – David Dymesich, Fetch Robotics

1:10pm to 1:15pm – Workplace/Classroom – Prof. Ross Knepper, Cornell

1:15pm to 1:20pm – Simulation – Evan Drumwright, Toyota Research Institute

1:20pm to 2:00pm – Q/A Panel – Session III Speakers (4)  

2:00pm to 3:00pm – ROBOTICS LAB TOURS (See Overviews below agenda*)

  1. Autonomous Systems Lab – Prof. Hadas Kress-Gazit 
  2. Autonomous Systems Lab – Lexus RX Tour – Carlos Diaz-Ruiz
  3. Robotics Personal Assistants Lab – Prof. Ross Knepper
  4. Collective Embodied Intelligence Lab – Prof. Kirstin Petersen
  5. Robotics and Biomechanics Lab – Prof. Andy Ruina
  6. Organic Robotics Lab – Maura O’Neill
  7. Human-robot Collaboration & Companionship Lab – Prof. Guy Hoffman

3:00pm to 3:30pm – NETWORKING BREAK

SESSION IV: Robotics in 2030 – Messy World Ecosystems – Lightning Talks & Panel – “Robotics in Our Public and Private Lives” 

Session Chair: Prof. Dan Lee  

3:30pm to 3:35pm – Home – Prof. Guy Hoffman, Cornell

3:35pm to 3:40pm – Home – Randy Gomez, Honda Research Institute Japan

3:40pm to 3:45pm – Public SpacesNegar Khojasteh, Cornell  

3:45pm to 3:50pm – Health – Prof. Keith Green, Cornell

3:50pm to 3:55pm – Home – Sonia Chernova, Facebook AI Research & Georgia Tech

3:55pm to 4:30pm – Q/A Panel – Session IV Speakers (5)

4:40pm to 4:45pm – Wrap-up – Prof. Hadas Kress-Gazit

4:45pm to 7:00pm – NETWORKING RECEPTION

Cornell Robotics Labs Overviews:

ASL: Cornell’s Autonomous Systems Lab (ASL), directed by Professors Mark Campbell and Hadas Kress-Gazit, focuses on algorithms and hardware implementations that enable a variety of applications in the general area of autonomous/semi-autonomous robotic systems, including modular robots, swarms, multi-robot systems and autonomous driving. Topics studied include perception, estimation, control, high-level decision making and human-robot interaction. The lab is equipped with a variety of robots including Aldabaran Naos, a Kuka Youbot, Jackels, Hebi modular robots, and other small robots, in addition to a fully autonomous Chevrolet Tahoe (Skynet), one of the six finishers of the 2007 DARPA Urban Challenge. 

ASL Lexus RX Tour: The Autonomous Systems Lab performs self-driving vehicle research with its Lexus RX. The car is fitted with four 16 lasers Velodyne’s, Ibeo’s, multiple cameras, inertial navigation system, and a NVIDIA Drive PX. The team focuses on sensor fusion and probabilistic perception. Projects include 3D point cloud semantic segmentation, vision-only 3D vehicle and pedestrian tracking. We also study the anticipation of vehicle motion in unstructured environments, such as garages. Additionally, we examine the anticipation of pedestrians by incorporating scene context, such as intersection crosswalks and pedestrian’s awareness of oncoming traffic.

RPAL: Cornell’s Robotic Personal Assistants Lab (RPAL), directed by Prof. Ross Knepper researches robotics technologies that allow robots to work alongside human beings and act as peers in ordinary tasks.  Our philosophy is that robots should adapt to human intuition and practices so that people don’t need to learn how to operate robots. Using software and hardware solutions, we study both the technical and social aspects of problems such as teamwork, assembly, coordination, and pedestrian navigation.  We build complete robot systems that assist people to perform tasks like building Ikea furniture, finding their way through public spaces, assessing environmental conditions, and playing games.

ORL: Cornell’s Organic Robotics Lab (ORL), directed by Prof. Rob Shepherd focuses on using organic chemistry of soft material composites for new capabilities in robots. Take a step into our lab space to gain insight into how these soft composite materials can be integrated into distributed sensing and actuation systems for cardiac devices, prosthetic limbs, and wearable sensing networks. This laboratory is highly interdisciplinary, consisting of unique spaces for material synthesis, hardware development, and advanced manufacturing.

RBL: Cornell’s Robotics and Biomechanics Lab (RBL), directed by Prof. Andy Ruina has many ongoing projects, active or pending funding, including:

WALKING ROBOTS:  We aim for the humanoid world record for energy effectiveness.  Despite the lab reputation, we no longer frame things in terms of passive dynamics.

ROBOTIC BICYCLES:  Using bicycle dynamics models in model-based control, we are working autonomous bicycles and steer-by-wire bicycles.

ROBOTIC SAILBOATS: We would like to make 1 m boats that  can sail the oceans for months. 

CONTACT MODELING: We are developing faster yet adequately-accurate models for use in model-based robotic control design.

VIDEO GAMES AS A PROXY FOR ROBOTICS; We are trying to master the game QWOP as a proxy control problem

HRC2: Cornell’s Human-Robot Collaboration & Companionship (HRC2, Lab directed by Prof. Guy Hoffman studies the computational, engineering, and social aspects of interactions between humans and robots. Research interests include: human-robot teamwork and collaboration; personal robotic companions; non-anthropomorphic robot design; AI and machine learning for human-robot interaction (HRI); human outcomes for robots in the workplace; robot assistants for cognitive tasks; timing and fluency in HRI and multi-agent MDPs; entertainment, theater, and musical performance robots; robot improvisation; and nonverbal communication in HRI.

CEIL: Cornell’s Collective Embodied Intelligent Lab (CEIL), directed by Prof. Kirstin Petersen,  involves design and coordination of large robot collectives able to achieve complex behaviors beyond the reach of single robot systems, and corresponding studies on how social insects do so in nature. Major research topics include swarm intelligence, embodied intelligence, autonomous construction, bio-cyber physical systems, human-swarm interaction, and soft robots.