Enabling Local-To-Global Behaviors Through a Scalable, Deformable Collective

12/10/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: Modular self-reconfigurable robots are typically composed of homogeneous units executing a set of programmed interactions with their neighbors based on a deterministic rule set. Some stochastic modular robotic systems take advantage of their physical design to arrive at a desired state and offer great potential in scalability. We propose combining an innovative hardware design with a control algorithm that enables intermodule interactions with some inherent randomness to ensure successful attachment through permanent magnets. We present the FOAMbots, a scalable, planar, modular robot composed of inflatable units, capable of onboard processing, actuation, sensing, and communication. Each module contains a poro-elastic foam that enables structural integrity while allowing fluid to flow through its volume. Pairs of permanent magnets along the modules’ perimeters enable attachment with adjacent modules, and low-cost, stretchable strain sensors allow modules to communicate and sense their surroundings. This presentation will introduce the hardware, various characterizations of the modular robot’s mechanical and locomotion properties, and a discussion of the algorithms that are currently being implemented to achieve local-to-global changes in the collective’s mechanical properties.

Heterogeneous Team of Robots: Sampling in aquatic environments

Alberto Quattrini Li, Dartmouth College

12/3/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: How can robots effectively explore, monitor, and sample in large scale aquatic environments? This talk presents a recent interdisciplinary project funded by the National Science Foundation on monitoring cyanobacterial blooms in lakes with a team of heterogeneous robots. I will present a sample of solutions that involve the development and deployment of aquatic robotic systems for data collection. First, I show our efficient multirobot algorithms for a team of Autonomous Surface Vehicles governed by Dubins vehicle dynamics to cover of large areas of interest. Field trials with custom-modified motorized kayak are presented, providing insights for improvements.

Second, I discuss the use of a heterogeneous team of robots to exploit their complementary capabilities to reduce the operational cost and increase the mission time for environmental monitoring and water sampling. Using machine learning techniques to model the distribution of the observed phenomena, we developed adaptive exploration and sampling strategies that accounts for reduction in uncertainty. Experimental results from several field experiments together with some lessons learned will be presented.

The talk will conclude with a discussion on some of the open problems that still need to be fully addressed for a robust multirobot system useful for addressing environmental problems and current work, such as ensuring high-quality data and recovery mechanisms, towards the long-term goal of a ubiquitous collaborative multiagent/multirobot system for accomplishing large scale real world tasks.

Bio: Alberto Quattrini Li is an assistant professor in the Department of Computer Science at Dartmouth College and co-director of the Dartmouth Reality and Robotics Lab. He was a postdoctoral fellow and research assistant professor in the Autonomous Field Robotics Laboratory (AFRL), led by Professor Ioannis Rekleitis, in University of South Carolina from 2015 to 2018. During 2014, he was a visiting PhD student in the Robotic Sensor Networks Lab, directed by Professor Volkan Isler, at the Department of Computer Science and Engineering, University of Minnesota. He received a M.Sc. (2011) and a Ph.D. (2015) in Computer Science and Engineering from Politecnico di Milano, working with Professor Francesco Amigoni. His main research (currently funded by the National Science Foundation) include autonomous mobile robotics and active perception, applied to the aquatic domain, dealing with problems that span from multirobot exploration and coverage to multisensor fusion based state estimation. He has worked with many ground and marine robots, including Autonomous Surface Vehicles and Autonomous Underwater Vehicles.

 

Robotics Day

Date: 12/10/2019

Time: 10:00 am – 4:00 pm

Location: Duffield Hall Atrium

Join us for a day of robotics celebration and competition, with interactive exhibits by our award-winning project and research teams. All are welcome!

Robotic Maze Runners
ECE 3400: Intelligent Physical Systems
10:00 am to 12:00 pm

Cube Crazy Robots
MAE 3780: Mechatronics
1:00 pm to 4:00 pm

Formal Verification of End-to-End Deep Reinforcement Learning

Yasser Shoukry, University of California – Irvine

11/26/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: From simple logical constructs to complex deep neural network models, Artificial Intelligence (AI)-agents are increasingly controlling physical/mechanical systems. Self-driving cars, drones, and smart cities are just examples of such systems to name a few. However, regardless of the explosion in the use of AI within a multitude of cyber-physical systems (CPS) domains, the safety, and reliability of these AI-enabled CPS is still an understudied problem. Mathematically based techniques for the specification, development, and verification of software and hardware systems, also known as formal methods, hold the promise to provide appropriate rigorous analysis of the reliability and safety of AI-enabled CPS. In this talk, I will discuss our work on applying formal verification techniques to provide formal verification of the safety of autonomous vehicles controlled by end-to-end machine learning models and the synthesis of certifiable end-to-end neural network architectures.

Bio: Yasser Shoukry is an Assistant Professor in the Department of Electrical Engineering and Computer Science at the University of California, Irvine where he leads the Resilient Cyber-Physical Systems Lab. Before joining UCI, he spent two years as an assistant professor at the University of Maryland, College Park. He received his Ph.D. in Electrical Engineering from the University of California, Los Angeles in 2015. Between September 2015 and July 2017, Yasser was a joint postdoctoral researcher at UC Berkeley, UCLA, and UPenn. His current research focuses on the design and implementation of resilient cyber-physical systems and IoT. His work in this domain was recognized by the NSF CAREER Award, the Best Demo Award from the International Conference on Information Processing in Sensor Networks (IPSN) in 2017, the Best Paper Award from the International Conference on Cyber-Physical Systems (ICCPS) in 2016, and the Distinguished Dissertation Award from UCLA EE department in 2016. In 2015, he led the UCLA/Caltech/CMU team to win the NSF Early Career Investigators (NSF-ECI) research challenge. His team represented the NSF- ECI in the NIST Global Cities Technology Challenge, an initiative designed to advance the deployment of Internet of Things (IoT) technologies within a smart city. He is also the recipient of the 2019 George Corcoran Memorial Award for his contributions to teaching and educational leadership in the field of CPS and IoT.

Can Science Fiction Help Real Robots?

Deanna Kocher and Ross Knepper

11/19/19

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: With creative license, science fiction envisions a future in which robots operate among humans. Stories like Bladerunner and Star Trek help us to imagine the ways in which robots could bring out both the best and the worst in humanity. As researchers and companies develop real robots, we notice that they operate on a different plane of assumptions than sci-fi robots. For instance, Isaac Asimov’s three laws of robotics tacitly assume an accurate human detector. In the real world, the three laws are useless to a robot that cannot reliably distinguish a person from a piece of furniture. Science fiction authors are not technologists, for the most part, but do they have something useful to contribute to us? We lead a group discussion about how the two separate planes of real robotics and fantasy robots can be made to intersect. We ask how we roboticsts could utilize science fiction, which has a rich history of considering the ethical dilemmas that may one day arise from robots. And we ask what roboticists can do for science fiction authors and society at large to create a better understanding of robot capabilities and limitations.

Robotics Collaboration Speed Dating

11/5/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: Collaboration opportunities abound in robotics. Today, we will do an activity to speculatively explore connections between people’s research areas. We will pair you up with other people in the group for short amounts of time, and the goal of each encounter is to find a common theme, idea, or project that the two of you could work on together. If you don’t currently do research in robotics, you can shadow somebody else or make up a project on the spot. After we are done brainstorming projects in pairs, we will have an opportunity to share our best ideas with the group.

Transience, Replication, and the Paradox of Social Robotics

Guy Hoffman, Cornell University

10/29/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: As we continue to develop social robots designed for connectedness, we struggle with paradoxes related to authenticity, transience, and replication. In this talk, I will attempt to link together 15 years of experience designing social robots with 100-year-old texts on transience, replication, and the fear of dying. Can there be meaningful relationships with robots who do not suffer natural decay? What would our families look like if we all choose to buy identical robotic family members? Could hand-crafted robotics offer a relief from the mass-replication of the robot’s physical body and thus also from the mass-customization of social experiences?

Robots, Language, and Human Environments: Approaches to Modeling Linguistic Human-Robot Interactions

Cynthia Matuszek, University of Maryland

10/22/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: As robots move from labs and factories into human-centric spaces, it becomes progressively harder to predetermine the environments, tasks, and human interactions they will need to be able to handle. Letting these robots learn from end users via natural language is an intuitive, versatile approach to handling novel situations robustly. Grounded language acquisition is concerned with learning the meaning of language as it applies to the physical world. At the same time, physically embodied agents offer a way to learn to understand natural language in the context of the world to which it refers. In this presentation, I will give an overview of our recent work on joint statistical models to learn the grounded semantics of natural language describing objects, spaces, and actions, as well as presenting some open problems.

Bio: Cynthia Matuszek is an assistant professor of computer science and electrical engineering at the University of Maryland, Baltimore County. Dr. Matuszek directs UMBC’s Interactive Robotics and Language lab, in which research is focused on robots’ acquisition of grounded language, including work in human-robot interfaces, natural language, machine learning, and collaborative robot learning. She has developed a number of algorithms and approaches that make it possible for robots to learn about their environment and how to follow instructions from interactions with non-technical end users. She received her Ph.D. in computer science and engineering from the University of Washington in 2014. Dr Matuszek has published in artificial intelligence, robotics, and human-robot interaction venues, and was named in the most recent IEEE bi-annual “10 to watch in AI.”

Who Must Adapt to Whom?

A conversation with Keith Green, Cornell DEA/MAE, and Chajoong Kim, Cornell Visiting Professor

10/8/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: Robots are engineered products designed to perform a task. In industrial robot deployments, such as factory and warehouse settings, the robot’s environment is often engineered to simplify the robot’s task. As robots begin to be deployed in our daily lives around untrained human users, the question becomes: who must adapt to whom? In this panel, we discuss the following questions:

Anthropomorphism and Bio-inspiration: Shouldn’t robots have their own look and behavior, or must they reference familiar living things?

If robots can’t (yet) do all that we’d like them to do in a given physical environment (e.g. a hospital, a school, a workplace), might we change the physical environment to better fit the (current and near future) capacities of robots, or should we focus our efforts on advancing the robot to fit the human environments we already have?

Bio: Keith Evan Green is professor of design (DEA) and mechanical engineering (MAE) at Cornell University. He addresses problems and opportunities of an increasingly digital society by developing and evaluating interactive and adaptive physical environments and, more broadly, novel robotic manipulators. For Green, the built environment—furniture to metropolis—is a next frontier at the interface of robotics, design, and psychology.

Bio: Chajoong (“CJ”) Kim is associate professor at the Graduate School of Creative Design Engineering, Ulsan National Institute of Science and Technology, South Korea. Dr. Kim investigates how affective experiences in human-product interactions influence user well-being. During this sabbatical year at Cornell, Dr. Kim is studying “the functions of experiencing diverse positive emotions in de-accelerating hedonic adaptation and promoting subjective well-being in the context of consumer product use.”

Autonomous Matter – Bridging the Robotics and Material Composites Communities

Robert Shepherd, Cornell University

10/1/2019

Location: Upson 106 Conference Room Next to the Lounge

Time: 3:00p.m.

Abstract: The robotics community has been more aggressively incorporating new materials for improved performance—some call this Robotic Materials. These systems are, essentially, smaller versions of existing robots that are sometimes used in swarms; an example of this concept is “Smart Dust.” Concurrently, the materials community has been applying their knowledge towards autonomic responses—they call this Autonomous or Smart Materials. These materials have a feed forward response to applied stimulus, an example of these responses are self healing, or swelling with humidity. For example, Our research group, Organic Robotics Laboratory, is at the intersection of these two approaches. We are building towards the concept of Autonomous Matter, where sensing, computation, actuation, and power are part of a composite material. Examples of how we are moving towards the complexity and size scales that can be considered a material system with these abilities will be shown and discussed.