PLENARY/KEYNOTE TALKS

PLENARY TALKS

HRI in the era of Robot Foundation Models

Vincent Vanhoucke, Senior Director, Google DeepMind

Foundation Models are quickly becoming the key enabler of modern AI. What distinguishes them from previous generations of AI approaches to robotics is that they incorporate jointly natural language, multimodality, and robot actuation as core facets of the system. I will make the case that this unique combination of capabilities opens up fantastic new avenues for HRI research in real-world environments. I will also highlight the many ways in which improved HRI can directly impact the pace of progress and development of this new generation of Embodied AI capabilities.

Vincent Vanhoucke is a Distinguished Scientist and Senior Director of Robotics at Google DeepMind. His research has spanned many areas of artificial intelligence and machine learning, from speech recognition to deep learning, computer vision, and robotics. His Udacity lecture series has introduced over 100,000 students to deep neural networks. He is President of the Robot Learning Foundation, which organizes the Conference on Robot Learning, now in its seventh year. He holds a doctorate from Stanford University and a diplôme d'ingénieur from the École Centrale Paris.

Human-Centered Design for the Practical Deployment of Socially Assistive Robots for Healthcare

Dr. Ayanna Howard, Dean of the College of Engineering, Ohio State University

It is estimated that over 240 million children worldwide are living with a disability. In the United States, it’s estimated that 1 in 4 adults live with a disability. With the recent advances in robotics and artificial intelligence (AI), intervention protocols using robots is now ideally positioned to make an impact in the healthcare domain. There are numerous challenges though that still must be addressed to enable successful interaction between patients, clinicians, and robots - developing intelligence methods to enable personalized adaption to the needs of the patient; ensuring equitable outcomes and mitigation of possible healthcare inequities that derive from the use of AI learning methods; and ensuring that the system can provide engaging and emotionally appropriate feedback to the user. In this presentation, I will discuss the role of socially interactive robotics for healthcare and possible ways to enhance our interactions with them.

Dr. Ayanna Howard is the Dean of Engineering at The Ohio State University and Monte Ahuja Endowed Dean’s Chair. Previously she was the Linda J. and Mark C. Smith Endowed Chair in Bioengineering and Chair of the School of Interactive Computing at the Georgia Institute of Technology. Dr. Howard’s research encompasses advancements in artificial intelligence (AI), assistive technologies, and robotics, and has resulted in over 275 peer-reviewed publications. She is a Fellow of IEEE, AAAI, AAAS, and the National Academy of Inventors (NAI). She is also an elected member of the American Academy of Arts and Sciences and recipient of the Richard A. Tapia Achievement Award and NSBE Janice Lumpkin Educator of the Year Award. To date, Dr. Howard’s unique accomplishments have been highlighted through a number of other public recognitions, including being recognized as one of the 23 most powerful women engineers in the world by Business Insider and one of the Top 50 U.S. Women in Tech by Forbes. In 2013, she also founded Zyrobotics, which developed STEM educational products to engage children of all abilities. Prior to Georgia Tech, Dr. Howard was at NASA’s Jet Propulsion Laboratory where she held the title of Senior Robotics Researcher and Deputy Manager in the Office of the Chief Scientist.


Keynote Talks

Socially assistive robots: how we can do better

Shelly Levy-Tzedek, Associate Professor, Ben-Gurion University

Shelly Levy-Tzedek is an associate professor and the director of the Cognition, Aging & Rehabilitation Laboratory at Ben-Gurion University.

Prof. Levy-Tzedek completed her undergraduate studies, summa cum laude, at UC Berkeley, where she won the Bioengineering departmental citation medal. At the Massachusetts Institute of Technology (MIT), she completed her M.S. and her Ph.D. degrees as an MIT Presidential Fellow and a Howard Hughes Medical Institute fellow, in the Biomedical Engineering department.

Chosen as one of Israel's most promising 40-under-40 by The Marker Magazine for 2016, she also won the 2016 award from the Paedagogica Foundation's special program entitled "Initiative for Excellence in the Negev". In 2018, she won the Toronto Prize for excellence in research. In 2019 she participated in the Dagstuhl Seminar on Verification and Synthesis of Human-Robot Interaction. In the academic year 2018-19, Prof. Levy-Tzedek was a guest professor at the University of Freiburg in Germany as part of the Marie S. Curie FRIAS COFUND Fellowship Program, supported by the European Union through the Horizon 2020 research and innovation programme, and in 2023 she was a scholar-in-residence at Sweden’s KTH.

She takes a multi-disciplinary approach to her studies, combining engineering, rehabilitation, cognition and sociology. Her lab team studies how robotics can facilitate efficient rehabilitation processes for conditions such as stroke and Parkinson's disease, as well as the various ways in which interaction with social robots can benefit users, such as reducing pain or performing cognitive training. She has been a strong advocate for the responsible deployment of social robots, especially when working with vulnerable populations.


On designing robots to be invisible-in-use

Leila Takayama, VP of Design and Human-Robot Interaction, Robust.AI

In a time when there are so many flashy robot demos being promoted in the media, it is worth remembering that sometimes those robots are seen as threats, especially when deployed in workplaces. When robots are unwanted, people find ways to shove them aside -- sometimes subtly, sometimes violently. Drawing from human-computer interaction research, this talk will explore ways to make robots more invisible-in-use, offering examples of how we might invent a future in which people are the stars of the show, not robots. Sometimes, the most useful robots are the ones that hide in plain sight.

Leila Takayama is a Human-Robot Interaction specialist in the psychology of people's interactions with robotic products and services. She is VP of Design and Human-Robot Interaction at Robust.AI, where they are developing collaborative mobile robots that support warehouse operations. Prior to joining Robust.AI, she was a tenured associate professor at the University of California, Santa Cruz. She also worked as a full-time researcher at Willow Garage and GoogleX.

In her consulting work at Hoku Labs, she translates human-robot interaction research into actionable recommendations for the design of robotic products and services. Her clients include Fortune 100 companies, tech startups, and non-profit organizations. She has contributed to over a dozen robot products that have shipped, including drones in the air; industrial, service, and consumer robots on the ground; and uncrewed robotic systems in the ocean. More info: https://www.hokulabs.com/


Maya Cakmak

Associate Professor, University of Washington

Maya Cakmak is an Associate Professor at the Paul G. Allen School of Computer Science & Engineering, University of Washington, where she directs the Human-Centered Robotics lab. Her research interests are in assistive robotics, end-user programming, and human-robot interaction. Her work aims to develop robots that can be programmed and controlled by a diverse group of users with unique needs, preferences, and abilities to do useful tasks. Most recently she collaborated with Hello Robot, Inc. and UIUC researchers to deploy a Stretch mobile manipulator in the home of a quadriplegic individual for several weeks over the course of three summers. Her team developed an accessible and customizable interface for the user to control the robot for tasks ranging from self-care to social participation. Maya is also the PI of AccessComputing — an NSF-funded BPC Alliance that aims to broaden participation of people with disabilities in computing education and careers. She is currently serving on the CRA-WP board as a co-director of the DREU program. She received an NSF CAREER award, a Sloan Research Fellowship, Early Career Spotlights at RSS and IJCAI, and the CRA Anita Borg Early Career Award.

Robots that can assist with everyday tasks in the home have the potential to enable aging in place for older adults, increase independence for people with physical limitations, and increase quality of life for all. Deploying robots in homes is, however, extremely challenging given the vast diversity of environments, tasks, and users in homes. In this talk I discuss strategies we have pursued to enable physically-assistive mobile manipulator robots at home in the near-term, including: (1) customizing robots to the specific home/user, (2) having users in the loop rather than aim for full autonomy, and (3) structuring the environment to make it more robot-friendly. I share results and lessons learned from an interdisciplinary project where we deployed a mobile manipulator in the home of an older adult with severe motor impairments for a month, three consecutive summers. I also share how we have adapted our research process and methods to engage potential early-adopters of physically-assistive robots and their caregivers in different roles, from participant to fellow researcher, in all phases of our research.