Seed Funding Grant Helps Humanities and Social Sciences Build Better Robots

by Sarah Holland

Seed Funding Grant Helps Humanities and Social Sciences Build Better Robots
Rydia Weiland, major in human factors and applied cognition, is a URSP research funded student working with VR programming. They receive a handshake from Aibo, the robotic dog produced by Sony.

In 2004, the science-fiction film I, Robot explored the quandary of human/robotic relationships on the big screen. While we’re not quite at the level of fully self-driving cars and humanoid robotic servants, our connection and reliance on computer agents and artificial intelligence accelerates each year: Roombas, facial recognition, smart home devices, deepfakes, and ChatGBT, to name only a few. And with each new development, we wonder about the morality of these inventions, the ethics of their use, and how our own biases and worldviews impact the development of smart tools.  

Elizabeth “Beth” Phillips, assistant professor of psychology in human factors and applied cognition, is working with collaborators from labs around the country to answer these pressing questions about artificial intelligence and robotics in her role as the principal investigator for the Applied Psychology & Autonomous System lab (ALPHAS). One of the lab’s projects, “Improving Human-Machine Collaborations with Heterogeneous Multiagent Systems,” was a recipient of the 2022 Seed Funding Initiative out of Mason’s Office of Research Innovation and Economic Impact (ORIE) and in support of the Institute for Digital Innovation (IDIA). This project will be done in collaboration with co-investigators Dr. Ewart de Visser, Lt. Col. Chad Tossell PhD, and Air Force Academy cadets slated to commission as officers into the U.S. Space Force.  

ALPHAS studies how to make robots, computer agents, and artificial intelligence (AI) into better teammates, partners, and companions. While computer science seems like the obvious field for this work, social sciences and humanities are critical to the development of effective, ethical, and efficient machines that humans can work with and alongside long term. ALPHAS engages in a variety of research around other facets of human/robotic interaction and relationship-building. Student and faculty researchers are exploring the benefits of using social robotic animals as therapeutic tools, how robots might conceptualize socialization norms to better fit into our lives, and how robots might justify and define morality in their actions, for example.  

“You have to know a lot about humans in order to create successful robotics,” Phillips explained. “How humans interact, how they work together, how they build trust in each other: these are the same building blocks in understanding how humans might interact with machines, and can help us better establish how people might build trust in a variety of autonomous machines.” 

Lydia Melles, MA Human Factors and Applied Cognition, works with Baxter, an industrial robot by Rethink Robotics.In space travel, for example, humans will need to rely on a team of machines and AI. Data shows that human trust in machines is dependent on every machine in the group working perfectly; a single error by one machine rapidly decreases trust in all the machines. By understanding human psychology, Phillips explains that roboticists incorporate methods and strategies for building and maintaining trust among their machine counterparts. And in experimentation, the social sciences are critical to creating ethical methodology. “Often roboticists don’t have a lot of training about how to test their robots with human participants. So the social sciences are often asked to come onto projects in order to work with the human participants of trials, as well as help the roboticists understand and interpret the resulting data. Robotics is an excellent domain for engaging in truly interdisciplinary work.” Phillips explained. The lab’s work with the Air Force Academy hopes to find solutions to this challenge of trust in machines.  

Humanities also provides perspective on ethics and morality in relation to robotics: what could it mean for a robot to be a moral agent? “The rules of moral philosophy vary based on culture and structure,” Phillips said. “When we think about robots as moral agents, what we’re really talking about is being able to translate a system of morality and ethics into an algorithm. It’s up to the humanities and social sciences to define these systems, as well as consider the implications of favoring one moral and ethical system over another.” 

In paving the way for the future of space travel and human/robotic partnerships, Phillips also feels strongly about making space for underrepresented scholars in the field. As a first-generation student and a woman in the predominately male field of computer science, Phillips prioritizes accessibility and support for underrepresented groups. Per her request, the psychology department has ensured that the lab is equipped with the tools and systems necessary for all students, regardless of income, to be able to do their research. The lab supports its female and female-identifying students in their participation in exclusive events such as the Women and Robotics Workshop at the Robotics Science and System Conference. Whether on earth, the moon, or Mars, “it’s important that they have opportunities to network and meet others in the field who look like them, and to increase the visibility of marginalized and minority groups in this field,” Phillips said.