University of Arizona researchers have been awarded $7.5 million to create an artificial intelligence agent that can understand social cues and human interactions, and use that information to help teams achieve their goals. The grant comes from the Defense Advanced Research Projects Agency and is part of DARPA's Artificial Social Intelligence for Successful Teams program. The UArizona project, funded over four years, is called ToMCAT, which stands for Theory of Mind-Based Cognitive Architecture for Teams. It is a collaboration among the School of Information in the College of Social and Behavioral Sciences, the Department of Computer Science in the College of Science and the Department of Family Studies and Human Development in the College of Agriculture and Life Sciences' Norton School of Family and Consumer Sciences.
Data collection for the ToMCAT project will take place in the Lang Lab for Family and Child Observational Research in the Norton School's Frances McClelland Institute for Children, Youth and Families.
Text exchanges between the human and AI agent will be analyzed by researchers in the university's Computational Language Understanding Lab, using natural language processing to determine what the humans are doing and how they feel about each other and the mission.
In addition to aiding in AI development, the experiment will also provide valuable data on how humans interact with one another. That will be the primary focus of co-principal investigator Emily Butler, a professor of family studies and human development and lead social scientist on the ToMCAT project.
"My area of interest is interpersonal coordination and how people get themselves organized as a more dynamic system. It's going beyond the individual to think of how a whole group of people coordinate their efforts," Butler said. "In this case, we have a chance to look at multiperson teams, and the most exciting thing will be this multiperson brain scanning that has only really been possible for about 15 years. Being able to get full brain activity from multiple people in real time as they interact will provide us with rich data that we're hoping to be able to use to understand complex interpersonal coordination, both with regard to cognition and emotions."