CMU Puts AI To Work in New NSF-funded Institutes
Carnegie Mellon University researchers will contribute to four of the 11 new National Artificial Intelligence Research Institutes announced today by the National Science Foundation. Focused on AI-based technologies, these new institutes, which will each receive $20 million in funding over five years, will advance technology in fields ranging from agriculture to wireless networks.
The wide reach of these institutes illustrates the role AI will play in the next wave of transformational innovation and as a tool for improving daily life. CMU’s inclusion in these efforts to expand the use of AI underscores the university’s depth and breadth of expertise in the field, spanning deep learning, robotics, interaction, ethics and engineering. The university is one of the birthplaces of AI and continues to push its boundaries.
AI-CARING
Reid Simmons, a research professor in the Robotics Institute (RI) and the Computer Science Department (CSD), will lead CMU’s involvement in the AI Institute for Collaborative Assistance and Responsive Interaction for Networked Groups (AI-CARING), which seeks to develop AI systems for caregiving environments.
AI-CARING will be led by the Georgia Institute of Technology with partial funding from Amazon and Google. The institute will initially focus on assisting in the care of the elderly by seeking to understand and manage the interaction between humans and AI agents. Researchers will develop methods to teach AI systems to learn a person’s needs, preferences and caregiving network, and enable the systems to adapt as those change over time. The tasks will involve not only coordinating a variety of AI agents — like voice-controlled assistants, smart devices or chatbots — but also managing human caregivers ranging from medical and assistance providers to family.
“What we’re trying to do is collaborate. We’re not replacing caregivers; we’re augmenting their capabilities,” Simmons said. “The AI has to adapt to the individual and even more so, it has to adapt to the network the individual is in. We’re intending that the AI is going to be assisting people over a long time, and while there will be plenty of time for the system to learn, there will also be plenty to learn.”
The CMU team consists of about a dozen faculty members spanning RI, CSD, the Human-Computer Interaction Institute, Dietrich College of Humanities and Social Sciences and Tepper School of Business. The team includes Henny Admoni, who researches nonverbal human-robot interaction; Jodi Forlizzi and her work in human-AI interaction; Aaron Steinfeld, who works on assistive technology; Dave Touretzky and his work in teaching AI to K-12 students; Alex London, a professor of ethics and philosophy in Dietrich; and Tepper’s Anita Williams Woolley, who studies human-AI teaming and collaboration.
The institute’s CMU ties even extend to its director, Sonia Chernova. She earned her Ph.D. in computer science from CMU and now heads the Robot Autonomy and Interactive Learning Lab at Georgia Tech.
AI-EDGE
Gauri Joshi, assistant professor of electrical and computer engineering, will manage a team of CMU researchers, including Ameet Talwalkar, assistant professor in CMU’s Machine Learning Department, for the AI Institute for Future Edge Networks and Distributed Intelligence (AI-EDGE). AI-EDGE is led by The Ohio State University and partially funded by the Department of Homeland Security.
“The overarching research mission of the AI-EDGE Institute will be to design next-generation intelligent edge networks that are efficient, reliable, robust and secure,” says Joshi. “The focus will be on edge networks that consist of diverse components including mobile phones, sensors, robots, self-driving cars that are connected to backhaul networks, and data centers.”
AI-EDGE will develop new AI tools and techniques to ensure that wireless edge networks are self-healing and self-optimized. These networks will make AI more efficient, interactive and privacy-preserving for applications in sectors such as intelligent transportation, remote health care, distributed robotics and smart aerospace.
CMU is a globally recognized leader in machine learning, artificial intelligence and networked computing systems. The confluence of these areas puts the team in a unique position to make a lasting impact on next-generation AI edge networks. Joshi and Talwalkar are leaders in the emerging field of federated learning, a framework that trains machine learning models using data collected by edge devices.
AI-EDGE will create a research, education, knowledge transfer and workforce development environment that will help establish U.S. leadership in next-generation edge networks and distributed AI for many decades to come.
AIIRA
George Kantor, a research professor in RI, will lead CMU’s work in the USDA-NIFA AI Institute for Resilient Agriculture (AIIRA), which is focused on AI and robotics in agriculture.
AIIRA will be led by the Iowa State University and is funded by the U.S. Department of Agriculture’s National Institute of Food and Agriculture. The institute aims to use AI to develop digital twins to simulate and predict how a crop will perform in climate scenarios that do not yet exist. It will also enable education and workforce development through formal and informal educational activities and drive knowledge transfer through partnerships with industry, producers, and federal and state agencies.
George Kantor will lead CMU’s work with AIIRA, which will develop AI for agricultural purposes.
“Imagine a robotic manipulator pushing on a branch. We can record how much force was used to make the branch move and how much the branch moved. Fusing that with other data will help us create a predictive model,” Kantor said. “This is going to be a big opportunity to bring intelligent manipulation into the field.”
Seven faculty members from RI and the Machine Learning Department make up CMU’s contribution to the institute. The challenge for the CMU team is, in part, designing the never-before-used robotic tools that will be required in the field. A robotic manipulator that will interact with crops must be gentle to avoid damaging the plants, so CMU’s Nancy Pollard is designing soft manipulators for the project. She will use AI and video capture techniques from her Graphics Lab to build a hand optimized for the task, whether grabbing a leaf or pulling on a stem. Other CMU team members include Oliver Kroemer, who is designing techniques that will allow a robot to insert sensors into a plant’s stem and the ground; and Katia Sycara, who studies multi-agent systems and human-robot interaction. Other members of the team will tackle novel imaging techniques, machine learning for decision support, and field robotics.
While the initial work will focus on using robotics and AI to gather and process data, the tools and techniques developed for those tasks could translate to other work on the farm, such as pollinating flowers, which is done by hand in some instances.
AgAID
Wenzhen Yuan (left), an assistant professor in the Robotics Institute, will lead CMU’s contribution to the Institute for Agricultural AI for Transforming Workforce and Decision Support (AgAID). Both AIIRA and AgAID are focused on AI and robotics in agriculture.
AgAID will be led by Washington State University and is also funded by the USDA-NIFA. The institute will integrate AI methods into agriculture operations for prediction, decision support and robotics-enabled agriculture to address complex agricultural challenges. The institute involves farmers, workers, managers and policy makers in the development of these solutions, as well as in AI training and education, which promotes equity by increasing the technological skill levels of the next-generation agricultural workforce.
Yuan, who heads the RoboTouch Lab at CMU, will bring her research on robotic tactile perception to the institute. Yuan has developed tactile sensors that enable a robot to determine the hardness of an object. She has used this previously in her lab to determine if tomatoes, avocadoes and mangos are ripe but imagines it employed by robots in the field.
“Robots can touch the corn or other fruits and determine if they are ripe or not,” Yuan said. “I’m excited to take this technology out of the lab and into the field.”