Carnegie Mellon Transforms Everyday Objects Into AI-Powered Assistants

TL;DR: Carnegie Mellon University researchers have developed a system that uses large language models and wheeled robotic platforms to transform everyday objects like staplers and mugs into proactive assistants that can predict when humans need them and move autonomously to help.

Researchers at Carnegie Mellon University’s Human-Computer Interaction Institute have combined artificial intelligence with robotic mobility to give ordinary household and office items the ability to anticipate human needs. The system transforms items such as staplers, mugs, plates and utensils into intelligent assistants that observe behaviour, predict interventions and move across surfaces to provide help at precisely the right moment.

Context and Background

The research team, led by Alexandra Ion, an assistant professor who heads the Interactive Structures Lab, presented their work on unobtrusive physical AI at the 2025 ACM Symposium on User Interface Software and Technology in Busan, Korea. The system uses computer vision and large language models to reason about a person’s goals and predict what they may need next.

A ceiling-mounted camera tracks the environment and object positions, translating visual information into text-based scene descriptions. An LLM then infers the person’s likely goals and determines which actions would provide the most assistance. Finally, the system transfers these predicted actions to the physical objects, enabling seamless help with everyday tasks like cooking, organising and office work.

Key Insight: The objects operate without explicit user commands—they sense what users need and perform tasks autonomously, representing a shift from reactive to proactive physical AI assistance.

Looking Forward

The Interactive Structures Lab is studying ways to expand unobtrusive physical AI throughout homes and offices. Ion envisions scenarios where shelves automatically fold out from walls when someone arrives home with groceries, or where objects rearrange themselves to prevent accidents before they occur.

Ph.D. student Violet Han, working with Ion, emphasises that the team chose to enhance everyday objects because users already trust them. The goal is to create adaptive systems that blend into daily life whilst dynamically responding to human needs, bringing safe and reliable physical assistance to homes, hospitals, factories and other spaces.

Source Attribution:

Share this article