Reading the Robot’s Mind: How Can Machines Signal Their Intent?
New research explores how different communication methods – from gestures to lights and sounds – help humans understand and predict the actions of robots sharing our spaces.
New research explores how different communication methods – from gestures to lights and sounds – help humans understand and predict the actions of robots sharing our spaces.

New research details a hierarchical framework for predicting both human movements and actions, paving the way for robots that can proactively assist their human partners.

A new study reveals widespread adoption of generative AI by MBA students, highlighting both the perceived benefits and the crucial need for critical evaluation of AI-assisted writing.

New research combines electroadhesive clutches with pneumatic actuators to enable precise, adaptable shape control in soft robotic systems.

New research demonstrates how automatically refining the instructions given to teams of AI agents can dramatically improve their ability to gather complex information and produce insightful reports.

New research demonstrates how robots can intelligently combine visual perception with force/torque sensing to perform more nuanced and reliable manipulation tasks.

New research shows middle school students can grasp core artificial intelligence concepts by learning a fundamental problem-solving technique within the context of science education.
As AI agents increasingly contribute to software development, a critical need arises for standardized and transparent evaluation methodologies.
As generative AI automates core data analysis tasks, the most valuable skills for future data scientists are shifting toward uniquely human capabilities.

New research demonstrates a shared autonomy framework that blends human guidance with AI-powered control, allowing robots to navigate complex urban environments more effectively.