Robots That See Their Way to Success

A new control framework combines vision, learning, and robust control to enable mobile robots to navigate complex environments and reach goals with centimeter-level accuracy.

A new control framework combines vision, learning, and robust control to enable mobile robots to navigate complex environments and reach goals with centimeter-level accuracy.

Researchers have developed an open-source framework that leverages artificial intelligence to dramatically accelerate the design and optimization of electronic-photonic systems.

A new framework empowers fleets of robots to simultaneously explore and inspect large, complex 3D environments even with restricted communication.
New research introduces a dataset and benchmark for training robots to interpret natural language feedback, paving the way for more intuitive human-robot interaction.

Researchers have engineered microswimmers capable of autonomously navigating complex environments by sensing the chemical traces of their own movements.

Researchers are charting a course toward large-scale artificial intelligence systems powered by photonics, demanding a complete overhaul of design methodologies.

A new system combines large language models with robotic platforms, allowing robots to understand instructions, map environments, and autonomously locate and grasp objects.

A new framework, HiGR, leverages advanced generative techniques to move beyond simple item lists and create more diverse and satisfying recommendation slates for users.

Researchers have developed a new framework that allows robots to react to visual instructions in real-time, achieving smoother and more efficient manipulation.

A new framework, OpenOneRec, is leveraging the power of generative AI to build more flexible and performant recommendation systems.