Author: Denis Avetisyan
New research explores how synchronizing a robot’s movements with a user’s breathing patterns can create a stronger sense of connection and control during teleoperation.

Synchronizing robot motion with human breathing enhances user embodiment and body ownership in remote control scenarios.
While effective telepresence often prioritizes visuomotor feedback, a crucial element of embodied experience – internal physiological cues – remains largely unexplored in human-robot interaction. This research, detailed in ‘Breathe with Me: Synchronizing Biosignals for User Embodiment in Robots’, investigates the impact of ‘embreathment’ – synchronizing a robot’s movements with a user’s breathing – on the sense of body ownership during remote control. Results demonstrate that this synchronization significantly enhances embodiment, suggesting that physiological signals can serve as a novel interoceptive pathway for human-robot systems. Could harnessing these internal cues unlock more intuitive and immersive forms of robotic control and collaboration?
The Illusion of Control: Deconstructing Embodiment
The foundation of many embodiment studies rests upon the principle of visuomotor congruence, powerfully demonstrated by the Rubber Hand Illusion. This phenomenon, where the sensation of touch on a visible but fake hand is mapped onto the participant’s own hidden hand, suggests that the brain readily integrates visual and tactile information to construct a unified sense of body ownership. Essentially, if what one sees aligns with what one feels, the brain can be tricked into perceiving the artificial limb as part of itself. However, this reliance on matching sensory inputs presents a limitation; it implies embodiment is primarily driven by superficial correspondence rather than a deeper, more complex interplay of predictive processing and sensorimotor integration, potentially overlooking crucial factors in establishing true agency and ownership – especially when dealing with dynamic robotic extensions that don’t perfectly mimic natural movement.
The prevailing research on embodiment often prioritizes visual feedback as the primary driver of feeling connected to an artificial limb or robotic extension, a perspective largely shaped by the success of illusions like the Rubber Hand. However, this emphasis overlooks the crucial role of other sensory modalities and the complex interplay of predictive processing within the brain. Feeling ‘one’ with a robotic extension isn’t simply about seeing movement that corresponds to one’s own intentions; it requires a cohesive integration of proprioceptive signals – the sense of body position and effort – tactile feedback, and even auditory cues. The brain constantly predicts sensory consequences of actions, and discrepancies between these predictions and actual sensations can disrupt the feeling of ownership, even if the visual input is perfectly aligned. Consequently, a purely visual-centric approach may fall short in replicating the nuanced, multi-sensory experience necessary for robust and lasting embodiment, particularly when the robotic extension is actively engaged in dynamic, real-world interactions.
Despite advancements in robotic prosthetics and extensions, fostering a genuine sense of agency and body ownership during active interactions remains a significant hurdle. Current techniques, often relying on synchronized movements or visual feedback, frequently falter when confronted with the complexities of real-world dynamics – unpredictable delays, varying forces, and the need for continuous adaptation. The brain doesn’t simply register a matching signal; it expects a predictable and reliable causal link between intention and action. When robotic extensions introduce even slight discrepancies in this loop, the feeling of control diminishes, and the illusion of embodiment breaks down. This is particularly pronounced in dynamic scenarios, where quick reactions and nuanced movements are required, highlighting the need for more sophisticated methods that account for the inherent uncertainties of physical interaction and prioritize a seamless, intuitive connection between the user and the robotic system.
The success of future human-robot teams, particularly in assistive roles, hinges on establishing a robust sense of embodiment – the feeling that the robot extension is part of the user’s own body. This isn’t merely about tricking the brain, as demonstrated by illusions like the Rubber Hand; it requires a deeper integration allowing for seamless, intuitive interaction. When a user genuinely feels agency over a robotic limb, they experience reduced cognitive load and increased efficiency, crucial for tasks demanding precision and sustained effort. Consider applications ranging from robotic prosthetics to exoskeletons assisting with rehabilitation or supporting physically demanding jobs – in each scenario, a strong feeling of embodiment directly translates to improved performance, enhanced safety, and a more natural, comfortable experience for the user. Consequently, ongoing research prioritizes developing methods to foster this sense of ownership, moving beyond simple sensory congruence to encompass predictive coding, complex movement dynamics, and personalized adaptation to individual user needs.

Embreathment: Physiological Synchronization for Embodied Control
Embreathment is a proposed method for enhancing robotic embodiment by directly linking a robotic arm’s movements to a user’s respiratory cycle. This synchronization is achieved by utilizing the user’s respiration signal as a control input, effectively modulating the robotic arm’s motion in time with each breath. Unlike traditional teleoperation systems reliant on visual feedback, Embreathment aims to establish a more fundamental connection between the user and the robot, grounding the sense of embodiment in a shared physiological rhythm. The premise is that aligning external robotic action with internal bodily processes – specifically, the cyclical nature of breathing – can create a stronger and more intuitive feeling of ownership and control over the robotic limb.
Interoceptive synchrony, the alignment of external events with internal physiological rhythms, forms the basis of Embreathment. This synchronization is achieved by directly linking a robotic arm’s movements to a user’s respiration signal, effectively mirroring the user’s breathing pattern in the robot’s actions. The premise is that by matching external robotic motion to an intrinsic, cyclical bodily process – respiration – a sense of embodiment can be fostered. This differs from traditional teleoperation which relies heavily on visual feedback and cognitive processing; Embreathment seeks to establish a more direct, sensorimotor connection grounded in the user’s internal state and autonomic nervous system activity.
Traditional robotic teleoperation and control systems heavily rely on visual feedback to establish a sense of embodiment for the user. However, this reliance creates a cognitive load and can feel detached from natural human experience. Embreathment, by contrast, shifts the focus from external visual cues to internal physiological signals, specifically respiration. This grounding in interoception-the sense of the internal state of the body-provides a more fundamental and intuitive connection between the user and the robotic system. By directly linking robotic movements to the user’s breathing rhythm, the system bypasses the need for conscious visual monitoring and promotes a feeling of bodily ownership and control that is rooted in an inherent, pre-cognitive awareness of the body’s internal state.
Utilizing a user’s Respiration Signal as direct input for a robotic arm establishes a physiological connection between the user and the robotic system. This is achieved by directly mapping parameters derived from the respiration signal – such as inhalation/exhalation phases, amplitude, and frequency – to control the robotic arm’s degrees of freedom. Consequently, the arm’s movements are no longer solely determined by external commands or visual feedback, but become inherently tied to the user’s involuntary breathing cycle, creating a closed-loop system where the robotic action is intrinsically linked to the user’s internal physiological state and perceived as an extension of the self.
Quantifying Ownership: Assessing the Embodiment Response
The assessment of body ownership utilized the Virtual Embodiment Questionnaire (VEQ), a validated instrument comprised of statements evaluating the feeling of body ownership over a virtual or robotic limb. Participants rate their agreement with these statements on a Likert scale, providing a quantifiable measure of embodiment. The VEQ assesses multiple facets of embodiment, including feelings of agency, sensory congruence, and self-attribution, allowing for a nuanced understanding of how strongly users perceive the robotic extension as part of their own body. Established norms and reliability data for the VEQ were referenced to ensure the validity and comparability of the collected data.
Teleoperation studies were conducted to evaluate the degree to which participants incorporated the robotic arm as an extension of their own body during task performance. These studies assessed metrics such as movement smoothness, reaction time, and error rates while subjects controlled the robotic arm to manipulate objects. Specifically, researchers examined how closely the robotic arm’s movements mirrored the user’s intended actions, quantifying the latency and precision of the control interface. Lower latency and increased precision indicated a more seamless integration, suggesting a stronger sense of embodiment and agency over the robotic extension. Data from these studies were then correlated with subjective measures of body ownership to provide a comprehensive understanding of the user experience.
Quantitative analysis revealed a statistically significant enhancement in the feeling of body ownership when utilizing Embreathment. Participants demonstrated an average increase of 0.47 in Body Ownership scores compared to conditions without synchronization. This difference was assessed using a t-test with 25 participants ($t(25)=3.01$, $p=.0029$), and remained significant after applying the Benjamini-Hochberg correction for multiple comparisons ($pholm=.0265$). The effect size, as measured by Cohen’s d, was 0.59, indicating a medium-to-large magnitude of difference in perceived body ownership between the Embreathment and non-synchronized conditions.
Analysis using the Rubber Hand Illusion (RHI) questionnaire revealed a statistically significant increase in reported ownership when utilizing Embreathment. Specifically, RHI scores increased by an average of 0.36 points in the Embreathment condition compared to the control condition. This difference was subjected to a t-test with 25 degrees of freedom, yielding a t-statistic of 3.05, a p-value of 0.0027, and a corrected p-value (holm) of 0.0265. The calculated effect size, Cohen’s d, was 0.60, indicating a moderate to large effect of Embreathment on the feeling of body ownership as measured by the RHI.
Beyond Control: The Expanding Implications of Embodied Interaction
The emerging field of Embreathment offers a novel approach to prosthetic control, promising more seamless and intuitive operation of assistive robotic limbs. Traditional prosthetic interfaces often rely on complex motor signals or cumbersome controls, creating a disconnect between intention and action. Embreathment, however, leverages the natural coordination between breathing and movement, integrating respiratory patterns into the control scheme. This allows users to manipulate prosthetics by modulating their breath – a subtle, innate action – fostering a stronger sense of embodiment and potentially reducing the cognitive load associated with operation. Early research indicates that this biofeedback-driven control system can significantly enhance the user’s ability to perform complex tasks with greater precision and fluidity, ultimately improving quality of life and promoting wider acceptance of prosthetic technology.
A heightened sense of body ownership, facilitated by technologies like Embreathment, directly correlates with increased self-efficacy in human-robot interaction. When an individual perceives a robotic limb or avatar as an integral part of their own body, their confidence in controlling and interacting with it grows substantially. This perception isn’t merely psychological; it impacts performance, allowing for more fluid and natural movements. Consequently, user acceptance of assistive devices and virtual embodiments increases, as the technology feels less like an external tool and more like an extension of the self. The resulting improvement in perceived control minimizes frustration and maximizes the potential benefits of human-robot collaboration, paving the way for more effective and intuitive interfaces.
The core tenets of Embreathment, initially explored within the realm of robotic control, possess a remarkable adaptability extending into virtual and augmented reality design. This approach, centered on subtly synchronizing visual stimuli with a user’s breathing patterns, can effectively cultivate a heightened sense of presence and embodiment within digital environments. By mirroring natural physiological rhythms, developers can move beyond purely visual or auditory cues, fostering a deeper, more convincing illusion of ‘being there’. This is particularly relevant as VR and AR strive for greater realism; subtle cues like breath synchronization can dramatically reduce simulator sickness, enhance immersion, and allow for more intuitive interactions with virtual objects and spaces – ultimately leading to more compelling and engaging user experiences.
Ongoing research endeavors are dedicated to refining Embreathment techniques to ensure accessibility and effectiveness across a wider spectrum of individuals, accounting for variations in age, physical ability, and neurological differences. This optimization process involves tailoring the sensory stimuli and feedback mechanisms to suit specific user needs and preferences. Beyond enhanced human-robot interaction, investigations are also underway to explore the therapeutic potential of Embreathment, particularly in areas such as rehabilitation following stroke or spinal cord injury, and in addressing conditions related to body image disturbance or phantom limb sensation. Preliminary studies suggest that fostering a stronger sense of body ownership through this technology may facilitate motor recovery, reduce chronic pain, and improve psychological well-being, opening avenues for novel interventions in both physical and mental healthcare.
The pursuit of seamless human-robot interaction, as explored in this research, hinges on establishing a predictable and reproducible connection-a principle Donald Davies articulated when he stated, “A system is only as good as its ability to be understood and reproduced.” The synchronization of breathing patterns between human and robot isn’t merely about mimicking life; it’s about creating a deterministic link that fosters a stronger sense of embodiment. By ensuring the robot’s actions are reliably tied to the user’s physiological state, the research effectively addresses the need for predictable system behavior-a cornerstone of robust and trustworthy teleoperation, enhancing the user’s feeling of body ownership and control.
Where Do We Go From Here?
The demonstrated correlation between synchronized robotic motion and a user’s perception of embodiment, while intriguing, merely shifts the fundamental question. It is not sufficient to induce a feeling of presence; the underlying mechanisms governing this sensation demand rigorous formalization. The current work establishes a phenomenological link, but lacks a predictive model capable of specifying, with mathematical certainty, the parameters of synchronization – respiratory rate, amplitude, and phase – necessary to achieve a quantifiable level of body ownership. To claim success requires more than subjective reports; it demands a provable relationship.
Future investigations should prioritize the development of such a model, potentially drawing from principles of predictive processing and Bayesian inference. The notion of ‘interoceptive prediction error’ – the discrepancy between expected and actual physiological states – offers a potential avenue for formalization, but its application to robotic teleoperation remains largely unexplored. Simply ‘matching’ breathing patterns is a heuristic; a true theory would specify why this synchronization facilitates embodiment, and under what conditions it would fail.
Ultimately, the field must move beyond the pursuit of convincing illusions. The goal is not to simulate embodiment, but to understand the computational principles that give rise to it. Until these principles are expressed in a language amenable to formal verification, the quest for truly shared autonomy will remain, at best, an elegantly engineered conjecture.
Original article: https://arxiv.org/pdf/2512.14952.pdf
Contact the author: https://www.linkedin.com/in/avetisyan/
See also:
- Brawl Stars December 2025 Brawl Talk: Two New Brawlers, Buffie, Vault, New Skins, Game Modes, and more
- Mobile Legends: Bang Bang (MLBB) Sora Guide: Best Build, Emblem and Gameplay Tips
- Clash Royale Best Boss Bandit Champion decks
- Best Hero Card Decks in Clash Royale
- Call of Duty Mobile: DMZ Recon Guide: Overview, How to Play, Progression, and more
- Clash Royale December 2025: Events, Challenges, Tournaments, and Rewards
- Best Arena 9 Decks in Clast Royale
- Clash Royale Best Arena 14 Decks
- Clash Royale Witch Evolution best decks guide
- Brawl Stars December 2025 Brawl Talk: Two New Brawlers, Buffie, Vault, New Skins, Game Modes, and more
2025-12-18 16:18