<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Active | Living with Assistive and Interactive Robots (LAIR) Lab</title><link>https://lair-lab.github.io/tag/active/</link><atom:link href="https://lair-lab.github.io/tag/active/index.xml" rel="self" type="application/rss+xml"/><description>Active</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Thu, 01 May 2025 00:00:00 +0000</lastBuildDate><item><title>Variable Stiffness Robotic Finger</title><link>https://lair-lab.github.io/project/bistablegripper/</link><pubDate>Thu, 01 May 2025 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/bistablegripper/</guid><description>&lt;!-- Sensor Observability Analysis, akin to the kinematic manipulability index, aims to quantify the quality of sensor observations of task-space quantities based on the robot configuration for optimization purposes. -->
&lt;!-- Sensor Observability Analysis, akin to the kinematic manipulability index, is a novel performance metric for articulated robotic mechanisms.
The goal is to analyse and evaluate the performance of robot-mounted distributed directional or axial-based sensors to observe specific axes in task space as a function of joint configuration.
For example, joint torque sensors are often used in serial robot manipulators and assumed to be perfectly capable of estimating end effector forces, but certain joint configurations may cause one or more task-space axes to be unobservable as a result of how the joint torque sensors are aligned.
The proposed sensor observability analysis provides a method to analyse the cumulative quality of a robot configuration to observe the task space.
The resultant metrics can then be used in optimization and in null-space control to avoid sensor observability singular configurations or to maximize sensor observability in particular directions.
Parallels are drawn between sensor observability and the traditional kinematic Jacobian for the particular case of joint torque sensors in serial robot manipulators.
Compared to kinematic analysis using the Jacobian in serial manipulators, sensor observability analysis is shown to be more generalizable in terms of analysing non-joint-mounted sensors and can potentially be applied to sensor types other than for force sensing, e.g., link-mounted proximity sensors.
We demonstrate the utility and importance of sensor observability in physical interactions using simulations and experiments of a custom 3-DOF robot and the Baxter robot. -->
&lt;!-- **Related Research Items**:
* C. Y. Wong and W. Suleiman, "Sensor Observability Analysis for Maximizing Task-Space Observability of Articulated Robots," in *IEEE Transactions on Robotics*, 2024. (Accepted)
* [ResearchGate Link](https://www.researchgate.net/publication/370687950_Sensor_Observability_Analysis_for_Maximizing_Task-Space_Observability_of_Articulated_Robots)
* C. Y. Wong and W. Suleiman, "Sensor Observability Index: Evaluating Sensor Alignment for Task-Space Observability in Robotic Manipulators," *2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)*, Kyoto, Japan, 2022, pp. 1276-1282.
* [IEEE *Xplore* Link](https://ieeexplore.ieee.org/document/9982209)
* [ResearchGate Link](https://www.researchgate.net/publication/362629254_Sensor_Observability_Index_Evaluating_Sensor_Alignment_for_Task-Space_Observability_in_Robotic_Manipulators)
* [IROS 2022 Presentation (Kyoto, Oct 2022)](https://www.youtube.com/watch?v=W8IQpi4CBZg) --></description></item><item><title>Virtual reality and psHRI</title><link>https://lair-lab.github.io/project/vr/</link><pubDate>Wed, 01 Jan 2025 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/vr/</guid><description>&lt;p>The goal is to use virtual reality as a tool for studying psHRI by achieving contextually-rich but low-cost interactions.&lt;/p>
&lt;!-- Please visit my [ResearchGate project page](https://www.researchgate.net/project/Sensor-Observability-Analysis) to see the list of related research items. --></description></item><item><title>Robot-to-Human (R2H) Grasping</title><link>https://lair-lab.github.io/project/graspr2h/</link><pubDate>Tue, 30 May 2023 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/graspr2h/</guid><description>&lt;p>During interactions between a human and a robot, there may be a time when a robot must purposefully come into contact or grasp human (e.g., a robot grasps a human’s hand to physically guide them to perform a task, to teach a motion, or to provide stability and support). Depending on how the robot grasps the human (e.g., grasp location, orientation, force, and open/closed grip), different grasp types may elicit different emotional responses from the human. The goal of the project is to investigate what robot-to-human (R2H) contact/grasp factors affect the perceived safety and comfort of the interaction and how it differs from the similar human-to-human (H2H) contact/grasping.&lt;/p>
&lt;!-- Please visit my [ResearchGate project page](https://www.researchgate.net/project/Sensor-Observability-Analysis) to see the list of related research items. --></description></item><item><title>Sensor Observability Analysis</title><link>https://lair-lab.github.io/project/soa/</link><pubDate>Sat, 01 Jan 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/soa/</guid><description>&lt;!-- Sensor Observability Analysis, akin to the kinematic manipulability index, aims to quantify the quality of sensor observations of task-space quantities based on the robot configuration for optimization purposes. -->
&lt;p>Sensor Observability Analysis, akin to the kinematic manipulability index, is a novel performance metric for articulated robotic mechanisms.
The goal is to analyse and evaluate the performance of robot-mounted distributed directional or axial-based sensors to observe specific axes in task space as a function of joint configuration.
For example, joint torque sensors are often used in serial robot manipulators and assumed to be perfectly capable of estimating end effector forces, but certain joint configurations may cause one or more task-space axes to be unobservable as a result of how the joint torque sensors are aligned.
The proposed sensor observability analysis provides a method to analyse the cumulative quality of a robot configuration to observe the task space.
The resultant metrics can then be used in optimization and in null-space control to avoid sensor observability singular configurations or to maximize sensor observability in particular directions.
Parallels are drawn between sensor observability and the traditional kinematic Jacobian for the particular case of joint torque sensors in serial robot manipulators.
Compared to kinematic analysis using the Jacobian in serial manipulators, sensor observability analysis is shown to be more generalizable in terms of analysing non-joint-mounted sensors and can potentially be applied to sensor types other than for force sensing, e.g., link-mounted proximity sensors.
We demonstrate the utility and importance of sensor observability in physical interactions using simulations and experiments of a custom 3-DOF robot and the Baxter robot.&lt;/p>
&lt;p>&lt;strong>Related Research Items&lt;/strong>:&lt;/p>
&lt;ul>
&lt;li>C. Y. Wong and W. Suleiman, &amp;ldquo;Sensor Observability Analysis for Maximizing Task-Space Observability of Articulated Robots,&amp;rdquo; in &lt;em>IEEE Transactions on Robotics&lt;/em>, 2024. (Accepted)
&lt;ul>
&lt;li>&lt;a href="https://www.researchgate.net/publication/370687950_Sensor_Observability_Analysis_for_Maximizing_Task-Space_Observability_of_Articulated_Robots" target="_blank" rel="noopener">ResearchGate Link&lt;/a>&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>C. Y. Wong and W. Suleiman, &amp;ldquo;Sensor Observability Index: Evaluating Sensor Alignment for Task-Space Observability in Robotic Manipulators,&amp;rdquo; &lt;em>2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)&lt;/em>, Kyoto, Japan, 2022, pp. 1276-1282.
&lt;ul>
&lt;li>&lt;a href="https://ieeexplore.ieee.org/document/9982209" target="_blank" rel="noopener">IEEE &lt;em>Xplore&lt;/em> Link&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.researchgate.net/publication/362629254_Sensor_Observability_Index_Evaluating_Sensor_Alignment_for_Task-Space_Observability_in_Robotic_Manipulators" target="_blank" rel="noopener">ResearchGate Link&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.youtube.com/watch?v=W8IQpi4CBZg" target="_blank" rel="noopener">IROS 2022 Presentation (Kyoto, Oct 2022)&lt;/a>&lt;/li>
&lt;/ul>
&lt;/li>
&lt;/ul></description></item></channel></rss>