<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Living with Assistive and Interactive Robots (LAIR) Lab</title><link>https://lair-lab.github.io/</link><atom:link href="https://lair-lab.github.io/index.xml" rel="self" type="application/rss+xml"/><description>Living with Assistive and Interactive Robots (LAIR) Lab</description><generator>Hugo Blox Builder (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Mon, 24 Oct 2022 00:00:00 +0000</lastBuildDate><item><title>Example Event</title><link>https://lair-lab.github.io/event/example/</link><pubDate>Sat, 01 Jun 2030 13:00:00 +0000</pubDate><guid>https://lair-lab.github.io/event/example/</guid><description>&lt;p>Slides can be added in a few ways:&lt;/p>
&lt;ul>
&lt;li>&lt;strong>Create&lt;/strong> slides using Wowchemy&amp;rsquo;s &lt;a href="https://docs.hugoblox.com/managing-content/#create-slides" target="_blank" rel="noopener">&lt;em>Slides&lt;/em>&lt;/a> feature and link using &lt;code>slides&lt;/code> parameter in the front matter of the talk file&lt;/li>
&lt;li>&lt;strong>Upload&lt;/strong> an existing slide deck to &lt;code>static/&lt;/code> and link using &lt;code>url_slides&lt;/code> parameter in the front matter of the talk file&lt;/li>
&lt;li>&lt;strong>Embed&lt;/strong> your slides (e.g. Google Slides) or presentation video on this page using &lt;a href="https://docs.hugoblox.com/writing-markdown-latex/" target="_blank" rel="noopener">shortcodes&lt;/a>.&lt;/li>
&lt;/ul>
&lt;p>Further event details, including page elements such as image galleries, can be added to the body of this page.&lt;/p></description></item><item><title>Research Projects</title><link>https://lair-lab.github.io/projects/</link><pubDate>Mon, 19 May 2025 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/projects/</guid><description/></item><item><title>Variable Stiffness Robotic Finger</title><link>https://lair-lab.github.io/project/bistablegripper/</link><pubDate>Thu, 01 May 2025 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/bistablegripper/</guid><description>&lt;!-- Sensor Observability Analysis, akin to the kinematic manipulability index, aims to quantify the quality of sensor observations of task-space quantities based on the robot configuration for optimization purposes. -->
&lt;!-- Sensor Observability Analysis, akin to the kinematic manipulability index, is a novel performance metric for articulated robotic mechanisms.
The goal is to analyse and evaluate the performance of robot-mounted distributed directional or axial-based sensors to observe specific axes in task space as a function of joint configuration.
For example, joint torque sensors are often used in serial robot manipulators and assumed to be perfectly capable of estimating end effector forces, but certain joint configurations may cause one or more task-space axes to be unobservable as a result of how the joint torque sensors are aligned.
The proposed sensor observability analysis provides a method to analyse the cumulative quality of a robot configuration to observe the task space.
The resultant metrics can then be used in optimization and in null-space control to avoid sensor observability singular configurations or to maximize sensor observability in particular directions.
Parallels are drawn between sensor observability and the traditional kinematic Jacobian for the particular case of joint torque sensors in serial robot manipulators.
Compared to kinematic analysis using the Jacobian in serial manipulators, sensor observability analysis is shown to be more generalizable in terms of analysing non-joint-mounted sensors and can potentially be applied to sensor types other than for force sensing, e.g., link-mounted proximity sensors.
We demonstrate the utility and importance of sensor observability in physical interactions using simulations and experiments of a custom 3-DOF robot and the Baxter robot. -->
&lt;!-- **Related Research Items**:
* C. Y. Wong and W. Suleiman, "Sensor Observability Analysis for Maximizing Task-Space Observability of Articulated Robots," in *IEEE Transactions on Robotics*, 2024. (Accepted)
* [ResearchGate Link](https://www.researchgate.net/publication/370687950_Sensor_Observability_Analysis_for_Maximizing_Task-Space_Observability_of_Articulated_Robots)
* C. Y. Wong and W. Suleiman, "Sensor Observability Index: Evaluating Sensor Alignment for Task-Space Observability in Robotic Manipulators," *2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)*, Kyoto, Japan, 2022, pp. 1276-1282.
* [IEEE *Xplore* Link](https://ieeexplore.ieee.org/document/9982209)
* [ResearchGate Link](https://www.researchgate.net/publication/362629254_Sensor_Observability_Index_Evaluating_Sensor_Alignment_for_Task-Space_Observability_in_Robotic_Manipulators)
* [IROS 2022 Presentation (Kyoto, Oct 2022)](https://www.youtube.com/watch?v=W8IQpi4CBZg) --></description></item><item><title>Virtual reality and psHRI</title><link>https://lair-lab.github.io/project/vr/</link><pubDate>Wed, 01 Jan 2025 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/vr/</guid><description>&lt;p>The goal is to use virtual reality as a tool for studying psHRI by achieving contextually-rich but low-cost interactions.&lt;/p>
&lt;!-- Please visit my [ResearchGate project page](https://www.researchgate.net/project/Sensor-Observability-Analysis) to see the list of related research items. --></description></item><item><title>Sensor Observability Analysis for Maximizing Task-Space Observability of Articulated Robots</title><link>https://lair-lab.github.io/publication/wong-sensor-observability-analysis-2024/</link><pubDate>Thu, 01 Aug 2024 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-sensor-observability-analysis-2024/</guid><description/></item><item><title>Vision- and Tactile-Based Continuous Multimodal Intention and Attention Recognition for Safer Physical Human-Robot Interaction</title><link>https://lair-lab.github.io/publication/wong-vision-tactile-based-continuous-2024/</link><pubDate>Mon, 01 Jul 2024 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-vision-tactile-based-continuous-2024/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Perspectives on Robotic Systems for the Visually Impaired</title><link>https://lair-lab.github.io/publication/wong-perspectives-robotic-systems-2024/</link><pubDate>Mon, 01 Jan 2024 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-perspectives-robotic-systems-2024/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Roboethics for everyone - a hands-on teaching module for K-12 and beyond</title><link>https://lair-lab.github.io/publication/ananto-roboethics-everyone-handson-2024-a/</link><pubDate>Mon, 01 Jan 2024 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/ananto-roboethics-everyone-handson-2024-a/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Stigma and Service Robots</title><link>https://lair-lab.github.io/publication/akiyama-stigma-service-robots-2024-a/</link><pubDate>Mon, 01 Jan 2024 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/akiyama-stigma-service-robots-2024-a/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Robot-to-Human (R2H) Grasping</title><link>https://lair-lab.github.io/project/graspr2h/</link><pubDate>Tue, 30 May 2023 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/graspr2h/</guid><description>&lt;p>During interactions between a human and a robot, there may be a time when a robot must purposefully come into contact or grasp human (e.g., a robot grasps a human’s hand to physically guide them to perform a task, to teach a motion, or to provide stability and support). Depending on how the robot grasps the human (e.g., grasp location, orientation, force, and open/closed grip), different grasp types may elicit different emotional responses from the human. The goal of the project is to investigate what robot-to-human (R2H) contact/grasp factors affect the perceived safety and comfort of the interaction and how it differs from the similar human-to-human (H2H) contact/grasping.&lt;/p>
&lt;!-- Please visit my [ResearchGate project page](https://www.researchgate.net/project/Sensor-Observability-Analysis) to see the list of related research items. --></description></item><item><title>Contact</title><link>https://lair-lab.github.io/contact/</link><pubDate>Mon, 24 Oct 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/contact/</guid><description/></item><item><title>People</title><link>https://lair-lab.github.io/people/</link><pubDate>Mon, 24 Oct 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/people/</guid><description/></item><item><title>Tour</title><link>https://lair-lab.github.io/tour/</link><pubDate>Mon, 24 Oct 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/tour/</guid><description/></item><item><title>Sensor Observability Index: Evaluating Sensor Alignment for Task-Space Observability in Robotic Manipulators</title><link>https://lair-lab.github.io/publication/wong-sensor-observability-index-2022/</link><pubDate>Sat, 01 Oct 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-sensor-observability-index-2022/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Human-Humanoid Robot Cooperative Load Transportation: Model-based Control Approach</title><link>https://lair-lab.github.io/publication/rahem-human-humanoid-robot-cooperative-2022/</link><pubDate>Sat, 01 Jan 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/rahem-human-humanoid-robot-cooperative-2022/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Sensor Observability Analysis</title><link>https://lair-lab.github.io/project/soa/</link><pubDate>Sat, 01 Jan 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/project/soa/</guid><description>&lt;!-- Sensor Observability Analysis, akin to the kinematic manipulability index, aims to quantify the quality of sensor observations of task-space quantities based on the robot configuration for optimization purposes. -->
&lt;p>Sensor Observability Analysis, akin to the kinematic manipulability index, is a novel performance metric for articulated robotic mechanisms.
The goal is to analyse and evaluate the performance of robot-mounted distributed directional or axial-based sensors to observe specific axes in task space as a function of joint configuration.
For example, joint torque sensors are often used in serial robot manipulators and assumed to be perfectly capable of estimating end effector forces, but certain joint configurations may cause one or more task-space axes to be unobservable as a result of how the joint torque sensors are aligned.
The proposed sensor observability analysis provides a method to analyse the cumulative quality of a robot configuration to observe the task space.
The resultant metrics can then be used in optimization and in null-space control to avoid sensor observability singular configurations or to maximize sensor observability in particular directions.
Parallels are drawn between sensor observability and the traditional kinematic Jacobian for the particular case of joint torque sensors in serial robot manipulators.
Compared to kinematic analysis using the Jacobian in serial manipulators, sensor observability analysis is shown to be more generalizable in terms of analysing non-joint-mounted sensors and can potentially be applied to sensor types other than for force sensing, e.g., link-mounted proximity sensors.
We demonstrate the utility and importance of sensor observability in physical interactions using simulations and experiments of a custom 3-DOF robot and the Baxter robot.&lt;/p>
&lt;p>&lt;strong>Related Research Items&lt;/strong>:&lt;/p>
&lt;ul>
&lt;li>C. Y. Wong and W. Suleiman, &amp;ldquo;Sensor Observability Analysis for Maximizing Task-Space Observability of Articulated Robots,&amp;rdquo; in &lt;em>IEEE Transactions on Robotics&lt;/em>, 2024. (Accepted)
&lt;ul>
&lt;li>&lt;a href="https://www.researchgate.net/publication/370687950_Sensor_Observability_Analysis_for_Maximizing_Task-Space_Observability_of_Articulated_Robots" target="_blank" rel="noopener">ResearchGate Link&lt;/a>&lt;/li>
&lt;/ul>
&lt;/li>
&lt;li>C. Y. Wong and W. Suleiman, &amp;ldquo;Sensor Observability Index: Evaluating Sensor Alignment for Task-Space Observability in Robotic Manipulators,&amp;rdquo; &lt;em>2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)&lt;/em>, Kyoto, Japan, 2022, pp. 1276-1282.
&lt;ul>
&lt;li>&lt;a href="https://ieeexplore.ieee.org/document/9982209" target="_blank" rel="noopener">IEEE &lt;em>Xplore&lt;/em> Link&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.researchgate.net/publication/362629254_Sensor_Observability_Index_Evaluating_Sensor_Alignment_for_Task-Space_Observability_in_Robotic_Manipulators" target="_blank" rel="noopener">ResearchGate Link&lt;/a>&lt;/li>
&lt;li>&lt;a href="https://www.youtube.com/watch?v=W8IQpi4CBZg" target="_blank" rel="noopener">IROS 2022 Presentation (Kyoto, Oct 2022)&lt;/a>&lt;/li>
&lt;/ul>
&lt;/li>
&lt;/ul></description></item><item><title>Touch Semantics for Intuitive Physical Manipulation of Humanoids</title><link>https://lair-lab.github.io/publication/wong-touch-semantics-intuitive-2022/</link><pubDate>Sat, 01 Jan 2022 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-touch-semantics-intuitive-2022/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Dynamic modelling, simulation and experiments of a micro-cutter with applications to cell perforation</title><link>https://lair-lab.github.io/publication/bahadur-dynamic-modelling-simulation-2021-a/</link><pubDate>Fri, 01 Jan 2021 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/bahadur-dynamic-modelling-simulation-2021-a/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Placeholder News 1</title><link>https://lair-lab.github.io/post/20-12-02-icml-best-paper/</link><pubDate>Wed, 02 Dec 2020 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/post/20-12-02-icml-best-paper/</guid><description>&lt;p>Placeholder News 1&lt;/p>
&lt;p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Integer tempus augue non tempor egestas. Proin nisl nunc, dignissim in accumsan dapibus, auctor ullamcorper neque. Quisque at elit felis. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia curae; Aenean eget elementum odio. Cras interdum eget risus sit amet aliquet. In volutpat, nisl ut fringilla dignissim, arcu nisl suscipit ante, at accumsan sapien nisl eu eros.&lt;/p>
&lt;p>Sed eu dui nec ligula bibendum dapibus. Nullam imperdiet auctor tortor, vel cursus mauris malesuada non. Quisque ultrices euismod dapibus. Aenean sed gravida risus. Sed nisi tortor, vulputate nec quam non, placerat porta nisl. Nunc varius lobortis urna, condimentum facilisis ipsum molestie eu. Ut molestie eleifend ligula sed dignissim. Duis ut tellus turpis. Praesent tincidunt, nunc sed congue malesuada, mauris enim maximus massa, eget interdum turpis urna et ante. Morbi sem nisl, cursus quis mollis et, interdum luctus augue. Aliquam laoreet, leo et accumsan tincidunt, libero neque aliquet lectus, a ultricies lorem mi a orci.&lt;/p>
&lt;p>Mauris dapibus sem vel magna convallis laoreet. Donec in venenatis urna, vitae sodales odio. Praesent tortor diam, varius non luctus nec, bibendum vel est. Quisque id sem enim. Maecenas at est leo. Vestibulum tristique pellentesque ex, blandit placerat nunc eleifend sit amet. Fusce eget lectus bibendum, accumsan mi quis, luctus sem. Etiam vitae nulla scelerisque, eleifend odio in, euismod quam. Etiam porta ullamcorper massa, vitae gravida turpis euismod quis. Mauris sodales sem ac ultrices viverra. In placerat ultrices sapien. Suspendisse eu arcu hendrerit, luctus tortor cursus, maximus dolor. Proin et velit et quam gravida dapibus. Donec blandit justo ut consequat tristique.&lt;/p></description></item><item><title>Placeholder News 2</title><link>https://lair-lab.github.io/post/20-12-01-wowchemy-prize/</link><pubDate>Tue, 01 Dec 2020 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/post/20-12-01-wowchemy-prize/</guid><description>&lt;p>Placeholder News 2&lt;/p>
&lt;p>Lorem ipsum dolor sit amet, consectetur adipiscing elit. Integer tempus augue non tempor egestas. Proin nisl nunc, dignissim in accumsan dapibus, auctor ullamcorper neque. Quisque at elit felis. Vestibulum ante ipsum primis in faucibus orci luctus et ultrices posuere cubilia curae; Aenean eget elementum odio. Cras interdum eget risus sit amet aliquet. In volutpat, nisl ut fringilla dignissim, arcu nisl suscipit ante, at accumsan sapien nisl eu eros.&lt;/p>
&lt;p>Sed eu dui nec ligula bibendum dapibus. Nullam imperdiet auctor tortor, vel cursus mauris malesuada non. Quisque ultrices euismod dapibus. Aenean sed gravida risus. Sed nisi tortor, vulputate nec quam non, placerat porta nisl. Nunc varius lobortis urna, condimentum facilisis ipsum molestie eu. Ut molestie eleifend ligula sed dignissim. Duis ut tellus turpis. Praesent tincidunt, nunc sed congue malesuada, mauris enim maximus massa, eget interdum turpis urna et ante. Morbi sem nisl, cursus quis mollis et, interdum luctus augue. Aliquam laoreet, leo et accumsan tincidunt, libero neque aliquet lectus, a ultricies lorem mi a orci.&lt;/p>
&lt;p>Mauris dapibus sem vel magna convallis laoreet. Donec in venenatis urna, vitae sodales odio. Praesent tortor diam, varius non luctus nec, bibendum vel est. Quisque id sem enim. Maecenas at est leo. Vestibulum tristique pellentesque ex, blandit placerat nunc eleifend sit amet. Fusce eget lectus bibendum, accumsan mi quis, luctus sem. Etiam vitae nulla scelerisque, eleifend odio in, euismod quam. Etiam porta ullamcorper massa, vitae gravida turpis euismod quis. Mauris sodales sem ac ultrices viverra. In placerat ultrices sapien. Suspendisse eu arcu hendrerit, luctus tortor cursus, maximus dolor. Proin et velit et quam gravida dapibus. Donec blandit justo ut consequat tristique.&lt;/p></description></item><item><title>Cell extraction automation in single cell surgery using the displacement method</title><link>https://lair-lab.github.io/publication/wong-cell-extraction-automation-2019/</link><pubDate>Tue, 01 Jan 2019 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-cell-extraction-automation-2019/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Gravity Compensation for Impedance Control of Legged Robots Using Optimizationless Proportional Contact Force Estimation</title><link>https://lair-lab.github.io/publication/wong-gravity-compensation-impedance-2018-b/</link><pubDate>Mon, 01 Jan 2018 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-gravity-compensation-impedance-2018-b/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Theory of series inductance matching to transducer at premechanical resonance zone in ultrasonic vibration cutting</title><link>https://lair-lab.github.io/publication/jiang-theory-series-inductance-2018/</link><pubDate>Mon, 01 Jan 2018 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/jiang-theory-series-inductance-2018/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Automation of Single Cell Manipulation for Embryo Biopsy</title><link>https://lair-lab.github.io/publication/wong-automation-single-cell-2017/</link><pubDate>Sun, 01 Jan 2017 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-automation-single-cell-2017/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Dynamic modelling and embryo zona pellucida perforation experiments with piezoelectric actuated micro-needles</title><link>https://lair-lab.github.io/publication/bahadur-dynamic-modelling-embryo-2017/</link><pubDate>Sun, 01 Jan 2017 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/bahadur-dynamic-modelling-embryo-2017/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Automation and optimization of multipulse laser zona drilling of mouse embryos during embryo biopsy</title><link>https://lair-lab.github.io/publication/wong-automation-optimization-multipulse-2016/</link><pubDate>Fri, 01 Jan 2016 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-automation-optimization-multipulse-2016/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Cleavage-stage embryo rotation tracking and automated micropipette control: Towards automated single cell manipulation</title><link>https://lair-lab.github.io/publication/wong-cleavagestage-embryo-rotation-2016-a/</link><pubDate>Fri, 01 Jan 2016 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-cleavagestage-embryo-rotation-2016-a/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Multi-Pulse laser ablation modeling with applications to automated zona removal</title><link>https://lair-lab.github.io/publication/wong-multi-pulse-laser-ablation-2015/</link><pubDate>Thu, 01 Jan 2015 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-multi-pulse-laser-ablation-2015/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Posture Reconfiguration and Navigation Maneuvers on a Wheel-Legged Hydraulic Robot</title><link>https://lair-lab.github.io/publication/wong-posture-reconfiguration-navigation-2015/</link><pubDate>Thu, 01 Jan 2015 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-posture-reconfiguration-navigation-2015/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title>Posture reconfiguration and step climbing maneuvers for a wheel-legged robot</title><link>https://lair-lab.github.io/publication/wong-posture-reconfiguration-step-2014-a/</link><pubDate>Wed, 01 Jan 2014 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/publication/wong-posture-reconfiguration-step-2014-a/</guid><description>&lt;p>Add the &lt;strong>full text&lt;/strong> or &lt;strong>supplementary notes&lt;/strong> for the publication here using Markdown formatting.&lt;/p></description></item><item><title/><link>https://lair-lab.github.io/admin/config.yml</link><pubDate>Mon, 01 Jan 0001 00:00:00 +0000</pubDate><guid>https://lair-lab.github.io/admin/config.yml</guid><description/></item></channel></rss>