top of page

HAPTICS/VR

VIST (Visual-Inertial Skeleton Tracking) for Hand Tracking

State-of-the-art technologies for hand (and finger) tracking do not always provide accurate and robust tracking. Occlusions can affect tracking with vision sensors, electromagnetic interference inertial measurement units (IMUs) and compasses, and ambiguous mechanical contact soft sensors. We propose visual-inertial skeleton tracking (VIST) for robust and accurate hand tracking in a variety of real-world scenarios. Our VIST comprises a sensor glove with multiple IMUs and passive visual markers and a head-mounted stereo camera. VIST also uses a tightly coupled filtering-based visual-inertial fusion to estimate hand motion and auto-calibrates hand/glove parameters simultaneously under hand anatomical constraints. Our VIST has the potential to enrich not only human-robot interaction but also user experience in VR/AR/MR. (Science Robotics 2021 https://www.science.org/doi/10.1126/scirobotics.abe1315)

Real-Time Accurate Simulation with Multi-Contact

Fast, accurate and stable simulation with multi-contact and tight-tolerance is crucial to data-driven approaches, e.g., deep reinforcement learning. Current robot simulators (e.g., ODE, Vortex, Bullet, etc.) do not provide this capability. For this, we propose a novel data-driven contact clustering based on the interaction network and trained with real experimental data. Combining this with our PMI (passive mid-point integrator, IJRR17), we could attain real-time, experimentally-validated simulation of peg-in-hole and bolting tasks with multi-contact and very tight tolerance, all impossible for other current simulators. (ICRA19)

Passivity-Based Haptic Rendering

We propose a novel PMI-based haptic rendering and interactive simulation framework, which enforces discrete-time passivity of the simulation for mechanical systems  both in maximal and generalized coordinates, while fully incorporating multi-point Coulomb frictional contact via PMI-LCP formulation. The proposed PMI-based simulation framework is applied to some illustrative examples to manifest its advantages: 1) haptic rendering of peg-in-hole task, where very light/stiff articulated objects can be simulated with multi-point contact; 2) haptic interaction with flexible beam, where marginally-stable/lossless behavior (i.e., vibration) can be stably emulated; and 3) under-actuated tendon-driven hand grasping, where mixed maximal-generalized coordinates are used with very light/stiff fingers. (IJRR17)

Wearable Cutaneous Haptic Interface

We propose a wearable/mobile cutaneous haptic interface (WCHI), which can provide 3-DOF finger-tip contact force by using 3 wire-driven actuators with FSR and soft sensors for feedback control; and IMU and soft sensors to real-time track finger/hand pose for VR rendering.  By utilizing IMU, FSR and soft sensors, the WCHI can track multi-DOF finger motions while avoiding motor-IMU interferenc. Its feedback control also allows for precise 3-DOF contact force display while addressing variability among human users.  (T-MECH 2018, Collaboration with SNU Soft Robotics and Bionics Laboratory)

Our goal is to allow geographically distributed multiple users to haptically interact with each other in VR cyberspace.  For this, we adopt p2p consensus architecutre to provide instantaneous and consistent haptic stimuli to the users. Passivity is also widely utilized from rendering to consensus and virtual coupling to ensure system stability particularly against information delay over the Internet. Wearable/mobile cutaneous haptic device with IMU-based finger tracking is also adopted for rich and untethered haptic experience. (T-RO 2013)

Multi-User Haptic Interaction

Multi-Modal Perception in VR

Wearable/mobile cutaneous haptic device with integrated IMU-based finger tracking is promising to duplicate rich realworld tasks relying on dexterity of fingers/hands in VR.  However, with its 3-DOF actuation miniaturized on the finger and utilizing commercially-viable IMU sensors, it is inevitable that both its actuation and sensing are not perfect.  To address this limitation of wearable/mobile haptic/VR devices, we explore ways to utilize (limited) multi-modal perception ability of human users. In particular, we found that: (1) pseudo-haptics can double the perceived strength of the cutaneous haptic actuation; and (2) around 3cm finger-tip tracking error is within detection threshold of users VR with cutaneous haptic feedback. (HS2014, WHC2015)

bottom of page