Chang et al., 2024 - Google Patents
Real-time arm motion tracking and hand gesture recognition based on a single inertial measurement unitChang et al., 2024
- Document ID
- 7562671982407907817
- Author
- Chang T
- Wu Y
- Han C
- Chang C
- Publication year
- Publication venue
- 2024 11th International Conference on Internet of Things: Systems, Management and Security (IOTSMS)
External Links
Snippet
With the development of virtual reality (VR) and augmented reality (AR) devices, handheld controllers and camera-based hand tracking are the most common methods for interacting with the virtual world. This paper proposes arm motion tracking and hand gesture …
- 230000033001 locomotion 0 title abstract description 34
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/014—Hand-worn input/output arrangements, e.g. data gloves
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03545—Pens or stylus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
- G06F3/012—Head tracking input arrangements
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06K—RECOGNITION OF DATA; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
- G06K9/00—Methods or arrangements for reading or recognising printed or written characters or for recognising patterns, e.g. fingerprints
- G06K9/00335—Recognising movements or behaviour, e.g. recognition of gestures, dynamic facial expressions; Lip-reading
- G06K9/00355—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING; COUNTING
- G06F—ELECTRICAL DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Detecting, measuring or recording for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/11—Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
- A61B5/1121—Determining geometric values, e.g. centre of rotation or angular range of movement
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10860091B2 (en) | Motion predictions of overlapping kinematic chains of a skeleton model used to control a computer system | |
| Fang et al. | 3D human gesture capturing and recognition by the IMMU-based data glove | |
| EP3707584B1 (en) | Method for tracking hand pose and electronic device thereof | |
| US11474593B2 (en) | Tracking user movements to control a skeleton model in a computer system | |
| Zhang et al. | Fine-grained and real-time gesture recognition by using IMU sensors | |
| Fang et al. | A novel data glove using inertial and magnetic sensors for motion capture and robotic arm-hand teleoperation | |
| Lu et al. | Immersive manipulation of virtual objects through glove-based hand gesture interaction | |
| Fang et al. | Robotic teleoperation systems using a wearable multimodal fusion device | |
| US11175729B2 (en) | Orientation determination based on both images and inertial measurement units | |
| US10976863B1 (en) | Calibration of inertial measurement units in alignment with a skeleton model to control a computer system based on determination of orientation of an inertial measurement unit from an image of a portion of a user | |
| US11079860B2 (en) | Kinematic chain motion predictions using results from multiple approaches combined via an artificial neural network | |
| Fang et al. | Development of a wearable device for motion capturing based on magnetic and inertial measurement units | |
| CN108279773B (en) | A data glove based on MARG sensor and magnetic field positioning technology | |
| Chang et al. | Real-time arm motion tracking and hand gesture recognition based on a single inertial measurement unit | |
| WO2020009715A2 (en) | Tracking user movements to control a skeleton model in a computer system | |
| Placidi et al. | Data integration by two-sensors in a LEAP-based Virtual Glove for human-system interaction | |
| Abualola et al. | Flexible gesture recognition using wearable inertial sensors | |
| Osawa et al. | Telerehabilitation system based on OpenPose and 3D reconstruction with monocular camera | |
| CN105068657A (en) | Gesture identification method and device | |
| CN113496168B (en) | Sign language data acquisition method, device and storage medium | |
| WO2022228056A1 (en) | Human-computer interaction method and device | |
| Tsekleves et al. | Wii your health: a low-cost wireless system for home rehabilitation after stroke using Wii remotes with its expansions and blender | |
| Fang et al. | A novel data glove for fingers motion capture using inertial and magnetic measurement units | |
| Oh | A Study on MTL Device Design and Motion Tracking in Virtual Reality Environments. | |
| Xie et al. | Enhanced Recognition for Finger Gesture-Based Control in Humanoid Robots Using Inertial Sensors |