CN108815804A - VR rehabilitation training of upper limbs platform and method based on MYO armlet and mobile terminal - Google Patents
VR rehabilitation training of upper limbs platform and method based on MYO armlet and mobile terminal Download PDFInfo
- Publication number
- CN108815804A CN108815804A CN201810602719.0A CN201810602719A CN108815804A CN 108815804 A CN108815804 A CN 108815804A CN 201810602719 A CN201810602719 A CN 201810602719A CN 108815804 A CN108815804 A CN 108815804A
- Authority
- CN
- China
- Prior art keywords
- gesture
- virtual environment
- unity
- myo
- upper limb
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B23/00—Exercising apparatus specially adapted for particular parts of the body
- A63B23/035—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
- A63B23/12—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B23/00—Exercising apparatus specially adapted for particular parts of the body
- A63B23/035—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
- A63B23/12—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
- A63B23/14—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles for wrist joints
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B23/00—Exercising apparatus specially adapted for particular parts of the body
- A63B23/035—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously
- A63B23/12—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles
- A63B23/16—Exercising apparatus specially adapted for particular parts of the body for limbs, i.e. upper or lower limbs, e.g. simultaneously for upper limbs or related muscles, e.g. chest, upper back or shoulder muscles for hands or fingers
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B71/0622—Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B71/0622—Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
- A63B2071/0638—Displaying moving images of recorded environment, e.g. virtual environment
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B2071/0647—Visualisation of executed movements
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B71/00—Games or sports accessories not covered in groups A63B1/00 - A63B69/00
- A63B71/06—Indicating or scoring devices for games or players, or for other sports activities
- A63B71/0619—Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
- A63B2071/0658—Position or arrangement of display
- A63B2071/0661—Position or arrangement of display arranged on the user
- A63B2071/0666—Position or arrangement of display arranged on the user worn on the head or face, e.g. combined with goggles or glasses
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2220/00—Measuring of physical parameters relating to sporting activity
- A63B2220/80—Special sensors, transducers or devices therefor
- A63B2220/803—Motion sensors
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/08—Measuring physiological parameters of the user other bio-electrical signals
- A63B2230/085—Measuring physiological parameters of the user other bio-electrical signals used as a control parameter for the apparatus
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63B—APPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
- A63B2230/00—Measuring physiological parameters of the user
- A63B2230/62—Measuring physiological parameters of the user posture
- A63B2230/625—Measuring physiological parameters of the user posture used as a control parameter for the apparatus
Landscapes
- Health & Medical Sciences (AREA)
- Orthopedic Medicine & Surgery (AREA)
- General Health & Medical Sciences (AREA)
- Physical Education & Sports Medicine (AREA)
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Human Computer Interaction (AREA)
- Rehabilitation Tools (AREA)
Abstract
The invention discloses a kind of VR rehabilitation training of upper limbs platform and method based on MYO armlet and mobile terminal, including MYO armband, Unity virtual environment, mobile terminal platform, VR module;MYO armband includes a nine axis inertial sensor units, eight surface myoelectric sensors and a Bluetooth Receiver;Unity virtual environment includes 3d upper limb model, the real-time visual feedback of myoelectric information, the scene and its training mode of rehabilitation training game;Mobile terminal platform is the mobile phone or flat bed platform for being used to carry the virtual environment based on MYO armlet, and gesture and arm posture information that MYO armlet acquires are mapped in VR environment, is visualized to user, use handy to carry about;VR module, the virtual scene that will be constructed in Unity virtual environment feed back to its hand motion state of user using VR glasses, enhance environment feeling of immersion and human-computer interaction.Compared with prior art, the present invention has the advantages that low cost, strong operability, practicability are good, be convenient for carrying and feeling of immersion is strong, interactivity is good.
Description
Technical field
The present invention relates to a kind of Low Cost Virtual environment 3d arm trainings developed to help stroke patient upper limb healing
Platform, especially a kind of VR rehabilitation training of upper limbs platform based on MYO armlet and mobile terminal.
Background technique
China's prevalence rate of stroke is on the rise in recent years, becomes the principal disease for endangering the health of our people, and
It is damaged when 85% stroke patients with upper extremity function, on about 55%~75% patient still has in 6 months after the onset
Limb dysfunction, this is also a main cause for causing hand function impaired, the treatment and health of post-stroke hand hemiplegia of limb
Have become the research hotspot of modern rehabilitation medicine and rehabilitation project again.
The innovative MY0 armlet that Canadian venture company Thalmic Labs is released reads sEMG signal, can be worn at
The sEMG signal of arm muscles generation is acquired above the elbow joint of any arm, it has eight channels, each channel etc.
Spacing arrangement.MY0 armlet can acquire original sEMG signal, be spread out of signal by the bluetooth of low-power, interfere small, signal quality
It is good and cheap, as voltage input, have the characteristics that at low cost, comfortable wearing, and meet practicability.
Application No. is 201610379614.4 patents to disclose a kind of prosthetic hand control method based on MYO armlet, and
Application No. is a set of prosthetic hand control systems based on MYO armlet of the patented invention of 201710168821.X, including signal to adopt
Collect module, STM32 module, fuzzy controller module, prosthetic hand module, grasping force feedback module and the PC for cooperating off-line training
Machine.And the prosthetic hand control being wherein previously mentioned is there are system cost height, grasping flexibility is poor, operation is aesthetic poor and the practicability is poor
The problems such as, it may cause uncomfortable experience when patient wears.
The problems such as expensive and comfort in order to solve true EMG-controlling prosthetic hand equipment is poor, application No. is
201611073067.3 patent discloses a kind of virtual artificial hand training platform based on MYO armlet and Eye-controlling focus and its training
Method, including MYO armband, Unity virtual environment, Eye-controlling focus unit and vibration armband etc., the training system is mainly at the end PC
Execute, it has not been convenient to carry and opereating specification be limited, although and introduce Unity virtual environment, its feeling of immersion is poor and training
Mode is too single uninteresting.
Therefore the present invention parses the motion gesture information of user using MY0 armlet, and maps that in Unity environment
3d upper limb model on, realize that myoelectricity posture information is interacted with virtual environment using software algorithm, then MYO armlet acquired
Surface electromyogram signal carries out data processing and identification, observation-imitation Training scene is added in unity environment, and control in real time
3d arm processed completes the movement such as grasping, and shows myoelectricity relevant information in gui interface, is realized on hardware by bluetooth module
The communication of MYO armlet and mobile platform, finally using the environment feeling of immersion of VR glasses enhancing user.
Summary of the invention
To solve drawbacks described above existing in the prior art, the purpose of the present invention is to provide one kind based on MYO armlet and
The VR rehabilitation training of upper limbs platform and its method of mobile terminal, build that a low cost, strong operability, practicability is good, is convenient for carrying
And the VR rehabilitation training of upper limbs platform that feeling of immersion is strong, interactivity is good, it is mentioned for the rehabilitation training and daily life of upper limb disability patient
For convenience.
The present invention is realized by following technical proposals.
A kind of VR rehabilitation training of upper limbs platform based on MYO armlet and mobile terminal of the invention, including:
Computer, using Unity3D software building Unity virtual environment, including 3d upper limb model, myoelectric information it is real-time
Visual feedback, the virtual environment for the treatment of and the scene and its training mode of rehabilitation training game;
MYO armlet, including a nine axis inertial sensors, eight surface myoelectric sensors and a Bluetooth Receiver lead to
Motion profile, orientation and the arm posture information for crossing nine axis inertial sensors detection arm, are detected by surface myoelectric sensor
The electromyography signal and its gesture information of different gestures;It is realized and Unity virtual environment and mobile terminal platform by Bluetooth Receiver
Communication;
Mobile device, for for carrying mobile phone or tablet computer based on Unity virtual environment, MYO armlet is acquired
Gesture information and arm posture information are mapped in VR glasses, are visualized to user;
VR glasses observe that Unity is virtual by VR glasses as the wearing body for carrying mobile device on the mobile apparatus
Its gesture information of user and arm posture information are fed back in environment, carry out human-computer interaction;
MYO armlet and VR glasses in user's wearing by acquiring the arm posture information of user's hand, and pass through flesh
Electric signal carries out parsing identification to gesture information, and corresponding hand motion, user are then converted into Unity virtual environment
Rehabilitation training of upper limbs is carried out by operation mobile device.
The present invention gives a kind of MYO armlet of above-mentioned platform and the VR rehabilitation training of upper limbs method of mobile terminal, packet in turn
Include following steps:
The first step:User wears MYO armlet and VR glasses, and determines and if the Unity virtual ring in computer
Border communicates successfully, and the bluetooth connection success with mobile end equipment;
Second step:By the gesture of user's wrist varus, 3d upper limb in Unity virtual environment is watched in slave mobile device
The initial position of arm in model, and the initial position is calibrated into the mobile device screen that face VR glasses carry;
Third step:By writing reading data and posture synchronized algorithm program about MYO armlet in Unity environment,
The space bit confidence of arm in the synchronous user's upper limb model of gyro data according to nine axis inertial sensors on MYO armlet
Breath, while current gesture is parsed according to user's hand electromyography signal;
4th step:In foundation phase, to treat as scene, hand signal is sent on the 3d in Unity virtual environment
Limb model executes different hand motions, and in the myoelectric information of mobile device interface real-time display MYO armlet acquisition;
5th step:In the training stage, using game as scene, virtual interacting object is added, in Unity virtual environment scene
Observation-imitation training mode first sets animation and completes the mobile observation scene placed of crawl, then active movement upper limb mould again
Imitative observation cartoon scene, completes the movement of itself limbs;
If detecting, hand normal grip lives virtual interacting object, and virtual interacting object images color is changed to B from A,
If virtual interacting object images color does not change, user and virtual interacting object not in contact with;
If detecting, hand opens release virtual interacting object, and puts virtual interacting object at specified position, empty
Quasi- interactive object color of image becomes C from B;
The above movement is repeated, feeds back to obtain the training effect of user by mobile device.
Further, in the myoelectric information of mobile device interface real-time display MYO armlet acquisition, by Unity void
Gui interface is added in near-ring border, foundation phase rotates angle, acceleration information and positional number using the arm of MYO armlet acquisition
According to intuitively showing myoelectric information;Between training stage then utilizes scoring event, movement accuracy and the speed of training process
Ground connection display myoelectric information, so that Real-time Feedback is to user.
Further, in the scene of the Unity virtual environment, by realizing the fortune under virtual environment based on mobile device
In-motion viewing is examined and motion mimics rehabilitation training scene, i.e., by observation fantasy sport scene, induces patient's active movement and be intended to, and is same
When gameization training in initiatively imitate fantasy sport scene carry out rehabilitation training.
Further, it is corresponding with mobile device that MYO armlet is added in the script of the Unity virtual environment of computer
Mac address code realizes being successfully connected for the bluetooth of the bluetooth and mobile device on MYO armlet, by the Unity virtual ring of computer
Border goes to mobile device, to carry carry out rehabilitation training.
Further, the VR glasses in use, the position for incuding eyes with gyroscope built-in in VR glasses is dynamic
State variation, according to VR camera lens spacing and interpupillary distance, cooperates the screen size of mobile device, constructs to suitability binocular void in the scene
Quasi- reality camera, realizes 3D immersive effects;In a computer using the gaze module in Unity virtual environment, work as to monitor
The previous dynasty is to object.
Further, in the second step, 3d upper limb model is imported in Unity virtual environment, by the initial of MYO armlet
Change script and be assigned to 3d upper limb model, judges whether gesture changes;Then each timing is using wrist varus gesture as opening
Dynamic to correct gesture, forearm is as initial gesture after calibration in the 3d upper limb inside face mobile device screen;The forearm most started
It needs to compensate forearm normal vector and direction of rotation when calibration, needs to adjust the direction of rotation of forearm, be allowed to set with movement
The change in coordinate axis direction of standby upper MYO armlet is consistent, realizes upper arm in fixed 3d upper limb and rotates the movement of forearm, to the last calibrates
When forearm towards inside screen.
Further, in the arm calibration, space is rotationally-varying where describing arm using Eulerian angles, by exhausted
To the Eulerian angles for fixing upper arm in 3d upper limb in coordinate system, forearm in 3d upper limb is directed at picture center, then in the phase of forearm
To in coordinate system, define the reference axis that Z axis is plane where vertical two forearms, Y-axis in the horizontal plane along forearm horizontal direction,
X-axis is in the horizontal plane along forearm vertical direction, and progress Yaw rotation adjusts the angle between forearm and upper arm to normal physiological about the z axis
Position first carries out Roll rotation around Y-axis to adjust the Eulerian angles of forearm, carries out Pitch rotation further around X-axis to adjust in 3d upper limb
The Eulerian angles of wrist reset orientation after positioning extremity orientation by measuring device, to synchronize in virtual and real world coordinates
Initial coordinate.
Further, in the 4th step, the default part of MYO armlet is assigned to virtual 3d upper limb first, makes it have MYO
Then each attribute of armlet defines each joint target object of 3d upper limb, movement flag bit, gesture completion status and finger speed
Degree and time span;When the every frame picture of Unity virtual environment is updated, first detecting the gesture since last time gesture is
It is no to be changed, the gesture detected at present is set by gesture if detecting, if not detecting update
Gesture, the gesture that gesture is arbitrarily placed when will be set as loosening.
The present invention has the advantage that compared with prior art:
The present invention provides a kind of VR rehabilitation training of upper limbs platform and its method based on MYO armlet and mobile terminal and the end PC
It compares, mobile terminal has the characteristics that be convenient for carrying, practicability is good, and Cardboard VR glasses have the characteristics that feeling of immersion is strong;?
The game in training mode in scape is easy to operate, has interest, and the game training for observing imitation can improve the active of patient
Degree of participation, and the visualization of myoelectric information can allow patient more intuitively to observe trained data, enhance man-machine interaction,
Therefore it provides convenience for the rehabilitation training of upper limb disability patient and daily life.
Detailed description of the invention
The drawings described herein are used to provide a further understanding of the present invention, constitutes part of this application, not
Inappropriate limitation of the present invention is constituted, in the accompanying drawings:
Fig. 1 is that the present invention is based on the VR rehabilitation training of upper limbs platform the general frames of MYO armlet and mobile terminal;
Fig. 2 is algorithm flow chart of the invention;
Fig. 3 is rehabilitation training interactive mode schematic diagram of the invention.
Specific embodiment
Following will be combined with the drawings in the embodiments of the present invention, and technical solution in the embodiment of the present invention carries out clear, complete
Site preparation description, it is clear that described embodiments are only a part of the embodiments of the present invention, instead of all the embodiments.It is based on
Embodiment in the present invention, it is obtained by those of ordinary skill in the art without making creative efforts every other
Embodiment shall fall within the protection scope of the present invention.
As shown in Figure 1, the VR rehabilitation training of upper limbs platform based on MYO armlet and mobile terminal, including:Computer, MYO arm
Ring, mobile device and VR glasses.Wherein:
Computer, using Unity3D software building Unity virtual environment, including 3d upper limb model, myoelectric information it is real-time
Visual feedback, the virtual environment for the treatment of and the scene and its training mode of rehabilitation training game.
Unity virtual environment is to use Unity3D Software Create on computers, and rehabilitation training platform mainly relies on this ring
Border carries out related training, obtains including the equipment of MYO armlet, reading data and the write-in of posture synchronization program, Cardboard
In construct to suitability in the scene binocular virtual reality camera program write-in, the foundation of 3d upper limb model, myoelectric information
Gui interface show that the Unity virtual environment of medical treatment builds and rehabilitation training game in the training stage in foundation phase
Scene build, the arithmetic programming etc. of training game;
In the myoelectricity visual feedback of Unity virtual environment, by the way that gui interface, base are added in Unity virtual environment
Arm that the plinth stage is acquired using MYO armlet rotation angle, acceleration information and position data intuitively show, the training stage
It is then shown indirectly using the scoring event of training process, movement accuracy and speed, so that Real-time Feedback myoelectric information is given
User.
In the scene of Unity virtual environment, by based on mobile terminal realize reality environment under movement observation and
Motion mimics rehabilitation training scene induces user's active movement and is intended to that is, by observation fantasy sport scene, and is swimming simultaneously
Fantasy sport scene, which is initiatively imitated, in playization training carries out rehabilitation training.
MYO armlet, including a nine axis inertial sensors, eight surface myoelectric sensors and a Bluetooth Receiver, nine
Axis inertial sensor unit includes three-axis gyroscope, three axis accelerometer and triaxial magnetometer, is examined by nine axis inertial sensors
Motion profile, orientation and the arm posture information for surveying arm, the electromyography signal of different gestures is detected by surface myoelectric sensor
And its gesture information;The communication with Unity virtual environment and mobile terminal platform is realized by Bluetooth Receiver;Wherein, MY0 armlet
It is the inventive arrangement for being produced and being released by Canadian venture company Thalmic Labs, for reading surface electromyogram signal, hand
Portion's signal is to be given birth to by arm forearm bands of muscle movable property, therefore can be worn at the elbow joint top of any arm to acquire
The bioelectrical signals that arm muscles generate, it has eight channels, and each channel equidistantly arranges, and frequency acquisition is that its data is defeated
Highest frequency 200Hz out.
Mobile device, for for carrying mobile phone or tablet computer based on Unity virtual environment, MYO armlet is acquired
Gesture information and arm posture information are mapped in VR glasses, are visualized to user.
Mobile device refers to Android device, for carrying the Unity virtual environment based on MYO armlet, by MYO armlet
The gesture information and arm posture information of acquisition are mapped in Unity virtual environment, by the way that Android to be placed in
In Cardboard VR spectacle case, it is equivalent to helmet, visualizes to user, it is facilitated to carry use.
VR glasses observe that Unity is virtual by VR glasses as the wearing body for carrying mobile device on the mobile apparatus
Its gesture information of user and arm posture information are fed back in environment, carry out human-computer interaction.
VR glasses refer mainly to Cardboard VR glasses, include in Cardboard carton cardboard, biconvex lens, magnetite,
The components such as magic power patch, bandage and NFC patch, are a cheap virtual reality devices, user can be allowed to experience void by mobile phone
Intend the glamour of reality;After related algorithm is added, the virtual scene constructed in Unity virtual environment can will be in mobile phone
Hold and carry out split screen display available, eyes see in have parallax to generate stereoscopic effect, and utilize Cardboard VR glasses anti-
It feeds its hand motion state of user, environment feeling of immersion and human-computer interaction can be enhanced.
MYO armlet and VR glasses in user's wearing by acquiring the arm posture information of user's hand, and pass through flesh
Electric signal carries out parsing identification to gesture information, and corresponding hand motion, user are then converted into Unity virtual environment
Rehabilitation training of upper limbs is carried out by operation mobile device.
Shown in Fig. 2, for the method flow diagram based on MYO armlet and the VR rehabilitation training of upper limbs platform of mobile terminal, it is desirable that make
MYO armlet and Cardboard VR glasses in user's wearing are obtained, data by acquiring the electromyography signal of hand using equipment
It reads and posture synchronized algorithm carries out processing to gesture motion and parses identification, its recognition result is then utilized into bluetooth 4.0LE
It is transmitted in Unity virtual environment, converts corresponding hand for these gesture identification results in Unity virtual environment and move
Make, user, which passes through, wears the VR glasses progress gameization training that Android phone is added, and method specifically includes following steps:
The first step:User wears MYO armlet and Cardboard VR glasses, and determine if in computer
Unity virtual environment communicates successfully, and successful with Android device bluetooth connection.
Joint wears MYO armlet before the elbow of left arm:
MYO armlet is communicated with Unity virtual Environment Implementation by Bluetooth Receiver, and MYO armlet and Android phone
Connection, then need the mac address code of the Android phone by the way that acquisition is added in Unity virtual environment algorithm, and will virtually show
Real Training scene is exported from computer terminal to App is generated in Android phone, starts Bluetooth of mobile phone, opening this App can lead to immediately
Bluetooth connection MYO armlet is crossed, the motion profile of hand can be seen in mobile phone screen by brandishing hand then.
Mobile phone is placed in Cardboard VR glasses box, is fixed on head using bandage.
And Cardboard VR glasses need that VR module is added in Unity virtual environment, according to Cardboard VR mirror
Head spacing and interpupillary distance, cooperate the screen size of Android device, construct to suitability the camera shooting of binocular virtual reality in the scene
Head, two piecemeals in plane environment it can be seen that in mobile phone screen or so, incudes eyes with the gyroscope on VR glasses
Position dynamic change, the gaze function being equivalent in Unity virtual environment, is observed with this currently towards object, and wears VR
Mirror can make scene have 3D stereoscopic effect, enhance the feeling of immersion of user by.
Second step:By the gesture of wrist varus, the initial position of the forearm in virtual environment in 3d upper limb is calibrated to just
To inside screen.
The initial position for needing to calibrate the forearm in 3d upper limb within sweep of the eye to appear in 3d upper limb, first allowing makes
Then wrist varus is brandished towards the right side towards screen the inside, the forearm in 3d upper limb can be made to be in front by user's left arm
Position:
3d upper limb model is imported in Unity virtual environment, and the initializtion script of MYO armlet is assigned to 3d upper limb model,
Whether it is last gesture by the gesture of identification, judges whether gesture changes.Then each timing is by wrist varus
Gesture corrects gesture as starting, and the forearm position in 3d upper limb inside face screen is as initial gesture position after calibration;
Forearm in the 3d upper limb most started needs to compensate forearm normal vector and direction of rotation when calibrating, and needs to adjust
The direction of rotation for saving forearm, is allowed to almost the same with the arm movements direction in Android device, that is, keeps the coordinate of this forearm
Axis direction is consistent with the MYO coordinate in Android device, it can be achieved that upper arm in fixed 3d upper limb and rotate the movement of forearm,
Space is rotationally-varying where describing forearm with Eulerian angles, by fixing the Eulerian angles of upper arm in absolute coordinate system, by forearm
It is directed at picture center, then in the relative coordinate system of forearm, defines the reference axis that Z axis is plane where vertical two forearms, Y
For axis in the horizontal plane along forearm horizontal direction, it is small to carry out Yaw rotation adjustment in the horizontal plane along forearm vertical direction about the z axis for X-axis
Angle between arm and upper arm first carries out Roll rotation around Y-axis to adjust the Eulerian angles of forearm, further around X-axis to normal physiological position
Pitch rotation is carried out to adjust the Eulerian angles of the wrist in 3d upper limb, resets orientation after positioning extremity orientation by measuring device,
It synchronizes the initial coordinate in virtual and real world coordinates, enhances virtual limbs this body-sensing of bring synchronous with limbs of patient;
Forearm is towards inside screen when to the last calibrating.
Third step:It is obtained by writing MYO armlet and equipment in Unity virtual environment, reading data is synchronous with posture
Relevant algorithm routine, the synchronous patient hand's limbs space of gyro data according to nine axis inertial sensors in MYO armlet
Location information, while current gesture is parsed according to electromyography signal.
Hand signal built in MYO armlet includes 5 kinds, and wrist varus, wrist is turned up, and is clenched fist, exrending boxing, and thumb and middle finger are double
It strikes, other than often using the gesture of wrist varus in calibration, is on basis and training stage, the other user's gesture of most common sense
It clenches fist and opens, the corresponding gesture motions different on hand for executing 3d upper limb model.
4th step:In foundation phase, using hospitalize as scene, hand signal is sent to 3d upper limb model, is executed not
Same hand motion is such as grasped and is opened, and in the data of Android interface real-time display MYO armlet acquisition.
Under the background of hospitalize, left hand opens the five fingers, and after detecting the movement, MYO armlet can generate of short duration shake
Dynamic, the finger in 3d upper limb also slowly opens therewith;Then left hand is clenched fist, MYO armlet equally can of short duration vibration, and in 3d upper limb
Finger also slowly collapse, but finger when executing other movements in MYO armlet and 3d upper limb will not generate variation.
In the Unity virtual environment of computer terminal, the default part of MYO armlet is assigned to virtual 3d upper limb model first,
Then it is complete to define each joint target object in 3d upper limb, movement flag bit, gesture for each attribute for making it have MYO armlet
At state and digit speed and time span;When the every frame of environment picture is updated, first detect in Unity virtual environment
Whether gesture is changed since last time gesture, sets the gesture detected at present for gesture if detecting,
If not detecting the gesture of update, the gesture that gesture is arbitrarily placed when will be set as loosening.
Joint target object, including in 3d upper limb model thumb, index finger, middle finger, the third finger and little finger of toe it is each close
Refer to and remote finger joints such as finger, the movement rotated around joint object thus, to select different objects when executing different gestures.
Movement flag bit, which represents, grasps two different action gestures, represents the gesture clenched fist when wherein flag bit is 1,
And flag bit be 2 when represent open gesture.
Gesture completion status represents three kinds of different gesture states, indicates that gesture is in initial bit when wherein state is 0
It sets, indicates that gesture is carrying out when state is 1, indicate that gesture motion has been completed when state is 2.
Wherein, digit speed adjusts speed and guarantees to grasp training for adjusting the flexibility of the finger movement in 3d upper limb
Rhythm;Time span can guarantee finger in conjunction with finger movement speed for determining the time of the finger movement in 3d upper limb
Motion range is constraint with the range of motion of normal physiological, establishes the mapping relations of electromyography signal intensity and movement velocity.
If detecting that gesture is the movement clenched fist in Unity virtual environment, and a upper movement is not clenched fist, and gesture
In initial position or completed state, i.e., movement flag bit is not 1 and gesture completion status when being also not 1, then will be existing
Gesture motion be set as the state of clenching fist, gesture completion status is set as the state being carrying out, i.e., movement flag bit become 1 and
Gesture completion status becomes 1, and timing from this moment on;If detecting that gesture is the gesture opened, and a upper movement is not
It is to open, and gesture is in initial position or completed state, i.e., movement flag bit is not 2 and gesture completion status is also not
When 1, then open configuration is set by existing gesture motion, gesture completion status is set as the state being carrying out, that is, acts
Flag bit becomes 2 and gesture completion status becomes 1, and timing from this moment on.
If gesture is in the state clenched fist, and gesture motion does not complete also, i.e., movement flag bit is 1 and gesture is completed
When state is not 2, when timing time length reaches regulation at a length of 4 seconds, setting gesture completes shape in the state that is completed, gesture
State becomes 2, then the movement of each finger-joint in 3d upper limb is executed by certain speed proportional;If gesture is in opening
State, and gesture motion does not complete also, i.e. movement flag bit are 2 and gesture completion status when not being 2, when timing time length
When reaching a length of 4 seconds when regulation, gesture is set in the state that is completed, gesture completion status becomes 2, then presses certain speed proportional
Execute the movement of each finger-joint;Wherein, speed proportional is 1 at three finger joints of the thumb in 3d upper limb:4:6, remaining
It is identical as the finger joint speed that thumb movement velocity is most fast that the joint velocity of finger is kept;Frame updating repeats the above circulation.
Android interface is designed by gui interface, and rotation angle, the acceleration of explanatory note and gyroscope acquisition is added
Counting the acceleration information of acquisition and the position data in three orientation makes this Android to show the data of MYO armlet acquisition
The Equipment Foundations stage has the characteristics that real-time and visual.
5th step:In the training stage, using ball shooting game as scene, object such as bead is added, if detecting normal grip in one's hands
Ball is become blue (B) from grey (A) by firmly ball;Then ball is moved to above ball frame, detects and open one's hand when discharging ball, ball is just
Become green (C) from blue (B);Repeat the above movement, Android device by the training effect of score feedback user (see
Fig. 3).
By realizing movement observation and motion mimics rehabilitation training field under Unity virtual environment based on Android device
Scape induces patient's active movement and is intended to that is, by observation fantasy sport scene, and initiatively imitates in gameization training simultaneously
Fantasy sport scene carries out rehabilitation training;Under the scene of ball shooting game, first sets animation and complete the mobile observation placed of crawl
Scene, then active observation cartoon scene imitates the movement for completing itself limbs again:It first carries out with left-handed grip bead, then
Mobile bead and the last movement for decontroling bead observe the movement that user's hand is clenched fist and opened with this, and can be real-time
See that bead knocks down the scoring event of ball frame:
In the Unity virtual environment of computer terminal, the border collision that Unity3D software physics system is added detects function
Can, defining includes ash (A), indigo plant (B), green (C) three kinds of colors in the material of ball, and the priming color that bead is arranged is grey, as general
3d upper limb moves to above ball, and when detecting that the hand of 3d upper limb executes movement of clenching fist, and being in contact with the boundary of ball, ball is arranged
Color is blue, clenches fist if detecting or does not encounter ball boundary, then the color of ball is maintained as grey;When hand grip it is small
Bead is moved to right above ball frame, detects execution expansion action in one's hands, and ball collides with frame bottom at the appointed time by ball,
The color of ball is then set for green and destroys this ball therewith, if detecting that open left hand or ball does not contact with frame bottom, then ball
Color keep be blue and destroy this ball.
And sub-module, accuracy module and acceleration module is added to obtain in the design of the gui interface of Unity virtual environment.?
Sub-module:Five points are just counted in bead greening, bead only counts two points when only becoming blue;20 times more than needing to repeat in one group of training
Movement is finished end until 20 groups of movements and is trained, and judges different users or the training of same user's difference by hundred-mark system
The training effect in period;Accuracy module:One group training 20 times movement in, by the total degree of the final greening of bead divided by
20, expression completes a whole set of movement, and the crawl accuracy rate of different users is judged with percentage;Acceleration module:It calculates 3
Minute in bead crawl number, this module also only consider greening bead number, unit be per minute, judge different uses
The crawl reaction speed of person.User's training log is saved as simultaneously, records the training process of each user, dynamic monitoring
Rehabilitation course judges the effectiveness of this training system.
The present invention is not limited to the above embodiments, on the basis of technical solution disclosed by the invention, the skill of this field
For art personnel according to disclosed technology contents, one can be made to some of which technical characteristic by not needing creative labor
A little replacements and deformation, these replacements and deformation are within the scope of the invention.
Claims (9)
1. a kind of VR rehabilitation training of upper limbs platform based on MYO armlet and mobile terminal, which is characterized in that including:
Computer, using Unity3D software building Unity virtual environment, Unity virtual environment includes the flesh to 3d upper limb model
The real-time visual of power information is fed back, the virtual environment for the treatment of and the scene and its training mode of rehabilitation training game;
MYO armlet, including a nine axis inertial sensors, eight surface myoelectric sensors and a Bluetooth Receiver, pass through nine
Axis inertial sensor detects motion profile, orientation and the arm posture information of arm, is detected by surface myoelectric sensor different
The electromyography signal and its gesture information of gesture are realized logical with Unity virtual environment and mobile terminal platform by Bluetooth Receiver
Letter;
Mobile device, for for carrying mobile phone or tablet computer based on Unity virtual environment, the gesture that MYO armlet is acquired
Information and arm posture information are mapped in VR glasses, are visualized to user;
VR glasses observe Unity virtual environment by VR glasses as the wearing body for carrying mobile device on the mobile apparatus
In feed back to its gesture information of user and arm posture information, carry out human-computer interaction;
MYO armlet and VR glasses in user's wearing are believed by acquiring the arm posture information of user's hand, and by myoelectricity
Number parsing identification is carried out to gesture information, corresponding hand motion is then converted into Unity virtual environment, user passes through
It operates mobile device and carries out rehabilitation training of upper limbs.
2. a kind of VR rehabilitation training of upper limbs method of MYO armlet and mobile terminal based on platform described in claim 1, feature exist
In including the following steps:
The first step:User wears MYO armlet and VR glasses, and determines if logical with the Unity virtual environment in computer
Believe successfully, and the bluetooth connection success with mobile end equipment;
Second step:By the gesture of user's wrist varus, 3d upper limb model in Unity virtual environment is watched in slave mobile device
The initial position of middle arm, and the initial position is calibrated into the mobile device screen that face VR glasses carry;
Third step:By writing reading data and posture synchronized algorithm program about MYO armlet, foundation in Unity environment
The spatial positional information of arm in the synchronous user's upper limb model of the gyro data of nine axis inertial sensors on MYO armlet,
Current gesture is parsed according to user's hand electromyography signal simultaneously;
4th step:In foundation phase, to treat the 3d upper limb mould for scene, hand signal being sent in Unity virtual environment
Type executes different hand motions, and in the myoelectric information of mobile device interface real-time display MYO armlet acquisition;
5th step:In the training stage, using game as scene, virtual interacting object is added, is seen in Unity virtual environment scene
The training mode examined-imitated first sets animation and completes the mobile observation scene placed of crawl, and then active movement upper limb imitates again
Cartoon scene is observed, the movement of itself limbs is completed;
If detecting, hand normal grip lives virtual interacting object, and virtual interacting object images color is changed to B from A, if empty
Quasi- interactive object color of image does not change, then user and virtual interacting object not in contact with;
If detecting, hand opens release virtual interacting object, and puts virtual interacting object at specified position, virtual to hand over
Mutual object images color becomes C from B;
The above movement is repeated, feeds back to obtain the training effect of user by mobile device.
3. according to the method described in claim 2, it is characterized in that, in the acquisition of mobile device interface real-time display MYO armlet
In myoelectric information, by the way that gui interface is added in Unity virtual environment, foundation phase is rotated using the arm of MYO armlet acquisition
Angle, acceleration information and position data intuitively show myoelectric information;Training stage then utilizes the score feelings of training process
Condition acts accuracy and speed to show myoelectric information indirectly, so that Real-time Feedback is to user.
4. according to the method described in claim 2, it is characterized in that, being moved in the scene of the Unity virtual environment by being based on
Dynamic equipment realizes movement observation and motion mimics rehabilitation training scene under virtual environment, i.e., by observing fantasy sport scene,
It induces patient's active movement to be intended to, and initiatively imitates fantasy sport scene in gameization training simultaneously and carry out rehabilitation training.
5. according to the method described in claim 2, it is characterized in that, being added in the script of the Unity virtual environment of computer
MYO armlet mac address code corresponding with mobile device realizes that the success of the bluetooth of the bluetooth and mobile device on MYO armlet connects
It connects, the Unity virtual environment of computer is gone into mobile device, to carry carry out rehabilitation training.
6. according to the method described in claim 2, it is characterized in that, the VR glasses in use, with built-in in VR glasses
Gyroscope cooperates the screen size of mobile device according to VR camera lens spacing and interpupillary distance to incude the position dynamic change of eyes,
Binocular virtual reality camera is constructed to suitability in scene, realizes 3D immersive effects;Unity virtual ring is used in a computer
Gaze module in border, to monitor currently towards object.
7. according to the method described in claim 2, it is characterized in that, importing 3d in Unity virtual environment in the second step
The initializtion script of MYO armlet is assigned to 3d upper limb model, judges whether gesture changes by upper limb model;Then each school
Wrist varus gesture is corrected gesture by timing, and forearm is as calibration in the 3d upper limb inside face mobile device screen
Initial gesture afterwards;It needs to compensate forearm normal vector and direction of rotation when the forearm calibration most started, needs to adjust forearm
Direction of rotation, be allowed to consistent with the change in coordinate axis direction of MYO armlet in mobile device, realize upper arm in fixed 3d upper limb and rotate
The movement of forearm, forearm is towards inside screen when to the last calibrating.
8. the method according to the description of claim 7 is characterized in that describing arm institute using Eulerian angles in arm calibration
In the rotationally-varying of space, by fixing the Eulerian angles of upper arm in 3d upper limb in absolute coordinate system, by forearm pair in 3d upper limb
Quasi- picture center defines the reference axis that Z axis is plane where vertical two forearms, Y-axis then in the relative coordinate system of forearm
In the horizontal plane along forearm horizontal direction, X-axis carries out Yaw rotation adjustment forearm in the horizontal plane along forearm vertical direction about the z axis
Angle between upper arm first carries out Roll rotation around Y-axis to adjust the Eulerian angles of forearm to normal physiological position, further around X-axis into
Row Pitch rotation is come same to adjust the Eulerian angles of wrist in 3d upper limb by resetting orientation after measuring device positioning extremity orientation
Walk the initial coordinate in virtual and real world coordinates.
9. according to the method described in claim 2, it is characterized in that, the default part of MYO armlet is assigned first in the 4th step
To virtual 3d upper limb, each attribute of MYO armlet is made it have, then defines each joint target object of 3d upper limb, movement mark
Will position, gesture completion status and digit speed and time span;When the every frame picture of Unity virtual environment is updated,
It first detects whether the gesture since last time gesture is changed, is set as detecting at present by gesture if detecting
Gesture, if not detecting the gesture of update, the gesture that gesture is arbitrarily placed when will be set as loosening.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810602719.0A CN108815804B (en) | 2018-06-12 | 2018-06-12 | VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201810602719.0A CN108815804B (en) | 2018-06-12 | 2018-06-12 | VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN108815804A true CN108815804A (en) | 2018-11-16 |
| CN108815804B CN108815804B (en) | 2020-06-09 |
Family
ID=64144900
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201810602719.0A Active CN108815804B (en) | 2018-06-12 | 2018-06-12 | VR upper limb rehabilitation training platform and method based on MYO arm ring and mobile terminal |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN108815804B (en) |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109701224A (en) * | 2019-02-22 | 2019-05-03 | 重庆市北碚区中医院 | An Augmented Reality AR Wrist Rehabilitation Evaluation and Training System |
| CN110232963A (en) * | 2019-05-06 | 2019-09-13 | 中山大学附属第一医院 | A kind of upper extremity exercise functional assessment system and method based on stereo display technique |
| CN110227243A (en) * | 2019-06-11 | 2019-09-13 | 刘简 | Table tennis practice intelligent correcting system and its working method |
| CN110298286A (en) * | 2019-06-24 | 2019-10-01 | 中国科学院深圳先进技术研究院 | Virtual reality recovery training method and system based on surface myoelectric and depth image |
| CN110624217A (en) * | 2019-09-23 | 2019-12-31 | 孙孟雯 | Rehabilitation glove based on multi-sensor fusion and implementation method thereof |
| CN110706776A (en) * | 2019-09-20 | 2020-01-17 | 广东技术师范大学 | Apoplexy rehabilitation training system based on virtual reality technology and using method thereof |
| CN111714334A (en) * | 2020-07-13 | 2020-09-29 | 厦门威恩科技有限公司 | Upper limb rehabilitation training robot and control method |
| CN111840920A (en) * | 2020-07-06 | 2020-10-30 | 暨南大学 | A virtual reality-based upper limb intelligent rehabilitation system |
| CN111991762A (en) * | 2020-09-02 | 2020-11-27 | 冼鹏全 | Psychotherapy-based wearable upper limb rehabilitation device for stroke patient and cooperative working method |
| CN113101137A (en) * | 2021-04-06 | 2021-07-13 | 合肥工业大学 | An upper limb rehabilitation robot based on motion mapping and virtual reality |
| CN113181621A (en) * | 2021-06-09 | 2021-07-30 | 张彤 | Utilize supplementary training equipment of VR and force feedback arm |
| CN114377358A (en) * | 2022-02-22 | 2022-04-22 | 南京医科大学 | A Home Rehabilitation System for Upper Limbs Based on Sphero Spherical Robot |
| CN114469465A (en) * | 2021-12-28 | 2022-05-13 | 山东浪潮工业互联网产业股份有限公司 | A control method, device and medium based on intelligent prosthesis |
| CN114637395A (en) * | 2022-02-14 | 2022-06-17 | 上海诠视传感技术有限公司 | A method for training hand-eye coordination through AR glasses |
| CN115047979A (en) * | 2022-08-15 | 2022-09-13 | 歌尔股份有限公司 | Head-mounted display equipment control system and interaction method |
| CN115691756A (en) * | 2022-10-18 | 2023-02-03 | 中国人民解放军陆军军医大学 | Remote home treatment real-time monitoring system |
| CN116312947A (en) * | 2023-03-13 | 2023-06-23 | 北京航空航天大学 | Immersive ankle and foot rehabilitation training method based on upper limb movement signals and electronic equipment |
| CN116510249A (en) * | 2023-05-09 | 2023-08-01 | 福州大学 | A hand virtual rehabilitation training system and training method based on electromyographic signals |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103230664A (en) * | 2013-04-17 | 2013-08-07 | 南通大学 | Upper limb movement rehabilitation training system and method based on Kinect sensor |
| CA2933053A1 (en) * | 2013-12-20 | 2015-06-25 | Integrum Ab | System for neuromuscular rehabilitation |
| CN106530926A (en) * | 2016-11-29 | 2017-03-22 | 东南大学 | Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking |
| CN106621287A (en) * | 2017-02-07 | 2017-05-10 | 西安交通大学 | Upper limb rehabilitation training method based on brain-computer interface and virtual reality technology |
| CN107544675A (en) * | 2017-09-08 | 2018-01-05 | 天津大学 | Brain control formula virtual reality method |
| CN107626040A (en) * | 2017-10-24 | 2018-01-26 | 杭州易脑复苏科技有限公司 | It is a kind of based on the rehabilitation system and method that can interact virtual reality and nerve electric stimulation |
| CN107694034A (en) * | 2017-12-04 | 2018-02-16 | 陈林 | Neck trainer based on virtual reality |
-
2018
- 2018-06-12 CN CN201810602719.0A patent/CN108815804B/en active Active
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103230664A (en) * | 2013-04-17 | 2013-08-07 | 南通大学 | Upper limb movement rehabilitation training system and method based on Kinect sensor |
| CA2933053A1 (en) * | 2013-12-20 | 2015-06-25 | Integrum Ab | System for neuromuscular rehabilitation |
| CN106530926A (en) * | 2016-11-29 | 2017-03-22 | 东南大学 | Virtual hand prosthesis training platform and training method thereof based on Myo armband and eye tracking |
| CN106621287A (en) * | 2017-02-07 | 2017-05-10 | 西安交通大学 | Upper limb rehabilitation training method based on brain-computer interface and virtual reality technology |
| CN107544675A (en) * | 2017-09-08 | 2018-01-05 | 天津大学 | Brain control formula virtual reality method |
| CN107626040A (en) * | 2017-10-24 | 2018-01-26 | 杭州易脑复苏科技有限公司 | It is a kind of based on the rehabilitation system and method that can interact virtual reality and nerve electric stimulation |
| CN107694034A (en) * | 2017-12-04 | 2018-02-16 | 陈林 | Neck trainer based on virtual reality |
Cited By (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109701224A (en) * | 2019-02-22 | 2019-05-03 | 重庆市北碚区中医院 | An Augmented Reality AR Wrist Rehabilitation Evaluation and Training System |
| CN109701224B (en) * | 2019-02-22 | 2024-02-23 | 重庆市北碚区中医院 | Augmented reality AR wrist joint rehabilitation evaluation and training system |
| CN110232963B (en) * | 2019-05-06 | 2021-09-07 | 中山大学附属第一医院 | A system and method for evaluating upper limb motor function based on stereoscopic display technology |
| CN110232963A (en) * | 2019-05-06 | 2019-09-13 | 中山大学附属第一医院 | A kind of upper extremity exercise functional assessment system and method based on stereo display technique |
| CN110227243A (en) * | 2019-06-11 | 2019-09-13 | 刘简 | Table tennis practice intelligent correcting system and its working method |
| CN110298286A (en) * | 2019-06-24 | 2019-10-01 | 中国科学院深圳先进技术研究院 | Virtual reality recovery training method and system based on surface myoelectric and depth image |
| CN110298286B (en) * | 2019-06-24 | 2021-04-30 | 中国科学院深圳先进技术研究院 | A virtual reality rehabilitation training method and system based on surface electromyography and depth images |
| CN110706776A (en) * | 2019-09-20 | 2020-01-17 | 广东技术师范大学 | Apoplexy rehabilitation training system based on virtual reality technology and using method thereof |
| CN110624217A (en) * | 2019-09-23 | 2019-12-31 | 孙孟雯 | Rehabilitation glove based on multi-sensor fusion and implementation method thereof |
| CN111840920A (en) * | 2020-07-06 | 2020-10-30 | 暨南大学 | A virtual reality-based upper limb intelligent rehabilitation system |
| CN111714334A (en) * | 2020-07-13 | 2020-09-29 | 厦门威恩科技有限公司 | Upper limb rehabilitation training robot and control method |
| CN111714334B (en) * | 2020-07-13 | 2022-08-05 | 厦门威恩科技有限公司 | Upper limb rehabilitation training robot and control method |
| CN111991762A (en) * | 2020-09-02 | 2020-11-27 | 冼鹏全 | Psychotherapy-based wearable upper limb rehabilitation device for stroke patient and cooperative working method |
| CN113101137A (en) * | 2021-04-06 | 2021-07-13 | 合肥工业大学 | An upper limb rehabilitation robot based on motion mapping and virtual reality |
| CN113101137B (en) * | 2021-04-06 | 2023-06-02 | 合肥工业大学 | A robot for upper limb rehabilitation based on motion mapping and virtual reality |
| CN113181621A (en) * | 2021-06-09 | 2021-07-30 | 张彤 | Utilize supplementary training equipment of VR and force feedback arm |
| CN113181621B (en) * | 2021-06-09 | 2024-03-08 | 张彤 | Auxiliary training equipment using VR and force feedback mechanical arm |
| CN114469465A (en) * | 2021-12-28 | 2022-05-13 | 山东浪潮工业互联网产业股份有限公司 | A control method, device and medium based on intelligent prosthesis |
| CN114637395A (en) * | 2022-02-14 | 2022-06-17 | 上海诠视传感技术有限公司 | A method for training hand-eye coordination through AR glasses |
| CN114377358A (en) * | 2022-02-22 | 2022-04-22 | 南京医科大学 | A Home Rehabilitation System for Upper Limbs Based on Sphero Spherical Robot |
| CN114377358B (en) * | 2022-02-22 | 2025-01-28 | 南京医科大学 | An upper limb home rehabilitation system based on Sphero spherical robot |
| CN115047979A (en) * | 2022-08-15 | 2022-09-13 | 歌尔股份有限公司 | Head-mounted display equipment control system and interaction method |
| CN115047979B (en) * | 2022-08-15 | 2022-11-01 | 歌尔股份有限公司 | Head-mounted display equipment control system and interaction method |
| CN115691756A (en) * | 2022-10-18 | 2023-02-03 | 中国人民解放军陆军军医大学 | Remote home treatment real-time monitoring system |
| CN116312947B (en) * | 2023-03-13 | 2023-11-24 | 北京航空航天大学 | Immersive ankle and foot rehabilitation training method based on upper limb movement signals and electronic equipment |
| CN116312947A (en) * | 2023-03-13 | 2023-06-23 | 北京航空航天大学 | Immersive ankle and foot rehabilitation training method based on upper limb movement signals and electronic equipment |
| CN116510249A (en) * | 2023-05-09 | 2023-08-01 | 福州大学 | A hand virtual rehabilitation training system and training method based on electromyographic signals |
Also Published As
| Publication number | Publication date |
|---|---|
| CN108815804B (en) | 2020-06-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN108815804A (en) | VR rehabilitation training of upper limbs platform and method based on MYO armlet and mobile terminal | |
| US20210349529A1 (en) | Avatar tracking and rendering in virtual reality | |
| US11262841B2 (en) | Wireless wrist computing and control device and method for 3D imaging, mapping, networking and interfacing | |
| US20190265802A1 (en) | Gesture based user interfaces, apparatuses and control systems | |
| US7918808B2 (en) | Assistive clothing | |
| CN103488291B (en) | Immersion virtual reality system based on motion capture | |
| JP2021525431A (en) | Image processing methods and devices, image devices and storage media | |
| WO2018098961A1 (en) | Sports and health management platform and smart sports equipment | |
| US20150004581A1 (en) | Interactive physical therapy | |
| CN108883335A (en) | Wearable electronic multisensory interfaces for man-machine or man-man | |
| CN107433021A (en) | A kind of VR rehabilitation systems based on mirror neuron | |
| CN113571153A (en) | Passive training perception system and its client for physical rehabilitation of patients with cerebral palsy | |
| US20240302908A1 (en) | Virtual, Augmented and Mixed Reality Systems with Physical Feedback | |
| US20230023609A1 (en) | Systems and methods for animating a simulated full limb for an amputee in virtual reality | |
| CN113035000A (en) | Virtual reality training system for central integrated rehabilitation therapy technology | |
| JP6664778B1 (en) | Information processing apparatus, information processing method, and program | |
| Baskar | Holopham: An augmented reality training system for upper limb myoelectric prosthesis users | |
| US12001605B2 (en) | Head mounted display with visual condition compensation | |
| US12299196B2 (en) | Sensors for accurately interacting with objects in an artificial-reality environment, and systems and methods of use thereof | |
| US20250321639A1 (en) | Sensors for accurately interacting with objects in an artificial-reality environment, and systems and methods of use thereof | |
| CN120014904A (en) | A teaching simulation device for intermittent gastric tube insertion based on virtual reality | |
| Kao et al. | Variability in head movement during gait transitions | |
| NAN | DEVELOPMENT OF A COMPUTER PROGRAM TO ASSIST UPPER LIMB REHABILITATION USING KINECT | |
| JP2023126252A (en) | Information processing device, information processing method, and program | |
| Vogiatzaki | for Game-Based Training in Mixed-Reality |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| TA01 | Transfer of patent application right |
Effective date of registration: 20181112 Address after: 518103 Fuhai Street Ocean Development Zone, Baoan District, Shenzhen City, Guangdong Province Applicant after: Shenzhen Medical Technology Co., Ltd. Address before: 710049 Department of Instrument Science and Precision Manufacturing, School of Machinery, Xi'an Jiaotong University, 28 Xianning Road, Beilin District, Xi'an City, Shaanxi Province Applicant before: Wang Jing |
|
| TA01 | Transfer of patent application right | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |