Disclosure of Invention
The invention aims to at least solve one of the technical problems in the prior art and provides a whole body gesture tracking and touch equipment and a virtual reality system.
In a first aspect, embodiments of the present disclosure provide a whole-body gesture tracking and haptic device comprising a body structure, at least one motion detection unit, a control unit, at least one haptic control unit, wherein,
The body structure configured to be worn by a user on a limb;
The motion detection unit is arranged on the main body structure and is configured to detect limb motions of a user and generate motion gesture information of the user;
The control unit is configured to process the received motion gesture information, generate a first control signal, transmit the first control signal to the VR equipment, control the limb actions of the virtual user by the VR equipment, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR equipment, wherein the virtual user is touched by the virtual user;
The haptic control unit is configured to feedback a real haptic sensation to a user limb through the body structure in accordance with the second control signal.
In some examples, the motion gesture information includes a rotation angle of a motion joint;
the first detection module is configured to detect and process an electromyographic signal generated by skeletal muscles of a user;
The second detection module is configured to detect a skin surface tension signal of a user and process the skin surface tension signal;
The control unit is configured to obtain a rotation angle of a motion joint of a user according to the processed electromyographic signals and the skin surface tension signals through a first preset algorithm, and obtain motion gesture information of the user according to the rotation angle of the motion joint of the user through a second preset algorithm so as to generate a first control signal.
In some examples, the first detection module includes an electromyographic signal electrode and a first bandpass amplifier, the second detection module includes a skin surface tension strain gauge and a second bandpass amplifier;
The electromyographic signal electrode is configured to detect an electromyographic signal generated by skeletal muscle of a user;
the first band-pass amplifier is configured to amplify the electromyographic signals and transmit the electromyographic signals to the control unit;
The skin surface tension strain gauge is configured to detect a skin surface tension signal of a user;
the second band pass amplifier is configured to amplify the skin surface tension signal and transmit it to the control unit.
In some examples, the control unit includes an analog-to-digital conversion module, a first calculation module, and a first control module;
the analog-to-digital conversion module is configured to convert the processed electromyographic signals and the skin surface tension signals into a first digital signal and a second digital signal respectively;
the first calculation module is configured to obtain the rotation angle of the motion joint of the user according to the first digital signal and the second digital signal through a first preset algorithm, and obtain the motion gesture information of the user through a second preset algorithm;
the first control module is configured to generate a first control signal according to the motion gesture information of the user, transmit the first control signal to the VR device so that the VR device can control the limb actions of the virtual user, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device and touched by the virtual user.
In some examples, the first calculating module is specifically configured to obtain, according to the first digital signal and the second digital signal and through a first preset algorithm, a rotation angle of a motion joint of a user, and calculate, according to a pre-established kinematic model, spatial pose coordinates of a current position point of each motion joint relative to an initial point of each motion joint, so as to obtain the motion pose information of the user, where the initial point is angle information of each motion joint of an initial pose of the user.
In some examples, the body structure is comprised of a plurality of hollow conduit braids; the touch control unit comprises a first switch module, a second switch module and a touch module;
The first switch module and the second switch module are both connected with the hollow tube and are used for controlling gas to enter and exit the hollow tube according to the second control signal, the touch module is arranged on the hollow tube defined by the first switch module and the second switch module, and the touch module is positioned on one side, close to a human body, of the hollow tube when a user wears the main body structure.
In some examples, the control unit includes a second control module and a second calculation module;
the second calculating module is configured to calculate and obtain the position information of the actual touched position of the user according to the touch signal fed back by the VR equipment and touched by the virtual user;
the second control module is configured to generate the second control signal according to the position information actually touched by the user.
In some examples, the material of the hollow conduit comprises a fibrous material.
In some examples, the first and second switch modules include valves.
In some examples, the touch module includes a striking hammer.
In some examples, the haptic control unit further includes a suction assembly configured to inflate gas to the hollow conduit and to vent gas from the hollow conduit.
In some examples, the suction assembly is an air pump.
In a second aspect, embodiments of the present disclosure also provide a virtual reality system, including the above-described whole body gesture tracking and haptic device and a VR device, where the VR device is communicatively connected to the whole body gesture tracking and haptic device.
In some examples, the whole body pose tracking and haptic device and the VR device are connected by WIFI or bluetooth.
In some examples, the VR device includes a VR headset.
Detailed Description
The present invention will be described in further detail below with reference to the drawings and detailed description for the purpose of better understanding of the technical solution of the present invention to those skilled in the art.
Unless defined otherwise, technical or scientific terms used in this disclosure should be given the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The terms "first," "second," and the like, as used in this disclosure, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. Likewise, the terms "a," "an," or "the" and similar terms do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
Existing Virtual Reality (VR) technology generally requires that a VR device is used to seal a person's vision and hearing from the outside, and to guide the user to create a sense of being in a Virtual environment. VR devices 2 that are currently common in the market include VR glasses, VR helmets, and the like.
The VR helmet generally comprises a lens, a display screen and a cable, wherein the lens is used as one of the most critical component parts, plays a role of linking eyes of a user and VR content, and determines the visual effect of the user to a great extent;
In addition, high-performance displays, such that VR helmets have sufficient pixel density to display clear images and smooth motion pictures in VR, high-end VR helmets also use dual-screen displays to provide stereoscopic 3D effects, each screen displaying a slightly offset image to each eye, and our brains then automatically "stick" them together into one image, creating an illusion of depth in the process;
In addition, VR helmet still is provided with built-in sensor to make the virtual personage can make corresponding change along with user's head, obtain more accurate display. Most VR devices 2 in the prior art still track around the head and hands, which makes the way to interact in a virtual environment single and the reality of the virtual reality lacking. Even though tracking may be performed by a tracker or using a constellation of specific shapes on the tracked object, this system is cumbersome and difficult to achieve for tracking of the detailed pose. Meanwhile, how to make the user feel the touch sense of different body parts of the virtual user and blend the virtual touch sense into the virtual reality is still an important problem in the field, so that the real experience sense of the virtual reality is improved.
In view of this, in the embodiment of the present disclosure, a whole body gesture tracking and haptic device 1 is provided, which detects the limb motion of a user and generates the motion gesture information of the user through the motion detection unit 20, so as to implement the whole body gesture tracking of the user, and implements virtual touch through the main body structure 10 and the haptic control unit 40, and the VR device 2 is responsible for general head position and gesture tracking, so as to greatly improve the real experience of virtual reality, and has a certain pushing effect on the real virtual reality development of a virtual reality system.
The display module of the embodiments of the present disclosure is described below with reference to the accompanying drawings and specific embodiments.
In a first aspect, the disclosed embodiments provide a whole-body gesture tracking and haptic device 1 that may be used for information interaction with a VR device 2 worn on the head of a user. The whole body gesture tracking and haptic device 1 comprises a main body structure 10, at least one motion detection unit 20, a control unit 30, at least one haptic control unit 40. The body structure 10 is configured to be worn on a limb by a user, the motion detection unit 20 is mounted on the body structure 10, the motion detection unit 20 is configured to detect the limb motion of the user and generate motion gesture information of the user, the control unit 30 is configured to process the received motion gesture information, generate a first control signal and transmit the first control signal to the VR device 2 so that the VR device 2 can control the limb motion of the virtual user, generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device 2 when the virtual user is touched, and the touch control unit 40 is configured to feed back real touch perception to the limb of the user through the body structure 10 according to the second control signal.
It should be noted that, in the embodiment of the present disclosure, the main body structure 10 may be a garment, and the user wears the main body structure 10 when using the whole body posture tracking and haptic device 1, and the main body structure 10 can be relatively fixed on the limb (for example, tights) of the user, so as to fit the skin of the human body. The motion detection unit 20 is fixed on the main body structure 10, and when the user wears the main body structure 10, the position of the motion detection unit 20 generally corresponds to skeletal muscle of the user and is located on one side of the main body structure 10 away from skin of the user.
In the embodiment of the disclosure, since the whole body gesture tracking and haptic device 1 includes the main structure 10, at least one motion detection unit 20, the control unit 30, and at least one haptic control unit 40, the control unit 30 can process the received real motion gesture information of the user detected by the motion detection unit 20, generate a first control signal, and transmit the first control signal to the VR device 2, so that the VR device 2 can control the limb motion of the virtual user, and after receiving the touch signal that the virtual user is touched and fed back by the VR device 2, generate a second control signal according to the touch signal, the haptic control unit 40 can feed back the real haptic sensation to the limb of the user through the main structure 10 according to the second control signal.
In some examples, the motion gesture information may be a rotation angle of the motion joint. The motion detection unit 20 includes a first detection module 21 and a second detection module 22. Specifically, the first detection module 21 is configured to detect and process an electromyographic signal generated by skeletal muscle of the user, the second detection module 22 is configured to detect and process a skin surface tension signal of the user, and the control unit 30 is configured to obtain a rotation angle of a motion joint of the user according to the processed electromyographic signal and skin surface tension signal and through a first preset algorithm, and obtain motion gesture information of the user according to the rotation angle of the motion joint of the user through a second preset algorithm, so as to generate a first control signal.
Further, the first detection module 21 comprises an electromyographic signal electrode 211 and a first bandpass amplifier 212, the second detection module 22 comprises a skin surface tension strain gauge 221 and a second bandpass amplifier 222, the electromyographic signal electrode 211 is configured to detect an electromyographic signal generated by skeletal muscle of the user, the first bandpass amplifier 212 is configured to amplify the electromyographic signal and transmit to the control unit 30, the skin surface tension strain gauge 221 is configured to detect a skin surface tension signal of the user, and the second bandpass amplifier 222 is configured to amplify the skin surface tension signal and transmit to the control unit 30.
It should be noted that, the motion detection unit 20 is mounted on the main body structure 10, and its projection on the main body structure 10 covers the outside of skeletal muscle of the whole body of the human body, and the number of the electromyographic signal electrodes 211, the first band pass amplifier 212, the skin surface tension strain gauge 221, and the second band pass amplifier 222 in the motion detection unit 20 can be adjusted according to different body parts.
Specifically, when the muscle contracts, the electromyographic signal electrode 211 detects electromyographic current, the greater the contraction degree of the muscle, the greater the electromyographic current generated, the contraction degree of the muscle can be deduced by detecting the magnitude of the electromyographic current, then the joint rotation angle is calculated, the motion of a virtual user can be controlled through the motion angle of each joint of a human body, and further the whole body gesture tracking of the human body is completed, the skin surface tension strain gauge 221 can detect the skin stress change in the contraction and relaxation process of the muscle, when the muscle contracts, the skin is subjected to the extrusion force, the skin surface tension strain gauge 221 is subjected to the pressure, when the muscle expands, the skin is subjected to the surface tension force, the tension force is detected by the skin surface tension strain force, and the joint rotation angle can be calculated more accurately by matching the skin stress change current with the electromyographic signal current.
The first band pass amplifier 212 and the second band pass amplifier 222 are used for filtering and amplifying the electromyographic signal and the skin surface tension signal, but the signal amplifier used for amplifying the signal is not limited to the band pass amplifier, and the type of the signal amplifier is not limited here, so long as the detected signal can be effectively amplified.
Further, the control unit 30 comprises an analog-to-digital conversion module 31, a first calculation module and a first control module, wherein the analog-to-digital conversion module 31 is configured to convert the processed electromyographic signals and skin surface tension signals into a first digital signal and a second digital signal, respectively. The first calculation module is configured to obtain the rotation angle of the motion joint of the user according to the first digital signal and the second digital signal through a first preset algorithm, and obtain the motion gesture information of the user through a second preset algorithm. The first control module is configured to generate a first control signal according to motion gesture information of a user, transmit the first control signal to the VR device 2, so that the VR device 2 can control limb actions of the virtual user, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device 2 when the virtual user is touched.
In some examples, the first calculation module is specifically configured to obtain a rotation angle of a motion joint of a user according to a first digital signal and a second digital signal and through a first preset algorithm, and calculate, according to a pre-established kinematic model, spatial pose coordinates of a current position point of each motion joint relative to an initial point of each motion joint, thereby obtaining motion pose information of the user, where the initial point is angle information of each motion joint of an initial pose of the user.
Further, the first computing module comprises a multimedia application processor (MAP; multimedia Application Processor) and an inertial measurement unit (IMU; inertial Measurement Unit), and is configured to process the first digital signal and the second digital signal through the MAP to obtain a rotation angle of a motion joint of the user, and then calculate to obtain motion gesture information of the user through a kinematic formula according to the rotation angle of the motion joint of the user.
Specifically, when the system is used for the first time, virtual and real body calibration is required, namely, the physical size, initial posture, hand and foot tail end space coordinates of the user body are required to correspond to those of the virtual user. The physical size of the body is obtained by taking a whole body photo through the VR equipment 2, the initial posture is based on the initial posture of a user using the system disclosed by the invention, after the system is started, the system can track and record the initial posture through the whole body posture, in the whole body posture tracking process, all joint angles of the initial posture are used as initial points, and space coordinates of the tail ends of hands and feet can be calculated by kinematics.
Establishing a fixed point O 1, establishing an X 1O1Y1 coordinate system, keeping the position of the fixed point O 1 relative to the VR device 2 unchanged, acquiring and calculating the relative gesture by the VR device 2 and the IMU, taking the position required to be calculated by a user as O 2, establishing an X 2O2Y2 coordinate system, calculating a coordinate conversion matrix of X 2O2Y2 relative to X 1O1Y1, and finally obtaining the gesture of the user by calculating the coordinates of limbs of the user relative to O 1.
Fig. 1 is a schematic diagram of a human body kinematic model, and the flow of controlling a virtual user through limb actions is that the user makes corresponding actions according to a virtual picture, the whole body gesture tracking and the tactile device 1 performs gesture tracking, the control unit 30 sends body gesture coordinates to the VR device 2, and the VR device 2 controls the actions of the virtual user through the user gesture. Specifically, as shown in fig. 1, taking a left arm of a human body in the figure as an example, a kinematic model is established, relative spatial pose coordinates of a point O 2 relative to a point O 1 are calculated, relative positions of a point O 1 relative to a VR device 2 are fixed, relative poses can be obtained and calculated by IMUs of the VR device 2 and a control unit 30, angle information of each joint of the left arm of the human body in the figure can be obtained by tracking a whole body pose, and pose coordinates of a point O 2 relative to a point O 1 are calculated:
In the above formula:
1) Pose coordinates of θ 2 relative to O 1;
2) cθ 2 is cos θ 2;
3) s theta 2 is sin theta 2;
in the course of the above calculation process, For the coordinate transformation matrix of the hand coordinate system X 2O2Y2 relative to the central coordinate system X 1O1Y1 of the control unit 30, the whole body posture information of the human body can be obtained by calculating the coordinates of the four limbs of the human body relative to O 1. In this process, the user controls the virtual user through the limb motion, namely, the user makes corresponding motion according to the virtual picture, the whole body gesture tracking and the tactile device 1 performs gesture tracking, and the control unit 30 sends the pose coordinates of the user to the VR device 2, so that the user controls the virtual user motion.
It will be appreciated that in the above calculation process, the establishment of the hand coordinate system X 2O2Y2 and the central coordinate system X 1O1Y1 of the control unit 30 is not strictly required. The control unit 30 is usually located at the center of gravity of the human body, so that the calculation of the whole body posture information of the human body can be more convenient, but the control unit 30 can not be located at the center of gravity of the human body, and the tracking and calculation of the whole body posture can not be influenced at any position of the human body.
In some examples, after receiving the touch signal that the virtual user fed back by the VR device 2 is touched, the control unit 30 generates a second control signal according to the touch signal, and the haptic control unit 40 may feed back the real haptic sensation to the user's limb through the body structure 10 according to the second control signal.
In some examples, the control unit 30 includes a second control module and a second calculation module, the second calculation module is configured to calculate, according to a touch signal fed back by the VR device 2, position information actually touched by the user, the second control module is configured to generate a second control signal according to the position information actually touched by the user, and the haptic control unit 40 may feed back a real haptic sensation to the user limb through the body structure 10 according to the second control signal.
It should be noted that, the haptic control unit 40 may exist independently of the control unit 30, that is, the control unit 30 does not directly control the implementation of the virtual haptic sensation, but the haptic control unit 40 may also be controlled by the control unit 30, and the two units are connected in communication to implement the virtual haptic function.
Further, the haptic control unit 40 includes a first switch module 41, a second switch module 42, and a touch module 43, wherein the first switch module 41 and the second switch module 42 are connected to the hollow conduit for controlling the air to enter and exit the hollow conduit according to the second control signal, and the touch module 43 is disposed on the hollow conduit defined by the first switch module 41 and the second switch module 42 and is located at a side of the hollow conduit close to the human body when the user wears the body structure 10.
Specifically, the main body structure 10 is formed by weaving a plurality of hollow tubes, wherein fig. 2b is a schematic diagram of weaving part I in fig. 2a, as shown in fig. 2b, the hollow tubes are of a net-shaped staggered structure, the internal air pressure is adjustable, the flexibility of the hollow tubes is changed along with the internal air pressure, the whole flexibility of the hollow tubes can be changed by filling or discharging air, the local air pressure of the hollow tubes is controllable, and virtual touch is realized by adjusting the local air pressure of the hollow tubes.
In some examples, the material of the hollow conduit includes a fibrous material, i.e., a hollow fiber conduit. The fiber material has the characteristics of high strength, light weight and good air permeability, so the main structure 10 woven by the hollow fiber tubes is suitable for being worn by a human body. However, the material of the hollow conduit is not limited to a fibrous material, and any type of material may be tried while meeting the functional requirements of the body structure 10, but is preferably lightweight and high performance.
In some examples, the first and second switch modules 41, 42 include, but are not limited to, valves. The first and second switch modules 41 and 42 are used to control the local pressure of the hollow conduit, and thus the form thereof is not limited.
In some examples, the touch module 43 includes a striking hammer, wherein when the haptic control unit 40 receives the second control signal, the first and second switch modules 41 and 42 on both sides of the striking hammer are controlled to be turned off, so that the partial air pressure of the hollow line tube becomes small, and the striking hammer strikes a striking point on the skin surface of the human body under the action of the pressure, thereby realizing virtual pain sensation.
It should be noted that, instead of the first switch module 41 and the second switch module 42 on both sides of the striking hammer being turned off to control the striking hammer hit the person to strike the skin surface, the landing of the striking hammer may be directly controlled to realize the virtual touch.
In some examples, the haptic control unit 40 further includes a suction assembly 44, the suction assembly 44 being configured to inflate the hollow conduit with gas and to deflate the hollow conduit with gas. Virtual touch can be achieved by adjusting the air pressure inside the hollow tube.
In some examples, the suction assembly 44 is an air pump. The air pump fills the hollow conduit with air and discharges the air in the hollow conduit. Specifically, the air is led into the air reservoir through the air duct, so that the air is led into the hollow wire tube, and meanwhile, the air reservoir is used for leading the air in the air reservoir into the pressure regulating valve fixed on the air pump through the air duct, so that the air pressure in the air reservoir is controlled. When the air pressure in the air storage cylinder reaches the pressure set by the pressure regulating valve, the air in the air storage cylinder pushes the pressure regulating valve, the air in the air storage cylinder enters the air channel communicated with the pressure regulating valve and controls the air inlet of the air pump to be normally opened through the air channel, so that the air pump runs under the empty load, when the air pressure in the air storage cylinder is lower than the pressure set by the pressure regulating valve due to loss, the valve in the air storage cylinder returns by the return spring, the control air channel of the air pump is disconnected, and the air pump restarts pumping. When the air pump is stopped, the air pump automatically discharges air.
The form and number of the suction unit 44 are not limited herein, as long as it is configured to charge the hollow conduit with gas and discharge the gas from the hollow conduit. For example, a blower may also be used as the suction assembly 44. Of course, in view of the overall body position tracking and haptic device 1 provided by the present disclosure being wearable on the body of a user, the weight and performance of the suction assembly 44 should be considered, i.e., the lighter, smaller and better the suction assembly 44 is in achieving the function of the suction assembly 44.
In order to make the specific principle of action of the whole body posture tracking and haptic device 1 provided by the embodiments of the present disclosure more clear, the following description will be made with reference to specific examples.
Fig. 2a is a diagram of the whole body gesture tracking and haptic device usage effect, fig. 3 is a diagram of virtual reaction force in virtual haptic, fig. 4 is a diagram of virtual pain in virtual haptic, fig. 5 is a diagram of the whole body gesture tracking flow, fig. 6 is a diagram of the virtual haptic flow, referring to fig. 2a, 3, 4, 5, 6, the motion detection unit 20 of the whole body gesture tracking and haptic device 1 is mounted on the body structure 10 and configured to detect limb motion of a user, when the user moves, the motion detection unit 20 detects limb motion of the user and generates user movement gesture information to transmit to the control unit 30, the control unit 30 is configured to process the received movement gesture information, generate a first control signal and transmit to the VR device 2 for the VR device 2 to control limb motion of the virtual user, thereby realizing whole body gesture tracking, when the user moves in VR environment and is touched by the virtual person, the control unit 30 generates a second control signal according to the touch signal after the VR device 2 feeds back the touch signal touched by the virtual user, and the control unit 40 is configured to realize real haptic sensation to the limb structure 10 according to the second control body.
Specifically, as shown in fig. 5, when the system is used for the first time, virtual-real body calibration is performed, that is, the physical size, initial posture, hand and foot end space coordinates of the user body are corresponding to those of the virtual user. The physical size of the body is obtained by taking a whole body photo through the VR equipment 2, the initial posture is based on the initial posture of a user using the system disclosed by the invention, after the system is started, the system can track and record the initial posture through the whole body posture, in the whole body posture tracking process, all joint angles of the initial posture are used as initial points, and space coordinates of the tail ends of hands and feet can be calculated by kinematics.
After the user performs the virtual-real body calibration, the whole body posture tracking is started, fig. 7 is a schematic diagram of the human elbow joint posture tracking, and as shown in fig. 7, for example, the elbow joint motion detection unit 20 is attached to the outside of the triceps brachii and the outside of the biceps brachii, respectively. Fig. 8 is a schematic block diagram of motion detection, and as shown in fig. 8, each motion detection unit 20 includes a first detection module 21 and a second detection module 22, wherein the first detection module 21 includes an electromyographic signal electrode 211 and a first bandpass amplifier 212, and the second detection module 22 includes a skin surface tension strain gauge 221 and a second bandpass amplifier 222. When the muscle contracts, the electromyographic signal electrode 211 detects electromyographic current, the greater the contraction degree of the muscle, the greater the generated electromyographic current, the contraction degree of the muscle can be deduced by detecting the magnitude of the electromyographic current, then the joint rotation angle is calculated, the motion of a virtual user can be controlled through the motion angles of all joints of a human body, and further the whole body gesture tracking of the human body is completed, the skin surface tension strain gauge 221 can detect the skin stress change in the contraction and relaxation process of the muscle, when the muscle contracts, the skin is subjected to extrusion force, the skin surface tension strain gauge 221 is subjected to pressure, when the muscle expands, the skin is subjected to surface tension, the skin surface tension strain force detects tension force, and the skin stress change current can be matched with the electromyographic signal current to calculate the joint rotation angle more accurately, so that the whole body gesture tracking is completed.
As shown in fig. 3, when a user tries to push an object with the arm of the virtual user in the virtual scene, the user swings the arm in the real environment, and the virtual user pushes the object in the virtual scene according to the action of the user at the same time. In a real scene, we will feel the reaction force of an object to our hand or arm when pushing an object, while in this disclosure we want to achieve this effect in a virtual scene. When the user swings the arm to push a certain object reversely along the movement shown in fig. 3, the air pressure of the hollow tube at the outer side of the large arm is reduced, a force F1 for measuring shrinkage in the hollow tube is generated, and the arm can feel a reaction force of F2 under the action of F1 when the small arm swings, namely, the user also can feel the reaction force born by the virtual user, so that virtual touch is realized.
As shown in fig. 6, when the virtual user is hit, the pain is felt by the user in a process that after the virtual user is hit, the VR device 2 records the body coordinate position of the virtual user hit, and then transmits the position information to the haptic control unit 40, and the haptic control unit 40 adjusts the pressure of the corresponding position of the user's body to generate a slight pain.
Specifically, as shown in fig. 4, when the user plays a fight game, a virtual opponent is hit on the virtual user, and in order to make the virtual reality more similar to the real scene, the user should also feel pain when the virtual user is hit. When a virtual user body is hit, the hit points corresponding to the skin surface of the real user are hit, at this time, the air pressure of the hollow tubes on both sides of the first switch module 41 and the second switch module 42 of the hit points is reduced, under the action of pressure, the first switch module 41 and the second switch module 42 are under the action of the tensile force of F1 and F2, the touch module 43 hits the hit points on the skin surface under the action of F3 pressure, and the user feels slight pain, so that virtual pain is realized.
In a second aspect, embodiments of the present disclosure further provide a virtual reality system. Fig. 9 is a schematic diagram of a virtual reality system, which includes the above-described whole body gesture tracking and haptic device 1 and VR device 2, as shown in fig. 9, and VR device 2 is communicatively connected with whole body gesture tracking and haptic device 1.
In order to make the specific working principle of the virtual reality system provided by the embodiments of the present disclosure clearer, the following description is made with reference to specific examples.
In one example, referring to fig. 9, the virtual reality system provided by the present disclosure includes a whole body gesture tracking and haptic device 1 and a VR device 2, the whole body gesture tracking and haptic device 1 mainly performs a whole body gesture tracking and virtual haptic function, and the VR device 2 mainly performs a general head six-degree-of-freedom tracking. The whole body posture tracking is mainly completed by a motion detection unit 20 arranged on the outer surface of skeletal muscles of the whole body, the motion detection unit 20 mainly comprises an electromyographic signal electrode 211 and a skin surface tension strain gauge 221, when a human body moves, the electromyographic signal electrode 211 and the skin surface tension strain gauge 221 generate weak current, the weak current is amplified through filtering of a band-pass amplifier, the weak current is converted into a digital signal through an analog-to-digital conversion module 31, the current value is processed through MAP, and the digital signal is converted into a corresponding joint angle, so that the whole body posture tracking is realized. The virtual touch is realized by the tights woven by the hollow fiber tubes, the virtual touch is realized by adjusting the pressure intensity inside the local fiber tubes, the tights can realize the local pressure intensity control, each local pressure intensity is controlled by a separate switch module, the local pressure intensity is adjusted by introducing or exhausting gas through the suction assembly 44, the virtual touch is realized, and the virtual pain is realized by the touch module 43. Meanwhile, the IMU can record the gesture information of the main control box. The whole body posture tracking and haptic device 1 control unit 30 is in communication connection with the VR device 2, the control unit 30 transmits the body posture information of the user to the VR device 2 for processing, and the VR device 2 transmits the haptic point coordinates to the haptic control unit 40, thereby realizing the virtual haptic function. In this example, the first switch module 41 and the second switch module 42 are valves, the touch module 43 is a striking hammer, and the suction module 44 is an air pump.
In some examples, the whole body gesture tracking and haptic device 1 and VR device 2 are connected by WIFI or bluetooth, thereby enabling information interaction.
In some examples, VR device 2 comprises a VR headset, although VR device 2 may also be any device that is worn on the head of a user with VR, such as VR glasses.
It is to be understood that the above embodiments are merely illustrative of the application of the principles of the present invention, but not in limitation thereof. Various modifications and improvements may be made by those skilled in the art without departing from the spirit and substance of the invention, and are also considered to be within the scope of the invention.