[go: up one dir, main page]

CN115202471B - A whole body posture tracking and tactile device and virtual reality system - Google Patents

A whole body posture tracking and tactile device and virtual reality system Download PDF

Info

Publication number
CN115202471B
CN115202471B CN202210704377.XA CN202210704377A CN115202471B CN 115202471 B CN115202471 B CN 115202471B CN 202210704377 A CN202210704377 A CN 202210704377A CN 115202471 B CN115202471 B CN 115202471B
Authority
CN
China
Prior art keywords
user
signal
module
motion
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210704377.XA
Other languages
Chinese (zh)
Other versions
CN115202471A (en
Inventor
杜伟华
张�浩
陈丽莉
韩鹏
何惠东
石娟娟
秦瑞峰
姜倩文
赵砚秋
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Original Assignee
BOE Technology Group Co Ltd
Beijing BOE Display Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by BOE Technology Group Co Ltd, Beijing BOE Display Technology Co Ltd filed Critical BOE Technology Group Co Ltd
Priority to CN202210704377.XA priority Critical patent/CN115202471B/en
Publication of CN115202471A publication Critical patent/CN115202471A/en
Priority to PCT/CN2023/091411 priority patent/WO2023246305A1/en
Application granted granted Critical
Publication of CN115202471B publication Critical patent/CN115202471B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/015Input arrangements based on nervous system activity detection, e.g. brain waves [EEG] detection, electromyograms [EMG] detection, electrodermal response detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/016Input arrangements with force or tactile feedback as computer generated output to the user

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Dermatology (AREA)
  • Neurosurgery (AREA)
  • Neurology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

本公开提供一种全身姿态追踪及触觉设备及虚拟现实系统,属于虚拟现实增强显示技术领域。本公开的全身姿态追踪及触觉设备包括:主体结构、至少一个动作检测单元、控制单元、至少一个触觉控制单元;主体结构,被配置为用户穿戴于肢体上;动作检测单元,安装于主体结构上,被配置为检测用户的肢体动作,并生成用户运动姿态信息;控制单元,被配置为对接收到的运动姿态信息进行处理,生成第一控制信号,并传输给VR设备,以供VR设备对虚拟用户的肢体动作进行控制,以及在接收到VR设备反馈的虚拟用户被触碰的触碰信号后,根据触碰信号,生成第二控制信号;触觉控制单元,被配置为根据第二控制信号通过主体结构向用户肢体反馈真实触觉感知。

The present disclosure provides a whole-body posture tracking and tactile device and a virtual reality system, which belongs to the field of virtual reality enhanced display technology. The whole-body posture tracking and tactile device disclosed in the present disclosure includes: a main body structure, at least one motion detection unit, a control unit, and at least one tactile control unit; the main body structure is configured to be worn by the user on the limbs; the motion detection unit is installed on the main body structure and is configured to detect the user's limb movements and generate user motion posture information; the control unit is configured to process the received motion posture information, generate a first control signal, and transmit it to the VR device, so that the VR device can control the limb movements of the virtual user, and after receiving the touch signal of the virtual user being touched fed back by the VR device, generate a second control signal according to the touch signal; the tactile control unit is configured to feedback the real tactile perception to the user's limbs through the main body structure according to the second control signal.

Description

Whole body gesture tracking and touch device and virtual reality system
Technical Field
The disclosure belongs to the technical field of virtual reality augmented display, and particularly relates to whole-body gesture tracking and haptic equipment and a virtual reality system.
Background
Virtual Reality (VR) technology is a brand new practical technology developed in the 20 th century. The virtual reality technology comprises a computer, electronic information and simulation technology, and the basic implementation mode is that the computer simulates a virtual environment so as to bring the sense of environmental immersion. With the continuous development of social productivity and scientific technology, VR technology is increasingly required by various industries.
Most mobile VR heads show a rotation tracking (3 DoF) you can look up or down, tilting left or right. But will not be tracked if you try to tilt or move your head position. Virtual reality is currently only based on our eyes and ears.
Disclosure of Invention
The invention aims to at least solve one of the technical problems in the prior art and provides a whole body gesture tracking and touch equipment and a virtual reality system.
In a first aspect, embodiments of the present disclosure provide a whole-body gesture tracking and haptic device comprising a body structure, at least one motion detection unit, a control unit, at least one haptic control unit, wherein,
The body structure configured to be worn by a user on a limb;
The motion detection unit is arranged on the main body structure and is configured to detect limb motions of a user and generate motion gesture information of the user;
The control unit is configured to process the received motion gesture information, generate a first control signal, transmit the first control signal to the VR equipment, control the limb actions of the virtual user by the VR equipment, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR equipment, wherein the virtual user is touched by the virtual user;
The haptic control unit is configured to feedback a real haptic sensation to a user limb through the body structure in accordance with the second control signal.
In some examples, the motion gesture information includes a rotation angle of a motion joint;
the first detection module is configured to detect and process an electromyographic signal generated by skeletal muscles of a user;
The second detection module is configured to detect a skin surface tension signal of a user and process the skin surface tension signal;
The control unit is configured to obtain a rotation angle of a motion joint of a user according to the processed electromyographic signals and the skin surface tension signals through a first preset algorithm, and obtain motion gesture information of the user according to the rotation angle of the motion joint of the user through a second preset algorithm so as to generate a first control signal.
In some examples, the first detection module includes an electromyographic signal electrode and a first bandpass amplifier, the second detection module includes a skin surface tension strain gauge and a second bandpass amplifier;
The electromyographic signal electrode is configured to detect an electromyographic signal generated by skeletal muscle of a user;
the first band-pass amplifier is configured to amplify the electromyographic signals and transmit the electromyographic signals to the control unit;
The skin surface tension strain gauge is configured to detect a skin surface tension signal of a user;
the second band pass amplifier is configured to amplify the skin surface tension signal and transmit it to the control unit.
In some examples, the control unit includes an analog-to-digital conversion module, a first calculation module, and a first control module;
the analog-to-digital conversion module is configured to convert the processed electromyographic signals and the skin surface tension signals into a first digital signal and a second digital signal respectively;
the first calculation module is configured to obtain the rotation angle of the motion joint of the user according to the first digital signal and the second digital signal through a first preset algorithm, and obtain the motion gesture information of the user through a second preset algorithm;
the first control module is configured to generate a first control signal according to the motion gesture information of the user, transmit the first control signal to the VR device so that the VR device can control the limb actions of the virtual user, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device and touched by the virtual user.
In some examples, the first calculating module is specifically configured to obtain, according to the first digital signal and the second digital signal and through a first preset algorithm, a rotation angle of a motion joint of a user, and calculate, according to a pre-established kinematic model, spatial pose coordinates of a current position point of each motion joint relative to an initial point of each motion joint, so as to obtain the motion pose information of the user, where the initial point is angle information of each motion joint of an initial pose of the user.
In some examples, the body structure is comprised of a plurality of hollow conduit braids; the touch control unit comprises a first switch module, a second switch module and a touch module;
The first switch module and the second switch module are both connected with the hollow tube and are used for controlling gas to enter and exit the hollow tube according to the second control signal, the touch module is arranged on the hollow tube defined by the first switch module and the second switch module, and the touch module is positioned on one side, close to a human body, of the hollow tube when a user wears the main body structure.
In some examples, the control unit includes a second control module and a second calculation module;
the second calculating module is configured to calculate and obtain the position information of the actual touched position of the user according to the touch signal fed back by the VR equipment and touched by the virtual user;
the second control module is configured to generate the second control signal according to the position information actually touched by the user.
In some examples, the material of the hollow conduit comprises a fibrous material.
In some examples, the first and second switch modules include valves.
In some examples, the touch module includes a striking hammer.
In some examples, the haptic control unit further includes a suction assembly configured to inflate gas to the hollow conduit and to vent gas from the hollow conduit.
In some examples, the suction assembly is an air pump.
In a second aspect, embodiments of the present disclosure also provide a virtual reality system, including the above-described whole body gesture tracking and haptic device and a VR device, where the VR device is communicatively connected to the whole body gesture tracking and haptic device.
In some examples, the whole body pose tracking and haptic device and the VR device are connected by WIFI or bluetooth.
In some examples, the VR device includes a VR headset.
Drawings
FIG. 1 is a schematic diagram of a human kinematic model;
FIG. 2a is a diagram of the overall body position tracking and haptic device usage effects;
FIG. 2b is a schematic drawing of a textile detail of section I of FIG. 2 a;
FIG. 3 is a schematic diagram of virtual reaction forces in a virtual haptic;
FIG. 4 is a schematic illustration of virtual pain sensation in a virtual haptic;
FIG. 5 is a flowchart of whole body pose tracking;
FIG. 6 is a virtual haptic flow chart;
FIG. 7 is a schematic view of the gesture tracking of an elbow joint of a human body;
FIG. 8 is a schematic block diagram of motion detection;
fig. 9 is a schematic diagram of a virtual reality system.
Detailed Description
The present invention will be described in further detail below with reference to the drawings and detailed description for the purpose of better understanding of the technical solution of the present invention to those skilled in the art.
Unless defined otherwise, technical or scientific terms used in this disclosure should be given the ordinary meaning as understood by one of ordinary skill in the art to which this disclosure belongs. The terms "first," "second," and the like, as used in this disclosure, do not denote any order, quantity, or importance, but rather are used to distinguish one element from another. Likewise, the terms "a," "an," or "the" and similar terms do not denote a limitation of quantity, but rather denote the presence of at least one. The word "comprising" or "comprises", and the like, means that elements or items preceding the word are included in the element or item listed after the word and equivalents thereof, but does not exclude other elements or items. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. "upper", "lower", "left", "right", etc. are used merely to indicate relative positional relationships, which may also be changed when the absolute position of the object to be described is changed.
Existing Virtual Reality (VR) technology generally requires that a VR device is used to seal a person's vision and hearing from the outside, and to guide the user to create a sense of being in a Virtual environment. VR devices 2 that are currently common in the market include VR glasses, VR helmets, and the like.
The VR helmet generally comprises a lens, a display screen and a cable, wherein the lens is used as one of the most critical component parts, plays a role of linking eyes of a user and VR content, and determines the visual effect of the user to a great extent;
In addition, high-performance displays, such that VR helmets have sufficient pixel density to display clear images and smooth motion pictures in VR, high-end VR helmets also use dual-screen displays to provide stereoscopic 3D effects, each screen displaying a slightly offset image to each eye, and our brains then automatically "stick" them together into one image, creating an illusion of depth in the process;
In addition, VR helmet still is provided with built-in sensor to make the virtual personage can make corresponding change along with user's head, obtain more accurate display. Most VR devices 2 in the prior art still track around the head and hands, which makes the way to interact in a virtual environment single and the reality of the virtual reality lacking. Even though tracking may be performed by a tracker or using a constellation of specific shapes on the tracked object, this system is cumbersome and difficult to achieve for tracking of the detailed pose. Meanwhile, how to make the user feel the touch sense of different body parts of the virtual user and blend the virtual touch sense into the virtual reality is still an important problem in the field, so that the real experience sense of the virtual reality is improved.
In view of this, in the embodiment of the present disclosure, a whole body gesture tracking and haptic device 1 is provided, which detects the limb motion of a user and generates the motion gesture information of the user through the motion detection unit 20, so as to implement the whole body gesture tracking of the user, and implements virtual touch through the main body structure 10 and the haptic control unit 40, and the VR device 2 is responsible for general head position and gesture tracking, so as to greatly improve the real experience of virtual reality, and has a certain pushing effect on the real virtual reality development of a virtual reality system.
The display module of the embodiments of the present disclosure is described below with reference to the accompanying drawings and specific embodiments.
In a first aspect, the disclosed embodiments provide a whole-body gesture tracking and haptic device 1 that may be used for information interaction with a VR device 2 worn on the head of a user. The whole body gesture tracking and haptic device 1 comprises a main body structure 10, at least one motion detection unit 20, a control unit 30, at least one haptic control unit 40. The body structure 10 is configured to be worn on a limb by a user, the motion detection unit 20 is mounted on the body structure 10, the motion detection unit 20 is configured to detect the limb motion of the user and generate motion gesture information of the user, the control unit 30 is configured to process the received motion gesture information, generate a first control signal and transmit the first control signal to the VR device 2 so that the VR device 2 can control the limb motion of the virtual user, generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device 2 when the virtual user is touched, and the touch control unit 40 is configured to feed back real touch perception to the limb of the user through the body structure 10 according to the second control signal.
It should be noted that, in the embodiment of the present disclosure, the main body structure 10 may be a garment, and the user wears the main body structure 10 when using the whole body posture tracking and haptic device 1, and the main body structure 10 can be relatively fixed on the limb (for example, tights) of the user, so as to fit the skin of the human body. The motion detection unit 20 is fixed on the main body structure 10, and when the user wears the main body structure 10, the position of the motion detection unit 20 generally corresponds to skeletal muscle of the user and is located on one side of the main body structure 10 away from skin of the user.
In the embodiment of the disclosure, since the whole body gesture tracking and haptic device 1 includes the main structure 10, at least one motion detection unit 20, the control unit 30, and at least one haptic control unit 40, the control unit 30 can process the received real motion gesture information of the user detected by the motion detection unit 20, generate a first control signal, and transmit the first control signal to the VR device 2, so that the VR device 2 can control the limb motion of the virtual user, and after receiving the touch signal that the virtual user is touched and fed back by the VR device 2, generate a second control signal according to the touch signal, the haptic control unit 40 can feed back the real haptic sensation to the limb of the user through the main structure 10 according to the second control signal.
In some examples, the motion gesture information may be a rotation angle of the motion joint. The motion detection unit 20 includes a first detection module 21 and a second detection module 22. Specifically, the first detection module 21 is configured to detect and process an electromyographic signal generated by skeletal muscle of the user, the second detection module 22 is configured to detect and process a skin surface tension signal of the user, and the control unit 30 is configured to obtain a rotation angle of a motion joint of the user according to the processed electromyographic signal and skin surface tension signal and through a first preset algorithm, and obtain motion gesture information of the user according to the rotation angle of the motion joint of the user through a second preset algorithm, so as to generate a first control signal.
Further, the first detection module 21 comprises an electromyographic signal electrode 211 and a first bandpass amplifier 212, the second detection module 22 comprises a skin surface tension strain gauge 221 and a second bandpass amplifier 222, the electromyographic signal electrode 211 is configured to detect an electromyographic signal generated by skeletal muscle of the user, the first bandpass amplifier 212 is configured to amplify the electromyographic signal and transmit to the control unit 30, the skin surface tension strain gauge 221 is configured to detect a skin surface tension signal of the user, and the second bandpass amplifier 222 is configured to amplify the skin surface tension signal and transmit to the control unit 30.
It should be noted that, the motion detection unit 20 is mounted on the main body structure 10, and its projection on the main body structure 10 covers the outside of skeletal muscle of the whole body of the human body, and the number of the electromyographic signal electrodes 211, the first band pass amplifier 212, the skin surface tension strain gauge 221, and the second band pass amplifier 222 in the motion detection unit 20 can be adjusted according to different body parts.
Specifically, when the muscle contracts, the electromyographic signal electrode 211 detects electromyographic current, the greater the contraction degree of the muscle, the greater the electromyographic current generated, the contraction degree of the muscle can be deduced by detecting the magnitude of the electromyographic current, then the joint rotation angle is calculated, the motion of a virtual user can be controlled through the motion angle of each joint of a human body, and further the whole body gesture tracking of the human body is completed, the skin surface tension strain gauge 221 can detect the skin stress change in the contraction and relaxation process of the muscle, when the muscle contracts, the skin is subjected to the extrusion force, the skin surface tension strain gauge 221 is subjected to the pressure, when the muscle expands, the skin is subjected to the surface tension force, the tension force is detected by the skin surface tension strain force, and the joint rotation angle can be calculated more accurately by matching the skin stress change current with the electromyographic signal current.
The first band pass amplifier 212 and the second band pass amplifier 222 are used for filtering and amplifying the electromyographic signal and the skin surface tension signal, but the signal amplifier used for amplifying the signal is not limited to the band pass amplifier, and the type of the signal amplifier is not limited here, so long as the detected signal can be effectively amplified.
Further, the control unit 30 comprises an analog-to-digital conversion module 31, a first calculation module and a first control module, wherein the analog-to-digital conversion module 31 is configured to convert the processed electromyographic signals and skin surface tension signals into a first digital signal and a second digital signal, respectively. The first calculation module is configured to obtain the rotation angle of the motion joint of the user according to the first digital signal and the second digital signal through a first preset algorithm, and obtain the motion gesture information of the user through a second preset algorithm. The first control module is configured to generate a first control signal according to motion gesture information of a user, transmit the first control signal to the VR device 2, so that the VR device 2 can control limb actions of the virtual user, and generate a second control signal according to the touch signal after receiving the touch signal fed back by the VR device 2 when the virtual user is touched.
In some examples, the first calculation module is specifically configured to obtain a rotation angle of a motion joint of a user according to a first digital signal and a second digital signal and through a first preset algorithm, and calculate, according to a pre-established kinematic model, spatial pose coordinates of a current position point of each motion joint relative to an initial point of each motion joint, thereby obtaining motion pose information of the user, where the initial point is angle information of each motion joint of an initial pose of the user.
Further, the first computing module comprises a multimedia application processor (MAP; multimedia Application Processor) and an inertial measurement unit (IMU; inertial Measurement Unit), and is configured to process the first digital signal and the second digital signal through the MAP to obtain a rotation angle of a motion joint of the user, and then calculate to obtain motion gesture information of the user through a kinematic formula according to the rotation angle of the motion joint of the user.
Specifically, when the system is used for the first time, virtual and real body calibration is required, namely, the physical size, initial posture, hand and foot tail end space coordinates of the user body are required to correspond to those of the virtual user. The physical size of the body is obtained by taking a whole body photo through the VR equipment 2, the initial posture is based on the initial posture of a user using the system disclosed by the invention, after the system is started, the system can track and record the initial posture through the whole body posture, in the whole body posture tracking process, all joint angles of the initial posture are used as initial points, and space coordinates of the tail ends of hands and feet can be calculated by kinematics.
Establishing a fixed point O 1, establishing an X 1O1Y1 coordinate system, keeping the position of the fixed point O 1 relative to the VR device 2 unchanged, acquiring and calculating the relative gesture by the VR device 2 and the IMU, taking the position required to be calculated by a user as O 2, establishing an X 2O2Y2 coordinate system, calculating a coordinate conversion matrix of X 2O2Y2 relative to X 1O1Y1, and finally obtaining the gesture of the user by calculating the coordinates of limbs of the user relative to O 1.
Fig. 1 is a schematic diagram of a human body kinematic model, and the flow of controlling a virtual user through limb actions is that the user makes corresponding actions according to a virtual picture, the whole body gesture tracking and the tactile device 1 performs gesture tracking, the control unit 30 sends body gesture coordinates to the VR device 2, and the VR device 2 controls the actions of the virtual user through the user gesture. Specifically, as shown in fig. 1, taking a left arm of a human body in the figure as an example, a kinematic model is established, relative spatial pose coordinates of a point O 2 relative to a point O 1 are calculated, relative positions of a point O 1 relative to a VR device 2 are fixed, relative poses can be obtained and calculated by IMUs of the VR device 2 and a control unit 30, angle information of each joint of the left arm of the human body in the figure can be obtained by tracking a whole body pose, and pose coordinates of a point O 2 relative to a point O 1 are calculated:
In the above formula:
1) Pose coordinates of θ 2 relative to O 1;
2) cθ 2 is cos θ 2;
3) s theta 2 is sin theta 2;
in the course of the above calculation process, For the coordinate transformation matrix of the hand coordinate system X 2O2Y2 relative to the central coordinate system X 1O1Y1 of the control unit 30, the whole body posture information of the human body can be obtained by calculating the coordinates of the four limbs of the human body relative to O 1. In this process, the user controls the virtual user through the limb motion, namely, the user makes corresponding motion according to the virtual picture, the whole body gesture tracking and the tactile device 1 performs gesture tracking, and the control unit 30 sends the pose coordinates of the user to the VR device 2, so that the user controls the virtual user motion.
It will be appreciated that in the above calculation process, the establishment of the hand coordinate system X 2O2Y2 and the central coordinate system X 1O1Y1 of the control unit 30 is not strictly required. The control unit 30 is usually located at the center of gravity of the human body, so that the calculation of the whole body posture information of the human body can be more convenient, but the control unit 30 can not be located at the center of gravity of the human body, and the tracking and calculation of the whole body posture can not be influenced at any position of the human body.
In some examples, after receiving the touch signal that the virtual user fed back by the VR device 2 is touched, the control unit 30 generates a second control signal according to the touch signal, and the haptic control unit 40 may feed back the real haptic sensation to the user's limb through the body structure 10 according to the second control signal.
In some examples, the control unit 30 includes a second control module and a second calculation module, the second calculation module is configured to calculate, according to a touch signal fed back by the VR device 2, position information actually touched by the user, the second control module is configured to generate a second control signal according to the position information actually touched by the user, and the haptic control unit 40 may feed back a real haptic sensation to the user limb through the body structure 10 according to the second control signal.
It should be noted that, the haptic control unit 40 may exist independently of the control unit 30, that is, the control unit 30 does not directly control the implementation of the virtual haptic sensation, but the haptic control unit 40 may also be controlled by the control unit 30, and the two units are connected in communication to implement the virtual haptic function.
Further, the haptic control unit 40 includes a first switch module 41, a second switch module 42, and a touch module 43, wherein the first switch module 41 and the second switch module 42 are connected to the hollow conduit for controlling the air to enter and exit the hollow conduit according to the second control signal, and the touch module 43 is disposed on the hollow conduit defined by the first switch module 41 and the second switch module 42 and is located at a side of the hollow conduit close to the human body when the user wears the body structure 10.
Specifically, the main body structure 10 is formed by weaving a plurality of hollow tubes, wherein fig. 2b is a schematic diagram of weaving part I in fig. 2a, as shown in fig. 2b, the hollow tubes are of a net-shaped staggered structure, the internal air pressure is adjustable, the flexibility of the hollow tubes is changed along with the internal air pressure, the whole flexibility of the hollow tubes can be changed by filling or discharging air, the local air pressure of the hollow tubes is controllable, and virtual touch is realized by adjusting the local air pressure of the hollow tubes.
In some examples, the material of the hollow conduit includes a fibrous material, i.e., a hollow fiber conduit. The fiber material has the characteristics of high strength, light weight and good air permeability, so the main structure 10 woven by the hollow fiber tubes is suitable for being worn by a human body. However, the material of the hollow conduit is not limited to a fibrous material, and any type of material may be tried while meeting the functional requirements of the body structure 10, but is preferably lightweight and high performance.
In some examples, the first and second switch modules 41, 42 include, but are not limited to, valves. The first and second switch modules 41 and 42 are used to control the local pressure of the hollow conduit, and thus the form thereof is not limited.
In some examples, the touch module 43 includes a striking hammer, wherein when the haptic control unit 40 receives the second control signal, the first and second switch modules 41 and 42 on both sides of the striking hammer are controlled to be turned off, so that the partial air pressure of the hollow line tube becomes small, and the striking hammer strikes a striking point on the skin surface of the human body under the action of the pressure, thereby realizing virtual pain sensation.
It should be noted that, instead of the first switch module 41 and the second switch module 42 on both sides of the striking hammer being turned off to control the striking hammer hit the person to strike the skin surface, the landing of the striking hammer may be directly controlled to realize the virtual touch.
In some examples, the haptic control unit 40 further includes a suction assembly 44, the suction assembly 44 being configured to inflate the hollow conduit with gas and to deflate the hollow conduit with gas. Virtual touch can be achieved by adjusting the air pressure inside the hollow tube.
In some examples, the suction assembly 44 is an air pump. The air pump fills the hollow conduit with air and discharges the air in the hollow conduit. Specifically, the air is led into the air reservoir through the air duct, so that the air is led into the hollow wire tube, and meanwhile, the air reservoir is used for leading the air in the air reservoir into the pressure regulating valve fixed on the air pump through the air duct, so that the air pressure in the air reservoir is controlled. When the air pressure in the air storage cylinder reaches the pressure set by the pressure regulating valve, the air in the air storage cylinder pushes the pressure regulating valve, the air in the air storage cylinder enters the air channel communicated with the pressure regulating valve and controls the air inlet of the air pump to be normally opened through the air channel, so that the air pump runs under the empty load, when the air pressure in the air storage cylinder is lower than the pressure set by the pressure regulating valve due to loss, the valve in the air storage cylinder returns by the return spring, the control air channel of the air pump is disconnected, and the air pump restarts pumping. When the air pump is stopped, the air pump automatically discharges air.
The form and number of the suction unit 44 are not limited herein, as long as it is configured to charge the hollow conduit with gas and discharge the gas from the hollow conduit. For example, a blower may also be used as the suction assembly 44. Of course, in view of the overall body position tracking and haptic device 1 provided by the present disclosure being wearable on the body of a user, the weight and performance of the suction assembly 44 should be considered, i.e., the lighter, smaller and better the suction assembly 44 is in achieving the function of the suction assembly 44.
In order to make the specific principle of action of the whole body posture tracking and haptic device 1 provided by the embodiments of the present disclosure more clear, the following description will be made with reference to specific examples.
Fig. 2a is a diagram of the whole body gesture tracking and haptic device usage effect, fig. 3 is a diagram of virtual reaction force in virtual haptic, fig. 4 is a diagram of virtual pain in virtual haptic, fig. 5 is a diagram of the whole body gesture tracking flow, fig. 6 is a diagram of the virtual haptic flow, referring to fig. 2a, 3, 4, 5, 6, the motion detection unit 20 of the whole body gesture tracking and haptic device 1 is mounted on the body structure 10 and configured to detect limb motion of a user, when the user moves, the motion detection unit 20 detects limb motion of the user and generates user movement gesture information to transmit to the control unit 30, the control unit 30 is configured to process the received movement gesture information, generate a first control signal and transmit to the VR device 2 for the VR device 2 to control limb motion of the virtual user, thereby realizing whole body gesture tracking, when the user moves in VR environment and is touched by the virtual person, the control unit 30 generates a second control signal according to the touch signal after the VR device 2 feeds back the touch signal touched by the virtual user, and the control unit 40 is configured to realize real haptic sensation to the limb structure 10 according to the second control body.
Specifically, as shown in fig. 5, when the system is used for the first time, virtual-real body calibration is performed, that is, the physical size, initial posture, hand and foot end space coordinates of the user body are corresponding to those of the virtual user. The physical size of the body is obtained by taking a whole body photo through the VR equipment 2, the initial posture is based on the initial posture of a user using the system disclosed by the invention, after the system is started, the system can track and record the initial posture through the whole body posture, in the whole body posture tracking process, all joint angles of the initial posture are used as initial points, and space coordinates of the tail ends of hands and feet can be calculated by kinematics.
After the user performs the virtual-real body calibration, the whole body posture tracking is started, fig. 7 is a schematic diagram of the human elbow joint posture tracking, and as shown in fig. 7, for example, the elbow joint motion detection unit 20 is attached to the outside of the triceps brachii and the outside of the biceps brachii, respectively. Fig. 8 is a schematic block diagram of motion detection, and as shown in fig. 8, each motion detection unit 20 includes a first detection module 21 and a second detection module 22, wherein the first detection module 21 includes an electromyographic signal electrode 211 and a first bandpass amplifier 212, and the second detection module 22 includes a skin surface tension strain gauge 221 and a second bandpass amplifier 222. When the muscle contracts, the electromyographic signal electrode 211 detects electromyographic current, the greater the contraction degree of the muscle, the greater the generated electromyographic current, the contraction degree of the muscle can be deduced by detecting the magnitude of the electromyographic current, then the joint rotation angle is calculated, the motion of a virtual user can be controlled through the motion angles of all joints of a human body, and further the whole body gesture tracking of the human body is completed, the skin surface tension strain gauge 221 can detect the skin stress change in the contraction and relaxation process of the muscle, when the muscle contracts, the skin is subjected to extrusion force, the skin surface tension strain gauge 221 is subjected to pressure, when the muscle expands, the skin is subjected to surface tension, the skin surface tension strain force detects tension force, and the skin stress change current can be matched with the electromyographic signal current to calculate the joint rotation angle more accurately, so that the whole body gesture tracking is completed.
As shown in fig. 3, when a user tries to push an object with the arm of the virtual user in the virtual scene, the user swings the arm in the real environment, and the virtual user pushes the object in the virtual scene according to the action of the user at the same time. In a real scene, we will feel the reaction force of an object to our hand or arm when pushing an object, while in this disclosure we want to achieve this effect in a virtual scene. When the user swings the arm to push a certain object reversely along the movement shown in fig. 3, the air pressure of the hollow tube at the outer side of the large arm is reduced, a force F1 for measuring shrinkage in the hollow tube is generated, and the arm can feel a reaction force of F2 under the action of F1 when the small arm swings, namely, the user also can feel the reaction force born by the virtual user, so that virtual touch is realized.
As shown in fig. 6, when the virtual user is hit, the pain is felt by the user in a process that after the virtual user is hit, the VR device 2 records the body coordinate position of the virtual user hit, and then transmits the position information to the haptic control unit 40, and the haptic control unit 40 adjusts the pressure of the corresponding position of the user's body to generate a slight pain.
Specifically, as shown in fig. 4, when the user plays a fight game, a virtual opponent is hit on the virtual user, and in order to make the virtual reality more similar to the real scene, the user should also feel pain when the virtual user is hit. When a virtual user body is hit, the hit points corresponding to the skin surface of the real user are hit, at this time, the air pressure of the hollow tubes on both sides of the first switch module 41 and the second switch module 42 of the hit points is reduced, under the action of pressure, the first switch module 41 and the second switch module 42 are under the action of the tensile force of F1 and F2, the touch module 43 hits the hit points on the skin surface under the action of F3 pressure, and the user feels slight pain, so that virtual pain is realized.
In a second aspect, embodiments of the present disclosure further provide a virtual reality system. Fig. 9 is a schematic diagram of a virtual reality system, which includes the above-described whole body gesture tracking and haptic device 1 and VR device 2, as shown in fig. 9, and VR device 2 is communicatively connected with whole body gesture tracking and haptic device 1.
In order to make the specific working principle of the virtual reality system provided by the embodiments of the present disclosure clearer, the following description is made with reference to specific examples.
In one example, referring to fig. 9, the virtual reality system provided by the present disclosure includes a whole body gesture tracking and haptic device 1 and a VR device 2, the whole body gesture tracking and haptic device 1 mainly performs a whole body gesture tracking and virtual haptic function, and the VR device 2 mainly performs a general head six-degree-of-freedom tracking. The whole body posture tracking is mainly completed by a motion detection unit 20 arranged on the outer surface of skeletal muscles of the whole body, the motion detection unit 20 mainly comprises an electromyographic signal electrode 211 and a skin surface tension strain gauge 221, when a human body moves, the electromyographic signal electrode 211 and the skin surface tension strain gauge 221 generate weak current, the weak current is amplified through filtering of a band-pass amplifier, the weak current is converted into a digital signal through an analog-to-digital conversion module 31, the current value is processed through MAP, and the digital signal is converted into a corresponding joint angle, so that the whole body posture tracking is realized. The virtual touch is realized by the tights woven by the hollow fiber tubes, the virtual touch is realized by adjusting the pressure intensity inside the local fiber tubes, the tights can realize the local pressure intensity control, each local pressure intensity is controlled by a separate switch module, the local pressure intensity is adjusted by introducing or exhausting gas through the suction assembly 44, the virtual touch is realized, and the virtual pain is realized by the touch module 43. Meanwhile, the IMU can record the gesture information of the main control box. The whole body posture tracking and haptic device 1 control unit 30 is in communication connection with the VR device 2, the control unit 30 transmits the body posture information of the user to the VR device 2 for processing, and the VR device 2 transmits the haptic point coordinates to the haptic control unit 40, thereby realizing the virtual haptic function. In this example, the first switch module 41 and the second switch module 42 are valves, the touch module 43 is a striking hammer, and the suction module 44 is an air pump.
In some examples, the whole body gesture tracking and haptic device 1 and VR device 2 are connected by WIFI or bluetooth, thereby enabling information interaction.
In some examples, VR device 2 comprises a VR headset, although VR device 2 may also be any device that is worn on the head of a user with VR, such as VR glasses.
It is to be understood that the above embodiments are merely illustrative of the application of the principles of the present invention, but not in limitation thereof. Various modifications and improvements may be made by those skilled in the art without departing from the spirit and substance of the invention, and are also considered to be within the scope of the invention.

Claims (14)

1.一种全身姿态追踪及触觉设备,其包括:主体结构、至少一个动作检测单元、控制单元、至少一个触觉控制单元;其中,1. A whole body posture tracking and tactile device, comprising: a main structure, at least one motion detection unit, a control unit, and at least one tactile control unit; wherein: 所述主体结构,被配置为用户穿戴于肢体上;The main structure is configured to be worn on a limb by a user; 所述动作检测单元,安装于所述主体结构上,被配置为检测用户的肢体动作,并生成用户运动姿态信息;The motion detection unit is installed on the main structure and is configured to detect the user's body movements and generate user motion posture information; 所述控制单元,被配置为对接收到的运动姿态信息进行处理,生成第一控制信号,并传输给VR设备,以供所述VR设备对虚拟用户的肢体动作进行控制,以及在接收到所述VR设备反馈的虚拟用户被触碰的触碰信号后,根据所述触碰信号,生成第二控制信号;The control unit is configured to process the received motion posture information, generate a first control signal, and transmit the first control signal to the VR device so that the VR device can control the body movements of the virtual user, and after receiving a touch signal fed back by the VR device indicating that the virtual user is touched, generate a second control signal according to the touch signal; 所述触觉控制单元,被配置为根据所述第二控制信号通过所述主体结构向用户肢体反馈真实触觉感知;The tactile control unit is configured to feedback real tactile perception to the user's limbs through the main structure according to the second control signal; 所述运动姿态信息包括运动关节的旋转角度;所述动作检测单元包括第一检测模块、第二检测模块;The motion posture information includes the rotation angle of the motion joint; the motion detection unit includes a first detection module and a second detection module; 所述第一检测模块,被配置为检测用户骨骼肌所产生的肌电信号,并进行处理;The first detection module is configured to detect the electromyographic signal generated by the user's skeletal muscles and process it; 所述第二检测模块,被配置为检测用户的皮肤表面张力信号,并进行处理;The second detection module is configured to detect and process a skin surface tension signal of the user; 所述控制单元,被配置为根据处理后的所述肌电信号和所述皮肤表面张力信号,并通过第一预设算法,得到用户的运动关节的旋转角度,以及根据用户的运动关节的旋转角度,通过第二预设算法,得到用户的运动姿态信息,以生成第一控制信号。The control unit is configured to obtain the rotation angle of the user's motion joint based on the processed electromyographic signal and the skin surface tension signal through a first preset algorithm, and to obtain the user's motion posture information based on the rotation angle of the user's motion joint through a second preset algorithm to generate a first control signal. 2.根据权利要求1所述的全身姿态追踪及触觉设备,其中,所述第一检测模块包括肌电信号电极和第一带通放大器;所述第二检测模块包括皮肤表面张力应变片和第二带通放大器;2. The whole body posture tracking and tactile device according to claim 1, wherein the first detection module comprises an electromyographic signal electrode and a first bandpass amplifier; the second detection module comprises a skin surface tension strain gauge and a second bandpass amplifier; 所述肌电信号电极,被配置为检测用户骨骼肌所产生的肌电信号;The myoelectric signal electrode is configured to detect the myoelectric signal generated by the user's skeletal muscle; 所述第一带通放大器,被配置为将所述肌电信号进行放大,并传输给所述控制单元;The first bandpass amplifier is configured to amplify the electromyographic signal and transmit it to the control unit; 所述皮肤表面张力应变片,被配置为检测用户的皮肤表面张力信号;The skin surface tension strain gauge is configured to detect a skin surface tension signal of a user; 所述第二带通放大器,被配置为将所述皮肤表面张力信号进行放大,并传输给所述控制单元。The second bandpass amplifier is configured to amplify the skin surface tension signal and transmit the amplified signal to the control unit. 3.根据权利要求1或2所述的全身姿态追踪及触觉设备,其中,所述控制单元包括模数转换模块、第一计算模块和第一控制模块;3. The whole body posture tracking and haptic device according to claim 1 or 2, wherein the control unit comprises an analog-to-digital conversion module, a first calculation module and a first control module; 所述模数转换模块,被配置为将处理后的所述肌电信号和所述皮肤表面张力信号分别转换为第一数字信号和第二数字信号;The analog-to-digital conversion module is configured to convert the processed electromyographic signal and the skin surface tension signal into a first digital signal and a second digital signal respectively; 所述第一计算模块,被配置为根据所述第一数字信号和所述第二数字信号,并通过第一预设算法,得到用户的运动关节的旋转角度,以及通过第二预设算法,得到用户的运动姿态信息;The first calculation module is configured to obtain the rotation angle of the user's motion joint according to the first digital signal and the second digital signal through a first preset algorithm, and to obtain the user's motion posture information through a second preset algorithm; 所述第一控制模块,被配置为根据用户的所述运动姿态信息,生成第一控制信号,并传输给VR设备,以供所述VR设备对虚拟用户的肢体动作进行控制,以及在接收到所述VR设备反馈的虚拟用户被触碰的触碰信号后,根据所述触碰信号,生成第二控制信号。The first control module is configured to generate a first control signal according to the user's motion posture information and transmit it to the VR device so that the VR device can control the virtual user's body movements, and after receiving a touch signal from the VR device indicating that the virtual user is touched, generate a second control signal according to the touch signal. 4.根据权利要求3所述的全身姿态追踪及触觉设备,其中,所述第一计算模块,具体被配置为根据所述第一数字信号和所述第二数字信号,并通过第一预设算法,得到用户的运动关节的旋转角度,以及根据预先建立的运动学模型,计算得到各个运动关节的当前位置点分别相对于各个运动关节的初始点的空间位姿坐标,从而得到用户的所述运动姿态信息;其中,所述初始点为用户的初始姿态的各个运动关节的角度信息。4. The whole-body posture tracking and tactile device according to claim 3, wherein the first computing module is specifically configured to obtain the rotation angle of the user's motion joints according to the first digital signal and the second digital signal and through a first preset algorithm, and to calculate the spatial posture coordinates of the current position points of each motion joint relative to the initial point of each motion joint according to a pre-established kinematic model, thereby obtaining the user's motion posture information; wherein the initial point is the angle information of each motion joint of the user's initial posture. 5.根据权利要求1所述的全身姿态追踪及触觉设备,其中,所述主体结构由多条中空线管编织组成;所述触觉控制单元包括第一开关模块、第二开关模块和触碰模块;5. The whole body posture tracking and tactile device according to claim 1, wherein the main structure is composed of a plurality of hollow wire tubes woven together; the tactile control unit comprises a first switch module, a second switch module and a touch module; 所述第一开关模块和第二开关模块均与所述中空线管连接,用于根据所述第二控制信号,控制气体进入以及排出所述中空线管;且在所述触碰模块设置在所述第一开关模块和所述第二开关模块限定的所述中空线管上,且在用户穿戴所述主体结构时,所述触碰模块位于所述中空线管靠近人体的一侧。The first switch module and the second switch module are both connected to the hollow wire tube, and are used to control the gas to enter and discharge from the hollow wire tube according to the second control signal; and the touch module is arranged on the hollow wire tube defined by the first switch module and the second switch module, and when the user wears the main structure, the touch module is located on the side of the hollow wire tube close to the human body. 6.根据权利要求5所述的全身姿态追踪及触觉设备,其中,所述控制单元包括第二控制模块和第二计算模块;6. The whole body posture tracking and haptic device according to claim 5, wherein the control unit comprises a second control module and a second calculation module; 所述第二计算模块被配置为根据所述VR设备反馈的虚拟用户被触碰的触碰信号,计算得到用户实际被触碰的位置信息;The second calculation module is configured to calculate the actual touched position information of the user according to the touch signal of the virtual user fed back by the VR device; 所述第二控制模块被配置为根据用户实际被触碰的位置信息,生成所述第二控制信号。The second control module is configured to generate the second control signal according to the position information where the user is actually touched. 7.根据权利要求5所述的全身姿态追踪及触觉设备,其中,所述中空线管的材料包括纤维材料。7 . The whole body posture tracking and haptic device according to claim 5 , wherein the material of the hollow wire tube comprises fiber material. 8.根据权利要求5所述的全身姿态追踪及触觉设备,其中,所述第一开关模块和第二开关模块包括阀门。8. The whole body posture tracking and haptic device according to claim 5, wherein the first switch module and the second switch module comprise valves. 9.根据权利要求5所述的全身姿态追踪及触觉设备,其中,所述触碰模块包括击打锤。9 . The whole body posture tracking and haptic device according to claim 5 , wherein the touch module comprises a striking hammer. 10.根据权利要求5所述的全身姿态追踪及触觉设备,其中,所述触觉控制单元还包括抽吸组件,所述抽吸组件被配置为向中空线管充入气体,以及将中空线管中的气体排出。10. The whole body posture tracking and haptic device according to claim 5, wherein the haptic control unit further comprises a suction component, and the suction component is configured to fill the hollow wire tube with gas and discharge the gas in the hollow wire tube. 11.根据权利要求10所述的全身姿态追踪及触觉设备,其中,所述抽吸组件为气泵。11. The whole body posture tracking and haptic device according to claim 10, wherein the suction component is an air pump. 12.一种虚拟现实系统,其包括权利要求1-11中任一项所述的全身姿态追踪及触觉设备和VR设备;所述VR设备与所述全身姿态追踪及触觉设备通信连接。12. A virtual reality system, comprising the whole body posture tracking and tactile device and a VR device according to any one of claims 1 to 11; the VR device is communicatively connected to the whole body posture tracking and tactile device. 13.根据权利要求12所述的虚拟现实系统,其中,所述全身姿态追踪及触觉设备和所述VR设备通过WIFI或者蓝牙连接。13. The virtual reality system according to claim 12, wherein the full-body posture tracking and tactile device and the VR device are connected via WIFI or Bluetooth. 14.根据权利要求12所述的虚拟现实系统,其中,所述VR设备包括VR头盔。14. The virtual reality system according to claim 12, wherein the VR device comprises a VR helmet.
CN202210704377.XA 2022-06-21 2022-06-21 A whole body posture tracking and tactile device and virtual reality system Active CN115202471B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN202210704377.XA CN115202471B (en) 2022-06-21 2022-06-21 A whole body posture tracking and tactile device and virtual reality system
PCT/CN2023/091411 WO2023246305A1 (en) 2022-06-21 2023-04-28 Whole-body posture tracking and haptic device and virtual reality system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210704377.XA CN115202471B (en) 2022-06-21 2022-06-21 A whole body posture tracking and tactile device and virtual reality system

Publications (2)

Publication Number Publication Date
CN115202471A CN115202471A (en) 2022-10-18
CN115202471B true CN115202471B (en) 2025-03-14

Family

ID=83576875

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210704377.XA Active CN115202471B (en) 2022-06-21 2022-06-21 A whole body posture tracking and tactile device and virtual reality system

Country Status (2)

Country Link
CN (1) CN115202471B (en)
WO (1) WO2023246305A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115202471B (en) * 2022-06-21 2025-03-14 京东方科技集团股份有限公司 A whole body posture tracking and tactile device and virtual reality system
CN116009687A (en) * 2022-11-30 2023-04-25 北京京东方显示技术有限公司 Virtual display device and virtual display method
CN119045648A (en) * 2023-05-29 2024-11-29 北京京东方显示技术有限公司 Virtual reality interaction method, device, system, storage medium and electronic equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107943282A (en) * 2017-11-06 2018-04-20 上海念通智能科技有限公司 A kind of man-machine interactive system and method based on augmented reality and wearable device
CN113190114A (en) * 2021-04-14 2021-07-30 三峡大学 Virtual scene experience system and method with haptic simulation and emotional perception

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10353532B1 (en) * 2014-12-18 2019-07-16 Leap Motion, Inc. User interface for integrated gestural interaction and multi-user collaboration in immersive virtual reality environments
US10296086B2 (en) * 2015-03-20 2019-05-21 Sony Interactive Entertainment Inc. Dynamic gloves to convey sense of touch and movement for virtual objects in HMD rendered environments
US11106273B2 (en) * 2015-10-30 2021-08-31 Ostendo Technologies, Inc. System and methods for on-body gestural interfaces and projection displays
EP3452183A4 (en) * 2016-05-02 2020-01-15 Blue Goji LLC EXERCISE MACHINE WITH VARIABLE RESISTANCE AND WIRELESS COMMUNICATION TO CONTROL INTELLIGENT DEVICES AND INTERACTIVE SOFTWARE APPLICATIONS
CN106227339A (en) * 2016-08-16 2016-12-14 西安中科比奇创新科技有限责任公司 wearable device, virtual reality human-computer interaction system and method
CN107632699B (en) * 2017-08-01 2019-10-11 东南大学 Human-computer natural interaction system based on multi-sensory data fusion
CN115202471B (en) * 2022-06-21 2025-03-14 京东方科技集团股份有限公司 A whole body posture tracking and tactile device and virtual reality system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107943282A (en) * 2017-11-06 2018-04-20 上海念通智能科技有限公司 A kind of man-machine interactive system and method based on augmented reality and wearable device
CN113190114A (en) * 2021-04-14 2021-07-30 三峡大学 Virtual scene experience system and method with haptic simulation and emotional perception

Also Published As

Publication number Publication date
CN115202471A (en) 2022-10-18
WO2023246305A1 (en) 2023-12-28

Similar Documents

Publication Publication Date Title
CN115202471B (en) A whole body posture tracking and tactile device and virtual reality system
US11086392B1 (en) Devices, systems, and methods for virtual representation of user interface devices
EP3427103B1 (en) Virtual reality
US9599821B2 (en) Virtual reality system allowing immersion in virtual space to consist with actual movement in actual space
US5436638A (en) Image display method and apparatus with means for yoking viewpoint orienting muscles of a user
JP6481057B1 (en) Character control method in virtual space
WO2015154627A1 (en) Virtual reality component system
EP3779559A1 (en) Head-mounted display, and display screen, head-mounted bracket and video thereof
JP2014510336A (en) Human computer interaction control method and its operation
CN108815804A (en) VR rehabilitation training of upper limbs platform and method based on MYO armlet and mobile terminal
CN108355346B (en) A VR device
US20090303179A1 (en) Kinetic Interface
CN109189279A (en) Human-computer interaction device
US11951397B2 (en) Display control program, display control device, and display control method
KR101811809B1 (en) Arcade game system by 3D HMD
JP2022184958A (en) animation production system
US12223679B2 (en) Data processing
CN107344019A (en) A kind of virtual reality system
CN212789785U (en) Household VR cinema system
CN206980052U (en) A kind of virtual reality system
JP7111848B2 (en) Program, Information Processing Apparatus, and Method
CN208340092U (en) A kind of VR equipment
KR20080098464A (en) Virtual Reality Device with Helmet Mount Indicator on Robot Arm
CN210845259U (en) Reality and virtual arena are combined
CN220041227U (en) VR teaching device for distance education

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant