WO2018149267A1 - Procédé et dispositif d'affichage basés sur la réalité augmentée - Google Patents
Procédé et dispositif d'affichage basés sur la réalité augmentée Download PDFInfo
- Publication number
- WO2018149267A1 WO2018149267A1 PCT/CN2018/073473 CN2018073473W WO2018149267A1 WO 2018149267 A1 WO2018149267 A1 WO 2018149267A1 CN 2018073473 W CN2018073473 W CN 2018073473W WO 2018149267 A1 WO2018149267 A1 WO 2018149267A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- display
- virtual
- augmented reality
- display mode
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 140
- 238000000034 method Methods 0.000 title claims abstract description 30
- 210000001508 eye Anatomy 0.000 claims description 74
- 238000012545 processing Methods 0.000 claims description 15
- 230000033001 locomotion Effects 0.000 claims description 4
- 230000002194 synthesizing effect Effects 0.000 abstract description 3
- 210000003128 head Anatomy 0.000 description 58
- 238000010586 diagram Methods 0.000 description 17
- 210000005252 bulbus oculi Anatomy 0.000 description 10
- 230000003287 optical effect Effects 0.000 description 10
- 230000000007 visual effect Effects 0.000 description 10
- 230000009471 action Effects 0.000 description 8
- 241000699666 Mus <mouse, genus> Species 0.000 description 7
- 210000004556 brain Anatomy 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 230000003993 interaction Effects 0.000 description 7
- 238000004891 communication Methods 0.000 description 6
- 238000005516 engineering process Methods 0.000 description 6
- 230000008569 process Effects 0.000 description 6
- 230000015572 biosynthetic process Effects 0.000 description 5
- 230000008859 change Effects 0.000 description 5
- 238000012937 correction Methods 0.000 description 5
- 230000005236 sound signal Effects 0.000 description 5
- 238000003786 synthesis reaction Methods 0.000 description 5
- 230000005540 biological transmission Effects 0.000 description 4
- 230000006870 function Effects 0.000 description 4
- 230000010365 information processing Effects 0.000 description 4
- 230000010287 polarization Effects 0.000 description 4
- 238000009877 rendering Methods 0.000 description 4
- 241000699670 Mus sp. Species 0.000 description 3
- 238000003384 imaging method Methods 0.000 description 3
- 238000002310 reflectometry Methods 0.000 description 3
- 230000009977 dual effect Effects 0.000 description 2
- 238000002594 fluoroscopy Methods 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 239000002994 raw material Substances 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 238000002834 transmittance Methods 0.000 description 2
- 241000282412 Homo Species 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 230000005057 finger movement Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000007246 mechanism Effects 0.000 description 1
- 239000000203 mixture Substances 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000001328 optic nerve Anatomy 0.000 description 1
- APTZNLHMIGJTEW-UHFFFAOYSA-N pyraflufen-ethyl Chemical compound C1=C(Cl)C(OCC(=O)OCC)=CC(C=2C(=C(OC(F)F)N(C)N=2)Cl)=C1F APTZNLHMIGJTEW-UHFFFAOYSA-N 0.000 description 1
- 229910052710 silicon Inorganic materials 0.000 description 1
- 239000010703 silicon Substances 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
Definitions
- the embodiments of the present application relate to the field of augmented reality technologies, and in particular, to a display method and device based on augmented reality.
- Head mounted display is a new technology developed in recent years.
- the specific applications can be divided into virtual reality and augmented reality.
- Human 3D vision is experimented through the collaborative work of both eyes.
- the image seen by the left eye and the image seen by the right eye have subtle differences in perspective.
- the optic nerve of the human brain mixes the images of the left and right eyes to form a 3D vision.
- the head-mounted display can display different images by displaying the image on the display near the front of the eye.
- the left eye can only see the left display area, and the right eye can only see the right display area, which can also simulate 3D vision. If, during this process, the head-mounted display blocks the light and images of the external area at the same time, the virtual reality effect will be produced when the eye can only see the display screen.
- the principle of augmented reality is to simulate virtual vision through a head-mounted display, superimposed on the user's normal vision.
- the following figure is a typical implementation of an augmented reality head mounted display.
- the augmented reality head-mounted display has two implementations of optical perspective and video perspective, the main difference being that the optical combiner is different.
- the optical synthesizer of such a head-mounted display is a semi-transparent mirror, from which part of the ambient light passes through the half-lens, and the virtual object is projected through the display on the half-reflecting mirror through the half mirror. The surface is reflected into the user's eyes to synthesize the real and virtual worlds.
- the optical synthesis of such a head mounted display is achieved by a camera device and a display screen.
- the camera device captures the real environment, and the video data and the virtual object are superimposed by computer processing and presented to the user through the display screen.
- the system structure is basically the same as that of the virtual reality head-mounted display, and only needs to increase the camera environment of the shooting environment and the software module that processes the real and virtual world synthesis.
- the optical see-through head-mounted display has a better user experience and a better user experience because the user directly sees the real environment.
- the technical problem to be solved by the embodiment of the present application is to provide a display method and device based on augmented reality, which has a sense of presence, a large display area, and privacy.
- the embodiment of the present application provides a display method based on augmented reality, including: emitting a first ray including a virtual image, the virtual image being an image transmitted by a receiving external device; and acquiring an external scene a second ray of the live image; synthesizing the first ray comprising the virtual image with a second ray comprising a live view of the external scene.
- an augmented reality based display method combines an external real image with a virtual image, and the virtual image can provide a virtual display screen or a virtual mouse keyboard with a large display range for the user, and The virtual image can be used in conjunction with real-life physical screens, mice, keyboards, touch screens, buttons, etc., with a large field of view and privacy.
- FIG. 1a is a schematic structural diagram of a display device based on augmented reality provided by Embodiment 1 of the present application;
- Figure 1b is a schematic view of the see-through light guiding element shown in Figure 1a when it is placed on the head frame;
- Figure 1c is a first relationship diagram between a side view angle and a display brightness of the display module shown in Figure 1a;
- Figure 1d is a second relationship diagram between the side view angle and the display brightness of the display module shown in Figure 1a;
- Figure 1e is a third relationship diagram between a side view angle and a display brightness of the display module shown in Figure 1a;
- FIG. 2a is a schematic diagram showing a positional relationship between a display module and a user's face when the augmented reality-based display device shown in FIG. 1a is worn;
- Figure 2b is a schematic view showing the rotation of the display module shown in Figure 1a;
- FIG. 3 is a schematic diagram of an imaging principle of the augmented reality based display device shown in FIG. 1a;
- FIG. 4 is a schematic view of the augmented reality based display device shown in FIG. 1a when a diopter correction lens is provided;
- FIG. 5 is a schematic diagram showing the distance relationship between the diagonal field of view area and the farthest end of the head frame to the foremost end of the user's head of the augmented reality display device shown in FIG. 1a;
- FIG. 6 is a schematic diagram of the augmented reality based display device shown in FIG. 1a connected to an external device;
- FIG. 7 is a schematic structural diagram of a display device based on augmented reality provided by Embodiment 2 of the present application.
- FIG. 8 is a schematic diagram of the augmented reality based display device shown in FIG. 7 connected to an external device;
- FIG. 9 is still another schematic diagram of the augmented reality based display device shown in FIG. 7 connected to an external device; FIG.
- FIG. 10 is a schematic diagram of the operation of the augmented reality based display device shown in FIG. 7;
- FIG. 11 is a schematic diagram of a first display mode in an augmented reality based display method according to a third embodiment of the present application.
- FIG. 12 is a schematic diagram of a second display mode in an augmented reality based display method according to a third embodiment of the present application.
- FIG. 13 is a first application example diagram of a display method based on augmented reality provided by a third embodiment of the present application.
- FIG. 14 is a schematic diagram of a second application example in a display method based on augmented reality provided by a third embodiment of the present application;
- FIG. 15 is a schematic diagram of a third application example in a display method based on augmented reality provided by a third embodiment of the present application.
- an augmented reality based display device provided by the embodiment of the present application has a total weight of the display device of the augmented reality of less than 350 grams, including: a head frame 11, two display modules 12, and two Perspective light guiding element 13.
- the see-through light guiding element 13 is a partially transmissive, partially reflective optical synthesizing device.
- the display module 12 and the see-through light guiding elements 13 are all disposed on the head frame 11.
- the bracket 11 fixes the display module 12 and the see-through light guiding element 13.
- the display module 12 is disposed on the upper side of the see-through light guiding element 13, and the light emitted by the display module 12 can be reflected after passing through the see-through light guiding element 13.
- the display module 13 may also be located on the side of the see-through light guiding element 13.
- the augmented reality-based display device further includes a main board 17 disposed on the head frame 11 and located between the two display modules 12.
- the main board 17 is provided with a processor for processing a virtual image signal and displaying the virtual image information on the display module 12.
- the head frame 11 is used for wearing on the head of the user, and each of the see-through light guiding elements 13 has a concave surface which is disposed toward the eyes of the user.
- the first light reflected through the concave surface of the see-through light guiding element 13 enters the left eye of the user, and the other first light reflected through the concave surface of the other see-through light guiding element 13 enters the right eye of the user to be at the user's
- the vision of a 3D virtual scene is formed in the mind.
- the first light is emitted by the display module 12, and the first light includes virtual image information of the left eye and the right eye.
- two see-through light guiding elements 13 are disposed on the head frame 11 and are independently embedded in the head frame 11, respectively.
- two regions corresponding to the left and right eyes of the user may be disposed on the raw material for fabricating the fluoroscopic light guiding element, and the shape and size of the region are different from the shape and size of each of the fluoroscopic light guiding members 13 when independently disposed as described above.
- the same effect; the final effect is that a large see-through light guiding element is provided with two areas corresponding to the left and right eyes of the user.
- the two fluoroscopic light guiding elements 13 are integrally formed.
- the see-through light guiding elements provided corresponding to the left and right eye regions of the user are embedded in the head frame 11.
- the display module 12 is detachably mounted on the head frame 11, for example, the display module is an intelligent display terminal such as a mobile phone or a tablet computer; or the display module is fixedly mounted on the head frame, for example, the display module and the head. Wear a frame integrated design.
- Two display modules 12 can be mounted on the headgear frame 11.
- the left eye and the right eye of the user are respectively provided with a display module 12, for example, one display module 12 is configured to emit a first light containing virtual image information of the left eye, and A display module 12 is configured to emit another first light that includes virtual image information of the right eye.
- the two display modules 12 are respectively located above the two fluoroscopic light guiding elements 13 in a one-to-one correspondence.
- the two display modules 12 are respectively located in the user one by one.
- the display module 12 can also be located on the side of the see-through light guiding element, that is, two see-through light guiding elements are located between the two display modules, when the display device based on the augmented reality is worn on the user In the head, the two display modules are respectively located one-to-one correspondingly to the sides of the left and right eyes of the user.
- a single display module 12 can also be mounted on the headgear frame 11.
- the single display module 12 has two display areas, one for emitting a first ray containing left eye virtual image information and the other for transmitting Another first ray of virtual image information for the right eye.
- the display module includes, but is not limited to, an LCD (Liquid Crystal Display), an OLED (Organic Light-Emitting Diode), an LCOS (Liquid Crystal On Silicon), or the like.
- LCD Liquid Crystal Display
- OLED Organic Light-Emitting Diode
- LCOS Liquid Crystal On Silicon
- the horizontal axis represents the side view angle and the vertical axis represents the display brightness.
- the display module 12 is an LCD
- the brightness of the display module 12 varies with the angle of the viewer.
- the side observation angle ⁇ at a display luminance of 50% is generally large.
- the LCD When the LCD is applied to an augmented reality display system, it is more suitable for a small side viewing angle, and the brightness of such a display module 12 is concentrated in an angular area near the center. Because the augmented reality display system mainly uses an angled area near the center, the brightness of the first light and the second light projected into the user's eyes will be relatively high. Referring to Fig. 1d, the brightness of the first light and the second light applied to the LCD in the enhanced reality display system is generally small when the display brightness is 50%. Moreover, the distribution of the brightness of the first light and the second light emitted by the LCD applied in the augmented reality display system is symmetric about the 0 degree side view, and the side view angle is less than 60 degrees.
- the display brightness of the brightness of the first light and the second light emitted by the display module 12 is maximum, and when the user's angle of view is shifted to both sides, the display brightness is gradually reduced, and the side view is gradually reduced. When it is less than 60 degrees, the display brightness is 0.
- the brightness distribution of the first light and the second light emitted by the LCD applied to the augmented reality display system may not be symmetric with respect to the 0 degree side view, and the side view angle when the brightness is the brightest is not 0. degree.
- two display modules 12 are respectively located in a one-to-one correspondence above the two perspective light guiding elements 13.
- the display module 12 forms a front plane with the user's head.
- An angle a the angle of the angle a is from 0 to 180 degrees, preferably an obtuse angle.
- the projection of the display module 12 on the horizontal plane is perpendicular to the normal plane.
- the position of the see-through light guiding element 13 can be rotated by a certain angle b around a certain axis of rotation perpendicular to the horizontal plane, the angle b of the angle b being 0 degrees to 180 degrees, preferably 0 degrees to 90 degrees.
- the see-through light guiding elements 13 corresponding to the left and right eyes can be adjusted in pitch by the mechanical structure on the head frame 11 to accommodate the user's interpupillary distance, ensuring comfort and imaging quality in use.
- the farthest distance between the edges of the two see-through light guiding elements 13 is less than 150 mm, that is, the left edge of the see-through light guiding element 13 corresponding to the left eye to the see-through light guiding element corresponding to the right eye
- the distance to the right edge of 13 is less than 150 mm.
- the display modules 12 are connected by a mechanical structure, and the distance between the display modules 12 can also be adjusted, or the same effect can be achieved by adjusting the position of the display content on the display module 12.
- the headgear frame 11 may be an eyeglass frame structure for hanging on the ear and nose of the user, on which the nose pad 111 and the temple 112 are disposed, and the nose pad 111 and the temple 112 are fixed to the user's head.
- the temple 112 is a foldable structure, wherein the nose pad 111 is correspondingly fixed on the nose bridge of the user, and the temple 112 is correspondingly fixed on the user's ear.
- the temples 112 can also be connected by an elastic band, and the elastic band tightens the temples when worn to help the frame to be fixed at the head.
- the nose pad 111 and the temple 112 are telescopic mechanisms that adjust the height of the nose pad 111 and the telescopic length of the temple 112, respectively.
- the nose pad 111 and the temple 112 can also be of a detachable structure, and the nose pad 111 or the temple 112 can be replaced after disassembly.
- the head frame 11 may include a nose pad and a stretch rubber band that is fixed to the user's head by a nose pad and a stretch rubber band; or only a stretch rubber band that is fixed to the user's head by the stretch rubber band.
- the headgear frame 11 may also be a helmet-type frame structure for wearing on the top and nose of the user's head.
- the head frame 11 since the main function of the head frame 11 is to be worn on the user's head and to provide support for the optical and electrical components such as the display module 12 and the see-through light guiding element 13, the head frame includes but not Limited to the above manner, under the premise of having the above-mentioned main effects, those skilled in the art can make some modifications to the head frame according to the needs of practical applications.
- the display module 12 emits a first light ray 121 including left-eye virtual image information, and the first light ray 121 reflected by the concave surface 131 of the see-through light guiding element 13 enters the left eye 14 of the user; similarly, the display module emits Another first light containing the virtual image information of the right eye, another first light reflected by the concave surface of the other see-through light guiding element enters the right eye of the user, thereby forming a visual feeling of the 3D virtual scene in the user's brain,
- a small display screen is directly disposed in front of the user's right eye, resulting in a small visual area.
- more display modules are reflected by two fluoroscopy light guiding elements. The first light enters the user's eyes, respectively, and the visual area is large.
- each of the see-through light guiding elements 13 further has a convex surface disposed opposite to the concave surface; the convex surface and the concave surface via the see-through light guiding element 13
- the transmitted second light containing the external image information enters the user's eyes to form a visual blend of the 3D virtual scene and the real scene.
- a see-through light guiding element 13 further has a convex surface 132 disposed opposite the concave surface 131, and the second light 151 containing external image information transmitted through the convex surface 132 and the concave surface 131 of the see-through light guiding element 13 enters the user.
- the left eye 14 is similarly shaped.
- the other see-through light guiding element further has a convex surface disposed opposite to the concave surface thereof, and the second light containing the external image information transmitted through the convex surface and the concave surface of the see-through light guiding element enters the right side of the user.
- the user can see the real scene of the outside world, thereby forming a visual experience of mixing the 3D virtual scene and the real scene.
- a diopter correcting lens 16 is disposed between the human eye and the see-through light guiding element 13, the diopter correcting lens 16 being disposed perpendicular to the horizontal plane.
- the plane of the diopter correction lens may also be at an angle of 30 degrees to 90 degrees from the horizontal plane.
- different degrees of diopter correcting lenses may be arbitrarily set.
- the display module 12 emits a first light ray 121 including left-eye virtual image information, a first light ray 121 reflected through the concave surface 131 of the fluoroscopic light guiding element 13, and a convex surface 132 and a concave surface 131 transmitted through the fluoroscopic light guiding element 13
- the second light ray 151 of the image information passes through the refractive correction lens 16 before entering the left eye 14 of the user.
- the refractive correction lens 16 is a concave lens, and the first light 121 and the second light 151 passing therethrough are diverged, so that the focus of the first light 121 and the second light 151 on the left eye 14 are shifted back.
- the refractive correction lens 16 can also be a convex lens that converges the first light ray 121 and the second light ray 151 thereon to advance the focus of the first light ray 121 and the second light 151 on the left eye 14.
- the display module emits another first light containing the virtual image information of the right eye, another first light reflected through the concave surface of the other see-through light guiding element, and the convex and concave surface transmitted through the transparent light guiding element.
- the lens is also corrected by a diopter.
- the user's eyeball is the apex, and the user's eyeball is on both sides of the virtual display area of the virtual image seen through the see-through light guiding element 13.
- the edges form a diagonal field of view.
- the distance from the farthest end of the head frame to the contact position with the foremost end of the head is c, and the distance length of the c can be adjusted as needed.
- the angular extent of the diagonal field of view region is inversely proportional to the distance from the most distal end of the head frame 11 to the contact position with the foremost end of the head.
- the distance from the farthest end of the head frame to the contact position with the foremost end of the head is less than 80 mm under the premise that the diagonal field of view area is greater than 55 degrees.
- the second display module 12 is connected to the main board 17 by a cable.
- the main board 17 is also provided with a video interface, a power interface, a communication chip, and a memory.
- the video interface is used to connect a computer, a mobile phone, or other device to receive a video signal.
- the video interface may be: hmdi, display port, thunderbolt or usb type-c, micro usb, MHL (Mobile High-Definition Link) and the like.
- the processor is configured to process data, wherein the video signal is mainly used for decoding and displayed on the display module 12.
- the power interface is used for external power supply or battery power supply.
- the power interface includes a USB interface or other interfaces.
- the communication chip is configured to perform data interaction with the outside world through a communication protocol, specifically, connecting to the Internet through a communication protocol such as WiFi, WDMA, TD-LTE, and then acquiring data through the Internet or connecting with other display devices based on the augmented reality; or directly It is connected to other augmented reality based display devices through a communication protocol.
- a communication protocol such as WiFi, WDMA, TD-LTE
- the memory is used for storing data, and is mainly used for storing display data displayed in the display module 12.
- An earphone interface, a sound card chip or other generating device can also be provided on the main board 17.
- the earbud connector is used to connect the earbuds and transmit audio signals to the earbuds.
- the sound card chip is used to parse the sound signal.
- a speaker can also be disposed on the head frame 11 of the display device based on the augmented reality, and the sound signal parsed by the sound card chip is converted into sound.
- the augmented reality-based display device includes only the head frame 11, the two display modules 12, the two see-through light guiding elements 13, and the main board 17, as described above, all 3D virtual scene renderings and image generation corresponding to both eyes are performed.
- the external device includes: a computer, a mobile phone, a tablet computer, and the like.
- the display device based on the augmented reality receives the video signal of the external device through the video interface, and displays it on the display module 12 after decoding.
- the interaction with the user is performed by an application software on an external device such as a computer, a mobile phone, a tablet computer, etc., and the augmented reality-based display device can be interacted by using a mouse keyboard, a touch pad or a button on the external device.
- Augmented reality based display devices can project a display screen at a fixed location within the user's field of view. The user needs to adjust the size, position, and the like of the projection screen through software on the device connected to the augmented reality based display device.
- the display device based on the augmented reality provided by the embodiment of the present application reflects the first ray including the left-eye virtual image information and the right-eye virtual image information into the user by the concave surfaces of the two fluoroscopic light guiding elements. Both eyes form a visual experience of a 3D virtual scene in the user's brain, and the visual area is large.
- a plurality of sensors are disposed to perform sensing on a surrounding environment.
- the total weight of the display device based on the augmented reality is less than 350 grams, and includes: a head frame 21, two display modules 22, and two see-through light guiding elements 23 And motherboard 24.
- the display module 22, the see-through light guiding element 23 and the main board 24 are all disposed on the head frame 21.
- the head frame 21 fixes the display module 22, the see-through light guiding element 23 and the main board 24.
- the display module 22 is disposed on the upper side of the see-through light guiding element 23, and the light emitted by the display module 22 can be reflected by the see-through light guiding element 23.
- the main board 24 and the main board 24 are located between the two display modules 22.
- the main board 24 is provided with a processor for processing virtual image signals and displaying the virtual image information on the display module 22.
- the head frame 21, the two display modules 22, the two see-through light guiding elements 23, the main board 24 and the head frame 11 described in the first embodiment, the two display modules 12, the two see-through light guiding elements 13, and the main board 17 The specific functions, structures, and positional relationships are the same and will not be described here.
- a diopter correcting lens is disposed between the human eye and the see-through light guiding element 23, the diopter correcting lens being disposed perpendicular to the horizontal plane.
- different degrees of diopter correcting lenses may be arbitrarily set.
- the head frame 21 is further provided with a monocular camera 211, a binocular/multi-view camera 212, an eyeball tracking camera 213, a gyroscope 214, an accelerometer 215, a magnetometer 216, a depth of field sensor 217, an ambient light sensor 218, and/or a distance. Sensor 219.
- the monocular camera 211, the binocular/multi-view camera 212, the eyeball tracking camera 213, the gyroscope 214, the accelerometer 215, the magnetometer 216, the depth of field sensor 217, the ambient light sensor 218, and/or the distance sensor 219 are all electrically connected to the main board 24 on.
- the monocular camera 211 is a color monocular camera placed at the front of the head frame 21.
- the monocular camera 211 faces the other side with respect to the user's face, and the camera can be used to take a photo.
- the display device based on the augmented reality may be assisted by using the camera to detect a position known in the environment using computer vision technology.
- the monocular camera 211 can also be a high-resolution camera for taking photos or taking video; the captured video can also superimpose the virtual object seen by the user through software, and the user can see through the augmented reality-based display device. To the content.
- the binocular/multi-view camera 212 may be a monochrome or color camera disposed on the front or side of the head frame 21 and located on one side, two sides or all sides of the monocular camera 211. Further, the binocular/multi-view camera 212 may be provided with an infrared filter. Using the binocular camera, you can further obtain the depth of field information on the image based on the environment image. With a multi-camera camera, you can further expand the camera's viewing angle to get more ambient image and depth of field information.
- the ambient image and distance information captured by the dual/multi-view camera 212 can be used to: (1) fuse with the data of the gyroscope 214, the accelerometer 215, and the magnetometer 216 to calculate the pose of the augmented reality based display device. (2) Capture user gestures, palm prints, etc. for human-computer interaction.
- each of the above-mentioned monocular camera or binocular/multi-view camera may be one of an RGB camera, a monochrome camera or an infrared camera.
- the eyeball tracking camera 213 is disposed on one side of the see-through light guiding element 23, and when the user wears the augmented reality based display device, the eyeball tracking camera 213 faces the side opposite to the user's face.
- the eyeball tracking camera 213 is used to track the focus of the human eye, and to track and specialize the specific parts of the virtual object or virtual screen that the human eye is looking at. For example, the specific information of the object is automatically displayed next to the object that the human eye is watching.
- the area of the human eye can display the high-definition virtual object image, while for other areas, only the low-definition image can be displayed, which can effectively reduce the amount of image rendering calculation without affecting the user experience.
- the gyroscope 214, the accelerometer 215, and the magnetometer 216 are disposed between the two display modules 22.
- the relative pose between the user's head and the initial position of the system can be obtained by fusing the data of the gyroscope 214, the accelerometer 215, and the magnetometer 216.
- the raw data of these sensors can be further fused with the data of the binocular/multi-view camera 212 to obtain the position and attitude of the augmented reality based display device in a fixed environment.
- the depth of field sensor 217 is disposed at the front of the head frame 21, and can directly obtain depth information in the environment. Compared to the dual/multi-view camera 212, the depth of field sensor can obtain more accurate, higher resolution depth of field data. Similarly, the use of these data can be: (1) merging with the data of the gyroscope 214, the accelerometer 215, and the magnetometer 216 to calculate the pose of the display device based on the augmented reality. (2) Capture user gestures, palm prints, etc. to interact with humans. (3) detecting three-dimensional information of objects around the user.
- the ambient light sensor 218 is disposed on the head frame 21, and can monitor the intensity of ambient light in real time.
- the display device based on the augmented reality adjusts the brightness of the display module 22 in real time according to the change of the ambient light to ensure the consistency of the display effect under different ambient light.
- the distance sensor 219 is disposed at a position where the augmented reality based display device is in contact with the user's face for detecting whether the augmented reality based display device is worn on the user's head. If the user removes the display device based on the augmented reality, the power can be saved by turning off the display module 22, the processor, or the like.
- the augmented reality-based display device further includes: an infrared/near-infrared LED electrically connected to the main board 24, wherein the infrared/near-infrared LED is used for binocular
- the multi-view camera 212 provides a light source. Specifically, the infrared/near-infrared LED emits infrared rays, and when the infrared rays reach an object acquired through the binocular/multi-view camera 212, the object reflects the infrared rays, and the photosensitive element on the binocular/multi-view camera 212 receives the reflection. The returned infrared rays are converted into electrical signals, followed by imaging processing.
- the operations that the augmented reality-based display device can perform when performing human-computer interaction include the following:
- Augmented reality based display devices can project a display screen at a fixed location within the user's field of view. The user can adjust the size, position, and the like of the projection screen through sensors on the augmented reality based display device.
- the remote control has a button, a joystick, a touchpad, etc., and is connected to the display device based on the augmented reality through a wired or wireless manner as a human-computer interaction interface.
- the device and the microphone can be integrated by adding an audio decoding and power amplifying chip to the main board, integrating an earphone jack, an earplug, or a speaker, and allowing the user to interact with the augmented reality based display device using voice.
- a video interface and a processor are provided on the motherboard.
- the augmented reality based display device includes the headgear frame 21, the two display modules 22, the two see-through light guiding elements 23, the main board 24, and the plurality of sensors as described above, all of the 3D virtual scene rendering,
- the image generation corresponding to both eyes and the processing of data acquired by a plurality of sensors can be performed in an external device connected to the display device based on the augmented reality.
- the external device includes: a computer, a mobile phone, a tablet computer, and the like.
- the display device based on the augmented reality receives the video signal of the external device through the video interface, and displays it on the display module 23 after decoding.
- the external device receives the data acquired by the plurality of sensors on the augmented reality-based display device, and performs processing to adjust the image displayed by both eyes according to the data, and is reflected on the image displayed on the display module 23.
- the processor on the augmented reality based display device is only used to support the transmission and display of video signals and the transfer of sensor data.
- a processor with strong computing power is disposed on the motherboard, and some or all of the computer vision algorithms are completed in the display device based on the augmented reality.
- the display device based on the augmented reality receives the video signal of the external device through the video interface, and displays it on the display module 23 after decoding.
- the external device receives data acquired by a part of the sensors on the augmented reality-based display device, and performs processing to adjust the image displayed by the two eyes according to the sensor data, and is reflected on the image displayed on the display module 23.
- the data acquired by the remaining sensors is processed on an augmented reality based display device.
- data acquired by the monocular camera 211, the binocular/multi-view camera 212, the gyroscope 214, the accelerometer 215, the magnetometer 216, and the depth of field sensor 217 are processed in an augmented reality based display device.
- the data acquired by the eyeball tracking camera 213, the ambient light sensor 218, and the distance sensor 219 are processed in an external device.
- the processor on the augmented reality based display device is used to support the transmission and display of video signals, the processing of partial sensor data, and the transfer of remaining sensor data.
- a high-performance processor and an image processor are provided on the motherboard to perform all operations in an augmented reality based display device.
- Augmented Reality displays operate as a stand-alone system without the need to connect an external device.
- the augmented reality-based display device processes the data acquired by the sensor, the image displayed by the two eyes is adjusted, and then displayed on the display module 23 after rendering.
- the processor on the augmented reality based display device is used for decoding processing and display of video signals and processing of sensor data.
- the concave surface of the see-through light guiding element is plated with a reflective film.
- the reflective surface of the see-through light guiding element coated with the reflective film has a reflectance of 20% to 80%.
- the concave surface of the see-through light guiding element is plated with a polarizing reflective film, and the polarization direction of the polarizing reflective film and the polarization of the first light
- the angle between the directions is greater than 70° and less than or equal to 90°, for example, the polarization direction of the polarizing reflective film is perpendicular to the polarization direction of the first light, achieving a reflectivity of approximately 100%, and, in addition, due to the second image containing external image information.
- the light is unpolarized light.
- the concave surface of the see-through light guiding element is plated with a polarizing reflective film, when the second light passes through the polarizing reflective film, nearly 50% of the second light enters the user's eyes, and the user can still see the outside world. Real scene.
- the convex surface of the see-through light guiding element is coated with an anti-reflection film.
- the perspective The concave surface of the light guiding element is provided with a pressure sensitive reflective film, and by changing the magnitude of the voltage applied to the pressure sensitive reflective film, the reflectance of the pressure sensitive reflective film can be adjusted between 0 and 100% when the reflection of the pressure sensitive reflective film When the rate is 100%, the display device based on augmented reality can realize the function of virtual reality.
- the other surface of the see-through light guiding element disposed opposite the concave surface A pressure-sensitive black sheet is provided thereon, and the light transmittance of the pressure-sensitive black sheet can be adjusted by changing the magnitude of the voltage applied to the pressure-sensitive black sheet.
- the display device based on the augmented reality provided by the embodiment of the present application reflects the first ray including the left eye virtual image information and the right eye virtual image information into the user by the concave surfaces of the two fluoroscopic light guiding elements.
- the eyes of the eyes form a visual experience of the 3D virtual scene in the user's brain, and the visual area is large.
- a plurality of sensors are arranged on the display device based on the augmented reality. After the sensor senses the surrounding environment, the perceived result can be reflected in the image displayed in the display module, so that the on-site experience is better and the user experience is better.
- the embodiment provides a display method based on augmented reality, including:
- the first ray containing the virtual image is combined with the second ray containing the live image of the external scene.
- the real-life image of the acquired external scene includes a real-life image of an environment in which the user wearing the augmented reality-based display device is located.
- the acquired real-life image includes: an image of a student, a desk, a chair, a learning tool, and the like in the classroom after being reflected by the light;
- the user is in the office, the The acquired real-life images include images of objects such as desks, computers, keyboards, and mice that are reflected by light.
- the acquiring the virtual image includes: acquiring the virtual image displayed by the display module 12 after being processed by the processor.
- the virtual image is a virtual image of display data transmitted by a device connected to the augmented reality based display device, the virtual image including: a virtual display screen, a virtual mouse, a virtual keyboard, and the like.
- the virtual display screen is used to display data transmitted by an external device, network data acquired through the Internet, or data stored in a local storage.
- the two display modules 12 emit a first light containing display data of the virtual image, and then combine the acquired second light rays of the see-through light guiding element 13 containing the external scene image information, and the two kinds of light pass through the augmented reality based display device.
- the synthesis of the fluoroscopic light guiding element 13 is fused in the eyes of the user, and the content of the display data of the virtual image can be presented in front of the user's eyes in a three-dimensional manner through the human brain processing. It can be understood that the augmented reality based display device projects the display data of the virtual image into a live view image within the user's field of view.
- the first display mode is between a virtual image and a real image
- the relative angle and the relative position are not fixed display modes
- the second display mode is a display mode in which the relative angle between the virtual image and the real image and the relative position are fixed.
- the third mode is a display mode in which a relative angle between the virtual image and the live image is fixed and the relative position is not fixed.
- the user is projecting a piece of virtual screen and a virtual keyboard to the table, the augmented reality based display device, facing the user, and placing it on the desktop in the real space.
- first display mode When the user's head moves or rotates, the projected virtual screen and keyboard do not change in front of the user's eyes, and their position in real space changes or changes.
- This display mode is called "first display mode".
- the processor combines the second ray including the real scene image of the external scene with the first ray including the virtual image, and displays the image in the first display mode.
- the monocular camera 211 can be used to detect the position-known mark in the environment by using the computer vision technology to help the augmented reality-based display device to perform positioning.
- the depth of field sensor 217 obtains depth of field information in the environment.
- the augmented reality based display device may further obtain the depth of field information on the acquired image by using the binocular/multi-view camera 212 on the basis of obtaining the environment image. Then, the display device based on the augmented reality processes the data obtained by the monocular camera 211, the depth of field sensor 217 or the binocular/multi-view camera 212, and the processor uses the computer vision technology to perform 3D modeling on the surrounding environment to identify the real environment in real time.
- the augmented reality based display device can analyze which spaces in the vicinity of the user can better project the virtual screen, virtual keyboard and other display content.
- the augmented reality based display device can also calculate the image and depth of field data obtained by the gyroscope 214, the accelerometer 215, and the magnetic field meter 216 in combination with the image obtained by the monocular camera 211, the depth of field sensor 217, or the binocular/multi-view camera 212.
- the position and attitude of the display device based on the augmented reality in real space, that is, the relative position and angular relationship T of the coordinate systems F H and F I .
- the position and posture in the coordinate system F H are known, and the projected digital content can be obtained in the real space (F I ) by T The position and angle.
- the position and posture of the projected content in the augmented reality based display device coordinate system F H can be calculated through the relationship T, Projected content such as a virtual screen is placed here.
- the augmented reality based display device can implement the "second display mode".
- the user is projecting a virtual screen and a virtual keyboard on the table, the augmented reality based display device, facing the user, and placing it on the desktop in the real space.
- the second display mode is used, the position of the projected virtual keyboard and the screen in the real space does not change, so that the user generates the screen and the keyboard is real and placed on the desktop. illusion.
- the augmented reality-based display device uses a gyroscope, an accelerometer, and a magnetic field meter to obtain a relative angle between the user's head and the environment in which the "third display mode" can be realized.
- a gyroscope an accelerometer
- a magnetic field meter to obtain a relative angle between the user's head and the environment in which the "third display mode" can be realized.
- the relative angle is fixed, but the relative position can be moved.
- the user is projecting a piece of virtual screen and a virtual keyboard on the table, the augmented reality based display device, facing the user, placed on the desktop in real space.
- the relative angle of the projected virtual keyboard and the screen in the real space does not change, and when the user moves, the projected virtual keyboard and screen are in the real space.
- the relative position in the change changes, followed by the user to move.
- the relationship between the first display mode, the second display mode, and the third display mode and the real environment and the user's head is as follows:
- first display mode the “second display mode” or the “third display mode” may be used for different virtual images, and may be determined by the system software or by the user.
- the "first display mode”, the "second display mode” or the “third mode” is implemented by a two-dimensional code set in a live view image or other manually set auxiliary mark.
- the two-dimensional code set in the real-life image is scanned and recognized by the monocular camera 211, the depth of field sensor 217, or the binocular/multi-view camera 212, and the two-dimensional code includes turning on the first display mode and turning on the second display mode.
- Information or turn on the information in the third display mode After the information in the two-dimensional code is recognized, it is displayed in a display mode corresponding to the information of the two-dimensional code. For example, if the information in the two-dimensional code is scanned to turn on the information of the first display mode, the display is performed in the first display mode; and the information in the scanned two-dimensional code is the second display mode or the third display mode. The information is displayed in the second display mode or the third mode.
- the manual mark set in the live image may be scanned and recognized by the monocular camera 211, the depth of field sensor 217 or the binocular/multi-view camera 212, and the artificial mark includes opening the first display mode or turning on the second display mode. information. For example, if the information in the manual mark is the information that the first display mode is turned on, the display is performed in the first display mode; and if the information in the identified manual mark is the information in the third mode of the second display mode, Display in the second display mode or the third mode.
- the artificial mark on the two-dimensional plane set in the two-dimensional code or other real-life image can also be used to assist the positioning by the augmented reality-based display device when displaying in the second display mode: according to the monocular camera 211, the depth of field sensor 217 or the double
- the shape and size of the two-dimensional code or the artificial mark captured by the mesh/multi-view camera 212 are compared with the actual size and shape of the two-dimensional code or the artificial mark on the two-dimensional plane, and the relationship between the mark and the camera is calculated.
- Relative position and angle Since the position of the mark in the environment is fixed, the relative position and angle relationship T of the display device based on the augmented reality and the environment can be calculated therefrom, thereby implementing the second display mode.
- the augmented reality-based display device can track the motion of the user's gesture through the monocular camera 211, the depth of field sensor 217, or the binocular/multi-view camera 212, analyze the user's intention, and operate the virtual displayed content.
- the monocular camera 211 the depth of field sensor 217, or the binocular/multi-view camera 212 to track the position of the user's finger
- an operation command corresponding to the finger click action is executed.
- the virtual screen As a whole, or an object in the virtual screen, is dragged.
- an instruction corresponding to the zoomed or zoomed gesture action is performed on the virtual screen or the virtual screen.
- the object is scaled.
- the user wears the augmented reality-based display device according to Embodiment 1 or 2, and connects the augmented reality-based display device to the notebook computer, and the notebook computer transmits the video signal to the augmented reality-based display through the cable.
- the cable can also be used for data and audio signal transmission as well as powering the head display. Cables can be single (such as USB Type-C) or multiple (video, audio, and data using different cables).
- the augmented reality based display device acquires a video signal transmitted through the notebook computer and then displays on the display module 12 of the augmented reality based display device.
- the virtual image includes: one or several virtual display screens and display content displayed on the display screen, the display content being the content transmitted by the notebook computer.
- the two display modules 12 emit light rays including the above-mentioned virtual image and combine the acquired light rays of the perspective light guiding element 13 containing the real scene image information of the external scene, and the two kinds of light pass through the perspective guide on the information processing device based on the augmented reality.
- the synthesis of the optical element 13 is fused within the eyes of the user, and after processing by the human brain, the content of the virtual image can be presented in front of the user's eyes in three dimensions.
- the augmented reality-based information processing device projects the virtual display screen and the content displayed on the virtual display screen in a live view image of the external scene within the user's field of view. For example, if there is a physical mouse and keyboard in the real scene image of the external scene in the user's field of vision, the physical mouse and keyboard can be connected to the notebook computer, so that the user can input information in the most familiar manner, thereby improving work efficiency.
- the user wears the augmented reality-based display device according to Embodiment 1 or 2, and connects the augmented reality-based display device to the mobile phone or other mobile terminal, and the mobile phone or other mobile terminal transmits the video signal through the cable to the enhanced device.
- Realistic display device The cable can also be used for data and audio signal transmission as well as powering the head display. Cables can be single (such as USB Type-C) or multiple (video, audio, and data using different cables).
- the augmented reality based display device acquires a video signal of a virtual image transmitted by the mobile phone or other mobile terminal, and then displays on the display module 12 of the augmented reality based display device.
- the virtual image includes: one or several virtual display screens and display content displayed on the display screen, the display content being content transmitted by a mobile phone or other mobile terminal.
- the two display modules 12 emit light rays including the above-mentioned virtual image and combine the acquired light rays of the perspective light guiding element 13 containing the real scene image information of the external scene, and the two kinds of light pass through the perspective guide on the information processing device based on the augmented reality.
- the synthesis of the optical element 13 is fused within the eyes of the user, and after processing by the human brain, the content of the virtual image can be presented in front of the user's eyes in three dimensions.
- the augmented reality-based information processing device projects the virtual display screen and the content displayed on the virtual display screen in a live view image of the external scene within the user's field of view.
- the portable mouse and keyboard are connected to the mobile phone or other mobile terminal via Bluetooth or other means of communication.
- the real scene image of the external scene in the user's field of vision has a physical mouse and keyboard, that is, the user wearing the display device based on the augmented reality can see the entity mouse and the keyboard, so that the user inputs information in the most familiar manner, and improves the user. Work efficiency.
- the user wears the augmented reality-based display device according to Embodiment 1 or 2, and connects the augmented reality-based display device to the mobile phone or other mobile terminal, and the mobile phone or other mobile terminal transmits the video through the cable.
- Signal to augmented reality based display devices The augmented reality based display device acquires a video signal of a virtual image transmitted by the mobile phone or other mobile terminal, and then displays on the display module 12 of the augmented reality based display device.
- the virtual image includes: one or several virtual display screens, display content displayed on the display screen, and a virtual mouse and keyboard.
- the position of the user's finger can be tracked by using the monocular camera 211, the depth of field sensor 217, or the binocular/multi-view camera 212, and after the action of clicking the mouse or the keyboard of the finger is recognized, the operation of the operation key corresponding to the finger click action is performed. Instructions to achieve the input of information.
- the arrangement of the multiple display screens may be adjusted.
- the position of the user's finger can be tracked by using the monocular camera 211, the depth of field sensor 217, or the binocular/multi-view camera 212, and after the action of moving the display screen of the finger is recognized, the operation of the operation key corresponding to the finger movement display screen is performed. Instructions to move the display.
- the content projected by the augmented reality-based display device is sound-bearing content
- sound can be emitted through an external earphone or sounded through a speaker.
- the eyeball tracking camera 213 can also track the focus of the user's eyes, and track and specialize the specific parts of the virtual object or virtual screen that the user's eye focus is on, for example, in the eyes of the user.
- the local area of the observation, the annotations are automatically displayed, and the specific information of the observed object.
- an augmented reality based display method combines an external real image with a virtual image, and the virtual image can provide a virtual display screen or a virtual mouse keyboard with a large display range for the user, and The virtual image can be used in conjunction with real-life physical screens, mice, keyboards, touch screens, buttons, etc., with a large field of view and privacy.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
Abstract
L'invention se rapporte à un procédé d'affichage basé sur la réalité augmentée, qui consiste : à émettre des premiers rayons (121) incluant une image virtuelle, l'image virtuelle étant une image reçue transmise à partir d'un dispositif externe ; à obtenir des seconds rayons (151) comprenant une image de vue en direct d'une scène externe ; et à synthétiser les premiers rayons (121) incluant l'image virtuelle et les seconds rayons (151) comprenant l'image de vue en direct de la scène externe. À la différence de l'état de la technique, le procédé d'affichage basé sur la réalité augmentée combine l'image de vue en direct de la scène externe et l'image virtuelle ; étant donné que l'image virtuelle peut fournir à un utilisateur un écran virtuel ayant une grande plage d'affichage ou une souris et un clavier virtuels, ladite image virtuelle peut être utilisée en coopération avec une souris et un clavier réels, un écran tactile réel, et autres. L'invention présente une plage de visualisation relativement grande, et une certaine confidentialité.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201710079199.5 | 2017-02-14 | ||
CN201710079199.5A CN108427194A (zh) | 2017-02-14 | 2017-02-14 | 一种基于增强现实的显示方法及设备 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018149267A1 true WO2018149267A1 (fr) | 2018-08-23 |
Family
ID=63155134
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2018/073473 WO2018149267A1 (fr) | 2017-02-14 | 2018-01-19 | Procédé et dispositif d'affichage basés sur la réalité augmentée |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN108427194A (fr) |
WO (1) | WO2018149267A1 (fr) |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110174939A (zh) * | 2019-04-19 | 2019-08-27 | 深圳远为文化有限公司 | 全景扫描融合现实系统 |
CN110362231B (zh) * | 2019-07-12 | 2022-05-20 | 腾讯科技(深圳)有限公司 | 抬头触控设备、图像显示的方法及装置 |
CN111736692B (zh) * | 2020-06-01 | 2023-01-31 | Oppo广东移动通信有限公司 | 显示方法、显示装置、存储介质与头戴式设备 |
CN111664741B (zh) * | 2020-06-08 | 2023-01-06 | 中国人民解放军陆军特种作战学院 | 一种用于射击训练的智能靶标系统的交互方法 |
CN111664742B (zh) * | 2020-06-08 | 2023-01-06 | 中国人民解放军陆军特种作战学院 | 一种基于空气成像的智能靶标系统 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN103019377A (zh) * | 2012-12-04 | 2013-04-03 | 天津大学 | 基于头戴式可视显示设备的输入方法及装置 |
US20130215235A1 (en) * | 2011-04-29 | 2013-08-22 | Austin Russell | Three-dimensional imager and projection device |
CN104915979A (zh) * | 2014-03-10 | 2015-09-16 | 苏州天魂网络科技有限公司 | 跨移动平台实现沉浸式虚拟现实的系统 |
CN105892631A (zh) * | 2015-11-16 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | 一种简化虚拟现实应用操作的方法和装置 |
CN105955453A (zh) * | 2016-04-15 | 2016-09-21 | 北京小鸟看看科技有限公司 | 一种3d沉浸式环境下的信息输入方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2850609A4 (fr) * | 2012-05-16 | 2017-01-11 | Mobile Augmented Reality Ltd. Imagine | Système porté par un utilisateur mobile pour augmenter complètement la réalité par ancrage d'objets virtuels |
US9367960B2 (en) * | 2013-05-22 | 2016-06-14 | Microsoft Technology Licensing, Llc | Body-locked placement of augmented reality objects |
CN105446580B (zh) * | 2014-08-13 | 2019-02-05 | 联想(北京)有限公司 | 一种控制方法及便携式电子设备 |
CN105607730A (zh) * | 2014-11-03 | 2016-05-25 | 航天信息股份有限公司 | 基于眼球追踪的增强显示方法及装置 |
CN104407700A (zh) * | 2014-11-27 | 2015-03-11 | 曦煌科技(北京)有限公司 | 一种移动头戴式虚拟现实和增强现实的设备 |
CN106258004B (zh) * | 2015-04-20 | 2019-01-11 | 我先有限公司 | 虚拟实景装置与操作模式 |
CN105809144B (zh) * | 2016-03-24 | 2019-03-08 | 重庆邮电大学 | 一种采用动作切分的手势识别系统和方法 |
CN105866955A (zh) * | 2016-06-16 | 2016-08-17 | 深圳市世尊科技有限公司 | 智能眼镜 |
-
2017
- 2017-02-14 CN CN201710079199.5A patent/CN108427194A/zh active Pending
-
2018
- 2018-01-19 WO PCT/CN2018/073473 patent/WO2018149267A1/fr active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130215235A1 (en) * | 2011-04-29 | 2013-08-22 | Austin Russell | Three-dimensional imager and projection device |
CN103019377A (zh) * | 2012-12-04 | 2013-04-03 | 天津大学 | 基于头戴式可视显示设备的输入方法及装置 |
CN104915979A (zh) * | 2014-03-10 | 2015-09-16 | 苏州天魂网络科技有限公司 | 跨移动平台实现沉浸式虚拟现实的系统 |
CN105892631A (zh) * | 2015-11-16 | 2016-08-24 | 乐视致新电子科技(天津)有限公司 | 一种简化虚拟现实应用操作的方法和装置 |
CN105955453A (zh) * | 2016-04-15 | 2016-09-21 | 北京小鸟看看科技有限公司 | 一种3d沉浸式环境下的信息输入方法 |
Also Published As
Publication number | Publication date |
---|---|
CN108427194A (zh) | 2018-08-21 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN206497255U (zh) | 增强现实显示系统 | |
WO2018149267A1 (fr) | Procédé et dispositif d'affichage basés sur la réalité augmentée | |
CN103309037B (zh) | 头部佩戴型显示装置以及头部佩戴型显示装置的控制方法 | |
CN111602082B (zh) | 用于包括传感器集成电路的头戴式显示器的位置跟踪系统 | |
US9122321B2 (en) | Collaboration environment using see through displays | |
US20120249587A1 (en) | Keyboard avatar for heads up display (hud) | |
US10235808B2 (en) | Communication system | |
CN108427193A (zh) | 增强现实显示系统 | |
CN105607255A (zh) | 头部佩戴型显示装置、控制其的方法及计算机程序 | |
WO2019001575A1 (fr) | Dispositif d'affichage portable | |
CN115176194A (zh) | 一种用于可穿戴电子设备的超伸展铰链 | |
US20250008077A1 (en) | Display method and electronic device | |
WO2014128748A1 (fr) | Dispositif, programme et procédé d'étalonnage | |
TW201802642A (zh) | 視線檢測系統 | |
CN108446011A (zh) | 一种基于增强现实的医疗辅助方法及设备 | |
CN110192142B (zh) | 显示装置及其显示方法、显示系统 | |
WO2018045985A1 (fr) | Système d'affichage à réalité augmentée | |
KR20220128726A (ko) | 머리 착용형 디스플레이 장치, 그 장치에서의 동작 방법 및 저장매체 | |
US20200012352A1 (en) | Discrete Type Wearable Input and Output Kit for Mobile Device | |
WO2018149266A1 (fr) | Procédé et dispositif de traitement d'informations basés sur la réalité augmentée | |
CN107111143B (zh) | 视觉系统及观片器 | |
WO2016051429A1 (fr) | Dispositif d'entrée/sortie, programme d'entrée/sortie, et procédé d'entrée/sortie | |
TW201805689A (zh) | 外加式近眼顯示裝置 | |
EP4449185A1 (fr) | Lunettes comprenant un ensemble de lentilles push-pull non uniformes | |
CN108696740A (zh) | 一种基于增强现实的直播方法及设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18754559 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 18754559 Country of ref document: EP Kind code of ref document: A1 |