[go: up one dir, main page]

CN110531525B - Device for realizing near-eye display of three-dimensional images - Google Patents

Device for realizing near-eye display of three-dimensional images Download PDF

Info

Publication number
CN110531525B
CN110531525B CN201810513869.4A CN201810513869A CN110531525B CN 110531525 B CN110531525 B CN 110531525B CN 201810513869 A CN201810513869 A CN 201810513869A CN 110531525 B CN110531525 B CN 110531525B
Authority
CN
China
Prior art keywords
waveguide
light
nanograting
virtual
pixel
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810513869.4A
Other languages
Chinese (zh)
Other versions
CN110531525A (en
Inventor
陈林森
张云莉
乔文
黄文彬
花尔凯
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou University
SVG Tech Group Co Ltd
Original Assignee
Suzhou University
SVG Tech Group Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou University, SVG Tech Group Co Ltd filed Critical Suzhou University
Priority to CN201810513869.4A priority Critical patent/CN110531525B/en
Publication of CN110531525A publication Critical patent/CN110531525A/en
Application granted granted Critical
Publication of CN110531525B publication Critical patent/CN110531525B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

本发明涉及显示技术,特别涉及用于实现三维图像近眼显示的装置。按照本发明一个方面的用于实现三维图像近眼显示的装置包含:光场再现单元,其配置为重构目标物体的光场信息以再现虚拟景象,所述光场再现单元包括:空间光调制器;以及在所述空间光调制器的出光方向上设置的位相板,其具有衍射结构,所述衍射结构配置为将所述虚拟景象中的不同视角的图像投射至各自对应的观察位置;以及虚实融合单元,其被配置为输出将所述虚拟景象与真实景象融合在一起的三维图像。

The present invention relates to display technology, and in particular to a device for realizing near-eye display of three-dimensional images. According to one aspect of the present invention, the device for realizing near-eye display of three-dimensional images comprises: a light field reproduction unit, which is configured to reconstruct light field information of a target object to reproduce a virtual scene, and the light field reproduction unit comprises: a spatial light modulator; and a phase plate arranged in the light output direction of the spatial light modulator, which has a diffraction structure, and the diffraction structure is configured to project images of different perspectives in the virtual scene to their respective corresponding observation positions; and a virtual-real fusion unit, which is configured to output a three-dimensional image that fuses the virtual scene with the real scene.

Description

Device for realizing near-to-eye display of three-dimensional image
Technical Field
The invention relates to display technology, in particular to a device for realizing near-eye display of three-dimensional images.
Background
The current three-dimensional display is mainly based on the binocular parallax principle of human eyes, and the left and right eyes of the human are enabled to obtain different image information by means of optical elements such as parallax barriers or lenticular lens arrays.
In the near-eye display devices disclosed in US8989535B2 and US9581820B2, an augmented reality 3D display based on binocular parallax is realized by combining a micro-projection device with an optical waveguide lens, but this approach is difficult to eliminate the vergence contradiction and is prone to visual fatigue. U.S. patent No. 00795952 discloses a method for realizing three-dimensional display of a light field by combining optical fiber micro-projection and an optical waveguide lens, but the technical difficulty of the method is high. US008014050B2 discloses an optical holographic phase plate for three-dimensional display, which is described as a nanophase plate comprising a bulk grating structure and a photosensitive material, and which enables rapid modulation of the optical field phase by controlling the diffraction efficiency and phase retardation of individual pixel units through a single electrode array. However, with such an electrode array, it is difficult to miniaturize the modulation of individual pixels, and the display effect thereof is difficult to meet the comfort and definition required by consumers.
Disclosure of Invention
It is an object of the present invention to provide a device for achieving near-to-eye display of three-dimensional images which has the advantages of low manufacturing cost, simplicity and convenience in design, compactness and the like.
An apparatus for implementing near-eye display of a three-dimensional image according to one aspect of the present invention includes:
A light field reproduction unit configured to reconstruct light field information of a target object to reproduce a virtual scene, the light field reproduction unit comprising:
Spatial light modulator, and
A phase plate arranged in the light emitting direction of the spatial light modulator and having a diffraction structure configured to project images of different perspectives in the virtual scene to respective corresponding viewing positions, and
And a virtual-real fusion unit configured to output a three-dimensional image fusing the virtual scene and the real scene together.
Preferably, in the above apparatus, the apparatus further includes a projection unit configured to transfer the virtual scene output by the light field reproduction unit to the virtual-real fusion unit.
Preferably, in the above device, the spatial light modulator comprises a plurality of volume pixels, each volume pixel comprising a plurality of sub-pixels, each sub-pixel corresponding to a different viewing angle, and the diffractive structure comprises a plurality of nanostructure elements, each nanostructure element being configured to project light beams from a sub-pixel of the plurality of volume pixels corresponding to the same viewing angle to the same viewing position associated with the sub-pixel.
Preferably, in the above device, the spatial light modulator comprises a plurality of volume pixels, each volume pixel comprising a plurality of sub-pixels, each sub-pixel corresponding to a different viewing angle, and the diffractive structure comprises a plurality of nanostructure elements, each nanostructure element being configured to project light beams from a sub-pixel of the plurality of volume pixels corresponding to the same viewing angle to a set of viewing positions associated with the sub-pixel.
Preferably, in the above device, the set of viewing positions is a plurality of viewing positions distributed in a horizontal direction and/or a lateral direction.
Preferably, in the above device, the spatial light modulator is one of a DLP display screen, an LCOS display screen, or a liquid crystal display screen.
Preferably, in the above device, the virtual-real fusion unit includes a waveguide, a first nano grating disposed inside the waveguide, and a second nano grating, wherein the first nano grating diffracts incoming light, the waveguide totally reflects light diffracted by the first nano grating, and the second nano grating diffracts totally reflected light to guide the light from the waveguide to the visible region.
Preferably, in the above device, the virtual-real fusion unit includes three layers of transparent light field lenses stacked together, each layer of transparent light field lenses includes a waveguide, a first nano-grating disposed inside the waveguide, and a second nano-grating, wherein the first nano-grating diffracts incoming light, the waveguide totally reflects light diffracted by the first nano-grating, and the second nano-grating diffracts totally reflected light to guide light from the waveguide to the visible region, wherein the first nano-grating of each layer of transparent light field lenses has different orientation angles and/or periods, and the second nano-grating of each layer of transparent light field lenses has different orientation angles and/or periods.
Preferably, in the above device, the virtual-real fusion unit includes a prism, a waveguide, and a nano-grating disposed inside the waveguide, wherein the prism refracts incident light into the waveguide, the waveguide totally reflects the refracted light, and the nano-grating diffracts the totally reflected light to guide the light from the waveguide to the viewing area.
Preferably, in the above device, the virtual-real fusion unit includes a prism refracting the incident light into the waveguide, the waveguide totally reflecting the refracted light, a waveguide, and a pair of mirrors disposed inside the waveguide, the mirrors diffracting the totally reflected light to guide the light from the waveguide to the viewing area.
Preferably, in the above device, the virtual-real fusion unit includes a first waveguide lens, a second waveguide lens, a first set of nano gratings, a second set of nano gratings, and a third set of nano gratings, where the first set of nano gratings is located between the first and second waveguide lenses, the second set of nano gratings is located on a surface of the second waveguide lens, which is far away from the first waveguide lens, and the third set of nano gratings is located above the second set of nano gratings, and the first to third sets of nano gratings have different orientation angles and/or periods.
Drawings
Fig. 1 is a schematic block diagram of an apparatus for implementing a near-eye display of a three-dimensional image in accordance with one embodiment of the invention.
Fig. 2 is a schematic diagram of a phase plate that may be used in the apparatus of fig. 1.
Fig. 3 is a schematic structural diagram of the phase plate shown in fig. 2 in an X-Y plane and an X-Z plane.
Fig. 4 is a graph of nanostructure distribution of a directional functional thin film implementing single viewpoint convergence.
Fig. 5a and 5b are schematic diagrams of realizing single view and multi-view, respectively, with a light field rendering unit according to another embodiment of the present invention.
Fig. 6 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
Fig. 7 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
Fig. 8 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
Fig. 9 is a top view of a virtual-real fusion unit according to another embodiment of the present invention.
Fig. 10 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
Fig. 11a and 11b are schematic diagrams for respectively implementing longitudinal and transverse viewing area expansion using the phase plate as described above. Fig. 12 is a schematic diagram of a virtual-real fusion unit according to another embodiment of the present invention.
Fig. 13 is a schematic diagram of an apparatus for implementing near-eye display of three-dimensional images according to another embodiment of the present invention.
Fig. 14 is a schematic diagram of an apparatus for implementing near-eye display of three-dimensional images in accordance with another embodiment of the invention.
Detailed Description
The objects of the present invention will be described in detail with reference to the accompanying drawings.
Fig. 1 is a schematic block diagram of an apparatus for implementing a near-eye display of a three-dimensional image in accordance with one embodiment of the invention.
The apparatus 100 for implementing near-to-eye display of three-dimensional images shown in fig. 1 includes a light field reproduction unit 110, a projection unit 120, and a virtual-real fusion unit 130. In the present embodiment, the light field reproduction unit 110 is configured to reconstruct light field information of a target object to reproduce a virtual scene. The projection unit 120 is optically coupled between the light field reproduction unit 110 and the virtual-to-real fusion unit 130, which is configured to transfer the virtual scene output by the light field reproduction unit 110 to the virtual-to-real fusion unit 130, for example, by geometrical optics such as reflection, refraction or diffraction. The virtual-real fusion unit 130 is configured to output a three-dimensional image that fuses the virtual scene with the real scene.
It is noted that the projection unit 120 is an optional component. Alternatively, the virtual scene reconstructed by the light field rendering unit 110 may be directly coupled to the virtual-to-real fusion unit 130 by a suitable design.
In this embodiment, the light field reproduction unit 110 includes a spatial light modulator and a phase plate to achieve reconstruction of the light field.
The spatial light modulator is used for amplitude modulation, i.e. loading the image information of the multi-view mixture. The spatial light modulator may include, for example, a display panel, a driving circuit, a control system, software control, and the like. Spatial light modulators can implement either monochrome or color displays, depending on the particular application field requirements. Preferably, the spatial light modulator may employ one of a DLP display screen, an LCOS display screen, and a liquid crystal display screen. The spatial light modulator may comprise a plurality of volume pixels or amplitude modulated pixels, each volume pixel comprising a plurality of sub-pixels, and each sub-pixel corresponding to a different viewing angle.
The phase plate has a diffraction grating structure including a plurality of bulk pixels. Further, each volume pixel of the phase plate comprises a plurality of nanostructure elements, each nanostructure element being aligned in matching relation with a viewing angle image pixel of the spatial light modulator, that is to say, light beams from a sub-pixel of the plurality of volume pixels from the spatial light modulator corresponding to the same viewing angle are projected to a set of viewing positions associated with that sub-pixel.
Fig. 2 is a schematic diagram of a phase plate that may be used in the apparatus of fig. 1.
Without loss of generality, the case shown in fig. 2 takes 5 viewing angles as an example and only three individual pixels of the phase plate are shown, but it is apparent that the present embodiment can be applied to cases where the viewing angles are other numbers. As shown in fig. 2, phase plate 212 includes volume pixels 212A-212C. Each individual pixel of phase plate 212 takes the form of a pixel cell and contains 5 nanograting regions or sub-pixels having different periods and/or orientation angles that will deflect light rays to different viewing positions or viewing angles 1-5 when the light rays from the sub-pixels of the spatial light modulator arrive, thereby enabling the projection of light beams from the same viewing angle to multiple viewing positions, thereby enabling a three-dimensional display with separate viewing angles. Likewise, through the design of the phase plate, a plurality of separated or continuous view points distributed by the dot matrix, the linear array and the area array can be realized, so as to achieve the effect of the optimal observation area.
The period and orientation angle of the grating region may be determined according to the following grating equation:
tanφ1=sinφ/(cosφ-nsinθ(Λ/λ)) (1)
sin21)=(λ/Λ)2+(nsinθ)2-2nsinθcosφ(λ/Λ) (2)
Wherein, theta 1 and phi 1 respectively represent diffraction angle (included angle of diffracted light and Z axis negative direction) and azimuth angle (included angle of diffracted light and Y axis positive direction) of diffracted light, theta and lambda respectively represent incidence angle (included angle of incident light and Z axis negative direction) and wavelength of light source, lambda and phi respectively represent period and orientation angle (included angle of groove type direction and X axis positive direction) of nano diffraction grating, and n represents refractive index of light wave in medium.
Fig. 3 is a schematic structural diagram of the phase plate shown in fig. 2 in an X-Y plane and an X-Z plane.
After the wavelength, incidence angle, diffraction angle and diffraction azimuth angle of the incident light are determined, the required grating period and orientation angle can be calculated by using the above formula. The period and the orientation angle of the nanostructure unit determine the modulation characteristics of the light field angle and the spectrum, and the regulation and the conversion of the light field can be realized by designing the change patterns of the orientation angle and the period of the nanostructure unit. The diffraction grating structure shown can be fabricated directly as a functional thin film layer on, for example, a glass substrate.
As described above, each nanograting region is considered to be a pixel unit or sub-pixel. The orientation of the grating region determines the optical field angular modulation characteristic and its period determines the spectral filtering characteristic. The modulation and transformation of the light field can be achieved by continuously varying the period (spatial frequency) and orientation of each nanograting region between the subpixels. Therefore, after a plurality of grating areas with different orientation angles and periods set according to the needs are manufactured on one screen surface, enough viewpoints can be obtained, and the 3D display under multiple viewing angles can be realized by amplitude control.
Fig. 4 is a graph of nanostructure distribution of a directional functional thin film implementing single viewpoint convergence. The nanostructure shown has a single off-axis fresnel structure that allows the image to converge at viewpoint 1. In the structure shown in fig. 4, n×m sub-pixels constitute an off-axis fresnel structure of n×m different focal points. It should be noted that the shape of the sub-pixels in fig. 4 is not limited to a rectangle, but may be a circle, a hexagon, or the like.
Fig. 5a and 5b are schematic diagrams of realizing single view and multi-view, respectively, with a light field rendering unit according to another embodiment of the present invention.
The light field reproduction unit 510 shown in fig. 5a and 5b comprises a spatial light modulator (e.g. a liquid crystal display) 511 and a phase plate 512 (e.g. having the structure of a phase plate as shown in fig. 2) with grating areas of different orientations and/or periods arranged on the surface, wherein the phase plate 512 focuses light rays passing through the spatial light modulator 511 to one or several viewpoints.
Fig. 6 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
The apparatus 600 shown in fig. 6 comprises a light field rendering unit 610, a projection unit 620 and a virtual-to-real fusion unit 630.
The light field reproduction unit 610 includes a liquid crystal display 611 and a phase plate 612 positioned on the light emitting surface of the liquid crystal display. The phase plate 612 may take the configuration described above with respect to fig. 2. The light field information of the object reproduced by the light field reproduction unit 610 is coupled to the virtual-real fusion unit 630, for example, by diffraction, refraction, or reflection of the projection unit 620.
In this embodiment, the virtual-real fusion unit 630 includes a waveguide 631, a first nanograting 632a, and a second nanograting 632b. Referring to fig. 6, a first nanograting 632a is disposed inside the waveguide 631 near the position where light is incident into the waveguide, which diffracts the incident light. The light diffracted by the first nanograting 632a is totally reflected inside the waveguide 631. The light reaches the second nanograting 632b after undergoing multiple total reflections, and is then directed to a visual region outside the waveguide by diffraction of the second nanograting 632b, thereby outputting a three-dimensional image fusing the virtual scene with the real scene.
Fig. 7 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
The apparatus 700 shown in fig. 7 includes a light field reproduction unit 710, a projection unit 720, and a virtual-real fusion unit 730.
The light field reproduction unit 710 includes a liquid crystal display 711 and a phase plate 712 positioned on the light-emitting surface of the liquid crystal display. The phase plate 712 may take the configuration described above with respect to fig. 2. The light field information of the object reproduced by the light field reproduction unit 710 is coupled to the virtual-real fusion unit 730, for example, by diffraction, refraction, or reflection of the projection unit 720.
The main difference between this embodiment and the embodiment shown in fig. 6 is the structure of the virtual-real fusion unit. Specifically, the virtual-actual fusion unit 730 of the present embodiment includes a prism 731, a waveguide 732, and a nanograting 733 located inside the waveguide. Referring to fig. 7, the projection unit 720 projects a virtual scene from the light field reproduction unit 710 to the prism 731, and enters the waveguide 732 after being refracted by the prism 731. The refracted light rays are totally reflected inside waveguide 732. The light reaches the nano-grating 733 after undergoing multiple total reflections, and is then directed to a visual region outside the waveguide by diffraction of the nano-grating 733, thereby outputting a three-dimensional image fusing the virtual scene and the real scene together.
Fig. 8 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
The apparatus 800 shown in fig. 8 comprises a light field rendering unit 810, a projection unit 820 and a virtual-to-real fusion unit 830.
The light field reproduction unit 810 includes a liquid crystal display 811 and a phase plate 812 positioned on the light exit surface of the liquid crystal display. The phase plate 812 may take the configuration described above with respect to fig. 2. The light field information of the object reproduced by the light field reproduction unit 810 is coupled to the virtual-real fusion unit 830, for example, by diffraction, refraction or reflection action of the projection unit 820.
The main difference between this embodiment and the embodiment shown in fig. 6 is the structure of the virtual-real fusion unit. Specifically, the virtual-actual fusion unit 830 of the present embodiment includes a prism 831, a waveguide 832, and a pair of mirrors 833a and 833b located inside the waveguide. Referring to fig. 8, the projection unit 820 projects the virtual scene from the light field reproduction unit 810 to the prism 831, and is refracted by the prism 831 to enter the waveguide 832. The refracted ray undergoes total reflection within the waveguide 832. The light reaches the mirror 833a after undergoing multiple total reflections, a part of the light is reflected by the mirror 833a to be guided to the visible region outside the waveguide, and the rest of the light passes through the mirror 833a to reach the mirror 833b and is reflected by the mirror 833b to be guided to the visible region outside the waveguide, thereby outputting a three-dimensional image in which the virtual scene and the real scene are fused together.
Fig. 9 is a top view of a virtual-real fusion unit according to another embodiment of the present invention.
The virtual-real fusion unit 930 shown in fig. 9 takes the form of a diffractive optical element, the surface of which comprises a plurality of functional areas 931-933, each of which contains a pixel-type nanodiffraction grating. In this embodiment, the nanodiffraction grating may be fabricated using photolithographic techniques, nanoimprinting, or holographic interference techniques. As shown in fig. 9, the first functional region 931 having a circular grating is used for light coupling in. It should be noted that the shape of the first functional region is not limited to a circle. The light is coupled through the first functional region 931 and then directed into the second functional region 932 of the wedge-shaped diffraction grating, which enables expansion of the image projected onto the virtual-real fusion unit 930 in the X-axis direction. The image is guided to the third functional region 933 after being expanded in the X-axis direction via the second functional region 932. This functional area 933 realizes expansion of the projection image in the Y-axis direction. Thus, the human eye can observe the virtual image with enlarged viewing angle through the virtual-to-actual fusion unit 930.
Fig. 10 is a schematic structural view of an apparatus for implementing near-eye display of a three-dimensional image according to another embodiment of the present invention.
The apparatus 1100 shown in fig. 10 includes a light field reproduction unit 1010, a projection unit 1020, and a virtual-real fusion unit 1030.
The light field reproduction unit 1010 includes a liquid crystal display 1011 and a phase plate 1012 positioned on the light exit surface of the liquid crystal display. The phase plate 1012 may take the configuration described above with respect to fig. 2. The light field information of the object reproduced by the light field reproduction unit 1010 is coupled to the virtual-real fusion unit 1030, for example, by diffraction, refraction, or reflection action of the projection unit 1020. The virtual-real coupling unit 1030 includes a plurality of functional areas 1031-1033, which are similar to the functional areas 931-933 shown in fig. 9, and are not described herein.
Fig. 11a and 11b are schematic diagrams for respectively implementing longitudinal and transverse viewing area expansion using the phase plate as described above. Without loss of generality, only four views are taken as examples herein.
Referring to fig. 11a, there is shown a volume pixel (rectangle divided into 4 areas in the figure) of phase plate 1112, the volume pixel containing 4 sub-pixels (identified by numerals 1-4). Each subpixel corresponds to one subpixel of the spatial light modulator, which comprises three nanograting pixel units stacked or arranged in a distribution. The nano grating pixel units in the sub-pixels have different orientations and/or periods, so that a plurality of windows 1a-1c with the same visual angle information, which are longitudinally and continuously arranged, can be formed in the vertical direction for each view point, thereby achieving the effect of expanding the longitudinal visual angle.
Referring to fig. 11b, as such, a plurality of voxels (rectangular divided into 4 regions in the figure) of the phase plate 1112 are shown. Taking the leftmost voxel as an example, the voxel contains 4 subpixels (identified by the numbers 1-4). Each subpixel corresponds to one subpixel of the spatial light modulator, which comprises three nanograting pixel units stacked or arranged in a distribution. The nanometer grating pixel units in the sub-pixels have different orientations and/or periods, so that a plurality of windows which are transversely and continuously arranged and have the same visual angle information can be formed along the horizontal direction for each view point, and the effect of expanding the transverse visual angle is achieved.
In the above example, the expanded view angle information is formed in the vertical or horizontal direction of the same viewpoint and is the same as the light field information covered by the original viewpoint, so that the view angle can be increased without reducing the resolution of the image.
Preferably, in order to enable human eyes to observe virtual images in a large view field angle, the pixel structure of the phase plate can also adopt spatial multiplexing stacking or binary optical elements to display three-dimensional images in a wide view field. The image reproduction information after the visual angle expansion is coupled into the virtual-real fusion unit through the projection unit. In practical application, by matching the exit pupil of the projection optical unit and the entrance pupil of the virtual-real fusion lens, an observer can observe a three-dimensional scene with an enlarged field angle through the virtual-real fusion unit without moving.
Fig. 12 is a schematic diagram of a virtual-real fusion unit according to another embodiment of the present invention.
The virtual-to-real fusion unit 1230 shown in fig. 12 includes three subunits 1231-1233 stacked together, each of which may take the configuration of the virtual-to-real fusion unit shown in fig. 6. By having different orientation angles and/or periods of the nanograting therein for each subunit, different images can be generated at different distances from the virtual-real fusion unit, thereby obtaining an image with a sense of depth of field.
Fig. 13 is a schematic diagram of an apparatus for implementing near-eye display of three-dimensional images according to another embodiment of the present invention.
The near-eye display device 1300 shown in fig. 13 can realize color display. To this end, in device 1300, as shown in fig. 13, light field rendering unit 1310 includes a spatial light modulator, a phase plate, and a filter between the spatial light modulator and the phase plate. Without loss of generality, each volume pixel of the phase plate (a rectangle divided into four regions in the figure) contains four differently oriented sub-pixels (identified by numerals 1-4 in the figure), each sub-pixel corresponding to one of the four viewing angles. In this embodiment, the subpixels of the spatial light modulator are matched with the pixels of the filter and the subpixels of the phase plate. Still further, each subpixel of the phase plate is comprised of R, G, B subpixels or pixel raster units (shown in different hatching). For each sub-pixel, by designing each pixel raster unit to have the proper orientation angle and period, light can be converged at the same viewpoint through R, G, B sub-pixels. The convergence viewpoint contains color information of the image. The image is reconstructed by the light field reproduction unit 1310, coupled to the virtual-to-real fusion unit 1330 by the projection unit 1320, and finally a color virtual image is presented through the virtual-to-real fusion unit 1330. In an actual three-dimensional display, the input of the spatial light modulator can be controlled by a computer, so that video playing of color images can be realized.
Fig. 14 is a schematic diagram of an apparatus for implementing near-eye display of three-dimensional images in accordance with another embodiment of the invention.
The near-eye display device 1400 shown in fig. 14 can realize a color display. In the apparatus 1400 shown in fig. 14, the virtual-real fusion unit 1430 adopts a structure of a double-layered waveguide lens. Specifically, the virtual-real fusion unit 1430 includes a first waveguide lens 1431, a second waveguide lens 1432, a first set of nano-gratings 1433a, 1433b, a second set of nano-gratings 1434a, 1434b, and a third set of nano-gratings 1435a, 1435b. Referring to fig. 14, a first set of nanograms 1433a, 1433b are located between the first and second waveguide lenses, a second set of nanograms 1434a, 1434b are located on the surface of the second waveguide lens 1432 remote from the first waveguide lens 1431, and a third set of nanograms 1435a, 1435b are located on top of the second set of nanograms 1434a, 1434 b.
In this embodiment, each set of nanograting diffracts only light of a specific color (e.g., R, G and B). The projection unit 1420 couples light rays from the light field reproduction unit 1410 into the first waveguide lens 1431, and light rays of a specific color (e.g. green) are totally reflected in the first waveguide lens 1431 due to wavelength selectivity of the grating, while light rays of other colors (e.g. blue and red) cannot be transmitted in the first waveguide lens 1431 due to the diffraction angle not satisfying the total reflection condition. Similarly, the second set of nano-gratings 1434a, 1434b and the third set of nano-gratings 1435a, 1435b, which have diffraction effects only on blue and red light, respectively, cause total reflection of the blue and red light within the second waveguide lens 1432. Thus, image information having different colors is transmitted through the waveguide lens without crosstalk. The virtual-real fusion unit 1430 projects a color converging light field in front of the human eye.
Compared with the prior art, the device for realizing the near-eye display of the three-dimensional image has a plurality of advantages. For example, the microlens array-based near-eye display device of the present invention can automatically generate stereoscopic images, is simple to operate, is compact and can operate under incoherent light sources without special illumination light, and provides continuous parallax and viewpoint to an observer.
The foregoing has described the principles and preferred embodiments of the present invention. However, the invention should not be construed as being limited to the particular embodiments discussed. The preferred embodiments described above should be regarded as illustrative rather than restrictive, and it should be appreciated that variations may be made in those embodiments by workers skilled in the art without departing from the scope of the present invention as defined by the following claims.

Claims (9)

1. An apparatus for enabling near-to-eye display of a three-dimensional image, comprising:
A light field reproduction unit configured to reconstruct light field information of a target object to reproduce a virtual scene, the light field reproduction unit comprising:
Spatial light modulator, and
A phase plate arranged in the light emitting direction of the spatial light modulator and having a diffraction structure configured to project images of different perspectives in the virtual scene to respective corresponding viewing positions, and
A virtual-real fusion unit configured to output a three-dimensional image fusing the virtual scene and the real scene together,
Wherein each volume pixel of the spatial light modulator comprises a plurality of sub-pixels corresponding to different viewing angles,
Wherein each of the individual pixels of the phase plate includes a sub-pixel corresponding to a sub-pixel of the spatial light modulator, and each of the sub-pixels of the phase plate includes a nano-grating pixel unit having a different orientation and/or period, thereby forming a plurality of windows having the same viewing angle information, which are arranged consecutively in a vertical or horizontal direction, for each viewing point.
2. The apparatus of claim 1, wherein the apparatus further comprises a projection unit configured to transfer the virtual scene output by the light field rendering unit to the virtual-to-real fusion unit.
3. The apparatus of claim 1, wherein the spatial light modulator comprises a plurality of volume pixels, each volume pixel comprising a plurality of sub-pixels, each sub-pixel corresponding to a different viewing angle, the diffractive structure comprising a plurality of nanostructure elements, each nanostructure element configured to project light beams from a sub-pixel of the plurality of volume pixels corresponding to a same viewing angle to a same viewing position associated with the sub-pixel.
4. The apparatus of claim 1, wherein the spatial light modulator is one of a DLP display, LCOS display, or liquid crystal display.
5. The apparatus of claim 1, wherein the virtual-real fusion unit comprises a waveguide, a first nanograting disposed inside the waveguide, and a second nanograting, wherein the first nanograting diffracts incoming light, the waveguide totally reflects light diffracted by the first nanograting, and the second nanograting diffracts totally reflected light to direct light from the waveguide to the viewable area.
6. The apparatus of claim 1, wherein the virtual-real fusion unit comprises three layers of transparent light field lenses stacked together, each layer of transparent light field lenses comprising a waveguide, a first nanograting disposed inside the waveguide, and a second nanograting, wherein the first nanograting diffracts incoming light, the waveguide totally reflects light diffracted by the first nanograting, and the second nanograting diffracts totally reflected light to direct light from the waveguide to the viewable area, wherein the first nanograting of each layer of transparent light field lenses has a different orientation angle and/or period, and the second nanograting of each layer of transparent light field lenses has a different orientation angle and/or period.
7. The apparatus of claim 1, wherein the virtual-real fusion unit comprises a prism, a waveguide, and a nano-grating disposed inside the waveguide, wherein the prism refracts incident light into the waveguide, the waveguide totally reflects the refracted light, and the nano-grating diffracts the totally reflected light to direct the light from the waveguide to the viewable area.
8. The apparatus of claim 1, wherein the virtual-real fusion unit comprises a prism, a waveguide, and a pair of mirrors disposed inside the waveguide, wherein the prism refracts incident light rays into the waveguide, the waveguide totally reflects the refracted light rays, and the mirrors diffract the totally reflected light rays to direct the light rays from the waveguide to the viewable area.
9. The apparatus of claim 1, wherein the virtual-real fusion unit comprises a first waveguide lens, a second waveguide lens, a first set of nanograting, a second set of nanograting, and a third set of nanograting, the first set of nanograting being located between the first waveguide lens and the second waveguide lens, the second set of nanograting being located on a surface of the second waveguide lens remote from the first waveguide lens, the third set of nanograting being located above the second set of nanograting, the first set of nanograting, the second set of nanograting, and the third set of nanograting having different orientation angles and/or periods.
CN201810513869.4A 2018-05-25 2018-05-25 Device for realizing near-eye display of three-dimensional images Active CN110531525B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810513869.4A CN110531525B (en) 2018-05-25 2018-05-25 Device for realizing near-eye display of three-dimensional images

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810513869.4A CN110531525B (en) 2018-05-25 2018-05-25 Device for realizing near-eye display of three-dimensional images

Publications (2)

Publication Number Publication Date
CN110531525A CN110531525A (en) 2019-12-03
CN110531525B true CN110531525B (en) 2025-04-11

Family

ID=68656833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810513869.4A Active CN110531525B (en) 2018-05-25 2018-05-25 Device for realizing near-eye display of three-dimensional images

Country Status (1)

Country Link
CN (1) CN110531525B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111123524A (en) * 2020-01-17 2020-05-08 北京枭龙科技有限公司 Diffraction waveguide capable of expanding pupil and uniformly emitting light
CN115202044B (en) * 2021-04-14 2024-10-29 宁波舜宇车载光学技术有限公司 Optical waveguide device, augmented reality display apparatus, and display method
CN116300135A (en) * 2021-12-20 2023-06-23 宁波舜宇车载光学技术有限公司 Display device, manufacturing method thereof and optical system
CN115494574B (en) * 2022-01-30 2023-07-28 珠海莫界科技有限公司 An optical waveguide module and an AR display device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105959672A (en) * 2016-05-03 2016-09-21 苏州苏大维格光电科技股份有限公司 Naked eye three-dimensional display device based on active emitting type display technology
CN106501938A (en) * 2016-11-21 2017-03-15 苏州苏大维格光电科技股份有限公司 A kind of wear-type augmented reality three-dimensional display apparatus
CN208805627U (en) * 2018-05-25 2019-04-30 苏州苏大维格光电科技股份有限公司 Apparatus for realizing near-eye display of three-dimensional images

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3608747B2 (en) * 1995-04-21 2005-01-12 大日本印刷株式会社 Multi-sided hologram and method for producing the same
JP2013231874A (en) * 2012-04-27 2013-11-14 Panasonic Corp Video display device
CN106556966B (en) * 2016-11-17 2019-02-12 苏州苏大维格光电科技股份有限公司 A super viewing angle pointing projection screen with nano-grating pixel structure
CN106526730B (en) * 2016-11-21 2019-07-12 苏州苏大维格光电科技股份有限公司 A kind of wide viewing angle waveguide eyeglass and production method and wear-type three-dimensional display apparatus
CN106773057A (en) * 2017-01-13 2017-05-31 苏州苏大维格光电科技股份有限公司 A kind of monolithic hologram diffraction waveguide three-dimensional display apparatus

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105959672A (en) * 2016-05-03 2016-09-21 苏州苏大维格光电科技股份有限公司 Naked eye three-dimensional display device based on active emitting type display technology
CN106501938A (en) * 2016-11-21 2017-03-15 苏州苏大维格光电科技股份有限公司 A kind of wear-type augmented reality three-dimensional display apparatus
CN208805627U (en) * 2018-05-25 2019-04-30 苏州苏大维格光电科技股份有限公司 Apparatus for realizing near-eye display of three-dimensional images

Also Published As

Publication number Publication date
CN110531525A (en) 2019-12-03

Similar Documents

Publication Publication Date Title
JP6845289B2 (en) Methods and systems for generating virtual content displays using virtual or augmented reality devices
US10359630B2 (en) Display apparatus comprising first and second optical phased arrays and method for augmented reality
CN105487239B (en) Directive property colored filter and bore hole 3D display device
US20200225487A1 (en) Near-eye optical imaging system, near-eye display device and head-mounted display device
WO2017107313A1 (en) Naked eye 3d laser display device
KR101819905B1 (en) Stereoscopic imaging method and device employing planar optical waveguide loop
WO2019179136A1 (en) Display apparatus and display method
CN208805627U (en) Apparatus for realizing near-eye display of three-dimensional images
CN109073882A (en) The display based on waveguide with exit pupil extender
US11487117B2 (en) Display apparatus having wide viewing window
US8797620B2 (en) Autostereoscopic display assembly based on digital semiplanar holography
CN106773057A (en) A kind of monolithic hologram diffraction waveguide three-dimensional display apparatus
CN110531525B (en) Device for realizing near-eye display of three-dimensional images
CN103091850A (en) Naked-eye multi-dimensional display assembly and display thereof
CN105425409B (en) A projection-type naked-eye 3D display device and its colorized display device
JP2005504339A (en) Flat projection display
CN107367845A (en) Display system and display methods
CN109521506A (en) Nanometer eyeglass, nearly eye display methods and nearly eye display device
JP2016500829A (en) True 3D display with convergence angle slice
CN108646412B (en) Near-eye display device and near-eye display method
CN106556966A (en) Point to projection screen in a kind of ultraphotic angle containing nanometer grating dot structure
CN112305776B (en) Light field display system based on light waveguide coupling light exit pupil segmentation-combination control
CN110727115A (en) A near-eye display device with super multi-viewpoint based on diffractive optics
KR20220088698A (en) image display device
CN110531528B (en) Three-dimensional display device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 215123 No. 68, Xinchang Road, Suzhou Industrial Park, Jiangsu, China

Applicant after: SUZHOU SUDAVIG SCIENCE AND TECHNOLOGY GROUP Co.,Ltd.

Applicant after: SOOCHOW University

Address before: 215123 No. 68, Xinchang Road, Suzhou Industrial Park, Jiangsu, China

Applicant before: SVG OPTRONICS, Co.,Ltd.

Applicant before: SOOCHOW University

SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant