[go: up one dir, main page]

CN105704478B - Stereo display method, device and electronic equipment for virtual and reality scene - Google Patents

Stereo display method, device and electronic equipment for virtual and reality scene Download PDF

Info

Publication number
CN105704478B
CN105704478B CN201510546495.2A CN201510546495A CN105704478B CN 105704478 B CN105704478 B CN 105704478B CN 201510546495 A CN201510546495 A CN 201510546495A CN 105704478 B CN105704478 B CN 105704478B
Authority
CN
China
Prior art keywords
user
matrix
virtual scene
application program
observation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201510546495.2A
Other languages
Chinese (zh)
Other versions
CN105704478A (en
Inventor
刘江
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
SuperD Co Ltd
Original Assignee
Shenzhen Super Perfect Optics Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Super Perfect Optics Ltd filed Critical Shenzhen Super Perfect Optics Ltd
Priority to CN201510546495.2A priority Critical patent/CN105704478B/en
Publication of CN105704478A publication Critical patent/CN105704478A/en
Application granted granted Critical
Publication of CN105704478B publication Critical patent/CN105704478B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Processing Or Creating Images (AREA)

Abstract

The present invention provides a kind of stereo display method, device and electronic equipment for virtual and reality scene, and its display methods includes:Obtain the identification information of application program;According to identification information, the corresponding observation visual angle synchronous mode of application program is determined;Visual angle synchronous mode according to the observation, when the head position of the user of application program changes, build and show the stereo-picture of the virtual scene of application program, so as to which the observation visual angle to virtual scene enters line translation, realize that the observation visual angle under virtual scene is synchronous with the observation visual angle after user's head change in location.The embodiment of the present invention can provide the different observation visual angle methods of synchronization, experience of the lifting user to stereo content for distinct program.The embodiment of the present invention can be applied in 3D display devices.

Description

Stereoscopic display method and device for virtual and real scenes and electronic equipment
Technical Field
The invention relates to the technical field of virtual reality, in particular to a stereoscopic display method and device for virtual and real scenes and electronic equipment.
Background
Vr (virtual reality) is a short name for virtual and reality, and means that a realistic virtual reality effect is created by technical means. At present, virtual and real devices have become mature and perfect, and are now widely used in the fields of movies, games, and the like, and more users begin to select the virtual and real devices to experience 3D (3 dimensional ) stereoscopic content.
When a user experiences 3D stereoscopic content through virtual and real devices, in order for the user to experience a very realistic real effect, it becomes very important to synchronize the viewing angles, that is, when the user changes the position of the head to change the viewing angle of the eyes, the viewing orientation of the 3D stereoscopic content viewed by the user, that is, the viewing angle of the virtual scene, changes accordingly, thereby bringing a strong sense of substitution and reality to the user.
In the prior art, most of virtual and real devices are single and have no selectivity in a method of synchronously observing visual angles, so that the experience of the whole 3D stereoscopic content is inevitably influenced.
Disclosure of Invention
The invention aims to provide a three-dimensional display method and device for virtual and real scenes, which can provide different observation visual angle synchronization modes aiming at different programs and effectively improve user experience.
In order to achieve the above object, an embodiment of the present invention provides a stereoscopic display method for virtual and real scenes, including:
acquiring identification information of an application program;
determining an observation visual angle synchronization mode corresponding to the application program according to the acquired identification information;
and according to the determined observation visual angle synchronization mode, when the head position of the user of the application program changes, constructing and displaying a three-dimensional image of the virtual scene of the application program, so that the observation visual angle of the virtual scene is changed, and the observation visual angle under the virtual scene is synchronized with the observation visual angle after the head position of the user changes.
Wherein the viewing perspective synchronization pattern comprises at least one of: observing a matrix transformation mode, a projection matrix transformation mode and a mouse transformation mode;
in the observation matrix transformation mode:
tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, transforming an original observation matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new observation matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new observation matrix; or
Tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, determining a displacement matrix of the head of the user according to the real-time tracking data when the head position of the user changes, determining a new transformation matrix according to the displacement matrix of the head of the user, a parallax deflection moment of a virtual scene and an original transformation matrix of the virtual scene, and constructing and displaying a three-dimensional image of the virtual scene according to the new transformation matrix;
tracking the head position of a user of the application program in the projection matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original projection matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new projection matrix, and constructing and displaying a stereoscopic image of the virtual scene of the application program according to the new projection matrix;
tracking the head position of a user of the application program in the mouse conversion mode, and determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data of the head position of the user; and modifying the position information of the simulated mouse according to the movement amount of the simulated mouse so as to generate and display a three-dimensional image of a virtual scene according to the modified position information of the virtual mouse.
The identification information of the application program comprises a Hash value corresponding to the application program or an installation package name of the application program.
Wherein, the step of determining the viewing angle synchronization mode corresponding to the application program according to the acquired identification information comprises:
determining an observation visual angle synchronous mode corresponding to the application program according to the acquired identification information and the corresponding relation between the preset application program identification information and the observation visual angle synchronous mode;
or
Determining the display scene type of the application program according to the acquired identification information;
and determining an observation visual angle synchronous mode corresponding to the application program according to the determined display scene type.
And in the observation matrix conversion mode, the projection matrix conversion mode or the mouse conversion mode, firstly, performing smooth filtering processing on the real-time tracking data to obtain the real-time tracking data after the smooth filtering processing, and then performing subsequent processing by using the real-time tracking data after the smooth filtering processing.
And determining a rotation matrix of the head of the user according to the real-time tracking data in the observation matrix transformation mode, and determining a new observation matrix according to the rotation matrix of the head of the user, the parallax deflection matrix of the virtual scene and the observation matrix of the virtual scene.
And under the observation matrix conversion mode, determining the visual distance information of the virtual scene according to preset configuration or according to setting parameters input by a user, and determining the parallax deflection matrix of the virtual scene according to the determined visual distance information.
And in the observation matrix transformation mode, intercepting an original rendering pipeline of the application program, and modifying the original rendering pipeline according to the new observation matrix, so that parallax images corresponding to the left eye and the right eye of the user of the application program are generated by rendering the new observation matrix, and a three-dimensional image of a virtual scene is constructed and displayed based on the parallax images corresponding to the left eye and the right eye.
And under the projection matrix conversion mode, determining projection position offset information of a three-dimensional image of a virtual scene according to the real-time tracking data, and constructing a new projection matrix according to the determined projection position offset information and the projection matrix of the virtual scene.
And determining the projection position offset information according to the real-time tracking data of the head of the user and the distance between the viewpoint of the original projection matrix and a near projection plane in the projection matrix conversion mode.
And in the projection matrix transformation mode, intercepting an original rendering pipeline of the application program, and modifying the original rendering pipeline according to the new projection matrix, so that parallax images corresponding to the left eye and the right eye of the user of the application program are generated by rendering the new projection matrix, and a stereoscopic image of a virtual scene is constructed and displayed based on the parallax images corresponding to the left eye and the right eye.
And under the mouse conversion mode, determining the real-time rotation angle of the head of the user according to the real-time tracking data of the head position, and determining the movement amount of the simulated mouse for the virtual scene according to the real-time rotation angle of the head of the user.
And under the mouse conversion mode, acquiring the tracking data of the current frame of the head of the user and the tracking data of the previous frame of the head of the user, acquiring the difference value of the tracking data of the current frame and the tracking data of the previous frame, and determining the real-time rotation angle according to the difference value.
An embodiment of the present invention further provides a stereoscopic display device for virtual and real scenes, including:
the acquisition module is used for acquiring the identification information of the application program;
a determining module, configured to determine, according to the obtained identification information, an observation view synchronization mode corresponding to the application program;
and the display module is used for constructing and displaying a three-dimensional image of the virtual scene of the application program when the head position of the user of the application program changes according to the determined observation visual angle synchronization mode, so that the observation visual angle of the virtual scene is changed, and the observation visual angle under the virtual scene is synchronized with the observation visual angle after the head position of the user changes.
An embodiment of the present invention further provides an electronic device for virtual and real scenes, including:
the device comprises a shell, a processor, a memory, a display, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; a power supply circuit for supplying power to each circuit or device of the electronic apparatus; the memory is used for storing executable program codes; the processor runs a program corresponding to the executable program code by reading the executable program code stored in the memory for performing the steps of:
acquiring identification information of an application program;
determining an observation visual angle synchronization mode corresponding to the application program according to the acquired identification information;
and according to the determined observation visual angle synchronization mode, when the head position of the user of the application program changes, constructing and displaying a three-dimensional image of the virtual scene of the application program, so that the observation visual angle of the virtual scene is changed, and the observation visual angle under the virtual scene is synchronized with the observation visual angle after the head position of the user changes.
The technical scheme of the invention at least has the following beneficial effects:
the stereoscopic display method and the stereoscopic display device for the virtual and real scenes, provided by the embodiment of the invention, provide technical support for an application program, determine an observation visual angle synchronization mode supported by the application program or required to be used by the application program by acquiring the identification of the application program, and further construct and display a stereoscopic image of the virtual scene of the application program by using the determined observation visual angle synchronization mode when the head position of a user changes, so that the observation visual angle of the virtual scene is changed, the observation visual angle under the virtual scene is synchronized with the observation visual angle after the head position of the user changes, different observation visual angle synchronization modes can be provided for different application programs, and an observation visual angle synchronization mode suitable for the virtual scene of the application program can be provided for the application program, so that the user experience can be effectively improved.
Drawings
Fig. 1 is a flow chart illustrating a method for stereoscopic display of virtual and real scenes according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of user head tracking data in a stereoscopic display method for virtual and real scenes in accordance with the present invention;
fig. 3 is a schematic projection diagram of a stereoscopic display method for virtual and real scenes according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram illustrating a stereoscopic display apparatus for virtual and real scenes according to a fifth embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantages of the present invention more apparent, the following detailed description is given with reference to the accompanying drawings and specific embodiments.
The invention provides a stereo display method and a device for virtual and real scenes aiming at the problems of singleness and no selectivity on the mode of synchronizing the observation visual angles and poorer user experience in the prior art, provides technical support and a solution for an application program, determines the observation visual angle synchronization mode supported by the application program or needed to be used by the application program by acquiring the identification of the application program, and further constructs and displays a stereo image of the virtual scene of the application program by utilizing the determined observation visual angle synchronization mode when the head position of a user changes, thereby changing the observation visual angle of the virtual scene, realizing the synchronization of the observation visual angle under the virtual scene and the observation visual angle after the head position of the user changes, providing different observation visual angle synchronization modes aiming at different application programs, providing the observation visual angle synchronization mode of the virtual scene suitable for the application program, the display effect of the virtual scene is effectively guaranteed, and therefore user experience can be effectively improved.
It should be noted that the embodiment of the present invention may be applied to a wearable 3D display scene, and may also be applied to a naked-eye 3D display scene, which is not limited in this respect.
As shown in fig. 1, a method for providing stereoscopic display of virtual and real scenes in accordance with a first embodiment of the present invention includes:
step 11, acquiring identification information of an application program;
the application program can be various 3D games. Of course, the present invention is not limited to this, and may be any other application program that needs to perform 3D display of a virtual scene, and those skilled in the art may arbitrarily select the application program.
Step 12, determining an observation view angle synchronization mode corresponding to the application program according to the acquired identification information;
and step 13, according to the determined observation visual angle synchronization mode, when the head position of the user of the application program changes, constructing and displaying a three-dimensional image of the virtual scene of the application program, so as to change the observation visual angle of the virtual scene, and realize the synchronization of the observation visual angle in the virtual scene and the observation visual angle after the head position of the user changes.
The stereoscopic display method for virtual and real scenes provided by the embodiment of the invention can select a proper observation angle synchronization mode for the application program according to the identification information of the application program to realize the synchronization of the observation angles, namely, when the head position of a user using the application program changes, the determined observation angle synchronization mode is utilized to construct and display the stereoscopic image of the virtual scene of the application program. According to the stereoscopic display method provided by the embodiment of the invention, the observation visual angle synchronization mode of the application program is determined firstly and can be matched with the application program, so that the observation visual angle of the virtual scene is changed, and the synchronization of the observation visual angle in the virtual scene and the observation visual angle after the head position of the user is changed is realized. Therefore, different observation visual angle synchronization modes can be provided for different application programs, the observation visual angle synchronization mode suitable for the virtual scene of the application program can be provided for the application programs, the display effect of the virtual scene is effectively guaranteed, and therefore user experience can be effectively improved.
Specifically, in the embodiment of the present invention, the identification information of the application includes a Hash value corresponding to the application or an installation package name of the application.
Of course, the present invention is applicable to any identification information that can uniquely identify the application program, and the examples are not limited to this.
Further, in the above embodiment of the present invention, two ways of determining the viewing angle synchronization mode corresponding to the application program may be included, where the first way step 12 specifically includes:
determining an observation visual angle synchronous mode corresponding to the application program according to the acquired identification information and the corresponding relation between the preset application program identification information and the observation visual angle synchronous mode;
in this way, a technician presets a database in the background or locally in advance, the database stores the corresponding relationship between the identification information of the application program and the observation perspective synchronization mode, that is, the observation perspective synchronization mode applicable to each application program is determined in advance, and the observation perspective synchronization mode corresponding to the application program can be determined by directly searching the corresponding relationship from the database after the identification information is acquired.
The second method, step 12, specifically includes:
determining the display scene type of the application program according to the acquired identification information;
and determining an observation visual angle synchronous mode corresponding to the application program according to the determined display scene type.
In this way, the display scene type of each application program is predetermined, that is, each application program is classified, and the application programs of different display scene types correspond to different observation view angle synchronization modes, so that the display scene type of the application program is determined according to the identification information, and then the observation view angle synchronization mode is determined according to the display scene type, wherein the display scene type and the observation view angle synchronization mode have a corresponding relationship.
For example, the display scene types of the application may include a first type supporting a mouse to rotate a viewing angle, a second type (e.g., a head-mounted 3D display) in which the display scene is not missing after the viewing angle is rotated, and a third type (e.g., a naked-eye 3D display) in which the display scene may be missing after the viewing angle is rotated, and thus, different viewing angle synchronization modes may be selected for the application according to different types of the application.
It can be understood that, in the embodiment of the present invention, how to determine the viewing angle synchronization mode corresponding to the application program according to the acquired identification information is not limited, and any reasonable and feasible scheme may be applied thereto, and those skilled in the art may perform any setting.
Specifically, in the foregoing embodiment of the present invention, the viewing angle synchronization mode includes at least one of the following modes: observing a matrix transformation mode, a projection matrix transformation mode and a mouse transformation mode;
wherein, in the observation matrix transformation mode:
tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, transforming an original observation matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new observation matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new observation matrix;
or
Tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, determining a displacement matrix of the head of the user according to the real-time tracking data when the head position of the user changes, determining a new transformation matrix according to the displacement matrix of the head of the user, a parallax deflection moment of a virtual scene and an original transformation matrix of the virtual scene, and constructing and displaying a three-dimensional image of the virtual scene according to the new transformation matrix;
tracking the head position of a user of the application program in the projection matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original projection matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new projection matrix, and constructing and displaying a stereoscopic image of the virtual scene of the application program according to the new projection matrix;
tracking the head position of a user of the application program in the mouse conversion mode, and determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data of the head position of the user; and modifying the position information of the simulated mouse according to the movement amount of the simulated mouse so as to generate and display a three-dimensional image of a virtual scene according to the modified position information of the virtual mouse.
According to the three observation visual angle synchronization modes provided by the embodiment of the invention, when the head position of the user changes, the real-time tracking data of the head position of the user is utilized to transform or modify the observation matrix (or transformation matrix), the projection matrix or the position information of the virtual mouse of the virtual scene, so that the modified observation matrix (or transformation matrix), the projection matrix or the position information of the virtual mouse are utilized to construct the stereoscopic image of the virtual scene, and further, the discomfort generated when the user uses virtual and real equipment to synchronously observe visual angles can be relieved to a certain extent. In addition, the embodiment of the invention provides three different observation angle synchronization modes to realize the synchronization of the observation angle in the virtual scene and the observation angle after the head position of the real user is changed, and the three different observation angle synchronization modes can be used for different display scenes, so that the application range is very wide.
In the three modes, the embodiment of the present invention assumes that the viewing rotation angle of the head position of the user is the same in the virtual scene and in the real scene, and the shift of the position is also the same after the Scale (Scale) is incorporated. In combination with the above assumptions, the three viewing angle synchronization modes provided in the embodiments of the present invention transform the viewing orientation in the virtual scene, i.e., synchronize the viewing angles, according to the tracking data of the user head position in the real scene.
The following describes three viewing angle synchronization modes in detail:
detailed description of the invention
In this embodiment, the viewing angle synchronization mode includes a viewing matrix conversion mode or a projection matrix conversion mode. When the observation visual angle synchronous mode corresponding to the application program is determined to be an observation matrix conversion mode, tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, converting an original observation matrix of a virtual scene according to the real-time tracking data when the head position of the user changes, obtaining a new observation matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new observation matrix.
When the observation visual angle synchronous mode corresponding to the application program is determined to be a projection matrix transformation mode, tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, transforming an original projection matrix of a virtual scene according to the real-time tracking data when the head position of the user changes, obtaining a new projection matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new projection matrix.
Optionally, if the user watches the 3D content in a wearable manner, the head position of the user may be tracked through a sensing device such as a speed sensor or an acceleration sensor to obtain real-time tracking data of the head position of the user, and if the user is a naked-eye 3D scene, the head position of the user may be tracked through a camera to obtain real-time tracking data of the head position of the user, which may be arbitrarily selected by a person skilled in the art.
Specifically, the real-time tracking data of the head position of the user may include a real-time rotation angle of the head in a three-dimensional space, and the real-time rotation angle of the head of the user may be collectively represented by establishing a three-dimensional coordinate system in advance so as to obtain a rotation angle of the head of the user relative to an X axis, a rotation angle of the head of the user relative to a Y axis, and a rotation angle of the head of the user relative to a Z axis. Of course, the real-time tracking data of the head position of the user may also include a real-time translation distance of the head in a three-dimensional space, similar to a real-time rotation angle, which is not described in detail herein.
For example, as shown in fig. 2, the real-time tracking data of the head of the user includes a rotation angle of the head in a three-dimensional space (Pitch, Yaw, Roll); wherein, Pitch: the angle of rotation of the user's head relative to the x-axis; and (3) Yaw: the angle of rotation of the user's head relative to the y-axis; roll: the angle of rotation of the user's head relative to the z-axis.
In principle, the original viewing matrix is transformed in such a way that the viewing scene and the object are kept still, and the viewing angle synchronization is achieved by changing the viewing position and angle of the camera (representing the eyes of the user), while the original projection matrix is transformed in such a way that the viewing angle synchronization is achieved by changing the position of the camera and the position of the scene, while the viewing position is kept still.
In the two modes, specifically, the original observation matrix or the original projection matrix of the virtual scene is transformed according to the real-time tracking data acquired when the head position of the user changes, and the real-time tracking data can represent the change of the head position of the user.
It should be emphasized that the observation matrix or the projection matrix described in the embodiments of the present invention is a conventional concept in the field of graphic image processing, and those skilled in the art can directly and unambiguously determine this, and the present invention will not be described in detail.
According to the stereoscopic display method for the virtual and real scenes, by tracking the head position of the user, when the head position of the user changes, the real-time tracking data of the head position of the user is utilized to transform the observation matrix or the projection matrix of the virtual scene, and the transformed or modified observation matrix and projection matrix are utilized to construct the stereoscopic image of the virtual scene, so that the synchronization of the observation angle of the virtual scene and the observation angle of the user after the head position changes is realized, and the discomfort generated when the user synchronously observes the observation angles when using the virtual and real devices can be relieved to a certain extent.
Preferably, in order to avoid sudden change of the observation angle due to sudden fluctuation of the head position data, thereby causing a user to feel vertigo, in a specific embodiment of the present invention, before the original observation matrix or the original projection matrix of the virtual scene is transformed according to the real-time tracking data, the real-time tracking data is subjected to smoothing filtering processing to obtain real-time tracking data after the smoothing filtering processing, and the original observation matrix or the original projection matrix of the virtual scene is transformed according to the real-time tracking data after the smoothing filtering processing. Therefore, when the stereoscopic image is subsequently constructed and displayed so as to change the observation visual angle, the discomfort of the user can be weakened to a certain extent, and the user experience is improved.
The third concrete embodiment:
in this embodiment, the observation perspective synchronization mode includes an observation matrix transformation mode, and when it is determined that the observation perspective synchronization mode corresponding to the application program is the observation matrix transformation mode, this embodiment includes:
and step 31, tracking the head position of the user, and acquiring real-time tracking data of the head position of the user.
Specifically, as shown in fig. 2, the real-time tracking data of the user's head is consistent with the real-time tracking data in the foregoing embodiment, and the description is not repeated here. In this embodiment, the observation matrix is changed according to the real-time tracking data, so that a new observation matrix is obtained.
And step 32, when the head position of the user changes, transforming the original observation matrix of the virtual scene according to the real-time tracking data to obtain a new observation matrix.
In this embodiment, the viewing angle deflection is performed by changing the viewing matrix, thereby realizing synchronization of the viewing angles. The original observation matrix may also be referred to as a current observation matrix, which refers to an observation matrix applied to the current stereoscopic image when the head position of the user changes, that is, an observation matrix applied to the stereoscopic image when the observation angle has not changed. In an embodiment of the present invention, the original observation matrix of the virtual scene may be obtained by an interception technique, which is not important in the present application and is not described in detail herein.
And step 33, constructing and displaying the three-dimensional image of the virtual scene according to the new observation matrix, so as to change the observation angle of the virtual scene and realize the synchronization of the observation angle of the virtual scene and the observation angle of the user after the head position changes.
Specifically, in an embodiment of the present invention, after obtaining the new observation matrix, the new observation matrix is injected into the imaging process of the application program through an injection technique, so as to achieve the purposes of drawing the 3D stereoscopic content and realizing the synchronization of the observation angles.
In the stereoscopic display method for virtualizing and displaying a scene provided by this embodiment, the observation angle synchronization mode is an observation matrix synchronization mode, and the method of deflecting the angle of view by changing the observation matrix is suitable for being used in a scene (for example, head-mounted 3D display) where scene information is not missing, and still being capable of correctly observing after the angle of view is rotated. The method is more scientific and accurate in changing the visual angle, and is particularly suitable for the condition that the scene range of the three-dimensional content, namely the three-dimensional image, is small.
Specifically, the step 32 in this embodiment includes:
step 321, determining a rotation matrix of the user head according to the real-time tracking data;
specifically, based on real-time tracking data of the head position of the user, for example, the rotation angle of the head of the user, the rotation matrices of the head of the user are determined to be a rotation matrix of an X axis, a rotation matrix of a Y axis, and a rotation matrix of a Z axis, respectively. It should be noted that the format of the rotation matrix is preset, and the obtained real-time tracking data of the user head is subjected to predetermined processing and then filled into a predetermined position in the preset matrix, thereby forming the rotation matrix of the user head required in this embodiment.
Step 322, determining a new observation matrix according to the rotation matrix of the user's head, the parallax deflection matrix of the virtual scene, and the original observation matrix of the virtual scene.
In principle of stereo imaging, in order to construct a stereo image, two left and right eye images with horizontal parallax need to be constructed first, and in order to generate a parallax image, the position of a camera needs to be moved, that is, the camera needs to be shifted on the same horizontal plane to generate left and right cameras, which respectively correspond to the left and right eyes of a person. The spacing between the left and right cameras is called the apparent spacing. The parallax deflection matrix is determined from the apparent distance.
The rotation matrix of the user's head is actually the view angle deflection matrix of the user, i.e. the transformation of the observation matrix includes view angle deflection and parallax deflection; therefore, a new observation matrix can be obtained according to the visual angle deflection matrix, the parallax deflection matrix of the virtual scene and the observation matrix of the current virtual scene. And the new viewing matrix is relative to the new viewing angle.
Specifically, in the fourth embodiment of the present invention, the generating step of the parallax deflection matrix of the virtual scene includes:
step 323, determining the view interval information of the virtual scene according to the preset configuration or the setting parameters input by the user;
and step 324, determining a parallax deflection matrix of the virtual scene according to the determined inter-view distance information.
Optionally, the view distance information sep may be preset by the system or may be set by the user, and the specific value is not limited herein. Where sep can be considered as the distance between the left and right cameras, which is used to represent the distance between the left and right eyes of a person. The parallax error matrix is also in a preset format, and in the present application, the parallax error matrix is configured by performing predetermined processing on the parallax distance information and filling the parallax distance information in the preset format.
More specifically, in this embodiment, the parallax deflection matrix includes a first view parallax deflection matrix and a second view parallax deflection matrix, which may also be referred to as a left view parallax deflection matrix and a right view parallax deflection matrix; the new observation matrix comprises a first view observation matrix and a second view observation matrix, which can also be called a left view observation matrix and a right view observation matrix; step 322 specifically includes:
step 3221, determining a first view observation matrix according to the rotation matrix of the head of the user, the first view parallax deflection matrix of the virtual scene and the original observation matrix of the virtual scene; and
step 3222, determining a second view observation matrix according to the rotation matrix of the user head, the second view parallax deflection matrix of the virtual scene, and the original observation matrix of the virtual scene;
then, the step 43 in this embodiment includes:
step 331, generating a first view of the virtual scene by using the first view observation matrix, and generating a second view of the virtual scene by using the second view observation matrix;
that is, left and right eye parallax images are generated.
And step 332, constructing and displaying a stereoscopic image of the virtual scene according to the first view and the second view generated by rendering.
Namely, an original rendering pipeline of the application program is intercepted, the original rendering pipeline is modified according to the new observation matrix, so that parallax images corresponding to the left eye and the right eye of the user of the application program are generated by rendering through the new observation matrix, and the stereoscopic image of the virtual scene is constructed and displayed based on the parallax images corresponding to the left eye and the right eye.
Specifically, in an embodiment of the present invention, after obtaining the new observation matrix, the new observation matrix is injected into the 3D application process by using an injection technique, so as to achieve the purposes of drawing 3D stereoscopic content and realizing synchronization of observation angles.
Specifically, in this embodiment, two new observation matrices are obtained, and then the two new observation matrices need to be respectively injected into the application process, so as to implement synchronization of the observation visual angles; the method comprises the steps of generating a first view and a second view according to a first view observation matrix and a second view observation matrix respectively in a rendering mode, generating a stereoscopic image of a virtual scene according to the first view and the second view and combining a stereoscopic display technology, and displaying the stereoscopic image on a naked eye stereoscopic display screen.
Specifically, the following describes in detail the steps performed in this embodiment with reference to a specific embodiment:
firstly, acquiring an original observation matrix V of a current virtual scene through an interception technology;
then, according to the head tracking data Pitch, Yaw, Roll, a rotation matrix of the head about the x, y, z axes respectively can be obtained:
further, a rotation matrix V of the head of the user can be obtainedRotation
VRotation=RotationPitch*RotationYaw*RotationRoll
Of course, in many scenarios (e.g., first person shooter games), we focus only on the rotation transformation of the head about the x-axis and y-axis, so we only need to make the rotation matrix about the z-axis the identity matrix:
RotationRoll=E
further, the transformation of the observation matrix comprises: parallax deflection transformation matrix VsepAnd a view angle deflection transformation matrix (i.e., a rotation matrix of the user's head) VRotationWherein the parallax deflection matrix VsepIt is understood that when generating the parallax image, the cameras need to be shifted by ± sep/2 (i.e. moving sep/2 in the first direction to obtain the left camera, and moving sep/2 in the second direction to obtain the right camera), and then the parallax deflection matrices of the left and right cameras are respectively:
existing parallax deflection transformation VsepComprises the following steps:
then the transformed observation matrix V'
V′=VRotation*Vsep*V
Specifically, the transformed first-view observation matrix V ═ VRotation*Vsep_RV; accordingly, the transformed second-view observation matrix V ═ VRotation*Vsep_LV. And respectively injecting the two V's into the application program, so that the transformed observation matrix is utilized to generate a three-dimensional image of the virtual scene, thereby achieving the purpose of deflecting the view angle and realizing the synchronization of the observation view angle.
The fourth concrete embodiment:
in this embodiment, the viewing angle synchronization pattern includes a viewing matrix transformation pattern. When the observation visual angle synchronous mode corresponding to the application program is determined to be an observation matrix transformation mode, tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, when the head position of the user changes, determining a displacement matrix of the head of the user according to the real-time tracking data, determining a new transformation matrix according to the displacement matrix of the head of the user, the parallax deflection moment of the virtual scene and the original transformation matrix of the virtual scene, and constructing and displaying a three-dimensional image of the virtual scene according to the new transformation matrix.
In many cases (e.g. opengl1.0), only the current global transformation matrix (which is the result of multiplying the world coordinate transformation, the observation coordinate transformation, and the projection transformation) can be obtained by the interception technique, so that we cannot directly modify a single transformation of the world coordinate transformation, the observation coordinate transformation, and the projection transformation.
As is known, the original transformation matrix is M ═ P × V × W, where W is world coordinate transformation, V is observation coordinate transformation, and P is projection transformation. The whole process of the partial map requires modification of the observation coordinate transformation V, and the projection matrix P. Is provided with
M′=P′*V′*W
Wherein V 'is the changed observation coordinate transformation, and P' is the changed projection coordinate transformation:
V′=VRotation*Vsep*V
wherein, VRotationIs a view angle deflection matrix, VsepIs a parallax deflection matrix.
Then, the following steps are carried out,
M′=P′*VRotation*Vsep*V*W
=(P’*VRotation*Vsep*P-1)*P*V*W
=(P’*Vsep*Vrotation*P-1)*M
wherein if the viewing angle is not deflected in the matrix transformation, then VrotationE, is an identity matrix.
Then
ML=(PL*Vsep_L*Vrotation*P-1)*M
MR=(PR*Vsep_R*Vrotation*P-1)*M
Wherein M isLAnd MRRespectively, transformation matrices, V, for left and right cameras after deflectionsep_LAnd Vsep_RParallax deflection matrices, P, for left and right cameras, respectivelyLAnd PRRespectively, the projection matrixes of the left camera and the right camera after deflection. On the premise that the projection matrix is known and M of the transformation matrix can be intercepted, the transformed transformation matrix can be obtained according to the parallax deflection matrix and the rotation matrix of the head.
Based on the principle, in this embodiment, in the observation matrix synchronization mode, the head position of the user of the application program is tracked, real-time tracking data of the head position of the user is obtained, when the head position of the user changes, a displacement matrix of the head of the user is determined according to the real-time tracking data, a new transformation matrix is determined according to the displacement matrix of the head of the user, the parallax deflection moment of the virtual scene, and the original transformation matrix of the virtual scene, and a stereoscopic image of the virtual scene is constructed and displayed according to the new transformation matrix.
Specifically, a rotation matrix V of the head of the user is determined according to the real-time tracking datarotationUsing VrotationParallax deflection matrix VsepAnd the known projection matrix P and its inverse matrix P-1And obtaining a new transformation matrix, and further constructing the stereo image by using the new transformation matrix.
ML=(PL*Vsep_L*Vrotation*P-1)*M
MR=(PR*Vsep_R*Vrotation*P-1)*M
It should be understood that the rotation matrix is used for illustration in the above embodiment, but the present invention is not limited to this, and a translation matrix of the head of the user may be constructed when the head of the user is translated, and then a new transformation matrix is obtained by using the translation matrix of the head of the user. In this embodiment, the rotation matrix and the translation matrix may be both referred to as displacement matrices.
How to obtain the rotation matrix of the head, the parallax deflection matrix, and the like are the same as those of the foregoing embodiments, and are not described in detail here.
The fifth concrete embodiment:
in this embodiment, the viewing angle synchronization mode includes a projection matrix transformation mode. When it is determined that the viewing angle synchronization mode corresponding to the application program is the projection matrix transformation mode, the stereoscopic display method of the embodiment includes:
step 51, tracking the head position of the user, and acquiring real-time tracking data of the head position of the user;
specifically, as shown in fig. 2, the real-time tracking data of the user's head referred to in this embodiment is consistent with the real-time tracking data of the user's head referred to in the foregoing embodiment, and the description is not repeated here.
Step 52, when the head position of the user changes, transforming the original projection matrix of the virtual scene according to the real-time tracking data to obtain a new projection matrix;
the original projection matrix may also be referred to as a current projection matrix, which is a projection matrix applied to the current stereoscopic image when the head position of the user changes, that is, a projection matrix applied to the stereoscopic image when the observation angle has not changed.
And 53, constructing and displaying a three-dimensional image of the virtual scene according to the new projection matrix, so as to change the observation angle of the virtual scene and realize the synchronization of the observation angle of the virtual scene and the observation angle of the real user after the head position changes.
In principle, the method in this embodiment specifically realizes the deflection of the observation angle by changing the position and angle of the view object (image) without changing the observation position of the camera, thereby realizing the synchronization of the observation angle and optimizing the user experience.
Specifically, step 52 in this embodiment includes:
step 521, determining projection position offset information of a stereoscopic image of a virtual scene according to the real-time tracking data;
and 522, constructing a new projection matrix according to the determined projection position offset information and the original projection matrix of the virtual scene.
In a manner consistent with the manner of obtaining the original observation matrix in the foregoing embodiment, the embodiment obtains the original projection matrix through an interception technique, and a specific process thereof is not described in detail.
Further, in this embodiment, the step 521 includes:
5211, determining the projection position offset information of the stereoscopic image of the virtual scene according to the real-time tracking data and the distance between the viewpoint of the original projection matrix and the near projection plane.
As shown in fig. 3, OpenGL is taken as an example, left, right, bottom, top define the size of the cropping plane, and Znear and Zfar define the distance from the camera to the viewing cone. From these six parameters, a cone of six cropping planes can be defined, which is commonly referred to as a view cone or a view volume.
In the present embodiment, the offset of the projection position includes an offset of a position and an offset of an angle, that is, an offset of an angle is calculated from real-time tracking data of the head of the user, and an offset of a position is calculated from a distance (Znear shown in fig. 3) from a viewpoint (e.g., a position of a camera in fig. 3) of the original projection matrix to the current near projection plane.
For example, the user's head is deflected to the left (up), L, R (T, B) of the scene volume (image) can be simultaneously offset to the left (up) by DH(DV) To change the position and angle of projection, with an offset D to the leftHComprises the following steps: dH=ZnearTan (pitch), upward offset DVComprises the following steps: dV=Znear*tan(Yaw);
Therein is provided with ZnearPitch, Yaw, tracks the rotational data (radians) about the x-axis and y-axis for the user's head for the distance from the viewpoint of the original projection matrix P to the near projection plane.
Further, the new projection matrix in this embodiment includes a first view projection matrix and a second view projection matrix, which may also be referred to as a left view projection matrix and a right view projection matrix; in the fifth embodiment of the present invention, the step 522 includes:
step 5221, constructing a first view projection matrix and a second view projection matrix according to the determined projection position offset information and the original projection matrix.
And moving the projection position of the current image by the calculated offset to obtain the projection position of the image after offset. It should be noted that the projection matrix in this embodiment is identical to the observation matrix in the foregoing embodiment, and the projection matrix required in this embodiment can be obtained by processing and filling the projection matrix according to a preset format, and the description is not repeated here.
Further, step 53 in the fifth embodiment of the present invention specifically includes:
531, generating a third view of the virtual scene by using the first view projection matrix, and generating a fourth view of the virtual scene by using the second view projection matrix;
and 532, constructing and displaying a stereoscopic image of the virtual scene according to the third view and the fourth view generated by rendering. In the fifth embodiment of the present invention, the projection matrix of the image after the shift is also injected into the application program by using an injection technique, so as to implement the shift of the observation angle, thereby achieving the purpose of synchronously shifting the observation angle.
In other words, in the projection matrix transformation mode, an original rendering pipeline of the application program is intercepted, and the original rendering pipeline is modified according to the new projection matrix, so that parallax images corresponding to the left eye and the right eye of the user of the application program are generated by rendering with the new projection matrix, and then a stereoscopic image of a virtual scene is constructed and displayed based on the parallax images corresponding to the left eye and the right eye.
Specifically, the following describes the stereoscopic display method provided in this embodiment in detail with reference to a specific embodiment:
for example, the user's head is deflected to the left (up), L, R (T, B) of the scene volume (image) can be simultaneously offset to the left (up) by DH(DV) To change the position and angle of projection, with an offset D to the leftHComprises the following steps: dH=ZnearTan (pitch), upward offset DVComprises the following steps: dV=Znear*tan(Yaw);
Therein is provided with ZnearPitch, Yaw tracks the rotational data (radians) about the x-axis and y-axis for the user's head for the viewpoint-to-near projection plane distance of the original projection matrix P.
Results after transformation of the scene volume (projection of the image):
taking OpenGL as an example, as shown in fig. 3, left, right, bottom, top define the size of the cropping plane, and Znear and Zfar define the distance from the camera to the viewing cone. From these six parameters, a cone of six cropping planes can be defined, which is commonly referred to as a view cone or a view volume.
Let L right, R right, B bottom, T top, Znear=near,ZfarGiven as far, the projection matrix mayThe expression is as follows:
the modified projection matrix P' is
The changed projection matrix P 'can be injected into an application program, namely, a three-dimensional image of a virtual scene is constructed by utilizing P', so that the purpose of deflecting the view angle is achieved, the synchronization of the observation view angle is realized, and the user experience is optimized.
However, it should be noted that, in an embodiment of the present invention, when constructing a stereoscopic image of a virtual scene for an application, a projection matrix of the application itself may be first obtained through an interception technique, and then, after an observation angle of view of a user is changed, the projection matrix obtained through interception is used as an original projection matrix to perform subsequent transformation.
The sixth specific embodiment:
in this embodiment, the viewing angle synchronization mode includes a projection matrix transformation mode. When it is determined that the viewing angle synchronization mode corresponding to the application program is the projection matrix transformation mode, the stereoscopic display method of the embodiment includes: tracking the head position of a user, and acquiring real-time tracking data of the head position of the user; determining projection position offset information according to the real-time tracking data; and determining a new projection matrix according to the projection position offset information, obtaining a new transformation matrix through the new projection matrix, an inverse matrix of the original projection matrix, the parallax deflection matrix and the original transformation matrix by using the following formula (see a specific embodiment four in principle), and constructing and displaying the stereo image of the application program by using the new transformation matrix.
ML=(PL*Vsep_L*Vrotation*P-1)*M
MR=(PR*Vsep_R*Vrotation*P-1)*M
Wherein, VrotationE, is an identity matrix.
How to obtain the rotation matrix of the head, the parallax deflection matrix, the new projection matrix, etc. are the same as the previous embodiments, and are not described in detail here.
The seventh specific embodiment:
in this embodiment, the viewing angle synchronization mode includes a mouse conversion mode. When it is determined that the viewing angle synchronization mode corresponding to the application program is the mouse conversion mode, the stereoscopic display method of the embodiment includes:
and step 71, tracking the head position of the user and acquiring real-time tracking data of the head position of the user.
Optionally, if the user watches the 3D content in a wearable manner, the head position of the user may be tracked through a sensing device such as a speed sensor or an acceleration sensor to obtain real-time tracking data of the head position of the user, and if the user is a naked-eye 3D scene, the head position of the user may be tracked through a camera to obtain real-time tracking data of the head position of the user, which may be arbitrarily selected by a person skilled in the art.
Specifically, the real-time tracking data of the head position of the user may include a real-time rotation angle of the head in a three-dimensional space, and the real-time rotation angle of the head of the user may be collectively represented by establishing a three-dimensional coordinate system in advance so as to obtain a rotation angle of the head of the user relative to an X axis, a rotation angle of the head of the user relative to a Y axis, and a rotation angle of the head of the user relative to a Z axis. Of course, the real-time tracking data of the head position of the user may also include a real-time translation distance of the head in a three-dimensional space, similar to a real-time rotation angle, which is not described in detail herein.
For example, as shown in fig. 2, the real-time tracking data of the head of the user includes a rotation angle of the head in a three-dimensional space (Pitch, Yaw, Roll); wherein, Pitch: the angle of rotation of the user's head relative to the x-axis; and (3) Yaw: the angle of rotation of the user's head relative to the y-axis; roll: the angle of rotation of the user's head relative to the z-axis.
Step 72, determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data of the head position of the user.
The stereoscopic display method provided by the embodiment can be applied to an application scene supporting a mouse rotation visual angle, namely, the movement of the mouse can control a deflection visual angle. The real-time tracking data of the head position of the user is bound with the movement amount of the mouse, so that the movement amount of the mouse can be determined according to the real-time tracking data, such as the real-time rotation angle of the head of the user.
Specifically, the mouse may be a virtual mouse or an actual mouse, but on the premise that the overall picture feeling of the 3D content is not affected, the mouse referred to in the present application is generally a virtual mouse, that is, a simulated mouse used in a virtual scene. The simulated mouse corresponds to an actual mouse of a real scene, the actual mouse moves, the simulated mouse moves correspondingly, and the 3D content moves along with the movement of the simulated mouse. It is understood that the simulated mouse may be present in the virtual scene, or background data for controlling the change of the angle of view of the virtual scene. In this embodiment, the change in the position of the user's head is associated with the position of the simulated mouse, thereby changing the perspective of the virtual scene.
And 73, modifying the position information of the analog mouse according to the movement amount of the analog mouse to generate and display a three-dimensional image of the virtual scene according to the modified position information of the analog mouse, so that the observation angle of the virtual scene is changed, and the observation angle of the virtual scene is synchronized with the observation angle of the user after the head position is changed.
Specifically, after the movement amount of the analog mouse is obtained, the position information of the analog mouse is modified, so that an application program (for example, a 3D game and the like) controls the movement of the analog mouse, and a game picture is changed due to the movement of the analog mouse, thereby achieving the purpose of synchronizing the game picture with the rotation of the head of the user.
Although this method may affect the normal operation of the mouse, the method does not affect the whole pipeline and flow in the angle conversion of the game, and is closer to the original game itself for operation.
Preferably, in order to avoid sudden change of an observation angle due to sudden fluctuation of head position data and further cause a user to feel vertigo, in a specific embodiment of the present invention, before determining a movement amount of a simulated mouse for a virtual scene according to the real-time tracking data, the real-time tracking data is subjected to smoothing filtering processing to obtain real-time tracking data after the smoothing filtering processing; and determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data after the smooth filtering processing. Therefore, when the stereoscopic image is subsequently constructed and displayed so as to change the observation visual angle, the discomfort of the user can be weakened to a certain extent, and the user experience is improved.
Specifically, step 72 in this embodiment may include:
and step 721, determining the real-time rotation angle of the head of the user according to the real-time tracking data.
It will be appreciated that in many scenarios, such as first person shooter games, where only rotational translation of the head about the X and Y axes is of interest, the real-time rotation angle of the user's head can be determined directly from the X and Y data.
And step 722, determining the movement amount of the simulated mouse for the virtual scene according to the real-time rotation angle of the head of the user.
Further, step 71 in this embodiment includes:
step 711, acquiring tracking data of a current frame of the user head and tracking data of a previous frame of the user head;
for example, it takes Pitch (rotation angle of the head around the x-axis) and Yaw (rotation angle of the head around the y-axis) in the head tracking data of the current frame as reference values; then obtain Pitch, Yaw of the previous frame. The rotation angle of the head of the user can be accurately determined according to the head tracking data of the current frame and the tracking data of the previous frame.
Step 712, obtaining a difference between the tracking data of the current frame and the tracking data of the previous frame, and determining the real-time rotation angle according to the difference.
Specifically, Δ Pitch and Δ Yaw are obtained by subtracting Pitch (rotation angle of the head around the x-axis), Yaw (rotation angle of the head around the y-axis) of the current frame from Pitch (rotation angle of the head around the x-axis) and Yaw (rotation angle of the head around the y-axis) of the previous frame, respectively. The Δ Pitch, Δ Yaw are the real-time rotation angles of the user's head along the X-axis and along the Y-axis, respectively.
Further, in this embodiment, step 722 includes:
step 7221, multiplying the real-time rotation angle of the user's head by a preset conversion coefficient to obtain the movement amount of the simulated mouse for the virtual scene.
Respectively multiplying the Δ Pitch and Δ Yaw obtained in step 712 by a preset conversion coefficient, where the preset conversion coefficient is a conversion factor for converting the head rotation angle and the mouse offset, so as to obtain Δ x and Δ y; thus, by Δ x, Δ y, the mouse position is changed: mouse x + ═ Δ x; mouse, y + ═ Δ y; the purpose of deflecting the visual angle can be achieved by moving the analog mouse along the X axis by delta X and moving the analog mouse along the Y axis by delta Y, so that the synchronization of the observation visual angle of a user is realized.
In the embodiment of the invention, under various synchronous modes, the original rendering pipeline content of an application program can be intercepted, modified and data information needing to be configured is injected into a configuration file, so that the modified pipeline and content can generate parallax images corresponding to left and right eyes respectively, after the application program is operated, the related configuration information is loaded and applied to the rendering pipeline units at the related levels or other flows, and modification and transformation are completed. The injected information includes, for example: stereoscopic display parameters, mode parameters of viewing angles, and the like. In actual use, we can obtain a unique identification (such as a Hash value or an application package name) of the game, and use the identification to configure different parameters for each application.
In order to better achieve the above object, as shown in fig. 4, the present invention also provides a stereoscopic display device for virtual and real scenes, comprising:
an obtaining module 61, configured to obtain identification information of an application;
a determining module 62, configured to determine, according to the obtained identification information, an observation view synchronization mode corresponding to the application program;
and a display module 63, configured to construct and display a stereoscopic image of the virtual scene of the application program according to the determined observation perspective synchronization mode when the head position of the user of the application program changes, so as to change the observation perspective of the virtual scene, and implement synchronization between the observation perspective in the virtual scene and the observation perspective after the head position of the user changes.
Specifically, in the fifth embodiment of the present invention, the viewing angle synchronization mode includes at least one of the following modes: observing a matrix transformation mode, a projection matrix transformation mode and a mouse transformation mode;
the display module may include:
the observation matrix transformation module is used for tracking the head position of the user of the application program in the observation matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original observation matrix of the virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new observation matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new observation matrix;
or,
tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, determining a displacement matrix of the head of the user according to the real-time tracking data when the head position of the user changes, determining a new transformation matrix according to the displacement matrix of the head of the user, a parallax deflection moment of a virtual scene and an original transformation matrix of the virtual scene, and constructing and displaying a three-dimensional image of the virtual scene according to the new transformation matrix;
the display module may include:
the projection matrix transformation module is used for tracking the head position of the user of the application program in the projection matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original projection matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new projection matrix, and constructing and displaying a stereoscopic image of the virtual scene of the application program according to the new projection matrix;
the display module may include:
the mouse conversion module is used for tracking the head position of the user of the application program in the mouse conversion mode and determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data of the head position of the user; and modifying the position information of the simulated mouse according to the movement amount of the simulated mouse so as to generate and display a three-dimensional image of a virtual scene according to the modified position information of the virtual mouse.
Specifically, the identification information of the application includes a Hash value corresponding to the application or an installation package name of the application.
Specifically, the determining module 62 includes:
the first determining submodule is used for determining an observation visual angle synchronous mode corresponding to the application program according to the acquired identification information and the corresponding relation between the preset application program identification information and the observation visual angle synchronous mode;
or
The second determining submodule is used for determining the display scene type of the application program according to the acquired identification information;
and the third determining submodule is used for determining an observation visual angle synchronous mode corresponding to the application program according to the determined display scene type.
Specifically, the observation matrix transformation module includes:
and the first transformation submodule is used for firstly carrying out smooth filtering processing on the real-time tracking data under the observation matrix transformation mode to obtain the real-time tracking data after the smooth filtering processing, and then carrying out subsequent processing by using the real-time tracking data after the smooth filtering processing.
Specifically, in the fifth embodiment of the present invention, the observation matrix transformation module further includes:
and the second transformation submodule is used for determining a rotation matrix of the head of the user according to the real-time tracking data in the observation matrix transformation mode, and determining a new observation matrix according to the rotation matrix of the head of the user, the parallax deflection matrix of the virtual scene and the observation matrix of the virtual scene.
Specifically, in the fifth embodiment of the present invention, the observation matrix transformation module further includes:
and the third transformation submodule is used for determining the visual distance information of the virtual scene according to the preset configuration or the setting parameters input by the user and determining the parallax deflection matrix of the virtual scene according to the determined visual distance information in the observation matrix transformation mode.
Specifically, in the fifth embodiment of the present invention, the observation matrix transformation module further includes:
and the fourth transformation submodule is used for intercepting an original rendering pipeline of the application program in the observation matrix transformation mode, modifying the original rendering pipeline according to the new observation matrix, so that parallax images corresponding to the left eye and the right eye of the user of the application program are generated by rendering the new observation matrix, and further, constructing and displaying a three-dimensional image of a virtual scene based on the parallax images corresponding to the left eye and the right eye.
Specifically, the projection matrix transformation module includes:
and the fifth transformation submodule is used for firstly carrying out smooth filtering processing on the real-time tracking data under the projection matrix transformation mode to obtain the real-time tracking data after the smooth filtering processing, and then carrying out subsequent processing by using the real-time tracking data after the smooth filtering processing.
Specifically, the projection matrix transformation module includes:
and the sixth transformation submodule is used for determining the projection position offset information of the three-dimensional image of the virtual scene according to the real-time tracking data in the projection matrix transformation mode, and constructing a new projection matrix according to the determined projection position offset information and the projection matrix of the virtual scene.
Specifically, the projection matrix transformation module further includes:
and the seventh transformation submodule is used for determining the projection position offset information according to the real-time tracking data of the head of the user and the distance between the viewpoint of the original projection matrix and a near projection plane in the projection matrix transformation mode.
Specifically, the projection matrix transformation module further includes:
and the eighth transformation submodule is used for intercepting an original rendering pipeline of the application program in the projection matrix transformation mode, modifying the original rendering pipeline according to the new projection matrix, so that parallax images corresponding to the left eye and the right eye of the user of the application program are generated by rendering the new projection matrix, and further, constructing and displaying a stereoscopic image of a virtual scene based on the parallax images corresponding to the left eye and the right eye.
Specifically, the mouse conversion module includes:
and the eighth transformation submodule is used for firstly carrying out smooth filtering processing on the real-time tracking data under the mouse transformation mode to obtain the real-time tracking data after the smooth filtering processing, and then carrying out subsequent processing by using the real-time tracking data after the smooth filtering processing.
Specifically, the mouse conversion module includes:
and the ninth transformation submodule is used for determining the real-time rotation angle of the head of the user according to the real-time tracking data of the head position in the mouse transformation mode, and determining the movement amount of the simulated mouse for the virtual scene according to the real-time rotation angle of the head of the user.
Specifically, in the fifth embodiment of the present invention, the mouse transformation module further includes:
and the tenth transformation submodule is used for acquiring the tracking data of the current frame of the head of the user and the tracking data of the previous frame of the head of the user in the mouse transformation mode, acquiring the difference value of the tracking data of the current frame and the tracking data of the previous frame, and determining the real-time rotation angle according to the difference value.
It should be noted that, the stereoscopic display device for virtual and real scenes provided by the embodiment of the present invention is a device to which the above stereoscopic display method for virtual and real scenes is applied, and all the embodiments of the above stereoscopic display method for virtual and real scenes are applicable to the stereoscopic display device for virtual and real scenes, and can achieve the same or similar beneficial effects.
In order to better achieve the above object, a seventh embodiment of the present invention further provides an electronic device for virtual and real scenes, including:
the device comprises a shell, a processor, a memory, a display, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; a power supply circuit for supplying power to each circuit or device of the electronic apparatus; the memory is used for storing executable program codes; the processor runs a program corresponding to the executable program code by reading the executable program code stored in the memory for performing the steps of:
acquiring identification information of an application program;
determining an observation visual angle synchronization mode corresponding to the application program according to the acquired identification information;
and according to the determined observation visual angle synchronization mode, when the head position of the user of the application program changes, constructing and displaying a three-dimensional image of the virtual scene of the application program, so that the observation visual angle of the virtual scene is changed, and the observation visual angle under the virtual scene is synchronized with the observation visual angle after the head position of the user changes.
The electronic device exists in a variety of forms, including but not limited to:
(1) a mobile communication device: such devices are characterized by mobile communications capabilities and are primarily targeted at providing voice, data communications. Such terminals include: smart phones (e.g., iphones), multimedia phones, functional phones, and low-end phones, etc.; (2) ultra mobile personal computer device: the equipment belongs to the category of personal computers, has calculation and processing functions and generally has the characteristic of mobile internet access. Such terminals include: PDA, MID, and UMPC devices, etc., such as iPad; (3) a portable entertainment device: such devices can display and play multimedia content. This type of device comprises: audio, video players (e.g., ipods), handheld game consoles, electronic books, and smart toys and portable car navigation devices; (4) and other electronic devices with data interaction functions.
It should be noted that the electronic device provided in the embodiment of the present invention is an electronic device to which the stereoscopic display method for virtual and real scenes can be applied, and all embodiments of the stereoscopic display method and the beneficial effects thereof are applicable to the electronic device.
While the foregoing is directed to the preferred embodiment of the present invention, it will be understood by those skilled in the art that various changes and modifications may be made without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (14)

1. A stereoscopic display method for virtual and real scenes, comprising:
acquiring identification information of an application program;
determining an observation visual angle synchronization mode corresponding to the application program according to the acquired identification information;
according to the determined observation visual angle synchronization mode, when the head position of the user of the application program changes, a three-dimensional image of the virtual scene of the application program is constructed and displayed, so that the observation visual angle of the virtual scene is changed, the observation visual angle under the virtual scene is synchronized with the observation visual angle after the head position of the user changes,
wherein the viewing perspective synchronization pattern comprises at least one of: observing a matrix transformation mode, a projection matrix transformation mode and a mouse transformation mode;
in the observation matrix transformation mode:
tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, transforming an original observation matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new observation matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new observation matrix;
or
Tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, determining a displacement matrix of the head of the user according to the real-time tracking data when the head position of the user changes, determining a new transformation matrix according to the displacement matrix of the head of the user, a parallax deflection moment of a virtual scene and an original transformation matrix of the virtual scene, and constructing and displaying a three-dimensional image of the virtual scene according to the new transformation matrix;
tracking the head position of a user of the application program in the projection matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original projection matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new projection matrix, and constructing and displaying a stereoscopic image of the virtual scene of the application program according to the new projection matrix;
tracking the head position of a user of the application program in the mouse conversion mode, and determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data of the head position of the user; and modifying the position information of the simulated mouse according to the movement amount of the simulated mouse so as to generate and display a three-dimensional image of a virtual scene according to the modified position information of the virtual mouse.
2. The stereoscopic display method according to claim 1, wherein the identification information of the application program comprises a Hash value corresponding to the application program or an installation package name of the application program.
3. The stereoscopic display method according to claim 1, wherein the step of determining the viewing angle synchronization pattern corresponding to the application program according to the acquired identification information comprises:
determining an observation visual angle synchronous mode corresponding to the application program according to the acquired identification information and the corresponding relation between the preset application program identification information and the observation visual angle synchronous mode;
or
Determining the display scene type of the application program according to the acquired identification information;
and determining an observation visual angle synchronous mode corresponding to the application program according to the determined display scene type.
4. The stereoscopic display method according to claim 1,
and under the observation matrix conversion mode, the projection matrix conversion mode or the mouse conversion mode, firstly, carrying out smooth filtering processing on the real-time tracking data to obtain the real-time tracking data after the smooth filtering processing, and then carrying out subsequent processing by using the real-time tracking data after the smooth filtering processing.
5. The stereoscopic display method according to claim 1,
and under the observation matrix transformation mode, determining a rotation matrix of the head of the user according to the real-time tracking data, and determining a new observation matrix according to the rotation matrix of the head of the user, the parallax deflection matrix of the virtual scene and the observation matrix of the virtual scene.
6. The stereoscopic display method according to claim 5, wherein in the observation matrix conversion mode, the parallax distance information of the virtual scene is determined according to a preset configuration or according to a setting parameter input by a user, and the parallax deflection matrix of the virtual scene is determined according to the determined parallax distance information.
7. The stereoscopic display method according to claim 5, wherein in the observation matrix transformation mode, an original rendering pipeline of the application program is intercepted, and the original rendering pipeline is modified according to the new observation matrix, so as to realize that the parallax images corresponding to the left and right eyes of the user of the application program are generated by rendering with the new observation matrix, and further construct and display the stereoscopic image of the virtual scene based on the parallax images corresponding to the left and right eyes.
8. The stereoscopic display method according to claim 1, wherein in the projection matrix conversion mode, projection position offset information of a stereoscopic image of a virtual scene is determined based on the real-time tracking data, and a new projection matrix is constructed based on the determined projection position offset information and the projection matrix of the virtual scene.
9. The stereoscopic display method according to claim 8, wherein the projection position offset information is determined according to real-time tracking data of the head of the user and a distance from a viewpoint of the original projection matrix to a near projection plane in the projection matrix conversion mode.
10. The stereoscopic display method according to claim 8, wherein in the projection matrix transformation mode, an original rendering pipeline of the application program is intercepted, and the original rendering pipeline is modified according to the new projection matrix, so that parallax images corresponding to left and right eyes of a user of the application program are generated by rendering with the new projection matrix, and then a stereoscopic image of a virtual scene is constructed and displayed based on the parallax images corresponding to the left and right eyes.
11. The stereoscopic display method according to claim 1, wherein in the mouse conversion mode, a real-time rotation angle of the head of the user is determined based on real-time tracking data of the head position, and a movement amount of the simulated mouse for the virtual scene is determined based on the real-time rotation angle of the head of the user.
12. The stereoscopic display method according to claim 11, wherein in the mouse conversion mode, tracking data of a current frame of the head of the user and tracking data of a previous frame of the head of the user are obtained, a difference value between the tracking data of the current frame and the tracking data of the previous frame is obtained, and the real-time rotation angle is determined from the difference value.
13. A stereoscopic display apparatus for virtual and real scenes, comprising:
the acquisition module is used for acquiring the identification information of the application program;
the determining module is used for determining an observation angle synchronization mode which corresponds to the application program and can change the observation angle of the virtual scene watched by the user when the user changes the head position so as to change the observation angle of the eyes;
a display module, configured to construct and display a stereoscopic image of a virtual scene of the application program according to the determined observation angle synchronization mode when the head position of the user of the application program changes, so as to change the observation angle of the virtual scene, and implement synchronization between the observation angle in the virtual scene and the observation angle after the head position of the user changes,
wherein the viewing perspective synchronization pattern comprises at least one of: observing a matrix transformation mode, a projection matrix transformation mode and a mouse transformation mode;
the display module comprises an observation matrix transformation module and/or a projection matrix transformation module and/or a mouse transformation module;
the observation matrix transformation module is used for tracking the head position of a user of the application program in the observation matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original observation matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new observation matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new observation matrix;
or
Tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, determining a displacement matrix of the head of the user according to the real-time tracking data when the head position of the user changes, determining a new transformation matrix according to the displacement matrix of the head of the user, a parallax deflection moment of a virtual scene and an original transformation matrix of the virtual scene, and constructing and displaying a three-dimensional image of the virtual scene according to the new transformation matrix;
the projection matrix transformation module is used for tracking the head position of a user of the application program in the projection matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original projection matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new projection matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new projection matrix;
the mouse transformation module is used for tracking the head position of a user of the application program in the mouse transformation mode and determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data of the head position of the user; and modifying the position information of the simulated mouse according to the movement amount of the simulated mouse so as to generate and display a three-dimensional image of a virtual scene according to the modified position information of the virtual mouse.
14. An electronic device for virtual and real scenes, comprising:
the device comprises a shell, a processor, a memory, a display, a circuit board and a power circuit, wherein the circuit board is arranged in a space enclosed by the shell, and the processor and the memory are arranged on the circuit board; a power supply circuit for supplying power to each circuit or device of the electronic apparatus; the memory is used for storing executable program codes; the processor runs a program corresponding to the executable program code by reading the executable program code stored in the memory for performing the steps of:
acquiring identification information of an application program;
according to the acquired identification information, determining an observation visual angle synchronization mode which corresponds to the application program and can change the observation visual angle of the virtual scene watched by the user when the user changes the head position so as to change the observation visual angle of the eyes;
according to the determined observation visual angle synchronization mode, when the head position of the user of the application program changes, a three-dimensional image of the virtual scene of the application program is constructed and displayed, so that the observation visual angle of the virtual scene is changed, the observation visual angle under the virtual scene is synchronized with the observation visual angle after the head position of the user changes,
wherein the viewing perspective synchronization pattern comprises at least one of: observing a matrix transformation mode, a projection matrix transformation mode and a mouse transformation mode;
in the observation matrix transformation mode:
tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, transforming an original observation matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new observation matrix, and constructing and displaying a three-dimensional image of the virtual scene of the application program according to the new observation matrix;
or
Tracking the head position of a user of the application program, acquiring real-time tracking data of the head position of the user, determining a displacement matrix of the head of the user according to the real-time tracking data when the head position of the user changes, determining a new transformation matrix according to the displacement matrix of the head of the user, a parallax deflection moment of a virtual scene and an original transformation matrix of the virtual scene, and constructing and displaying a three-dimensional image of the virtual scene according to the new transformation matrix;
tracking the head position of a user of the application program in the projection matrix transformation mode, acquiring real-time tracking data of the head position of the user, transforming an original projection matrix of a virtual scene according to the real-time tracking data when the head position of the user changes to obtain a new projection matrix, and constructing and displaying a stereoscopic image of the virtual scene of the application program according to the new projection matrix;
tracking the head position of a user of the application program in the mouse conversion mode, and determining the movement amount of the simulated mouse for the virtual scene according to the real-time tracking data of the head position of the user; and modifying the position information of the simulated mouse according to the movement amount of the simulated mouse so as to generate and display a three-dimensional image of a virtual scene according to the modified position information of the virtual mouse.
CN201510546495.2A 2015-08-31 2015-08-31 Stereo display method, device and electronic equipment for virtual and reality scene Expired - Fee Related CN105704478B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510546495.2A CN105704478B (en) 2015-08-31 2015-08-31 Stereo display method, device and electronic equipment for virtual and reality scene

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510546495.2A CN105704478B (en) 2015-08-31 2015-08-31 Stereo display method, device and electronic equipment for virtual and reality scene

Publications (2)

Publication Number Publication Date
CN105704478A CN105704478A (en) 2016-06-22
CN105704478B true CN105704478B (en) 2017-07-18

Family

ID=56228085

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510546495.2A Expired - Fee Related CN105704478B (en) 2015-08-31 2015-08-31 Stereo display method, device and electronic equipment for virtual and reality scene

Country Status (1)

Country Link
CN (1) CN105704478B (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106131539A (en) * 2016-06-30 2016-11-16 乐视控股(北京)有限公司 A kind of Virtual Reality equipment and video broadcasting method thereof
CN107918482B (en) * 2016-10-08 2023-12-12 深圳思蓝智创科技有限公司 Method and system for avoiding overstimulation in immersive VR system
CN106529409B (en) * 2016-10-10 2019-08-09 中山大学 A Method for Measuring Eye Gaze Angle Based on Head Posture
CN108882018B (en) * 2017-05-09 2020-10-20 阿里巴巴(中国)有限公司 Video playing and data providing method in virtual scene, client and server
TWI653081B (en) * 2017-09-20 2019-03-11 宏碁股份有限公司 Image processing system and method
CN109561298A (en) * 2017-09-27 2019-04-02 宏碁股份有限公司 Image processing system and method
CN107797662B (en) 2017-10-23 2021-01-01 北京小米移动软件有限公司 Viewing angle control method and device and electronic equipment
CN108259738A (en) * 2017-11-20 2018-07-06 优视科技有限公司 Camera control method, equipment and electronic equipment
CN108181994A (en) * 2018-01-26 2018-06-19 成都科木信息技术有限公司 For the man-machine interaction method of the AR helmets
US10499044B1 (en) * 2019-05-13 2019-12-03 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
CN110610454A (en) * 2019-09-18 2019-12-24 上海云绅智能科技有限公司 Method and device for calculating perspective projection matrix, terminal device and storage medium
CN111028552A (en) * 2019-12-24 2020-04-17 江西拓荒者科技有限公司 Red education platform based on VR technique
CN113791687B (en) * 2021-09-15 2023-11-14 咪咕视讯科技有限公司 Interaction methods, devices, computing devices and storage media in VR scenes
CN114359396B (en) * 2022-03-18 2022-05-17 成都工业学院 A kind of stereo image acquisition and display method
CN114964249A (en) * 2022-04-07 2022-08-30 西安应用光学研究所 Synchronous association method of three-dimensional digital map and real-time photoelectric video
CN115048017B (en) * 2022-07-28 2023-10-17 广东伟达智能装备股份有限公司 Control method for synchronizing simulated grabbing and placing box and live-action in 3D control system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8581905B2 (en) * 2010-04-08 2013-11-12 Disney Enterprises, Inc. Interactive three dimensional displays on handheld devices
CN202067213U (en) * 2011-05-19 2011-12-07 上海科睿展览展示工程科技有限公司 Interactive three-dimensional image system
CN102789313B (en) * 2012-03-19 2015-05-13 苏州触达信息技术有限公司 User interaction system and method
US10055013B2 (en) * 2013-09-17 2018-08-21 Amazon Technologies, Inc. Dynamic object tracking for user interfaces
CN104857704A (en) * 2015-06-11 2015-08-26 苏州百源软件设计有限公司 Wearable virtual reality motion helmet and wearable virtual action game system

Also Published As

Publication number Publication date
CN105704478A (en) 2016-06-22

Similar Documents

Publication Publication Date Title
CN105704478B (en) Stereo display method, device and electronic equipment for virtual and reality scene
CN105704468B (en) Stereo display method, device and electronic equipment for virtual and reality scene
CN102274633B (en) Image display system, image display apparatus, and image display method
CN106251403B (en) A kind of methods, devices and systems of virtual three-dimensional Scene realization
US20050264558A1 (en) Multi-plane horizontal perspective hands-on simulator
US9554119B2 (en) Image generation method, image display method, storage medium storing image generation program, image generation system, and image display device
US9294673B2 (en) Image generation method, image display method, storage medium storing image generation program, image generation system, and image display device
JP5572647B2 (en) Display control program, display control device, display control system, and display control method
EP2565848B1 (en) Program, information processing apparatus, information processing system, and information processing method
US20050219240A1 (en) Horizontal perspective hands-on simulator
US20150187132A1 (en) System and method for three-dimensional visualization of geographical data
US20120293549A1 (en) Computer-readable storage medium having information processing program stored therein, information processing apparatus, information processing system, and information processing method
CN109598796A (en) Real scene is subjected to the method and apparatus that 3D merges display with dummy object
WO2005098517A2 (en) Horizontal perspective hand-on simulator
CN101742348A (en) Rendering method and system
US20170150212A1 (en) Method and electronic device for adjusting video
CN105389090A (en) Game interaction interface displaying method and apparatus, mobile terminal and computer terminal
CN108830944A (en) Optical perspective formula three-dimensional near-eye display system and display methods
WO2017062730A1 (en) Presentation of a virtual reality scene from a series of images
US20050248566A1 (en) Horizontal perspective hands-on simulator
CN115767068A (en) Information processing method and device and electronic equipment
CN108124148A (en) A kind of method and device of the multiple view images of single view video conversion
JP2016001476A (en) Display control program, display control device, display control system and display control method
CN116610213A (en) Interactive display method and device in virtual reality, electronic equipment and storage medium
CN106249858B (en) A kind of display converting method, device and terminal device

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20180724

Address after: 518054 Room 201, building A, 1 front Bay Road, Shenzhen Qianhai cooperation zone, Shenzhen, Guangdong

Patentee after: SUPERD Co.,Ltd.

Address before: 518053 H-1 Tung 101, overseas Chinese town, Nanshan District, Shenzhen, Guangdong.

Patentee before: SHENZHEN SUPER PERFECT OPTICS Ltd.

CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20170718