[go: up one dir, main page]

CN107808409B - Method and device for performing illumination rendering in augmented reality and mobile terminal - Google Patents

Method and device for performing illumination rendering in augmented reality and mobile terminal Download PDF

Info

Publication number
CN107808409B
CN107808409B CN201610810809.XA CN201610810809A CN107808409B CN 107808409 B CN107808409 B CN 107808409B CN 201610810809 A CN201610810809 A CN 201610810809A CN 107808409 B CN107808409 B CN 107808409B
Authority
CN
China
Prior art keywords
light source
real
scene
real scene
boundary line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201610810809.XA
Other languages
Chinese (zh)
Other versions
CN107808409A (en
Inventor
邵红胜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ZTE Corp
Original Assignee
ZTE Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ZTE Corp filed Critical ZTE Corp
Priority to CN201610810809.XA priority Critical patent/CN107808409B/en
Priority to PCT/CN2017/081402 priority patent/WO2018045759A1/en
Publication of CN107808409A publication Critical patent/CN107808409A/en
Application granted granted Critical
Publication of CN107808409B publication Critical patent/CN107808409B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/506Illumination models
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/06Ray-tracing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects
    • G06T15/60Shadow generation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/04Indexing scheme for image data processing or generation, in general involving 3D image data
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/12Shadow map, environment map
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2215/00Indexing scheme for image rendering
    • G06T2215/16Using real world measurements to influence rendering

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Graphics (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention provides a method, a device and a mobile terminal for performing illumination rendering in augmented reality, wherein the method comprises the following steps: determining the position and brightness of a light source in a real scene; and performing illumination rendering on the virtual object in the virtual scene according to the position and the brightness of the light source in the real scene. The method and the device can acquire the light source information in the real scene of the augmented reality in real time, and dynamically perform illumination rendering on the virtual object in the virtual scene of the augmented reality according to the light source information in the real scene, so that the virtual object in the virtual scene can be better fused with the real scene, and the visual experience of a user is closer to reality.

Description

Method and device for performing illumination rendering in augmented reality and mobile terminal
Technical Field
The invention relates to the technical field of augmented reality, in particular to a method and a device for performing illumination rendering in augmented reality and a mobile terminal.
Background
Augmented Reality (AR) is a technology for calculating the position and angle of a camera image in real time and adding a corresponding virtual model, and the goal of the technology is to load the virtual model into the real world on a screen and perform interaction. However, the rendering problem of virtual models in augmented reality has been a key to impacting the visual experience. Especially, the illumination rendering of the virtual model has the largest influence on the visual experience. In the existing augmented reality technology, the virtual model in augmented reality is simply subjected to illumination rendering of a fixed light source, so that the reality sense of the virtual model cannot meet the requirement.
Disclosure of Invention
The technical problem to be solved by the invention is to provide a method, a device and a mobile terminal for performing illumination rendering in augmented reality, so that the fidelity of a virtual object is enhanced, and the visual effect of augmented reality is improved.
The technical scheme adopted by the invention is that the method for performing illumination rendering in augmented reality comprises the following steps:
determining the position and brightness of a light source in a real scene;
and performing illumination rendering on the virtual object in the virtual scene according to the position and the brightness of the light source in the real scene.
Further, the determining the position and the brightness of the light source in the real scene includes:
determining the three-dimensional coordinate position of the light source in the real scene according to the position relation between any real object and the shadow thereof in the real scene;
and calculating the brightness values of all position points in the real scene, wherein the maximum brightness value in the real scene is the brightness of the light source in the real scene.
Further, the determining the three-dimensional coordinate position of the light source in the real scene according to the position relationship between any real object in the real scene and the shadow thereof includes:
determining an entity boundary line and a shadow boundary line of any real object in a real scene;
respectively selecting characteristic points on the entity boundary line and the shadow boundary line of any real object;
determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line;
and determining the three-dimensional coordinate position of the light source in the real scene according to the intersection point positions of the extension lines or the reverse extension lines of all the feature vectors.
Further, the feature points include: inflection points and extreme points.
Further, the feature vector is a vector in which any feature point on the solid boundary line points to a feature point on the shadow boundary line corresponding to the feature point.
Further, the performing illumination rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene includes:
converting a position of a light source in a real scene to a position of the light source in a virtual scene;
and performing illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
Further, the converting the position of the light source in the real scene into the position of the light source in the virtual scene includes:
obtaining a real model conversion matrix M according to a parallel tracking and mapping algorithm, and converting the position (x, y, z) of a light source in a real scene into the position (u, v, w) of the light source in a virtual scene by using the real model conversion matrix M; wherein (u, v, w) ═ M (x, y, z).
The invention also provides a device for performing illumination rendering in augmented reality, which comprises:
the light source determining module is used for determining the position and the brightness of a light source in a real scene;
and the illumination rendering module is used for performing illumination rendering on the virtual object in the virtual scene according to the position and the brightness of the light source in the real scene.
Further, the light source determining module specifically includes:
the position determining unit is used for determining the three-dimensional coordinate position of the light source in the real scene according to the position relation between any real object and the shadow thereof in the real scene;
and the brightness determining unit is used for calculating the brightness values of all the position points in the real scene, wherein the maximum brightness value in the real scene is the brightness of the light source in the real scene.
Further, the position determining unit is specifically configured to:
determining an entity boundary line and a shadow boundary line of any real object in a real scene;
respectively selecting characteristic points on the entity boundary line and the shadow boundary line of any real object;
determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line;
and determining the three-dimensional coordinate position of the light source in the real scene according to the intersection point positions of the extension lines or the reverse extension lines of all the feature vectors.
Further, the feature points include: inflection points and extreme points.
Further, the feature vector is a vector in which any feature point on the solid boundary line points to a feature point on the shadow boundary line corresponding to the feature point.
Further, the illumination rendering module specifically includes:
a conversion unit for converting the position of the light source in the real scene into the position of the light source in the virtual scene;
and the rendering unit is used for performing illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
Further, the conversion unit is specifically configured to:
obtaining a real model conversion matrix M according to a parallel tracking and mapping algorithm, and converting the position (x, y, z) of a light source in a real scene into the position (u, v, w) of the light source in a virtual scene by using the real model conversion matrix M; wherein (u, v, w) ═ M (x, y, z).
The invention also provides a mobile terminal which comprises the device for performing illumination rendering in the augmented reality.
By adopting the technical scheme, the invention at least has the following advantages:
according to the method, the device and the mobile terminal for performing illumination rendering in augmented reality, the light source information in the real scene of augmented reality can be acquired in real time, and the illumination rendering can be performed on the virtual object in the virtual scene of augmented reality dynamically according to the light source information in the real scene, so that the virtual object in the virtual scene can be better fused with the real scene, and the visual experience of a user is closer to reality.
Drawings
Fig. 1 is a flowchart of a method for performing illumination rendering in augmented reality according to a first embodiment of the present invention;
fig. 2 is a flowchart of a method for performing illumination rendering in augmented reality according to a second embodiment of the present invention;
fig. 3 is a schematic structural diagram illustrating a composition of an apparatus for performing illumination rendering in augmented reality according to a third embodiment of the present invention;
fig. 4 is a schematic structural diagram illustrating a composition of an apparatus for performing illumination rendering in augmented reality according to a fourth embodiment of the present invention.
Detailed Description
To further explain the technical means and effects of the present invention adopted to achieve the intended purpose, the present invention will be described in detail with reference to the accompanying drawings and preferred embodiments.
A first embodiment of the present invention provides a method for performing illumination rendering in augmented reality, as shown in fig. 1, including the following specific steps:
step S101: the position and brightness of the light source in the real scene is determined.
Specifically, step S101 includes:
step A1: and determining the three-dimensional coordinate position of the light source in the real scene according to the position relation between any real object and the shadow thereof in the real scene.
Further, step a1, includes:
step A11: determining the solid boundary line and the shadow boundary line of any real object in the real scene.
Step A12: and respectively selecting characteristic points on the solid boundary line and the shadow boundary line of any real object. The feature points include: inflection and extreme points, such as: highest, lowest and highest points on the borderline, etc.
Step A13: and determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line. The characteristic vector is a vector of any characteristic point on the entity boundary line pointing to a characteristic point corresponding to the characteristic point on the shadow boundary line.
Step A14: and determining the three-dimensional coordinate position of the light source in the real scene according to the intersection point positions of the extension lines or the reverse extension lines of all the feature vectors.
Step A2: and calculating the brightness values of all position points in the real scene, wherein the maximum brightness value in the real scene is the brightness of the light source in the real scene.
Step S102: and performing illumination rendering on the virtual object in the virtual scene according to the position and the brightness of the light source in the real scene.
Specifically, step S102 includes:
step B1: the position of a light source in a real scene is converted into the position of the light source in a virtual scene.
Step B2: and performing illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
Further, step B1 specifically includes:
obtaining a real model conversion matrix M according to a parallel tracking and mapping algorithm, and converting the position (x, y, z) of a light source in a real scene into the position (u, v, w) of the light source in a virtual scene by using the real model conversion matrix M; wherein (u, v, w) ═ M (x, y, z).
A second embodiment of the present invention provides a method for performing illumination rendering in augmented reality, as shown in fig. 2, including the following specific steps:
step S201: and acquiring a video image of the real world by using the camera.
Step S202: the real-world video image is decomposed to obtain a series of image frames.
Step S203: the position of the light source in each of the image frames is calculated.
Specifically, step S203 includes:
step C1: determining a solid boundary line and a shadow boundary line of any real object in the image frame.
Step C2: and respectively selecting characteristic points on the entity boundary line and the shadow boundary line.
The feature points include: inflection and extreme points, such as: highest, lowest and highest points on the borderline, etc.
Step C3: and determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line. The characteristic vector is a vector of any characteristic point on the entity boundary line pointing to a characteristic point corresponding to the characteristic point on the shadow boundary line.
Step C4: determining a position of a light source in the image frame from the feature vector. The position of the light source in the image frame is the intersection position of the extension lines of all the feature vectors or the reverse extension lines.
Step S204: and calculating the brightness of the light source in each image frame.
Specifically, step S204 includes:
and calculating the brightness value of each pixel point in any image frame, wherein the maximum brightness value in any image frame is the brightness of the light source in any image frame.
Step S205: converting a position of a light source in the image frame to a position of the light source in a three-dimensional model of augmented reality.
Specifically, step S205 includes:
obtaining a real model conversion matrix M according to a PTAM (Parallel Tracking and Mapping) algorithm, and converting the position (x, y, z) of the light source in the image frame into a position (u, v, w) of the light source in the three-dimensional model of the augmented reality by using the real model conversion matrix M, wherein (u, v, w) is M (x, y, z).
Step S206: and performing illumination rendering on the virtual object in the three-dimensional model of the augmented reality according to the position of the light source in the three-dimensional model of the augmented reality and the brightness of the light source in the image frame.
Step S207: and placing the virtual object rendered through illumination into the video image of the real world acquired by the camera.
And dynamically extracting light source information in the video image of the real world along with the continuous acquisition of the video image of the real world by the camera, and dynamically performing dynamic illumination rendering on the virtual object according to the light source information.
A third embodiment of the present invention provides an apparatus for performing illumination rendering in augmented reality, as shown in fig. 3, including the following components:
1) a light source determining module 301, configured to determine the position and brightness of a light source in a real scene.
Specifically, the light source determining module 301 includes:
and the position determining unit is used for determining the three-dimensional coordinate position of the light source in the real scene according to the position relation between any real object and the shadow thereof in the real scene.
Further, the position determining unit is specifically configured to:
determining the solid boundary line and the shadow boundary line of any real object in the real scene.
And respectively selecting characteristic points on the solid boundary line and the shadow boundary line of any real object. The feature points include: inflection and extreme points, such as: highest, lowest and highest points on the borderline, etc.
And determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line. The characteristic vector is a vector of any characteristic point on the entity boundary line pointing to a characteristic point corresponding to the characteristic point on the shadow boundary line.
And determining the three-dimensional coordinate position of the light source in the real scene according to the intersection point positions of the extension lines or the reverse extension lines of all the feature vectors.
And the brightness determining unit is used for calculating the brightness values of all the position points in the real scene, wherein the maximum brightness value in the real scene is the brightness of the light source in the real scene.
2) And the illumination rendering module 302 is configured to perform illumination rendering on the virtual object in the virtual scene according to the position and brightness of the light source in the real scene.
Specifically, the illumination rendering module 302 is configured to:
a conversion unit for converting the position of the light source in the real scene into the position of the light source in the virtual scene.
And the rendering unit is used for performing illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
Further, the conversion unit is specifically configured to:
obtaining a real model conversion matrix M according to a parallel tracking and mapping algorithm, and converting the position (x, y, z) of a light source in a real scene into the position (u, v, w) of the light source in a virtual scene by using the real model conversion matrix M; wherein (u, v, w) ═ M (x, y, z).
A fourth embodiment of the present invention provides an apparatus for performing illumination rendering in augmented reality, as shown in fig. 4, including the following components:
1) a video acquiring module 401, configured to acquire a video image of the real world by using a camera.
2) The video decomposition module 402 is configured to decompose a real-world video image to obtain a series of image frames.
3) A position calculating module 403, configured to calculate a position of the light source in each of the image frames.
Specifically, the calculating module 403 includes:
a boundary line determining unit for determining a solid boundary line and a shadow boundary line of any real object in the image frame.
And the characteristic point determining unit is used for respectively selecting characteristic points on the entity boundary line and the shadow boundary line. The feature points include: inflection and extreme points, such as: highest, lowest and highest points on the borderline, etc.
And the characteristic vector determining unit is used for determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line. The characteristic vector is a vector of any characteristic point on the entity boundary line pointing to a characteristic point corresponding to the characteristic point on the shadow boundary line.
And the light source determining unit is used for determining the position of the light source in the image frame according to the characteristic vector. The position of the light source in the image frame is the intersection position of the extension lines of all the feature vectors or the reverse extension lines.
4) A brightness calculating module 404, configured to calculate brightness of the light source in each of the image frames.
Specifically, the brightness calculation module 404 is configured to:
and calculating the brightness value of each point in any image frame, wherein the maximum brightness value in any image frame is the brightness of the light source in any image frame.
5) A position conversion module 405 for converting the position of the light source in the image frame into the position of the light source in the three-dimensional model of augmented reality.
Specifically, the position conversion module 405 is configured to:
and obtaining a reality model conversion matrix M according to a PTAM algorithm, and converting the position (x, y, z) of the light source in the image frame into the position (u, v, w) of the light source in the three-dimensional model of the augmented reality by using the reality model conversion matrix M, wherein (u, v, w) is M (x, y, z).
6) And the illumination rendering module 406 is configured to perform illumination rendering on a virtual object in the augmented reality three-dimensional model according to the position of the light source in the augmented reality three-dimensional model and the brightness of the light source in the image frame.
7) And the video restoration module 407 is configured to put the virtual object rendered by illumination into the video image of the real world acquired by the camera.
And dynamically extracting light source information in the video image of the real world along with the continuous acquisition of the video image of the real world by the camera, and dynamically performing dynamic illumination rendering on the virtual object according to the light source information.
A fifth embodiment of the present invention is a mobile terminal, where the mobile terminal is provided with the apparatus for performing illumination rendering in augmented reality according to the third embodiment of the present invention.
The method, the device and the mobile terminal for performing illumination rendering in augmented reality introduced in the embodiment of the invention can acquire the light source information in the real scene of augmented reality in real time, and perform illumination rendering on the virtual object in the virtual scene of augmented reality dynamically according to the light source information in the real scene, so that the virtual object in the virtual scene can be better fused with the real scene, and the user can have visual experience closer to reality.
While the invention has been described in connection with specific embodiments thereof, it is to be understood that it is intended by the appended drawings and description that the invention may be embodied in other specific forms without departing from the spirit or scope of the invention.

Claims (11)

1. A method for performing illumination rendering in augmented reality is characterized by comprising the following steps:
determining the three-dimensional coordinate position of the light source in the real scene according to the position relation between any real object and the shadow thereof in the real scene; the method comprises the following steps: determining an entity boundary line and a shadow boundary line of any real object in a real scene; respectively selecting characteristic points on the entity boundary line and the shadow boundary line of any real object; determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line; determining the three-dimensional coordinate position of the light source in the real scene according to the intersection point positions of the extension lines or the reverse extension lines of all the feature vectors, wherein the feature points comprise inflection points and extreme points;
calculating the brightness values of all position points in a real scene, wherein the maximum brightness value in the real scene is the brightness of the light source in the real scene;
and performing illumination rendering on the virtual object in the virtual scene according to the position and the brightness of the light source in the real scene.
2. The method for performing illumination rendering in augmented reality according to claim 1, wherein the feature points comprise: inflection points and extreme points.
3. The method for performing illumination rendering in augmented reality according to claim 1, wherein the feature vector is a vector in which any feature point on the entity boundary line points to a feature point on the shadow boundary line corresponding to the any feature point.
4. The method for performing illumination rendering in augmented reality according to claim 1, wherein the illumination rendering of the virtual object in the virtual scene according to the position and brightness of the light source in the real scene comprises:
converting a position of a light source in a real scene to a position of the light source in a virtual scene;
and performing illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
5. The method of claim 4, wherein the transforming the position of the light source in the real scene to the position of the light source in the virtual scene comprises:
obtaining a real model conversion matrix M according to a parallel tracking and mapping algorithm, and converting the position (x, y, z) of a light source in a real scene into the position (u, v, w) of the light source in a virtual scene by using the real model conversion matrix M; wherein (u, v, w) ═ M (x, y, z).
6. An apparatus for performing illumination rendering in augmented reality, comprising:
a light source determination module comprising: the position determining unit is used for determining the three-dimensional coordinate position of the light source in the real scene according to the position relation between any real object and the shadow thereof in the real scene; the brightness determining unit is used for calculating the brightness values of all position points in a real scene, and the maximum brightness value in the real scene is the brightness of the light source in the real scene;
the position determining unit is specifically configured to: determining an entity boundary line and a shadow boundary line of any real object in a real scene; respectively selecting characteristic points on the entity boundary line and the shadow boundary line of any real object; determining a characteristic vector according to the corresponding relation between the characteristic points on the entity boundary line and the characteristic points on the shadow boundary line; determining the three-dimensional coordinate position of the light source in the real scene according to the intersection point positions of the extension lines or the reverse extension lines of all the feature vectors, wherein the feature points comprise inflection points and extreme points;
and the illumination rendering module is used for performing illumination rendering on the virtual object in the virtual scene according to the position and the brightness of the light source in the real scene.
7. The apparatus for performing illumination rendering in augmented reality according to claim 6, wherein the feature points comprise: inflection points and extreme points.
8. The apparatus for performing illumination rendering in augmented reality according to claim 6, wherein the feature vector is a vector in which any feature point on the solid borderline points to a feature point on the shadow borderline corresponding to the any feature point.
9. The apparatus for performing illumination rendering in augmented reality according to claim 6, wherein the illumination rendering module specifically includes:
a conversion unit for converting the position of the light source in the real scene into the position of the light source in the virtual scene;
and the rendering unit is used for performing illumination rendering on the virtual object in the virtual scene according to the position of the light source in the virtual scene and the brightness of the light source in the real scene.
10. The apparatus for performing illumination rendering in augmented reality according to claim 9, wherein the conversion unit is specifically configured to:
obtaining a real model conversion matrix M according to a parallel tracking and mapping algorithm, and converting the position (x, y, z) of a light source in a real scene into the position (u, v, w) of the light source in a virtual scene by using the real model conversion matrix M; wherein (u, v, w) ═ M (x, y, z).
11. A mobile terminal, characterized by comprising the device for lighting rendering in augmented reality according to any one of claims 6-10.
CN201610810809.XA 2016-09-07 2016-09-07 Method and device for performing illumination rendering in augmented reality and mobile terminal Active CN107808409B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201610810809.XA CN107808409B (en) 2016-09-07 2016-09-07 Method and device for performing illumination rendering in augmented reality and mobile terminal
PCT/CN2017/081402 WO2018045759A1 (en) 2016-09-07 2017-04-21 Method and device for lighting rendering in augmented reality, and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201610810809.XA CN107808409B (en) 2016-09-07 2016-09-07 Method and device for performing illumination rendering in augmented reality and mobile terminal

Publications (2)

Publication Number Publication Date
CN107808409A CN107808409A (en) 2018-03-16
CN107808409B true CN107808409B (en) 2022-04-12

Family

ID=61561344

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201610810809.XA Active CN107808409B (en) 2016-09-07 2016-09-07 Method and device for performing illumination rendering in augmented reality and mobile terminal

Country Status (2)

Country Link
CN (1) CN107808409B (en)
WO (1) WO2018045759A1 (en)

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2569267A (en) * 2017-10-13 2019-06-19 Mo Sys Engineering Ltd Lighting integration
CN108986199B (en) * 2018-06-14 2023-05-16 北京小米移动软件有限公司 Virtual model processing method, device, electronic device and storage medium
CN108877340A (en) * 2018-07-13 2018-11-23 李冬兰 A kind of intelligent English assistant learning system based on augmented reality
CN110021071B (en) * 2018-12-25 2024-03-12 创新先进技术有限公司 Rendering method, device and equipment in augmented reality application
CN110033423B (en) * 2019-04-16 2020-08-28 北京字节跳动网络技术有限公司 Method and apparatus for processing image
US11288844B2 (en) * 2019-10-16 2022-03-29 Google Llc Compute amortization heuristics for lighting estimation for augmented reality
EP4058993A4 (en) 2019-12-06 2023-01-11 Guangdong Oppo Mobile Telecommunications Corp., Ltd. Light source detection for extended reality technologies
CN111145341B (en) * 2019-12-27 2023-04-28 陕西职业技术学院 Virtual-real fusion illumination consistency drawing method based on single light source
CN111402409B (en) * 2020-04-03 2021-03-05 湖北工业大学 Exhibition hall design illumination condition model system
CN111862344B (en) * 2020-07-17 2024-03-08 抖音视界有限公司 Image processing method, apparatus and storage medium
CN112040596B (en) * 2020-08-18 2022-11-08 张雪媛 Virtual space light control method, computer readable storage medium and system
CN112367750A (en) * 2020-11-02 2021-02-12 北京德火科技有限责任公司 Linkage structure of AR immersion type panoramic simulation system and lighting system and control method thereof
CN112837425B (en) * 2021-03-10 2022-02-11 西南交通大学 Mixed reality lighting consistency adjustment method
CN115499576A (en) * 2021-06-18 2022-12-20 华为技术有限公司 Light source estimation method, device and system
CN114004803B (en) * 2021-10-29 2024-11-12 武汉大学 A method based on object lighting editing
CN116310016A (en) * 2021-12-10 2023-06-23 沈阳美行科技股份有限公司 Skybox rendering method, device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101710429A (en) * 2009-10-12 2010-05-19 湖南大学 Illumination algorithm of augmented reality system based on dynamic light map
CN101802842A (en) * 2007-08-01 2010-08-11 昙盾特视觉科学有限公司 System and method for identifying complex tokens in an image
CN102696057A (en) * 2010-03-25 2012-09-26 比兹摩德莱恩有限公司 Augmented reality systems
WO2013154688A2 (en) * 2012-04-12 2013-10-17 Qualcomm Incorporated Photometric registration from arbitrary geometry for augmented reality
CN104766270A (en) * 2015-03-20 2015-07-08 北京理工大学 Virtual and real lighting fusion method based on fish-eye lens

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8872853B2 (en) * 2011-12-01 2014-10-28 Microsoft Corporation Virtual light in augmented reality
US9418629B2 (en) * 2013-03-15 2016-08-16 Disney Enterprises, Inc. Optical illumination mapping
CN105844714A (en) * 2016-04-12 2016-08-10 广州凡拓数字创意科技股份有限公司 Augmented reality based scenario display method and system

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101802842A (en) * 2007-08-01 2010-08-11 昙盾特视觉科学有限公司 System and method for identifying complex tokens in an image
CN101710429A (en) * 2009-10-12 2010-05-19 湖南大学 Illumination algorithm of augmented reality system based on dynamic light map
CN102696057A (en) * 2010-03-25 2012-09-26 比兹摩德莱恩有限公司 Augmented reality systems
WO2013154688A2 (en) * 2012-04-12 2013-10-17 Qualcomm Incorporated Photometric registration from arbitrary geometry for augmented reality
CN104766270A (en) * 2015-03-20 2015-07-08 北京理工大学 Virtual and real lighting fusion method based on fish-eye lens

Also Published As

Publication number Publication date
CN107808409A (en) 2018-03-16
WO2018045759A1 (en) 2018-03-15

Similar Documents

Publication Publication Date Title
CN107808409B (en) Method and device for performing illumination rendering in augmented reality and mobile terminal
ATE500578T1 (en) NON-PHOTO REALISTIC REPRESENTATION OF AUGMENTED REALITY
CN105049718A (en) Image processing method and terminal
CN110070551B (en) Video image rendering method and device and electronic equipment
KR20180032750A (en) Method of processing image and display apparatus performing the same
CN110084204B (en) Image processing method and device based on target object posture and electronic equipment
CN101140661A (en) Real time object identification method taking dynamic projection as background
CN108509887A (en) A kind of acquisition ambient lighting information approach, device and electronic equipment
CN106683100A (en) Image segmentation and defogging method and terminal
EP3561776A1 (en) Method and apparatus for processing a 3d scene
CN109600605B (en) Detection method of 4K ultra-high-definition video, electronic device and computer program product
US20080118176A1 (en) Adjusting apparatus for enhancing the contrast of image and method therefor
US8600157B2 (en) Method, system and computer program product for object color correction
CN106331663B (en) A kind of interaction material acquisition system and method for portable equipment
CN116823674A (en) Cross-modal fusion underwater image enhancement method
CN101873506B (en) Image processing method and image processing system for providing depth information
GB2573593A8 (en) Augmented reality rendering method and apparatus
CN114125302A (en) Image adjustment method and device
CN112132871B (en) Visual feature point tracking method and device based on feature optical flow information, storage medium and terminal
US10748331B2 (en) 3D lighting
CN114760422B (en) Backlight detection method and system, electronic equipment and storage medium
WO2011066673A1 (en) Rendering method
CN117274107A (en) End-to-end color and detail enhancement method, device and equipment under low-illumination scene
KR101488647B1 (en) Virtual illumination of operating method and apparatus for mobile terminal
CN112866286B (en) Data transmission method and device, terminal equipment and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant