[go: up one dir, main page]

CN103761759A - Image rendering method based on radiance spray - Google Patents

Image rendering method based on radiance spray Download PDF

Info

Publication number
CN103761759A
CN103761759A CN201310751911.3A CN201310751911A CN103761759A CN 103761759 A CN103761759 A CN 103761759A CN 201310751911 A CN201310751911 A CN 201310751911A CN 103761759 A CN103761759 A CN 103761759A
Authority
CN
China
Prior art keywords
pixel
sampled point
value
radiance
weighted
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201310751911.3A
Other languages
Chinese (zh)
Other versions
CN103761759B (en
Inventor
张大龙
李盼
张丹
董建锋
鲁东明
赵磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN201310751911.3A priority Critical patent/CN103761759B/en
Publication of CN103761759A publication Critical patent/CN103761759A/en
Application granted granted Critical
Publication of CN103761759B publication Critical patent/CN103761759B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Generation (AREA)

Abstract

本发明公开了一种基于光辉喷洒的图像渲染方法,该图像渲染方法通过像素对场景采样,确定各个像素的采样点,反向根据辐射度公式计算每个采样点对每个像素的光辉贡献值,并转化为辐射出度,将对该像素点的影响的权重值大于设定阈值的采样点的辐射出度进行加权累加得到辐射出度加权和,然后利用该辐射出度加权和对该像素进行渲染,为提高渲染精度,本发明将大于设定阈值的权重值累加得到权重和并用该权重和来衡量渲染精度,若小于设定的精度,则重新采样,直至达到设定的精度。该方法实现简单,通过两次精度判断,有效提高渲染精度,且计算各采样点的光辉贡献值时各个像素之间不存在相互影响,可并行处理有利于提高渲染速度。The invention discloses an image rendering method based on radiance spraying. The image rendering method samples the scene through pixels, determines the sampling point of each pixel, and calculates the radiance contribution value of each sampling point to each pixel in reverse according to the radiance formula. , and converted into the radiation output, weighted and accumulated the radiation output of the sampling point whose weight value of the influence on the pixel is greater than the set threshold to obtain the weighted radiation output, and then use the radiation output weighted sum to the pixel For rendering, in order to improve the rendering accuracy, the present invention accumulates the weight values greater than the set threshold to obtain the weight sum and uses the weight sum to measure the rendering accuracy. If it is less than the set accuracy, re-sampling is performed until the set accuracy is reached. This method is simple to implement, and the rendering accuracy is effectively improved through two precision judgments, and there is no mutual influence between each pixel when calculating the radiance contribution value of each sampling point, and parallel processing is beneficial to improve the rendering speed.

Description

The image rendering method spraying based on radiance
Technical field
The present invention relates to image processing field, relate in particular to a kind of image rendering method spraying based on radiance.
Background technology
Common scene rendering is all calculated to obtaining high-quality rendering effect by global illumination.The object that global illumination calculates is exactly to simulate the effect of the light of a plurality of crossed reflexes.Conventionally global illumination calculates all and calculates by ray trace (ray tracing) and Monte Carlo sampling (Monte Carlo sampling), very consuming time of calculating.
Aspect ray trace, someone proposes a kind of new method in recent years.This method mainly utilizes graphics hardware to show the result of global illumination, can reach the speed of real-time, interactive.In this method the inside, by adaptive splicing (tesselated) together, the radiance of incident is directed to each pixel and adopts optical fiber to follow the tracks of to calculate scene.But because high-quality playing up need to be embedded into each surface in many triangles and go, cause hydraulic performance decline.And this method only concentrates on interactive visual field: be directed to high-quality global illumination, render time is not greatly improved.
In precomputation (precomputed radiance transfer, the PRT) method of propagating at radiance, by off-line precomputation out, then spherical harmonic function shows the radiance transmission between body surface, or by Wavelet representation for transient out.By by this precalculated information, employing GPU that can be real-time calculates global illumination information.Although PRT method allows high-quality heavy illumination in real time, they mainly rely on precomputation consuming time, and this global illumination that has just hindered complex scene itself calculates and can be easy to adopt PRT method.
In addition, someone proposes one based on GPU(Graphic Processing Unit, graphic process unit) method carry out the real-time calculating that light is loose of carrying out.This algorithm mainly depends on the smooth surface of being chosen in of sampled point.Each sampled point is used as a pin hole (pinhole) camera, and this camera can project to a diffuse reflection the light of incident and accept surface and get on.This method can be processed several dynamic light sources and object, although speed can reach mutual speed, after quality improves, speed declines very fast.
At present existing different strategy solves global illumination problem.Several trials have been used to calculate global illumination and have calculated, and these methods are mainly to utilize programmable graphics hardware and pixel coloring device to implement the calculated crosswise of light elements.These methods have common shortcoming: GPU structure does not allow complicated data structure as tree construction, but this data structure is but used in ray trace optimization and the storage of photon pinup picture conventionally.
Summary of the invention
The invention provides a kind of image rendering method spraying based on radiance, avoid using complicated data structure, reduced widely and played up the spent time, can be efficiently real-time carry out playing up under global illumination, and be convenient to implement.
The image rendering method spraying based on radiance, comprises the following steps:
(1) from viewpoint along each pixel image to be rendered to scene, emit beam and determine the intersecting area of object in light and scene;
(2) using the central point of described intersecting area as sampled point corresponding to this pixel, obtain and generate the information recording of each sampled point, described information recording comprises the mediation distance between position vector, method phasor, brilliant value and sampled point and each pixel;
(3) for any one pixel, according to the information recording of each sampled point, utilize Weighting value function to calculate respectively the weighted value of each sampled point on the radiance value impact of current pixel, according to the radiosity equation of distortion, calculate the brilliant contribution margin of each sampled point to current pixel, and calculate the radiation out-degree contribution margin of this sampled point to current pixel according to brilliant contribution margin;
(4) size of the threshold value of relatively more described weighted value and setting, travel through all sampled points, the weighted value that weighted value is greater than to the threshold value of setting adds up and obtains the weight cumulative sum of this pixel, and radiation out-degree contribution margin corresponding to this weighted value is weighted to the cumulative radiation out-degree weighted sum that obtains;
(5) described weight cumulative sum and the precision of setting are compared:
If the cumulative precision that is less than setting of weight, returns and carry out step (2), difference is in described intersecting area, to take at random a bit as sampled point corresponding to this pixel at random;
Otherwise, by the radiation out-degree weighted sum of described pixel, divided by weight cumulative sum, calculate the indirect illumination value of this pixel;
(6) described indirect illumination value is added to the direct illumination value of this pixel obtains the illumination value of this pixel, and utilize this illumination value to play up this pixel.
Directly illumination value is known quantity, if this pixel is light source, because light source is artificial setting, its value is also known so; If not light source, so directly illumination value is 0.
Each pixel can be thought a criss-cross zonule, and due to dispersing of light, the intersecting area obtaining in scene is generally greater than pixel region.The precision of sampling when the center of general this intersecting area is sampled point is the highest, so the present invention starts sampling from center, contributes to improve rendering efficiency.When take precision that center is sampled point when inadequate, from this intersecting area, choose at random a bit as sampled point, and proceed precision judgement, until precision reaches setting value.The sampled point of each sampling is all different from the sampled point of front once sampling.
After determining sampled point, what can obtain sampled point is position (coordinate), the radiance value of self, with vector using this location mark as position vector, and according to this position vector, according to this sampled point scenery on the scene surface curve, determine its normal vector, and calculate the mediation distance between this sampled point and each pixel by the coordinate information of sampled point in coordinate system and the coordinate information of pixel (position vector).
The present invention samples to scene by pixel, determine the sampled point of each pixel, because the radiance value of each pixel is jointly contributed and obtained by all sampled points, therefore oppositely according to radiancy formula, calculate each sampled point to the contribution of the radiance value of each pixel (brilliant contribution margin), claim this process for radiance sprinkling.The method realizes simple, and does not exist and influence each other between each pixel, can parallel processing.
The present invention is according to determining after sampled point, consider that different sampled points are all different on the impact of each pixel, with Weighting value function, calculate the impact size (weighted value) of each sampled point on this pixel, affecting less sampled point ignores to the contribution of radiance value, only consider impact the sampled point of large (being greater than the threshold value of setting) the radiance value of this pixel is adopted to contribution, with these, adopt point to understand this pixel and play up.For improving precision, further precision is calibrated, by the cumulative weight cumulative sum that obtains this pixel of all relatively large weighted values, using the employing precision of this cumulative sum as this pixel, if weight cumulative sum does not reach the precision of setting, resampling is until ask for the illumination value of this pixel again and complete and play up after reaching the precision of setting.In the present invention, by twice precision, judge, effectively improved sampling precision, and then improved rendering accuracy.
Described step (1)~(3) are carried out by GPU; Described step (4)~(6) are carried out by CPU.Complicated large scene under global illumination is difficult to be played up out rapidly by GPU always.During the playing up of common this situation, by ray trace (ray tracing) and Monte Carlo, sample to calculate.And these two kinds of methods have the judgement of more logic, or recursive sequence, but GPU is to the processing of logic judgement and to possess the data processing of topological order all unable to do what one wishes.In the present invention because step (1)~(3) only have data operation, do not have too much logic judgement, and in above-mentioned steps, each computation process does not interfere with each other, thereby can be calculated by GPU, by parallel processing, greatly improve arithmetic speed, contributed to improve rendering efficiency, and comparatively logic judgement of the ratio of precision of carrying out, by CPU, completed, but this part of calculated amount is less, be only that the partial pixel that part is not met to accuracy requirement is proofreaied and correct, so little on whole time loss impact.Most calculating of algorithm can be carried out on GPU on the whole, concurrency feature that can adequate cause GPU, rapidly and efficiently complete scene rendering.
The present invention arranges brilliant buffer area in GPU, for storing information recording, weighted value and the radiation out-degree of each sampled point.Adopt non-simple data structure, greatly reduce the consumption of storage, reduce the complexity of visit data, behind GPU execution of step (1)~(3), CPU reads the data (being about to brilliant buffer area data-mapping to CPU) that brilliant buffer area is stored, by CPU, carry out subsequent step, complete image rendering.In execution step (1), need before to carry out initialization, empty described brilliant buffer area.CPU reads after the data that brilliant buffer area stores, and the data of brilliant buffer area storage are not deleted, and still retain, and as backup, for follow-up precision, calibrate and prepare.
The threshold value of setting and the product of described precision are 1.And generally, the precision of setting is greater than 1.
Weighting value function in described step (2) is as follows:
w m k = 1 | | p k - p m | | R m k + 1 - n k · n m ,
Wherein:
Figure BDA0000449335670000042
be the weighted value of m sampled point on the impact of the radiance value of k pixel, p kand n kbe respectively position vector and the normal vector of k pixel, p mand n mbe respectively position vector and the normal vector of m sampled point,
Figure BDA0000449335670000043
be the mediation distance between m sampled point and k pixel, k=1,2,3 ... K, m=1,2,3 ... K, the sum that K is pixel.
The position vector of each pixel can directly be obtained, and its normal vector can obtain according to the position simple computation of obtaining.
The radiosity equation of the distortion in described step (3) is as follows:
E m k = E m ( 1 + n m × n k · ▿ r + ( p k - p m ) · ▿ t ) ,
Wherein
Figure BDA0000449335670000052
be the brilliant contribution margin of m sampled point to k pixel, E kbe the radiance value of k sampled point,
Figure BDA0000449335670000054
for the rotation gradient of sampled point k, skew gradient for sampled point k.Rotation gradient and skew gradient refer to respectively rotation gradient and the skew gradient of the position vector of sampled point.
The following formula of described step (3) basis:
Lo = ρ d E m k
Calculate the radiation out-degree contribution margin Lo of m sampled point to k pixel, ρ dit is the diffuse reflectance of k pixel.
Described diffuse reflectance is determined by the material of sampled point corresponding to this pixel.
Image rendering method of the present invention can complete the playing up of complex scene in time loss seldom, plays up complexity only relevant to pixel, is applicable to the application of playing up of nearly all complicated large scene.Compared with prior art, following beneficial effect of the present invention:
(1) the present invention is according to determining after sampled point, consider that different sampled points are all different on the impact of each pixel, with Weighting value function, calculate the impact size (weighted value) of each sampled point on this pixel, selecting to affect the sampled point of large (being greater than the threshold value of setting) plays up this pixel, for improving precision, further this precision is calibrated, effectively improved sampling precision, and then improved rendering accuracy;
(2) in image rendering method of the present invention, simple calculations is by GPU executed in parallel, complicated logic judgement is carried out by CPU, and radiance is set in GPU sprays buffer area, reduce the consumption of storage, the complexity that reduces visit data, reduces time loss greatly, makes the real-time of scene rendering obtain and significantly improve.
Embodiment
The image rendering method spraying based on radiance, comprises the following steps:
The image rendering method spraying based on radiance, comprises the following steps:
(1) from viewpoint along each pixel image to be rendered to scene, emit beam and determine the intersecting area of object in light and scene;
(2) using the central point of described intersecting area as sampled point corresponding to this pixel, obtain and generate the information recording of each sampled point, described information recording comprises the mediation distance between position vector, method phasor, brilliant value and sampled point and each pixel;
(3) for any one pixel, according to the information recording of each sampled point, utilize Weighting value function to calculate respectively the weighted value of each sampled point on the radiance value impact of current pixel, according to the radiosity equation of distortion, calculate the brilliant contribution margin of each sampled point to current pixel, and calculate the radiation out-degree contribution margin of this sampled point to current pixel according to brilliant contribution margin;
(4) size of the threshold value of relatively more described weighted value and setting, travel through all sampled points, the weighted value that weighted value is greater than to the threshold value of setting adds up and obtains the weight cumulative sum of this pixel, and radiation out-degree contribution margin corresponding to this weighted value is weighted to the cumulative radiation out-degree weighted sum that obtains;
(5) described weight cumulative sum and the precision of setting are compared:
If the cumulative precision that is less than setting of weight, returns and carry out step (2), difference is in described intersecting area, to take at random a bit as sampled point corresponding to this pixel at random;
Otherwise, by the radiation out-degree weighted sum of described pixel, divided by weight cumulative sum, calculate the indirect illumination value of this pixel;
(6) described indirect illumination value is added to the direct illumination value of this pixel obtains the illumination value of this pixel, and utilize this illumination value to play up this pixel.
In this image rendering method, threshold value and precision are set by artificial, and in the present embodiment, this threshold value is 1/a), and this precision is a.
In the present embodiment, after determining sampled point, what can obtain sampled point is position (coordinate), the radiance value of self, with vector using this location mark as position vector, and according to this position vector, according to this sampled point scenery on the scene surface curve, determine its normal vector, according to the coordinate of each sampled point and pixel in coordinate system, calculate the mediation distance between this sampled point and each pixel.
In the present embodiment, this pixel is not light source, and therefore directly illumination value is 0.
Step in this image rendering method (1)~(3) are carried out by GPU; Step (4)~(5) are carried out by CPU.
Weighting value function in step (2) is as follows:
w m k = 1 | | p k - p m | | R m k + 1 - n k · n m ,
Wherein:
Figure BDA0000449335670000062
be the weighted value of m sampled point on the impact of the radiance value of k pixel, p kand n kbe respectively position vector and the normal vector of k pixel, p mand n mbe respectively position vector and the normal vector of m sampled point,
Figure BDA0000449335670000071
be the mediation distance between m sampled point and k pixel, k=1,2,3 ... K, m=1,2,3 ... K, the sum that K is pixel.
The position vector of each pixel can directly be obtained, and its normal vector can obtain according to the position simple computation of obtaining.
The radiosity equation of the distortion in step (3) is as follows:
E m k = E m ( 1 + n m × n k · ▿ r + ( p k - p m ) · ▿ t ) ,
Wherein
Figure BDA0000449335670000073
be the brilliant contribution margin of m sampled point to k pixel, E kbe the radiance value of k sampled point,
Figure BDA0000449335670000075
for sampled point k rotation gradient,
Figure BDA0000449335670000076
for sampled point k skew gradient.Rotation gradient and skew gradient refer to respectively rotation gradient and the skew gradient of the position vector of sampled point.
The following formula of step (3) basis:
Lo = ρ d E m k
Calculate the radiation out-degree contribution margin Lo of m sampled point to k pixel, ρ dbe the diffuse reflectance of k pixel, wherein diffuse reflectance is determined by the material of sampled point corresponding to this pixel.
The above; be only the specific embodiment of the present invention, but the protection domain of invention is not limited to this, is anyly familiar with those skilled in the art in the technical scope that the present invention discloses; the variation that can expect easily or replacement, within all should being encompassed in protection scope of the present invention.

Claims (7)

1. the image rendering method spraying based on radiance, is characterized in that, comprises the following steps:
(1) from viewpoint along each pixel image to be rendered to scene, emit beam and determine the intersecting area of object in light and scene;
(2) using the central point of described intersecting area as sampled point corresponding to this pixel, obtain and generate the information recording of each sampled point, described information recording comprises the mediation distance between position vector, method phasor, brilliant value and sampled point and each pixel;
(3) for any one pixel, according to the information recording of each sampled point, utilize Weighting value function to calculate respectively the weighted value of each sampled point on the radiance value impact of current pixel, according to the radiosity equation of distortion, calculate the brilliant contribution margin of each sampled point to current pixel, and calculate the radiation out-degree contribution margin of this sampled point to current pixel according to brilliant contribution margin;
(4) size of the threshold value of relatively more described weighted value and setting, travel through all sampled points, the weighted value that weighted value is greater than to the threshold value of setting adds up and obtains the weight cumulative sum of this pixel, and radiation out-degree contribution margin corresponding to this weighted value is weighted to the cumulative radiation out-degree weighted sum that obtains;
(5) described weight cumulative sum and the precision of setting are compared:
If the cumulative precision that is less than setting of weight, returns and carry out step (2), difference is in described intersecting area, to take at random a bit as sampled point corresponding to this pixel at random;
Otherwise, by the radiation out-degree weighted sum of described pixel, divided by weight cumulative sum, calculate the indirect illumination value of this pixel;
(6) described indirect illumination value is added to the direct illumination value of this pixel obtains the illumination value of this pixel, and utilize this illumination value to play up this pixel.
2. the image rendering method spraying based on radiance as claimed in claim 1, is characterized in that, described step (1)~(3) are carried out by GPU; Described step (4)~(6) are carried out by CPU.
3. the image rendering method spraying based on radiance as claimed in claim 2, is characterized in that, the threshold value of setting and the product of described precision are 1.
4. the image rendering method spraying based on radiance as claimed in claim 3, is characterized in that, the Weighting value function in described step (3) is as follows:
w k ( p ) = 1 | | p k - p m | | R m k + 1 - n k * n m ,
Wherein: w m(p) be the weighted value of m sampled point on the impact of the radiance value of k pixel, p kand n kbe respectively positional information and the normal vector of k pixel, p mand n mbe respectively position vector and the normal vector of m sampled point, be the mediation distance between m sampled point and k pixel, k=1,2,3 ... K, m=1,2,3 ... K, the sum that K is pixel.
5. the image rendering method spraying based on radiance as claimed in claim 4, is characterized in that, the radiosity equation of the distortion in described step (3) is as follows:
E m k = E m ( 1 + n m × n k · ▿ r + ( p k - p m ) · ▿ t ) ,
Wherein
Figure FDA0000449335660000023
be the brilliant contribution margin of m sampled point to k pixel, E kbe the radiance value of k sampled point,
Figure FDA0000449335660000025
for the rotation gradient of sampled point k,
Figure FDA0000449335660000026
skew gradient for sampled point k.
6. the image rendering method spraying based on radiance as claimed in claim 5, is characterized in that, the following formula of described step (3) basis:
Lo = ρ d E m k
Calculate the radiation out-degree contribution margin Lo of m sampled point to k pixel, ρ dit is the diffuse reflectance of k pixel.
7. the image rendering method spraying based on radiance as claimed in claim 6, is characterized in that, described diffuse reflectance is determined by the material of sampled point corresponding to this pixel.
CN201310751911.3A 2013-12-30 2013-12-30 The image rendering method sprayed based on radiance Expired - Fee Related CN103761759B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310751911.3A CN103761759B (en) 2013-12-30 2013-12-30 The image rendering method sprayed based on radiance

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310751911.3A CN103761759B (en) 2013-12-30 2013-12-30 The image rendering method sprayed based on radiance

Publications (2)

Publication Number Publication Date
CN103761759A true CN103761759A (en) 2014-04-30
CN103761759B CN103761759B (en) 2016-09-14

Family

ID=50528992

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310751911.3A Expired - Fee Related CN103761759B (en) 2013-12-30 2013-12-30 The image rendering method sprayed based on radiance

Country Status (1)

Country Link
CN (1) CN103761759B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447905A (en) * 2015-11-17 2016-03-30 长春理工大学 Three dimensional scene approximation soft shadow light tracking based on visible smooth filtering

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6246421B1 (en) * 1996-12-24 2001-06-12 Sony Corporation Apparatus and method for parallel rendering of image pixels
US20050088440A1 (en) * 2003-10-22 2005-04-28 Microsoft Corporation Hardware-accelerated computation of radiance transfer coefficients in computer graphics
CN1889128A (en) * 2006-07-17 2007-01-03 北京航空航天大学 Method for precalculating radiancy transfer full-frequency shadow based on GPU
CN101982838A (en) * 2010-11-02 2011-03-02 长春理工大学 3D virtual set ray tracking method for accelerating back light source irradiation
CN103136399A (en) * 2011-12-01 2013-06-05 北京七十二炫信息技术有限公司 System and method for radiation intensity parallel rendering for indoor scene

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6246421B1 (en) * 1996-12-24 2001-06-12 Sony Corporation Apparatus and method for parallel rendering of image pixels
US20050088440A1 (en) * 2003-10-22 2005-04-28 Microsoft Corporation Hardware-accelerated computation of radiance transfer coefficients in computer graphics
CN1889128A (en) * 2006-07-17 2007-01-03 北京航空航天大学 Method for precalculating radiancy transfer full-frequency shadow based on GPU
CN101982838A (en) * 2010-11-02 2011-03-02 长春理工大学 3D virtual set ray tracking method for accelerating back light source irradiation
CN103136399A (en) * 2011-12-01 2013-06-05 北京七十二炫信息技术有限公司 System and method for radiation intensity parallel rendering for indoor scene

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105447905A (en) * 2015-11-17 2016-03-30 长春理工大学 Three dimensional scene approximation soft shadow light tracking based on visible smooth filtering
CN105447905B (en) * 2015-11-17 2018-03-06 长春理工大学 Three-dimensional scenic approximation soft shadows method for drafting based on observability smothing filtering

Also Published As

Publication number Publication date
CN103761759B (en) 2016-09-14

Similar Documents

Publication Publication Date Title
TWI764974B (en) Filtering image data using a neural network
CN105261059B (en) A kind of rendering intent based in screen space calculating indirect reference bloom
CN107452048B (en) The calculation method and device of global illumination
US10614619B2 (en) Graphics processing systems
CN110111408B (en) Large scene rapid intersection method based on graphics
US20150187129A1 (en) Technique for pre-computing ambient obscurance
JP5437475B2 (en) Shading generation method for images
EP2860701A2 (en) Photorealistic rendering of scenes with dynamic content
CN102592306B (en) The method of estimation blocked in virtual environment
DE102018101030A1 (en) Filter image data using a neutral network
CN105488844B (en) The display methods of magnanimity model real-time shadow in a kind of three-dimensional scenic
US20130328875A1 (en) Integration Cone Tracing
CN106023300B (en) A kind of the body rendering intent and system of translucent material
CN104700448A (en) Self adaption photon mapping optimization algorithm based on gradient
CN104318605B (en) Parallel lamination rendering method of vector solid line and three-dimensional terrain
US20120299922A1 (en) Image processing apparatus and method
CN104157004A (en) Method for computing radiosity lighting through fusion of GPU and CPU
CN103544731B (en) A kind of quick reflex method for drafting based on polyphaser
CN103761761A (en) Marine scalar field volume rendering method based on earth sphere model
CN103413346B (en) A kind of sense of reality fluid real-time reconstruction method and system thereof
CN106952328A (en) A method and system for rendering a large-scale macroscopic virtual scene
CN107292946A (en) A kind of image rendering method based on BRDF function lfs
CN101271588A (en) Method capable of reconstructing geometric shadow map
CN103761759A (en) Image rendering method based on radiance spray
CN118642068A (en) Dynamic decision-making method and device based on neural radiation field model

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20160914

CF01 Termination of patent right due to non-payment of annual fee