[go: up one dir, main page]

WO2018101093A1 - Dispositif d'imagerie, procédé d'imagerie et dispositif d'affichage - Google Patents

Dispositif d'imagerie, procédé d'imagerie et dispositif d'affichage Download PDF

Info

Publication number
WO2018101093A1
WO2018101093A1 PCT/JP2017/041564 JP2017041564W WO2018101093A1 WO 2018101093 A1 WO2018101093 A1 WO 2018101093A1 JP 2017041564 W JP2017041564 W JP 2017041564W WO 2018101093 A1 WO2018101093 A1 WO 2018101093A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
light source
unit
display device
imaging
Prior art date
Application number
PCT/JP2017/041564
Other languages
English (en)
Japanese (ja)
Inventor
善光 村橋
崇志 峰
高橋 真毅
伊藤 典男
Original Assignee
シャープ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by シャープ株式会社 filed Critical シャープ株式会社
Publication of WO2018101093A1 publication Critical patent/WO2018101093A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/25Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
    • G01N21/27Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • G01N21/57Measuring gloss
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B15/00Special procedures for taking photographs; Apparatus therefor
    • G03B15/02Illuminating scene
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T15/003D [Three Dimensional] image rendering
    • G06T15/50Lighting effects

Definitions

  • the present invention relates to an imaging device, an imaging method, and a display device.
  • Patent Document 1 discloses an image processing apparatus for obtaining a bidirectional reflectance distribution function representing reflectance characteristics for each pixel of an object.
  • Patent Document 1 discloses image processing for calculating a bidirectional reflectance distribution function for each pixel from image data obtained by imaging an object while changing the angle between the object and an imaging device. An apparatus is disclosed.
  • the bidirectional reflectance distribution function described above is a function of six variables including the position of the object (x, y), the direction of incident light ( ⁇ in , ⁇ in ), and the direction of reflected light ( ⁇ out , ⁇ out ). Therefore, the amount of calculation required for image processing is large.
  • a method for directly measuring the reflectance characteristics of an object has been proposed. Specifically, a plurality of light sources are sequentially illuminated one by one, and the intensity of light reflected on the imaging device when the light from the light source is reflected at a specific point is observed. This is a method for obtaining the reflection characteristics of the film.
  • the present invention has been made in view of the above problems, and a first object thereof is to provide an imaging apparatus and an imaging method capable of measuring a more accurate reflection characteristic of an object. is there.
  • Patent Document 1 the technique disclosed in Patent Document 1 described above is to calculate a bidirectional reflectance distribution function based on pre-input image data and generate an image with a texture on an object. There is a problem that the observer cannot perceive the texture of the object live.
  • the present invention has been made in view of the above problems, and a second object thereof is to provide a display device and a display method that enable an observer to sense the texture of an object live. There is.
  • an imaging device includes an imaging unit that captures an object irradiated with light from an irradiation pattern obtained by combining a plurality of light sources, and the imaging unit captures an image. And a specifying unit that specifies the reflectance of the light from each light source to the imaging unit.
  • an imaging method includes an imaging step of imaging a target irradiated with light from an irradiation pattern obtained by combining a plurality of light sources, and the imaging step. And a specifying step of specifying a reflectance in the imaging direction of the light of each light source from the image captured in.
  • a display device is a display device that displays an object, and a virtual light source for the display device according to the position or orientation of the display device And a display processing unit for displaying the object irradiated with light from the virtual light source located at the position determined by the determining unit.
  • a display device is a display device that displays an object, and the input receiving unit that receives an input from a user and the input receiving unit that receives the input Referring to the input from the user, a determination unit that determines the position of the virtual light source with respect to the display device, and the display device is irradiated with light from the virtual light source at the position determined by the determination unit A display processing unit for displaying the object.
  • a display method is a display method in which a display device displays an object, and the display device is in accordance with the position or orientation of the display device.
  • a display method is a display method in which a display device displays an object, an input receiving step for receiving an input from a user, and the input receiving step.
  • a display processing step for displaying the irradiated object is a display method in which a display device displays an object, an input receiving step for receiving an input from a user, and the input receiving step.
  • the observer can perceive the texture of the object live.
  • (A) in a figure shows the state which irradiated the some illumination to the target object
  • (b) is a figure which shows the bidirectional
  • (A) in a figure shows the state which irradiated the some illumination to the target object
  • (b) is a figure which shows the bidirectional
  • FIG. 1 It is a figure showing the important section composition of the imaging device concerning one embodiment of the present invention. It is a figure which shows the outline of the imaging method by the imaging device which concerns on one Embodiment of this invention. It is a figure which shows the outline of the imaging method which concerns on one Embodiment of this invention. It is a figure which shows an example of the concrete irradiation method from the several light source in the imaging device which concerns on one Embodiment of this invention. It is a figure which shows the specific example of the irradiation pattern by the light source part which concerns on one Embodiment of this invention.
  • Embodiment 1 Hereinafter, embodiments of the present invention will be described in detail.
  • the present embodiment relates to a reflection characteristic description model for obtaining the reflection characteristic of an object.
  • An outline of the reflection characteristic description model according to the present embodiment is shown in FIGS.
  • the reflection characteristic description model includes a position (x, y) of an object K, which is six variables of a bidirectional reflectance distribution function BRDF, and a direction of incident light ( ⁇ in , phi in), the direction of the reflected light ( ⁇ out, ⁇ out) among the direction of the reflected light ( ⁇ out, ⁇ out) to fix the position of the object K (x, y), the direction of the incident light (theta in , ⁇ in ), which is a reflection characteristic description model having four variables.
  • BRDF bidirectional reflectance distribution function
  • FIG. 2A when light is incident on the object K from various directions ( ⁇ in , ⁇ in ), a specific direction ( ⁇ out , ⁇ the reflection characteristics described model when capturing the light reflected in the out).
  • a specific direction ( ⁇ out , ⁇ the reflection characteristics described model when capturing the light reflected in the out) Specifically, as shown in FIG. 2B, reflection when light reflected in a specific direction ( ⁇ out , ⁇ out ) is captured at various positions (x, y) in the object K. It becomes a characteristic description model.
  • FIG. 2C when the resolutions of the incident light directions ( ⁇ in , ⁇ in ) are N ⁇ and N ⁇ , respectively, the bidirectional direction at the position (x, y) of the object K is obtained.
  • the reflectance distribution function BRDF can be modeled with a smaller number of parameters M than N ⁇ ⁇ N ⁇ .
  • the direction in the three-dimensional space is indicated by using two angles ⁇ and ⁇ , but here, in order to simplify the explanation, it will be explained on a two-dimensional plane, and the variable indicating the direction is mainly ⁇ . It shall be expressed using
  • FIG. 3 shows a state in which the object K is illuminated.
  • Object K is illuminated
  • L k is illuminated by the point on the object K to the coordinates x i on the captured image when taken by the image pickup device, the coordinates of the light in the x i intensity I (x i ) Is expressed as follows using the reflectivity r ik .
  • I (x i ) ⁇ L k r ik
  • the reflection characteristic captures two dependency relationships, i.e., an item i that depends on the coordinate position on the captured image and an item k that depends on the direction of illumination, so that the object K has an arbitrary intensity from an arbitrary direction. The appearance when receiving light is determined.
  • FIG. 4 is a diagram illustrating a state in which the object K is illuminated.
  • the coordinates on the captured image when the point P 1 is captured by the image sensor are represented by x 1
  • the normal direction at the point P 1 is n (x 1 )
  • An angle formed by the direction of light incident on the point P 1 and the normal direction n (x 1 ) is ⁇ in
  • An angle formed by the direction of light reflected from the point P 1 and the normal direction n (x 1 ) is ⁇ out (x 1 )
  • the light incident from the ⁇ in direction when ⁇ out (x 1 ) is used as a reference is represented by L ( ⁇ in ⁇ out (x 1 ))
  • I (x 1 ) represents the intensity of light reflected from the point P 1 in the ⁇ out (x 1 ) direction and imaged.
  • the display device By displaying the light intensity I (x 1 ) on the display device while observing, it is possible to reproduce the texture such as the glossiness of the object and the roughness of the material.
  • the bidirectional reflectance distribution function BRDF is recorded with three-dimensional information on the two-dimensional space (that is, six-dimensional information on the three-dimensional space). Need a huge amount of information.
  • FIG. 5 shows the difference between the reflected light and the incident light when the object K is illuminated.
  • ⁇ d when the angle of the difference between the reflected light and the incident light is ⁇ d , it is expressed as follows.
  • ⁇ out (x 1) -n (x 1) when note that depending on the theta d, in the case of this calculation, a bidirectional reflectance distribution function BRDF, only component that varies theta d Therefore, the third argument of the bidirectional reflectance distribution function BRDF is unnecessary. That is, ⁇ out (x 1 ) ⁇ n (x 1 ) can be described as follows when rewritten on the assumption that it does not change with respect to ⁇ d .
  • ⁇ out (x 1 ) is fixed (that is, the imaging (observation) direction is constant) and n (x 1 ) is fixed (that is, the position and orientation of the object is constant).
  • BRDF which required 3 arguments, can drop the dimension to 2 arguments (that is, in the case of 3 dimensions, 6 arguments can drop the dimension to 3 arguments) Can be expressed).
  • the present embodiment records the reflection characteristics of the object, and the illumination changes according to the reproduction environment. Can have an effect.
  • this recording method does not require grasping the 3D shape of the subject, it is possible to easily record the reflection characteristics of the actual image.
  • FIG. 6A shows a state in which the object K is irradiated with a plurality of illuminations
  • FIG. 6B shows a bidirectional reflectance distribution function BRDF ′ in that case.
  • Bidirectional reflectance distribution function BRDF ' is to measure the reflection characteristic when changing the theta d, theta d is recorded discretely (for example, the resolution of the N [theta] d). Therefore, the intensity I (x 1 ) of the light reflected and imaged on the object is expressed as follows.
  • N ⁇ d 8.
  • the incident light is totally incident on the normal vector n of the normal T at the point P 1 because the incident angle is equal to the reflection angle. reflect. Accordingly, in the case of FIG. 6A, the light having the incident angle that is equal to the angle formed by the direction in which I (x 1 ) is observed and the normal vector of the point P 1 is reflected most strongly. As shown in FIG. 6B, the reflectivity of ⁇ d3 is the largest, and light at other angles hardly reflects.
  • FIG. 7A shows a state in which the object K is irradiated with a plurality of illuminations
  • FIG. 7B shows a bidirectional reflectance distribution function BRDF ′ in that case.
  • the bidirectional reflectance distribution function BRDF ′ is a function.
  • the shape of is similar, but the point where the reflectance is the largest is shifted. Therefore, it is considered appropriate to parameterize the deviation of this function in the reflection characteristics.
  • FIG. 8A shows a state in which the object K is irradiated with a plurality of illuminations
  • FIG. 8B shows a bidirectional reflectance distribution function BRDF ′ in that case.
  • the object was a material, such as matte
  • the incident light relative to the normal vector n of the normal T of the point P 3
  • most high reflectance in component of the incident angle reflection angle
  • the reflectance gradually decreases.
  • the reflection characteristics are unimodal such that the reflectance increases at a certain point when ⁇ d is changed.
  • the sharpness of the function varies depending on the material of the object, and the point at which the reflectance is maximized varies depending on the shape of the object.
  • the bidirectional reflectance distribution function BRDF ′ is reduced with a smaller number of parameters by using the parameter indicating the height and the parameter indicating the deviation of the reference function. Can be represented.
  • reflection characteristics of more materials can be expressed with fewer parameters by introducing parameters whose function sharpness (spreading) varies depending on the material of the object.
  • the unimodal function there are eight parameters for one coordinate x 1 , but if expressed based on a unimodal function, it can be expressed by three parameters of height, shift, and spread.
  • the following Gaussian function may be used as the unimodal function.
  • g (s, ⁇ ) exp ( ⁇ 2 / s)
  • s is a parameter indicating the spread, and the larger the s is, the wider the base of the function is.
  • FIG. 9 shows a bidirectional reflectance distribution function BRDF ′ when a component that is always affected by light in all directions is included.
  • a component B that is always influenced by light in all directions may be included without depending on ⁇ d .
  • the bidirectional reflectance distribution function BRDF ′ is expressed as follows.
  • FIG. 10 shows a bidirectional reflectance distribution function BRDF ′ in which a plurality of peaks appear.
  • the bidirectional reflectance distribution function BRDF ′ may be expressed by a superposition of a plurality of (N i ) unimodal functions as follows.
  • the bidirectional reflectance distribution function BRDF ′ is formed by superimposing two unimodal functions X and Y.
  • the unimodal function that is the basis of the reflection characteristic description model may be such that a different function is selected for each coordinate x.
  • the bidirectional reflectance distribution function BRDF ′ is expressed as follows.
  • N i unimodal functions it is represented by 1 + 6 ⁇ N i parameters.
  • c i (x, y) becomes unnecessary, and is expressed by 1 + 5 ⁇ N i parameters.
  • the unimodal function g ci (s ⁇ , ⁇ d ⁇ u ⁇ , s ⁇ , ⁇ d ⁇ u ⁇ ) shown may be, for example, a Gaussian function of the following equation.
  • the parameters may be increased and expressed using a cross correlation coefficient s ⁇ . In this way, the anisotropy with respect to the direction of the texture can be reproduced more faithfully.
  • one unimodal function is limited to a Gaussian distribution only, B (x, y), A i (x, y), u ⁇ i (x, y), u ⁇ i (x, y), s ⁇ i (x, y), s ⁇ i (x, y), and s ⁇ i (x, y).
  • N i unimodal functions it is represented by 1 + 6 ⁇ N i parameters.
  • N x ⁇ 1920, N y 1080
  • 8 bits per pixel about 13.4 Tbits are required.
  • each bit needs further consideration, but at least if the reflection characteristics can be approximated by a superposition of multiple unimodal functions, the amount of information is much larger than the resolution for the direction of the incident light acquired. Can be compressed.
  • the present embodiment relates to an imaging apparatus and an imaging method for measuring reflection characteristics of an object.
  • FIG. 12 shows a main configuration of the imaging apparatus according to the present embodiment.
  • the imaging device 30 includes an imaging unit 2, a control unit 6, and a light source unit 8.
  • the control unit 6 further includes a specifying unit 4 and a light source control unit 10.
  • the imaging unit 2 images an object irradiated with light from an irradiation pattern obtained by combining a plurality of light sources.
  • the light source unit 8 includes a plurality of light sources that irradiate light on the object.
  • the “light source” in the present specification may be a lighting device or an object that emits light by itself, or may be an object that reflects light from the lighting device or the like and indirectly irradiates the object with light. .
  • the specifying unit 4 of the control unit 6 specifies the reflectance of the light of each light source to the imaging unit 2 (imaging direction) from the captured image captured by the imaging unit 2. Further, the light source control unit 10 of the control unit 6 controls light irradiation by a plurality of light sources of the light source unit 8.
  • FIG. 13 shows a schematic diagram of an imaging method by the imaging device 30 according to the present embodiment.
  • the light source unit 8 has a plurality of light sources L 1 to L 8 .
  • the light source control unit 10 causes the light sources L 1 to L 8 of the light source unit 8 to irradiate the object with light with different irradiation patterns.
  • the specifying unit 4 specifies the reflectance for each light source at the point X of the target object by calculating the intensity of light reflected at the point X on the target object and reflected in the captured image of the imaging unit 2 at that time. .
  • the light source control unit 10 causes the plurality of light sources L 1 to L 8 of the light source unit 8 to irradiate the object with light according to the irradiation patterns of (1), (2),.
  • the imaging unit 2 captures an object irradiated with light.
  • the specifying unit 4 reflects the light intensity I X (1) , I X (2) ,..., I X (8) in the captured image of the imaging unit 2 by being reflected at the point X on the object. Based on the above, the specifying unit 4 specifies the reflectance for each light source.
  • the light intensities I X (1) , I X (2) ,..., I X (8) in the captured image are the reflectances of the respective light sources at the point X of the object, R X1 , R X2,. , R X8, the intensity of incident light of each light source of each illumination pattern ⁇ L 1 (1), L 2 (1), ..., L 8 (1) ⁇ , ⁇ L 1 (2), L 2 (2 ), ..., L 8 (2 ) ⁇ , ..., ⁇ L 1 (8), L 2 (8), ...,
  • the specifying unit 4 can specify the reflectivity R X1 , R X2 ,..., R X8 of each light source at the point X of the object. Note that the solving method of the simultaneous equations by the specifying unit 4 can be reduced to a matrix calculation using the coefficients of the simultaneous equations as components, but the specific processing is not limited to this embodiment.
  • the reflectances R X1 , R X2 ,..., R X8 of the light sources to be specified are eight, light is used as an object with at least eight types of irradiation patterns using a plurality of light sources L 1 to L 8.
  • An irradiated captured image is required. That is, a captured image in which light is irradiated on an object with an irradiation pattern equal to the number of light sources is necessary.
  • a captured image by the imaging unit 2 is captured brightly, and noise can be reduced. Thereby, a more accurate reflection characteristic of the object can be measured.
  • the irradiation pattern by the plurality of light sources is preferably obtained by causing more than half of the plurality of light sources L 1 to L 8 to emit light.
  • more than half of the light sources emit light in each irradiation pattern, so that a captured image by the imaging unit 2 is captured brighter, and a plurality of captured images from the same light source are obtained. Data can be obtained and noise can be reduced.
  • the irradiation pattern includes an irradiation pattern obtained by inverting another irradiation pattern.
  • FIG. 14 shows a schematic diagram of the imaging method in this case.
  • the light source unit 8 has a plurality of light sources L 1 to L 8 .
  • imaging is performed by irradiating light with a certain irradiation pattern (1+) and an irradiation pattern (1-) obtained by inverting the irradiation pattern (1+). Get an image.
  • the intensity of leakage light and B X the light intensity I X (1), I X (2), ..., I X (8) is expressed as follows.
  • FIG. 15 shows an example of a specific irradiation method from a plurality of light sources in the imaging apparatus 30.
  • a liquid crystal display is used as the light source unit 8
  • each block obtained by dividing the liquid crystal display into a plurality of light sources can be used as one light source L. That is, the light source control unit 10 can irradiate the object K with a plurality of irradiation patterns by controlling lighting of each block of the liquid crystal display.
  • the irradiation pattern is preferably an irradiation pattern corresponding to the Hadamard base.
  • the irradiation pattern corresponding to the Hadamard base is an irradiation pattern obtained by the following steps.
  • Each component of the Hadamard basis expressed in a matrix is made to correspond to a position on the irradiation pattern corresponding to a position on the matrix of the component.
  • the position on the irradiation pattern corresponding to the component is set to the light emission state (for example, the highest gradation).
  • the position on the irradiation pattern corresponding to the component is set to a non-light emitting state (for example, the lowest gradation).
  • the irradiation pattern corresponding to the Hadamard base is a matrix-represented Hadamard base, and for the component “+1”, the position on the irradiation pattern corresponding to the position on the matrix of the component is set to the light emission state.
  • Component “ ⁇ 1” is an irradiation pattern obtained by setting the position on the irradiation pattern corresponding to the position on the matrix of the component to the non-light emitting state.
  • the light source in the irradiation pattern corresponding to the Hadamard base, the light source is in a light emitting state or a non-light emitting state.
  • the reflectance of each light source can be specified.
  • FIG. 17 shows an actual captured image obtained by irradiating the object with light with a plurality of irradiation patterns corresponding to the Hadamard base. In the figure, captured images using 77 irradiation patterns are shown.
  • the configuration for specifying the reflection characteristic of the object using the captured image that is, the still image has been described.
  • the reflection characteristic of the object may be specified using the captured image, that is, the moving image, instead of the captured image. .
  • the image pickup unit 2 picks up an object by periodically repeating N irradiation patterns, and the specifying unit 4 performs motion compensation prediction for determining the reflection characteristics of each frame. More specifically, the specifying unit 4 specifies the reflection characteristics by the following procedure.
  • the bill recess 2 starts shooting with the object stationary, and the specifying unit 4 specifies the reflection characteristics.
  • the light source control unit 10 causes the light sources L 1 to L 8 of the light source unit 8 to irradiate the object with eight irradiation patterns as described above, and the imaging unit 2 The object irradiated with is imaged.
  • the specifying unit 4 reflects at the point (x, y) on the object at time t and reflects the light intensity I (t, x, y) in the captured image of the imaging unit 2 for each light source. Is identified.
  • the light intensity I (t, x, y) is the reflectance of each light source at the point (x, y) of the object, R 1 (t, x, y), R 2 (t, x, y). y), ..., R 8 ( t, x, y), the intensity of the ⁇ L 1 (t), L 2 (t), ..., L 8 (t) ⁇ of the incident light of each light source of each illumination pattern Then, it is expressed as follows.
  • R 2 (0, x, y) R 2 (1, x, y)
  • ... R 2 (7, x, y)
  • the reflectances R 1 (t, x, y), R 2 (t, x, y),..., R 8 (t, x, y) of each of the light sources ⁇ 7 can be specified.
  • the specifying unit 4 obtains the light intensity I ′ (t, x, y) in the predicted image at the time t using the reflectance at the time t ⁇ 1 as follows.
  • the specifying unit 4 performs motion compensation prediction based on the light intensity I (t, x, y) in the captured image and the light intensity I ′ (t, x, y) in the predicted image. Specifically, the specifying unit 4 divides the light intensity I (t, x, y) in the captured image into predetermined rectangular areas, and obtains ⁇ x, ⁇ y that minimizes the following expression for each rectangular area.
  • the specifying unit 4 has initial values R ′ 1 (t, x, y), R ′ 2 (t, x, y),..., R ′ 8 (t, x, y), the intensity of light I ′ 0 (x, y), I ′ 1 (x, y),..., I ′ 6 ( x, y) is obtained as follows.
  • I ′ 0 (x, y) R ′ 1 (t, x, y) L 1 (t ⁇ 8) + R ′ 2 (t, x, y) L 2 (t ⁇ 8)... + R ′ 8 (t, x, y) L 8 (t-8)
  • I ′ 1 (x, y) R ′ 1 (t, x, y) L 1 (t ⁇ 7) + R ′ 2 (t, x, y) L 2 (t ⁇ 7)...
  • the values of the light intensities I ′ 0 (x, y), I ′ 1 (x, y),..., I ′ 6 (x, y) in the predicted image obtained by the above formula are less than the minimum value. Is corrected to 0, and when it exceeds the maximum value, it is corrected to 1.
  • the specifying unit 4 determines the light intensity I ′ 0 (x, y), I ′ 1 (x, y),..., I ′ 6 (x, y) in the predicted image. , And the light intensity I (t, x, y) in the captured image, the reflectances R 1 (t, x, y), R 2 (t, x, y),... , R 8 (t, x, y) is obtained.
  • I ′ 0 (x, y) R 1 (t, x, y) L 1 (t ⁇ 8) + R 2 (t, x, y) L 2 (t ⁇ 8)... + R 8 (t, x, y ) L 8 (t-8)
  • the present embodiment relates to a display device and a display method for expressing the texture of an object live.
  • FIG. 18 shows a main configuration of the display device according to the present embodiment.
  • the display device 40 includes an inclination detection unit 12, a virtual light source data acquisition unit 14, an object image data acquisition unit 16, a control unit 18, and a display unit 20.
  • the control unit 18 further includes a virtual light source position determination unit 181 (determination unit) and a display processing unit 182 (derivation unit).
  • the tilt detection unit 12 detects the tilt of the display device 40.
  • the inclination detection unit 12 for example, an acceleration sensor can be used.
  • the virtual light source data acquisition unit 14 acquires virtual light source data (light environment image) indicating the light environment in which the texture of the object is to be expressed.
  • the virtual light source data may be data indicating the light environment during actual observation, or may be image data or video data indicating an arbitrary light environment such as a landscape image or a landscape video.
  • the object image data acquisition unit 16 acquires image data of an object for which a texture is to be expressed and reflection characteristic data of the object.
  • the reflection characteristic data may be, for example, reflection characteristic data obtained by the method shown in the first embodiment or the second embodiment.
  • the virtual light source position determination unit 181 of the control unit 18 determines the position of the virtual light source with respect to the display device 40 based on the tilt of the display device 40 detected by the tilt detection unit 12. Specifically, the virtual light source position determination unit 181 refers to the virtual light source data acquired by the virtual light source data acquisition unit 14 and expresses the position of the light source in the light environment indicated by the virtual light source data as the texture of the object. The position of the virtual light source in the light environment is fixed. Further, the virtual light source position determination unit 181 determines the position of the virtual light source with respect to the display device 40 based on the tilt of the display device 40 detected by the tilt detection unit 12. The technical meaning of “the position of the virtual light source with respect to the display device 40” will be described later.
  • the display processing unit 182 of the control unit 18 uses the virtual light source data acquired by the virtual light source data acquisition unit 14 and the object image data and reflection characteristic data acquired by the object image data acquisition unit 16 to display the display device. 40, an image showing the object irradiated with light from the virtual light source at the position determined by the virtual light source position determining unit 181 is generated.
  • the display unit 20 displays the image generated by the display processing unit 182.
  • FIG. 19 shows a schematic diagram of a display method by the display device 40 according to the present embodiment.
  • the virtual light source position determination unit 181 refers to the virtual light source data acquired by the virtual light source data acquisition unit 14, and the light source in the light environment indicated by the virtual light source data Is fixed as the position of the virtual light source L ′ in the light environment in which the texture of the object is to be expressed. Further, the virtual light source position determination unit 181 determines the position of the virtual light source L ′ with respect to the display device 40 based on the tilt of the display device 40 detected by the tilt detection unit 12.
  • the display processing unit 182 uses the virtual light source data acquired by the virtual light source data acquisition unit 14 and the image data and reflection characteristic data of the object acquired by the object image data acquisition unit 16 for the display device 40. Then, an image indicating the object irradiated with light from the virtual light source L ′ at the position determined by the virtual light source position determining unit 181 is generated and displayed on the display unit 20.
  • the tilt detector 12 redetects the tilt of the display device 40, and the virtual light source position determiner 181 Based on the inclination redetected by the detection unit 12, the position of the virtual light source L ′ with respect to the display device 40 is determined again.
  • the display processing unit 182 regenerates an image indicating the object irradiated with light from the virtual light source L ′ at the position re-determined by the virtual light source position determining unit 181 with respect to the display device 40. Display.
  • the position of the virtual light source L ′ with respect to the display device 40 is changed by tilting the display device 40 in a state where the position of the virtual light source L ′ is fixed.
  • the texture of the object is expressed live. can do.
  • FIG. 20 is a diagram illustrating a coordinate system including the display device 40 and the virtual light source L ′.
  • the xyz coordinates are taken with the vertical direction (downward direction in the drawing) taken as the negative direction of the z-axis.
  • the center coordinate of the display device 40 is M, its zenith angle is ⁇ o, and its azimuth angle is ⁇ o .
  • the virtual light source position determination unit 181 can obtain the zenith angle ⁇ o and the azimuth angle ⁇ o of the central coordinate M from the tilt detection of the display device 40 by the tilt detection unit 12. Further, it is assumed that the display device 40 is on a plane perpendicular to a straight line from the origin O of the coordinate system to the center coordinate M.
  • the virtual light source position determination unit 181 refers to the virtual light source data acquired by the virtual light source data acquisition unit 14, and indicates the position of the light source in the light environment indicated by the virtual light source data to express the texture of the object. Fixed as the position of the virtual light source in the environment.
  • the zenith angle of the virtual light source is ⁇ i and the azimuth angle is ⁇ i .
  • the virtual light source position determination unit 181 can obtain the zenith angle ⁇ i and the azimuth angle ⁇ i of the virtual light source from the virtual light source data.
  • the virtual light source position determination unit 181 determines the position of the virtual light source L ′ with respect to the display device 40 when the zenith angle ⁇ o and the azimuth angle ⁇ o of the center coordinate M of the display device 40 and the zenith angle ⁇ i of the virtual light source. And azimuth angle ⁇ i are determined.
  • the image generated by the display processing unit 182 is such that light from the virtual light source L ′ is applied to the object and the light reflected by the object is displayed at the position of the display device 40. It is a captured image. Therefore, the radiance incident on the object from the virtual light source L ′ is L ( ⁇ i , ⁇ i ), the radiance reflected from the object to the display device 40 is V ⁇ o, ⁇ o, and the display device 40 from the object is displayed.
  • BRDF ⁇ o, ⁇ o be the reflection characteristic of the light reflected to
  • the display processing unit 182 derives the changed radiance V ⁇ o, ⁇ o by obtaining the changed reflection characteristic BRDF ⁇ o, ⁇ o using the reflection characteristic data acquired by the object image data acquisition unit 16. Can do. In this way, the display device 40 can express the texture of the object live.
  • the virtual light source data may be image data or video data indicating an arbitrary light environment such as a landscape image or a landscape video.
  • the display processing unit 182 derives reproduction light that reproduces the light environment indicated by the virtual light source data, and the display device 40 is irradiated with the reproduction light from the virtual light source at the position determined by the virtual light source position determination unit 181. An image showing the target object is generated.
  • the light environment of the travel destination can be reproduced.
  • a plurality of virtual light source data it is possible to reproduce the light environment in various places. For example, when performing online shopping or the like, if this embodiment is used for a desire to confirm the appearance and texture of an object in various places, the user can stay in various places and stay in various places. The light environment can be reproduced and the texture of the object can be expressed live. Thus, according to the present embodiment, it is possible to eliminate temporal and spatial restrictions when expressing the texture of an object.
  • FIG. 21 is a diagram illustrating a main configuration of the display device 50 according to the present modification.
  • the display device 50 is different in that it includes a position detection unit 22 instead of the inclination detection unit 12.
  • the position detection unit 22 detects the position of the display device 50.
  • a position information sensor can be used as the position detection unit 22 for example.
  • the virtual light source position determination unit 181 of the control unit 18 determines the position of the virtual light source with respect to the display device 50 based on the position of the display device 50 detected by the position detection unit 22. Specifically, the virtual light source position determination unit 181 refers to the virtual light source data acquired by the virtual light source data acquisition unit 14 and expresses the position of the light source in the light environment indicated by the virtual light source data as the texture of the object. The position of the virtual light source in the light environment is fixed. Further, the virtual light source position determination unit 181 determines the position of the virtual light source with respect to the display device 50 based on the position of the display device 50 detected by the position detection unit 22.
  • the virtual light source position determination unit 181 determines the position from the distance and angle of the virtual light source with respect to the display device 50, and the virtual light source at the determined position.
  • the display processing unit 182 can generate an image showing the object irradiated with light and display it on the display unit 20.
  • FIG. 22 is a diagram illustrating a main configuration of the display device 60 according to the present modification.
  • the display device 60 is different in that it includes an angle detection unit 24 instead of the inclination detection unit 12.
  • the angle detection unit 24 detects the direction (direction) of the display device 50.
  • a geomagnetic sensor can be used as the angle detector 24 for example.
  • the virtual light source position determination unit 181 of the control unit 18 determines the position of the virtual light source with respect to the display device 60 based on the orientation of the display device 60 detected by the angle detection unit 24. Specifically, the virtual light source position determination unit 181 refers to the virtual light source data acquired by the virtual light source data acquisition unit 14 and expresses the position of the light source in the light environment indicated by the virtual light source data as the texture of the object. The position of the virtual light source in the light environment is fixed. Further, the virtual light source position determination unit 181 determines the position of the virtual light source with respect to the display device 60 based on the orientation of the display device 60 detected by the angle detection unit 24.
  • the virtual light source position determination unit 181 determines the position from the angle of the virtual light source with respect to the display device 60, and light is emitted from the virtual light source at the determined position.
  • An image showing the irradiated object can be generated by the display processing unit 182 and displayed on the display unit 20.
  • FIG. 23 is a diagram showing a main configuration of a display device 70 according to this modification.
  • the display device 70 is different in that it includes an input receiving unit 26 instead of the inclination detecting unit 12.
  • the input receiving unit 26 receives an input from the observer, and receives an input of the position of the virtual light source set by the observer.
  • a touch sensor provided in the display unit 20 can be used.
  • an image or the like indicating the positional relationship between the display device 70 and the virtual light source is displayed on the display unit 20, and the position of the virtual light source displayed in the image is changed by tracing on the touch sensor of the display unit 20. By doing so, the position of the virtual light source can be input.
  • the virtual light source position determination unit 181 of the control unit 18 determines the position of the virtual light source with respect to the display device 70 based on the position of the virtual light source received by the input reception unit 26. Specifically, the virtual light source position determination unit 181 refers to the virtual light source data acquired by the virtual light source data acquisition unit 14 and expresses the position of the light source in the light environment indicated by the virtual light source data as the texture of the object. The position of the virtual light source in the light environment is fixed. Further, the virtual light source position determining unit 181 determines the position of the virtual light source with respect to the display device 70 based on the position of the virtual light source received by the input receiving unit 26.
  • the display processing unit 182 generates an image showing the object irradiated with light from the virtual light source at the position determined by the observer by determining the position of the virtual light source at the desired position by the observer, It can be displayed on the display unit 20.
  • control blocks of the imaging device 30 and the display devices 40 to 70 may be realized by a logic circuit (hardware) formed in an integrated circuit (IC chip) or the like, or by software using a CPU (Central Processing Unit). It may be realized.
  • a logic circuit hardware
  • IC chip integrated circuit
  • CPU Central Processing Unit
  • the imaging device 30 and the display devices 40 to 70 include a CPU that executes instructions of a program that is software for realizing each function, and a ROM in which the program and various data are recorded so as to be readable by the computer (or CPU).
  • a CPU Read Only Memory
  • a storage device these are referred to as “recording media”
  • RAM Random Access Memory
  • the objective of this invention is achieved when a computer (or CPU) reads the said program from the said recording medium and runs it.
  • a “non-temporary tangible medium” such as a tape, a disk, a card, a semiconductor memory, a programmable logic circuit, or the like can be used.
  • the program may be supplied to the computer via an arbitrary transmission medium (such as a communication network or a broadcast wave) that can transmit the program.
  • an arbitrary transmission medium such as a communication network or a broadcast wave
  • one embodiment of the present invention can also be realized in the form of a data signal embedded in a carrier wave, in which the program is embodied by electronic transmission.
  • the imaging device 30 includes an imaging unit 2 that captures an object irradiated with light from an irradiation pattern obtained by combining a plurality of light sources, and an image captured by the imaging unit 2. And a specifying unit 4 that specifies the reflectance of the light from the light source to the imaging unit 2.
  • a captured image by the imaging unit 2 is captured brightly and noise can be reduced. Thereby, a more accurate reflection characteristic of the object can be measured.
  • the irradiation pattern may include an irradiation pattern corresponding to a Hadamard base.
  • the light source in the irradiation pattern corresponding to the Hadamard base, the light source is in a light emitting state or a non-light emitting state. Therefore, by adding or subtracting the intensity of light obtained from the captured image, it is easy to The reflectance of each light source at a specific point can be specified.
  • the irradiation pattern may include an irradiation pattern obtained by inverting another irradiation pattern.
  • the light component from the plurality of light sources and the leaked light component can be separated by including an irradiation pattern obtained by inverting another irradiation pattern in the irradiation pattern.
  • the irradiation pattern is obtained by causing more than half of the plurality of light sources to emit light. May be.
  • the picked-up image by the image pick-up part 2 is image
  • the imaging method includes an imaging step of imaging an object irradiated with light from an irradiation pattern obtained by combining a plurality of light sources, and light of each of the light sources from an image captured in the imaging step. Specifying the reflectance in the imaging direction.
  • the display devices 40 to 60 are display devices 40 to 60 that display an object, and are virtual to the display devices 40 to 60 according to the position or orientation of the display devices 40 to 60.
  • a determination unit virtual light source position determination unit 181 that determines the position of the light source, and the display device 40 to 60, the object irradiated with light from the virtual light source at the position determined by the determination unit
  • a display processing unit 182 for displaying
  • the texture of the object can be reduced by displaying the image in which the reflection characteristic of the object with respect to the light from the virtual light source is changed according to the change in the position of the virtual light source with respect to the display devices 40 to 60. It can be expressed live.
  • a display device 70 is a display device 70 that displays an object, and refers to an input receiving unit 26 that receives an input from a user and the input from the user that is received by the input receiving unit 26. Then, a determination unit (virtual light source position determination unit 181) that determines the position of the virtual light source with respect to the display device 70 and light from the virtual light source at the position determined by the determination unit with respect to the display device 70 A display processing unit 182 for displaying the irradiated object.
  • a determination unit virtual light source position determination unit 181 that determines the position of the virtual light source with respect to the display device 70 and light from the virtual light source at the position determined by the determination unit with respect to the display device 70
  • a display processing unit 182 for displaying the irradiated object.
  • the texture of the object can be expressed live.
  • the display devices 40 to 70 according to aspect 8 of the present invention refer to the light environment image showing an arbitrary light environment in the above-described aspect 6 or 7, and a derivation unit (display) that derives reproduction light that reproduces the light environment.
  • the display processing unit 182 may further display the target object irradiated with the reproduction light as light from the virtual light source.
  • the light environment of the travel destination can be reproduced by using the image data of the landscape image of the travel destination as the virtual light source data. Further, by using a plurality of virtual light source data, it is possible to reproduce the light environment in various places.
  • the time and space restrictions at the time of expressing the texture of a target object can be eliminated.
  • the display method according to the ninth aspect of the present invention is a display method in which the display devices 40 to 60 display an object, and the virtual devices for the display devices 40 to 60 are displayed according to the position or orientation of the display devices 40 to 60.
  • the display method according to the tenth aspect of the present invention is a display method in which the display device 70 displays an object, and refers to an input receiving step for receiving an input from a user and the input from the user received in the input receiving step. Then, a determination step for determining the position of the virtual light source with respect to the display device 70, and the object irradiated with light from the virtual light source at the position determined in the determination step is displayed on the display device 70. And a display processing step.
  • the imaging device 30 and the display devices 40 to 70 according to each aspect of the present invention may be realized by a computer.
  • each unit (software element) included in the imaging device 30 and the display devices 40 to 70 is provided with the computer.
  • the programs of the imaging device 30 and the display devices 40 to 70 that cause the imaging device 30 and the display devices 40 to 70 to be realized by a computer by operating them as well as the computer-readable recording medium on which the programs are recorded are also included in the scope of the present invention. to go into.
  • Imaging unit 4 Identification unit 6, 18 Control unit 8
  • Light source unit 10 Light source control unit 12
  • Detection unit 14 Virtual light source data acquisition unit 16
  • Object image data acquisition unit 181 Virtual light source position determination unit 182
  • Display processing unit 20 Display unit 22
  • Position detection Unit 24 angle detection unit 26 input reception unit 30 imaging device 40 to 70 display device

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Immunology (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Pathology (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Mathematical Physics (AREA)
  • Computer Graphics (AREA)
  • Studio Devices (AREA)

Abstract

La présente invention concerne la mesure plus correcte de caractéristiques de réflexion d'un objet. Un dispositif d'imagerie (30) comprend : une unité d'imagerie (2) permettant d'imager un objet irradié avec de la lumière par l'intermédiaire d'un motif d'irradiation obtenu par combinaison d'une pluralité de sources de lumière ; et une unité de spécification (4) permettant de spécifier la réflectivité de la lumière des sources de lumière sur l'unité d'imagerie (2), en fonction de l'image imagée par l'unité d'imagerie (2).
PCT/JP2017/041564 2016-12-01 2017-11-17 Dispositif d'imagerie, procédé d'imagerie et dispositif d'affichage WO2018101093A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016234514 2016-12-01
JP2016-234514 2016-12-01

Publications (1)

Publication Number Publication Date
WO2018101093A1 true WO2018101093A1 (fr) 2018-06-07

Family

ID=62242428

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/041564 WO2018101093A1 (fr) 2016-12-01 2017-11-17 Dispositif d'imagerie, procédé d'imagerie et dispositif d'affichage

Country Status (1)

Country Link
WO (1) WO2018101093A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007018173A (ja) * 2005-07-06 2007-01-25 Canon Inc 画像処理方法、画像処理装置
JP2007033099A (ja) * 2005-07-25 2007-02-08 Fuji Xerox Co Ltd 光沢特性評価方法および光沢特性評価装置並びにプログラム
JP2013008324A (ja) * 2011-06-27 2013-01-10 Toshiba Corp 画像処理システム、端末装置及び方法
JP2014240830A (ja) * 2013-05-15 2014-12-25 キヤノン株式会社 測定装置およびその制御方法
JP2016038222A (ja) * 2014-08-05 2016-03-22 株式会社リコー 試料測定装置および試料測定プログラム

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007018173A (ja) * 2005-07-06 2007-01-25 Canon Inc 画像処理方法、画像処理装置
JP2007033099A (ja) * 2005-07-25 2007-02-08 Fuji Xerox Co Ltd 光沢特性評価方法および光沢特性評価装置並びにプログラム
JP2013008324A (ja) * 2011-06-27 2013-01-10 Toshiba Corp 画像処理システム、端末装置及び方法
JP2014240830A (ja) * 2013-05-15 2014-12-25 キヤノン株式会社 測定装置およびその制御方法
JP2016038222A (ja) * 2014-08-05 2016-03-22 株式会社リコー 試料測定装置および試料測定プログラム

Similar Documents

Publication Publication Date Title
JP3962588B2 (ja) 三次元画像処理方法、三次元画像処理装置、三次元画像処理システムおよび三次元画像処理プログラム
JP5795384B2 (ja) 映像処理装置、照明処理装置及びその方法
WO2015166684A1 (fr) Appareil et procédé de traitement d'image
US20180014003A1 (en) Measuring Accuracy of Image Based Depth Sensing Systems
JP5633058B1 (ja) 3次元計測装置及び3次元計測方法
JP2013127774A (ja) 画像処理装置、画像処理方法及びプログラム
KR101713875B1 (ko) 프로젝터 투사 환경하에서의 사용자 시점을 고려한 가상공간 구현 방법 및 시스템
US9756313B2 (en) High throughput and low cost height triangulation system and method
TW201518694A (zh) 光源亮度檢測方法及系統
US11727658B2 (en) Using camera feed to improve quality of reconstructed images
JP2020008502A (ja) 偏光ステレオカメラによる深度取得装置及びその方法
JP6869652B2 (ja) 画像処理装置、撮像装置、画像処理方法、画像処理プログラム、および、記憶媒体
Lin et al. Automatic optimal projected light intensity control for digital fringe projection technique
JP7633790B2 (ja) 質感取得システム、質感取得装置および質感取得プログラム
JP5926626B2 (ja) 画像処理装置及びその制御方法、プログラム
WO2018101093A1 (fr) Dispositif d'imagerie, procédé d'imagerie et dispositif d'affichage
JP4764963B2 (ja) 画像処理装置
JP2009236696A (ja) 被写体の3次元画像計測方法、計測システム、並びに計測プログラム
US11657586B1 (en) Systems and methods for augmented reality viewing based on directly mapped point cloud overlays
Ekstrand et al. High-resolution, high-speed, three-dimensional video imaging with digital fringe projection techniques
JP5506371B2 (ja) 画像処理装置、画像処理方法およびプログラム
US12026823B2 (en) Volumetric imaging
JP2019066424A (ja) 3次元形状を導出する装置、方法、及びプログラム
KR20130028370A (ko) 영상 모델링 시스템에서 형상 정보, 재질 정보 및 조명 정보를 획득하는 장치 및 방법
JP7565762B2 (ja) 情報処理装置、撮像装置、制御方法及びプログラム

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17876408

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17876408

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP