[go: up one dir, main page]

WO2018189867A1 - Projecteur, dispositif de commande à distance, système de projecteur et procédé de correction de distorsion - Google Patents

Projecteur, dispositif de commande à distance, système de projecteur et procédé de correction de distorsion Download PDF

Info

Publication number
WO2018189867A1
WO2018189867A1 PCT/JP2017/015159 JP2017015159W WO2018189867A1 WO 2018189867 A1 WO2018189867 A1 WO 2018189867A1 JP 2017015159 W JP2017015159 W JP 2017015159W WO 2018189867 A1 WO2018189867 A1 WO 2018189867A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
projection
projector
remote controller
distortion correction
Prior art date
Application number
PCT/JP2017/015159
Other languages
English (en)
Japanese (ja)
Inventor
青柳 寿和
Original Assignee
Necディスプレイソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Necディスプレイソリューションズ株式会社 filed Critical Necディスプレイソリューションズ株式会社
Priority to PCT/JP2017/015159 priority Critical patent/WO2018189867A1/fr
Publication of WO2018189867A1 publication Critical patent/WO2018189867A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B21/00Projectors or projection-type viewers; Accessories therefor
    • G03B21/14Details
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/74Projection arrangements for image reproduction, e.g. using eidophor
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N9/00Details of colour television systems
    • H04N9/12Picture reproducers
    • H04N9/31Projection devices for colour picture display, e.g. using electronic spatial light modulators [ESLM]

Definitions

  • the present invention relates to a projector, a remote controller, a projector system, and a distortion correction method.
  • a projector can project an image not only on a screen but also on a structure such as a wall.
  • the projection surface may not be flat.
  • distortion corresponding to the three-dimensional shape of the projection surface occurs in the projection image.
  • the projector is equipped with a distortion correction function for correcting such distortion.
  • the distortion of the projected image varies depending on the position (viewpoint) where the projected image is viewed. For example, when viewed from an oblique direction with respect to the projection plane, the near side of the projected image looks large and the far side appears small.
  • the distortion correction function is configured to correct distortion when a predetermined position (for example, the position of the projector) is the viewpoint, and the viewpoint cannot be changed.
  • Patent Document 1 a technique for correcting distortion according to an arbitrary viewpoint has been proposed (see Patent Document 1).
  • the projector described in Patent Document 1 includes an image modulation unit, a tilt sensor, a distance measurement unit, and a control unit, and an image formed by the image modulation unit is projected on a screen.
  • the tilt sensor detects the tilt angle of the projector with respect to the horizontal plane.
  • the screen is arranged perpendicular to the horizontal plane.
  • the distance measuring means measures the distance from the projector to the center of the screen (the center of the projection area) and the distance from the center of the screen to the person.
  • the distance measuring means includes a light source, a photodetector, and a right-angle prism.
  • the right angle prism is rotatably provided on the upper surface of the projector casing.
  • the light source outputs pulse-modulated light, and the pulse-modulated light is emitted in the horizontal direction via a right-angle prism. Reflected light from the object enters the photodetector through a right-angle prism. By rotating the right-angle prism, the periphery of the projector is scanned with pulse-modulated light.
  • the distance to the object is calculated based on the phase difference between the pulse modulated light output from the light source and the incident light (reflected light from the object) incident on the photodetector.
  • the control unit Based on the inclination angle detected by the inclination sensor and the distance from the projector to the screen and the distance from the screen to the person, which is measured by the distance measuring means, the control unit has a rectangular projection image when viewed from the person. As described above, the distortion of the image formed on the image modulation means is corrected.
  • distortion correction method there is a method in which a camera is installed at an arbitrary viewpoint position, an image projected on an object is captured by the camera, and distortion correction data is created from the degree of distortion of the captured image. Furthermore, as another distortion correction method, the operator adjusts the image so that the distortion is eliminated by using the user interface while observing the degree of distortion of the projected image projected on the target object at an arbitrary position. Thus, there is a method for creating distortion correction data.
  • Patent Document 1 discloses distance measuring means as means for detecting the position of an image projection area and the position of a person. However, Patent Document 1 measures distance to surrounding objects as distance measuring means. However, it is unclear how to detect the position of the image projection area and the position of the person. Therefore, it is also unclear whether it can be applied to distortion correction with the position of an arbitrary observer as the viewpoint. ing. In the method of installing a camera at an arbitrary viewpoint position, it is necessary to re-install the camera every time the viewpoint is changed. Such work is very troublesome for the user. Even in the method of correcting distortion using the user interface, the user needs to re-adjust the distortion correction every time the viewpoint is changed. Such work is very troublesome for the user.
  • An object of the present invention is to provide a projector, a remote controller, a projector system, and a distortion correction method capable of correcting distortion from an arbitrary viewpoint with a simple operation.
  • a projector used in combination with a remote controller that projects a predetermined pattern including three or more feature points An image forming unit for forming an image; A projection unit that projects an image formed by the image forming unit on a projection surface; Distortion correcting means for correcting an image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
  • the distortion correction means includes A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates.
  • the predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired.
  • a second acquisition unit A storage unit storing data indicating the radiation direction of each of the three or more feature points of the remote controller; Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. And a remote controller position calculating unit.
  • a projection unit that projects a predetermined pattern including three or more feature points;
  • a transmitter that wirelessly transmits a distortion correction start request signal for requesting the start of distortion correction;
  • First and second operation keys First and second operation keys;
  • a control unit that causes the projection unit to project the predetermined pattern when the first operation key is pressed, and transmits the distortion correction start request signal from the transmission unit when the second operation key is pressed.
  • a remote controller is provided.
  • a projector system having a remote controller that projects a predetermined pattern including three or more feature points.
  • a projector including an image forming unit that forms an image and a projection unit that projects an image formed by the image forming unit onto a projection plane
  • a distortion correction method using a remote controller Correcting the image formed on the image forming unit so that the projected image is not distorted when the projection surface is viewed from the position of the remote controller,
  • the distortion correction is A three-dimensional position of the projection plane in a projector coordinate system in which a three-dimensional measurement of the projection plane is performed to obtain a three-dimensional position of the projection plane, and a space in which the image of the projection unit is projected is represented by three-dimensional coordinates.
  • the predetermined pattern projected on the projection plane is imaged, the three or more feature points are extracted from the captured image of the pattern, and the physical coordinates of the three or more feature points in the projector coordinate system are acquired. And Based on the three-dimensional position coordinates of the projection plane, the physical coordinates of the three or more feature points, and the radiation directions of the three or more feature points, the position of the remote controller in the projector coordinate system is calculated. A distortion correction method is provided.
  • distortion correction from an arbitrary viewpoint can be performed with a simple operation.
  • FIG. 1 is a block diagram illustrating a configuration of a projector that is a first embodiment of the present invention.
  • FIG. It is a block diagram which shows the structure of the remote controller used with the projector shown in FIG.
  • It is a schematic diagram for demonstrating the positional relationship of the detection range of a three-dimensional sensor part, the imaging range of a camera part, and the projection area of a projection lens unit part.
  • It is a schematic diagram for demonstrating the positional relationship of the pattern projected from the remote controller, and the projection area of a projector.
  • It is a flowchart which shows one procedure of a distortion correction data calculation process.
  • It is a schematic diagram for demonstrating the relationship of a field angle in case of launching, a projection optical axis, and a projection center point.
  • FIG. 1 is a block diagram showing a configuration of a projector according to the first embodiment of the present invention.
  • the projector includes a projection unit 2, a reception unit 3, and a distortion correction data calculation unit 4.
  • the projection unit 2 projects an image based on the video signal 1 from the video supply device.
  • the receiving unit 3 receives a remote control signal (including a distortion correction start request and the like) from the remote controller.
  • the distortion correction data calculation unit 4 calculates distortion correction data so that an image without distortion can be visually recognized when the projection plane is viewed from the position of the remote controller.
  • the video supply device is, for example, an information processing device such as a personal computer or a video device such as a recorder.
  • the projection unit 2 includes a video processing unit 5, a distortion correction unit 6, a video mute unit 7, and a projection lens unit unit 8.
  • the video signal 1 is input to the video processing unit 5.
  • the video processing unit 5 performs processing for converting the resolution of the video signal 1 to the resolution of the display device of the projection lens unit 8, processing for adjusting image quality, and the like.
  • the distortion correction unit 6 corrects the distortion of the projected image when the position of the remote controller is set as the viewpoint, according to the distortion correction coefficient, with respect to the video signal processed by the video processing unit 5.
  • the video mute unit 7 supplies the video signal corrected by the distortion correction unit 6 to the projection lens unit unit 8 as it is, or creates a black screen image and outputs it to the projection lens unit unit 8.
  • the projection lens unit section 8 includes a display device that forms an image based on the video signal from the video mute section 7 and a projection lens that projects an image formed by the display device.
  • the projection lens includes a lens that can move in the direction of the optical axis, a zoom mechanism that changes the angle of view according to a zoom position corresponding to the position on the optical axis of the lens, and the entire projection lens that is orthogonal to the optical axis. And a lens shift mechanism that shifts in the direction of movement.
  • the projection lens unit 8 supplies zoom / shift position information indicating the zoom position and lens shift position of the projection lens to the distortion correction data calculation unit 4.
  • the display device can be referred to as an image forming element including an image forming surface including a plurality of pixels.
  • a liquid crystal display element, DMD (digital micromirror device), or the like can be used.
  • the distortion correction data calculation unit 4 includes a projector projection design data storage unit 9, a three-dimensional sensor unit 10, a three-dimensional sensor calibration data storage unit 11, a three-dimensional data projector coordinate conversion unit 12, a camera unit 13, and a pattern feature point detection unit. 14, camera imaging design data storage unit 15, pattern physical coordinate calculation unit 16, camera calibration data storage unit 17, pattern projector coordinate conversion unit 18, pattern storage unit 19, remote controller projection design data storage unit 20, remote controller position calculation And a distortion correction coefficient calculation unit 22.
  • the projector projection design data storage unit 9 stores design data related to the projection of the projector.
  • the design data is data necessary for setting the projector coordinate system such as the angle of view and the projection center from the zoom position and the lens shift position.
  • the three-dimensional sensor unit 10 measures the three-dimensional position of the projection surface of the projection object and outputs three-dimensional position data.
  • the three-dimensional sensor unit 10 includes a three-dimensional sensor arranged toward the optical axis direction of the projection lens.
  • the detection range of the three-dimensional sensor includes the entire projectable area of the projection lens.
  • a TOF (Time of Flight) method or a triangulation method three-dimensional sensor can be used, but it is not limited to these methods.
  • the TOF method is a method of performing three-dimensional measurement by projecting light toward an object and measuring the time until the projected light is reflected by the object and returned.
  • Examples of the triangulation method include a passive triangulation method and an active triangulation method.
  • the passive triangulation method an object is photographed at the same time with two cameras arranged side by side on the left and right, and the principle of triangulation is used based on the difference in position on the captured image of the object obtained by each camera.
  • This is a method for measuring and is also called a stereo camera method.
  • the active triangulation method is a method of irradiating light on an object and performing three-dimensional measurement using the principle of triangulation based on information on reflected light from the object.
  • the 3D sensor calibration data storage unit 11 stores 3D sensor calibration data.
  • the three-dimensional sensor calibration data includes parameters (rotation amount and translation amount) for converting the coordinate system of the three-dimensional sensor into the coordinate system of the projector, a reference zoom position, and a reference lens shift position.
  • the amount of rotation and the amount of translation are data obtained from the result of calibration that measures the positional relationship between the coordinate system of the three-dimensional sensor and the coordinate system of the projector.
  • the reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
  • the three-dimensional data projector coordinate conversion unit 12 acquires the three-dimensional position data of the projection plane from the three-dimensional sensor unit 10 and the three-dimensional sensor calibration data (rotation amount, translation amount, Reference zoom position and reference lens shift position) are acquired, design data is acquired from the projector projection design data storage unit 9, and zoom / shift position information is acquired from the projection lens unit unit 8. Based on the 3D sensor calibration data, the design data, and the zoom / shift position information, the 3D data projector coordinate conversion unit 12 converts the 3D position data of the projection plane into the coordinate system of the projector with the projection center as the origin. Convert to 3D position data.
  • the calibration data and design data can be called coordinate conversion data for converting the coordinate system of the three-dimensional sensor into a projector coordinate system with the projection center as the origin.
  • the camera unit 13 is arranged in the direction of the optical axis of the projection lens, and images the pattern projected from the remote controller onto the projection area.
  • the imaging range of the camera unit 13 includes the entire projectable area of the projection lens.
  • the pattern feature point detection unit 14 extracts a pattern projected by the remote controller from the captured image of the camera unit 13 and detects a plurality of feature points indicated by the pattern.
  • the number of feature points is three or more.
  • the pattern feature point detector 14 outputs pattern feature point information (two-dimensional coordinate information) indicating the coordinates of each feature point on the captured image.
  • the camera imaging design data storage unit 15 stores camera imaging design data related to imaging such as the angle of view of the camera.
  • the camera imaging design data is data necessary for setting a camera coordinate system such as an angle of view and an optical center.
  • the pattern physical coordinate calculation unit 16 acquires camera imaging design data from the camera imaging design data storage unit 15 and acquires pattern feature point information indicating the coordinates of each feature point from the pattern feature point detection unit 14.
  • the pattern physical coordinate calculation unit 16 calculates physical coordinates of each feature point in the camera coordinate system based on the camera imaging design data and the pattern feature point information.
  • the pattern physical coordinate calculation unit 16 outputs pattern feature point physical coordinate data, which is a calculation result of the physical coordinates of each feature point.
  • the physical coordinates will be briefly described.
  • the coordinates on the captured image which are two-dimensional coordinates, expressed as three-dimensional coordinates on a reference plane placed at an arbitrary distance based on design information such as the angle of view of the camera are called physical coordinates.
  • the camera calibration data storage unit 17 stores camera calibration data. It includes parameters (amount of rotation and translation) for converting the camera coordinate system to the projector coordinate system, a reference zoom position, and a reference lens shift position.
  • the rotation amount and the translation amount are data obtained as a result of performing calibration for measuring the positional relationship between the camera coordinate system and the projector coordinate system.
  • the reference zoom position and the reference lens shift position are the zoom position and the lens shift position when calibration is performed.
  • the pattern projector coordinate conversion unit 18 acquires the pattern feature point physical coordinate data from the pattern physical coordinate calculation unit 16 and the camera calibration data (rotation amount, translation amount, reference zoom position, and reference lens) from the camera calibration data storage unit 17.
  • design data is acquired from the projector projection design data storage unit 9, and zoom / shift position information is acquired from the projection lens unit unit 8.
  • the pattern projector coordinate conversion unit 18 uses the pattern feature point physical coordinate data based on the camera calibration data, design data, and zoom / shift position information as the pattern feature point physical coordinates in the projector coordinate system with the projection center as the origin. Convert to data.
  • the pattern storage unit 19 stores pattern information projected from the remote controller. For example, in the case of a pattern including four feature points, information such as the positional relationship of each feature point is stored in the pattern storage unit 19.
  • the remote controller projection design data storage unit 20 stores design data related to the projection of the remote controller. This remote controller projection design data is the angle of view of projection of the remote controller. From this design data, it is possible to obtain the direction (radiation direction) of the light beam toward each of the feature points.
  • the remote controller position calculation unit 21 acquires the three-dimensional position coordinate data of the projection plane of the projector coordinate system from the three-dimensional data projector coordinate conversion unit 12, and the pattern feature point physical coordinate data of the projector coordinate system from the pattern projector coordinate conversion unit 18. , Pattern information is acquired from the pattern storage unit 19, and remote controller projection design data is acquired from the remote controller projection design data storage unit 20.
  • the remote controller position calculation unit 21 calculates the three-dimensional position of the remote controller in the projector coordinate system based on the three-dimensional position coordinate data, the pattern feature point physical coordinate data, the pattern information, and the remote controller projection design data.
  • the distortion correction coefficient calculation unit 22 acquires zoom / shift position information from the projection lens unit unit 8, acquires design data from the projector projection design data storage unit 9, and receives the projector coordinate system from the three-dimensional data projector coordinate conversion unit 12.
  • the three-dimensional position coordinate data of the projection plane is acquired, and the three-dimensional position data of the remote controller of the projector coordinate system is acquired from the remote controller position calculation unit 21.
  • the distortion correction coefficient calculation unit 22 uses the position of the remote controller as a viewpoint based on the zoom / shift position information, the design data, the three-dimensional position coordinate data of the projection plane, and the three-dimensional position data of the remote controller.
  • a distortion correction coefficient for correcting the projection image so as not to cause distortion corresponding to the three-dimensional shape of the projection surface is calculated.
  • the distortion correction coefficient is supplied from the distortion correction coefficient calculation unit 22 to the distortion correction unit 6.
  • FIG. 2 is a block diagram showing the configuration of the remote controller.
  • the remote controller 50 includes an operation unit 51, a control unit 52, a transmission unit 53, and a pattern projection unit 54.
  • the operation unit 51 includes a plurality of operation keys, receives an input operation using these operation keys, and supplies an operation signal corresponding to the input operation to the control unit 52.
  • a pattern projection key and a distortion correction start key are provided on the operation unit 51.
  • an operation signal for starting pattern projection is supplied from the operation unit 51 to the control unit 52.
  • the distortion correction start key is pressed, an operation signal indicating the start of distortion correction is supplied from the operation unit 51 to the control unit 52.
  • the pattern projection unit 54 projects a predetermined pattern including three or more feature points.
  • the predetermined pattern may be any pattern as long as three or more feature points can be specified.
  • the predetermined pattern may be a pattern composed of three or more bright spots, or may be a figure composed of a plurality of lines.
  • a pattern including four feature points is used as the predetermined pattern.
  • the pattern projection unit 54 may have any structure as long as a predetermined pattern can be projected.
  • the pattern projection unit 54 may have a structure for projecting a pattern composed of a plurality of bright spots and a figure pattern composed of a plurality of lines, such as used in a laser pointer or the like.
  • the pattern projection unit 54 may include four light sources arranged on the substrate and a lens group that projects a bright spot pattern composed of these light sources.
  • an LD laser diode
  • LED light emitting diode
  • the direction of the chief ray from the four light sources toward each of the four feature points is the radiation direction of each feature point.
  • a combination of three or more laser pointers each indicating one bright spot may be used as the pattern projection unit 54. In this case, the emission direction of each laser pointer is the radiation direction of each feature point.
  • the transmission unit 53 transmits a remote control signal (radio signal) using infrared rays or the like. Since the remote control signal is generally used in video equipment such as a recorder, a detailed description of its structure is omitted here.
  • the control unit 52 controls the operations of the transmission unit 53 and the pattern projection unit 54 in accordance with an operation signal from the operation unit 51. Specifically, the control unit 52 causes the pattern projection unit 54 to project an image of a predetermined pattern in accordance with an operation signal indicating that the pattern projection key is pressed. In addition, the control unit 52 causes the transmission unit 53 to transmit a remote control signal indicating a distortion correction start request in accordance with an operation signal indicating that the distortion correction start key is pressed.
  • FIG. 3 schematically shows a positional relationship among the detection range of the three-dimensional sensor unit 10, the imaging range of the camera unit 13, and the projection area of the projection lens unit unit 8.
  • the three-dimensional sensor 107, the camera 106, and the projection lens 101a are provided on the same surface of the casing.
  • the three-dimensional sensor 107 and the camera 106 are arranged toward the optical axis direction of the projection lens 101a.
  • the projection area can be enlarged or reduced using the zoom function, and the projection area can be moved up, down, left, and right using the lens shift function.
  • the projection area 110 is a projection area when the lens is shifted to the upper right to minimize the zoom.
  • the projection area 111 is a projection area when the zoom is maximized by shifting the lens in the upper right direction.
  • the projection area 112 is a projection area when the zoom is minimized by shifting the lens in the upper left direction.
  • the projection area 113 is a projection area when the lens is shifted in the upper left direction to maximize the zoom.
  • the projection area 114 is a projection area when the zoom is minimized by shifting the lens in the lower right direction.
  • the projection area 115 is a projection area when the zoom is maximized by shifting the lens in the lower right direction.
  • the projection area 116 is a projection area when the zoom is minimized by shifting the lens in the lower left direction.
  • the projection area 117 is a projection area when the zoom is maximized by shifting the lens in the lower left direction.
  • the detection range of the three-dimensional sensor 107 includes the entire projectable area where the image from the projection lens 101a can be projected, that is, the entire projection areas 110 to 117. Therefore, the three-dimensional sensor 107 can measure the three-dimensional position of the three-dimensional object arranged in the projectable area.
  • the three-dimensional sensor unit 10 supplies the three-dimensional position data that is the output of the three-dimensional sensor 107 to the three-dimensional data projector coordinate conversion unit 12.
  • the imaging range of the camera 106 also includes the entire projectable area, that is, the entire projection areas 110 to 117. Therefore, the camera 106 can capture a predetermined pattern projected on the projection plane by the remote controller 50 in the projectable area.
  • the camera unit 13 supplies the captured image that is the output of the camera 106 to the pattern feature point detection unit 14.
  • Both the detection range and the imaging range of the camera 106 include a projection area.
  • the user uses the remote controller 50 to specify a position that is a viewpoint for viewing the projected image.
  • the pattern projection unit 54 projects a predetermined pattern on the projection surface.
  • the transmission unit 53 transmits a remote control signal indicating a distortion correction start request.
  • the projector 101 starts distortion correction processing in response to a remote control signal from the remote controller 50.
  • FIG. 4 shows the positional relationship between the pattern projected from the remote controller 50 and the projection area of the projector 101.
  • the remote controller 50 projects a pattern 120 composed of four points onto the projection area 103 on the projection object 104.
  • the projector 101 detects the position of the remote controller 50 based on the pattern 120, and an image without distortion is viewed from the detected position as a viewpoint. Such distortion correction data is calculated.
  • FIG. 5 shows a procedure of distortion correction data calculation processing. The operation will be described below with reference to FIGS.
  • the receiving unit 3 It is determined whether the receiving unit 3 has received a remote control signal (distortion correction start request) from the remote controller 50 (step S100).
  • the receiving unit 3 notifies the distortion correction data calculating unit 4 to that effect.
  • the distortion correction data calculation unit 4 in response to a distortion correction start request, first, the three-dimensional sensor 107 three-dimensionally measures the projection surface of the projection area 103, and the three-dimensional position data as the measurement result is measured by a three-dimensional data projector. It supplies to the coordinate transformation part 12 (step S101).
  • the three-dimensional position data is point group data indicated by three-dimensional coordinates in the coordinate system of the three-dimensional sensor 107.
  • the three-dimensional data projector coordinate conversion unit 12 converts the three-dimensional position data of the projection plane into the three-dimensional position data of the projector coordinate system with the projection center 102 as the origin (step S102).
  • the projection center 102 which is the origin of the projector coordinate system will be briefly described.
  • FIG. 6A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when launching is performed.
  • FIG. 6B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when there is no launch.
  • FIG. 6C schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when the angle of view is enlarged.
  • FIG. 6A schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when launching is performed.
  • FIG. 6B schematically shows the relationship between the angle of view, the projection optical axis, and the projection center when there is no launch.
  • FIG. 6C schematically shows
  • 6D schematically shows the relationship between the angle of view, the projection center axis, and the projection center when the lens shift is performed upward.
  • the projection optical axis is an axis passing through the center of the image forming surface and perpendicular to the image forming surface.
  • 6B to 6D correspond to cross sections in the vertical direction.
  • a projection form called launching is used in which the image is projected above the projection optical axis so that the image is projected above the height of the table.
  • launching is one form of the lens shift.
  • the projection center 102 is a line in which the points at the four corners of the projection area 103 and the points at the four corners of the image forming area of the display device 100 are linearly connected by corresponding points.
  • the projection area 103 is obtained by inverting the image of the image forming area of the display device 100 vertically and horizontally. Actually, since refraction is caused by the lens, lines connecting the four corners of the projection area 103 and the four corners of the image forming area of the display device 100 are not straight lines.
  • the projection center 102 needs to be determined in consideration of the lens configuration of the projection lens.
  • the four corner points of the image forming area of the display device 100 are A point, B point, C point, and D point, respectively, and the four corner points of the projection area 103 are respectively a point, b point, c point, and d point.
  • Points a, b, c, and d correspond to points A, B, C, and D, respectively, and the arrangement of points a, b, c, and d is points A, B, and C.
  • the positional relationship is reversed up, down, left and right with respect to the arrangement of point D.
  • the projection center 102 emits from the point A and reaches the point a via the lens, emits from the point B and reaches the point b via the lens, and exits from the point C.
  • the principal ray that reaches the point c via the lens and the principal ray that exits from the point D and reaches the point d via the lens are shown as crossing each other.
  • Such an intersection of principal rays can be defined at the center of the aperture stop of the projection lens, for example, and can be calculated based on lens design data.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
  • the projection optical axis 109 passes through the center of the projection area 103, and the projection center 102 is located on the projection optical axis 109.
  • the projection center axis coincides with the projection optical axis 109.
  • FIG. 6C is an example in which the angle of view is enlarged as compared with the example of FIG. 6B. 6B, the projection optical axis 109 passes through the center of the projection area 103, and the projection center 102 is located on the projection optical axis 109.
  • the projection center 102 is closer to the display device 100 than in the example of FIG. 6B. Is arranged.
  • the projection center axis coincides with the projection optical axis 109.
  • FIG. 6D is an example in which the lens shift is performed so that the projection area 103 is shifted upward as compared to the example of FIG. 6B.
  • the projection optical axis 109 passes through the center of the lower end of the projection area 103, and the projection center 102 is located above the projection optical axis 109. In this case, the projection center axis does not coincide with the projection optical axis 109.
  • the projection center 102 changes according to the zoom position and the lens shift position. In other words, the projection center 102 needs to be determined according to the zoom position or the lens shift position.
  • a rotation amount with respect to three coordinate axes orthogonal to each other and a translation amount representing movement with respect to these coordinate axes are used.
  • Obtaining the amount of rotation and the amount of translation is called calibration. Since the position of the projection center 102 that is the origin of the projector coordinate system is not always the same and moves by zooming or lens shift, the three-dimensional sensor calibration data storage unit 11 performs calibration together with the rotation amount and the translation amount.
  • the zoom position and lens shift position at that time are stored as a reference zoom position and a reference lens shift position, respectively.
  • the three-dimensional data projector coordinate conversion unit 12 performs coordinate conversion in the following procedure using the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position.
  • A1 The coordinates of the reference projection center point at the time of calibration are obtained from the design data stored in the projector projection design data storage unit 9, the reference zoom position, and the reference lens shift position.
  • A2) The coordinates of the current projection center point are obtained from the current zoom position and lens shift position from the projection lens unit 8 in the same manner as described above.
  • A3) A translation amount for converting from the coordinates of the reference projection center point to the coordinates of the current projection center point is obtained.
  • A4) The coordinate transformation is performed on the three-dimensional position data from the three-dimensional sensor 107 based on the rotation amount and the translation amount stored in the three-dimensional sensor calibration data storage unit 11.
  • the coordinates are moved by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point in accordance with the current zoom position and the current lens shift position. Based on the above procedures (A1) to (A5), the three-dimensional position data from the three-dimensional sensor 107 is converted into a projector coordinate system with the current projection center point as the origin.
  • the camera unit 13 including the camera 106 supplies the video mute unit 7 with a mute start instruction signal to output a black screen video.
  • the video mute unit 7 supplies a black screen video signal to the projection lens unit 8 in response to the mute start instruction signal.
  • the projection lens unit unit 8 projects a black screen image on the projection area 103 (step S103). By projecting a black screen image, a predetermined pattern projected from the remote controller 50 can be accurately extracted.
  • the camera 106 images the pattern 120 projected from the remote controller 50 (step S104).
  • the camera unit 13 including the camera 106 supplies a mute release instruction signal to the video mute unit 7.
  • the video mute unit 7 supplies the video signal from the distortion correction unit 6 to the projection lens unit unit 8 in response to the mute release instruction signal.
  • the pattern feature point detection unit 14 extracts four feature points of the pattern 120 from the captured image from the camera 106 (step S105).
  • the pattern physical coordinate calculation unit 16 calculates physical coordinates in the camera coordinate system of each of the four feature points (step S106).
  • the pattern projector coordinate conversion unit 18 converts the physical coordinates of each feature point of the pattern in the camera coordinates into physical coordinates in the projector coordinate system (step S107).
  • the coordinate transformation in step S107 can also be performed by the same method as the above-described three-dimensional sensor projector coordinate transformation.
  • the pattern projector coordinate conversion unit 18 performs coordinate conversion in the following procedure using the rotation amount, the translation amount, the reference zoom position, and the reference lens shift position.
  • the coordinates of the reference projection center point at the time of calibration are obtained from the design data stored in the projector projection design data storage unit 9, the reference zoom position, and the reference lens shift position.
  • the coordinates of the current projection center point are obtained from the current zoom position and lens shift position from the projection lens unit 8 in the same manner as described above.
  • B3 A translation amount for conversion from the coordinates of the reference projection center point to the coordinates of the current projection center point is obtained.
  • the physical coordinates of each feature point of the pattern in the camera coordinates from the pattern physical coordinate calculation unit 16 are coordinate-converted by the rotation amount and the translation amount stored in the camera calibration data storage unit 17.
  • the coordinates are moved by the amount of translation from the coordinates of the reference projection center point to the coordinates of the current projection center point in accordance with the current zoom position and the current lens shift position.
  • the physical coordinates of each feature point of the pattern in the camera coordinates from the pattern physical coordinate calculation unit 16 are converted into a projector coordinate system with the current projection center point as the origin. .
  • the origin (0,0,0) which is the optical center in the camera coordinate system, is also converted to the projector coordinate system.
  • the remote controller position calculation unit 21 performs the three-dimensional position data of the projection plane in the projector coordinate system (three-dimensional position coordinates of the projection plane) and each feature point of the pattern in the projector coordinate system.
  • the three-dimensional position of the remote controller in the projector coordinate system is calculated based on the physical coordinates (three-dimensional position coordinates) (step S108).
  • the distortion correction coefficient calculation unit 22 determines the remote controller based on the three-dimensional position data of the projection plane in the projector coordinate system (three-dimensional position coordinates of the projection plane) and the three-dimensional position of the remote controller in the projector coordinate system.
  • a distortion correction coefficient is calculated so that distortion corresponding to the three-dimensional shape of the projection plane does not occur in the projected image when the projection plane is viewed from the viewpoint (step S109).
  • the distortion correction coefficient calculated as described above is supplied to the distortion correction unit 6.
  • the distortion correction unit 6 performs distortion correction on the video signal from the video processing unit 5 according to the distortion correction coefficient.
  • FIG. 7 schematically shows a method for calculating the three-dimensional position of the remote controller 50.
  • the coordinate axes of the projector coordinate system are indicated by XYZ axes
  • the coordinate axes of the three-dimensional sensor coordinate system are indicated by X'Y'Z 'axes
  • the coordinate axes of the camera coordinate system are indicated by X "Y" Z "axes.
  • the origins of the projector coordinate system, the three-dimensional sensor coordinate system, and the camera coordinate system are different from each other.
  • the pattern feature point detection unit 14 extracts four feature points of the pattern 120 from the captured image 121 from the camera 106, and the pattern physical coordinate calculation unit 16 extracts the camera coordinates of each of the four feature points.
  • the physical coordinates 122 in the system are calculated.
  • the pattern projector coordinate conversion unit 18 converts the physical coordinates 122 of each feature point in the camera coordinate system into physical coordinates in the projector coordinate system. Further, the pattern projector coordinate conversion unit 18 converts the origin (0,0,0) of the camera coordinate system, which is the optical center of the camera 106, into the projector coordinate system.
  • the remote controller position calculation unit 21 obtains four straight lines connecting the physical coordinates of the four feature points and the origin of the camera coordinate system in the projector coordinate system, and each straight line and the projector converted by the three-dimensional data projector coordinate conversion unit 12 An intersection point with the three-dimensional position of the projection plane in the coordinate system is obtained.
  • the pattern projected from the remote controller 50 from the pattern data stored in the pattern storage unit 19 and the design data stored in the remote controller projection design data storage unit 20 by the remote controller position calculation unit 21 is The three-dimensional position of the remote controller 50 in the projector coordinate system is calculated from the condition that projection is performed at the three-dimensional position of the intersection.
  • the remote controller position calculation unit 21 acquires the light ray directions respectively directed to the four feature points of the remote controller 50 based on the design data stored in the remote controller projection design data storage unit 20. Then, a unique position of the remote controller is obtained such that light beams traveling toward the four feature points pass through the four intersection points on the projection plane in the projector coordinate system. This obtained position is the three-dimensional position of the remote controller 50 in the projector coordinate system.
  • distortion correction can be automatically performed using the position designated by the remote controller 50 as a viewpoint. Therefore, distortion correction corresponding to an arbitrary viewpoint can be performed with a simple operation using a remote controller.
  • the operation has been described by taking a pattern including four feature points as an example. However, the number of feature points necessary for calculating the position of the remote controller is three or more. Further, a polygonal graphic pattern composed of a plurality of lines may be used instead of the bright spot pattern. In this case, each vertex angle of the polygon is extracted as a feature point.
  • a portion including the distortion correction unit 6 and the distortion correction data calculation unit 4 can be referred to as a distortion correction unit.
  • a portion including the projector projection design data storage unit 9, the three-dimensional sensor unit 10, the three-dimensional sensor calibration data storage unit 11, and the three-dimensional data projector coordinate conversion unit 12 can be referred to as a first acquisition unit.
  • a portion including the camera unit 13, the pattern feature point detection unit 14, the camera imaging design data storage unit 15, the pattern physical coordinate calculation unit 16, the camera calibration data storage unit 17, and the pattern projector coordinate conversion unit 18 is referred to as a second acquisition unit. Can be called.
  • the pattern storage unit 19 and the remote controller projection design data storage unit 20 can be called storage units.
  • FIG. 8 is a block diagram showing a configuration of a projector according to the second embodiment of the present invention.
  • the projector is used in combination with a remote controller that projects a predetermined pattern including three or more feature points.
  • the projector includes an image forming unit 200 that forms an image, and an image forming unit 200.
  • a projection unit 201 that projects the formed image, and a distortion correction unit 202 that corrects an image formed on the image forming unit 200 so that the projection image is not distorted when the projection surface is viewed from the position of the remote controller.
  • the distortion correction unit 202 includes a first acquisition unit 203, a second acquisition unit 204, a storage unit 205, and a remote controller position calculation unit 206.
  • the first acquisition unit 203 performs three-dimensional measurement of the projection plane to acquire a three-dimensional position of the projection plane, and projection in a projector coordinate system in which a space in which the image of the projection unit 201 is projected is represented by three-dimensional coordinates.
  • the second acquisition unit 204 captures a predetermined pattern projected on the projection plane, extracts three or more feature points from the captured image of the pattern, and extracts the three or more feature points in the projector coordinate system. Get physical coordinates.
  • the storage unit 205 stores data indicating the radiation direction of each of the three or more feature points of the remote controller.
  • the remote controller position calculation unit 206 uses a three-dimensional position coordinate of the projection plane, physical coordinates of the three or more feature points, and radiation directions of the three or more feature points in the projector coordinate system. Calculate the position of the remote controller.
  • distortion correction can be automatically performed using the position designated by the remote controller as a viewpoint. Therefore, distortion correction corresponding to an arbitrary viewpoint can be performed with a simple operation using a remote controller.
  • the distortion correction unit 202 includes a distortion correction coefficient calculation unit that calculates a distortion correction coefficient for correcting distortion of the projected image based on the three-dimensional position coordinates of the projection surface and the position of the remote controller. Further, the image forming unit 200 may be configured to correct an image formed according to the distortion correction coefficient. In the projector according to the present embodiment, the second acquisition unit 204 may cause the image forming unit 200 to form a black screen image and capture the predetermined pattern in a state where the black screen is projected.
  • a receiving unit that receives a distortion correction start request signal transmitted from the remote controller is provided, and the first acquisition unit 203, the second acquisition unit 204, and the remote controller position calculation unit 206 include: It may be configured to operate in response to a distortion correction start request signal.
  • the image forming unit 200 and the projection unit 201 correspond to the projection lens unit unit 8 shown in FIG.
  • the distortion correction unit 202 corresponds to the distortion correction unit 6 and the distortion correction data calculation unit 4 illustrated in FIG.
  • the first acquisition unit 203 corresponds to the portion including the projector projection design data storage unit 9, the three-dimensional sensor unit 10, the three-dimensional sensor calibration data storage unit 11, and the three-dimensional data projector coordinate conversion unit 12 illustrated in FIG. To do.
  • the second acquisition unit 204 includes the camera unit 13, the pattern feature point detection unit 14, the camera imaging design data storage unit 15, the pattern physical coordinate calculation unit 16, the camera calibration data storage unit 17, and the pattern projector coordinates illustrated in FIG. This corresponds to the portion formed by the conversion unit 18.
  • the storage unit 205 corresponds to the pattern storage unit 19 and the remote controller projection design data storage unit 20 shown in FIG.
  • the remote controller position calculation unit 206 corresponds to the remote controller position calculation unit 21 shown in FIG.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Transforming Electric Information Into Light Information (AREA)
  • Projection Apparatus (AREA)

Abstract

L'invention concerne un projecteur capable de corriger des distorsions à partir de n'importe quel point de vue par l'intermédiaire d'une opération simple. Le projecteur comprend : une unité de formation d'image (200) ; une unité de projection (201) qui projette une image formée par l'unité de formation d'image ; et un moyen de correction de distorsion (202) qui corrige l'image qui est formée par l'unité de formation d'image de sorte qu'une distorsion ne soit pas générée dans l'image projetée. Le moyen de correction de distorsion comprend : une première unité d'acquisition (203) pour acquérir des coordonnées de position tridimensionnelles d'une surface de projection dans un système de coordonnées de projecteur ; une seconde unité d'acquisition (204) pour extraire trois points caractéristiques ou plus à partir d'une image capturée d'un motif prescrit projeté sur la surface de projection à partir d'un dispositif de commande à distance, et acquérir des coordonnées physiques de chaque point caractéristique dans le système de coordonnées de projecteur ; une unité de stockage (205) dans laquelle des données indiquant la direction de rayonnement respective de chaque point caractéristique sont stockées ; et une unité de calcul de position de dispositif de commande à distance (206) pour calculer la position du dispositif de commande à distance sur la base des coordonnées de position tridimensionnelles de la surface de projection, des coordonnées physiques de chaque point caractéristique et de la direction de rayonnement de chaque point caractéristique.
PCT/JP2017/015159 2017-04-13 2017-04-13 Projecteur, dispositif de commande à distance, système de projecteur et procédé de correction de distorsion WO2018189867A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/015159 WO2018189867A1 (fr) 2017-04-13 2017-04-13 Projecteur, dispositif de commande à distance, système de projecteur et procédé de correction de distorsion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/015159 WO2018189867A1 (fr) 2017-04-13 2017-04-13 Projecteur, dispositif de commande à distance, système de projecteur et procédé de correction de distorsion

Publications (1)

Publication Number Publication Date
WO2018189867A1 true WO2018189867A1 (fr) 2018-10-18

Family

ID=63793205

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/015159 WO2018189867A1 (fr) 2017-04-13 2017-04-13 Projecteur, dispositif de commande à distance, système de projecteur et procédé de correction de distorsion

Country Status (1)

Country Link
WO (1) WO2018189867A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004187052A (ja) * 2002-12-04 2004-07-02 Seiko Epson Corp 画像処理システム、プロジェクタ、携帯型装置および画像処理方法
JP2004297401A (ja) * 2003-03-26 2004-10-21 Sharp Corp プロジェクタの自動調整装置
JP2005039518A (ja) * 2003-07-15 2005-02-10 Sharp Corp 投写型プロジェクタ
JP2009206798A (ja) * 2008-02-27 2009-09-10 Seiko Epson Corp 画像表示システム、画像表示方法
JP2011176637A (ja) * 2010-02-24 2011-09-08 Sanyo Electric Co Ltd 投写型映像表示装置
JP2014056030A (ja) * 2012-09-11 2014-03-27 Ricoh Co Ltd 画像投影システム、画像投影システムの運用方法、画像投影装置、及び画像投影システムの遠隔操作装置

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004187052A (ja) * 2002-12-04 2004-07-02 Seiko Epson Corp 画像処理システム、プロジェクタ、携帯型装置および画像処理方法
JP2004297401A (ja) * 2003-03-26 2004-10-21 Sharp Corp プロジェクタの自動調整装置
JP2005039518A (ja) * 2003-07-15 2005-02-10 Sharp Corp 投写型プロジェクタ
JP2009206798A (ja) * 2008-02-27 2009-09-10 Seiko Epson Corp 画像表示システム、画像表示方法
JP2011176637A (ja) * 2010-02-24 2011-09-08 Sanyo Electric Co Ltd 投写型映像表示装置
JP2014056030A (ja) * 2012-09-11 2014-03-27 Ricoh Co Ltd 画像投影システム、画像投影システムの運用方法、画像投影装置、及び画像投影システムの遠隔操作装置

Similar Documents

Publication Publication Date Title
JP3960390B2 (ja) 台形歪み補正装置を備えたプロジェクタ
US9835445B2 (en) Method and system for projecting a visible representation of infrared radiation
EP3054693B1 (fr) Appareil d'affichage d'image et son procédé de pointage
JP5401940B2 (ja) 投写光学系のズーム比測定方法、そのズーム比測定方法を用いた投写画像の補正方法及びその補正方法を実行するプロジェクタ
KR102133492B1 (ko) 깊이 카메라와 비행체를 이용한 강건한 디스플레이 시스템 및 방법
US11620732B2 (en) Multi-projection system, image projection method and projector
CN108007344B (zh) 用于可视地表示扫描数据的方法、存储介质和测量系统
JP6804056B2 (ja) 投写型表示装置、投写型表示装置の制御方法、及びプログラム
JP2019215811A (ja) 投影システム、画像処理装置および投影方法
CN104660944A (zh) 图像投影装置及图像投影方法
JP6990694B2 (ja) プロジェクタ、マッピング用データ作成方法、プログラム及びプロジェクションマッピングシステム
JP3742085B2 (ja) 傾斜角度測定装置を有するプロジェクタ
JP2014134611A (ja) 幾何歪み補正装置、プロジェクタ装置、及び幾何歪み補正方法
JP2013152224A (ja) 光学システム
WO2012085990A1 (fr) Dispositif d'affichage à projection et procédé donnant des consignes relatives à une orientation d'installation
JP2005004165A (ja) 傾斜角度測定装置を有するプロジェクタ
WO2018189867A1 (fr) Projecteur, dispositif de commande à distance, système de projecteur et procédé de correction de distorsion
JP4301028B2 (ja) 投影装置、角度検出方法及びプログラム
JP3730982B2 (ja) プロジェクタ
JP2015142157A (ja) 映像投影システム、投影制御装置、投影制御用プログラム
JP3709406B2 (ja) 自動台形歪補正手段を有するプロジェクタ
KR102207256B1 (ko) 화상 표시 장치 및 그의 포인팅 방법
CN114936978B (zh) 投影校正量的获取方法、投影系统以及投影校正方法
JP4535749B2 (ja) 距離傾斜角度測定装置を有するプロジェクタ
WO2024162148A1 (fr) Dispositif de traitement de données optiques, procédé de traitement de données optiques et programme de traitement de données optiques

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17905061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17905061

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP