WO2018006669A1 - Procédé et appareil de fusion de parallaxe - Google Patents
Procédé et appareil de fusion de parallaxe Download PDFInfo
- Publication number
- WO2018006669A1 WO2018006669A1 PCT/CN2017/086950 CN2017086950W WO2018006669A1 WO 2018006669 A1 WO2018006669 A1 WO 2018006669A1 CN 2017086950 W CN2017086950 W CN 2017086950W WO 2018006669 A1 WO2018006669 A1 WO 2018006669A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- flow field
- field motion
- motion relationship
- relationship
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/275—Image signal generators from 3D object models, e.g. computer-generated stereoscopic image signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/20—Image signal generators
- H04N13/293—Generating mixed stereoscopic images; Generating mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/361—Reproducing mixed stereoscopic images; Reproducing mixed monoscopic and stereoscopic images, e.g. a stereoscopic image overlay window on a monoscopic image background
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N2013/0074—Stereoscopic image analysis
- H04N2013/0088—Synthesising a monoscopic image signal from stereoscopic images, e.g. synthesising a panoramic or high resolution monoscopic image
Definitions
- the present application relates to the field of image processing, and in particular to a parallax fusion method and apparatus.
- the 360-degree panoramic video has gradually become one of the main contents of the virtual reality field because it can provide users with a more realistic and immersive viewing experience that is different from the traditional limited-view video. Since the single-lens system that currently collects panoramic video is rare, it is generally composed of video captured by multiple cameras or multiple lens systems.
- the present application proposes a parallax fusion method and apparatus, which can reduce or eliminate ghosting, ghosting virtual edges, continuous line misalignment fractures and the like due to parallax.
- a parallax fusion method comprising:
- An overlap region and a second image of the first image transformed by forward and backward deformation The overlapping regions are fused to obtain a final image of the first image and the second image overlap region.
- the application also provides a parallax fusion device, the device comprising:
- One or more memories are One or more memories
- One or more processors among them,
- the one or more memories storing one or more instruction modules configured to be executed by the one or more processors;
- the one or more instruction modules include:
- a motion relationship acquiring module configured to acquire a first direction flow field motion relationship and a second direction flow field motion relationship of the first image and the second image overlap region
- a deformation module configured to perform forward and backward deformation transformation on the overlapping area of the first image and the overlapping area of the second image by using the first direction flow field motion relationship and the second direction flow field motion relationship, respectively;
- a fusion module configured to fuse the overlap region of the first image transformed by the forward and backward deformations and the overlap region of the second image to obtain a final image of the first image and the second image overlap region.
- the present application also proposes a non-transitory computer readable storage medium storing computer readable instructions that cause at least one processor to:
- the overlapping area of the area and the second image is transformed by forward and backward deformation; the overlapping area of the first image transformed by the forward and backward deformation is merged with the overlapping area of the second image to obtain the first image and the second image
- the final image of the image overlap area is
- FIG. 1 is a schematic diagram showing the internal structure of an electronic device in an example
- FIG. 2 is a flow chart of a parallax fusion method in an example
- FIG. 3 is a structural block diagram of a parallax fusion device in an example
- FIG. 4 is a block diagram showing the internal structure of a correction module in an example
- Figure 5 is a block diagram showing the internal structure of a deformation module in an example
- Figure 6 is a block diagram showing the internal structure of a fusion module in an example.
- the inventors have found that for a panoramic video stitched by a plurality of cameras or a plurality of lens systems, two non-co-optical cameras are obtained from the optical optical perspective principle of the lens.
- the content imaged by the two-dimensional imaging sensor always has a certain parallax in their common field of view. At different depths, the degree of parallax is different, which ultimately causes the stitched video content to appear visually in the region where parallax exists.
- Unacceptable defects such as ghosting, ghosting, broken lines of continuous lines, etc.
- first may be referred to as a second client
- second client may be referred to as a first client, without departing from the scope of the present application.
- Both the first client and the second client are clients, but they are not the same client.
- FIG. 1 is a schematic diagram showing the internal structure of an electronic device in an example.
- the electronic device includes a processor connected via a system bus, a non-volatile storage medium, an internal memory, a network interface, a display screen, and an input device.
- the non-volatile storage medium of the electronic device stores an operating system, and further includes a parallax fusion device, and the parallax fusion device is used to implement a parallax fusion method.
- the processor is used to provide computing and control capabilities to support the operation of the entire terminal.
- An internal memory in an electronic device provides an environment for operation of a parallax fusion device in a non-volatile storage medium, the internal memory being storable with computer readable instructions that, when executed by the processor, can cause The processor performs a parallax fusion method.
- the network interface is used to communicate with other devices, and the like.
- the display screen of the electronic device may be a liquid crystal display or an electronic ink display screen, and the input device may be a touch layer covered on the display screen, or may be a button, a trackball or a touchpad provided on the outer casing of the electronic device, or may be An external keyboard, trackpad, or mouse.
- the electronic device can be a cell phone, a personal computer, a tablet, a personal digital assistant, a wearable device, or a server.
- FIG. 1 is only a block diagram of a part of the structure related to the solution of the present application, and does not constitute a limitation on the electronic device to which the solution of the present application is applied.
- the specific electronic device may be It includes more or fewer components than those shown in the figures, or some components are combined, or have different component arrangements.
- FIG. 2 is a flow chart of a parallax fusion method in an example. As shown in FIG. 2, a parallax fusion method is implemented on an electronic device, including:
- Step 202 Acquire a first direction flow field motion relationship and a second direction flow field motion relationship of the first image and the second image overlap region.
- the captured panoramic video image is generally captured by multiple cameras or multiple lens systems.
- the video stitching is to stitch the first image and the second image.
- a portion of the overlapping area of the first image and the second image belonging to the first image is referred to as an overlapping area of the first image, and a portion of the overlapping area of the first image and the second image belonging to the second image is referred to as an overlap of the second image Area.
- Each pixel point coordinate of the overlapping area of the first image and the second image, each pixel point coordinate of the overlapping area of the first image, and each pixel point coordinate of the overlapping area of the second image are the same.
- the first direction flow field motion relationship of the first image and the second image overlap region is a flow field motion relationship from the first image to the second image direction.
- the second direction flow field motion relationship of the first image and the second image overlap region is a flow field motion relationship from the second image to the first image direction.
- the flow field motion relationship from left to right is Flow l2r
- the flow motion relationship from right to left is Flow r2l .
- L2r is the abbreviation of left2right
- the mark is from left to right
- r2l is the abbreviation of right2left
- the mark is from right to left.
- the pixel-by-pixel dense matching relationship of the overlapping area of the first image and the overlapping area of the second image may be calculated according to the classical flow field algorithm, that is, the first direction flow field motion relationship and the second direction flow field motion relationship.
- Step 204 correcting the motion relationship of the first direction flow field to obtain a corrected motion relationship of the first direction flow field, and correcting the motion relationship of the flow field in the second direction to obtain a corrected motion relationship of the second direction flow field.
- the motion relationship of the first direction flow field is corrected to obtain a modified first direction flow field motion relationship, including: performing a transition to the non-overlapping region of the first image in the first direction flow field motion relationship
- the convergence correction obtains the corrected first-direction flow field motion relationship.
- the non-overlapping area of the first image means that the first image does not overlap with the second image. Area.
- the transitional joint correction is to smooth the transition between the overlap region and the non-overlapping region image after deformation.
- Correcting the motion relationship of the flow field in the second direction to obtain the corrected motion relationship of the motion field in the second direction comprising: performing a transitional connection correction on the motion relationship of the flow field in the second direction with the non-overlapping region of the second image, The corrected second-direction motion flow field motion relationship.
- the non-overlapping area of the second image refers to an area in the second image that does not overlap with the first image.
- Step 206 Perform a forward and backward deformation transformation on the overlapping area of the first image and the overlapping area of the second image by using the corrected first direction flow field motion relationship and the second direction flow field motion relationship, respectively.
- the corrected first direction flow field motion relationship and the second direction flow field motion relationship are respectively used to perform forward and backward deformation transformation on the overlap region of the first image and the overlap region of the second image to obtain the first image.
- a forward-transformed transformed image of the overlapping region a forward-transformed transformed image of the overlapping region of the first image, a forward-transformed transformed image of the overlapping region of the second image, and a backward-transformed transformed image of the overlapping region of the second image
- the first image is a reference image and the second image is a target image.
- the second image is a reference image
- the first image is a target image
- the forward deformation changes to transform the image from the reference image to the target image according to the input flow field data
- the backward deformation changes into a deformation transformation from the target image to the reference image
- the purpose of correcting the first direction flow field motion relationship and the second direction flow field motion relationship in the above step 204 is to obtain a better connection effect between the non-overlapping area and the overlap area, in practice step 204 It doesn't have to be done, it can't be done.
- the first direction flow field motion relationship and the second direction flow field motion relationship obtained in step 202 are corrected, so that in step 206, the first direction flow field motion relationship and the second direction flow field obtained in step 202 can be directly adopted.
- the motion relationship may be performed by performing forward and backward deformation transformation on the overlapping area of the first image and the overlapping area of the second image.
- Step 208 merging the overlap region of the first image after the forward and backward deformation transformation and the overlap region of the second image to obtain a final image of the first image and the second image overlap region.
- the four transformed transformed images are fused to obtain a final image of the first image and the second image overlap region.
- the first direction flow field motion relationship and the second direction flow field motion relationship are used.
- the overlapping area of the first image and the second image overlapping area perform forward and backward deformation transformation, and the overlapping area of the first image transformed by the forward and backward deformation and the overlapping area of the second image are merged to obtain a first image.
- the final image of the overlapping area of the second image is transformed and re-fused by using the occlusion area of the flow field data including the transition of the different depth planes to obtain the final image.
- the images of the overlapping regions obtained by the deformation transformation have different ⁇ , and the ⁇ regions obtained by the forward and backward transformations are just complementary, so the final image obtained after the fusion reduces or avoids the continuation due to the parallax. Breaking of lines, ghosting of virtual edges, ghosting, etc.
- the motion relationship of the first direction flow field is modified to be connected with the non-overlapping region of the first image, and the corrected first direction flow field is obtained.
- the motion relationship includes: multiplying the horizontal direction component of the first direction flow field motion relationship by the first coefficient factor including the horizontal pixel width coordinate of the overlapping area and the horizontal pixel width relationship of the overlapping area to obtain the corrected first direction flow field.
- the horizontal direction component of the motion relationship and the vertical direction component of the motion relationship of the first direction flow field are used as the vertical direction component of the corrected first direction flow field motion relationship.
- the first coefficient factor including the pixel point horizontal coordinate of the overlap region and the horizontal pixel width relationship of the overlap region may be a ratio between a pixel point horizontal coordinate of the overlap region and a horizontal pixel width of the overlap region, or an overlap region.
- the first coefficient factor is the ratio between the horizontal coordinate of the pixel point of the overlap region and the horizontal pixel width of the overlap region minus one. When calculated by the formula (1).
- a horizontal component representing the corrected first-order flow field motion relationship a vertical component representing the motion relationship of the corrected first direction flow field
- the horizontal direction component representing the motion relationship of the flow direction in the first direction before correction
- a vertical direction component representing a motion relationship of the first direction flow field before correction
- N is a horizontal pixel width of the first image and the second image overlap region
- x is a horizontal coordinate of the image pixel point
- y is a vertical coordinate of the image pixel point
- the superscript h represents the horizontal direction component
- the superscript v represents the vertical direction component
- l2r is the abbreviation of left2right, marking the direction from the first image to the second image (ie, left to right direction).
- the first coefficient factor is the ratio between the horizontal coordinate of the pixel of the overlap region and the horizontal pixel width of the overlap region.
- the motion relationship of the flow field in the second direction is modified to be in transition with the non-overlapping region of the second image, and the corrected second direction motion flow is obtained.
- the field motion relationship includes: multiplying a horizontal direction component of the second direction flow field motion relationship by a second coefficient factor including a horizontal pixel width coordinate of the overlapping area and a horizontal pixel width relationship of the overlapping area to obtain a corrected second direction flow.
- the horizontal direction component of the field motion relationship and the vertical direction of the motion relationship of the second direction flow field The quantity is used as the vertical direction component of the corrected flow direction relationship in the second direction.
- the second coefficient factor including the pixel point horizontal coordinate of the overlap region and the horizontal pixel width relationship of the overlap region may be a difference between a predetermined constant and a ratio between a pixel point horizontal coordinate of the overlap region and a horizontal pixel width of the overlap region.
- the value relationship is either a difference relationship between a predetermined constant and a ratio between a pixel point horizontal coordinate of the overlap region and a horizontal pixel width of the overlap region minus one.
- the second coefficient factor is a difference relationship between a predetermined constant and a ratio between a pixel point horizontal coordinate of the overlap region and a horizontal pixel width of the overlap region minus one.
- a horizontal component representing a corrected motion relationship of the flow field in the second direction a vertical component representing a corrected motion relationship of the flow field in the second direction
- the horizontal direction component representing the motion relationship of the flow direction in the first direction before correction
- the vertical direction component representing the motion relationship of the flow direction in the second direction before correction
- N is the horizontal pixel width of the overlap region of the first image and the second image
- x is the horizontal coordinate of the pixel of the image
- y is the vertical coordinate of the pixel of the image.
- the superscript h represents the horizontal direction component
- the superscript v represents the vertical direction component
- r2l is the abbreviation of right2left, marking the direction from the second image to the first image (ie, right to left direction).
- the predetermined constant is 1, and of course, the predetermined constant is not limited to 1.
- the second coefficient factor is a difference relationship between a predetermined constant and a ratio between a pixel point horizontal coordinate of the overlap region and a horizontal pixel width of the overlap region, that is, Calculated using equation (4).
- the first coefficient factor is The second coefficient factor is Or, the first coefficient factor can be The second coefficient factor is Not limited to this.
- the transition relationship between the first direction flow field motion relationship and the non-overlapping region of the first image is corrected, and the corrected first direction flow is obtained.
- the field motion relationship includes: multiplying a vertical direction component of the first direction flow field motion relationship by a third coefficient factor including a vertical coordinate of a pixel point of the overlap region and a vertical pixel height relationship of the overlap region to obtain a corrected first direction
- the vertical direction component of the flow field motion relationship and the horizontal direction component of the first direction flow field motion relationship are used as the horizontal direction component of the corrected first direction flow field motion relationship.
- the predetermined constant is 1, and of course the predetermined constant is not limited to 1.
- the third coefficient factor may be a ratio relationship between the vertical coordinates of the pixel points of the overlap region and the vertical pixel height of the overlap region, or between the vertical coordinates of the pixel points of the overlap region and the vertical pixel height of the overlap region minus one. Ratio relationship.
- the third coefficient factor may be a ratio relationship between the vertical coordinate of the pixel of the overlap region and the vertical pixel height of the overlap region minus one. Calculated using equation (5).
- a horizontal component representing the corrected first-order flow field motion relationship a vertical component representing the motion relationship of the corrected first direction flow field
- the horizontal direction component representing the motion relationship of the flow direction in the first direction before correction
- a vertical direction component representing a motion relationship of the first direction flow field before correction
- M is a vertical pixel height of the first image and the second image overlap region
- x is a horizontal coordinate of the image pixel point
- y is a vertical coordinate of the image pixel point
- the superscript h represents the horizontal direction component
- the superscript v represents the vertical direction component
- u2d is an abbreviation of up2down, marking the direction from the first image to the second image (ie, the top to bottom direction).
- the third coefficient factor may be a ratio relationship between the vertical coordinates of the pixel points of the overlap region and the vertical pixel height of the overlap region. Calculated using equation (6).
- the motion relationship of the second direction flow field is modified to be connected with the non-overlapping region of the second image, and the corrected second direction motion is obtained.
- the flow field motion relationship includes: multiplying a vertical direction component of the second direction flow field motion relationship by a fourth coefficient factor of a vertical relationship between a pixel point vertical coordinate of the overlap region and a vertical pixel height of the overlap region, and a second corrected coefficient
- the vertical direction component of the directional flow field motion relationship and the horizontal direction component of the second direction flow field motion relationship are used as the horizontal direction component of the corrected second direction flow field motion relationship.
- the fourth coefficient factor may be a difference relationship between a predetermined constant and a ratio of a vertical coordinate of a pixel of the overlap region to a vertical pixel height of the overlap region, or may be a predetermined constant and a vertical coordinate of the pixel of the overlap region and overlap The difference between the vertical pixel height of the region minus the ratio between the values.
- the fourth coefficient factor is a difference relationship between a predetermined constant and a ratio between a vertical coordinate of a pixel of the overlap region and a vertical pixel height of the overlap region minus one. Calculated using equation (7).
- a horizontal component representing the corrected first-order flow field motion relationship a vertical component representing the motion relationship of the corrected first direction flow field
- the horizontal direction component representing the motion relationship of the flow direction in the first direction before correction
- a vertical direction component representing a motion relationship of the first direction flow field before correction
- M is a vertical pixel height of the first image and the second image overlap region
- x is a horizontal coordinate of the image pixel point
- y is a vertical coordinate of the image pixel point
- the superscript h represents the horizontal direction component
- the superscript v represents the vertical direction component
- d2u is the abbreviation of down2up, marking the direction from the second image to the first image (ie, the bottom-up direction).
- the predetermined constant is 1, and of course the predetermined constant is not limited to 1.
- the fourth coefficient factor is a difference relationship between a predetermined constant and a ratio between a vertical coordinate of a pixel point of the overlap region and a vertical pixel height of the overlap region.
- the third coefficient factor can be The fourth coefficient factor can be Alternatively, the third coefficient factor can be The fourth coefficient factor can be Not limited to this.
- the predetermined constant is 1, and of course, the predetermined constant is not limited to 1.
- the modified first direction flow field motion relationship and the second direction flow field motion relationship are respectively used to perform forward and backward deformation transformations on the overlap region of the first image and the overlap region of the second image, including:
- the first image is a reference image and the second image is a target image; if the modified second-direction flow field motion relationship is used for deformation transformation
- the second image is a reference image, and the first image is a target image.
- the forward deformation is transformed into a deformation transformation from the reference image to the target image, and the backward deformation is changed. Change to the deformation transformation from the target image to the reference image.
- the overlapping area of the first image is labeled I L
- the overlapping area of the second image is labeled I R
- the modified first direction flow field motion relationship rFlow ( x, y) l2r and the modified second-direction flow field motion relationship rFlow(x, y) r2l transforms I L and I R to obtain the following data:
- R' is a forward-transformed transformed image of the overlapping region I L of the first image
- R" is a backward-transformed transformed image of the overlapping region I L of the first image
- L' is an overlapping region I R of the second image
- L" is the backward deformation transformed image of the overlapping region I R of the second image
- Forwardwarp represents the forward deformation transformation
- Backwardwarp represents the backward deformation transformation.
- Forwardwarp refers to the forward deformation transformation of the image I from the reference image to the target image according to the input flow field data flow. Because the flow field data flow includes occlusion areas due to different depth plane transitions, R' and L' obtained by Forwardwarp transformation will have a first type of ⁇ area such as a void area and an overlap area, and the first type of ⁇ area will be Marked as region hole .
- Backwardwarp refers to the backward deformation transformation of the image I from the target image to the reference image according to the input flow field data flow. Since the flow field data flow includes occlusion areas due to transitions of different depth planes, the R" and L" obtained by the Backwardwarp transformation no longer have holes, but a ghosting phenomenon occurs.
- the four overlapping area images obtained by the deformation transformation have R', L', R" and L", each with a void area, an overlapping area, an overlapping area, etc., and the ⁇ area obtained by the Backwardwarp transformation and the Forwardwarp transformation happens to have Complementarity, so they are sourced to obtain the final image of the overlapping region.
- the overlapping regions of the first image and the overlapping regions of the second image after the forward and backward deformation are merged to obtain a final image of the first image and the second image overlapping region, including:
- the forward-transformed transformed image and the backward-transformed transformed image of the overlapping region of the second image are fused to obtain a second fused image.
- Fusion(x,y) is a fusion function.
- the final image of the first image and the second image overlapping area is the first fused image
- the final image of the first image and the second image overlapping area is the second fused image.
- the calculation formula is as shown in the formula (10).
- I overlap represents the final image.
- Mid indicates the intermediate separation position of the overlap region (the horizontal coordinate value of the intermediate pixel of the overlap region), which can be obtained by taking the center line and obtaining other methods.
- the forward transformed transformed image and the backward transformed transformed image of the overlapping region of the first image are merged to obtain a first fused image, including:
- the first fused image is a forward morphing transformed image of the overlapping region of the first image
- the first fused image is a backward morphing transformed image of the overlapping region of the first image.
- an implementation of the fusion(x,y) function can be as follows (11):
- R'(x, y) is R'
- R"(x, y) is R".
- the forward-transformed transformed image and the backward-transformed transformed image of the overlapping region of the second image are merged to obtain a second fused image, including:
- the second fused image is a forward morphing transformed image of the overlapping region of the second image
- the second fused image is a backward morphing transformed image of the overlapping region of the second image.
- L'(x, y) is L'
- L"(x, y) is L".
- the first type of ⁇ region and the non-first ⁇ region are distinguished from the pixel points, and the pixels belonging to the first ⁇ region are replaced by the backward deformation transformation image, and the backward deformation transformation image has no first ⁇ region, which can be eliminated.
- the process of overlapping the first image and the second image in the horizontal direction is used, and the first image and the second image may be overlapped and merged in the vertical direction, and the fusion process is the same.
- FIG. 3 is a structural block diagram of a parallax fusion device in an example.
- a disparity fusion device is a device configured to implement a parallax fusion method, and the device includes:
- One or more memories are One or more memories
- One or more processors among them,
- the one or more memories storing one or more instruction modules configured to be executed by the one or more processors;
- the one or more instruction modules include: a motion relationship acquisition module 310, a correction module 320, a deformation module 330, and a fusion module 340. among them:
- the motion relationship acquiring module 310 is configured to acquire a first direction flow field motion relationship and a second direction flow field motion relationship of the first image and the second image overlap region.
- the correction module 320 is configured to correct the first direction flow field motion relationship to obtain a corrected first direction flow field motion relationship, and modify the second direction flow field motion relationship to obtain a corrected second direction flow. Field sports relationship.
- the deformation module 330 is configured to perform forward and backward deformation transformation on the overlapping area of the first image and the overlapping area of the second image by using the corrected first direction flow field motion relationship and the second direction flow field motion relationship, respectively.
- the fusion module 340 is configured to fuse the overlap region of the first image transformed by the forward and backward deformations and the overlap region of the second image to obtain a final image of the first image and the second image overlap region.
- the function of the above-mentioned correction module 320 is to obtain a better connection effect between the non-overlapping area and the overlapping area.
- the correction module 320 is not an essential instruction module, so that the deformation module 330 directly acquires the motion relationship module.
- the first direction flow field motion relationship and the second direction flow field motion relationship acquired by 310 may be forward and backward deformation transformation of the overlap region of the first image and the overlap region of the second image.
- Figure 4 is a block diagram showing the internal structure of the correction module in one example.
- the correction Module 320 includes a first modification unit 3202 and a second modification unit 3204. among them:
- the first correcting unit 3202 is configured to perform a transition joint correction with the non-overlapping region of the first image in the first direction flow field motion relationship, to obtain a corrected first direction flow field motion relationship;
- the second correcting unit 3204 is configured to perform a transitional connection correction with the non-overlapping region of the second image in the second direction flow field motion relationship, and obtain a corrected second direction motion flow field motion relationship.
- the first correcting unit 3202 is further configured to multiply the horizontal direction component of the first direction flow field motion relationship by the pixel point horizontal coordinate including the overlapping area. a first coefficient factor in relation to a horizontal pixel width of the overlap region obtains a corrected horizontal direction component of the first direction flow field motion relationship, and a vertical direction component of the first direction flow field motion relationship as the corrected first The vertical component of the directional flow field motion relationship;
- the second correcting unit 3204 is further configured to multiply the horizontal direction component of the second direction flow field motion relationship by a second coefficient factor including a pixel point horizontal coordinate of the overlapping area and a horizontal pixel width relationship of the overlapping area.
- the horizontal direction component of the two-direction flow field motion relationship and the vertical direction component of the second direction flow field motion relationship are used as the vertical direction component of the corrected second-direction flow field motion relationship.
- the first correcting unit 3202 is further configured to multiply the vertical direction component of the first direction flow field motion relationship by the pixel point vertical coordinate including the overlap region.
- a third coefficient factor relating to a vertical pixel height relationship of the overlap region obtains a corrected vertical direction component of the first direction flow field motion relationship, and a horizontal direction component of the first direction flow field motion relationship as the corrected first The horizontal component of the directional flow field motion relationship;
- the second correcting unit 3204 is further configured to multiply the vertical direction component of the second direction flow field motion relationship by the pixel vertical coordinate of the overlapping area and the vertical pixel of the overlapping area.
- the fourth coefficient factor of the degree relationship obtains the corrected vertical direction component of the second direction flow field motion relationship, and the horizontal direction component of the second direction flow field motion relationship is used as the corrected second direction flow field motion relationship.
- Horizontal component
- FIG. 5 is a block diagram showing the internal structure of the deformation module in one example.
- the deformation module 330 includes a first deformation unit 3302, a second deformation unit 3304, a third deformation unit 3306, and a fourth deformation unit 3308. among them:
- the first deformation unit 3302 is configured to perform forward deformation transformation on the overlapping region of the first image by using the corrected first direction flow field motion relationship to obtain a forward deformation transformation image of the overlapping region of the first image;
- the second deformation unit 3304 is configured to perform backward deformation transformation on the overlapping region of the first image by using the corrected second-direction flow field motion relationship to obtain a backward deformation transformation image of the overlapping region of the first image;
- the third deforming unit 3306 is configured to perform forward deformation transformation on the overlapping region of the second image by using the corrected second-direction flow field motion relationship to obtain a forward-transformed transformed image of the overlapping region of the second image;
- the fourth deforming unit 3308 is configured to perform backward deformation transformation on the overlapping region of the second image by using the corrected first direction flow field motion relationship to obtain a backward deformation transformed image of the overlapping region of the second image.
- FIG. 6 is a block diagram showing the internal structure of a fusion module in an example.
- the fusion module 340 includes a first fusion unit 3402 and a second fusion unit 3404. among them:
- the first merging unit 3402 is configured to fuse the forward-transformed transformed image and the backward-transformed transformed image of the overlapping region of the first image to obtain a first fused image;
- the second merging unit 3404 is configured to fuse the forward-transformed transformed image and the backward-transformed transformed image of the overlapping region of the second image to obtain a second fused image;
- the final image of the first image and the second image overlapping area is the First fused image
- the final image of the first image and the second image overlapping area is the second fused image.
- the first fused image is a forward morphing transformed image of the overlap region of the first image, and if the overlap The pixel of the region belongs to the first type of ⁇ region, and the first fused image is a backward deformation transformed image of the overlapping region of the first image;
- the second fused image is a forward-transformed transformed image of the overlapping area of the second image, and if the pixel of the overlapping area belongs to In the first type of ⁇ region, the second fused image is a backward morphing transformed image of the overlapping region of the second image.
- the program when executed, may include a flow of an instance of each of the methods described above.
- the storage medium may be a magnetic disk, an optical disk, a read-only memory (ROM), or the like.
- An example of the present application provides a non-transitory computer readable storage medium storing computer readable instructions, which may cause at least one processor to perform the parallax fusion method proposed in the above example, for example, acquiring a first image and a second image.
- the first direction flow field motion relationship and the second direction flow field motion relationship in the overlap region; the first direction flow field motion relationship and the second direction flow field motion relationship respectively are used to overlap the first image overlap region and the second image overlap region respectively Forward and backward deformation transformation is performed; the overlapping region of the first image transformed by the forward and backward deformation and the overlapping region of the second image are merged to obtain a final image of the first image and the second image overlapping region.
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Image Processing (AREA)
Abstract
La présente invention concerne un procédé et un appareil de fusion de parallaxe. Le procédé comprend : l'acquisition d'une relation de mouvement de champ de propagation de première direction et d'une relation de mouvement de champ de propagation de seconde direction d'une zone de chevauchement entre une première image et une seconde image ; la correction de la relation de déplacement de champ de propagation de première direction afin d'obtenir une relation de mouvement de champ de propagation de première direction corrigée et la correction de la relation de mouvement de champ propagation de seconde direction afin d'obtenir une relation de mouvement de champ de propagation de seconde direction corrigée ; la réalisation respective de conversions de transformation vers l'avant et vers l'arrière sur une zone de chevauchement de la première image et une zone de chevauchement de la seconde image à l'aide de la relation de mouvement de champ de propagation de première direction et de la relation de mouvement de champ de propagation de seconde direction corrigées ; et la fusion de la zone de chevauchement de la première image et de la zone de chevauchement de la seconde image, lesquelles ont été soumises aux conversions de transformation vers l'avant et vers l'arrière, en vue d'obtenir une image finale de la zone de chevauchement entre la première image et la seconde image.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201610522270.8 | 2016-07-04 | ||
| CN201610522270.8A CN106162143B (zh) | 2016-07-04 | 2016-07-04 | 视差融合方法和装置 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018006669A1 true WO2018006669A1 (fr) | 2018-01-11 |
Family
ID=58061810
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/086950 Ceased WO2018006669A1 (fr) | 2016-07-04 | 2017-06-02 | Procédé et appareil de fusion de parallaxe |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN106162143B (fr) |
| WO (1) | WO2018006669A1 (fr) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113159161A (zh) * | 2021-04-16 | 2021-07-23 | 深圳市商汤科技有限公司 | 目标匹配方法和装置、设备及存储介质 |
Families Citing this family (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN106162143B (zh) * | 2016-07-04 | 2018-11-09 | 腾讯科技(深圳)有限公司 | 视差融合方法和装置 |
| CN106815802A (zh) * | 2016-12-23 | 2017-06-09 | 深圳超多维科技有限公司 | 一种图像拼接方法及装置 |
| CN115810033A (zh) * | 2021-09-16 | 2023-03-17 | 北京极感科技有限公司 | 图像配准方法、计算机程序产品、存储介质及电子设备 |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5963664A (en) * | 1995-06-22 | 1999-10-05 | Sarnoff Corporation | Method and system for image combination using a parallax-based technique |
| CN105141920A (zh) * | 2015-09-01 | 2015-12-09 | 电子科技大学 | 一种360度全景视频拼接系统 |
| CN105205796A (zh) * | 2014-06-30 | 2015-12-30 | 华为技术有限公司 | 广域图像获取方法和装置 |
| CN105635808A (zh) * | 2015-12-31 | 2016-06-01 | 电子科技大学 | 一种基于贝叶斯理论的视频拼接方法 |
| CN106162143A (zh) * | 2016-07-04 | 2016-11-23 | 腾讯科技(深圳)有限公司 | 视差融合方法和装置 |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP6303090B2 (ja) * | 2014-03-24 | 2018-04-04 | アルパイン株式会社 | 画像処理装置および画像処理プログラム |
| US20160191795A1 (en) * | 2014-12-30 | 2016-06-30 | Alpine Electronics, Inc. | Method and system for presenting panoramic surround view in vehicle |
| CN105488760A (zh) * | 2015-12-08 | 2016-04-13 | 电子科技大学 | 基于流场的虚拟图像拼接方法 |
-
2016
- 2016-07-04 CN CN201610522270.8A patent/CN106162143B/zh active Active
-
2017
- 2017-06-02 WO PCT/CN2017/086950 patent/WO2018006669A1/fr not_active Ceased
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5963664A (en) * | 1995-06-22 | 1999-10-05 | Sarnoff Corporation | Method and system for image combination using a parallax-based technique |
| CN105205796A (zh) * | 2014-06-30 | 2015-12-30 | 华为技术有限公司 | 广域图像获取方法和装置 |
| CN105141920A (zh) * | 2015-09-01 | 2015-12-09 | 电子科技大学 | 一种360度全景视频拼接系统 |
| CN105635808A (zh) * | 2015-12-31 | 2016-06-01 | 电子科技大学 | 一种基于贝叶斯理论的视频拼接方法 |
| CN106162143A (zh) * | 2016-07-04 | 2016-11-23 | 腾讯科技(深圳)有限公司 | 视差融合方法和装置 |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN113159161A (zh) * | 2021-04-16 | 2021-07-23 | 深圳市商汤科技有限公司 | 目标匹配方法和装置、设备及存储介质 |
Also Published As
| Publication number | Publication date |
|---|---|
| CN106162143A (zh) | 2016-11-23 |
| CN106162143B (zh) | 2018-11-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2019052534A1 (fr) | Procédé et dispositif d'assemblage d'images et support d'images | |
| EP2328125B1 (fr) | Procédé et dispositif de raccordement d'images | |
| US8818101B1 (en) | Apparatus and method for feature matching in distorted images | |
| WO2020007320A1 (fr) | Procédé de fusion d'images à plusieurs angles de vision, appareil, dispositif informatique, et support de stockage | |
| WO2016074620A1 (fr) | Assemblage vidéo tolérant à la parallaxe avec déformation spatio-temporelle localisée et recherche de joints | |
| US11282232B2 (en) | Camera calibration using depth data | |
| JP5327339B2 (ja) | 画像処理システム、画像処理方法、およびプログラム | |
| WO2018068719A1 (fr) | Procédé et appareil de collage d'image | |
| US20140098296A1 (en) | Method and apparatus for changing a perspective of a video | |
| WO2021258579A1 (fr) | Procédé et appareil d'épissage d'image, dispositif informatique et support de stockage | |
| CN107316273A (zh) | 全景图像采集装置及采集方法 | |
| WO2018006669A1 (fr) | Procédé et appareil de fusion de parallaxe | |
| CN108171735B (zh) | 基于深度学习的十亿像素视频对齐方法及系统 | |
| WO2021017589A1 (fr) | Procédé de fusion d'images basé sur une mise en correspondance de domaine de gradient | |
| US11758101B2 (en) | Restoration of the FOV of images for stereoscopic rendering | |
| KR102450236B1 (ko) | 전자 장치, 그 제어 방법 및 컴퓨터 판독가능 기록 매체 | |
| CN114390262A (zh) | 用于拼接三维球面全景影像的方法及电子装置 | |
| WO2021185036A1 (fr) | Procédé et appareil d'affichage en temps réel et de génération de données de nuage de points, dispositif et support | |
| WO2018058476A1 (fr) | Procédé et dispositif de correction d'image | |
| CN113344789B (zh) | 图像拼接方法及装置、电子设备、计算机可读存储介质 | |
| US9767580B2 (en) | Apparatuses, methods, and systems for 2-dimensional and 3-dimensional rendering and display of plenoptic images | |
| CN114241127A (zh) | 全景图像生成方法、装置、电子设备和介质 | |
| JP2012173858A (ja) | 全方位画像生成方法、画像生成装置およびプログラム | |
| CN114143442B (zh) | 图像虚化方法、计算机设备、计算机可读存储介质 | |
| US20250200768A1 (en) | Image processing method |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17823481 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17823481 Country of ref document: EP Kind code of ref document: A1 |