CN110741633A - Image processing method, electronic device, and computer-readable storage medium - Google Patents
Image processing method, electronic device, and computer-readable storage medium Download PDFInfo
- Publication number
- CN110741633A CN110741633A CN201780091703.1A CN201780091703A CN110741633A CN 110741633 A CN110741633 A CN 110741633A CN 201780091703 A CN201780091703 A CN 201780091703A CN 110741633 A CN110741633 A CN 110741633A
- Authority
- CN
- China
- Prior art keywords
- image
- point
- feature point
- feature
- coordinate
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
Abstract
The invention discloses a image processing method for an electronic device (100). The image processing method comprises the steps of (S112) obtaining a th image and a second image, wherein the 0 th image and the second image have 1-defined similarity, (S114) determining a th feature point of a 2 th image and searching a second feature point corresponding to a th feature point in the second image, (S116) establishing a coordinate system for the th image and the second image and determining an initial coordinate point of a th feature point and an initial coordinate point of the second feature point, (S118) adjusting the position of the second feature point in a preset mode, (S122) determining an adjustment coordinate point after the position of the second feature point is adjusted, (S124) calculating the relative distance between the th feature point and the second feature point according to the initial coordinate point of the feature point and the adjustment coordinate point of the second feature point, (S126) determining the position adjustment amount of the second feature point according to the th image and the electronic device, and (S128) determining the position adjustment amount of the electronic device.
Description
The present invention relates to image processing technologies, and in particular, to image processing methods, electronic devices, and computer-readable storage media.
In the related art, the left-eye image and the right-eye image of the 3D image may have definite deviations, for example, a single camera is used to capture the left-eye image and capture the right-eye image after translation, during the translation of the camera, movement to the vertical direction or deflection may occur, so that the left-eye image and the right-eye image may not be normally combined into the 3D image.
Disclosure of Invention
Embodiments of the present invention provide image processing methods, electronic devices, and computer-readable storage media.
The invention provides image processing methods for an electronic device, the image processing method comprising:
acquiring th image and a second image, wherein the th image has certain similarity with the second image;
determining feature points of the th image, and searching a second feature point corresponding to the th feature point in the second image;
establishing a coordinate system for the th image and the second image, and determining an initial coordinate point of the th characteristic point and the second characteristic point;
adjusting the position of the second characteristic point according to a preset mode;
determining an adjusted coordinate point after the position of the second feature point is adjusted;
calculating a relative distance between the th feature point and the second feature point according to the initial coordinate point of the th feature point and the adjusted coordinate point of the second feature point;
determining a position adjustment amount of the second feature point when the relative distance between the th feature point and the second feature point is minimum, and
and adjusting the second image according to the position adjustment amount.
The invention provides electronic devices comprising a processor configured to:
acquiring th image and a second image, wherein the th image has certain similarity with the second image;
determining feature points of the th image, and searching a second feature point corresponding to the th feature point in the second image;
establishing a coordinate system for the th image and the second image, and determining an initial coordinate point of the th characteristic point and the second characteristic point;
adjusting the position of the second characteristic point according to a preset mode;
determining an adjusted coordinate point after the position of the second feature point is adjusted;
calculating a relative distance between the th feature point and the second feature point according to the initial coordinate point of the th feature point and the adjusted coordinate point of the second feature point;
determining a position adjustment amount of the second feature point when the relative distance between the th feature point and the second feature point is minimum, and
and adjusting the second image according to the position adjustment amount.
The invention provides electronic devices, comprising:
or more processors;
a memory; and
, wherein the or more programs are stored in the memory and configured to be executed by the or more processors, the programs when executed by the processors implementing the steps of the image processing method.
The present invention provides computer-readable storage media characterized in that the computer-readable storage media stores an image processing program that, when executed by at least processors, performs the steps of the image processing method.
The image processing method, the electronic device, and the computer-readable storage medium of the embodiments of the present invention determine a position adjustment amount from the th feature point of the th image and the coordinate point of the second feature point of the second image and adjust the second image according to the position adjustment amount, thereby making the th image and the second image suitable for synthesizing a 3D image, and further facilitating the synthesis of the th image and the second image into a 3D image.
Additional aspects and advantages of embodiments of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the invention.
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
FIG. 1 is a schematic flow diagram of an image processing method according to an embodiment of the invention;
FIG. 2 is a block diagram of an electronic device according to an embodiment of the invention;
FIG. 3 is a schematic flow chart diagram of an example of an image processing method according to an embodiment of the invention;
FIG. 4 is a block diagram of an th example of an electronic device according to an embodiment of the invention;
FIG. 5 is a schematic flow chart diagram of a second example of an image processing method according to an embodiment of the present invention;
FIG. 6 is a block diagram of a second embodiment of an electronic device according to an embodiment of the invention;
fig. 7 is a flowchart illustrating a third example of an image processing method according to an embodiment of the present invention;
fig. 8 is a flowchart schematically showing a fourth example of the image processing method according to the embodiment of the present invention;
fig. 9 is a flowchart illustrating a fifth example of an image processing method according to an embodiment of the present invention;
fig. 10 is a flowchart schematically showing a sixth example of the image processing method according to the embodiment of the present invention;
fig. 11 is a flowchart schematically showing a seventh example of the image processing method according to the embodiment of the present invention;
fig. 12 is a flowchart schematically showing an eighth example of the image processing method according to the embodiment of the present invention;
fig. 13 is a flowchart schematically showing a ninth example of the image processing method according to the embodiment of the present invention;
FIG. 14 is a scene diagram of an th image and th example of a second image in accordance with an embodiment of the present invention;
FIG. 15 is a scene diagram of a second example of an th image and a second image of an embodiment of the invention;
FIG. 16 is a scene diagram of a third example of an th image and a second image of an embodiment of the invention;
fig. 17 is a block diagram of a third example of an electronic apparatus according to an embodiment of the present invention;
fig. 18 is a schematic connection diagram of an electronic device and a computer-readable storage medium according to an embodiment of the present invention.
Description of the drawings with the main elements symbols:
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated.
In the description of the present invention, it should be noted that unless otherwise explicitly stated or limited, the terms "mounted," "connected," and "connected" shall be construed , and for example, they may be fixedly connected, detachably connected, or physically connected, mechanically connected, electrically connected or in communication with each other, directly connected, indirectly connected through an intermediate medium, connected between two elements, or in an interaction relationship between two elements.
The following disclosure provides many different embodiments or examples for implementing different features of the invention. To simplify the disclosure of the present invention, the components and arrangements of specific examples are described below. Of course, they are merely examples and are not intended to limit the present invention. Furthermore, the present invention may repeat reference numerals and/or letters in the various examples, such repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed. In addition, the present invention provides examples of various specific processes and materials, but one of ordinary skill in the art may recognize applications of other processes and/or uses of other materials.
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention, and are not to be construed as limiting the present invention.
Please , with reference to fig. 1 and 2, an image processing method according to an embodiment of the present invention can be applied to the electronic device 100, the image processing method includes:
step S112, th image and a second image are obtained, wherein the th image and the second image have certain similarity;
step S114, determining the characteristic point of the th image, and searching a second characteristic point corresponding to the th characteristic point in the second image;
step S116, establishing a coordinate system for the th image and the second image, and determining an initial coordinate point of the th characteristic point and the second characteristic point;
step S118: adjusting the position of the second characteristic point according to a preset mode;
step S122: determining an adjusted coordinate point after the position of the second feature point is adjusted;
step S124, calculating the relative distance between the th characteristic point and the second characteristic point according to the initial coordinate point of the th characteristic point and the adjustment coordinate point of the second characteristic point;
step S126, when the relative distance between the th characteristic point and the second characteristic point is minimum, determining the position adjustment amount of the second characteristic point, and
step S128: and adjusting the second image according to the position adjustment amount.
Referring to fig. 2 again, the electronic device 100 according to the embodiment of the invention includes a processor 10. The processor 10 is configured to:
acquiring th image and a second image, wherein the th image has certain similarity with the second image;
determining feature points of the th image, and searching a second feature point corresponding to the th feature point in the second image;
establishing a coordinate system for the th image and the second image, and determining an initial coordinate point of the th characteristic point and the second characteristic point;
adjusting the position of the second characteristic point according to a preset mode;
determining an adjusted coordinate point after the position of the second feature point is adjusted;
calculating a relative distance between the th feature point and the second feature point according to the initial coordinate point of the th feature point and the adjusted coordinate point of the second feature point;
determining a position adjustment amount of the second feature point when the relative distance between the th feature point and the second feature point is minimum, and
and adjusting the second image according to the position adjustment amount.
That is, the image processing method according to the embodiment of the present invention can be realized by the electronic apparatus 100 according to the embodiment of the present invention.
The image processing method and the electronic device 100 of the embodiment of the invention determine the position adjustment amount according to the th characteristic point of the th image and the coordinate point of the second characteristic point of the second image and adjust the second image according to the position adjustment amount, so that the th image and the second image are suitable for synthesizing a 3D image, and further the th image and the second image are convenient to synthesize the 3D image.
The electronic device 100 includes, but is not limited to, a mobile phone, a computer, a camera, and the like.
In some embodiments, the th image has similarity with the second image, and it is understood that the 0 th image and the second image may refer to the left eye image and the right eye image of the 3D image, wherein the 1 th image may refer to 2 of the left eye image and the right eye image, and the second image may refer to the other 3 of the left eye image and the right eye image, it should be noted that the th image and the second image are paired with each other, for example, the th image refers to the left eye image of the 3D image, and the second image refers to the right eye image of the 3D image, in cases, the th image and the second image are different images of the same object (like person, the same objects, etc.).
In addition, whether the th image and the second image have similarity can be judged, whether the similar feature points of the th image and the second image exceed a preset proportion can be judged, when the similar feature points of the th image and the second image exceed the preset proportion, the th image and the second image have similarity, wherein the preset proportion can be preset in the electronic device 100 or set according to user requirements, for example, the preset proportion can be 80%, 90% and the like, and is not limited in detail.
It should be noted that the image processing method according to the embodiment of the present invention is to adjust the position of the second image so that the th image and the second image are suitable for synthesizing the 3D image, it is understood that in other embodiments, the image processing method may be performed before, after, or simultaneously with the adjustment of the second image, or the th image may be adjusted, and the method of adjusting the th image may be performed with reference to the adjustment method of the second image, which is not limited herein.
In some embodiments, the th image and the second image are stored in the electronic device 100, so that the electronic device 100 can directly read the th image and the second image and process the th image and the second image by using the image processing method according to the embodiments of the present invention.
In some embodiments, the electronic device 100 includes a communication module, which may be a WiFi module in communication with the cloud or a bluetooth module in communication with other electronic devices 100, and the electronic device 100 obtains the th image and the second image through the communication module.
Please , with reference to fig. 3 and 4, in the embodiment, the electronic device 100 includes two cameras 20, and the step S112 includes:
step S1122 controls the two cameras 20 to respectively acquire the th image and the second image.
Accordingly, the processor 10 controls the two cameras 20 to acquire the th image and the second image, respectively.
However, the images and the second images acquired by the two cameras 20 cannot be combined into qualified 3D images due to insufficient process or large errors in mounting the cameras 20, so the images and the second images acquired by the two cameras 20 can be adjusted by using the image processing method to be combined into qualified 3D images, and in examples, the images and the second images acquire different images of the same objects (like people, the same objects and the like) by the two cameras 20.
Please , referring to fig. 5 and 6, in another embodiment , the electronic device 100 includes a single camera 20, and step S112 specifically includes:
step S1124 of controlling the camera 20 to acquire the th image, and
step S1126: a second image is acquired after the position of the camera 20 has changed.
In this manner, the camera 20 acquires the th image and the second image at two different positions, respectively, i.e., the camera 20 acquires the th image at the th position and the second image at a second position different from the th position.
The processor 10 controls the camera 20 to acquire the th image and to acquire the second image after the position of the camera 20 has changed.
That is, steps S1124 and S1126 may be implemented by the processor 10.
In this manner, the electronic device 100 may acquire the th image and the second image through the single camera 20.
Specifically, in some embodiments, the electronic device 100 has only cameras 20, and therefore, the electronic device 100 can synthesize a 3D image using the 1 th and second images by controlling the cameras 20 to acquire the 0 th image at the th position and the second image at the second position, since the changed position of the cameras 20 may not be ideal, such as the th and second images acquired by a single camera 20 of the electronic device 100 cannot be synthesized into a qualified 3D image due to an excessively large angle of movement or rotation, the image processing method, i.e., steps S114, S116, S118, S122, S124, S126, and S128, etc., can be used to adjust the second image to enable the th and second images to be synthesized into a qualified 3D image, in instances, the th and second images are different images of the same object (like , the same person , etc.) acquired by the same cameras 20 at different positions.
In some embodiments, the th feature point corresponding to the second feature point may refer to similarity between the th feature point and the second feature point.
Step S114 can adopt a Sift (scale invariant feature change) algorithm, a Surf (accelerated robust feature) algorithm and other feature matching algorithms with rotation invariance to search a second feature point corresponding to the th feature point in a second image, the Sift algorithm is algorithms for detecting local features in the image, an extreme point of the image is searched in a spatial scale, and the position, scale and rotation invariance of the image is extracted, the Surf algorithm is high-robustness local feature point detection algorithms, is improved by the Sift algorithm and can be applied to computer vision object identification and 3D reconstruction, the th feature point is obtained in the th image, and then the th feature point obtained by the th image is used for searching a second feature point corresponding to the th feature point in the second image, so that the th feature point and the second feature point which are matched with each other are obtained.
In some embodiments, the feature points (the th feature point and the second feature point) may refer to global feature points, such as color features, texture features, shapes of main objects, and the like, and may also refer to local feature points, such as spots and corner points, spots may refer to regions with color and gray level differences from the surroundings, such as wild flowers in weeds or insects, corner points may refer to corners or intersections between lines of objects in the images, the above sift algorithm and surf algorithm may be used to detect spots in the images (extreme points of the above images) as feature points, in examples, when the th image and the second image are both face images, the eye corner of the human eye of the th image may be obtained as the th feature point, and then the second feature point corresponding to the eye corner of the human eye may be searched in the second image according to the features (such as pixel values, colors, and the like) of the eye corner of the th image, so as to obtain the second th feature point and the second feature point paired with each other.
Referring to fig. 7, step S118 specifically includes:
step S1182: and rotating the second characteristic point along the preset direction to change the position of the second characteristic point.
Accordingly, the processor 10 rotates the second feature point in the preset direction and changes the position of the second feature point.
Thus, the position of the second feature point can be changed by rotating the second feature point, so that the position adjustment amount of the second feature point is obtained, thereby adjusting the second image according to the position adjustment amount.
Specifically, when there is a position deviation between the th image and the second image, a position adjustment amount may be determined according to the position of the rotated second feature point by rotating the position of the second feature point in a preset direction, and the second image may be adjusted according to the position adjustment amount of the second feature point, wherein the preset direction may be a clockwise direction or a counterclockwise direction.
Referring to fig. 8, step S118 specifically includes:
step S1184: the second feature point is moved up or down in the vertical direction, so that the position of the second feature point is changed.
Accordingly, the processor 10 moves the second feature point upward or downward in the vertical direction, causing the position of the second feature point to change.
Thus, the position of the second feature point can be changed by moving the second feature point upward or downward in the vertical direction, and thus, the position adjustment amount of the second feature point is obtained, and the second image is adjusted according to the position adjustment amount of the second feature point.
Specifically, when there is a positional deviation between the th image and the second image, the second image may be adjusted by moving the second feature point upward or downward in the vertical direction, changing the position of the second feature point, determining the position adjustment amount of the second feature point from the position of the moved second feature point, and adjusting the second image based on the position adjustment amount of the second feature point.
It should be noted that, in some embodiments, when only rotational deviation exists between the th image and the second image, the th image and the second image can be adjusted to be able to synthesize a qualified 3D image through step S1182 and subsequent image processing methods, in some embodiments, when only translational deviation exists between the th image and the second image, the th image and the second image can be adjusted to be able to synthesize a qualified 3D image through step S1184 and subsequent image processing methods, in some embodiments, when both rotational deviation and translational deviation exist between the th image and the second image, the th image and the second image need to be adjusted to be able to synthesize a 3D image through steps S1182, S1184 and subsequent image processing methods, wherein when both rotational deviation and translational deviation exist between the th image and the second image, the rotational deviation can be eliminated through step S1182, the translational deviation can be eliminated through step S1184, and the translational deviation can also be eliminated through step S1182, so that the actual rotational deviation can be eliminated through step S1182.
Referring to fig. 9, step S124 includes:
in step S1242, a variance of the difference between the initial coordinate point of the th feature point and the ordinate of the adjustment coordinate point of the second feature point is calculated.
Correspondingly, the processor 10 calculates a variance of the difference between the initial coordinate point of the th feature point and the ordinate of the adjustment coordinate point of the second feature point.
In this manner, the relative distance between the -th feature point and the second feature point may be calculated by the variance of the difference between the initial coordinate point of the -th feature point and the ordinate of the adjusted coordinate point of the second feature point.
Referring to fig. 10, step S126 includes:
step S1262, when the variance of the difference value of the vertical coordinate is minimum, judging that the relative distance between the th characteristic point and the second characteristic point is minimum and judging that the second characteristic point is adjusted to the target position at the moment;
step S1264: determining a target coordinate point when the second characteristic point is adjusted to the target position;
and S1266, calculating the position adjustment amount of the second characteristic point according to the initial coordinate point and the target coordinate point of the second characteristic point, wherein the position adjustment amount is at least types of rotation angle and moving distance.
Correspondingly, the processor 10 is configured to:
when the variance of the difference value of the vertical coordinates is minimum, judging that the relative distance between the th characteristic point and the second characteristic point is minimum and judging that the second characteristic point is adjusted to the target position at the moment;
determining a target coordinate point when the second characteristic point is adjusted to the target position;
and calculating the position adjustment amount of the second characteristic point according to the initial coordinate point and the target coordinate point of the second characteristic point, wherein the position adjustment amount is at least of the rotation angle and the moving distance.
In this way, the position adjustment amount of the second feature point can be obtained.
Specifically, when the variance of the difference between the vertical coordinates is minimum, it may be determined that the relative distance between the th feature point and the second feature point is minimum, which indicates that the second feature point has been adjusted to the target position, at this time, the target coordinate point of the second feature point at the target position may be calculated, and then the position adjustment amount of the second feature point may be calculated according to the initial coordinate point and the target coordinate point of the second feature point.
It should be noted that, when the position of the second feature point is adjusted in the preset manner in step S118, if the position of the second feature point is rotated in the preset direction, the obtained position adjustment amount should be a rotation angle, and when the position of the second feature point is adjusted in the preset manner in step S118, if the position of the second feature point is moved upward or downward in the vertical direction, the obtained position adjustment amount is a movement distance.
In some embodiments, the second feature point may be adjusted to the target position when the variance of the difference between the longitudinal coordinates of the th feature point and the second feature point is minimum by rotating times at a preset rotation frequency, for example, every 1 degree, and calculating the variance of the difference between the longitudinal coordinates of the second feature point and the th feature point after each rotation.
In some embodiments, it may be calculated whether the variance of the difference between the ordinate of the translated second feature point and the ordinate of the -th feature point increases or decreases by setting the translation direction (upward or downward) and moving the second feature point in the translation direction, if the variance increases, it may be moved in the direction opposite to the translation direction, if the variance decreases, it may be moved continuously in the translation direction and it may be determined that the second feature point is adjusted to the target position when the variance of the difference between the ordinate of the -th feature point and the ordinate of the second feature point is minimum.
Referring to fig. 11, step S128 specifically includes:
step S1282: and controlling the second image to rotate around the center of the second image by a rotation angle along a preset direction so as to adjust the second image to the target position.
Accordingly, the processor 10 is configured to:
and controlling the second image to rotate around the center of the second image by a rotation angle along a preset direction so as to adjust the second image to the target position.
In this manner, the second image may be rotated according to the rotation angle to adjust the second image to the target position.
Specifically, in step S1182, when the position of the second feature point is rotated in the predetermined direction, if the center of the second image is set as the rotation center during the rotation, step S1282 may rotate the second image around the center of the second image, it is understood that in other embodiments, the origin of the coordinate system where the second feature point is located may be set as the rotation center during the rotation, or any angle of the second image may be set as the rotation center, and at the same time, the rotation center during the rotation of the second image and the rotation center during the rotation of the second feature point may be required, and the rotation direction of the second image and the rotation direction of the second feature point may be , and for example, the rotation direction of the second feature point may be clockwise, and the rotation direction of the second image may also be clockwise.
Referring to fig. 12, step S128 specifically includes:
step S1284: and controlling the second image to move upwards or downwards in the vertical direction by the moving distance so as to adjust the second image to the target position.
Accordingly, the processor 10 is configured to:
and controlling the second image to move upwards or downwards in the vertical direction by the moving distance so as to adjust the second image to the target position.
In this manner, the second image may be translated according to the movement distance to adjust the second image to the target position.
Specifically, when the position of the second feature point is moved up or down in the vertical direction in step S1184, if the second feature point is moved up while moving, step S1284 may be to move the second image up. It is to be understood that, in other embodiments, the second feature point may be moved downward when the second feature point is moved, and the second image may be moved downward when the second image is moved.
Referring to fig. 13, step S116 includes:
and step S1162, establishing th image and the second image in the same coordinate system.
Accordingly, the processor 10 is configured to:
the th and second images were set up in the same coordinate system.
In this manner, the coordinate points of the characteristic points of the th image and the second image can be simultaneously expressed by using the same coordinate systems.
Specifically, when the th image and the second image are established in the same coordinate system, the coordinate system may be established with the center point of the th image as the origin, the direction in which the center point of the th image points to the center point of the second image as the X-axis direction, and the direction perpendicular to the X-axis direction as the Y-axis direction, so that the th image and the second image can be represented in the same coordinate system, and therefore, the coordinate points of the feature points of the th image and the second image can be rapidly represented by the coordinate system.
When the th image and the second image can be normally combined into a 3D image, the th image and the second image only show a left-right shift, for example, under the same coordinate system, the abscissa of the fish eye in the th image is different from the ordinate of the fish eye in the second image, and the ordinate of the fish eye is the same.
In some embodiments, the images taken by the user have a deviation, for example, as shown in fig. 15, it can be seen that the fish in the second image on the right has a rotational deviation compared with the fish in the th image on the left, so that the image processing method according to the embodiments of the present invention can be used to calculate the rotation angle and rotate the second image, and the rotated second image may be directly changed into the second image shown in fig. 14 (only the rotational deviation exists) or the second image on the right (both the rotational deviation and the translational deviation exist) shown in fig. 16.
It is understood that the second image on the right side of fig. 16 is only vertically shifted compared to the second image shown in fig. 14, and therefore, the th image and the second image shown in fig. 14, which can normally synthesize a 3D image, can be obtained by calculating the movement distance and translating the second image shown in fig. 16 by the image processing method according to the embodiment of the present invention.
Referring to fig. 17, an electronic device 100 according to an embodiment of the invention includes processors 10, a memory 30, and programs are stored in the memory 30 and configured to be executed by the processors 10, and when executed by the processor 10, implement the steps of the image processing method described in any embodiment.
Referring to fig. 18, the computer readable storage medium 500 according to the embodiment of the present invention stores an image processing program, and the image processing program when executed by at least processors 10 performs the steps of the image processing method according to any of the embodiments described above.
Note that the computer-readable storage medium 500 may be a storage medium built in the electronic apparatus 100, or may be a storage medium that can be plugged into the electronic apparatus 100 in a removable manner.
Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any of or more embodiments or examples.
Although embodiments of the present invention have been shown and described above, it is understood that the above embodiments are exemplary and not to be construed as limiting the present invention, and those skilled in the art can make changes, modifications, substitutions and alterations to the above embodiments within the scope of the present invention.
Claims (18)
- image processing method for an electronic device, the image processing method comprising:acquiring th image and a second image, wherein the th image has certain similarity with the second image;determining feature points of the th image, and searching a second feature point corresponding to the th feature point in the second image;establishing a coordinate system for the th image and the second image, and determining an initial coordinate point of the th characteristic point and the second characteristic point;adjusting the position of the second characteristic point according to a preset mode;determining an adjusted coordinate point after the position of the second feature point is adjusted;calculating a relative distance between the th feature point and the second feature point according to the initial coordinate point of the th feature point and the adjusted coordinate point of the second feature point;determining a position adjustment amount of the second feature point when the relative distance between the th feature point and the second feature point is minimum, andand adjusting the second image according to the position adjustment amount.
- The image processing method according to claim 1, wherein the adjusting the position of the second feature point in a preset manner is specifically rotating the second feature point in a preset direction to change the position of the second feature point.
- The image processing method according to claim 1, wherein the adjusting the position of the second feature point in a preset manner is specifically moving the second feature point up or down in a vertical direction to change the position of the second feature point.
- The image processing method according to claim 2 or 3, wherein the calculating of the relative distance between the -th feature point and the second feature point from the initial coordinate point of the -th feature point and the adjusted coordinate point of the second feature point comprises:calculating a variance of a difference between the initial coordinate point of the th feature point and the ordinate of the adjusted coordinate point of the second feature point.
- The image processing method according to claim 4, wherein the determining the position adjustment amount of the second feature point when the relative distance between the th feature point and the second feature point is smallest includes:when the variance of the difference value of the vertical coordinates is minimum, judging that the relative distance between the th characteristic point and the second characteristic point is minimum and judging that the second characteristic point is adjusted to a target position at the moment;determining a target coordinate point when the second characteristic point is adjusted to the target position;and calculating the position adjustment amount of the second characteristic point according to the initial coordinate point and the target coordinate point of the second characteristic point, wherein the position adjustment amount is at least of the rotation angle and the movement distance.
- The image processing method according to claim 5, wherein the adjusting the second image according to the position adjustment amount is specifically: and controlling the second image to rotate around the center of the second image by the rotation angle along a preset direction so as to adjust the second image to the target position.
- The image processing method according to claim 5, wherein the adjusting the second image according to the position adjustment amount is specifically: and controlling the second image to move upwards or downwards in the vertical direction by the moving distance so as to adjust the second image to the target position.
- The image processing method of claim 1, wherein said establishing a coordinate system for said th image and said second image comprises:the th image and the second image are established in the same coordinate systems.
- , comprising a processor configured to:acquiring th image and a second image, wherein the th image has certain similarity with the second image;determining feature points of the th image, and searching a second feature point corresponding to the th feature point in the second image;establishing a coordinate system for the th image and the second image, and determining an initial coordinate point of the th characteristic point and the second characteristic point;adjusting the position of the second characteristic point according to a preset mode;determining an adjusted coordinate point after the position of the second feature point is adjusted;calculating a relative distance between the th feature point and the second feature point according to the initial coordinate point of the th feature point and the adjusted coordinate point of the second feature point;determining a position adjustment amount of the second feature point when the relative distance between the th feature point and the second feature point is minimum, andand adjusting the second image according to the position adjustment amount.
- The electronic device of claim 9, wherein the processor is configured to rotate the second feature point in a predetermined direction to change a position of the second feature point.
- The electronic device of claim 9, wherein the processor is configured to move the second feature point up or down in a vertical direction to change a position of the second feature point.
- The electronic device of claim 10 or 11, wherein the processor is to:calculating a variance of a difference between the initial coordinate point of the th feature point and the ordinate of the adjusted coordinate point of the second feature point.
- The electronic device of claim 12, wherein the processor is to:when the variance of the difference value of the vertical coordinates is minimum, judging that the relative distance between the th characteristic point and the second characteristic point is minimum and judging that the second characteristic point is adjusted to a target position at the moment;determining a target coordinate point when the second characteristic point is adjusted to the target position;and calculating the position adjustment amount of the second characteristic point according to the initial coordinate point and the target coordinate point of the second characteristic point, wherein the position adjustment amount is at least of the rotation angle and the moving distance.
- The electronic device of claim 13, wherein the processor is to:and controlling the second image to rotate around the center of the second image by the rotation angle along a preset direction so as to adjust the second image to the target position.
- The electronic device of claim 13, wherein the processor is to:and controlling the second image to move upwards or downwards in the vertical direction by the moving distance so as to adjust the second image to the target position.
- The electronic device of claim 9, wherein the processor is to:the th image and the second image are established in the same coordinate systems.
- an electronic device, comprising:or more processors;a memory; and, wherein the or more programs are stored in the memory and configured to be executed by the or more processors, when the programs are executed by the processors implement the steps of the image processing method of any of claims 1-8 to .
- computer-readable storage medium, characterized in that the computer-readable storage medium stores an image processing program which, when executed by at least processors, performs the steps of the image processing method of any of claims 1-8.
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/101309 WO2019047245A1 (en) | 2017-09-11 | 2017-09-11 | Image processing method, electronic device and computer readable storage medium |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN110741633A true CN110741633A (en) | 2020-01-31 |
Family
ID=65633588
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201780091703.1A Pending CN110741633A (en) | 2017-09-11 | 2017-09-11 | Image processing method, electronic device, and computer-readable storage medium |
Country Status (2)
| Country | Link |
|---|---|
| CN (1) | CN110741633A (en) |
| WO (1) | WO2019047245A1 (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114115678A (en) * | 2021-11-30 | 2022-03-01 | 深圳市锐尔觅移动通信有限公司 | Content display control method and related device |
Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102170576A (en) * | 2011-01-30 | 2011-08-31 | 中兴通讯股份有限公司 | Processing method and device for double camera stereoscopic shooting |
| CN102289803A (en) * | 2010-06-02 | 2011-12-21 | 索尼公司 | Image Processing Apparatus, Image Processing Method, and Program |
| CN102868896A (en) * | 2011-07-05 | 2013-01-09 | 瑞昱半导体股份有限公司 | Stereoscopic image processing device and stereoscopic image processing method |
| CN102905147A (en) * | 2012-09-03 | 2013-01-30 | 上海立体数码科技发展有限公司 | Three-dimensional image correction method and apparatus |
| CN105635719A (en) * | 2014-11-20 | 2016-06-01 | 三星电子株式会社 | Method and apparatus for calibrating multi-view images |
| CN105812766A (en) * | 2016-03-14 | 2016-07-27 | 吉林大学 | Vertical parallax subtraction method |
| CN106534833A (en) * | 2016-12-07 | 2017-03-22 | 上海大学 | Space and time axis joint double-viewpoint three dimensional video stabilizing method |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2010004465A (en) * | 2008-06-23 | 2010-01-07 | Fujinon Corp | Stereoscopic image photographing system |
| JP5024410B2 (en) * | 2010-03-29 | 2012-09-12 | カシオ計算機株式会社 | 3D modeling apparatus, 3D modeling method, and program |
| CN102567995A (en) * | 2012-01-04 | 2012-07-11 | 朱经纬 | Image registration method |
| WO2015165037A1 (en) * | 2014-04-29 | 2015-11-05 | 中国科学院自动化研究所 | Cascaded binary coding based image matching method |
| CN106327482B (en) * | 2016-08-10 | 2019-01-22 | 东方网力科技股份有限公司 | A kind of method for reconstructing and device of the facial expression based on big data |
-
2017
- 2017-09-11 WO PCT/CN2017/101309 patent/WO2019047245A1/en not_active Ceased
- 2017-09-11 CN CN201780091703.1A patent/CN110741633A/en active Pending
Patent Citations (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102289803A (en) * | 2010-06-02 | 2011-12-21 | 索尼公司 | Image Processing Apparatus, Image Processing Method, and Program |
| CN102170576A (en) * | 2011-01-30 | 2011-08-31 | 中兴通讯股份有限公司 | Processing method and device for double camera stereoscopic shooting |
| CN102868896A (en) * | 2011-07-05 | 2013-01-09 | 瑞昱半导体股份有限公司 | Stereoscopic image processing device and stereoscopic image processing method |
| CN102905147A (en) * | 2012-09-03 | 2013-01-30 | 上海立体数码科技发展有限公司 | Three-dimensional image correction method and apparatus |
| CN105635719A (en) * | 2014-11-20 | 2016-06-01 | 三星电子株式会社 | Method and apparatus for calibrating multi-view images |
| CN105812766A (en) * | 2016-03-14 | 2016-07-27 | 吉林大学 | Vertical parallax subtraction method |
| CN106534833A (en) * | 2016-12-07 | 2017-03-22 | 上海大学 | Space and time axis joint double-viewpoint three dimensional video stabilizing method |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114115678A (en) * | 2021-11-30 | 2022-03-01 | 深圳市锐尔觅移动通信有限公司 | Content display control method and related device |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2019047245A1 (en) | 2019-03-14 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10515271B2 (en) | Flight device and flight control method | |
| CN114637023B (en) | System and method for laser depth map sampling | |
| JP5580164B2 (en) | Optical information processing apparatus, optical information processing method, optical information processing system, and optical information processing program | |
| CN108574825B (en) | A method and device for adjusting a PTZ camera | |
| CN107680125B (en) | System and method for automatically selecting three-dimensional alignment algorithm in vision system | |
| US10078899B2 (en) | Camera system and image registration method thereof | |
| US20180249144A1 (en) | System and Method for Virtually-Augmented Visual Simultaneous Localization and Mapping | |
| JP2017091079A (en) | Image processing device and method for extracting image of object to be detected from input data | |
| CN110826512B (en) | Ground obstacle detection method, device and computer-readable storage medium | |
| JP2014529727A (en) | Automatic scene calibration | |
| WO2012039139A1 (en) | Pupil detection device and pupil detection method | |
| CN107111598A (en) | Optical flow imaging system and method using ultrasonic depth sensing | |
| CN112307912A (en) | Method and system for determining personnel track based on camera | |
| EP3093822B1 (en) | Displaying a target object imaged in a moving picture | |
| JP2017208606A5 (en) | ||
| CN113129383A (en) | Hand-eye calibration method and device, communication equipment and storage medium | |
| US10346709B2 (en) | Object detecting method and object detecting apparatus | |
| JP2014134856A (en) | Subject identification device, subject identification method, and subject identification program | |
| JP5369873B2 (en) | Judgment program and calibration device | |
| WO2022141123A1 (en) | Movable platform and control method and apparatus therefor, terminal device and storage medium | |
| KR20120108256A (en) | Robot fish localization system using artificial markers and method of the same | |
| JP2020004219A (en) | Apparatus, method, and program for generating three-dimensional shape data | |
| WO2023199583A1 (en) | Viewer control method and information processing device | |
| US10937180B2 (en) | Method and apparatus for depth-map estimation | |
| JP6492603B2 (en) | Image processing apparatus, system, image processing method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| CB02 | Change of applicant information | ||
| CB02 | Change of applicant information |
Address after: Building 43, Dayun software Town, No. 8288, Longgang Avenue, Henggang street, Longgang District, Shenzhen City, Guangdong Province Applicant after: Shenzhen Ruoyu Technology Co.,Ltd. Address before: Building 43, Dayun software Town, No. 8288, Longgang Avenue, Henggang street, Longgang District, Shenzhen City, Guangdong Province Applicant before: SHENZHEN ROYOLE TECHNOLOGIES Co.,Ltd. |
|
| RJ01 | Rejection of invention patent application after publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20200131 |