Disclosure of Invention
The embodiment of the invention aims to provide a template matching method and device for images, electronic equipment and a storage medium, so as to improve the template matching speed.
In a first aspect, an embodiment of the present invention provides a template matching method for an image, including:
acquiring a template image and an image to be matched;
Acquiring feature information of target template feature points of the template image, wherein the target template feature points are template feature points of the template image acquired according to a preset rule;
According to the feature information, calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points in parallel, wherein the plurality of target image points are image points in the same row or column in the image to be matched, and a preset column number or line number is arranged between two adjacent target image points;
Accumulating the similarity between each target image point and the target template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the target template feature point;
And determining a target matching area from each area to be matched according to the comprehensive similarity corresponding to each area to be matched.
Optionally, before determining the target matching area from the respective areas to be matched according to the comprehensive similarity corresponding to each area to be matched, the method further includes:
And acquiring the next target template characteristic point of the template image according to the preset rule, and returning to the step of acquiring the characteristic information of the target template characteristic point of the template image until the template characteristic points of the template image are acquired.
Optionally, after the template feature points of the template image are all acquired, the method further includes:
moving the template image by a preset line number according to a first moving direction or a preset column number according to a second moving direction, returning to the step of parallelly calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points according to the feature information until the template image is moved to coincide with a first boundary or a second boundary of the image to be matched;
the first boundary is a boundary corresponding to the first moving direction in the boundaries of the images to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the images to be matched.
Optionally, before the calculating, according to the feature information, the similarity between the target image points corresponding to the target template feature points in the image to be matched and the target template feature points in parallel, the method further includes:
based on single instruction stream multiple data stream SIMD, loading image information of a plurality of target image points corresponding to the target template feature points in the image to be matched from a cache;
And according to the feature information, calculating the similarity between a plurality of corresponding target image points of the target template feature points in the image to be matched and the target template feature points in parallel, wherein the similarity comprises the following steps:
and calculating the similarity between each target image point and the target template characteristic point in parallel based on the image information and the characteristic information.
Optionally, the storage sequence of the feature information of each template feature point in the template image in the memory is consistent with the distribution sequence of the row and the column of each template feature point in the template image;
the obtaining the feature information of the target template feature point of the template image comprises the following steps:
And acquiring the characteristic information of the target template characteristic points according to the storage sequence of the characteristic information of each template characteristic point of the template image in the memory.
Optionally, the feature information of each template feature point includes a template offset value and image feature information of the template feature point;
And according to the feature information, calculating the similarity between a plurality of corresponding target image points of the target template feature points in the image to be matched and the target template feature points in parallel, wherein the similarity comprises the following steps:
Determining a plurality of target image points corresponding to the target template feature points in the image to be matched according to the template offset value of the target template feature points, the position information of the corresponding image points of the reference points of the template image in the image to be matched, the column width of the image to be matched and the preset column number or row number;
and calculating the similarity between each target image point and the target template feature point in parallel according to the image feature information of the target template feature point and the image information of the target image points.
Optionally, the determining, according to the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, the column width of the image to be matched, and the preset column number or row number, a plurality of target image points corresponding to the target template feature point in the image to be matched includes:
According to the template offset value of the target template feature point, the position information of the corresponding image point of the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched, the current corresponding target image point of the target template feature point in the image to be matched is determined by adopting the following formula:
p=i*w+j+offset
Wherein p is a current corresponding target image point of a target template feature point in the image to be matched, i is an ordinal number of a row of a corresponding image point of a reference point of the template image in the image to be matched, w is a column width or a line width of the image to be matched, j is an ordinal number of a column of a corresponding image point of the reference point of the template image in the image to be matched, and offset is a template offset value of the target template feature point;
And moving the template image by a preset column number according to a third moving direction or a preset column number according to a fourth moving direction, returning the template offset value according to the target template characteristic point, the position information of the corresponding image point of the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched, and determining the current corresponding target image point of the target template characteristic point in the image to be matched by adopting the following formula until the template image moves to coincide with a third boundary or a fourth boundary of the image to be matched, wherein the third boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched, and the fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the image to be matched.
In a second aspect, an embodiment of the present invention provides a template matching apparatus for an image, including:
The image acquisition module is used for acquiring a template image and an image to be matched;
The feature information acquisition module is used for acquiring feature information of target template feature points of the template image;
The similarity calculation module is used for calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points in parallel according to the feature information, wherein the plurality of target image points are image points in the same row or column in the image to be matched, and a preset column number or line number is arranged between two adjacent target image points;
The similarity accumulation module is used for accumulating the similarity between each target image point and the target template characteristic point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point is located is the region corresponding to the template image in the image to be matched when the target image point coincides with the target template characteristic point;
The target region determining module is used for determining a target matching region from each region to be matched according to the comprehensive similarity corresponding to each region to be matched.
Optionally, the apparatus further includes:
The feature point obtaining module is configured to obtain a next target template feature point of the template image according to the preset rule before determining a target matching area from each to-be-matched area according to the comprehensive similarity corresponding to each to-be-matched area, and return the step of obtaining feature information of the target template feature point of the template image until the template feature points of the template image are all obtained.
Optionally, the apparatus further includes:
And the moving module is used for moving the template image by a preset line number according to a first moving direction or by a preset column number according to a second moving direction after the template feature points of the template image are acquired, returning to the step of parallelly calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points according to the feature information until the template image is moved to coincide with a first boundary or a second boundary of the image to be matched, wherein the first boundary is a boundary corresponding to the first moving direction in the boundaries of the image to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched.
Optionally, the apparatus further includes:
The information loading module is used for loading image information of a plurality of target image points corresponding to the target template feature points in the image to be matched from a high-speed buffer based on single instruction stream multiple data stream SIMD;
the similarity calculation module is specifically configured to calculate, in parallel, a similarity between each target image point and the target template feature point based on the image information and the feature information.
Optionally, the storage sequence of the feature information of each template feature point in the template image in the memory is consistent with the distribution sequence of the row and the column of each template feature point in the template image;
The feature information acquisition module is specifically configured to acquire feature information of feature points of the target template according to a storage sequence of feature information of feature points of each template of the template image in the memory.
Optionally, the feature information of each template feature point includes a template offset value and image feature information of the template feature point;
The similarity calculation module is specifically configured to determine a plurality of target image points corresponding to the target template feature points in the image to be matched according to the template offset value of the target template feature points, the position information of the corresponding image points of the reference points of the template image in the image to be matched, the column width of the image to be matched, and the preset column number or row number, and calculate the similarity between each target image point and the target template feature points in parallel according to the image feature information of the target template feature points and the image information of the plurality of target image points.
Optionally, the similarity calculation module includes:
The similarity calculation submodule is used for determining a target image point corresponding to the target template feature point currently in the image to be matched according to the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched by adopting the following formula:
p=i*w+j+offset
Wherein p is a current corresponding target image point of a target template feature point in the image to be matched, i is an ordinal number of a row of a corresponding image point of a reference point of the template image in the image to be matched, w is a column width or a line width of the image to be matched, j is an ordinal number of a column of a corresponding image point of the reference point of the template image in the image to be matched, and offset is a template offset value of the target template feature point;
And the moving submodule is used for moving the template image by a preset column number according to a third moving direction or by a preset line number according to a fourth moving direction, returning the template offset value according to the target template characteristic point, the position information of the corresponding image point of the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched, and determining the current corresponding target image point of the target template characteristic point in the image to be matched by adopting the following formula until the template image moves to be coincident with a third boundary or a fourth boundary of the image to be matched, wherein the third boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched, and the fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the image to be matched.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
A processor for implementing the method steps of any of the above first aspects when executing a program stored on a memory.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium having a computer program stored therein, which when executed by a processor, implements the method steps of any of the first aspects described above.
The embodiment of the invention has the beneficial effects that:
By adopting the method provided by the embodiment of the invention, after the characteristic information of the target template characteristic points of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points can be calculated in parallel according to the characteristic information, namely, the obtained characteristic information of one template characteristic point can be simultaneously calculated with the information of the corresponding target image points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row or the same column of the image to be matched can be loaded from the memory according to the preset column number or the preset line number, then when the similarity between the target template characteristic point and a plurality of corresponding target image points in the image to be matched is calculated, template characteristic information is not required to be acquired one by one, the similarity between the loaded target image points and the target template characteristic point at intervals of the preset column number or the preset line number can be calculated in parallel, and the similarity between the target template characteristic point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
Of course, it is not necessary for any one product or method of practicing the invention to achieve all of the advantages set forth above at the same time.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. Based on the embodiments of the present application, all other embodiments obtained by the person skilled in the art based on the present application are included in the scope of protection of the present application.
Fig. 1 is a schematic diagram of distribution of template feature information on a template image, as shown in fig. 1, where the size of a template image 101 is 7x7, template feature points of the template image are 0-21, and information corresponding to the template feature points 0-21 is template feature information, and as can be seen from fig. 1, the distribution position of the template feature information on the template image 101 is discrete and disordered. Fig. 2 is a schematic diagram of matching template feature information with an image to be matched, as shown in fig. 2, mainly matching a template image 101 with an image to be matched 201, and finding out whether there is a region matching with the template image 101 in the image to be matched 201. Because the distribution position of the template characteristic information on the template image is discrete and disordered, the traditional template matching mode can only acquire the information of the image points corresponding to the template characteristic points in the current image area of the image to be matched from the memory one by one and match the information with the template characteristic point information until the template characteristic information of all the template characteristic points is matched with the information of the corresponding image points in the image area, then acquire the information of the image points corresponding to the template characteristic points in the next image area of the image to be matched from the memory one by one and match the information of the template characteristic points until the image information in each image area of the image to be matched is matched with the template characteristic information. For example, when the template image 101 shown in fig. 2 is matched with the image area 202 of the covered image to be matched, the feature information of the template feature point 0 is firstly obtained, the similarity between the feature information of the template feature point 0 and the image information of the image point corresponding to the feature point 0 in the image area 202 is calculated, then the feature information of the feature point 1 is obtained, and the similarity between the feature information of the feature point 1 and the image information of the image point corresponding to the feature point 1 in the image area 202 is calculated, so that each template feature information is obtained one by one to be matched with the image information in the image area 202 until the template feature information is matched with the image information in the image area 202, then the next image area of the image 201 to be matched covered by the template image 101 is obtained after the template image is horizontally (or vertically) moved by 1 line (column) or multiple lines (row), and then the image information in the image area is matched with the template feature information one to be matched with the image information in the image area until the template feature information is matched with the image information in the image area. In this way, until the template feature information of the template image 101 is completely matched with the image information in each image area of the image 201 to be matched, the matching degree of each image area of the image 201 to be matched with the template image 101 is obtained.
However, template feature information needs to be loaded one by one when the template image is matched with each image area of the image to be matched, so that the template matching speed is low. The method for acquiring the image point information in the image areas of the images to be matched corresponding to the feature points of each template one by one and matching the image point information with the feature information of the template sequentially matches each image area of the images to be matched with the template image, and the feature information of the template needs to be loaded one by one when the template image is matched with each image area of the images to be matched, so that the template matching speed is low. In addition, in the process of acquiring the image information of the image point of each image area of the image to be matched from the memory, because of the mapping relationship between the memory and the Cache, a large amount of CACHE MISS occurs in acquiring the image point information of the adjacent image area to be matched, so that the image point information of the image area to be matched needs to be loaded from the memory repeatedly to the Cache for a plurality of times, thereby further causing the slow matching speed of the template. The Cache is a temporary memory between the CPU and an external memory and is positioned in the CPU processor, and the capacity of the Cache is smaller than that of the memory but the exchange speed is faster than that of the memory. CACHE MISS refers to the phenomenon that the CPU cannot find the required data in the Cache, so that the image information of the image point needs to be repeatedly loaded.
In order to improve the template matching speed, the embodiment of the invention provides an image template matching method, an image template matching device, electronic equipment, a computer readable storage medium and a computer program product.
The method for matching the template of the image provided by the embodiment of the invention is first described below. The template matching method of the image provided by the embodiment of the invention can be applied to any electronic equipment with an image processing function, and is not particularly limited herein.
Fig. 3 is a flowchart of a template matching method for an image according to an embodiment of the present invention, where, as shown in fig. 3, the method includes:
s301, acquiring a template image and an image to be matched.
In the embodiment of the invention, the template image is a target image which is generated in advance aiming at a specific application scene. For example, in a license plate matching scenario, the template image is a target license plate image, and in a face matching scenario, the template image is a face feature image, such as a face image including a face contour and/or a five sense organ of a person, and the like. The template image includes a plurality of template feature points, and as shown in fig. 1, the template image 101 includes template feature points 0 to 21.
The image to be matched is an image which needs to be matched with the template image.
S302, obtaining feature information of target template feature points of the template image.
The target template feature points are template feature points of a template image acquired according to a preset rule.
For example, the first template feature point in the template image may be used as a target template feature point and feature information of the target template feature point may be obtained according to a row-column distribution order of the template feature points in the template image, or the last template feature point in the template image may be used as a target template feature point and feature information of the target template feature point may be obtained by using the first template feature point as a starting point.
S303, according to the characteristic information, calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points in parallel.
The target image points are image points in the same row or column in the image to be matched, and two adjacent target image points are separated by a preset column number or line number. And when the plurality of target image points are the image points in the same row in the image to be matched, a preset column number is arranged between the two adjacent target image points, and when the plurality of target image points are the image points in the same column in the image to be matched, a preset line number is arranged between the two adjacent target image points.
The preset column number and the preset line number can be specifically set according to the actual application scenario, for example, the preset column number can be set to 2 columns or 3 columns, and the preset line number can be set to 3 lines or 4 lines.
And S304, accumulating the similarity between each target image point and the target template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located.
The region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the target template feature point.
S305, determining a target matching area from each area to be matched according to the comprehensive similarity corresponding to each area to be matched.
Specifically, the image region to be matched, whose corresponding comprehensive similarity is greater than a preset similarity threshold, may be selected as the target matching region. The preset similarity threshold may be specifically set according to an actual application scenario.
By adopting the method provided by the embodiment of the invention, after the characteristic information of the target template characteristic points of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points can be calculated in parallel according to the characteristic information, namely, the obtained characteristic information of one template characteristic point can be simultaneously calculated with the information of the corresponding target image points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row or the same column of the image to be matched can be loaded from the memory according to the preset column number or the preset line number, then when the similarity between the target template characteristic point and a plurality of corresponding target image points in the image to be matched is calculated, template characteristic information is not required to be acquired one by one, the similarity between the loaded target image points and the target template characteristic point at intervals of the preset column number or the preset line number can be calculated in parallel, and the similarity between the target template characteristic point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
In one possible implementation manner, before the target matching area is determined from the respective areas to be matched according to the comprehensive similarity corresponding to each area to be matched, the method may further include step A1:
and step A1, acquiring the next target template characteristic point of the template image according to the preset rule, and returning to the step of acquiring the characteristic information of the target template characteristic point of the template image until the template characteristic points of the template image are all acquired.
For example, taking the template image in fig. 1 as an example, taking the first template feature point "0" in the template image as a starting point according to the row-column distribution sequence of the template feature points of the template image, taking the first template feature point "0" as a target template feature point, after the execution of S302-S304 on the target template feature point is completed, the next template feature point "3" of the target template feature point may be taken as the next target template feature point of the template image, and then the execution of S302 is returned to the next target template feature point. Until the template feature points 0-21 of the template image are all acquired.
Or taking the template image in fig. 1 as an example, taking the last template feature point '12' in the template image as a starting point according to the reverse order of the row-column distribution sequence of the template feature points of the template image, taking the first template feature point '12' as a target template feature point, after the execution of S302-S304 on the target template feature point, taking the next template feature point '11' of the target template feature point as the next target template feature point of the template image, and then returning to execute S302 on the next target template feature point. Until the template feature points 0-21 of the template image are all acquired.
In one possible implementation manner, after the template feature points of the template image are all acquired, the method may further include step B1:
And B1, moving the template image by a preset line number according to a first moving direction or a preset column number according to a second moving direction, returning to the step of parallelly calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points according to the feature information until the template image moves to coincide with a first boundary or a second boundary of the image to be matched.
The first boundary is a boundary corresponding to the first moving direction in the boundaries of the images to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the images to be matched.
The method comprises the steps of determining a first moving direction of an image to be matched, wherein the first moving direction is a moving direction from top to bottom, the first boundary is an image lower boundary of the image to be matched, the first boundary is an image upper boundary of the image to be matched if the first moving direction is a moving direction from bottom to top, the second boundary is an image right boundary of the image to be matched if the second moving direction is a moving direction from left to right, and the second boundary is an image left boundary of the image to be matched if the second moving direction is a moving direction from right to left.
Specifically, the template image may be moved by a preset number of rows according to a first moving direction, and then the step of parallel calculating the similarity between the corresponding multiple target image points in the image to be matched and the target template feature points according to the feature information is returned until the template image is moved to coincide with the first boundary of the image to be matched, or the template image may be moved by a preset number of columns according to a second moving direction, and then the step of parallel calculating the similarity between the corresponding multiple target image points in the image to be matched and the target template feature points according to the feature information is returned until the template image is moved to coincide with the second boundary of the image to be matched.
In this embodiment, each template feature point may be sequentially acquired with the first template feature point in the template image as a starting point according to a row-column distribution order of the template feature points of the template image, for determining the target matching area. Referring specifically to fig. 4, fig. 4 is another flowchart of a template matching method for an image according to an embodiment of the present invention, as shown in fig. 4, where the method includes:
S401, acquiring a template image and an image to be matched, and setting k as 1.
S402, obtaining feature information of a kth template feature point of the template image.
S403, according to the feature information, calculating the similarity between a plurality of corresponding target image points in the image to be matched and the kth template feature point in parallel.
The target image points are image points in the same row in the image to be matched, and two adjacent target image points are separated by a preset column number. The preset number of columns may be set according to the actual matching condition, which is not specifically limited herein.
And S404, accumulating the similarity between each target image point and the kth template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located.
The region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the kth template feature point.
And S405, adding a value of k plus 1 to k, and returning to the step of acquiring the characteristic information of the kth template characteristic point of the template image until k is greater than the number of the template characteristic points.
S406, moving the template image by a preset line number according to a first moving direction, and returning to the step of setting k to be 1 until the template image is moved to coincide with the first boundary of the image to be matched.
The first boundary is a boundary corresponding to the first moving direction in the boundaries of the images to be matched. The first moving direction may be set to a vertically upward direction or a vertically downward direction along an image line of the image to be matched according to an actual matching situation. The preset number of lines may be set to 1, 2, or 3 according to the actual matching condition, which is not specifically limited herein.
S407, determining a target matching area from each area to be matched according to the comprehensive similarity corresponding to each area to be matched.
Specifically, the image region to be matched, whose corresponding comprehensive similarity is greater than a preset similarity threshold, may be selected as the target matching region. The preset similarity threshold may be specifically set according to an actual application scenario.
By adopting the method provided by the embodiment of the invention, after the feature information of the kth template feature point of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the kth template feature point of the kth template feature point can be calculated in parallel according to the feature information, namely, the obtained feature information of one template feature point can be used for calculating the similarity between the feature information of the corresponding target image points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row of the image to be matched can be loaded from the memory according to the preset column number, then when the similarity between the kth template feature point and the corresponding multiple target image points in the image to be matched is calculated, template feature information is not required to be acquired one by one, the loaded similarity between the multiple target image points at intervals of the preset column number and the kth template feature point can be calculated simultaneously, and the similarity between the kth template feature point and the corresponding multiple target image points in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
In one possible embodiment, the feature information of each template feature point includes a template offset value and image feature information of the template feature point. The parallel computing of the similarity between the corresponding target image points in the image to be matched and the target template feature points according to the feature information may include the following steps C1-C2
And step C1, determining a plurality of target image points corresponding to the target template feature points in the image to be matched according to the template offset value of the target template feature points, the position information of the corresponding image points of the reference points of the template image in the image to be matched, the column width of the image to be matched and the preset column number or row number.
The template offset value represents an offset of the template feature point relative to a reference point of the template image, and specifically may include a horizontal offset and a vertical offset of the template feature point relative to the reference point of the template image. The reference point of the template image may be set in a user-defined manner in practical application, for example, for the template image 101 shown in fig. 1, the reference point of the template image 101 may be set as an image point at the upper left corner of the template image 101, the image point may specifically be a pixel point of the image, then the horizontal offset of the template feature point 1 in the template image 101 relative to the reference point is 2, the vertical offset is 1, then the offset value of the template feature point 1 may be represented as [1,2], for example, the horizontal offset of the template feature point 18 in the template image 101 relative to the reference point is 6, the vertical offset is 2, and then the offset value of the template feature point 18 may be represented as [2,6].
The image feature information of the template feature point includes at least one of pixel information, luminance information, chromaticity information, gradient information, and position information of the template feature point.
Fig. 5 is a schematic diagram of matching a template image with an image to be matched according to an embodiment of the present invention. As shown in fig. 5, if the image point at the upper left corner of the template image 501 is set as the reference point, the image point corresponding to the reference point in the image 502 to be matched is the image point of the 0 th row and the 1 st column in the image 502 to be matched. The position information of the image point corresponding to the reference point of the template image in the image to be matched can be specifically the position coordinate of the image point in the image to be matched.
In one possible implementation manner, the step may determine, according to a template offset value of a target template feature point, position information of a corresponding image point of a reference point of the template image in the image to be matched, and a column width or a line width of the image to be matched, a current corresponding target image point of the target template feature point in the image to be matched by adopting the following formula:
p=i*w+j+offset
Wherein p is the current corresponding target image point of the target template feature point in the image to be matched, i is the ordinal number of the line of the image point corresponding to the reference point of the template image in the image to be matched, w is the column width or line width of the image to be matched, j is the ordinal number of the line of the image point corresponding to the reference point of the template image in the image to be matched, and offset is the template offset value of the target template feature point. If the target template characteristic points are obtained in a mode of sequentially obtaining the target template characteristic points from the template image row by row, w is the column width of the image to be matched, and if the target template characteristic points are obtained in a mode of sequentially obtaining the target template characteristic points from the template image row by row, w is the row width of the image to be matched.
For example, as shown in fig. 5, the image 502 to be matched includes 14 rows by 18 columns of image points, the image point at the upper left corner of the template image 501 may be set as a reference point, and the image point corresponding to the reference point in the image 502 to be matched is the image point a2. If the target template feature point is the template feature point 1 in the template feature points 0-21 of the template image 501, the offset value of the template feature point 1 may be [1,2], the position coordinate of the image point a2 in the image 502 to be matched is (0, 1), that is, the ordinal number of the row where the image point a2 is located is 0 is 1, and the column width of the image 502 to be matched is 18, the information is substituted into the formula p=i×w+j+offset=0×18+1+ (1×18+2) = (0+1) ×18+ (1+2) = [1,3], so as to obtain the image point of the corresponding image point of the target template feature point in the image 502 to be matched is the 1 st row and the 3 rd column. Wherein the index indices of the rows and columns in the image to be matched start from 0.
And then, moving the template image by a preset column number according to a third moving direction, and returning the template offset value according to the target template characteristic point, the position information of the image point corresponding to the reference point of the template image in the image to be matched and the column width of the image to be matched, and determining the target image point corresponding to the target template characteristic point currently in the image to be matched by adopting the following formula until the template image moves to be overlapped with the third boundary of the image to be matched. The third boundary is a boundary corresponding to the third moving direction in the boundaries of the images to be matched.
The third movement direction may be set to a horizontal right direction or a horizontal left direction along the image line of the image to be matched according to the actual matching situation. The preset number of columns may be set to 1,2, or 3 according to the actual matching situation, which is not specifically limited herein.
For example, as shown in fig. 5, if the preset column number is set to 1, the third moving direction is set to a horizontal right direction along the image line of the image 502 to be matched, and the target template feature point is template feature point 0, after determining that the target image point currently corresponding to the template feature point 0 in the image 502 to be matched is the image point a5, the template image 502 may be moved by 1 column according to the third moving direction, and the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, and the column width of the image to be matched are returned, and the step of determining the target image point currently corresponding to the target template feature point in the image 502 to be matched by using the following formula is adopted to determine the target image point currently corresponding to the template feature point 0 in the image 502 to be matched, namely the image point a6. Until the right boundary of the template image 502 moves to coincide with the third boundary of the image to be matched, the target image point corresponding to the template feature point 0 can be obtained to be the image point a 5-the image point a15 in the image 502 to be matched.
For further example, taking fig. 5 as an example, the image point at the upper left corner of the template image 501 is still set as the reference point, and the image point corresponding to the reference point in the image 502 to be matched is the image point a2. If the target template feature point is the template feature point 1 in the template feature points 0-21 of the template image 501, the offset value of the template feature point 1 may be [1,2], the position coordinate of the image point a2 in the image 502 to be matched is (0, 1), that is, the ordinal number of the row where the image point a2 is located is 0 is 1, and the line width of the image 502 to be matched is 14, the above information is substituted into the formula p=i×w+j+offset=0×14+1+ (1×14+2) = (0+1) ×14+ (1+2) = [1,3], so as to obtain the image point of the corresponding image point of the target template feature point in the image 502 to be matched is the 1 st line and the 3 rd column. Wherein the index indices of the rows and columns in the image to be matched start from 0.
And then, moving the template image by a preset line number according to a fourth moving direction, and returning the template offset value according to the target template characteristic point, the position information of the image point corresponding to the reference point of the template image in the image to be matched and the line width of the image to be matched, and determining the current corresponding target image point of the target template characteristic point in the image to be matched by adopting the following formula until the template image moves to be overlapped with the fourth boundary of the image to be matched. The fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the images to be matched.
The fourth moving direction may be set to a vertically upward direction or a vertically downward direction along the image column of the image to be matched according to the actual matching situation. The preset number of lines may be set to 1,2, or 3 according to the actual matching condition, which is not specifically limited herein.
Taking fig. 5 as an example, if the preset line number is set to 1, the fourth moving direction is set to a direction from top to bottom along the image column of the image 502 to be matched, and the target template feature point is the template feature point 3, after determining that the target image point currently corresponding to the template feature point 3 in the image 502 to be matched is the image point b1, the template image 502 may be moved by 1 line according to the fourth moving direction, and the template offset value according to the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, and the line width of the image to be matched are returned, and the step of determining the target image point currently corresponding to the target template feature point in the image 502 to be matched by adopting the following formula is adopted to determine the target image point currently corresponding to the template feature point 3 in the image 502 to be matched, namely the image point c1. Until the lower boundary of the template image 502 moves to coincide with the fourth boundary of the image to be matched, the target image points corresponding to the template feature point 3 can be obtained to be the image point b1, the image point c1, the image point d1, the image point e1, the image point f1, the image point g1, the image point h1 and the image point k1 in the image 502 to be matched.
And C2, calculating the similarity between each target image point and the target template characteristic point in parallel according to the image characteristic information of the target template characteristic point and the image information of the target image points.
For example, taking fig. 5 as an example, if the template image is moved along the third moving direction, if the preset column number is set to 1, the third moving direction is set to a horizontal right direction along the image line of the image 502 to be matched, and the target template feature point is the template feature point 0, the target image point corresponding to the template feature point 0 may be obtained as the image point a 5-the image point a15 in the image 502 to be matched. In the embodiment of the invention, the similarity between the image point a 5-the image point a15 and the target template characteristic point can be directly calculated in parallel.
Still taking fig. 5 as an example, if the template image is moved along the fourth moving direction, if the preset number of lines is set to 1, the fourth moving direction is set to a direction from top to bottom along the image column of the image 502 to be matched, and the target template feature point is the template feature point 3, the target image point corresponding to the template feature point 3 may be obtained as the image point b1, the image point c1, the image point d1, the image point e1, the image point f1, the image point g1, the image point h1 and the image point k1 in the image 502 to be matched. In the embodiment of the invention, the similarity between the image point b1, the image point c1, the image point d1, the image point e1, the image point f1, the image point g1, the image point h1 and the image point k1 and the target template characteristic point can be directly calculated in parallel.
The image information of each target image point corresponding to the target template feature point in the image to be matched comprises at least one of pixel information, brightness information, chromaticity information, gradient information and position information of the target image point.
Specifically, the step may calculate any one of cosine similarity, euclidean distance and manhattan distance between each corresponding target image point of the target template feature point in the image to be matched and the target template feature point as the similarity. Or at least 2 of cosine similarity, euclidean distance and Manhattan distance between each corresponding target image point of the target template feature points in the image to be matched and the target template feature points can be calculated, and then the average value is calculated, and the average value is taken as the similarity. Of course, in the embodiment of the present invention, according to the image feature information of the target template feature point and the image information of each target image point corresponding to the target template feature point in the image to be matched, other similarity calculation methods may be adopted to calculate the similarity between each corresponding target image point of the target template feature point in the image to be matched and the target template feature point, which is not limited herein specifically.
The following embodiment is a specific example of template matching of the template image 501 and the image 502 to be matched shown in fig. 5:
As shown in fig. 5, lines 0 to 13 in the image 502 to be matched include image points a1 to a18, image points b1 to b18, and image points n1 to n18 in this order. The reference point of the template image 501 may be set as an image point in its upper left corner, and the third movement direction may be set as a horizontal right direction moving from the image point a1 to a18, and the third boundary is a boundary corresponding to the third movement direction in the image 502 to be matched. The first movement direction may be set as a vertically downward direction moving from the image point a1 to n1, and the first boundary is a boundary corresponding to the first movement direction in the image 502 to be matched. The preset column number and the preset row number are set to 1.
In this example, when the matching process starts, the reference point of the template image 502 is a2 at the corresponding image point in the image to be matched, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, the plurality of target image points (a 5-a 15) corresponding to the 1 st template feature point in the image 502 to be matched are determined according to the feature information of the 1 st template feature point, and then the similarity between the 1 st template feature point and the image points a5-a15 is calculated in parallel. And accumulating the similarity between each target image point (image points a5-a 15) and the 1 st template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point (image points a5-a 15) is located is the region corresponding to the template image 501 in the image 502 to be matched when the target image point is coincident with the 1 st template feature point. For example, the image point a5 shown in fig. 5 coincides with the 1 st template feature point, and the region corresponding to the template image 501 in the current to-be-matched image 502 is the to-be-matched region where the image point a5 is located.
Or in this example, when the matching process starts, the reference point of the template image 502 corresponds to the image point a2 in the image to be matched, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, then the preset number of target image points corresponding to the 1 st template feature point in the image 502 to be matched are sequentially determined according to the feature information of the 1 st template feature point, and then the similarity between the 1 st template feature point and the preset number of target image points is calculated in parallel. The preset number may be set to 8 or 10, etc., and is not particularly limited herein. The number of target image points for which the similarity is calculated in parallel at a time does not exceed a preset number.
For example, if the preset number is set to 8, when the matching process starts, the image point corresponding to the reference point of the template image 502 in the image to be matched is a2, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, according to the feature information of the 1 st template feature point, the 8 target image points (a 5-a 12) corresponding to the 1 st template feature point in the image 502 to be matched are determined, and then the similarity between the 1 st template feature point and the image points a5-a12 is calculated in parallel. And accumulating the similarity between each image point in the image points a5-a12 and the 1 st template characteristic point to the comprehensive similarity corresponding to the region to be matched where the image point is located. Then, the other corresponding target image points (a 13-a 15) of the template feature point 0 in the image 502 to be matched are determined according to the feature information of the template feature point 0, and then the similarity between the template feature point 0 and the image points a13-a15 is calculated in parallel. And accumulating the similarity between each image point in the image points a13-a15 and the template feature point 0 to the comprehensive similarity corresponding to the region to be matched where the image point is located.
Then, the feature information of the 2 nd template feature point (template feature point 1) can be obtained in the same manner, the similarity between the 2 nd template feature point and the corresponding multiple target image points in the image 502 to be matched is continuously calculated according to the above flow to obtain the corresponding comprehensive similarity of each region to be matched after the new round of accumulation, and the similarity between the template feature point and the corresponding multiple target image points in the image 502 to be matched is calculated according to the above flow sequentially for the 3 rd to 22 th template feature points (template feature points 2 to 21) to obtain the final corresponding comprehensive similarity of each region to be matched after the similarity calculated by the 1 st to 22 th template feature points is accumulated. After the template feature points 0-21 of the template image 501 participate in similarity calculation, moving the template image 501 by 1 row according to a first moving direction to obtain an image point b1 corresponding to a reference point of the template image 501 in the image 502 to be matched, and then sequentially calculating the similarity between the 1 st template feature point to the 23 rd template feature point and a plurality of target image points corresponding to the image 502 to be matched according to the above flow until the template image moves to coincide with a first boundary of the image to be matched, so as to obtain the final corresponding comprehensive similarity of each region to be matched after the calculated similarity of the 1 st to 22 th template feature points is accumulated. Then, a target matching region can be selected from the respective regions to be matched according to the comprehensive similarity corresponding to each region to be matched. Specifically, the region to be matched with the highest corresponding comprehensive similarity can be selected as the target matching region. For example, if the region corresponding to the template image in the image to be matched 501 is the region to be matched with the highest corresponding comprehensive similarity, the region may be determined as the target matching region, that is, the region of the image to be matched covered by the template image 501 shown in fig. 5 is determined as the target matching region. The target matching area is the image area corresponding to the template image finally.
The following embodiment is another specific example of template matching of the template image 501 and the image 502 to be matched shown in fig. 5:
As shown in fig. 5, lines 0 to 13 in the image 502 to be matched include image points a1 to a18, image points b1 to b18, and image points n1 to n18 in this order. The reference point of the template image 501 may be set as an image point in its upper left corner, and the fourth moving direction may be set as a vertically downward direction moving from the image point b1 to n1, and the fourth boundary is a boundary corresponding to the fourth moving direction in the image 502 to be matched. The second movement direction may be set to a horizontal rightward direction moving from the image point a1 to a18, and the second boundary is a boundary corresponding to the second movement direction in the image 502 to be matched. The preset column number and the preset row number are set to 1.
In this example, when the matching process starts, the reference point of the template image 502 is a2 at the corresponding image point in the image to be matched, and the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory; according to the feature information of the 1 st template feature point, a plurality of target image points (image points a5, b5...once., h 5) corresponding to the 1 st template feature point in the image 502 to be matched are determined, and then the similarity between the 1 st template feature point and the "image points a5, b5...once., h5" is calculated in parallel. And accumulating the similarity between each target image point (image point a5, b5., h 5) and the 1 st template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point (image point a5, b5., h 5) is the region corresponding to the template image 501 in the image 502 to be matched when the target image point is coincident with the 1 st template feature point. For example, the image point a5 shown in fig. 5 coincides with the 1 st template feature point, and the region corresponding to the template image 501 in the current to-be-matched image 502 is the to-be-matched region where the image point a5 is located.
Or in this example, when the matching process starts, the reference point of the template image 502 corresponds to the image point a2 in the image to be matched, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, then the preset number of target image points corresponding to the 1 st template feature point in the image 502 to be matched are sequentially determined according to the feature information of the 1 st template feature point, then the similarity between the 1 st template feature point and the preset number of target image points is calculated in parallel, and the number of target image points for calculating the similarity in parallel each time does not exceed the preset number.
For example, if the preset number is set to 6, at the beginning of the matching process, the corresponding image point of the reference point of the template image 502 in the image to be matched is a2, and the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory; according to the feature information of the 1 st template feature point, 6 target image points (image points a5, b5...once., f 5) corresponding to the 1 st template feature point in the image 502 to be matched are determined, and then the similarity between the 1 st template feature point and the "image points a5, b5...once again, f5" is calculated in parallel. And accumulating the similarity between each image point in the image points a5, b5. and the 1 st template characteristic point to the comprehensive similarity corresponding to the region to be matched where the image point is located. Then, the other corresponding target image points (g 5 and k 5) of the template feature point 0 in the image 502 to be matched are continuously determined according to the feature information of the template feature point 0, and then the similarity between the template feature point 0 and the image points g5 and k5 is calculated in parallel. And accumulating the similarity between each image point g5 and k5 and the template feature point 0 to the comprehensive similarity corresponding to the region to be matched where the image point is located.
Then, the feature information of the 2 nd template feature point (template feature point 1) can be obtained in the same manner, the similarity between the 2 nd template feature point and the corresponding multiple target image points in the image 502 to be matched is continuously calculated according to the above flow to obtain the corresponding comprehensive similarity of each region to be matched after the new round of accumulation, and the similarity between the template feature point and the corresponding multiple target image points in the image 502 to be matched is calculated according to the above flow sequentially for the 3 rd to 22 th template feature points (template feature points 2 to 21) to obtain the final corresponding comprehensive similarity of each region to be matched after the similarity calculated by the 1 st to 22 th template feature points is accumulated. After the template feature points 0-21 of the template image 501 participate in similarity calculation, moving the template image 501 by 1 column according to a second moving direction to obtain an image point a3 corresponding to a reference point of the template image 501 in the image 502 to be matched, and then sequentially calculating the similarity between the 1 st template feature point to the 23 rd template feature point and a plurality of target image points corresponding to the image 502 to be matched according to the above flow until the template image moves to coincide with a second boundary of the image to be matched, so as to obtain the final corresponding comprehensive similarity of each region to be matched after the calculated similarity of the 1 st to 22 th template feature points is accumulated. Then, a target matching region can be selected from the respective regions to be matched according to the comprehensive similarity corresponding to each region to be matched. Specifically, the region to be matched with the highest corresponding comprehensive similarity can be selected as the target matching region. For example, if the region corresponding to the template image in the image to be matched 501 is the region to be matched with the highest corresponding comprehensive similarity, the region may be determined as the target matching region, that is, the region of the image to be matched covered by the template image 501 shown in fig. 5 is determined as the target matching region. The target matching area is the image area corresponding to the template image finally.
In one possible implementation, before the parallel computing of the similarity between the target template feature points and the corresponding target image points in the image to be matched according to the feature information, the method further comprises loading image information of the corresponding target image points in the image to be matched of the target template feature points from a cache based on SIMD (Single Instruction Multiple Data, single instruction stream multiple data stream).
In the embodiment of the invention, the image information of each image point of the line where the corresponding image point of the target template feature point in the image to be matched is located can be loaded into the vector register from the memory in advance, and meanwhile, the image information is cached in the cache. The SIMD is a parallel processing method for executing a plurality of data operations by using one computer instruction, and the cache may be a cache or the like.
The parallel computing of the similarity between the corresponding multiple target image points in the image to be matched and the target template feature points of the target template feature points according to the feature information specifically comprises the parallel computing of the similarity between each target image point and the target template feature points based on the image information and the feature information.
Specifically, image information of a plurality of target image points corresponding to the target template feature points in the image to be matched can be loaded into a vector register from a cache, and the vector register is used for calculating the similarity between each target image point and the target template feature points in parallel.
In the embodiment of the invention, since the template can be moved by a preset column number along the third moving direction or by a preset line number along the fourth moving direction, the preset column number and the preset line number are known, and the corresponding image points of the target template feature points in the image to be matched can be obtained in advance when the matching process starts, the image information of a plurality of corresponding target image points of the target template feature points in the image to be matched can be loaded into the vector register from the cache based on SIMD when the matching process starts, and then the similarity between the target template feature points and each target image point can be calculated in parallel.
For example, as shown in fig. 5, if the preset column number is set to 1, and it is determined that the template feature point 0 will sequentially calculate the similarity with the image points a5-a15 in the image 502 to be matched according to the preset column number, the image information of each image point of the line (0 th line) where the corresponding image point of the template feature point 0 in the image to be matched is loaded from the cache at the beginning of the matching process based on SIMD, and then the similarity between the template feature point 0 and the image points a5-a15 is calculated in parallel. Or if the preset number of lines is set to 1, determining that the template feature point 0 will be similar to the "image point a5, b5." in the image 502 to be matched, h5 "in turn, the image information of each image point of the column (5 th column) where the corresponding image point of the template feature point 0 in the image to be matched is located can be loaded from the cache based on SIMD at the beginning of the matching flow, and then the similarity between the template feature point 0 and the" image points a5, b5, the.
For another example, fig. 6 is a schematic diagram of performing template matching by using a parallel data processing method according to an embodiment of the present invention. In fig. 6 (1), a template image 601 is shown, and the distribution of the template feature points 0 to 21 in the template image 601 is discrete and disordered. As shown in the memory distribution display diagram 602 in fig. 6, the distribution order of the template feature points 0 to 21 in the memory is 0 to 21. The feature information of each template feature point can be taken out of the memory and put into the cache for caching, as shown in fig. 6, the feature information of the template feature point 0 can be taken out of the memory and put into the cache. When matching is started, a preset column number and a preset line number can be set to be 1, a plurality of target image points corresponding to the template feature point 0 in the image 603 to be matched are all located in the 0 th line of the image 603 to be matched, the similarity between the template feature point 0 and the image points 3-18 of the 0 th line in the image 603 to be matched is calculated in sequence according to the preset column number, image information of each image point of the 0 th line of the image to be matched can be loaded into a vector register from a high-speed buffer firstly based on SIMD (single instruction multiple data) when the matching process starts, the feature information of the template feature point 0 is also taken out from the high-speed buffer and placed into the vector register, and the similarity between the template feature point 0 and the image points 3-18 of the 0 th line in the image 602 to be matched is calculated in parallel. The vector register is a space closest to the CPU for temporarily storing data, and the CPU performs data access only with the vector register when calculating the similarity between the template feature points and the corresponding target image points, so that the data in the cache can be fetched and temporarily stored in the vector register. The op (Over-version, reserve space) in fig. 6 is the cache space pointing to the volume register.
Therefore, in this embodiment, for the image to be matched, when the similarity between the template feature point and the corresponding multiple target image points is calculated in each cycle, the position of the image point corresponding to the template feature point can be predicted according to the preset number of columns and the preset number of rows, so that the SIMD technology can be used to perform data parallel processing, and the template matching speed is improved.
In one possible implementation manner, in order to further increase the template matching speed, the present implementation manner improves the storage manner of the feature information of each template feature point in the template image in the memory. Specifically, the storage sequence of the feature information of each template feature point in the template image in the memory may be consistent with the distribution sequence of the row and the column of each template feature point in the template image. Before the feature information of the target template feature points of the template image is obtained, the method can further comprise the step of obtaining the feature information of the target template feature points according to the storage sequence of the feature information of the template feature points of the template image in the memory.
Specifically, the feature information of each template feature point of the template image may be loaded from the memory to the cache according to the storage sequence of the feature information of each template feature point of the template image in the memory, and then after the matching process begins, the feature information of the target template feature point of the template image may be obtained from the cache.
Before the storage mode of the feature information of each template feature point in the template image in the memory is not improved in this embodiment, as shown in the memory distribution display diagram 602 of template feature points 0-21 in fig. 6, the template feature information is stored in the memory in the order of the template feature points 0-21, and when the template matching is performed, the template feature information needs to be taken out from the memory and put into the cache according to the memory storage order of the template feature information, and the cache order is consistent with the memory order. Therefore, the memory mode can cause that when the template characteristic information is sequentially fetched from the cache and matched with the image information of the image point on the image to be matched, the distribution position of the template characteristic information on the template image is disordered, so that the image point information of the image to be matched, which is originally loaded into the cache, is replaced by the image point information which is far away from each other when the current region to be matched is traversed and the matching of the current region to be matched and the template image is carried out, and when the image point of the replaced image point to be matched is traversed, the image information of the image point needs to be loaded into the cache again from the memory. As shown in fig. 6, according to the memory storage sequence shown in the memory distribution display diagram 602 in fig. 6, when traversing the region to be matched on the image to be matched with respect to the template feature points 3-6, the image information of the 2 nd-5 th image points on the image to be matched can be sequentially first fetched from the memory and placed into the cache. When traversing the region to be matched on the image to be matched with respect to the template feature point 9, the cache memory may occupy all the cache space by the image information of the 2 nd-5 th line image point on the image to be matched, the image information of the 2 nd line image point on the image to be matched in the cache memory may be replaced by the image information of the 6 th line image point on the image to be matched, and further, when traversing the region to be matched on the image to be matched with respect to the template feature point 19, since the image information of the 2 nd line image point on the image to be matched is already replaced, the image information of the 2 nd line image point on the image to be matched needs to be reloaded from the memory to the cache memory, and the similarity between the template feature point 19 and the corresponding image point in the 2 nd line image point is calculated. The CPU cannot find the required data in the Cache, so the phenomenon of repeatedly loading the image information of the image point is called CACHE MISS. The more disordered the template feature information of the template feature points is in the distribution position on the template image, the more serious the CACHE MISS phenomenon is, so that the template matching speed can be influenced to a greater extent.
Therefore, in order to reduce CACHE MISS and further improve the template matching speed, the present embodiment makes the storage order of the feature information of each template feature point in the template image in the memory consistent with the distribution order of the row and the column of each template feature point in the template image. Fig. 7 is a schematic diagram of storing template feature information in a memory, where a schematic diagram of distribution of template feature points 0-21 in a template image may be shown as 601 in fig. 6. As shown in fig. 7, the storage order of the feature information of each template feature point in the template image 601 in the memory is consistent with the distribution order of the row and the column of each template feature point in the template image, that is, the feature information of each template feature point in the template image 601 is stored in the memory according to the order of the row and the column of the template feature point in the template image. After such improvement, according to the storage sequence of the feature information of each template feature point of the template image 601 in the memory, the feature information of each template feature point of the template image is loaded from the memory to the cache, and the feature information caching sequence of each template feature point of the template image 601 in the cache is consistent with the storage sequence in the memory. And when the templates are matched, the template characteristic information is taken out from the memory and put into a cache according to the memory storage sequence of the template characteristic information, and the cache sequence is consistent with the memory sequence. According to the improved storage mode, when the matching of the region to be matched and the template image is carried out, the probability that the image point information of the image to be matched, which is loaded into the high-speed buffer memory originally, is replaced by the image point information which is far away in the follow-up process can be reduced, CACHE MISS is further reduced, and the template matching speed is improved. As shown in fig. 5, according to the memory storage sequence of the template feature information shown in fig. 7, the image to be matched may be traversed for each template feature point 3-6 in the same row according to the sequence of the rows where each template feature point is located, and then the image to be matched may be traversed for each template feature point in the next row. For example, the image information of the 2 nd line of image points on the image 502 to be matched may be sequentially taken out from the memory and put into the cache, and then the image 502 to be matched may be sequentially traversed for the template feature point 3, the template feature point 2, the template feature point 1, the template feature point 21, the template feature point 20 and the template feature point 19 of the same line. The image 502 to be matched is traversed sequentially for the template feature points 4 and the template feature points 18 in the same row according to the row order. When traversing the image 502 to be matched for the template feature point 9, all the cache space may be occupied by the image information of the 2 nd-5 th row image point on the image 502 to be matched in the cache memory, then the image information of the 6 th row image point on the image 502 to be matched in the cache memory is replaced by the image information of the 2 nd row image point on the image 502 to be matched in this time, and the image information of the 2 nd row image point on the image 502 to be matched is not replaced in the subsequent traversing of the image to be matched for the template feature point 19, so that the situation that the image information of the 2 nd row image point on the image 502 to be matched is reloaded from the memory to the cache memory is required, and because the template feature point 19 is already completed before the template feature point 9 in the memory in the storage sequence of the template feature point 19, the present embodiment improves the storage mode of the feature information of each template feature point in the image to be matched in the memory, enables the feature information of each template feature point in the template image to be replaced in the storage sequence, and the image to be matched in the sequence to be more consistent with each memory, and the access speed of the template in the memory is further improved, and the performance of the image to be matched is improved.
The embodiment of the invention also provides a template matching device of the image corresponding to the template matching method of the image. The template matching device for the image provided by the embodiment of the invention is described below. As shown in fig. 8, a template matching apparatus of an image, the apparatus comprising:
an image acquisition module 801, configured to acquire a template image and an image to be matched;
The feature information obtaining module 802 is configured to obtain feature information of a target template feature point of the template image, where the target template feature point is a template feature point of the template image obtained according to a preset rule;
A similarity calculating module 803, configured to calculate, in parallel, a similarity between a plurality of target image points corresponding to the target template feature points in the image to be matched and the target template feature points according to the feature information, where the plurality of target image points are image points in a same row or column in the image to be matched, and a preset number of columns or rows are spaced between two adjacent target image points;
a similarity accumulating module 804, configured to accumulate similarities between each target image point and the target template feature points to comprehensive similarities corresponding to a region to be matched where the target image point is located, where the region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the target template feature points;
the target region determining module 805 is configured to determine a target matching region from the respective regions to be matched according to the comprehensive similarity corresponding to each region to be matched.
Therefore, after the device provided by the embodiment of the invention is adopted, the similarity between a plurality of corresponding target image points and the target template characteristic points in the image to be matched can be calculated in parallel according to the characteristic information after the characteristic information of the target template characteristic points of the template image is obtained each time, namely, the obtained characteristic information of one template characteristic point can be used for calculating the similarity between the characteristic information of the corresponding target image points in a plurality of different image areas of the image to be matched and the information of the corresponding target image points of the template characteristic points. Therefore, the data of each image point in the same row or the same column of the image to be matched can be loaded from the memory according to the preset column number or the preset line number, then when the similarity between the target template characteristic point and a plurality of corresponding target image points in the image to be matched is calculated, template characteristic information is not required to be acquired one by one, the similarity between the loaded target image points and the target template characteristic point at intervals of the preset column number or the preset line number can be calculated in parallel, and the similarity between the target template characteristic point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
Optionally, the apparatus further includes:
And a feature point obtaining module (not shown in the figure) configured to obtain a next target template feature point of the template image according to the preset rule before determining a target matching area from each to-be-matched area according to the comprehensive similarity corresponding to each to-be-matched area, and return the feature information of the target template feature point of the template image until the template feature points of the template image are all obtained.
Optionally, the apparatus further includes:
And a moving module (not shown in the figure) configured to, after the template feature points of the template image are all acquired, move the template image by a preset number of rows according to a first moving direction or by a preset number of columns according to a second moving direction, and return to the step of calculating, in parallel, the similarity between the target feature points and a plurality of target image points corresponding to the target feature points in the image to be matched according to the feature information until the template image moves to coincide with a first boundary or a second boundary of the image to be matched, where the first boundary is a boundary corresponding to the first moving direction in the boundaries of the image to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched.
Optionally, the apparatus further includes:
An information loading module (not shown in the figure) for loading image information of a plurality of target image points corresponding to the target template feature points in the image to be matched from a cache memory based on single instruction stream multiple data stream SIMD;
The similarity calculating module 803 is specifically configured to calculate, in parallel, a similarity between each target image point and the target template feature point based on the image information and the feature information.
Optionally, the storage sequence of the feature information of each template feature point in the template image in the memory is consistent with the distribution sequence of the row and the column of each template feature point in the template image;
The feature information obtaining module 802 is specifically configured to obtain feature information of the feature points of the target template according to a storage order of feature information of feature points of each template of the template image in the memory.
Optionally, the feature information of each template feature point includes a template offset value and image feature information of the template feature point;
The similarity calculating module 803 is specifically configured to determine a plurality of target image points corresponding to the target template feature point in the image to be matched according to the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, the column width of the image to be matched, and the preset column number or row number, and calculate the similarity between each target image point and the target template feature point in parallel according to the image feature information of the target template feature point and the image information of the plurality of target image points.
Optionally, the similarity calculating module 803 includes:
a similarity calculation submodule (not shown in the figure) is configured to determine, according to a template offset value of a target template feature point, position information of a corresponding image point in the image to be matched of a reference point of the template image, and a column width or a line width of the image to be matched, a current corresponding target image point in the image to be matched of the target template feature point by adopting the following formula:
p=i*w+j+offset
Wherein p is a current corresponding target image point of a target template feature point in the image to be matched, i is an ordinal number of a row of a corresponding image point of a reference point of the template image in the image to be matched, w is a column width or a line width of the image to be matched, j is an ordinal number of a column of a corresponding image point of the reference point of the template image in the image to be matched, and offset is a template offset value of the target template feature point;
a moving submodule (not shown in the figure) is configured to move the template image by a preset column number according to a third moving direction or by a preset line number according to a fourth moving direction, and return the template offset value according to the target template feature point, the position information of the corresponding image point of the reference point of the template image in the image to be matched, and the column width or the line width of the image to be matched, and determine the current corresponding target image point of the target template feature point in the image to be matched by adopting the following formula until the template image moves to be coincident with a third boundary or a fourth boundary of the image to be matched, where the third boundary is a boundary corresponding to the third moving direction in the boundaries of the image to be matched, and the fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the image to be matched.
By adopting the device provided by the embodiment of the invention, after the characteristic information of the target template characteristic points of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points can be calculated in parallel according to the characteristic information, namely, the obtained characteristic information of one template characteristic point can be used for calculating the similarity between the characteristic information of the target image points corresponding to the template characteristic points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row of the image to be matched can be loaded from the memory according to the preset number of columns or the preset number of rows, then when the similarity between the target template feature point and a plurality of corresponding target image points in the image to be matched is calculated, template feature information is not required to be acquired one by one, the loaded similarity between a plurality of target image points and the target template feature point at intervals of the preset number of columns or the preset number of rows can be calculated simultaneously, and the similarity between the target template feature point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved. In addition, the embodiment of the invention can improve the storage mode of the characteristic information of each template characteristic point in the template image in the memory, so that the storage sequence of the characteristic information of each template characteristic point in the template image in the memory is consistent with the distribution sequence of the line and the column of each template characteristic point in the template image, thereby reducing CACHE MISS phenomenon and further improving the template matching speed.
The embodiment of the present invention also provides an electronic device, as shown in fig. 9, including a processor 901, a communication interface 902, a memory 903, and a communication bus 904, where the processor 901, the communication interface 902, and the memory 903 perform communication with each other through the communication bus 904,
A memory 903 for storing a computer program;
A processor 901 for implementing the steps of any of the template matching methods of the images when executing the program stored in the memory 903.
The communication bus mentioned above for the electronic device may be a peripheral component interconnect standard (PERIPHERAL COMPONENT INTERCONNECT, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The Processor may be a general-purpose Processor including a central processing unit (Central Processing Unit, CPU), a network Processor (Network Processor, NP), etc., or may be a digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components.
In yet another embodiment of the present invention, there is also provided a computer readable storage medium having stored therein a computer program which, when executed by a processor, implements the steps of the template matching method of an image of any one of the above.
In yet another embodiment of the present invention, there is also provided a computer program product containing instructions that, when run on a computer, cause the computer to perform the template matching method of any of the images of the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system, electronic device, computer readable storage medium and computer program product embodiments, the description is relatively simple as it is substantially similar to the method embodiments, as relevant points are found in the partial description of the method embodiments.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.