[go: up one dir, main page]

CN114926670B - Image template matching method, device, electronic device and storage medium - Google Patents

Image template matching method, device, electronic device and storage medium

Info

Publication number
CN114926670B
CN114926670B CN202210590912.3A CN202210590912A CN114926670B CN 114926670 B CN114926670 B CN 114926670B CN 202210590912 A CN202210590912 A CN 202210590912A CN 114926670 B CN114926670 B CN 114926670B
Authority
CN
China
Prior art keywords
image
template
target
matched
point
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202210590912.3A
Other languages
Chinese (zh)
Other versions
CN114926670A (en
Inventor
张一凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikrobot Technology Co Ltd
Hangzhou Hikrobot Co Ltd
Original Assignee
Hangzhou Hikrobot Technology Co Ltd
Hangzhou Hikrobot Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikrobot Technology Co Ltd, Hangzhou Hikrobot Co Ltd filed Critical Hangzhou Hikrobot Technology Co Ltd
Priority to CN202210590912.3A priority Critical patent/CN114926670B/en
Publication of CN114926670A publication Critical patent/CN114926670A/en
Application granted granted Critical
Publication of CN114926670B publication Critical patent/CN114926670B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/752Contour matching
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/96Management of image or video recognition tasks

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

本发明实施例提供了一种图像的模板匹配方法、装置、电子设备及存储介质,上述方法包括:获取模板图像和待匹配图像,获取目标模板特征点的特征信息;根据特征信息并行计算目标模板特征点与对应的多个目标图像点间的相似度;并将每个目标图像点与所述目标模板特征点之间的相似度累加至该目标图像点所在的待匹配区域对应的综合相似度;根据每个待匹配区域对应的综合相似度,从各个待匹配区域中确定出目标匹配区域。采用方法可以实现并行计算目标模板特征点与在待匹配图像中多个不同区域所对应的目标图像点之间的相似度,提升了模板匹配速度。

Embodiments of the present invention provide a method, apparatus, electronic device, and storage medium for image template matching. The method comprises: obtaining a template image and an image to be matched, and obtaining feature information of a target template feature point; concurrently calculating the similarity between the target template feature point and a plurality of corresponding target image points based on the feature information; and adding the similarity between each target image point and the target template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located; and determining a target matching region from each region to be matched based on the comprehensive similarity corresponding to each region to be matched. This method enables parallel calculation of the similarity between the target template feature point and target image points corresponding to a plurality of different regions in the image to be matched, thereby improving the template matching speed.

Description

Image template matching method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of image processing technologies, and in particular, to a method and apparatus for matching templates of images, an electronic device, and a storage medium.
Background
The template matching of the images refers to a process of traversing the images to be matched and finding out an image area with higher similarity with a pre-generated template image from the images to be matched. The template matching technology of the image is widely applied in production and life, for example, the template matching technology of the image can be used for license plate recognition, bar code recognition, character recognition and the like.
The traditional template matching mode comprises the steps of extracting template characteristic information of a template image, traversing different image areas of the image to be matched, determining the similarity between the image information of each image area and the template characteristic information one by one, and then taking the image area with higher similarity in the image to be matched as a matched target area.
However, the distribution position of the template feature information on the template image is discrete and disordered, assuming that the template size is 7x7, the template feature points are discrete 21 points in the template image, and the information of the template feature points is the template feature information, and therefore, the distribution position of the template feature information on the template image is discrete and disordered.
In the process of determining the similarity between the image information of each image area of the image to be matched and the template feature information one by one, because the distribution position of the template feature information on the template image is discrete and disordered, in the process of matching the image area of the image to be matched and the template image each time, the information of the image points corresponding to the template feature points in the current image area of the image to be matched can only be obtained one by one from a memory and matched with the template feature points until the template feature information of all the template feature points is matched with the information of the corresponding image points in the image area, and then the information of the image points corresponding to the template feature points in the next image area of the image to be matched is obtained one by one from the memory again until the matching of the template feature information and the image information of each image area of the image to be matched is completed. However, in this way of acquiring the information of the image points corresponding to the template feature points in the image area of each image to be matched one by one, when the template image is matched with each image area of the image to be matched, the image point information of the image area to be matched corresponding to each template feature point needs to be acquired one by one from the memory, which results in slower template matching speed.
Disclosure of Invention
The embodiment of the invention aims to provide a template matching method and device for images, electronic equipment and a storage medium, so as to improve the template matching speed.
In a first aspect, an embodiment of the present invention provides a template matching method for an image, including:
acquiring a template image and an image to be matched;
Acquiring feature information of target template feature points of the template image, wherein the target template feature points are template feature points of the template image acquired according to a preset rule;
According to the feature information, calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points in parallel, wherein the plurality of target image points are image points in the same row or column in the image to be matched, and a preset column number or line number is arranged between two adjacent target image points;
Accumulating the similarity between each target image point and the target template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the target template feature point;
And determining a target matching area from each area to be matched according to the comprehensive similarity corresponding to each area to be matched.
Optionally, before determining the target matching area from the respective areas to be matched according to the comprehensive similarity corresponding to each area to be matched, the method further includes:
And acquiring the next target template characteristic point of the template image according to the preset rule, and returning to the step of acquiring the characteristic information of the target template characteristic point of the template image until the template characteristic points of the template image are acquired.
Optionally, after the template feature points of the template image are all acquired, the method further includes:
moving the template image by a preset line number according to a first moving direction or a preset column number according to a second moving direction, returning to the step of parallelly calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points according to the feature information until the template image is moved to coincide with a first boundary or a second boundary of the image to be matched;
the first boundary is a boundary corresponding to the first moving direction in the boundaries of the images to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the images to be matched.
Optionally, before the calculating, according to the feature information, the similarity between the target image points corresponding to the target template feature points in the image to be matched and the target template feature points in parallel, the method further includes:
based on single instruction stream multiple data stream SIMD, loading image information of a plurality of target image points corresponding to the target template feature points in the image to be matched from a cache;
And according to the feature information, calculating the similarity between a plurality of corresponding target image points of the target template feature points in the image to be matched and the target template feature points in parallel, wherein the similarity comprises the following steps:
and calculating the similarity between each target image point and the target template characteristic point in parallel based on the image information and the characteristic information.
Optionally, the storage sequence of the feature information of each template feature point in the template image in the memory is consistent with the distribution sequence of the row and the column of each template feature point in the template image;
the obtaining the feature information of the target template feature point of the template image comprises the following steps:
And acquiring the characteristic information of the target template characteristic points according to the storage sequence of the characteristic information of each template characteristic point of the template image in the memory.
Optionally, the feature information of each template feature point includes a template offset value and image feature information of the template feature point;
And according to the feature information, calculating the similarity between a plurality of corresponding target image points of the target template feature points in the image to be matched and the target template feature points in parallel, wherein the similarity comprises the following steps:
Determining a plurality of target image points corresponding to the target template feature points in the image to be matched according to the template offset value of the target template feature points, the position information of the corresponding image points of the reference points of the template image in the image to be matched, the column width of the image to be matched and the preset column number or row number;
and calculating the similarity between each target image point and the target template feature point in parallel according to the image feature information of the target template feature point and the image information of the target image points.
Optionally, the determining, according to the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, the column width of the image to be matched, and the preset column number or row number, a plurality of target image points corresponding to the target template feature point in the image to be matched includes:
According to the template offset value of the target template feature point, the position information of the corresponding image point of the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched, the current corresponding target image point of the target template feature point in the image to be matched is determined by adopting the following formula:
p=i*w+j+offset
Wherein p is a current corresponding target image point of a target template feature point in the image to be matched, i is an ordinal number of a row of a corresponding image point of a reference point of the template image in the image to be matched, w is a column width or a line width of the image to be matched, j is an ordinal number of a column of a corresponding image point of the reference point of the template image in the image to be matched, and offset is a template offset value of the target template feature point;
And moving the template image by a preset column number according to a third moving direction or a preset column number according to a fourth moving direction, returning the template offset value according to the target template characteristic point, the position information of the corresponding image point of the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched, and determining the current corresponding target image point of the target template characteristic point in the image to be matched by adopting the following formula until the template image moves to coincide with a third boundary or a fourth boundary of the image to be matched, wherein the third boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched, and the fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the image to be matched.
In a second aspect, an embodiment of the present invention provides a template matching apparatus for an image, including:
The image acquisition module is used for acquiring a template image and an image to be matched;
The feature information acquisition module is used for acquiring feature information of target template feature points of the template image;
The similarity calculation module is used for calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points in parallel according to the feature information, wherein the plurality of target image points are image points in the same row or column in the image to be matched, and a preset column number or line number is arranged between two adjacent target image points;
The similarity accumulation module is used for accumulating the similarity between each target image point and the target template characteristic point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point is located is the region corresponding to the template image in the image to be matched when the target image point coincides with the target template characteristic point;
The target region determining module is used for determining a target matching region from each region to be matched according to the comprehensive similarity corresponding to each region to be matched.
Optionally, the apparatus further includes:
The feature point obtaining module is configured to obtain a next target template feature point of the template image according to the preset rule before determining a target matching area from each to-be-matched area according to the comprehensive similarity corresponding to each to-be-matched area, and return the step of obtaining feature information of the target template feature point of the template image until the template feature points of the template image are all obtained.
Optionally, the apparatus further includes:
And the moving module is used for moving the template image by a preset line number according to a first moving direction or by a preset column number according to a second moving direction after the template feature points of the template image are acquired, returning to the step of parallelly calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points according to the feature information until the template image is moved to coincide with a first boundary or a second boundary of the image to be matched, wherein the first boundary is a boundary corresponding to the first moving direction in the boundaries of the image to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched.
Optionally, the apparatus further includes:
The information loading module is used for loading image information of a plurality of target image points corresponding to the target template feature points in the image to be matched from a high-speed buffer based on single instruction stream multiple data stream SIMD;
the similarity calculation module is specifically configured to calculate, in parallel, a similarity between each target image point and the target template feature point based on the image information and the feature information.
Optionally, the storage sequence of the feature information of each template feature point in the template image in the memory is consistent with the distribution sequence of the row and the column of each template feature point in the template image;
The feature information acquisition module is specifically configured to acquire feature information of feature points of the target template according to a storage sequence of feature information of feature points of each template of the template image in the memory.
Optionally, the feature information of each template feature point includes a template offset value and image feature information of the template feature point;
The similarity calculation module is specifically configured to determine a plurality of target image points corresponding to the target template feature points in the image to be matched according to the template offset value of the target template feature points, the position information of the corresponding image points of the reference points of the template image in the image to be matched, the column width of the image to be matched, and the preset column number or row number, and calculate the similarity between each target image point and the target template feature points in parallel according to the image feature information of the target template feature points and the image information of the plurality of target image points.
Optionally, the similarity calculation module includes:
The similarity calculation submodule is used for determining a target image point corresponding to the target template feature point currently in the image to be matched according to the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched by adopting the following formula:
p=i*w+j+offset
Wherein p is a current corresponding target image point of a target template feature point in the image to be matched, i is an ordinal number of a row of a corresponding image point of a reference point of the template image in the image to be matched, w is a column width or a line width of the image to be matched, j is an ordinal number of a column of a corresponding image point of the reference point of the template image in the image to be matched, and offset is a template offset value of the target template feature point;
And the moving submodule is used for moving the template image by a preset column number according to a third moving direction or by a preset line number according to a fourth moving direction, returning the template offset value according to the target template characteristic point, the position information of the corresponding image point of the reference point of the template image in the image to be matched and the column width or the line width of the image to be matched, and determining the current corresponding target image point of the target template characteristic point in the image to be matched by adopting the following formula until the template image moves to be coincident with a third boundary or a fourth boundary of the image to be matched, wherein the third boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched, and the fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the image to be matched.
In a third aspect, an embodiment of the present invention provides an electronic device, including a processor, a communication interface, a memory, and a communication bus, where the processor, the communication interface, and the memory complete communication with each other through the communication bus;
a memory for storing a computer program;
A processor for implementing the method steps of any of the above first aspects when executing a program stored on a memory.
In a fourth aspect, embodiments of the present invention provide a computer-readable storage medium having a computer program stored therein, which when executed by a processor, implements the method steps of any of the first aspects described above.
The embodiment of the invention has the beneficial effects that:
By adopting the method provided by the embodiment of the invention, after the characteristic information of the target template characteristic points of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points can be calculated in parallel according to the characteristic information, namely, the obtained characteristic information of one template characteristic point can be simultaneously calculated with the information of the corresponding target image points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row or the same column of the image to be matched can be loaded from the memory according to the preset column number or the preset line number, then when the similarity between the target template characteristic point and a plurality of corresponding target image points in the image to be matched is calculated, template characteristic information is not required to be acquired one by one, the similarity between the loaded target image points and the target template characteristic point at intervals of the preset column number or the preset line number can be calculated in parallel, and the similarity between the target template characteristic point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
Of course, it is not necessary for any one product or method of practicing the invention to achieve all of the advantages set forth above at the same time.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings used in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the invention, and other embodiments may be obtained according to these drawings to those skilled in the art.
FIG. 1 is a schematic diagram of a distribution of template feature information on a template image according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of matching template feature information with an image to be matched;
FIG. 3 is a flowchart of a method for matching templates of images according to an embodiment of the present invention;
FIG. 4 is another flowchart of a template matching method for an image according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of matching a template image with an image to be matched according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of template matching using parallel data processing according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of template feature information stored in a memory according to an embodiment of the present invention;
FIG. 8 is a schematic structural diagram of a template skin matching device for images according to an embodiment of the present invention;
fig. 9 is a schematic structural diagram of an electronic device according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present application will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present application, but not all embodiments. Based on the embodiments of the present application, all other embodiments obtained by the person skilled in the art based on the present application are included in the scope of protection of the present application.
Fig. 1 is a schematic diagram of distribution of template feature information on a template image, as shown in fig. 1, where the size of a template image 101 is 7x7, template feature points of the template image are 0-21, and information corresponding to the template feature points 0-21 is template feature information, and as can be seen from fig. 1, the distribution position of the template feature information on the template image 101 is discrete and disordered. Fig. 2 is a schematic diagram of matching template feature information with an image to be matched, as shown in fig. 2, mainly matching a template image 101 with an image to be matched 201, and finding out whether there is a region matching with the template image 101 in the image to be matched 201. Because the distribution position of the template characteristic information on the template image is discrete and disordered, the traditional template matching mode can only acquire the information of the image points corresponding to the template characteristic points in the current image area of the image to be matched from the memory one by one and match the information with the template characteristic point information until the template characteristic information of all the template characteristic points is matched with the information of the corresponding image points in the image area, then acquire the information of the image points corresponding to the template characteristic points in the next image area of the image to be matched from the memory one by one and match the information of the template characteristic points until the image information in each image area of the image to be matched is matched with the template characteristic information. For example, when the template image 101 shown in fig. 2 is matched with the image area 202 of the covered image to be matched, the feature information of the template feature point 0 is firstly obtained, the similarity between the feature information of the template feature point 0 and the image information of the image point corresponding to the feature point 0 in the image area 202 is calculated, then the feature information of the feature point 1 is obtained, and the similarity between the feature information of the feature point 1 and the image information of the image point corresponding to the feature point 1 in the image area 202 is calculated, so that each template feature information is obtained one by one to be matched with the image information in the image area 202 until the template feature information is matched with the image information in the image area 202, then the next image area of the image 201 to be matched covered by the template image 101 is obtained after the template image is horizontally (or vertically) moved by 1 line (column) or multiple lines (row), and then the image information in the image area is matched with the template feature information one to be matched with the image information in the image area until the template feature information is matched with the image information in the image area. In this way, until the template feature information of the template image 101 is completely matched with the image information in each image area of the image 201 to be matched, the matching degree of each image area of the image 201 to be matched with the template image 101 is obtained.
However, template feature information needs to be loaded one by one when the template image is matched with each image area of the image to be matched, so that the template matching speed is low. The method for acquiring the image point information in the image areas of the images to be matched corresponding to the feature points of each template one by one and matching the image point information with the feature information of the template sequentially matches each image area of the images to be matched with the template image, and the feature information of the template needs to be loaded one by one when the template image is matched with each image area of the images to be matched, so that the template matching speed is low. In addition, in the process of acquiring the image information of the image point of each image area of the image to be matched from the memory, because of the mapping relationship between the memory and the Cache, a large amount of CACHE MISS occurs in acquiring the image point information of the adjacent image area to be matched, so that the image point information of the image area to be matched needs to be loaded from the memory repeatedly to the Cache for a plurality of times, thereby further causing the slow matching speed of the template. The Cache is a temporary memory between the CPU and an external memory and is positioned in the CPU processor, and the capacity of the Cache is smaller than that of the memory but the exchange speed is faster than that of the memory. CACHE MISS refers to the phenomenon that the CPU cannot find the required data in the Cache, so that the image information of the image point needs to be repeatedly loaded.
In order to improve the template matching speed, the embodiment of the invention provides an image template matching method, an image template matching device, electronic equipment, a computer readable storage medium and a computer program product.
The method for matching the template of the image provided by the embodiment of the invention is first described below. The template matching method of the image provided by the embodiment of the invention can be applied to any electronic equipment with an image processing function, and is not particularly limited herein.
Fig. 3 is a flowchart of a template matching method for an image according to an embodiment of the present invention, where, as shown in fig. 3, the method includes:
s301, acquiring a template image and an image to be matched.
In the embodiment of the invention, the template image is a target image which is generated in advance aiming at a specific application scene. For example, in a license plate matching scenario, the template image is a target license plate image, and in a face matching scenario, the template image is a face feature image, such as a face image including a face contour and/or a five sense organ of a person, and the like. The template image includes a plurality of template feature points, and as shown in fig. 1, the template image 101 includes template feature points 0 to 21.
The image to be matched is an image which needs to be matched with the template image.
S302, obtaining feature information of target template feature points of the template image.
The target template feature points are template feature points of a template image acquired according to a preset rule.
For example, the first template feature point in the template image may be used as a target template feature point and feature information of the target template feature point may be obtained according to a row-column distribution order of the template feature points in the template image, or the last template feature point in the template image may be used as a target template feature point and feature information of the target template feature point may be obtained by using the first template feature point as a starting point.
S303, according to the characteristic information, calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points in parallel.
The target image points are image points in the same row or column in the image to be matched, and two adjacent target image points are separated by a preset column number or line number. And when the plurality of target image points are the image points in the same row in the image to be matched, a preset column number is arranged between the two adjacent target image points, and when the plurality of target image points are the image points in the same column in the image to be matched, a preset line number is arranged between the two adjacent target image points.
The preset column number and the preset line number can be specifically set according to the actual application scenario, for example, the preset column number can be set to 2 columns or 3 columns, and the preset line number can be set to 3 lines or 4 lines.
And S304, accumulating the similarity between each target image point and the target template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located.
The region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the target template feature point.
S305, determining a target matching area from each area to be matched according to the comprehensive similarity corresponding to each area to be matched.
Specifically, the image region to be matched, whose corresponding comprehensive similarity is greater than a preset similarity threshold, may be selected as the target matching region. The preset similarity threshold may be specifically set according to an actual application scenario.
By adopting the method provided by the embodiment of the invention, after the characteristic information of the target template characteristic points of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points can be calculated in parallel according to the characteristic information, namely, the obtained characteristic information of one template characteristic point can be simultaneously calculated with the information of the corresponding target image points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row or the same column of the image to be matched can be loaded from the memory according to the preset column number or the preset line number, then when the similarity between the target template characteristic point and a plurality of corresponding target image points in the image to be matched is calculated, template characteristic information is not required to be acquired one by one, the similarity between the loaded target image points and the target template characteristic point at intervals of the preset column number or the preset line number can be calculated in parallel, and the similarity between the target template characteristic point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
In one possible implementation manner, before the target matching area is determined from the respective areas to be matched according to the comprehensive similarity corresponding to each area to be matched, the method may further include step A1:
and step A1, acquiring the next target template characteristic point of the template image according to the preset rule, and returning to the step of acquiring the characteristic information of the target template characteristic point of the template image until the template characteristic points of the template image are all acquired.
For example, taking the template image in fig. 1 as an example, taking the first template feature point "0" in the template image as a starting point according to the row-column distribution sequence of the template feature points of the template image, taking the first template feature point "0" as a target template feature point, after the execution of S302-S304 on the target template feature point is completed, the next template feature point "3" of the target template feature point may be taken as the next target template feature point of the template image, and then the execution of S302 is returned to the next target template feature point. Until the template feature points 0-21 of the template image are all acquired.
Or taking the template image in fig. 1 as an example, taking the last template feature point '12' in the template image as a starting point according to the reverse order of the row-column distribution sequence of the template feature points of the template image, taking the first template feature point '12' as a target template feature point, after the execution of S302-S304 on the target template feature point, taking the next template feature point '11' of the target template feature point as the next target template feature point of the template image, and then returning to execute S302 on the next target template feature point. Until the template feature points 0-21 of the template image are all acquired.
In one possible implementation manner, after the template feature points of the template image are all acquired, the method may further include step B1:
And B1, moving the template image by a preset line number according to a first moving direction or a preset column number according to a second moving direction, returning to the step of parallelly calculating the similarity between a plurality of corresponding target image points in the image to be matched and the target template feature points according to the feature information until the template image moves to coincide with a first boundary or a second boundary of the image to be matched.
The first boundary is a boundary corresponding to the first moving direction in the boundaries of the images to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the images to be matched.
The method comprises the steps of determining a first moving direction of an image to be matched, wherein the first moving direction is a moving direction from top to bottom, the first boundary is an image lower boundary of the image to be matched, the first boundary is an image upper boundary of the image to be matched if the first moving direction is a moving direction from bottom to top, the second boundary is an image right boundary of the image to be matched if the second moving direction is a moving direction from left to right, and the second boundary is an image left boundary of the image to be matched if the second moving direction is a moving direction from right to left.
Specifically, the template image may be moved by a preset number of rows according to a first moving direction, and then the step of parallel calculating the similarity between the corresponding multiple target image points in the image to be matched and the target template feature points according to the feature information is returned until the template image is moved to coincide with the first boundary of the image to be matched, or the template image may be moved by a preset number of columns according to a second moving direction, and then the step of parallel calculating the similarity between the corresponding multiple target image points in the image to be matched and the target template feature points according to the feature information is returned until the template image is moved to coincide with the second boundary of the image to be matched.
In this embodiment, each template feature point may be sequentially acquired with the first template feature point in the template image as a starting point according to a row-column distribution order of the template feature points of the template image, for determining the target matching area. Referring specifically to fig. 4, fig. 4 is another flowchart of a template matching method for an image according to an embodiment of the present invention, as shown in fig. 4, where the method includes:
S401, acquiring a template image and an image to be matched, and setting k as 1.
S402, obtaining feature information of a kth template feature point of the template image.
S403, according to the feature information, calculating the similarity between a plurality of corresponding target image points in the image to be matched and the kth template feature point in parallel.
The target image points are image points in the same row in the image to be matched, and two adjacent target image points are separated by a preset column number. The preset number of columns may be set according to the actual matching condition, which is not specifically limited herein.
And S404, accumulating the similarity between each target image point and the kth template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located.
The region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the kth template feature point.
And S405, adding a value of k plus 1 to k, and returning to the step of acquiring the characteristic information of the kth template characteristic point of the template image until k is greater than the number of the template characteristic points.
S406, moving the template image by a preset line number according to a first moving direction, and returning to the step of setting k to be 1 until the template image is moved to coincide with the first boundary of the image to be matched.
The first boundary is a boundary corresponding to the first moving direction in the boundaries of the images to be matched. The first moving direction may be set to a vertically upward direction or a vertically downward direction along an image line of the image to be matched according to an actual matching situation. The preset number of lines may be set to 1, 2, or 3 according to the actual matching condition, which is not specifically limited herein.
S407, determining a target matching area from each area to be matched according to the comprehensive similarity corresponding to each area to be matched.
Specifically, the image region to be matched, whose corresponding comprehensive similarity is greater than a preset similarity threshold, may be selected as the target matching region. The preset similarity threshold may be specifically set according to an actual application scenario.
By adopting the method provided by the embodiment of the invention, after the feature information of the kth template feature point of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the kth template feature point of the kth template feature point can be calculated in parallel according to the feature information, namely, the obtained feature information of one template feature point can be used for calculating the similarity between the feature information of the corresponding target image points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row of the image to be matched can be loaded from the memory according to the preset column number, then when the similarity between the kth template feature point and the corresponding multiple target image points in the image to be matched is calculated, template feature information is not required to be acquired one by one, the loaded similarity between the multiple target image points at intervals of the preset column number and the kth template feature point can be calculated simultaneously, and the similarity between the kth template feature point and the corresponding multiple target image points in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
In one possible embodiment, the feature information of each template feature point includes a template offset value and image feature information of the template feature point. The parallel computing of the similarity between the corresponding target image points in the image to be matched and the target template feature points according to the feature information may include the following steps C1-C2
And step C1, determining a plurality of target image points corresponding to the target template feature points in the image to be matched according to the template offset value of the target template feature points, the position information of the corresponding image points of the reference points of the template image in the image to be matched, the column width of the image to be matched and the preset column number or row number.
The template offset value represents an offset of the template feature point relative to a reference point of the template image, and specifically may include a horizontal offset and a vertical offset of the template feature point relative to the reference point of the template image. The reference point of the template image may be set in a user-defined manner in practical application, for example, for the template image 101 shown in fig. 1, the reference point of the template image 101 may be set as an image point at the upper left corner of the template image 101, the image point may specifically be a pixel point of the image, then the horizontal offset of the template feature point 1 in the template image 101 relative to the reference point is 2, the vertical offset is 1, then the offset value of the template feature point 1 may be represented as [1,2], for example, the horizontal offset of the template feature point 18 in the template image 101 relative to the reference point is 6, the vertical offset is 2, and then the offset value of the template feature point 18 may be represented as [2,6].
The image feature information of the template feature point includes at least one of pixel information, luminance information, chromaticity information, gradient information, and position information of the template feature point.
Fig. 5 is a schematic diagram of matching a template image with an image to be matched according to an embodiment of the present invention. As shown in fig. 5, if the image point at the upper left corner of the template image 501 is set as the reference point, the image point corresponding to the reference point in the image 502 to be matched is the image point of the 0 th row and the 1 st column in the image 502 to be matched. The position information of the image point corresponding to the reference point of the template image in the image to be matched can be specifically the position coordinate of the image point in the image to be matched.
In one possible implementation manner, the step may determine, according to a template offset value of a target template feature point, position information of a corresponding image point of a reference point of the template image in the image to be matched, and a column width or a line width of the image to be matched, a current corresponding target image point of the target template feature point in the image to be matched by adopting the following formula:
p=i*w+j+offset
Wherein p is the current corresponding target image point of the target template feature point in the image to be matched, i is the ordinal number of the line of the image point corresponding to the reference point of the template image in the image to be matched, w is the column width or line width of the image to be matched, j is the ordinal number of the line of the image point corresponding to the reference point of the template image in the image to be matched, and offset is the template offset value of the target template feature point. If the target template characteristic points are obtained in a mode of sequentially obtaining the target template characteristic points from the template image row by row, w is the column width of the image to be matched, and if the target template characteristic points are obtained in a mode of sequentially obtaining the target template characteristic points from the template image row by row, w is the row width of the image to be matched.
For example, as shown in fig. 5, the image 502 to be matched includes 14 rows by 18 columns of image points, the image point at the upper left corner of the template image 501 may be set as a reference point, and the image point corresponding to the reference point in the image 502 to be matched is the image point a2. If the target template feature point is the template feature point 1 in the template feature points 0-21 of the template image 501, the offset value of the template feature point 1 may be [1,2], the position coordinate of the image point a2 in the image 502 to be matched is (0, 1), that is, the ordinal number of the row where the image point a2 is located is 0 is 1, and the column width of the image 502 to be matched is 18, the information is substituted into the formula p=i×w+j+offset=0×18+1+ (1×18+2) = (0+1) ×18+ (1+2) = [1,3], so as to obtain the image point of the corresponding image point of the target template feature point in the image 502 to be matched is the 1 st row and the 3 rd column. Wherein the index indices of the rows and columns in the image to be matched start from 0.
And then, moving the template image by a preset column number according to a third moving direction, and returning the template offset value according to the target template characteristic point, the position information of the image point corresponding to the reference point of the template image in the image to be matched and the column width of the image to be matched, and determining the target image point corresponding to the target template characteristic point currently in the image to be matched by adopting the following formula until the template image moves to be overlapped with the third boundary of the image to be matched. The third boundary is a boundary corresponding to the third moving direction in the boundaries of the images to be matched.
The third movement direction may be set to a horizontal right direction or a horizontal left direction along the image line of the image to be matched according to the actual matching situation. The preset number of columns may be set to 1,2, or 3 according to the actual matching situation, which is not specifically limited herein.
For example, as shown in fig. 5, if the preset column number is set to 1, the third moving direction is set to a horizontal right direction along the image line of the image 502 to be matched, and the target template feature point is template feature point 0, after determining that the target image point currently corresponding to the template feature point 0 in the image 502 to be matched is the image point a5, the template image 502 may be moved by 1 column according to the third moving direction, and the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, and the column width of the image to be matched are returned, and the step of determining the target image point currently corresponding to the target template feature point in the image 502 to be matched by using the following formula is adopted to determine the target image point currently corresponding to the template feature point 0 in the image 502 to be matched, namely the image point a6. Until the right boundary of the template image 502 moves to coincide with the third boundary of the image to be matched, the target image point corresponding to the template feature point 0 can be obtained to be the image point a 5-the image point a15 in the image 502 to be matched.
For further example, taking fig. 5 as an example, the image point at the upper left corner of the template image 501 is still set as the reference point, and the image point corresponding to the reference point in the image 502 to be matched is the image point a2. If the target template feature point is the template feature point 1 in the template feature points 0-21 of the template image 501, the offset value of the template feature point 1 may be [1,2], the position coordinate of the image point a2 in the image 502 to be matched is (0, 1), that is, the ordinal number of the row where the image point a2 is located is 0 is 1, and the line width of the image 502 to be matched is 14, the above information is substituted into the formula p=i×w+j+offset=0×14+1+ (1×14+2) = (0+1) ×14+ (1+2) = [1,3], so as to obtain the image point of the corresponding image point of the target template feature point in the image 502 to be matched is the 1 st line and the 3 rd column. Wherein the index indices of the rows and columns in the image to be matched start from 0.
And then, moving the template image by a preset line number according to a fourth moving direction, and returning the template offset value according to the target template characteristic point, the position information of the image point corresponding to the reference point of the template image in the image to be matched and the line width of the image to be matched, and determining the current corresponding target image point of the target template characteristic point in the image to be matched by adopting the following formula until the template image moves to be overlapped with the fourth boundary of the image to be matched. The fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the images to be matched.
The fourth moving direction may be set to a vertically upward direction or a vertically downward direction along the image column of the image to be matched according to the actual matching situation. The preset number of lines may be set to 1,2, or 3 according to the actual matching condition, which is not specifically limited herein.
Taking fig. 5 as an example, if the preset line number is set to 1, the fourth moving direction is set to a direction from top to bottom along the image column of the image 502 to be matched, and the target template feature point is the template feature point 3, after determining that the target image point currently corresponding to the template feature point 3 in the image 502 to be matched is the image point b1, the template image 502 may be moved by 1 line according to the fourth moving direction, and the template offset value according to the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, and the line width of the image to be matched are returned, and the step of determining the target image point currently corresponding to the target template feature point in the image 502 to be matched by adopting the following formula is adopted to determine the target image point currently corresponding to the template feature point 3 in the image 502 to be matched, namely the image point c1. Until the lower boundary of the template image 502 moves to coincide with the fourth boundary of the image to be matched, the target image points corresponding to the template feature point 3 can be obtained to be the image point b1, the image point c1, the image point d1, the image point e1, the image point f1, the image point g1, the image point h1 and the image point k1 in the image 502 to be matched.
And C2, calculating the similarity between each target image point and the target template characteristic point in parallel according to the image characteristic information of the target template characteristic point and the image information of the target image points.
For example, taking fig. 5 as an example, if the template image is moved along the third moving direction, if the preset column number is set to 1, the third moving direction is set to a horizontal right direction along the image line of the image 502 to be matched, and the target template feature point is the template feature point 0, the target image point corresponding to the template feature point 0 may be obtained as the image point a 5-the image point a15 in the image 502 to be matched. In the embodiment of the invention, the similarity between the image point a 5-the image point a15 and the target template characteristic point can be directly calculated in parallel.
Still taking fig. 5 as an example, if the template image is moved along the fourth moving direction, if the preset number of lines is set to 1, the fourth moving direction is set to a direction from top to bottom along the image column of the image 502 to be matched, and the target template feature point is the template feature point 3, the target image point corresponding to the template feature point 3 may be obtained as the image point b1, the image point c1, the image point d1, the image point e1, the image point f1, the image point g1, the image point h1 and the image point k1 in the image 502 to be matched. In the embodiment of the invention, the similarity between the image point b1, the image point c1, the image point d1, the image point e1, the image point f1, the image point g1, the image point h1 and the image point k1 and the target template characteristic point can be directly calculated in parallel.
The image information of each target image point corresponding to the target template feature point in the image to be matched comprises at least one of pixel information, brightness information, chromaticity information, gradient information and position information of the target image point.
Specifically, the step may calculate any one of cosine similarity, euclidean distance and manhattan distance between each corresponding target image point of the target template feature point in the image to be matched and the target template feature point as the similarity. Or at least 2 of cosine similarity, euclidean distance and Manhattan distance between each corresponding target image point of the target template feature points in the image to be matched and the target template feature points can be calculated, and then the average value is calculated, and the average value is taken as the similarity. Of course, in the embodiment of the present invention, according to the image feature information of the target template feature point and the image information of each target image point corresponding to the target template feature point in the image to be matched, other similarity calculation methods may be adopted to calculate the similarity between each corresponding target image point of the target template feature point in the image to be matched and the target template feature point, which is not limited herein specifically.
The following embodiment is a specific example of template matching of the template image 501 and the image 502 to be matched shown in fig. 5:
As shown in fig. 5, lines 0 to 13 in the image 502 to be matched include image points a1 to a18, image points b1 to b18, and image points n1 to n18 in this order. The reference point of the template image 501 may be set as an image point in its upper left corner, and the third movement direction may be set as a horizontal right direction moving from the image point a1 to a18, and the third boundary is a boundary corresponding to the third movement direction in the image 502 to be matched. The first movement direction may be set as a vertically downward direction moving from the image point a1 to n1, and the first boundary is a boundary corresponding to the first movement direction in the image 502 to be matched. The preset column number and the preset row number are set to 1.
In this example, when the matching process starts, the reference point of the template image 502 is a2 at the corresponding image point in the image to be matched, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, the plurality of target image points (a 5-a 15) corresponding to the 1 st template feature point in the image 502 to be matched are determined according to the feature information of the 1 st template feature point, and then the similarity between the 1 st template feature point and the image points a5-a15 is calculated in parallel. And accumulating the similarity between each target image point (image points a5-a 15) and the 1 st template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point (image points a5-a 15) is located is the region corresponding to the template image 501 in the image 502 to be matched when the target image point is coincident with the 1 st template feature point. For example, the image point a5 shown in fig. 5 coincides with the 1 st template feature point, and the region corresponding to the template image 501 in the current to-be-matched image 502 is the to-be-matched region where the image point a5 is located.
Or in this example, when the matching process starts, the reference point of the template image 502 corresponds to the image point a2 in the image to be matched, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, then the preset number of target image points corresponding to the 1 st template feature point in the image 502 to be matched are sequentially determined according to the feature information of the 1 st template feature point, and then the similarity between the 1 st template feature point and the preset number of target image points is calculated in parallel. The preset number may be set to 8 or 10, etc., and is not particularly limited herein. The number of target image points for which the similarity is calculated in parallel at a time does not exceed a preset number.
For example, if the preset number is set to 8, when the matching process starts, the image point corresponding to the reference point of the template image 502 in the image to be matched is a2, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, according to the feature information of the 1 st template feature point, the 8 target image points (a 5-a 12) corresponding to the 1 st template feature point in the image 502 to be matched are determined, and then the similarity between the 1 st template feature point and the image points a5-a12 is calculated in parallel. And accumulating the similarity between each image point in the image points a5-a12 and the 1 st template characteristic point to the comprehensive similarity corresponding to the region to be matched where the image point is located. Then, the other corresponding target image points (a 13-a 15) of the template feature point 0 in the image 502 to be matched are determined according to the feature information of the template feature point 0, and then the similarity between the template feature point 0 and the image points a13-a15 is calculated in parallel. And accumulating the similarity between each image point in the image points a13-a15 and the template feature point 0 to the comprehensive similarity corresponding to the region to be matched where the image point is located.
Then, the feature information of the 2 nd template feature point (template feature point 1) can be obtained in the same manner, the similarity between the 2 nd template feature point and the corresponding multiple target image points in the image 502 to be matched is continuously calculated according to the above flow to obtain the corresponding comprehensive similarity of each region to be matched after the new round of accumulation, and the similarity between the template feature point and the corresponding multiple target image points in the image 502 to be matched is calculated according to the above flow sequentially for the 3 rd to 22 th template feature points (template feature points 2 to 21) to obtain the final corresponding comprehensive similarity of each region to be matched after the similarity calculated by the 1 st to 22 th template feature points is accumulated. After the template feature points 0-21 of the template image 501 participate in similarity calculation, moving the template image 501 by 1 row according to a first moving direction to obtain an image point b1 corresponding to a reference point of the template image 501 in the image 502 to be matched, and then sequentially calculating the similarity between the 1 st template feature point to the 23 rd template feature point and a plurality of target image points corresponding to the image 502 to be matched according to the above flow until the template image moves to coincide with a first boundary of the image to be matched, so as to obtain the final corresponding comprehensive similarity of each region to be matched after the calculated similarity of the 1 st to 22 th template feature points is accumulated. Then, a target matching region can be selected from the respective regions to be matched according to the comprehensive similarity corresponding to each region to be matched. Specifically, the region to be matched with the highest corresponding comprehensive similarity can be selected as the target matching region. For example, if the region corresponding to the template image in the image to be matched 501 is the region to be matched with the highest corresponding comprehensive similarity, the region may be determined as the target matching region, that is, the region of the image to be matched covered by the template image 501 shown in fig. 5 is determined as the target matching region. The target matching area is the image area corresponding to the template image finally.
The following embodiment is another specific example of template matching of the template image 501 and the image 502 to be matched shown in fig. 5:
As shown in fig. 5, lines 0 to 13 in the image 502 to be matched include image points a1 to a18, image points b1 to b18, and image points n1 to n18 in this order. The reference point of the template image 501 may be set as an image point in its upper left corner, and the fourth moving direction may be set as a vertically downward direction moving from the image point b1 to n1, and the fourth boundary is a boundary corresponding to the fourth moving direction in the image 502 to be matched. The second movement direction may be set to a horizontal rightward direction moving from the image point a1 to a18, and the second boundary is a boundary corresponding to the second movement direction in the image 502 to be matched. The preset column number and the preset row number are set to 1.
In this example, when the matching process starts, the reference point of the template image 502 is a2 at the corresponding image point in the image to be matched, and the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory; according to the feature information of the 1 st template feature point, a plurality of target image points (image points a5, b5...once., h 5) corresponding to the 1 st template feature point in the image 502 to be matched are determined, and then the similarity between the 1 st template feature point and the "image points a5, b5...once., h5" is calculated in parallel. And accumulating the similarity between each target image point (image point a5, b5., h 5) and the 1 st template feature point to the comprehensive similarity corresponding to the region to be matched where the target image point is located, wherein the region to be matched where each target image point (image point a5, b5., h 5) is the region corresponding to the template image 501 in the image 502 to be matched when the target image point is coincident with the 1 st template feature point. For example, the image point a5 shown in fig. 5 coincides with the 1 st template feature point, and the region corresponding to the template image 501 in the current to-be-matched image 502 is the to-be-matched region where the image point a5 is located.
Or in this example, when the matching process starts, the reference point of the template image 502 corresponds to the image point a2 in the image to be matched, the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory, then the preset number of target image points corresponding to the 1 st template feature point in the image 502 to be matched are sequentially determined according to the feature information of the 1 st template feature point, then the similarity between the 1 st template feature point and the preset number of target image points is calculated in parallel, and the number of target image points for calculating the similarity in parallel each time does not exceed the preset number.
For example, if the preset number is set to 6, at the beginning of the matching process, the corresponding image point of the reference point of the template image 502 in the image to be matched is a2, and the feature information of the 1 st template feature point (template feature point 0) can be obtained from the memory; according to the feature information of the 1 st template feature point, 6 target image points (image points a5, b5...once., f 5) corresponding to the 1 st template feature point in the image 502 to be matched are determined, and then the similarity between the 1 st template feature point and the "image points a5, b5...once again, f5" is calculated in parallel. And accumulating the similarity between each image point in the image points a5, b5. and the 1 st template characteristic point to the comprehensive similarity corresponding to the region to be matched where the image point is located. Then, the other corresponding target image points (g 5 and k 5) of the template feature point 0 in the image 502 to be matched are continuously determined according to the feature information of the template feature point 0, and then the similarity between the template feature point 0 and the image points g5 and k5 is calculated in parallel. And accumulating the similarity between each image point g5 and k5 and the template feature point 0 to the comprehensive similarity corresponding to the region to be matched where the image point is located.
Then, the feature information of the 2 nd template feature point (template feature point 1) can be obtained in the same manner, the similarity between the 2 nd template feature point and the corresponding multiple target image points in the image 502 to be matched is continuously calculated according to the above flow to obtain the corresponding comprehensive similarity of each region to be matched after the new round of accumulation, and the similarity between the template feature point and the corresponding multiple target image points in the image 502 to be matched is calculated according to the above flow sequentially for the 3 rd to 22 th template feature points (template feature points 2 to 21) to obtain the final corresponding comprehensive similarity of each region to be matched after the similarity calculated by the 1 st to 22 th template feature points is accumulated. After the template feature points 0-21 of the template image 501 participate in similarity calculation, moving the template image 501 by 1 column according to a second moving direction to obtain an image point a3 corresponding to a reference point of the template image 501 in the image 502 to be matched, and then sequentially calculating the similarity between the 1 st template feature point to the 23 rd template feature point and a plurality of target image points corresponding to the image 502 to be matched according to the above flow until the template image moves to coincide with a second boundary of the image to be matched, so as to obtain the final corresponding comprehensive similarity of each region to be matched after the calculated similarity of the 1 st to 22 th template feature points is accumulated. Then, a target matching region can be selected from the respective regions to be matched according to the comprehensive similarity corresponding to each region to be matched. Specifically, the region to be matched with the highest corresponding comprehensive similarity can be selected as the target matching region. For example, if the region corresponding to the template image in the image to be matched 501 is the region to be matched with the highest corresponding comprehensive similarity, the region may be determined as the target matching region, that is, the region of the image to be matched covered by the template image 501 shown in fig. 5 is determined as the target matching region. The target matching area is the image area corresponding to the template image finally.
In one possible implementation, before the parallel computing of the similarity between the target template feature points and the corresponding target image points in the image to be matched according to the feature information, the method further comprises loading image information of the corresponding target image points in the image to be matched of the target template feature points from a cache based on SIMD (Single Instruction Multiple Data, single instruction stream multiple data stream).
In the embodiment of the invention, the image information of each image point of the line where the corresponding image point of the target template feature point in the image to be matched is located can be loaded into the vector register from the memory in advance, and meanwhile, the image information is cached in the cache. The SIMD is a parallel processing method for executing a plurality of data operations by using one computer instruction, and the cache may be a cache or the like.
The parallel computing of the similarity between the corresponding multiple target image points in the image to be matched and the target template feature points of the target template feature points according to the feature information specifically comprises the parallel computing of the similarity between each target image point and the target template feature points based on the image information and the feature information.
Specifically, image information of a plurality of target image points corresponding to the target template feature points in the image to be matched can be loaded into a vector register from a cache, and the vector register is used for calculating the similarity between each target image point and the target template feature points in parallel.
In the embodiment of the invention, since the template can be moved by a preset column number along the third moving direction or by a preset line number along the fourth moving direction, the preset column number and the preset line number are known, and the corresponding image points of the target template feature points in the image to be matched can be obtained in advance when the matching process starts, the image information of a plurality of corresponding target image points of the target template feature points in the image to be matched can be loaded into the vector register from the cache based on SIMD when the matching process starts, and then the similarity between the target template feature points and each target image point can be calculated in parallel.
For example, as shown in fig. 5, if the preset column number is set to 1, and it is determined that the template feature point 0 will sequentially calculate the similarity with the image points a5-a15 in the image 502 to be matched according to the preset column number, the image information of each image point of the line (0 th line) where the corresponding image point of the template feature point 0 in the image to be matched is loaded from the cache at the beginning of the matching process based on SIMD, and then the similarity between the template feature point 0 and the image points a5-a15 is calculated in parallel. Or if the preset number of lines is set to 1, determining that the template feature point 0 will be similar to the "image point a5, b5." in the image 502 to be matched, h5 "in turn, the image information of each image point of the column (5 th column) where the corresponding image point of the template feature point 0 in the image to be matched is located can be loaded from the cache based on SIMD at the beginning of the matching flow, and then the similarity between the template feature point 0 and the" image points a5, b5, the.
For another example, fig. 6 is a schematic diagram of performing template matching by using a parallel data processing method according to an embodiment of the present invention. In fig. 6 (1), a template image 601 is shown, and the distribution of the template feature points 0 to 21 in the template image 601 is discrete and disordered. As shown in the memory distribution display diagram 602 in fig. 6, the distribution order of the template feature points 0 to 21 in the memory is 0 to 21. The feature information of each template feature point can be taken out of the memory and put into the cache for caching, as shown in fig. 6, the feature information of the template feature point 0 can be taken out of the memory and put into the cache. When matching is started, a preset column number and a preset line number can be set to be 1, a plurality of target image points corresponding to the template feature point 0 in the image 603 to be matched are all located in the 0 th line of the image 603 to be matched, the similarity between the template feature point 0 and the image points 3-18 of the 0 th line in the image 603 to be matched is calculated in sequence according to the preset column number, image information of each image point of the 0 th line of the image to be matched can be loaded into a vector register from a high-speed buffer firstly based on SIMD (single instruction multiple data) when the matching process starts, the feature information of the template feature point 0 is also taken out from the high-speed buffer and placed into the vector register, and the similarity between the template feature point 0 and the image points 3-18 of the 0 th line in the image 602 to be matched is calculated in parallel. The vector register is a space closest to the CPU for temporarily storing data, and the CPU performs data access only with the vector register when calculating the similarity between the template feature points and the corresponding target image points, so that the data in the cache can be fetched and temporarily stored in the vector register. The op (Over-version, reserve space) in fig. 6 is the cache space pointing to the volume register.
Therefore, in this embodiment, for the image to be matched, when the similarity between the template feature point and the corresponding multiple target image points is calculated in each cycle, the position of the image point corresponding to the template feature point can be predicted according to the preset number of columns and the preset number of rows, so that the SIMD technology can be used to perform data parallel processing, and the template matching speed is improved.
In one possible implementation manner, in order to further increase the template matching speed, the present implementation manner improves the storage manner of the feature information of each template feature point in the template image in the memory. Specifically, the storage sequence of the feature information of each template feature point in the template image in the memory may be consistent with the distribution sequence of the row and the column of each template feature point in the template image. Before the feature information of the target template feature points of the template image is obtained, the method can further comprise the step of obtaining the feature information of the target template feature points according to the storage sequence of the feature information of the template feature points of the template image in the memory.
Specifically, the feature information of each template feature point of the template image may be loaded from the memory to the cache according to the storage sequence of the feature information of each template feature point of the template image in the memory, and then after the matching process begins, the feature information of the target template feature point of the template image may be obtained from the cache.
Before the storage mode of the feature information of each template feature point in the template image in the memory is not improved in this embodiment, as shown in the memory distribution display diagram 602 of template feature points 0-21 in fig. 6, the template feature information is stored in the memory in the order of the template feature points 0-21, and when the template matching is performed, the template feature information needs to be taken out from the memory and put into the cache according to the memory storage order of the template feature information, and the cache order is consistent with the memory order. Therefore, the memory mode can cause that when the template characteristic information is sequentially fetched from the cache and matched with the image information of the image point on the image to be matched, the distribution position of the template characteristic information on the template image is disordered, so that the image point information of the image to be matched, which is originally loaded into the cache, is replaced by the image point information which is far away from each other when the current region to be matched is traversed and the matching of the current region to be matched and the template image is carried out, and when the image point of the replaced image point to be matched is traversed, the image information of the image point needs to be loaded into the cache again from the memory. As shown in fig. 6, according to the memory storage sequence shown in the memory distribution display diagram 602 in fig. 6, when traversing the region to be matched on the image to be matched with respect to the template feature points 3-6, the image information of the 2 nd-5 th image points on the image to be matched can be sequentially first fetched from the memory and placed into the cache. When traversing the region to be matched on the image to be matched with respect to the template feature point 9, the cache memory may occupy all the cache space by the image information of the 2 nd-5 th line image point on the image to be matched, the image information of the 2 nd line image point on the image to be matched in the cache memory may be replaced by the image information of the 6 th line image point on the image to be matched, and further, when traversing the region to be matched on the image to be matched with respect to the template feature point 19, since the image information of the 2 nd line image point on the image to be matched is already replaced, the image information of the 2 nd line image point on the image to be matched needs to be reloaded from the memory to the cache memory, and the similarity between the template feature point 19 and the corresponding image point in the 2 nd line image point is calculated. The CPU cannot find the required data in the Cache, so the phenomenon of repeatedly loading the image information of the image point is called CACHE MISS. The more disordered the template feature information of the template feature points is in the distribution position on the template image, the more serious the CACHE MISS phenomenon is, so that the template matching speed can be influenced to a greater extent.
Therefore, in order to reduce CACHE MISS and further improve the template matching speed, the present embodiment makes the storage order of the feature information of each template feature point in the template image in the memory consistent with the distribution order of the row and the column of each template feature point in the template image. Fig. 7 is a schematic diagram of storing template feature information in a memory, where a schematic diagram of distribution of template feature points 0-21 in a template image may be shown as 601 in fig. 6. As shown in fig. 7, the storage order of the feature information of each template feature point in the template image 601 in the memory is consistent with the distribution order of the row and the column of each template feature point in the template image, that is, the feature information of each template feature point in the template image 601 is stored in the memory according to the order of the row and the column of the template feature point in the template image. After such improvement, according to the storage sequence of the feature information of each template feature point of the template image 601 in the memory, the feature information of each template feature point of the template image is loaded from the memory to the cache, and the feature information caching sequence of each template feature point of the template image 601 in the cache is consistent with the storage sequence in the memory. And when the templates are matched, the template characteristic information is taken out from the memory and put into a cache according to the memory storage sequence of the template characteristic information, and the cache sequence is consistent with the memory sequence. According to the improved storage mode, when the matching of the region to be matched and the template image is carried out, the probability that the image point information of the image to be matched, which is loaded into the high-speed buffer memory originally, is replaced by the image point information which is far away in the follow-up process can be reduced, CACHE MISS is further reduced, and the template matching speed is improved. As shown in fig. 5, according to the memory storage sequence of the template feature information shown in fig. 7, the image to be matched may be traversed for each template feature point 3-6 in the same row according to the sequence of the rows where each template feature point is located, and then the image to be matched may be traversed for each template feature point in the next row. For example, the image information of the 2 nd line of image points on the image 502 to be matched may be sequentially taken out from the memory and put into the cache, and then the image 502 to be matched may be sequentially traversed for the template feature point 3, the template feature point 2, the template feature point 1, the template feature point 21, the template feature point 20 and the template feature point 19 of the same line. The image 502 to be matched is traversed sequentially for the template feature points 4 and the template feature points 18 in the same row according to the row order. When traversing the image 502 to be matched for the template feature point 9, all the cache space may be occupied by the image information of the 2 nd-5 th row image point on the image 502 to be matched in the cache memory, then the image information of the 6 th row image point on the image 502 to be matched in the cache memory is replaced by the image information of the 2 nd row image point on the image 502 to be matched in this time, and the image information of the 2 nd row image point on the image 502 to be matched is not replaced in the subsequent traversing of the image to be matched for the template feature point 19, so that the situation that the image information of the 2 nd row image point on the image 502 to be matched is reloaded from the memory to the cache memory is required, and because the template feature point 19 is already completed before the template feature point 9 in the memory in the storage sequence of the template feature point 19, the present embodiment improves the storage mode of the feature information of each template feature point in the image to be matched in the memory, enables the feature information of each template feature point in the template image to be replaced in the storage sequence, and the image to be matched in the sequence to be more consistent with each memory, and the access speed of the template in the memory is further improved, and the performance of the image to be matched is improved.
The embodiment of the invention also provides a template matching device of the image corresponding to the template matching method of the image. The template matching device for the image provided by the embodiment of the invention is described below. As shown in fig. 8, a template matching apparatus of an image, the apparatus comprising:
an image acquisition module 801, configured to acquire a template image and an image to be matched;
The feature information obtaining module 802 is configured to obtain feature information of a target template feature point of the template image, where the target template feature point is a template feature point of the template image obtained according to a preset rule;
A similarity calculating module 803, configured to calculate, in parallel, a similarity between a plurality of target image points corresponding to the target template feature points in the image to be matched and the target template feature points according to the feature information, where the plurality of target image points are image points in a same row or column in the image to be matched, and a preset number of columns or rows are spaced between two adjacent target image points;
a similarity accumulating module 804, configured to accumulate similarities between each target image point and the target template feature points to comprehensive similarities corresponding to a region to be matched where the target image point is located, where the region to be matched where each target image point is located is a region corresponding to the template image in the image to be matched when the target image point coincides with the target template feature points;
the target region determining module 805 is configured to determine a target matching region from the respective regions to be matched according to the comprehensive similarity corresponding to each region to be matched.
Therefore, after the device provided by the embodiment of the invention is adopted, the similarity between a plurality of corresponding target image points and the target template characteristic points in the image to be matched can be calculated in parallel according to the characteristic information after the characteristic information of the target template characteristic points of the template image is obtained each time, namely, the obtained characteristic information of one template characteristic point can be used for calculating the similarity between the characteristic information of the corresponding target image points in a plurality of different image areas of the image to be matched and the information of the corresponding target image points of the template characteristic points. Therefore, the data of each image point in the same row or the same column of the image to be matched can be loaded from the memory according to the preset column number or the preset line number, then when the similarity between the target template characteristic point and a plurality of corresponding target image points in the image to be matched is calculated, template characteristic information is not required to be acquired one by one, the similarity between the loaded target image points and the target template characteristic point at intervals of the preset column number or the preset line number can be calculated in parallel, and the similarity between the target template characteristic point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved.
Optionally, the apparatus further includes:
And a feature point obtaining module (not shown in the figure) configured to obtain a next target template feature point of the template image according to the preset rule before determining a target matching area from each to-be-matched area according to the comprehensive similarity corresponding to each to-be-matched area, and return the feature information of the target template feature point of the template image until the template feature points of the template image are all obtained.
Optionally, the apparatus further includes:
And a moving module (not shown in the figure) configured to, after the template feature points of the template image are all acquired, move the template image by a preset number of rows according to a first moving direction or by a preset number of columns according to a second moving direction, and return to the step of calculating, in parallel, the similarity between the target feature points and a plurality of target image points corresponding to the target feature points in the image to be matched according to the feature information until the template image moves to coincide with a first boundary or a second boundary of the image to be matched, where the first boundary is a boundary corresponding to the first moving direction in the boundaries of the image to be matched, and the second boundary is a boundary corresponding to the second moving direction in the boundaries of the image to be matched.
Optionally, the apparatus further includes:
An information loading module (not shown in the figure) for loading image information of a plurality of target image points corresponding to the target template feature points in the image to be matched from a cache memory based on single instruction stream multiple data stream SIMD;
The similarity calculating module 803 is specifically configured to calculate, in parallel, a similarity between each target image point and the target template feature point based on the image information and the feature information.
Optionally, the storage sequence of the feature information of each template feature point in the template image in the memory is consistent with the distribution sequence of the row and the column of each template feature point in the template image;
The feature information obtaining module 802 is specifically configured to obtain feature information of the feature points of the target template according to a storage order of feature information of feature points of each template of the template image in the memory.
Optionally, the feature information of each template feature point includes a template offset value and image feature information of the template feature point;
The similarity calculating module 803 is specifically configured to determine a plurality of target image points corresponding to the target template feature point in the image to be matched according to the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, the column width of the image to be matched, and the preset column number or row number, and calculate the similarity between each target image point and the target template feature point in parallel according to the image feature information of the target template feature point and the image information of the plurality of target image points.
Optionally, the similarity calculating module 803 includes:
a similarity calculation submodule (not shown in the figure) is configured to determine, according to a template offset value of a target template feature point, position information of a corresponding image point in the image to be matched of a reference point of the template image, and a column width or a line width of the image to be matched, a current corresponding target image point in the image to be matched of the target template feature point by adopting the following formula:
p=i*w+j+offset
Wherein p is a current corresponding target image point of a target template feature point in the image to be matched, i is an ordinal number of a row of a corresponding image point of a reference point of the template image in the image to be matched, w is a column width or a line width of the image to be matched, j is an ordinal number of a column of a corresponding image point of the reference point of the template image in the image to be matched, and offset is a template offset value of the target template feature point;
a moving submodule (not shown in the figure) is configured to move the template image by a preset column number according to a third moving direction or by a preset line number according to a fourth moving direction, and return the template offset value according to the target template feature point, the position information of the corresponding image point of the reference point of the template image in the image to be matched, and the column width or the line width of the image to be matched, and determine the current corresponding target image point of the target template feature point in the image to be matched by adopting the following formula until the template image moves to be coincident with a third boundary or a fourth boundary of the image to be matched, where the third boundary is a boundary corresponding to the third moving direction in the boundaries of the image to be matched, and the fourth boundary is a boundary corresponding to the fourth moving direction in the boundaries of the image to be matched.
By adopting the device provided by the embodiment of the invention, after the characteristic information of the target template characteristic points of the template image is obtained each time, the similarity between a plurality of corresponding target image points in the image to be matched and the target template characteristic points of the target template characteristic points can be calculated in parallel according to the characteristic information, namely, the obtained characteristic information of one template characteristic point can be used for calculating the similarity between the characteristic information of the target image points corresponding to the template characteristic points in a plurality of different image areas of the image to be matched. Therefore, the data of each image point in the same row of the image to be matched can be loaded from the memory according to the preset number of columns or the preset number of rows, then when the similarity between the target template feature point and a plurality of corresponding target image points in the image to be matched is calculated, template feature information is not required to be acquired one by one, the loaded similarity between a plurality of target image points and the target template feature point at intervals of the preset number of columns or the preset number of rows can be calculated simultaneously, and the similarity between the target template feature point and the target image points corresponding to a plurality of different areas in the image to be matched can be calculated in parallel, so that the template matching speed is effectively improved. In addition, the embodiment of the invention can improve the storage mode of the characteristic information of each template characteristic point in the template image in the memory, so that the storage sequence of the characteristic information of each template characteristic point in the template image in the memory is consistent with the distribution sequence of the line and the column of each template characteristic point in the template image, thereby reducing CACHE MISS phenomenon and further improving the template matching speed.
The embodiment of the present invention also provides an electronic device, as shown in fig. 9, including a processor 901, a communication interface 902, a memory 903, and a communication bus 904, where the processor 901, the communication interface 902, and the memory 903 perform communication with each other through the communication bus 904,
A memory 903 for storing a computer program;
A processor 901 for implementing the steps of any of the template matching methods of the images when executing the program stored in the memory 903.
The communication bus mentioned above for the electronic device may be a peripheral component interconnect standard (PERIPHERAL COMPONENT INTERCONNECT, PCI) bus or an extended industry standard architecture (Extended Industry Standard Architecture, EISA) bus, etc. The communication bus may be classified as an address bus, a data bus, a control bus, or the like. For ease of illustration, the figures are shown with only one bold line, but not with only one bus or one type of bus.
The communication interface is used for communication between the electronic device and other devices.
The Memory may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Optionally, the memory may also be at least one memory device located remotely from the aforementioned processor.
The Processor may be a general-purpose Processor including a central processing unit (Central Processing Unit, CPU), a network Processor (Network Processor, NP), etc., or may be a digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components.
In yet another embodiment of the present invention, there is also provided a computer readable storage medium having stored therein a computer program which, when executed by a processor, implements the steps of the template matching method of an image of any one of the above.
In yet another embodiment of the present invention, there is also provided a computer program product containing instructions that, when run on a computer, cause the computer to perform the template matching method of any of the images of the above embodiments.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When loaded and executed on a computer, produces a flow or function in accordance with embodiments of the present invention, in whole or in part. The computer may be a general purpose computer, a special purpose computer, a computer network, or other programmable apparatus. The computer instructions may be stored in or transmitted from one computer-readable storage medium to another, for example, by wired (e.g., coaxial cable, optical fiber, digital Subscriber Line (DSL)), or wireless (e.g., infrared, wireless, microwave, etc.). The computer readable storage medium may be any available medium that can be accessed by a computer or a data storage device such as a server, data center, etc. that contains an integration of one or more available media. The usable medium may be a magnetic medium (e.g., floppy disk, hard disk, tape), an optical medium (e.g., DVD), or a semiconductor medium (e.g., solid state disk Solid STATE DISK (SSD)), etc.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. Moreover, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
In this specification, each embodiment is described in a related manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments. In particular, for system, electronic device, computer readable storage medium and computer program product embodiments, the description is relatively simple as it is substantially similar to the method embodiments, as relevant points are found in the partial description of the method embodiments.
The foregoing description is only of the preferred embodiments of the present invention and is not intended to limit the scope of the present invention. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present invention are included in the protection scope of the present invention.

Claims (10)

1.一种图像的模板匹配方法,其特征在于,包括:1. A template matching method for an image, comprising: 获取模板图像和待匹配图像;Get the template image and the image to be matched; 获取所述模板图像的目标模板特征点的特征信息,所述目标模板特征点为按照预设规则获取的模板图像的模板特征点;Acquire feature information of a target template feature point of the template image, where the target template feature point is a template feature point of the template image acquired according to a preset rule; 针对每个目标模板特征点,根据该目标模板特征点的特征信息,并行计算该目标模板特征点在所述待匹配图像中对应的多个目标图像点与该目标模板特征点之间的相似度,其中,所述多个目标图像点为所述待匹配图像中同一行或列的图像点,且相邻的两个目标图像点之间相隔预设列数或行数;For each target template feature point, similarities between a plurality of target image points corresponding to the target template feature point in the image to be matched and the target template feature point are calculated in parallel based on feature information of the target template feature point, wherein the plurality of target image points are image points in the same row or column in the image to be matched, and two adjacent target image points are separated by a preset number of columns or rows; 将每个目标图像点与所述目标模板特征点之间的相似度累加至该目标图像点所在的待匹配区域对应的综合相似度,其中,每个目标图像点所在的待匹配区域为:该目标图像点与所述目标模板特征点重合时,所述待匹配图像中与所述模板图像对应的区域;Accumulating the similarity between each target image point and the target template feature point to the comprehensive similarity corresponding to the to-be-matched region where the target image point is located, wherein the to-be-matched region where each target image point is located is: the region in the to-be-matched image corresponding to the template image when the target image point coincides with the target template feature point; 根据每个待匹配区域对应的综合相似度,从各个待匹配区域中确定出目标匹配区域。According to the comprehensive similarity corresponding to each to-be-matched region, a target matching region is determined from each to-be-matched region. 2.根据权利要求1所述的方法,其特征在于,在所述根据每个待匹配区域对应的综合相似度,从各个待匹配区域中确定出目标匹配区域之前,所述方法还包括:2. The method according to claim 1, characterized in that before determining the target matching area from each to-be-matched area based on the comprehensive similarity corresponding to each to-be-matched area, the method further comprises: 按照所述预设规则获取所述模板图像的下一个目标模板特征点,并返回所述获取所述模板图像的目标模板特征点的特征信息的步骤,直至所述模板图像的模板特征点均被获取完成。The next target template feature point of the template image is obtained according to the preset rule, and the process returns to the step of obtaining feature information of the target template feature point of the template image until all template feature points of the template image are obtained. 3.根据权利要求2所述的方法,其特征在于,在所述模板图像的模板特征点均被获取完成之后,所述方法还包括:3. The method according to claim 2, wherein after all template feature points of the template image are acquired, the method further comprises: 将所述模板图像按照第一移动方向移动预设行数或按照第二移动方向移动预设列数,返回所述根据所述特征信息,并行计算所述目标模板特征点在所述待匹配图像中对应的多个目标图像点与所述目标模板特征点之间的相似度的步骤,直至所述模板图像移动至与所述待匹配图像的第一边界或第二边界重合;moving the template image by a preset number of rows in a first moving direction or by a preset number of columns in a second moving direction, and returning to the step of calculating, in parallel, the similarities between a plurality of target image points corresponding to the target template feature point in the image to be matched and the target template feature point based on the feature information, until the template image moves to coincide with a first boundary or a second boundary of the image to be matched; 其中,所述第一边界为所述待匹配图像的边界中与所述第一移动方向相对应的边界,所述第二边界为所述待匹配图像的边界中与所述第二移动方向相对应的边界。The first boundary is a boundary in the boundary of the image to be matched that corresponds to the first moving direction, and the second boundary is a boundary in the boundary of the image to be matched that corresponds to the second moving direction. 4.根据权利要求1-3任一项所述的方法,其特征在于,在所述针对每个目标模板特征点,根据该目标模板特征点的特征信息,并行计算该目标模板特征点在所述待匹配图像中对应的多个目标图像点与该目标模板特征点之间的相似度之前,还包括:4. The method according to any one of claims 1 to 3, characterized in that before the step of calculating, for each target template feature point, the similarity between the target template feature point and a plurality of target image points corresponding to the target template feature point in the image to be matched based on the feature information of the target template feature point, the method further comprises: 针对每个目标模板特征点,基于单指令流多数据流SIMD,从高速缓存器中加载该目标模板特征点在所述待匹配图像中对应的多个目标图像点的图像信息;For each target template feature point, based on single instruction stream multiple data stream SIMD, image information of multiple target image points corresponding to the target template feature point in the image to be matched is loaded from a cache; 所述针对每个目标模板特征点,根据该目标模板特征点的特征信息,并行计算该目标模板特征点在所述待匹配图像中对应的多个目标图像点与该目标模板特征点之间的相似度,包括:The method of calculating, for each target template feature point, the similarity between a plurality of target image points corresponding to the target template feature point in the image to be matched and the target template feature point in parallel based on feature information of the target template feature point, includes: 针对每个目标模板特征点,基于所述图像信息和该目标模板特征点的特征信息,并行计算每个目标图像点与该目标模板特征点之间的相似度。For each target template feature point, based on the image information and the feature information of the target template feature point, the similarity between each target image point and the target template feature point is calculated in parallel. 5.根据权利要求1-3任一项所述的方法,其特征在于,所述模板图像中的各个模板特征点的特征信息在内存中的存储顺序,与每个模板特征点在所述模板图像中所在行和所在列的分布顺序一致;5. The method according to any one of claims 1 to 3, wherein the order in which the feature information of each template feature point in the template image is stored in the memory is consistent with the order in which each template feature point is distributed in the row and column in the template image; 所述获取所述模板图像的目标模板特征点的特征信息,包括:The acquiring feature information of the target template feature points of the template image includes: 按照所述模板图像的各个模板特征点的特征信息在内存中的存储顺序,获取目标模板特征点的特征信息。The feature information of the target template feature point is obtained according to the storage order of the feature information of each template feature point of the template image in the memory. 6.根据权利要求1-3任一项所述的方法,其特征在于,每个模板特征点的特征信息包括该模板特征点的模板偏移值和图像特征信息;6. The method according to any one of claims 1 to 3, wherein the feature information of each template feature point includes a template offset value and image feature information of the template feature point; 所述针对每个目标模板特征点,根据该目标模板特征点的特征信息,并行计算该目标模板特征点在所述待匹配图像中对应的多个目标图像点与该目标模板特征点之间的相似度,包括:The method of calculating, for each target template feature point, the similarity between a plurality of target image points corresponding to the target template feature point in the image to be matched and the target template feature point in parallel based on feature information of the target template feature point, includes: 针对每个目标模板特征点,根据该目标模板特征点的模板偏移值、所述模板图像的参考点在所述待匹配图像中对应的图像点的位置信息、所述待匹配图像的列宽和所述预设列数或行数,确定该目标模板特征点在所述待匹配图像中对应的多个目标图像点;For each target template feature point, determine a plurality of target image points corresponding to the target template feature point in the image to be matched based on the template offset value of the target template feature point, position information of the image point corresponding to the reference point of the template image in the image to be matched, the column width of the image to be matched, and the preset number of columns or rows; 根据该目标模板特征点的图像特征信息和所述多个目标图像点的图像信息,并行计算每个目标图像点与该目标模板特征点之间的相似度。The similarity between each target image point and the target template feature point is calculated in parallel based on the image feature information of the target template feature point and the image information of the plurality of target image points. 7.根据权利要求6所述的方法,其特征在于,所述根据该目标模板特征点的模板偏移值、所述模板图像的参考点在所述待匹配图像中对应的图像点的位置信息、所述待匹配图像的列宽和所述预设列数或行数,确定该目标模板特征点在所述待匹配图像中对应的多个目标图像点,包括:7. The method according to claim 6, wherein determining the plurality of target image points corresponding to the target template feature point in the image to be matched based on the template offset value of the target template feature point, position information of the image point corresponding to the reference point of the template image in the image to be matched, the column width of the image to be matched, and the preset number of columns or rows comprises: 根据该目标模板特征点的模板偏移值、所述模板图像的参考点在所述待匹配图像中对应的图像点的位置信息和所述待匹配图像的列宽或行宽,采用如下公式确定该目标模板特征点在所述待匹配图像中当前对应的目标图像点:The target image point currently corresponding to the target template feature point in the image to be matched is determined using the following formula based on the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, and the column width or row width of the image to be matched: p=i*w+j+offsetp=i*w+j+offset 其中,p为该目标模板特征点在所述待匹配图像中当前对应的目标图像点,i为所述模板图像的参考点在所述待匹配图像中对应的图像点所在行的序数,w为所述待匹配图像的列宽或行宽,j为所述模板图像的参考点在所述待匹配图像中对应的图像点所在列的序数,offset为该目标模板特征点的模板偏移值,具体表示为垂直偏移量*w+水平偏移量;Wherein, p is the target image point currently corresponding to the target template feature point in the image to be matched, i is the ordinal number of the row in which the image point corresponding to the reference point of the template image is located in the image to be matched, w is the column width or row width of the image to be matched, j is the ordinal number of the column in which the image point corresponding to the reference point of the template image is located in the image to be matched, and offset is the template offset value of the target template feature point, specifically expressed as vertical offset * w + horizontal offset; 将所述模板图像按照第三移动方向移动预设列数或按照第四移动方向移动预设行数,并返回所述根据该目标模板特征点的模板偏移值、所述模板图像的参考点在所述待匹配图像中对应的图像点的位置信息和所述待匹配图像的列宽或行宽,采用如下公式确定该目标模板特征点在所述待匹配图像中当前对应的目标图像点的步骤,直到所述模板图像移动至与所述待匹配图像的第三边界或第四边界重合,其中,所述第三边界为所述待匹配图像的边界中与所述第三移动方向相对应的边界,所述第四边界为所述待匹配图像的边界中与所述第四移动方向相对应的边界。The template image is moved a preset number of columns in a third moving direction or a preset number of rows in a fourth moving direction, and the template offset value of the target template feature point, the position information of the image point corresponding to the reference point of the template image in the image to be matched, and the column width or row width of the image to be matched are returned, and the step of determining the target image point currently corresponding to the target template feature point in the image to be matched using the following formula is performed until the template image moves to coincide with the third boundary or the fourth boundary of the image to be matched, wherein the third boundary is the boundary in the boundary of the image to be matched corresponding to the third moving direction, and the fourth boundary is the boundary in the boundary of the image to be matched corresponding to the fourth moving direction. 8.一种图像的模板匹配装置,其特征在于,包括:8. An image template matching device, comprising: 图像获取模块,用于获取模板图像和待匹配图像;An image acquisition module is used to acquire a template image and an image to be matched; 特征信息获取模块,用于获取所述模板图像的目标模板特征点的特征信息,所述目标模板特征点为按照预设规则获取的模板图像的模板特征点;A feature information acquisition module is used to acquire feature information of a target template feature point of the template image, wherein the target template feature point is a template feature point of the template image acquired according to a preset rule; 相似度计算模块,用于针对每个目标模板特征点,根据该目标模板特征点的特征信息,并行计算该目标模板特征点在所述待匹配图像中对应的多个目标图像点与该目标模板特征点之间的相似度,其中,所述多个目标图像点为所述待匹配图像中同一行或列的图像点,且相邻的两个目标图像点之间相隔预设列数或行数;a similarity calculation module, configured to calculate, for each target template feature point, based on feature information of the target template feature point, the similarity between a plurality of target image points corresponding to the target template feature point in the image to be matched and the target template feature point in parallel, wherein the plurality of target image points are image points in the same row or column in the image to be matched, and two adjacent target image points are separated by a preset number of columns or rows; 相似度累加模块,用于将每个目标图像点与所述目标模板特征点之间的相似度累加至该目标图像点所在的待匹配区域对应的综合相似度,其中,每个目标图像点所在的待匹配区域为:该目标图像点与所述目标模板特征点重合时,所述待匹配图像中与所述模板图像对应的区域;a similarity accumulation module, configured to accumulate the similarity between each target image point and the target template feature point to the comprehensive similarity corresponding to the to-be-matched region where the target image point is located, wherein the to-be-matched region where each target image point is located is: the region in the to-be-matched image corresponding to the template image when the target image point coincides with the target template feature point; 目标区域确定模块,用于根据每个待匹配区域对应的综合相似度,从各个待匹配区域中确定出目标匹配区域。The target region determination module is used to determine a target matching region from each to-be-matched region according to the comprehensive similarity corresponding to each to-be-matched region. 9.一种电子设备,其特征在于,包括处理器、通信接口、存储器和通信总线,其中,处理器,通信接口,存储器通过通信总线完成相互间的通信;9. An electronic device, comprising a processor, a communication interface, a memory, and a communication bus, wherein the processor, the communication interface, and the memory communicate with each other via the communication bus; 存储器,用于存放计算机程序;Memory for storing computer programs; 处理器,用于执行存储器上所存放的程序时,实现权利要求1-7任一所述的方法步骤。A processor, configured to implement the method steps described in any one of claims 1 to 7 when executing a program stored in a memory. 10.一种计算机可读存储介质,其特征在于,所述计算机可读存储介质内存储有计算机程序,所述计算机程序被处理器执行时实现权利要求1-7任一所述的方法步骤。10. A computer-readable storage medium, characterized in that a computer program is stored in the computer-readable storage medium, and when the computer program is executed by a processor, the method steps according to any one of claims 1 to 7 are implemented.
CN202210590912.3A 2022-05-27 2022-05-27 Image template matching method, device, electronic device and storage medium Active CN114926670B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210590912.3A CN114926670B (en) 2022-05-27 2022-05-27 Image template matching method, device, electronic device and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210590912.3A CN114926670B (en) 2022-05-27 2022-05-27 Image template matching method, device, electronic device and storage medium

Publications (2)

Publication Number Publication Date
CN114926670A CN114926670A (en) 2022-08-19
CN114926670B true CN114926670B (en) 2025-10-03

Family

ID=82809836

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210590912.3A Active CN114926670B (en) 2022-05-27 2022-05-27 Image template matching method, device, electronic device and storage medium

Country Status (1)

Country Link
CN (1) CN114926670B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110473238A (en) * 2019-06-25 2019-11-19 浙江大华技术股份有限公司 A kind of method for registering images, device, electronic equipment and storage medium
CN112861983A (en) * 2021-02-24 2021-05-28 广东拓斯达科技股份有限公司 Image matching method, image matching device, electronic equipment and storage medium

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101677561B1 (en) * 2010-12-08 2016-11-18 한국전자통신연구원 Image registration device and image registration method thereof
KR102548722B1 (en) * 2018-05-28 2023-06-27 삼성에스디에스 주식회사 Method for matching template and apparatus thereof
US11354883B2 (en) * 2019-12-30 2022-06-07 Sensetime International Pte. Ltd. Image processing method and apparatus, and electronic device
CN112508037B (en) * 2020-11-23 2024-04-02 北京配天技术有限公司 Image template matching method and device and storage device
CN112926695B (en) * 2021-04-16 2024-05-24 动员(北京)人工智能技术研究院有限公司 Image recognition method and system based on template matching
CN113139626B (en) * 2021-06-21 2021-10-15 浙江华睿科技股份有限公司 Template matching method and device, electronic equipment and computer-readable storage medium

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110473238A (en) * 2019-06-25 2019-11-19 浙江大华技术股份有限公司 A kind of method for registering images, device, electronic equipment and storage medium
CN112861983A (en) * 2021-02-24 2021-05-28 广东拓斯达科技股份有限公司 Image matching method, image matching device, electronic equipment and storage medium

Also Published As

Publication number Publication date
CN114926670A (en) 2022-08-19

Similar Documents

Publication Publication Date Title
US9563562B2 (en) Page crossing prefetches
KR102410348B1 (en) Object tagged memory monitoring method and processing device
US9582422B2 (en) Hardware prefetcher for indirect access patterns
CN118171612B (en) Method, device, storage medium and program product for optimizing instruction cache
CN110941989A (en) Image verification method, image verification device, video verification method, video verification device, equipment and storage medium
KR20190038835A (en) Data cache area prefetcher
CN111009001A (en) Image registration method, device, equipment and storage medium
CN115495394A (en) Data prefetching method and data prefetching device
CN113468469A (en) Convolution processing method and device of feature graph executed by computer and electronic equipment
US9697127B2 (en) Semiconductor device for controlling prefetch operation
CN111091572A (en) Image processing method and device, electronic equipment and storage medium
CN118567580B (en) Tensor memory handling method, tensor memory handling device, storage medium and program product
CN112634316A (en) Target tracking method, device, equipment and storage medium
CN105359142B (en) Hash connection method and device
CN114168318A (en) Training method for storage release model, storage release method and device
CN114926670B (en) Image template matching method, device, electronic device and storage medium
CN106649143B (en) Cache access method and device and electronic equipment
US20200327638A1 (en) Connected component detection method, circuit, device and computer-readable storage medium
US12174743B2 (en) Memory access
CN113920337B (en) Ship template matching acceleration method and system based on DSP
CN108762811B (en) A method for obtaining out-of-order memory access behavior patterns of applications based on clustering
CN112233153B (en) Image matching method, device, electronic equipment and storage medium
JP5778983B2 (en) Data processing apparatus, data processing apparatus control method, and program
US10025717B1 (en) Multi-dimensional prefetching
CN113900805A (en) A kind of image processing method and chip based on fixed orientation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Hikvision Robot Co.,Ltd.

Address before: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: HANGZHOU HIKROBOT TECHNOLOGY Co.,Ltd.

CB02 Change of applicant information
GR01 Patent grant