The image pattern planting modes on sink characteristic method for expressing of intensity-based distribution statistics information
Technical field
The present invention relates to technical field of image processing, relate in particular to a kind of image pattern planting modes on sink characteristic method for expressing of intensity-based distribution statistics information.
Background technology
Along with the development of machine vision theory with technology, picture material is identified and understood the focus that becomes research, have wide application market.Especially at intelligent transportation field, the demand that the images such as car mark, road sign and signal post are identified is day by day strong.At present, image recognition is generally based on the supervised learning method, need to set up corresponding image pattern storehouse according to application demand and trains and learn.The key of setting up Sample Storehouse is that sample characteristics is described, extracts and represents.Feature is described and is expressed characteristics of image with image-elements such as point, edge, color and textures; Feature extraction is to adopt image processing method to extract feature from image to describe image-element used; Character representation adopts the formalization mode of computing machine approval to organize and define these and describes the image-element that extracts according to feature.Character representation is the formalization result that feature is described and extracted, and represents to identify this image by Characteristic of Image.
Characteristic of Image represents that following several types is generally arranged:
1) statistical nature: color (gray scale) histogram and image moment are the image statistics features of commonly using.Histogram is simple and easy to use, but the profound image information that is beyond expression.Image moment comprises Hu square, Zemike square, wavelet moment etc., can the Description Image overall situation and local feature, but calculated amount is large, and require clear picture.
2) and the edge: unique point is the point of stable in properties in image, comprises harris angle point, SIFT unique point, Surf unique point etc.The edge is made of the pixel of gradient of image and gray scale transition, and the classic algorithm such as available Canny are extracted.But point and edge feature are inapplicable for fuzzyyer low-resolution image.
3) texture: textural characteristics means the another kind of important visual signature of image, and the spatial variations situation of texture structure reflection brightness of image has local with whole self-similarity.The method of texture analysis has multiple, and is as spatial autocorrelation method, co-occurrence matrix method, Tamura method etc., very effective to the image of texture-rich.
4) transform domain feature: image is carried out various mathematic(al) manipulations, with the coefficient of transform domain as characteristics of image.Such as wavelet transformation, bent wave conversion, Fourier transform, Hough conversion etc.These conversion General Requirements images reach the certain resolution requirement.
5) algebraic characteristic: image can represent with matrix.Algebraic characteristic utilizes matrix theory to extract the method for feature from image array.As PCA, LDA, ICA and SVD etc.But such character representation method is very large for the low-resolution image error.
For specific images such as car mark, road sign, signal posts, be subjected to acquisition condition and environmental impact, have the problems such as resolution is low, easily stained, illumination effect is large, its sample image substantially only remains with fuzzy general shape structure, is difficult to therefrom extract stable point, edge and textural characteristics.Therefore, above method is all not too applicable.
Summary of the invention
The objective of the invention is for the deficiencies in the prior art, propose a kind of image pattern planting modes on sink characteristic method for expressing of intensity-based distribution statistics information.By specific image being set up the required character representation method of Sample Storehouse, solve that sample image resolution is low, easily feature extraction and the expression in the large situation of stained, illumination effect.By extract the gray-scale relation information on a large amount of relative positions in sample image, and by the comparison between similar sample inside and inhomogeneity sample, filter out can reflect from the statistical significance such sample image gray distribution features point to set of relations as character representation.
The technical scheme that the present invention solves its technical matters employing is as follows:
Step 1: to the sample image of each type, select
N 0 Individual location point pair.According to size and the architectural feature of such sample image, chosen position point is to collection in the sample image scope
Pair c :
Pair c = {
Pair c,i <
P 1 ,
P 2 >,
i=1,2,...,
N 0 }
Subscript
C=1,2..., C, CTotal sample image type in the expression Sample Storehouse,
iExpression the
cOf class sample image
iIndividual location point pair comprises
P 1 =(x 1 , y 1 )With
P 2 =(x 2 , y 2 )Two location points,
x,yBe the relative value after the normalization of sample image planimetric coordinates,
N 0 Be natural number.
Having chosen that location point is right manually chosen and automatically chooses dual mode, all follows following principle:
1)
P 1 With
P 2 Can not be adjacent;
2) any two location points pair
Pair c,i <
P 1 ,
P 2 And
Pair c,j <
P 1 ',
P 2 ' between,
P k With
P k ' (
k=1,2) can not be simultaneously adjacent;
3) at least 20% location point centering,
P 1 With
P 2 One is positioned on the outstanding structure that reflects characteristics of image, and one is dropped on background; This can be satisfied by manually choosing;
4) location point is uniformly distributed in plane of delineation scope.
Step 2: right
cAll sample images under class are asked for wherein each sample image
sAt the relevant position point
PGray average
I
Step 3: right
cEach sample image under class
s, ask for the mutual relationship between location point pair, be called for short " point is to relation "; According to sample image
s cClass point to concentrate the
iIndividual location point pair
Pair c,i <
P k ,
P l Gray average
I 1 With
I 2 , calculate location point pair
Pair c,i At sample image
sIn point to relation
R C, s, i :
Wherein
ThFor greater than zero threshold value;
Step 4: ask for
cThe point of class sample image is to set of relations
R c And calculating confidence level; According to
cDifferent sample images in the class sample image are determined each location point pair
Pair c,i cIn the class sample image
iThe right point of individual location point is to relation
R C, i , and calculation level is to relation
R C, i cConfidence level between the different sample images of class
Rel c,i Confidence level has reflected that a location point ties up to the degree of reliability in this class sample image to the pass, and computation process is as follows:
Step 4-1: with three sign amounts
p1 , p2 , p3Zero setting;
Step 4-2: from location point to the collection
Pair c In choose a location point pair
Pair c,i
Step 4-3: for each sample image in such sample image storehouse
S, s=1,2..., S,
SFor the sample image sum, calculate
Pair c,i <
P 1 ,
P 2 In
P 1 With
P 2 Gray average in two location point fields
I 1 With
I 2 And calculate this location point at sample image
sIn point to relation
R c , s,i If,
R c , s,i ="
P 1 P 2 ",
p1Add 1, if
R c , s,i ="
P 1 =
P 2 ",
p2Add 1, otherwise
p3Add 1;
Step 4-4: in the sample image storehouse all sample images calculate complete after, according to the sign amount
P1, p2, p3In maximal value, with location point pair
Pair c,i cPoint in class is to relation
R C, i Confirm as one of corresponding three kinds of relations;
Step 4-5: with location point pair
Pair c,i cPoint in class is to relation
R C, i Confidence level
Rel c,i Assignment is
MAX (p1, p2, p3)/S
Step 4-6: if location point is to collection
Pair c In untreated location point pair is arranged, get back to step 4-2; Otherwise enter step 4-7;
Step 4-7: all location points pair
Pair c,i Confidence level
Rel c,i Calculate complete after, from location point to the collection
Pair c Middle selection confidence level
Rel c,i Greater than threshold value
Th N 1 Individual location point is to forming new point to collection
Pair c ' :
Pair c ’ = {
Pair c,i <
P 1 ,
P 2 > |
Rel c,i >
Th,
i=1,2,...,
N 1 }
Step 5: calculate
cThe class sample point is to collection
Pair c ' The degree of correlation of mid point to relation and other types sample image; The degree of correlation has reflected that a point ties up to differentiation degree in dissimilar sample image to the point of collection to the pass, the degree of correlation is less, the type point is larger to the relation difference to the point that collection calculates in other types, more easily classifies accurately based on this character representation to set of relations; Filter out with the location point of other class degree of correlation minimums collection
Pair c ' ' The relatedness computation process is as follows:
Step 5-1: right
The c classSample point is to collection
Pair c ' In each location point pair
Pair c,i <
P 1 ,
P 2 , calculate two location points of this location point centering
P 1, P 2 Point in the other types sample image is to relation
R C ', i ,
, method is referring to step 2 and step 3;
Step 5-2: obtain
The c classPoint is to collection
Pair c ' The degree of correlation with the type sample image
CoRel C, c ' :
Step 5-3: according to this
cThe degree of correlation of class sample image and other types sample image is from putting collection
Pair c ' In filter out
N 2 Individual location point is to forming reposition point to collection
Pair c ' ' , make the reposition point to collection
Pair c ' ' Minimum to concerning the degree of correlation with other types point.
Step 6: set up that in the image pattern storehouse, the Different categories of samples Characteristic of Image represents; To each class sample image, build location point to collection
Pair c ' ' And respective point is to set of relations
R c And confidence level
Rel c The character representation of collection.
Beneficial effect of the present invention is as follows:
By craft and the right mutual relationship of random site point, image distribution information has been described from the statistical significance, low and can well describe and represent its characteristics of image for the image of certain design feature for resolution, can resist stained fuzzy and illumination effect, feature extraction efficient is high, dimension is low, is conducive to follow-up learning classification algorithm and realizes.
Description of drawings
Fig. 1 be in the sample of the present invention's classification car mark " masses " three location points to and three kinds of mutual relationship figure.
Embodiment
The invention will be further described below in conjunction with accompanying drawing.
Step 1: to the sample image of each type, select
N 0 Individual location point pair.According to size and the architectural feature of such sample image, chosen position point is to collection in the sample image scope
Pair c :
Pair c = {
Pair c,i <
P 1 ,
P 2 >,
i=1,2,...,
N 0 }
Subscript
C=1,2..., C, CTotal sample image type in the expression Sample Storehouse,
iExpression the
cOf class sample image
iIndividual location point pair comprises
P 1 =(x 1 , y 1 )With
P 2 =(x 2 , y 2 )Two location points,
x,yBe the relative value after the normalization of sample image planimetric coordinates,
N 0 Be natural number.
Having chosen that location point is right manually chosen and automatically chooses dual mode, all follows following principle:
1)
P 1 With
P 2 Can not be adjacent;
2) any two location points pair
Pair c,i <
P 1 ,
P 2 And
Pair c,j <
P 1 ',
P 2 ' between,
P k With
P k ' (
k=1,2) can not be simultaneously adjacent;
3) at least 20% location point centering,
P 1 With
P 2 One is positioned on the outstanding structure that reflects characteristics of image, and one is dropped on background; This can be satisfied by manually choosing;
4) location point is uniformly distributed in plane of delineation scope.
Step 2: right
cAll sample images under class are asked for wherein each sample image
sAt the relevant position point
PGray average
I
Step 3: right
cEach sample image under class
s, ask for the mutual relationship between location point pair, be called for short " point is to relation "; According to sample image
s cClass point to concentrate the
iIndividual location point pair
Pair c,i <
P k ,
P l Gray average
I 1 With
I 2 , calculate location point pair
Pair c,i At sample image
sIn point to relation
R C, s, i :
Wherein
ThFor greater than zero threshold value;
As shown in Figure 1
cThe sample image of class car mark " masses "
sIn three location points to and three kinds of mutual relationships as follows:
Pair c,1 <
P 1 ,
P 2 >,
I 1 = 250,
I 2 =40,
R c,s,i = “
P 1 >
P 2 ”
Pair c,2 <
P 3 ,
P 4 >,
I 3 = 42,
I 2 =245,
R c,s,i = “
P 3 <
P 4 ”
Pair c,3 <
P 5 ,
P 6 >,
I 1 = 250,
I 2 =240,
R c,s,i = “
P 5 =
P 6 ”
Step 4: ask for
cThe point of class sample image is to set of relations
R c And calculating confidence level; According to
cDifferent sample images in the class sample image are determined each location point pair
Pair c,i cIn the class sample image
iThe right point of individual location point is to relation
R C, i , and calculation level is to relation
R C, i cConfidence level between the different sample images of class
Rel c,i Confidence level has reflected that a location point ties up to the degree of reliability in this class sample image to the pass, and computation process is as follows:
Step 4-1: with three sign amounts
p1 , p2 , p3Zero setting;
Step 4-2: from location point to the collection
Pair c In choose a location point pair
Pair c,i
Step 4-3: for each sample image in such sample image storehouse
S, s=1,2..., S,
SFor the sample image sum, calculate
Pair c,i <
P 1 ,
P 2 In
P 1 With
P 2 Gray average in two location point fields
I 1 With
I 2 And calculate this location point at sample image
sIn point to relation
R c , s,i If,
R c , s,i ="
P 1 P 2 ",
p1Add 1, if
R c , s,i ="
P 1 =
P 2 ",
p2Add 1, otherwise
p3Add 1;
Step 4-4: in the sample image storehouse all sample images calculate complete after, according to the sign amount
P1, p2, p3In maximal value, with location point pair
Pair c,i cPoint in class is to relation
R C, i Confirm as one of corresponding three kinds of relations;
Step 4-5: with location point pair
Pair c,i cPoint in class is to relation
R C, i Confidence level
Rel c,i Assignment is
MAX (p1, p2, p3)/S
Step 4-6: if location point is to collection
Pair c In untreated location point pair is arranged, get back to step 4-2; Otherwise enter step 4-7;
Step 4-7: all location points pair
Pair c,i Confidence level
Rel c,i Calculate complete after, from location point to the collection
Pair c Middle selection confidence level
Rel c,i Greater than threshold value
Th N 1 Individual location point is to forming new point to collection
Pair c ' :
Pair c ’ = {
Pair c,i <
P 1 ,
P 2 > |
Rel c,i >
Th,
i=1,2,...,
N 1 }
Step 5: calculate
cThe class sample point is to collection
Pair c ' The degree of correlation of mid point to relation and other types sample image; The degree of correlation has reflected that a point ties up to differentiation degree in dissimilar sample image to the point of collection to the pass, the degree of correlation is less, the type point is larger to the relation difference to the point that collection calculates in other types, more easily classifies accurately based on this character representation to set of relations; Filter out with the location point of other class degree of correlation minimums collection
Pair c ' ' The relatedness computation process is as follows:
Step 5-1: right
The c classSample point is to collection
Pair c ' In each location point pair
Pair c,i <
P 1 ,
P 2 , calculate two location points of this location point centering
P 1, P 2 Point in the other types sample image is to relation
R C ', i ,
* MERGEFORMAT, method is referring to step 2 and step 3;
Step 5-2: obtain
The c classPoint is to collection
Pair c ' The degree of correlation with the type sample image
CoRel C, c ' :
Step 5-3: according to this
cThe degree of correlation of class sample image and other types sample image is from putting collection
Pair c ' In filter out
N 2 Individual location point is to forming reposition point to collection
Pair c ' ' , make the reposition point to collection
Pair c ' ' Minimum to concerning the degree of correlation with other types point.
Step 6: set up that in the image pattern storehouse, the Different categories of samples Characteristic of Image represents; To each class sample image, build location point to collection
Pair c ' ' And respective point is to set of relations
R c And confidence level
Rel c The character representation of collection.
Embodiment
The present embodiment is the front face image of car that gathers certain city's traffic block port, therefrom intercepts the car standard specimen originally, uses the inventive method to set up car mark Sample Storehouse.Gather altogether 2126 bayonet socket images, comprise the 65 common car marks of class, every class car mark comprises 10 above samples at least, and resolution contains two kinds of illumination conditions of day and night in 50*50 pixel left and right.
Implementation step is as follows:
Step 1: the car standard specimen to each type originally normalizes to unified resolution, selects 400 location points pair.Wherein manually choose 50 pairs, choose at random 350 pairs.
Step 2: right
cAll samples under class
S, ask for each sample
sAt the relevant position point
PThe 3*3 field in gray average
I
Step 3: right
cEach sample under class
s, according to the gray average of location point under this sample
IAsk for a little to relation
R C, s, i
Step 4: calculate
cThe point of class sample is to set of relations
R c And calculating confidence level.According to
cDifferent samples in class are determined each location point pair
Pair c,i cPoint in class is to relation
R C, i , and calculate mutual relationship
R C, i cConfidence level between the different samples of class
Rel c,i From
Pair c Middle selection confidence level
Rel c,i 200 positions greater than threshold value 0.8 form new point to collection
Pair c ' If point is to inadequate 200, a certain amount of point of random selection is right again, repeating step 2 ~ 4.
Step 5: calculate
cThe class sample point is to collection
Pair c ' The degree of correlation of mid point to relation and other types sample, and filter out with 100 location points of other class degree of correlation minimums collection
Pair c ' '
Step 6: the character representation of setting up Different categories of samples in car mark Sample Storehouse.To each class sample, build location point to collection
Pair c ' ' And respective point is to set of relations
R c And confidence level
Rel c The character representation of collection.
Use Adaboost sorter of car mark Sample Storehouse learning training of this character representation, 1000 new cars that gather are marked on a map as carrying out discriminator: in 500 of daytime, wherein 487 of correct identifications; In 500 of night, wherein 465 of correct identifications.