[go: up one dir, main page]

WO1999036880A1 - Procede et dispositif pour apparier des images de dessins specifiques a un corps - Google Patents

Procede et dispositif pour apparier des images de dessins specifiques a un corps Download PDF

Info

Publication number
WO1999036880A1
WO1999036880A1 PCT/SE1998/002460 SE9802460W WO9936880A1 WO 1999036880 A1 WO1999036880 A1 WO 1999036880A1 SE 9802460 W SE9802460 W SE 9802460W WO 9936880 A1 WO9936880 A1 WO 9936880A1
Authority
WO
WIPO (PCT)
Prior art keywords
images
displacement
displacement positions
values
overlap assessment
Prior art date
Application number
PCT/SE1998/002460
Other languages
English (en)
Inventor
Christer FÅHRAEUS
Ola Hugosson
Petter Ericson
Original Assignee
Precise Biometrics Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE9704925A external-priority patent/SE513058C2/sv
Application filed by Precise Biometrics Ab filed Critical Precise Biometrics Ab
Priority to EP98965941A priority Critical patent/EP1042728A1/fr
Priority to AU21948/99A priority patent/AU2194899A/en
Publication of WO1999036880A1 publication Critical patent/WO1999036880A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1365Matching; Classification
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/75Organisation of the matching processes, e.g. simultaneous or sequential comparisons of image or video features; Coarse-fine approaches, e.g. multi-scale approaches; using context analysis; Selection of dictionaries
    • G06V10/751Comparing pixel values or logical combinations thereof, or feature values having positional relevance, e.g. template matching
    • G06V10/7515Shifting the patterns to accommodate for positional errors

Definitions

  • the present invention relates to a method and a device for matching two images of body-specific patterns, each image consisting of a plurality of pixels and having partially overlapping contents, the degree of correspondence between the images being determined for different displacement positions which represent different overlappings of the images.
  • Background of the Invention It has been known for hundreds, maybe even thousands, of years that a person's fingerprints are unique and therefore can be used to identify the" person in question.
  • the technique has been used mostly to identify criminals, but it is also known to use it in, for instance, access control systems. When using the technique in an access control system, the person who is to be accepted for access presses his finger against a surface, and an image of the fingerprint is made.
  • This image is compared with a number of prestored fingerprint images for accepted persons in order to decide whether the person in question should be allowed to enter or not.
  • use is normally made of a feature extraction, which means that certain characteristic elements in the print are identified and that the elements and their relative distance are compared with corresponding data for the prestored fingerprints.
  • a difficulty arising in the comparison is that the fingerprint is usually not in the same position in the image that is made when the identification is to be carried out as in the prestored image of the fingerprint.
  • a possible method for matching two images of body- specific patterns is to examine all possible overlap positions between the images and, for each overlap position, to examine every pair of overlapping pixels, to determine a score for each pair of overlapping pixels, the score depending on how well the values of the pixels correspond, and to then determine which overlap position provides the best match on the basis of the total of the scores for the overlapping pixels in each. position.
  • this procedure is too slow for the application indicated above. Summary of the Invention
  • one object of the present invention is thus to provide a new method for automatic matching of two images of body-specific patterns, which method permits faster matching of the images with a given processor than the method described above.
  • a further object is to provide a device for the implementation of the method.
  • the invention is based on determining the degree of correspondence between two images of body-specific patterns, each image consisting of a plurality of pixels and having partially overlapping contents, for different displacement positions representing different overlappings of the images.
  • the comparison of the contents of the images is effected in a more efficient manner. More specifically, a plurality of numbers are determined for each one of a plurality of displacement positions, each number being formed with the aid of pixel values from both images. The numbers are used to retrieve overlap assessment values for at least two displacement positions simultaneously. These overlap assessment values are subsequently used in determining the degree of correspondence between the contents of the images.
  • the different displacement posi- tions can be examined with a certain degree of parallelism, making it possible to examine the images more quickly than if all the displacement positions are examined sequentially.
  • This parallelism is achieved with the aid of the numbers, which are used to examine at least two displacement positions simultaneously. Since the numbers are based on the contents of each image, it is possible to calculate in advance the overlap assessment values in the cases where the pixel values which make up the numbers overlap completely or partially. These overlap assessment values can be stored and be retrieved with the aid of the numbers when carrying out the matching. Alternatively, it is possible to define one or more formulae which, when said numbers are used as parameters, result in overlap assessment values for at least two displacement positions. As a further alternative, it is possible to use a gate circuit which produces overlap assessment values for at least two displacement positions as a result for input signals consisting of said numbers. Naturally, the efficiency increases the more pixels are included in each number since this increases the parallelism.
  • the overlap assessment values are predefined. What this means is that if a pixel in one of the images has a first given value and the corresponding overlapping pixel in the other image has a second given value, a certain predetermined overlap value is always obtained. The same applies when the overlap assessment values relate to several overlapping pixels.
  • the numerical values for different overlap assessment values which are obtained for different combinations of pixel values can be determined optionally.
  • the method furthermore comprises the steps of adding up the overlap assessment values for each of said displacement positions, and of using the totals obtained in this manner to determine which of the displacement positions provides the best possible match between the contents of the images.
  • the overlap assessment values which are added together for a certain displacement position preferably reflect the degree of correspondence between all overlapping pixels for that displacement position.
  • the overlap assessment values are suitably added up in parallel for several displacement positions.
  • the adding-up becomes particularly advantageous if it is carried out in parallel for the overlap assessment values which are retrieved simultaneously with the aid of said numbers.
  • Each overlap assessment value can relate to one or more overlapping pixels. In the latter case, a matching speed increase is achieved by the fact that it is not necessary to add up the assessment values for each overlapping pixel for a certain displacement position, but rather overlap assessment values which have already been added up for two or more overlapping pixels can be retrieved directly.
  • the plurality of displacement positions for which numbers are determined can suitably constitute rough displacement positions, and said at least two displace- ment positions for which the overlap assessment values are retrieved directly can suitably comprise at least one fine displacement position, representing a smaller displacement from a rough displacement position than the displacement between two rough displacement positions.
  • the second overlap assessment value can relate to the rough displacement position in question or to another fine displacement position.
  • the con- tents of the images are displaced in relation to each other in one direction only.
  • the method can also be employed when the images are displaced in two different, preferably perpendicular, directions in relation to each other.
  • the rough displacement positions represent different overlappings of the images in the first direction, for example horizontally, and to repeat the method for different overlappings of the images in the other direction, for example vertically.
  • the rough displacement positions which thus constitute a subset of the displacement positions examined, are preferably determined by the images being divided into a plurality of rough segments consisting of N x M pixels where N and M are greater than one, the displacement between two adjoining rough displacement positions consisting of a rough segment.
  • the rough segments can thus be achieved by the images being divided into columns or rows, each having the width and the height of several pixels .
  • the images can be represented in various ways. They can be analogue, but it is preferable that they be digital since this facilitates their processing with the aid of a computer.
  • the pixel values can be represented with different resolutions.
  • the method is preferably intended for images which are represented as bitmaps.
  • the numbers are based on the contents of the two images. In a preferred embodiment the numbers are used as addresses for memory locations, which store the overlap assessment values. In this case, the latter are suitably defined by quite simply being calculated or determined in advance.
  • the addresses are used for addressing a lookup table which, for each address, contains said pre- calculated overlap assessment values for at least two displacement positions.
  • the order in which the pixels values are used in the address is of no importance as long as the same order is used for all addresses and as long as the storing of the overlap assessment values in the lookup table is carried out in a predetermined manner in relation to said order.
  • the method according to the invention can be implemented entirely in hardware.
  • the numbers can, as mentioned above, form input signals for a gate circuit which has been designed in such a way that for each given set of input signals the corresponding overlap assessment values are produced as output signals.
  • the overlap assessment values are defined by the design of the gate circuit.
  • the method is implemented in software with the aid of a processor which works with a predetermined word length.
  • the lookup table comprises a plurality of addressable rows, each of which has the predetermined word length and stores the pre-calculated overlap assessment values. By adjusting the width of the table to the word length of the processor, the best possible utilisation of the capacity of the processor is obtained.
  • the various parameters for the method i.e. the rough displacement positions, the number of overlap assessment values stored for each address, the number of tables, etc., are suitably determined on the basis of the processor utilised and its cache memory in order to achieve the highest speed possible.
  • the parameters are chosen so that the two images and all of the pre-calculated overlap assessment values can be contained in the cache memory.
  • each number is formed by a first fine segment, which comprises at least two adjoining pixels values from the first image, and by a second fine segment, which overlaps the first fine segment and which comprises as many adjoining pixel values as the first fine segment from the second image, and a third fine segment, which comprises as many adjoining pixel values as the first fine segment from the second image and which overlaps the first fine segment in an adjacent displacement position for which the determination of a plurality of numbers is carried out, i.e. an adjacent rough displacement position.
  • the number will include all pixel values which can overlap in a rough displacement position and in all fine displacement positions between this rough displacement position and the subsequent rough displacement position, as well as in this subsequent rough displacement position. Accordingly, it is possible to retrieve, with the number, pre-calculated overlap assessment values for all of these displacement positions.
  • each address is advantageously divided into a first and a second sub- address, the first subaddress, which consists of the pixel values from the first and the second fine segment, being used to simultaneously retrieve overlap assessment values in a first table for overlapping pixels belonging to the first and the second fine segment, and the second subaddress, which consists of the pixel values from the first and the third fine segment, being used to simulta- neously retrieve overlap assessment values in a second table for overlapping pixels belonging to the first and the third segment.
  • the first and the second table preferably store an overlap assessment value for each one of said at least two displacement positions, the sum of the two overlap assessment values for a first displacement position, which is retrieved with the first and second subaddresses of an address, constituting an overlap assessment value for all overlapping pixels of the first, the second, and the third fine segment for said first displacement position.
  • the overlap assessment values are preferably stored in the same order with respect to the displacement positions for each address, so that they can be easily added up.
  • a device has a processing unit which is adapted to implement a method according to any one of claims 1-17.
  • the processing unit can be connected to a unit for recording images and can process the images in real time.
  • the device exhibits the same advantages as the method described above, that is, it permits a quicker matching of the images.
  • the invention is implemented in the form of a computer program which is stored in a storage medium which is readable with the aid of a computer.
  • the method according to the invention can be used to examine all possible displacement positions or only a selection.
  • the invention is applicable to all types of atch- ing of images.
  • the invention is especially applicable when a high matching speed is required.
  • Fig. 1 shows an image consisting of a plurality of pixels, with one rough segment and one fine segment indicated.
  • Fig. 2 shows a hypothetical overlapping of two images.
  • Fig. 3 shows how an address is formed with the aid of pixel values from a plurality of overlapping pixels in two images.
  • Fig. 4 shows how the overlap assessment values for a plurality of different overlap positions are stored and retrieved simultaneously.
  • Fig. 5 shows how the overlap assessment values are calculated for various displacement positions.
  • Fig. 6 shows how overlap assessment values are stor- ed and retrieved in the case where subaddresses are employed.
  • Fig. 7 shows how overlap values for a plurality of different displacement positions are added up simultaneously.
  • a presently preferred embodiment of a method for matching two images of body-specific patterns, the images having partially overlapping contents, will be described below.
  • the purpose of the method is to find the overlap position which provides the best possible correspondence between the contents of the images and to assess the degree of correspondence in this position.
  • a predetermined assessment criterion is employed.
  • the method is implemented in software with the aid of a 32-bit processor with a clock fre- quency of 100 MHz and with a 16 kB cache memory, in which the images which are to be matched are stored.
  • An example of a processor of this type is StrongARM supplied by Digital.
  • the processor operates under the control of a program which is read into the program memory of the pro- cessor.
  • Fig. 1 schematically shows a digital image 1 consisting of a plurality of pixels 2 of which some are schematically indicated as squares.
  • the image is to be matched with a like image with partially the same contents .
  • the image is 55 pixels wide and 76 pixels high. It is stored as a bitmap, each pixel thus having the value one or zero. In this example, the value one represents a black dot and the value zero a white dot.
  • each image is divided into eleven rough segments 3 in the form of vertical bands, each being five pixels wide and 76 pixels high.
  • Each rough segment is divided into fine segments 4, each consisting of a horizontal row of five adjoining pixels .
  • the rough segments 3 are employed to define a plurality of rough displacement positions.
  • Fig. 2 shows a first rough displacement position, in which two images la and lb are displaced in relation to each other in such a way that one rough segment 3, indicated by slanting lines, from each image overlap one another.
  • two rough segments from each image will overlap, etc. up to an eleventh rough displacement position in which all the rough segments overlap. The difference between two adjoining rough displacement positions is thus one rough segment.
  • the rough displacement positions and the fine displacement positions represent displacements between the images in a first direction, viz. horizontally. If the images can also be displaced vertically in relation to each other, a number of vertical displacement positions are defined, each vertical displacement position representing a displacement by one pixel row vertically.
  • the left part of Fig. 3 shows a vertical displacement position for a first image la and a second image lb, which is indicated by dashed lines in the overlap position.
  • the fine segments 4 are employed to determine a number of 10-bit subaddresses which in turn are employed to retrieve pre-calculated overlap assessment values, each providing a measure of the degree of correspondence between one or more overlapping pixels for a certain displacement position.
  • a first subaddress is formed by the five least significant bits of the address being retriev- ed from a first fine segment 4a in the first image la and the five most significant bits being retrieved from the corresponding overlapping fine segment 4b in the second image lb.
  • the first subaddress thus represents the value for overlapping pixels which one wishes to compare in order to check the degree of correspondence with respect to contents.
  • FIG 3 shows an example of how the first fine segment 4a of five bits "10010” is retrieved from the one image la and the second fine segment 4b of five bits "01100” is retrieved from the other image lb and are put together into the address "0110010010".
  • the first subaddresses are employed to address two tables of 1024 rows each (the number of possible different addresses) .
  • the tables are shown schematically as Tables 1 and 2 in Fig. 4.
  • scores pre-calculated overlap assessment values
  • This is shown schematically in Fig. 4 by way of an enlargement of a row in each table .
  • the scores are calculated as follows. Two overlapping white pixels equal one point, two overlapping black pixels equal two points, while one white and one black overlapping pixel equal zero points.
  • Fig. 5 shows the scores which are stored in the tables in Fig. 4 in the row with the address "0110010010" and how these are calculated.
  • Score 0 is stored in Table ' 2 and Scores 1-4 are stored in Table 1. For each overlapping pixel, a score is achieved in accordance with the scoring set out above. The scores for all overlapping pixels are added to arrive at the total score or the overlap assessment value which is to be stored in the table in the row with the address in question.
  • Table 2 in Fig. 4 contains, for each address, the score (Score 0) achieved when the two fine segments over- lap completely, i.e. the overlapping which is obtained in the rough displacement position. This score is the total of the scores for five overlapping pixels and is stored in one byte.
  • Table 1 contains, for each address, the scores (Scores 1-4) which are achieved when the two fine segments are partially displaced in relation to each other, i.e. corresponding to various fine displacement positions. These scores are stored in one byte each in a 32-bit word and can accordingly be retrieved at the same time with one reading or one table lookup during one clock cycle.
  • Score 1 relates to the score achieved when the fine segments are displaced by one increment in rela- tion to each other, so that only four overlapping pixels are obtained.
  • Score 2 relates to the score achieved when the fine segments are displaced by two increments in relation to each other, so that only three overlapping pixels are obtained, etc.
  • the displacements reflect the overlapping obtained in the fine displacement positions between the rough displacement position in question and the following rough displacement position.
  • the overlap assessment values which are retrieved using the first sub- address relate only to overlappings between the pixels in the first and the second fine segment for the displacement positions examined.
  • the overlappings which occur in these displacement positions between the pixels in the first fine segment and pixels other than the ones in the second fine segment are not picked up with the aid of the method described above.
  • a second subaddress is formed in addition to the first subaddress.
  • This second subaddress con- sists of the five pixel values in the first fine segment 4a as well as five pixel values for a third fine segment 4c which adjoins the second fine segment in the second image lb and which overlaps the first fine segment in the subsequent rough displacement position.
  • Fig. 6 shows an example of how the second subaddress is formed.
  • the pixel values "10010" from the first fine segment 4a in the first image la constitute the five most significant bits of the second subaddress, while the pixel values "10101" from the third fine segment 4c in the second image lb constitute the five least significant bits in the second subaddress.
  • the scores or the overlap assessment values for the pixels in the first and the third fine segment which overlap in different displacement positions are stored in a third table, which is indicated as Table 3 in Fig. 7.
  • the scores are, of course, calculated in the same manner as in the case of Table 1, but the scores are stored in "reverse order". Accordingly, Score 4, which relates to one overlapping pixel of the first and the third fine segment, is stored in the first byte of a table row in Table 3. Score 3, which relates to two overlapping pixels of the first and the third segment, is stored in the second byte, etc.
  • Fig. 7 shows Tables 1 and 3, a first and a second subaddress employed to address these tables, and the overlap assessment values in separate rows of the table.
  • a first rough displace- ment position is chosen. For this position, a first pair of overlapping fine segments is chosen. Suppose that the first fine segment in the first image has the pixel values "10010" and that the second fine segment in the second image has the pixel values "01100", as in the example in Fig. 3. These values are used to form the first binary subaddress "0110010010". Moreover, suppose that a third fine segment which adjoins the second fine segment in the second image has the values "10101". These values are used together with the pixel values for the first fine segment to form the second subaddress
  • the first subaddress is employed to address both the first and the second table.
  • the scores 4, 3, 0, and 1 stored in one word are obtained from the first table and the score 1 is obtained from the second table.
  • the second subaddress is employed to address the third table, from which the scores 2, 0, 3, 3 are obtained in the example given. The scores are added up in parallel, the total scores 6, 3, 3, 4 being obtained.
  • the word A representing the word obtained with a first address, consisting of a first and a second subaddress
  • the word B represent- ing the word obtained with a second address, consisting of a first and a second subaddress
  • the word C representing the total obtained.
  • the proce- dure is repeated for the second and subsequent rough displacement positions until all rough displacement positions have been examined.
  • the method is repeated for each vertical position, the images thus first being displaced one row vertically in relation to each other and subsequently all rough and fine displacement positions being examined, whereupon the images are displaced to the next vertical displacement position and are examined and so on until all vertical displacement positions have been scanned.
  • a score will have been obtained for each position. With the assessment criterion used in this example, the highest score will represent the displacement position which provides the best overlapping of the contents of the images . If a first image is compared with a plurality of second images, the highest score indicates the second image which best corresponds with the first image.
  • an overlap assessment is first carried out in the manner described above with a lower resolution of the images than the one with which they are stored.
  • a resolution of 25 x 30 pixels is used. The purpose of this is the quick selection of relevant displacement positions for closer examination of the correspon- dence between the contents of the images. Subsequently, the method is repeated for the images in these and adjoining displacement positions for the original resolution.
  • the overlap assessment values are stored in three different tables. This has been done in order to utilise the processor optimally. In the case of other processors, it may instead be suitable to store all overlap assessment values in one table or in more than three tables. This can be determined by the skilled person on the basis of the above description.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Image Processing (AREA)

Abstract

Dans un procédé d'appariement d'images de motifs spécifiques à un corps (chacune desdites images étant constituée d'une pluralité de pixels et présentant des contenus partiellement superposés), le degré de correspondance entre les contenus des images est déterminé pour différentes positions de déplacement représentant différentes superpositions des images. Plus particulièrement, plusieurs numéros sont déterminés pour chaque position d'une série de plusieurs positions de déplacement. Chaque numéro est formé à l'aide de valeurs de pixel obtenues des deux images, et utilisé pour extraire simultanément des valeurs d'évaluation de superposition pour au moins deux des positions de déplacement. Les valeurs d'évaluation de superposition extraites sont ensuite utilisées pour déterminer le degré de correspondance entre les images pour les différentes positions de déplacement. Le procédé est réalisé à l'aide d'un ordinateur et peut être mis en oeuvre comme programme d'ordinateur.
PCT/SE1998/002460 1997-12-30 1998-12-30 Procede et dispositif pour apparier des images de dessins specifiques a un corps WO1999036880A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP98965941A EP1042728A1 (fr) 1997-12-30 1998-12-30 Procede et dispositif pour apparier des images de dessins specifiques a un corps
AU21948/99A AU2194899A (en) 1997-12-30 1998-12-30 A method for a device for matching images of body-specific patterns

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
SE9704925-8 1997-12-30
SE9704925A SE513058C2 (sv) 1997-12-30 1997-12-30 Sätt och anordning för matchning av bilder av kroppsspecifika mönster
US2423698A 1998-02-17 1998-02-17
US09/024,236 1998-02-17

Publications (1)

Publication Number Publication Date
WO1999036880A1 true WO1999036880A1 (fr) 1999-07-22

Family

ID=26663175

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE1998/002460 WO1999036880A1 (fr) 1997-12-30 1998-12-30 Procede et dispositif pour apparier des images de dessins specifiques a un corps

Country Status (3)

Country Link
EP (1) EP1042728A1 (fr)
AU (1) AU2194899A (fr)
WO (1) WO1999036880A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6333989B1 (en) 1999-03-29 2001-12-25 Dew Engineering And Development Limited Contact imaging device
WO2016094002A1 (fr) * 2014-12-10 2016-06-16 Intel Corporation Procédé et appareil de détection de visage avec table de consultation

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5054089A (en) * 1988-12-29 1991-10-01 Kabushiki Kaisha Toshiba Individual identification apparatus
US5640468A (en) * 1994-04-28 1997-06-17 Hsu; Shin-Yi Method for identifying objects and features in an image

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5054089A (en) * 1988-12-29 1991-10-01 Kabushiki Kaisha Toshiba Individual identification apparatus
US5640468A (en) * 1994-04-28 1997-06-17 Hsu; Shin-Yi Method for identifying objects and features in an image

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6333989B1 (en) 1999-03-29 2001-12-25 Dew Engineering And Development Limited Contact imaging device
WO2016094002A1 (fr) * 2014-12-10 2016-06-16 Intel Corporation Procédé et appareil de détection de visage avec table de consultation
US9460338B2 (en) 2014-12-10 2016-10-04 Intel Corporation Face detection method and apparatus with lookup table

Also Published As

Publication number Publication date
EP1042728A1 (fr) 2000-10-11
AU2194899A (en) 1999-08-02

Similar Documents

Publication Publication Date Title
CA1087315A (fr) Detecteur de minuties dans une image binaire
US6134340A (en) Fingerprint feature correlator
US5616905A (en) Two-dimensional code recognition method
KR930002348B1 (ko) 지문검증의 미뉴시어(minutia) 데이타 추출
US4135147A (en) Minutiae pattern matcher
US4932065A (en) Universal character segmentation scheme for multifont OCR images
US5251265A (en) Automatic signature verification
US4225850A (en) Non-fingerprint region indicator
EP0733240B1 (fr) Procede d'appariement de particularites d'empreintes digitales au moyen de graphes relationnels attribues
JP5183578B2 (ja) 局所的視覚的2次元指紋を用いた、文書コレクション内の文書画像を発見する方法およびシステム
US6005963A (en) System and method for determining if a fingerprint image contains an image portion representing a partial fingerprint impression
US4371865A (en) Method for analyzing stored image details
GB2278945A (en) Fingerprint identification system
US6563951B2 (en) Method and a device for matching images
US4468807A (en) Method for analyzing stored image details
US3831146A (en) Optimum scan angle determining means
US7515741B2 (en) Adaptive fingerprint matching method and apparatus
AU756016B2 (en) A method and a device for matching images
EP1042728A1 (fr) Procede et dispositif pour apparier des images de dessins specifiques a un corps
EP0651337A1 (fr) Procede et appareil de reconnaissance d'objets, et procede et appareil de traitement d'images
US7894642B2 (en) Device and method for fingerprints supervision
US5307424A (en) Character recognition system
US5253303A (en) Character recognizing method and apparatus thereof
GB2310522A (en) Fingerprint identification system
MXPA00006583A (en) A method and a device for matching images

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CU CZ CZ DE DE DK DK EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: KR

WWE Wipo information: entry into national phase

Ref document number: 1998965941

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1998965941

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWW Wipo information: withdrawn in national office

Ref document number: 1998965941

Country of ref document: EP