CN112764515A - Biological characteristic recognition device and electronic equipment - Google Patents
Biological characteristic recognition device and electronic equipment Download PDFInfo
- Publication number
- CN112764515A CN112764515A CN202011627196.9A CN202011627196A CN112764515A CN 112764515 A CN112764515 A CN 112764515A CN 202011627196 A CN202011627196 A CN 202011627196A CN 112764515 A CN112764515 A CN 112764515A
- Authority
- CN
- China
- Prior art keywords
- visitor
- biometric
- face
- preset
- sensor
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/64—Three-dimensional objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G—PHYSICS
- G07—CHECKING-DEVICES
- G07C—TIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
- G07C9/00—Individual registration on entry or exit
- G07C9/00174—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys
- G07C9/00563—Electronically operated locks; Circuits therefor; Nonmechanical keys therefor, e.g. passive or active electrical keys or other data carriers without mechanical keys using personal physical data of the operator, e.g. finger prints, retinal images, voicepatterns
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- General Engineering & Computer Science (AREA)
- Collating Specific Patterns (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
The application provides a biological feature recognition device which is used for carrying out biological feature recognition based on three-dimensional data on a visitor so as to verify the identity of the visitor. The biometric recognition apparatus includes: the first sensor is used for acquiring image information of a scene; the second sensor is used for acquiring the depth information of the scene; and the processor is used for judging whether a preset verification starting condition is met or not according to the acquired image information and/or depth information, executing the identity recognition based on the three-dimensional data under the condition that the verification starting condition is met, and keeping the identity recognition based on the three-dimensional data in a stopping state under the condition that the verification starting condition is not met. The application also provides an electronic device related to the biological characteristic recognition device.
Description
Technical Field
The application belongs to the field of biological feature recognition, and particularly relates to a biological feature recognition device and electronic equipment.
Background
As technology advances and people's living standard continues to improve, more and more electronic devices are provided with biometric devices to identify a user's identity. For example: intelligent door lock with fingerprint or face recognition function. When a user approaches, the intelligent door lock usually lights up the display screen to prompt the user to identify the identity. However, in many cases, the user does not really want to open the door but merely pass by, but the existing intelligent door lock is easy to frequently light up the screen and activate the biometric device by the passing user, which not only significantly increases the power consumption burden of the intelligent door lock, but also easily leaves the user with a very "fool" bad impression, which adversely affects the brand image of the product.
Disclosure of Invention
The present application provides a biometric apparatus and an electronic device to solve the above-mentioned problems.
The embodiment of the application provides a biological feature recognition device, which is used for carrying out biological feature recognition based on three-dimensional data on a visitor so as to verify the identity of the visitor, and the biological feature recognition device comprises:
the first sensor is used for acquiring image information of a scene;
the second sensor is used for acquiring the depth information of the scene;
and the processor is used for judging whether a preset verification starting condition is met or not according to the acquired image information and/or depth information, executing the identity recognition based on the three-dimensional data under the condition that the verification starting condition is met, and keeping the identity recognition based on the three-dimensional data in a stopping state under the condition that the verification starting condition is not met.
In some embodiments, the verification start condition includes a first verification start condition, a second verification start condition, and a third verification start condition,
the first verification starting condition is as follows: preset biological characteristic information exists in the acquired image information,
the second verification start condition is as follows: the visitor has entered a preset identification range,
the third verification start condition is as follows: the face of the visitor faces the biometric recognition device,
if any one of the first verification starting condition, the second verification starting condition and the third verification starting condition is met, the verification starting condition is considered to be met, and identity recognition based on three-dimensional data is further executed;
if the first verification starting condition, the second verification starting condition and the third verification starting condition are met, the verification starting condition is considered to be met, and identity recognition based on three-dimensional data is further executed; and
and if the combination of any two of the first verification starting condition, the second verification starting condition and the third verification starting condition is met, the verification starting condition is considered to be met, and the identity recognition based on the three-dimensional data is further executed.
In some embodiments, the preset biometric information is face feature information, including facial features.
In some embodiments, the identification range is defined as at least a portion of a spatial range within a predetermined identification distance threshold from the electronic device, and the identification distance threshold may be 0.3 meters, 0.5 meters, 0.6 meters, 0.8 meters, 1 meter, 1.2 meters, or 1.5 meters.
In some embodiments, the second sensor has a predetermined baseline, the biometric recognition device defines a horizontal reference plane, the baseline is located in the horizontal reference plane, a direction perpendicular to the baseline and pointing to the outside of the biometric recognition device in the horizontal reference plane is defined as a reference direction, and if an angle between a projection of the face of the visitor on the horizontal reference plane and the reference direction is smaller than a predetermined orientation angle threshold, the visitor is considered to face the biometric recognition device, and the orientation angle threshold may be 15 degrees, 20 degrees, 30 degrees, 45 degrees, 50 degrees, or 60 degrees.
In some embodiments, the processor calculates the face orientation of the visitor according to the acquired three-dimensional data of the face of the visitor, extracts feature points of the face of the visitor, connects the extracted feature points of the face to construct a corresponding face reference plane, and calculates a vertical vector of the face reference plane as the face orientation of the visitor according to the three-dimensional data of the feature points of the face constructing the face reference plane.
In some embodiments, the second sensor acquires depth information according to one or more of a structured light sensing principle, a time-of-flight sensing principle, a binocular vision sensing principle.
In some embodiments, the biometric apparatus further includes a proximity sensor, the proximity sensor is configured to sense whether a visitor enters a preset sensing range, and the processor activates the first sensor to acquire the image information of the scene after the proximity sensor senses that the visitor enters the preset sensing range, where the sensing range is at least a partial spatial range within a preset sensing distance threshold from the biometric apparatus, and the sensing distance threshold may be 1 meter, 2 meters, 3 meters, 4 meters, or 5 meters.
In some embodiments, the proximity sensor may be a pyroelectric infrared sensor, a capacitive proximity sensor, or an inductive proximity sensor.
In some embodiments, the biometric identification device further comprises a display screen, and maintaining the deactivated state of the identification operation based on the three-dimensional data comprises deactivating the second sensor and/or maintaining the display screen closed or off.
In some embodiments, the image information may be an infrared two-dimensional image or a near-infrared two-dimensional image.
The present application further provides an electronic device, including the biometric apparatus according to the above embodiment, for performing biometric identification based on three-dimensional data on a visitor to verify the identity of the visitor.
In some embodiments, the electronic device is an intelligent door lock, further comprising:
the lock body is used for locking or unlocking the door body provided with the intelligent door lock; and
the biometric identification device controls the lock body to unlock when the visitor is identified as a preset authorizer.
In one embodiment, the electronic device includes a display screen, the processor controls the display screen to be turned on or turned off only when the verification start condition is met, and the processor keeps the display screen turned off or turned off when the verification start condition is not met.
The biological feature recognition device and the electronic equipment provided by the application sense the approaching degree and the operation intention of a visitor in a grading manner according to the power consumption, and then the screen is lightened and the identity recognition based on the three-dimensional data with the highest power consumption is started after the fact that the visitor really wants to perform the identity recognition is confirmed, so that the long-term work of high-power-consumption parts is avoided, the overall power consumption of the electronic equipment can be reduced, the external reaction of the electronic equipment is enabled to be more accordant with the actual intention of a user, and the biological feature recognition device or the electronic equipment is enabled to be more intelligent.
Additional aspects and advantages of embodiments of the present application will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of embodiments of the present application.
Drawings
Fig. 1 is a schematic structural diagram of an electronic device provided with a biometric identification apparatus according to an embodiment of the present application.
Fig. 2 is a functional block diagram of the biometric apparatus shown in fig. 1.
FIG. 3 is a functional block diagram of another embodiment of the second sensor shown in FIG. 2.
FIG. 4 is a functional block diagram of another embodiment of the first sensor and the second sensor of FIG. 2.
Fig. 5 is a functional block diagram of the biometric apparatus according to an embodiment of the present application.
Fig. 6 is a functional block diagram of the biometric apparatus according to an embodiment of the present application.
Fig. 7 is a schematic diagram of determining whether a visitor is facing a biometric device according to an embodiment of the present application.
Fig. 8 is a functional block diagram of the biometric apparatus according to an embodiment of the present application.
Fig. 9 is a schematic diagram of calculating the face orientation of a visitor according to an embodiment of the present application.
Fig. 10 is a schematic diagram of calculating the face orientation of a visitor according to an embodiment of the present application.
Fig. 11 is a functional block diagram of the biometric apparatus according to an embodiment of the present application.
Fig. 12 is a flowchart of a biometric identification control method according to an embodiment of the present application.
Fig. 13 is a flowchart of a biometric identification control method according to an embodiment of the present application.
Fig. 14 is a flowchart of a biometric identification control method according to an embodiment of the present application.
Fig. 15 is a flowchart of a biometric identification control method according to an embodiment of the present application.
Fig. 16 is a flowchart of the substeps of step S1050 in fig. 15.
Fig. 17 is a flowchart of a biometric identification control method according to an embodiment of the present application.
DETAILED DESCRIPTION OF EMBODIMENT (S) OF INVENTION
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application. In the description of the present application, it is to be understood that the terms "first" and "second" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implying any order or number of technical features indicated. Thus, features defined as "first" and "second" may explicitly or implicitly include one or more of the described features. In the description of the present application, "a plurality" means two or more unless specifically limited otherwise.
In the description of the present application, it should be noted that, unless explicitly stated or limited otherwise, the terms "mounted," "connected," and "connected" are to be construed broadly, e.g., as meaning either a fixed connection, a removable connection, or an integral connection; either mechanically or electrically or in communication with each other; either directly or indirectly through intervening media, either internally or in any other relationship or combination of two or more elements. The specific meaning of the above terms in the present application can be understood by those of ordinary skill in the art as appropriate.
The following disclosure provides many different embodiments, or examples, for implementing different features of the application. In order to simplify the disclosure of the present application, only the components and settings of a specific example are described below. Of course, they are merely examples and are not intended to limit the present application. Moreover, the present application may repeat reference numerals and/or letters in the various examples, such repeat use is intended to provide a simplified and clear description of the present application and may not in itself dictate a particular relationship between the various embodiments and/or configurations discussed. In addition, the various specific processes and materials provided in the following description of the present application are only examples of implementing the technical solutions of the present application, but one of ordinary skill in the art should recognize that the technical solutions of the present application can also be implemented by other processes and/or other materials not described below.
Furthermore, the described features, structures, or characteristics may be combined in any suitable manner in one or more embodiments. In the following description, numerous specific details are provided to provide a thorough understanding of embodiments of the application. One skilled in the relevant art will recognize, however, that the subject technology can be practiced without one or more of the specific details, or with other structures, components, and so forth. In other instances, well-known structures or operations are not shown or described in detail to avoid obscuring the focus of the application.
Please refer to fig. 1, which is a schematic structural diagram of an electronic device 2 provided with a biometric apparatus 1 according to an embodiment of the present disclosure. The biometric identification device 1 is used for performing biometric identification based on three-dimensional data on a visitor to verify the identity of the visitor. Such as but not limited to: face, iris, fingerprint, vein, etc. The biometric device 1 can be mounted on an electronic device 2 that needs to be authenticated to provide corresponding authentication functions. The electronic device 2 is for example, but not limited to: the intelligent door lock comprises an intelligent door lock, a monitoring camera, an access control system, a vehicle, an unmanned vending supermarket, a self-service checkout terminal and the like.
Please refer to fig. 2, which is a schematic diagram of functional modules of the biometric apparatus 1 according to an embodiment of the present disclosure. Optionally, in some embodiments, the biometric identification device 1 includes a first sensor 12, a second sensor 14, a processor 16, and a control system 18. The first sensor 12 is used for acquiring image information of a scene. The second sensor 14 is used to acquire depth information of a scene, and the depth information can be used for identification or state perception after being analyzed. The processor 16 is connected to the first sensor 12 and the second sensor 14, respectively, and controls the first sensor 12 and the second sensor 14 by executing functional modules in the control system 18 to realize corresponding biometric functions.
For example, in some embodiments, the processor 16 controls the first sensor 12 to acquire image information of a scene and senses whether preset biometric information of a visitor exists in the acquired image information. The processor 16 activates the second sensor 14 to sense whether the visitor enters a preset identification range after sensing that the visitor with preset biometric information exists in the acquired image information. If the visitor enters the preset identification range, the processor 16 identifies whether the visitor is a preset authorizer by matching and analyzing the difference between the visitor three-dimensional data acquired by the second sensor 14 and the pre-stored authorizer identity feature template.
Optionally, in some embodiments, the depth information comprises a distance between the visitor and the biometric recognition apparatus 1 or the electronic device 2 and three-dimensional data of the visitor's face. The three-dimensional reconstruction of the face of the visitor can be realized through the logic operation of the three-dimensional data of the face of the visitor so as to construct the three-dimensional point cloud of the face of the visitor. The biometric recognition device 11 matches and analyzes the acquired three-dimensional data of the face of the visitor with an authorized person identity feature template, such as: the difference between the three-dimensional data of the face can be identified. The three-dimensional data of the face of the authorizer can be obtained by three-dimensionally scanning the face of the authorizer with the second sensor 14, and can be stored as an identity template of the authorizer for subsequent identification. In addition, in some embodiments, the second sensor 14 may further use infrared light to perform three-dimensional data collection and recognition, so that the three-dimensional data of the face of the visitor can be accurately acquired even in a dark environment, and the three-dimensional data has better adaptability and stability. Moreover, because there is a difference between the reflection and absorption of infrared light by human skin, the second sensor 14 can also be used to distinguish real human skin from a forged human face model or a human face photo, thereby improving the reliability of identification.
The biometric identification device 1 may further include a storage medium 13 and a power source 15. The first sensor 12, the second sensor 14, the processor 16, the power supply 15, the storage medium 13, and the control system 18 may be connected to each other through a bus to transmit data and signals to each other.
The power supply 15 may provide power for each component of the electronic device 2 by connecting with the mains supply and performing corresponding adaptation processing. The power source 15 may also include an energy storage element such as a battery to provide power to the various components via the battery.
The storage medium 13 includes, but is not limited to, a Flash Memory (Flash Memory), a charged Erasable Programmable read only Memory (EEPROM), a Programmable Read Only Memory (PROM), and a hard disk. The storage medium 13 is used for storing a preset identity characteristic template of an authorizer, intermediate data generated in the identification process, computer software codes for realizing identification and control functions, and the like.
Optionally, in some embodiments, the storage medium 13, the power supply 15, and the processor 16 may also be disposed in the electronic device 2, which is not specifically limited in this application.
The first sensor 12 is used for acquiring image information of a scene, such as: a two-dimensional photograph of the front of the biometric recognition apparatus 1 or the electronic device 2. Whether the image information has preset biological characteristic information or not can be judged through analysis of the image information, and whether people appear in a scene or not is further judged. The preset biometric information is, for example, face feature information, and may include five sense organs of a human face: eyebrows, eyes, ears, nose, mouth, and the proportion, size, color, brightness, and contour of their five sense organs, etc. Optionally, in some embodiments, the image information may be a visible light image or a non-visible light image, for example: infrared or near infrared light images.
Optionally, in some embodiments, the first sensor 12 includes a first transmitting unit 122 and a first receiving unit 124. The first emitting unit 122 is configured to emit a first light signal in a floodlight mode to illuminate a scene in front of the scene, so as to provide necessary illumination for acquiring image information of the scene. The first optical signal may be visible light or non-visible light, such as: infrared light or near infrared light. The first receiving unit 124 is configured to receive the reflected first optical signal to obtain corresponding image information. Alternatively, the first receiving unit 124 is, for example, an image sensor.
It is understood that, in some other embodiments, the first transmitting unit 122 may be omitted, and the first receiving unit 124 obtains the image information of the scene by receiving the reflected ambient light.
The second sensor 14 is used to acquire depth information of the scene. Alternatively, in some embodiments, the second sensor 14 may comprise one or more sets of components corresponding to one or more of the three-dimensional sensing principles employed. Optionally, in some embodiments, the second sensor 14 includes a second emitting unit 142 and a second receiving unit 144, and the second emitting unit 142 is configured to emit a second light signal to illuminate the scene. The second receiving unit 144 is configured to receive the reflected second light signal to obtain depth information of the scene.
Optionally, in some embodiments, if the depth information is acquired according to the structured light sensing principle, the second optical signal emitted by the second emitting unit 142 is a structured light beam, and the second receiving unit 144 is, for example, an image sensor, and is configured to receive the reflected second optical signal to acquire a structured light image of the scene. And comparing and analyzing the acquired scene structured light image and a preset structured light reference image to obtain the depth information of the scene.
Optionally, in some embodiments, if the depth information is acquired according to a Time of Flight (TOF) sensing principle, the second optical signal sent by the second transmitting unit 142 is an optical pulse, and the second receiving unit 144 is configured to receive the reflected second optical signal. And calculating and analyzing the time difference between the sending time and the receiving time of the second optical signal to obtain the depth information of the scene.
Alternatively, in some embodiments, as shown in fig. 3, if the depth information is acquired according to the binocular vision sensing principle, the second sensor 14 includes at least a second receiving unit 144 and a third receiving unit. The second receiving unit 144 and the third receiving unit are two image sensors located at different angles, respectively, to acquire images from different angles, respectively. The depth information of the scene can be derived by analyzing the images acquired from different angles.
Alternatively, in some embodiments, as shown in fig. 4, because the first sensor 12 and the second sensor 14 may operate at different periods of time, the first sensor 12 and the second sensor 14 may share a common receiving unit 126, and the common receiving unit 126 is an image sensor, for example. As shown, the first sensor 12 includes a first transmitting unit 122 and a common receiving unit 126. The second sensor 14 comprises a second emitting unit 142 and a common receiving unit 126.
Optionally, in some embodiments, the control system 18 includes one or more functional modules 180, and the functional modules 180 include, but are not limited to, a setup module 181, an image information acquisition module 182, an image information analysis module 183, a depth information acquisition module 184, a determination module 185, and an identification module 186. The functional module 180 may be firmware solidified within the corresponding storage medium 13 or computer software code stored within the storage medium 13. The functional modules 180 are executed by the corresponding one or more processors 16 to control the relevant components to implement the corresponding functions, such as: and (4) information protection function.
It is understood that in other embodiments, corresponding functions of one or more of the functional modules 180 in the control system 18 may be implemented by hardware, for example, any one or combination of the following hardware: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
The setting module 181 is configured to preset various parameter information that needs to be used in an information protection process. The parameter information includes, but is not limited to, preset biometric information, a verification start condition for determining whether to start authentication, a reference image required for performing depth information calculation, and an identity feature template required for performing identity recognition. The parameter information may be stored in the storage medium 13. It is understood that the parameter information may be preset by a manufacturer before the product leaves the factory, or may be set or adjusted by a user during the product use.
Optionally, in some embodiments, the verification start condition comprises a first verification start condition and a second verification start condition. The first verification starting condition is as follows: preset biological characteristic information exists in the acquired image information. The second verification start condition is: the visitor has entered a preset identification range. The identification range may be defined by a distance threshold from the electronic device 2 or the biometric recognition apparatus 1. For example, in some embodiments, the identification range may be at least a portion of the spatial range within a predetermined identification distance threshold from the electronic device 2 or the biometric identification device 1. Alternatively, the recognition distance threshold may be 0.3 meters, 0.5 meters, 0.6 meters, 0.8 meters, 1 meter, 1.2 meters, 1.5 meters, or other suitable distance. It will be appreciated that the identification distance is a working distance that facilitates the acquisition of high accuracy depth information by the second sensor 14. Since the second sensor 14 needs to acquire the depth information of the visitor, particularly the three-dimensional data of the face of the visitor, for the subsequent identification, it is advantageous for the second sensor 14 to acquire the depth information with higher accuracy if the visitor is located in a close recognition range.
Alternatively, in some embodiments, the biometric recognition apparatus 1 is configured to perform the identification based on the three-dimensional data only when the first authentication initiation condition and the second authentication initiation condition are satisfied at the same time.
The image information acquiring module 182 is configured to control the first sensor 12 to acquire image information of a front scene. The image information analysis module 183 is configured to analyze whether preset biometric information exists in the acquired scene image information. For example, in some embodiments, the image information analysis module 183 may analyze the acquired image information to obtain feature points in the image, such as: the method includes the steps of obtaining a boundary contour of an image, comparing feature points in the image with preset biological feature information to judge whether the obtained image information contains the preset biological feature information, wherein the preset biological feature information includes: five officials of the face, etc. If the image information contains preset human face feature information, it is indicated that the visitor in the front scene is a person who may want to perform identity verification, and the condition that the depth sensing is started to perform identity verification can be used.
The depth information acquiring module 184 is configured to control the second sensor 14 to acquire depth data of a scene, and analyze and process the depth data acquired by the second sensor 14 to acquire depth information in the scene. Optionally, in some embodiments, the depth information comprises a distance between the visitor and the biometric recognition apparatus 1 or the electronic device 2 and three-dimensional data of the visitor's face. The method for acquiring the depth information depends on the three-dimensional sensing principle adopted by the second sensor 14. For example: in some embodiments, the second sensor 14 employs a structured light sensing principle to project a structured light beam onto the space where the scene is located, for example: and acquiring a structured light image formed by the structured light beams in the space of the scene by the irregularly arranged speckle beams. The depth information acquiring module 184 acquires depth information of the face or scene space of the visitor by calculating a distortion value between the acquired structured light image and a preset reference image. In other embodiments, the second sensor 14 uses the TOF principle to emit light pulses to the visitor or the space where the visitor is located at a specific frequency/time period, and then receive the light pulses reflected back by the visitor or the space where the visitor is located. The depth information acquisition module 184 acquires depth information of the visitor or the space where the visitor is located by calculating the time difference required for the light pulse to be received from the emission. In other embodiments, the second sensor 14 and the depth information acquiring module 184 may also use binocular vision sensing principle to acquire the depth information of the visitor or the space where the visitor is located.
The judging module 185 is configured to judge whether the visitor has entered a preset identification range according to a preset identification distance threshold and the obtained depth information of the visitor.
If the depth information of the visitor shows that the distance between the visitor and the biometric identification device 1 or the electronic device 2 is greater than or equal to a preset identification distance threshold, it is determined that the visitor enters a preset identification range.
If the depth information of the visitor shows that the distance between the visitor and the biometric identification device 1 or the electronic device 2 is smaller than a preset identification distance threshold, it is determined that the visitor does not enter a preset identification range.
The identification module 186 is used for identifying whether the identity of the visitor is a preset authorizer. Optionally, in some embodiments, the identification module 186 identifies whether the visitor is a predetermined authorizer by matching and analyzing the difference between the three-dimensional data of the visitor's face acquired by the second sensor 14 and the predetermined authorizer identity template. It is understood that the recognition module 186 may send out different signals according to the recognition result after completing the recognition, for example: if the visitor is identified as the preset authorizer, an identification passing signal is sent out, and if the visitor is identified as not the preset authorizer, an authentication failure signal is sent out.
Optionally, in some embodiments, the processor 16 controls the identification module 186 to identify whether the visitor is a preset authorizer according to the three-dimensional data of the visitor's face acquired by the second sensor 14 when the verification start condition is satisfied.
Optionally, in some embodiments, the processor 16 controls the identification module 186 to stop performing identification based on three-dimensional data when the verification start condition is not satisfied.
Optionally, in some embodiments, as shown in fig. 5, the control system 18 further includes an operation module 187, and the operation module 187 is configured to control corresponding hardware or run corresponding software to perform corresponding operations according to the result of the identification. For example: the payment, unlocking, entering a specific application program or an operation interface, acquiring a specific operation authority and the like can be executed according to the result of the identity recognition.
Optionally, in some embodiments, as shown in fig. 1 and 6, the biometric recognition apparatus 1 further includes a display screen 3, and the control system 18 may further include a display control module 188, respectively. The display screen 3 is used for displaying images or characters to prompt or guide the visitor to identify the identity. The display control module 188 is configured to control the display screen 3 to be turned on or off. Optionally, the processor 16 controls the display screen 3 to be turned on or turned on by the display control module 188 when the verification start condition is satisfied, and keeps the display screen 3 turned off or turned off by the display control module 188 when the verification start condition is not satisfied. That is, the biometric authentication device 1 activates the display 3 and the second sensor 14 to perform the identification based on the three-dimensional data at least when it is determined that there is a visitor in front and the proximity is close enough to acquire the three-dimensional data with high accuracy. Therefore, the overall power consumption of the biological feature recognition device 1 can be remarkably reduced, and the screen is lightened only when identity recognition is needed at last, so that the biological feature recognition device 1 is more intelligent, and the improvement of user experience and the modeling of brand image are facilitated.
Optionally, in some other embodiments, the display screen 3 may also be disposed on the electronic device 2, and the biometric recognition apparatus 1 controls the display control module 188 to illuminate or activate the display screen 3.
Optionally, in some embodiments, the verification start condition further includes a third verification start condition, specifically: the face of the visitor should face the biometric recognition device 1. The biometric recognition apparatus 1 is configured to determine that the first, second, and third verification start conditions need to be satisfied simultaneously to satisfy the verification start condition, so as to perform identification based on three-dimensional data. As shown in fig. 7, the second emitting unit 142 and the second receiving unit 144 of the second sensor 14 are arranged along a predetermined baseline L, and the biometric recognition apparatus 1 can define a horizontal reference plane H, and the baseline is within the horizontal reference plane H. The direction perpendicular to the base line L and pointing to the outside of the biometric identification apparatus 1 in the horizontal reference plane H is defined as a reference direction R. If the included angle α between the projection F1 of the face orientation F of the visitor on the horizontal reference plane H and the reference direction R is smaller than the preset orientation angle threshold, it is defined that the visitor faces the biometric identification apparatus 1, and at this time, the second sensor 14 can acquire more comprehensive three-dimensional face data. If the included angle between the projection of the face orientation F of the visitor on the horizontal plane and the reference direction R is greater than a preset orientation angle threshold, it is defined that the visitor does not face the biometric identification device 1. Alternatively, in some embodiments, the orientation angle threshold may be 15 degrees, 20 degrees, 30 degrees, 45 degrees, 50 degrees, or 60 degrees, etc.
In most cases, the visitor will actively face towards the biometric recognition apparatus 1 when the visitor wants to perform authentication, and will not want to perform authentication when the visitor's face is significantly deviated from the biometric recognition apparatus 1, such as: the visitor may merely face the biometric recognition device 1 sideways to chat with others, etc. Therefore, the condition that the visitor wants to identify can be screened out more accurately by adding the third verification starting condition.
Accordingly, in some embodiments, as shown in fig. 8, the control system 18 further includes a face orientation calculation module 189. The face orientation calculating module 189 is configured to calculate a face orientation F of the visitor according to the acquired three-dimensional data of the face of the visitor. Alternatively, in some embodiments, as shown in fig. 9, the face orientation calculating module 189 extracts feature points of the viewer's face, acquires three-dimensional data of the face feature points, connects the extracted face feature points to construct a corresponding face reference plane, and calculates a vertical vector of the face reference plane as a vector of the face orientation F according to the three-dimensional data of the face feature points constructing the face reference plane. Alternatively, the facial feature points may be set to, but not limited to, the left eye, the right eye, the tip of the nose, and the corner of the mouth, and the left/right eyes may also be the corresponding corners of the eyes.
The face orientation calculating module 189 may calculate the face orientation F of the visitor by connecting the number of the visitor face feature points to each other, but it is sufficient that the face feature points for constructing the face reference plane are located on the same plane, so that the constructed face reference plane is a plane. Alternatively, in some embodiments, the face orientation calculation module 189 may construct the face reference plane by connecting three feature points of the visitor's face. For example: as shown in fig. 9, the left eye, the right eye and the left mouth corner form a face reference plane, and the face orientation F of the visitor can be obtained by calculating the vertical vector of the face reference plane.
It will be appreciated that the face orientation F calculated by choosing different facial feature points to construct the facial reference plane will vary slightly. Alternatively, in some embodiments, as shown in fig. 10, the face orientations F1 and F2 may be calculated for two different face reference planes, respectively, and the vector direction obtained by summing the two vectors is used as the face orientation F. By analogy, the final face orientation F may also be obtained by sequentially adding the vertical vectors of the plurality of face reference planes constructed by the plurality of different facial feature point correspondences.
Alternatively, in some embodiments, the biometric apparatus 1 may be configured to perform identification based on three-dimensional data only by satisfying any one of the first, second, and third verification start conditions, i.e., by considering that the verification start condition is satisfied.
Alternatively, in some embodiments, the biometric apparatus 1 may be configured to determine that any two combinations of the first, second, and third verification start conditions are satisfied, that is, the verification start conditions are satisfied, and may start the authentication based on the three-dimensional data.
Optionally, in some embodiments, as shown in fig. 11, the biometric recognition device 11 may further include a proximity sensor 17. The proximity sensor 17 is used for sensing whether a visitor enters a preset sensing range. The processor 16 activates the first sensor 12 and/or the second sensor 14 after the proximity sensor 17 senses that the visitor enters the preset sensing range.
The sensing range may be defined by a distance threshold from the electronic device 2 or the biometric recognition apparatus 1. For example, in some embodiments, the sensing range may be at least a portion of the spatial range within a predetermined sensing distance threshold from the electronic device 2 or the biometric identification device 1. Alternatively, the sensing distance threshold may be 1 meter, 2 meters, 3 meters, 4 meters, 5 meters, or other suitable distance. That is, when a visitor enters the sensing range, the proximity sensor 17 can sense the visitor and send a corresponding sensing signal. It is understood that the proximity sensor 17 only needs to make yes or no judgment for the visitor within the preset distance, and does not need to obtain a specific distance from the visitor, so that a low-power sensing method with low accuracy can be adopted. Alternatively, in some embodiments, the proximity sensor 17 may be, but is not limited to, a human Pyroelectric Infrared (PIR) sensor, a capacitive proximity sensor, an inductive proximity sensor, or the like. Because the power consumption of the proximity sensor 17 is low, it can continuously sense whether a visitor enters the sensing range at preset frequency intervals, such as: a number of any of 1 to 60 per second.
Accordingly, in some embodiments, the control system 18 further includes a proximity sensing module 189. The setup module 181 may be used to preset the frequency of proximity sensing, such as but not limited to: a number of any of 1 to 60 per second. The proximity sensing module 189 is configured to control the proximity sensor 17 to sense whether a visitor enters a preset sensing range at a preset proximity sensing frequency, and send a relevant signal after sensing that the visitor enters the sensing range.
It is understood that in other embodiments, the proximity sensor 17 may be omitted from the biometric identification device 1, and the proximity sensing module 189 may be omitted from the control system 18. The processor 16 controls the first sensor 12 to acquire the image information of the front scene at a preset frequency through the image information acquiring module 182. The preset frequency is, for example but not limited to: a number of any of 1 to 60 per second. The processor 16 controls the image information analysis module 183 to analyze whether preset biometric information exists in the acquired image information as a condition for initiating identity sensing based on three-dimensional data.
Optionally, in some embodiments, as shown in fig. 1, the electronic device 2 is an intelligent door lock. The intelligent door lock 2 comprises a lock body 22 and the biometric identification device 1. The lock body 22 is used for locking or unlocking the door body 4 provided with the intelligent door lock. The biometric device 1 controls the lock body 22 to unlock when recognizing the visitor as a predetermined authorized person. For example: referring to fig. 5, when the verification start condition is satisfied, the processor 16 controls the identification module 186 to identify whether the visitor is a predetermined authorizer according to the three-dimensional data of the visitor's face acquired by the second sensor 14, and controls the lock body to unlock through the operation module 187 when the visitor is identified as the predetermined authorizer.
Optionally, in some embodiments, the smart door lock 2 includes a display screen 3. The display screen 3 is used for displaying images or characters to prompt or guide the visitor to identify the identity. The biometric identification device 1 controls the display screen 3 to be turned on or turned on when the verification start condition is satisfied, and keeps the display screen 3 turned off or turned off when the verification start condition is not satisfied.
Referring to fig. 12 and fig. 2, the present embodiment further provides a method for controlling the biometric apparatus 1. The biological characteristic identification control method comprises the following steps:
step S101, image information of a front scene is acquired.
Optionally, in some embodiments, the processor 16 controls the first sensor 12 to acquire image information of the front scene by executing an image information acquisition module 182. Whether the image information has preset biological characteristic information or not can be judged through analysis of the image information, and whether people appear in a scene or not is further judged. The preset biometric information is, for example, face feature information, and may include five sense organs of a human face: eyebrows, eyes, ears, nose, mouth, and the proportion, size, color, brightness, and contour of the five sense organs, etc.
Optionally, in some embodiments, the image information may be a visible light two-dimensional image or a non-visible light two-dimensional image, for example: an infrared two-dimensional image or a near-infrared two-dimensional image.
And step S102, analyzing whether the acquired image information has preset visitor biological characteristic information.
Optionally, in some embodiments, the processor 16 analyzes the acquired image information by executing the image information analysis module 183 to obtain feature points in the image, such as: and comparing the characteristic points in the image with the biological characteristic information to judge whether the acquired image information has preset biological characteristic information of the visitor. The preset biological feature information of the visitor is face feature information which comprises five sense organs of the face and the proportion, the size, the color, the brightness and the outline of the five sense organs.
Step S103, keeping the identification operation based on the three-dimensional data in a stop state, and continuously executing step S101: image information of a front scene is acquired.
Optionally, in some embodiments, if the acquired image information does not include the preset biometric information, the processor 16 may control the first sensor 12 to continuously acquire the image information of the front scene at the preset frequency by executing the image information acquiring module 182, and in other embodiments, the processor 16 may also determine whether to continuously acquire the image information of the front scene according to other control signals.
The method for keeping the identity recognition operation stopped based on the three-dimensional data comprises but is not limited to stopping acquiring the three-dimensional data of the visitor, stopping identity recognition through matching analysis of the acquired three-dimensional data of the visitor and a preset authorized person identity characteristic template, and/or stopping displaying information or keeping a screen-off state.
Step S104, obtaining depth information of the scene.
Optionally, in some embodiments, if there is preset visitor biometric information in the acquired image information, the processor 16 controls the second sensor 14 to acquire the depth information of the scene by executing the depth information acquiring module 184.
The depth information includes a distance between the visitor and the biometric recognition apparatus 1 or the electronic device 2 and three-dimensional data of the visitor's face. The three-dimensional sensing principle for acquiring the scene depth information includes, but is not limited to, a structural light sensing principle, a TOF principle, and a binocular vision sensing principle.
Step S105, determining whether the visitor has entered a preset identification range.
The identification range may be defined by a distance threshold from the electronic device 2 or the biometric recognition apparatus 1. For example, in some embodiments, the identification range may be at least a portion of the spatial range within a predetermined identification distance threshold from the electronic device 2 or the biometric identification device 1. Alternatively, the recognition distance threshold may be 0.3 meters, 0.5 meters, 0.6 meters, 0.8 meters, 1 meter, 1.2 meters, 1.5 meters, or other suitable distance.
Optionally, in some embodiments, the processor 16 executes the determining module 185 to determine whether the visitor has entered the preset identification range according to the preset identification distance threshold and the acquired depth information of the visitor. If the depth information of the visitor shows that the distance between the visitor and the biometric identification device 1 or the electronic device 2 is greater than or equal to a preset identification distance threshold, it is determined that the visitor enters a preset identification range. If the depth information of the visitor shows that the distance between the visitor and the biometric identification device 1 or the electronic device 2 is smaller than a preset identification distance threshold, it is determined that the visitor does not enter a preset identification range.
Optionally, in some embodiments, if it is determined that the visitor does not enter the preset identification range, the step S103 is executed: so that the identification operation based on the three-dimensional data is kept in a stopped state, the step S101 is continuously executed: acquiring image information of a front scene or step S104: depth information of a scene is acquired.
And step S106, performing identity recognition based on the three-dimensional data.
Optionally, in some embodiments, if it is determined that the visitor has entered the predetermined identification range, the processor 16 executes the depth information obtaining module 184, the identification module 186 and/or the display control module 188 to implement the identification based on the three-dimensional data.
Optionally, in some embodiments, performing three-dimensional data based identification includes acquiring three-dimensional data of the visitor's face by the second sensor 14, activating or illuminating the display screen by the display control module 188, and/or identifying whether the visitor is a predetermined authorizer by executing the identification module 186 to match and analyze differences between the three-dimensional data of the visitor's face acquired by the second sensor 14 and a predetermined authorizer identity template.
Optionally, in some embodiments, as shown in fig. 13, after the step S106, the method further includes:
and step S107, executing corresponding operation according to the identification result.
Optionally, in some embodiments, the processor 16 controls corresponding hardware or runs corresponding software according to the result of the identification by executing the operation module 187 to perform corresponding operations. For example: and if the identity identification is passed, executing authorization operations such as payment, unlocking, entering a specific application program or an operation interface, or acquiring a specific operation authority. And if the identity identification fails, sending out a prompt of failure of identity authentication, sending out an alarm and/or notifying a preset authorizer or a control center.
Alternatively, in some embodiments, as shown in fig. 14, the steps S104 and S105 may also be omitted, and if the acquired image information includes the preset biometric information, the step S106 is directly performed.
Optionally, in some embodiments, as shown in fig. 15, before step S106, the method further includes:
step S1050, determining whether the face of the visitor faces the biometric recognition apparatus 1, and performing step S106 when the face of the visitor faces the biometric recognition apparatus 1.
Optionally, in some embodiments, if there is preset visitor biometric information in the acquired image information, step S1050 is performed.
Optionally, in some embodiments, if it is determined that the visitor has entered the preset identification range, the step S1050 is executed.
Alternatively, in some embodiments, if it is determined that the face of the visitor is not facing the biometric recognition device, step S103 is performed: so that the identification operation based on the three-dimensional data is kept in a stopped state, the step S101 is continuously executed: acquiring image information of a front scene or step S104: depth information of a scene is acquired.
Optionally, in some embodiments, as shown in fig. 16, the step S1050 may include the following sub-steps:
and step S1051, extracting three-dimensional data of the characteristic points of the face of the visitor. Optionally, in some embodiments, the processor 16 extracts three-dimensional data of facial feature points from the acquired three-dimensional data of the visitor's face by executing the face orientation calculation module 189. Alternatively, the facial feature points may be set to, but not limited to, the left eye, the right eye, the tip of the nose, and the corner of the mouth, and the left/right eyes may also be the corresponding corners of the eyes.
In step S1052, the extracted facial feature points are concatenated to construct a corresponding facial reference plane. Optionally, in some embodiments, the processor 16 connects the extracted characteristic points of the visitor's face by executing the face orientation calculation module 189 to construct a corresponding face reference plane. In this step, the number of the connected visitor facial feature points may be three or more, but it is necessary to satisfy that the facial feature points used for constructing the facial reference plane need to be located on the same plane so that the constructed facial reference plane is a plane. Alternatively, in some embodiments, the face reference plane may be constructed by connecting three feature points of the visitor's face. For example: as shown in fig. 4, the left eye, the right eye and the left mouth corner form a face reference plane, and the face orientation F of the visitor can be obtained by calculating the vertical vector of the face reference plane.
In step S1053, a vertical vector of the face reference plane is calculated as a vector of the face orientation F. Optionally, in some embodiments, the processor 16 calculates the vertical vector of the face reference plane as the vector of the face orientation F by executing the face orientation calculation module 189163 from the three-dimensional data of the facial feature points constructing the face reference plane.
It will be appreciated that the face orientation F calculated by choosing different facial feature points to construct the facial reference plane will vary slightly. Optionally, in some embodiments, the face orientation F may also be calculated for two different face reference planes, and then a vector direction obtained by summing the two vectors is taken as the face orientation F. By analogy, the finally determined face orientation F may also be obtained by sequentially adding the vertical vectors of the plurality of face reference planes constructed by the plurality of different facial feature point correspondences.
Step S1054, comparing the orientation F of the face of the visitor with the reference direction R defined by the biometric device 1 to determine whether the face of the visitor faces the biometric device 1.
The biometric recognition apparatus 1 defines a reference direction R as a direction pointing outward in the horizontal plane perpendicular to the base line L of the second sensor 14. If the included angle between the projection of the face orientation F of the visitor on the horizontal plane and the reference direction R is smaller than the preset orientation angle threshold, the visitor is defined to face the biometric feature recognition apparatus 1, and at this time, the second sensor 14 can acquire more comprehensive three-dimensional face data. If the included angle between the projection of the face orientation F of the visitor on the horizontal plane and the reference direction R is greater than a preset orientation angle threshold, it is defined that the visitor does not face the biometric identification device 1.
Referring to fig. 11 and 17 together, in some other embodiments, the biometric identification control method provided by the present application further includes step S100 executed before step S101:
step S100, sensing whether a visitor enters a preset sensing range, and executing step S101 only when the visitor is sensed to enter the preset sensing range.
Optionally, in some embodiments, the processor 16 executes the proximity sensing module 189 to control the proximity sensor 1716 to sense whether a visitor enters a predetermined sensing range at a predetermined proximity sensing frequency, and to send a related signal to activate the first sensor 12 after sensing that the visitor enters the sensing range.
The proximity sensor 17 is used for sensing whether a visitor enters a preset sensing range. The sensing range may be defined by a distance threshold from the electronic device 2 or the biometric recognition apparatus 1. For example, in some embodiments, the sensing range may be at least a portion of the spatial range within a predetermined sensing distance threshold from the electronic device 2 or the biometric identification device 1. Alternatively, the sensing distance threshold may be 1 meter, 2 meters, 3 meters, 4 meters, 5 meters, or other suitable distance.
Alternatively, in some embodiments, the proximity sensor 17 may be, but is not limited to, a human Pyroelectric Infrared (PIR) sensor, a capacitive proximity sensor, an inductive proximity sensor, or the like. Because the power consumption of the proximity sensor 17 is low, it can continuously sense whether a visitor enters the sensing range at preset frequency intervals, such as: a number of any of 1 to 60 per second.
It is understood that the proximity sensor 17 only needs to make yes or no judgment on the visitor within the preset distance, and does not need to obtain a specific distance from the visitor, so that a low-power measurement principle with low precision can be adopted. The proximity sensor 1716 consumes significantly less power to operate than the first sensor 12 or the second sensor 14.
The biometric identification device 1 and the corresponding biometric identification control method sense the approaching degree and the operation intention of a visitor in a grading manner according to the power consumption, and then lighten the screen and start the identification based on three-dimensional data with the highest power consumption after confirming that the visitor really wants to identify, thereby avoiding long-term work of high-power-consumption components, not only reducing the overall power consumption of the electronic equipment 2, but also enabling the external reaction of the electronic equipment 2 to better accord with the actual intention of a user, and enabling the biometric identification device 1 or the electronic equipment 2 to be more intelligent.
In the description herein, references to the description of the terms "one embodiment," "certain embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples," etc., mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Any process or method descriptions in flow charts or otherwise described herein may be understood as representing modules, segments, or portions of code which include one or more executable instructions for implementing specific logical functions or steps of the process, and alternate implementations are included within the scope of the preferred embodiment of the present application in which functions may be executed out of order from that shown or discussed, including substantially concurrently or in reverse order, depending on the functionality involved, as would be understood by those reasonably skilled in the art of the present application.
The logic and/or steps represented in the flowcharts or otherwise described herein, such as an ordered listing of executable instructions that can be considered to implement logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processing module-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device. More specific examples (a non-exhaustive list) of the computer-readable medium would include the following: an electrical connection (electronic device) having one or more wires, a portable computer diskette (magnetic device), a random access storage media 13(RAM), a read-only storage media 13(ROM), an erasable programmable read-only storage media 13(EPROM or flash memory media 13), an optical fiber device, and a portable compact disc read-only memory (CDROM). Further, the computer readable medium may even be paper or another suitable medium upon which the program is printed, as the program can be electronically captured, via for instance optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner if necessary, and then stored in the computer storage medium 13.
It should be understood that portions of embodiments of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above embodiments, the various steps or methods may be implemented in software or firmware stored in the storage medium 13 and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, any one or combination of the following techniques, which are well known in the art, may be used: a discrete logic circuit having a logic gate circuit for implementing a logic function on a data signal, an application specific integrated circuit having an appropriate combinational logic gate circuit, a Programmable Gate Array (PGA), a Field Programmable Gate Array (FPGA), or the like.
The above description is only exemplary of the present application and should not be taken as limiting the present application, as any modification, equivalent replacement, or improvement made within the spirit and principle of the present application should be included in the protection scope of the present application.
Claims (10)
1. A biometric apparatus for performing biometric authentication based on three-dimensional data on a visitor to verify the identity of the visitor, the biometric apparatus comprising:
the first sensor is used for acquiring image information of a scene;
the second sensor is used for acquiring the depth information of the scene;
and the processor is used for judging whether a preset verification starting condition is met or not according to the acquired image information and/or depth information, executing the identity recognition based on the three-dimensional data under the condition that the verification starting condition is met, and keeping the identity recognition based on the three-dimensional data in a stopping state under the condition that the verification starting condition is not met.
2. The biometric identification device of claim 1, wherein:
the authentication start-up condition includes a first authentication start-up condition, a second authentication start-up condition, and a third authentication start-up condition,
the first verification starting condition is as follows: preset biological characteristic information exists in the acquired image information,
the second verification start condition is as follows: the visitor has entered a preset identification range,
the third verification start condition is as follows: the face of the visitor faces the biometric recognition device,
if any one of the first verification starting condition, the second verification starting condition and the third verification starting condition is met, the verification starting condition is considered to be met, and identity recognition based on three-dimensional data is further executed;
if the first verification starting condition, the second verification starting condition and the third verification starting condition are met, the verification starting condition is considered to be met, and identity recognition based on three-dimensional data is further executed; and
and if the combination of any two of the first verification starting condition, the second verification starting condition and the third verification starting condition is met, the verification starting condition is considered to be met, and the identity recognition based on the three-dimensional data is further executed.
3. The biometric recognition apparatus of claim 2, wherein: the preset biological characteristic information is face characteristic information which comprises five sense organs of the face and the proportion, size, color, brightness and outline of the five sense organs.
4. The biometric recognition apparatus of claim 2, wherein: the identification range is defined as at least a partial spatial range within a preset identification distance threshold from the electronic device, and the identification distance threshold may be 0.3 meter, 0.5 meter, 0.6 meter, 0.8 meter, 1 meter, 1.2 meter, or 1.5 meter.
5. The biometric recognition apparatus of claim 2, wherein: the second sensor has a preset baseline, the biometric feature recognition device defines a horizontal reference plane, the baseline is located in the horizontal reference plane, a direction perpendicular to the baseline and pointing to the outside of the biometric feature recognition device in the horizontal reference plane is defined as a reference direction, if an included angle between a projection of the face of the visitor on the horizontal reference plane and the reference direction is smaller than a preset orientation angle threshold, the visitor is considered to face the biometric feature recognition device, and the orientation angle threshold may be 15 degrees, 20 degrees, 30 degrees, 45 degrees, 50 degrees, or 60 degrees.
6. The biometric identification device of claim 5, wherein: the processor calculates the face orientation of the visitor according to the acquired three-dimensional data of the face of the visitor, extracts feature points of the face of the visitor, connects the extracted face feature points to construct a corresponding face reference plane, and calculates a vertical vector of the face reference plane as the face orientation of the visitor according to the three-dimensional data of the face feature points constructing the face reference plane.
7. The biometric apparatus according to any one of claims 1 to 6, wherein: the biological feature recognition device further comprises a proximity sensor, the proximity sensor is used for sensing whether a visitor enters a preset sensing range, the processor starts the first sensor to acquire image information of a scene after the proximity sensor senses that the visitor enters the preset sensing range, the sensing range is at least a partial space range within a preset sensing distance threshold value from the biological feature recognition device, and the sensing distance threshold value can be 1 meter, 2 meters, 3 meters, 4 meters or 5 meters.
8. The biometric identification device of claim 7, wherein the proximity sensor is a pyroelectric infrared sensor, a capacitive proximity sensor, or an inductive proximity sensor.
9. The biometric apparatus according to any one of claims 1 to 6, wherein the biometric apparatus further comprises a display screen, and wherein the maintaining of the stopped state of the identification operation based on the three-dimensional data comprises stopping the second sensor and/or maintaining the display screen off or extinguishing the display screen.
10. An electronic device, comprising:
the biometric device as claimed in any one of claims 1 to 9, configured to perform biometric recognition based on three-dimensional data on a visitor to verify the identity of the visitor.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011627196.9A CN112764515A (en) | 2020-12-31 | 2020-12-31 | Biological characteristic recognition device and electronic equipment |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011627196.9A CN112764515A (en) | 2020-12-31 | 2020-12-31 | Biological characteristic recognition device and electronic equipment |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN112764515A true CN112764515A (en) | 2021-05-07 |
Family
ID=75699354
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202011627196.9A Withdrawn CN112764515A (en) | 2020-12-31 | 2020-12-31 | Biological characteristic recognition device and electronic equipment |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN112764515A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115588509A (en) * | 2022-10-19 | 2023-01-10 | 秦皇岛市惠斯安普医学系统股份有限公司 | Multi-mode mammary gland health detection system |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103731577A (en) * | 2012-10-15 | 2014-04-16 | 富士施乐株式会社 | Power supply control apparatus, image processing apparatus, and power supply control method |
| CN108427871A (en) * | 2018-01-30 | 2018-08-21 | 深圳奥比中光科技有限公司 | 3D faces rapid identity authentication method and device |
| CN109255282A (en) * | 2017-07-14 | 2019-01-22 | 上海荆虹电子科技有限公司 | A kind of biometric discrimination method, device and system |
| CN111223219A (en) * | 2019-12-31 | 2020-06-02 | 深圳阜时科技有限公司 | A kind of identification method and storage medium |
-
2020
- 2020-12-31 CN CN202011627196.9A patent/CN112764515A/en not_active Withdrawn
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103731577A (en) * | 2012-10-15 | 2014-04-16 | 富士施乐株式会社 | Power supply control apparatus, image processing apparatus, and power supply control method |
| CN109255282A (en) * | 2017-07-14 | 2019-01-22 | 上海荆虹电子科技有限公司 | A kind of biometric discrimination method, device and system |
| CN108427871A (en) * | 2018-01-30 | 2018-08-21 | 深圳奥比中光科技有限公司 | 3D faces rapid identity authentication method and device |
| CN111223219A (en) * | 2019-12-31 | 2020-06-02 | 深圳阜时科技有限公司 | A kind of identification method and storage medium |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN115588509A (en) * | 2022-10-19 | 2023-01-10 | 秦皇岛市惠斯安普医学系统股份有限公司 | Multi-mode mammary gland health detection system |
| CN115588509B (en) * | 2022-10-19 | 2023-05-02 | 秦皇岛市惠斯安普医学系统股份有限公司 | Multi-mode breast health detection system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US9836648B2 (en) | Iris biometric recognition module and access control assembly | |
| KR101415287B1 (en) | Method, computer-readable storage device and computing device for liveness detercion | |
| US9256794B2 (en) | Systems and methods for face authentication or recognition using spectrally and/or temporally filtered flash illumination | |
| US20080212849A1 (en) | Method and Apparatus For Facial Image Acquisition and Recognition | |
| CN109643366A (en) | Method and system for monitoring the condition of a vehicle driver | |
| KR20170093108A (en) | Control of wireless communication device capability in a mobile device with a biometric key | |
| TWI533234B (en) | Control method based on eye's motion and apparatus using the same | |
| CN103514440A (en) | Facial recognition | |
| TW201327413A (en) | Systems and methods for face authentication or recognition using spectrally and/or temporally filtered flash illumination | |
| US10930126B1 (en) | Motion sensing for electronic devices | |
| CN107479801A (en) | Displaying method of terminal, device and terminal based on user's expression | |
| CN108647504B (en) | Method and system for realizing information safety display | |
| CN111223219A (en) | A kind of identification method and storage medium | |
| CN108629278B (en) | System and method for realizing information safety display based on depth camera | |
| CN212160784U (en) | An identification device and access control equipment | |
| CN108924340A (en) | Authentication method, authentication apparatus, and computer-readable storage medium | |
| CN113313856A (en) | Door lock system with 3D face recognition function and using method | |
| CN112668539A (en) | Biological characteristic acquisition and identification system and method, terminal equipment and storage medium | |
| CN112764515A (en) | Biological characteristic recognition device and electronic equipment | |
| CN111063085A (en) | Identity recognition device and entrance guard equipment | |
| US8682041B2 (en) | Rendering-based landmark localization from 3D range images | |
| CN112632510A (en) | Information protection method and storage medium | |
| CN112784323A (en) | Information protection device and electronic equipment | |
| CN112764516A (en) | Biological characteristic recognition control method and storage medium | |
| CN111144219A (en) | Iris recognition device and method based on 3D structured light |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| WW01 | Invention patent application withdrawn after publication | ||
| WW01 | Invention patent application withdrawn after publication |
Application publication date: 20210507 |