WO2018195979A1 - 一种跟踪控制方法、装置及飞行器 - Google Patents
一种跟踪控制方法、装置及飞行器 Download PDFInfo
- Publication number
- WO2018195979A1 WO2018195979A1 PCT/CN2017/082562 CN2017082562W WO2018195979A1 WO 2018195979 A1 WO2018195979 A1 WO 2018195979A1 CN 2017082562 W CN2017082562 W CN 2017082562W WO 2018195979 A1 WO2018195979 A1 WO 2018195979A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- target object
- tracking
- face
- feature
- preset
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/12—Target-seeking control
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/13—Satellite images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/10—Terrestrial scenes
- G06V20/17—Terrestrial scenes taken from planes or by drones
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/166—Detection; Localisation; Normalisation using acquisition arrangements
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/168—Feature extraction; Face representation
- G06V40/171—Local features and components; Facial parts ; Occluding parts, e.g. glasses; Geometrical relationships
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B64—AIRCRAFT; AVIATION; COSMONAUTICS
- B64U—UNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
- B64U2101/00—UAVs specially adapted for particular uses or applications
- B64U2101/30—UAVs specially adapted for particular uses or applications for imaging, photography or videography
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10032—Satellite or aerial image; Remote sensing
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
- G06T2207/30201—Face
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V2201/00—Indexing scheme relating to image or video recognition or understanding
- G06V2201/07—Target detection
Definitions
- the present invention relates to the field of electronic technologies, and in particular, to a tracking control method, apparatus, and aircraft.
- the existing aircraft tracking strategy is to track an object with obvious features as a target object, usually in the process of tracking a certain feature part of the target object (such as a human face), because the distance between the aircraft and the target object is changing.
- the target frame's tracking frame also changes in the size of the captured image, which affects the tracking effect. For example, tracking the same feature part of the target object, when the distance between the aircraft and the target object is close, the tracking frame of the target object accounts for a large size (such as 10%) in the captured image, which may cause the tracking speed to be slow.
- the tracking loss of the target object is easily caused, and at the same time, the tracking control is poor in robustness; when the distance between the aircraft and the target object becomes far, the size of the tracking frame of the target object in the captured image is relatively small (for example, 5%). The feature of the tracked target object is blurred, and the tracking control is poor in robustness.
- the embodiment of the invention discloses a tracking control method, device and aircraft, which can prevent the tracking target object from being lost, and can improve the robustness of the tracking control.
- an embodiment of the present invention provides a tracking control method, where the method includes:
- the target object is tracked using the feature portion.
- an embodiment of the present invention provides a tracking control apparatus, where the apparatus includes:
- the acquisition module is used to obtain the tracking parameters of the target object.
- a determining module configured to determine a feature part of the target object according to the tracking parameter.
- a tracking module configured to track the target object by using the feature part.
- an embodiment of the present invention provides an aircraft, where the aircraft includes:
- processors and the memory being connected by a bus, the memory storing executable program code, the processor for calling the executable program code, performing the first aspect of the embodiment of the present invention Tracking control methods.
- the tracking parameter of the target object can be obtained by the embodiment of the invention, the feature part of the target object is determined according to the tracking parameter, and the target object is tracked by using the feature part, and the feature for tracking the target object can be re-determined based on the tracking parameter. Part, which prevents tracking target object from being lost and improves the robustness of tracking control.
- 1a is a schematic flow chart of a process for switching an aircraft from far field tracking to near field tracking according to an embodiment of the present invention
- 1b is a schematic flow chart of a process for switching an aircraft from near field tracking to far field tracking according to an embodiment of the present invention
- FIG. 2 is a schematic diagram of searching for a face in a tracking frame according to an embodiment of the present invention
- FIG. 3 is a schematic diagram of tracking a face and a shoulder according to an embodiment of the present invention.
- FIG. 4 is a schematic diagram of searching for a human body in a tracking frame according to an embodiment of the present invention.
- FIG. 5 is a schematic diagram of a tracking human body disclosed in an embodiment of the present invention.
- FIG. 6 is a schematic flowchart of a tracking control method according to an embodiment of the present invention.
- FIG. 7 is a schematic block diagram of a tracking control apparatus according to an embodiment of the present invention.
- FIG. 8 is a schematic block diagram of an aircraft disclosed in an embodiment of the present invention.
- the embodiment of the invention discloses a tracking control method, device and an aircraft, which are used for re-determining a feature part for tracking a target object based on the tracking parameter, can prevent the tracking target object from being lost, and can improve the robustness of the tracking control. The details are described below separately.
- the tracking process includes two aspects, one is the process of switching the aircraft from far field tracking to near field tracking, and the other is that the aircraft is switched from near field tracking to far field tracking.
- the process the following two aspects are introduced in detail.
- the near-field tracking and the far-field tracking may be defined according to a tracking parameter such as a distance between the aircraft and the tracked target object, a size ratio of the tracking frame in the captured image, for example, the near-field tracking refers to the target object of the aircraft and the tracking target.
- the far field tracking refers to the tracking when the distance between the aircraft and the tracked target object is less than the preset distance threshold.
- FIG. 1 is a schematic flowchart of a process for switching an aircraft from far-field tracking to near-field tracking according to an embodiment of the present invention, including:
- step S1011 The aircraft determines whether the switching condition of switching from the far field tracking to the near field tracking is satisfied, and if yes, executing step S1012; if not, ending the current flow.
- the aircraft tracks the human body of the target object.
- the size of the tracking frame in the captured image becomes larger.
- the tracking speed is slowed down, so the aircraft can determine the feature points of the target object for tracking for near field tracking to improve the efficiency of tracking.
- the flight controller can determine whether the switching condition from the far-field tracking to the near-field tracking is satisfied by the flight speed, the size of the tracking frame in the captured image, the distance between the target object and the aircraft, and the tracking speed is less than
- the speed threshold is set, and/or when the size ratio is greater than or equal to the preset ratio threshold (eg, 10%), and/or, the distance between the target object and the aircraft is less than or equal to the preset distance threshold (eg, 3 m)
- the aircraft may determine that the switching condition for switching from far field tracking to near field tracking is satisfied.
- the preset speed threshold, the preset ratio threshold, and the preset distance threshold may be dynamically set by the drone according to factors such as the accuracy of the tracking, the speed of the tracking (ie, the efficiency of the tracking), and the like. Therefore, the embodiment is not limited by the embodiment of the present invention.
- the aircraft needs to determine the feature points of the target object tracked by the near field. Since the features of the face are relatively obvious when tracking in the near field, the aircraft can track the face or the face and the shoulder of the target object to improve tracking. Efficiency, and robustness of tracking control.
- the specific steps of the aircraft to determine the face or face shoulder of the target object tracked in the near field are as follows:
- the aircraft can perform face recognition in the tracking frame to obtain a face of the target object.
- the aircraft can perform face recognition in the tracking frame to detect the face included in the tracking frame, determine the search center point in the tracking frame, search for the face centered on the search center point, and move away from the search center point.
- the face with the nearest face is the face of the target object.
- a point that is located on the center line of the tracking frame, and the ratio of the distance from the lower boundary of the tracking frame to the upper and lower boundaries of the tracking frame is a preset proportional threshold, and the search center is centered on the search center point.
- the preset ratio threshold may be set according to an actual requirement.
- the preset ratio threshold may be any value between 0.5 and 0.9, and the value of the preset ratio threshold is only a schematic value, which is not limited in the embodiment of the present invention. .
- the preset ratio threshold may be 0.9.
- the aircraft may search for a face centering on a black point in the tracking frame, and find a face closest to the black point as a target object.
- a targeted search can be performed in the area of the tracking frame according to the position information of the face. It's easier to get back the face or face and shoulders of the target.
- the range of the face can be expanded so that the face, the shoulder, the neck, etc. can be more
- the tracking of the combination of parts improves the robustness of the tracking control.
- the face is a positive face
- the feature is more obvious when the face is a side face. Therefore, when the face is a positive face and the face is a side face, different expansion strategies are adopted to obtain a suitable extension range, and further Improve the robustness of tracking control.
- the front face refers to the lens of the target device facing the aircraft
- the side face refers to the lens of the camera facing the left or right side of the target object, that is, the side face includes the face Head to the left and face to the right.
- face recognition is performed on the face of the target object to determine whether the face is a face or a face.
- the aircraft may use a face recognition algorithm to detect the face of the target object, Obtain the position information of the left and right eyes, the tip of the nose and the corner of the face, and the position information of the center point of the eye, and then use the symmetry between the center point and the nose of the two eyes to determine whether the face of the target object is a positive face or Side face, the formula is as follows:
- Dist_leye2nose (x_nose–x_leye)/(x_reye–x_leye)
- Dist_reye2nose (x_reye–x_nose)/(x_reye–x_leye)
- the face of the target object is a positive face
- the face of the target object is a side face
- the face of the target object is toward the left
- the face of the target object is toward the right;
- dist_leye2nose represents the distance from the left eye of the target's face to the nose
- dist_reye2nose represents the distance from the right eye of the target's face to the nose
- x_nose represents the abscissa of the nose
- x_leye represents the abscissa of the left eye
- x_reye represents the right
- abs() indicates the absolute value of a certain data.
- the aircraft can expand the current tracking frame to make the resulting tracking frame larger, thereby expanding the range of the face.
- an expansion factor may be set to expand the current tracking frame, so that the tracking frame is elongated in the horizontal direction and elongated in the vertical direction.
- expand is the expansion factor
- x is the abscissa of the center point of the current tracking frame
- y is the ordinate of the center point of the current tracking frame
- h is the height of the current tracking frame
- w is the current tracking frame.
- box_x is the abscissa of the center point of the expanded tracking frame
- box_y is the ordinate of the center point of the expanded tracking frame
- box_h is the height of the expanded tracking frame
- box_w is the width of the expanded tracking frame.
- an expansion factor may also be set to expand the current tracking frame so that the face of the tracking frame is elongated to the left and elongated in the horizontal direction when the face is facing to the left, or The tracking frame when the face is facing right is elongated to the right and elongated in the longitudinal direction.
- You can set the expansion factor to 2.0 and the extension formula for the current tracking box is as follows:
- the direction refers to the weight value of the face orientation.
- the tracking frame may further expand the tracking frame after the extension is performed, so that the head and the shoulder of the target object can be tracked. unit.
- the tracking frame can be doubled to the right in the lateral direction, and when the side face is facing to the right, the horizontal direction of the tracking frame can be doubled to the left.
- the size of the expansion factor can be set by the aircraft according to the robustness of the tracking control and the speed of the tracking.
- the size of the expansion factor can also be set by the user according to personal preference, due to the face of the target object. It is more obvious when the face is a face than when the face is a face. Therefore, when the face is a positive face, a smaller expansion factor (such as 1.8) can be set. When the face is a side face, a larger one can be set. Expansion factor (such as 2.1).
- the obtained tracking frame becomes larger, that is, the face of the target object after the range expansion is obtained, and the face of the extended target object may include the face and the shoulder of the target object.
- the head and shoulder model is a very obvious feature, the robustness of the tracking control can be greatly improved.
- the aircraft After obtaining the face of the extended target object, the aircraft can target the extended target. The face of the object is tracked.
- the aircraft can track the determined face and shoulder of the target object, so as to improve the efficiency and robustness of the tracking, as shown in FIG. 3, the solid line frame is the captured image.
- the dashed box is the tracking frame, and the face and shoulder of the target object are in the tracking frame.
- FIG. 1b is a schematic flowchart of a process for switching an aircraft from near field tracking to far field tracking according to an embodiment of the present invention, including:
- step S1021 The aircraft determines whether the switching condition of switching from near field tracking to far field tracking is satisfied, and if yes, executing step S1022; if not, ending the current flow.
- the aircraft tracks the face of the target object or the extended face, and the size of the tracking frame in the captured image when the aircraft is away from the target object, that is, when the aircraft switches from far-field tracking to near-field tracking.
- the proportion will become smaller, and the accuracy of the tracking will be worse, so the aircraft can determine the feature points of the target object for tracking for far-field tracking to improve the accuracy of tracking.
- the flight controller can determine whether the switching condition from the far-field tracking to the near-field tracking is satisfied by the tracking speed, the size ratio of the tracking frame in the captured image, the distance between the target object and the aircraft, and the tracking accuracy is less than
- the accurate threshold is preset, and/or when the size ratio is less than or equal to the preset ratio threshold, and/or the distance between the target object and the aircraft is greater than or equal to the preset distance threshold, the aircraft may determine that the near field is satisfied. Track the switching conditions for switching to far field tracking.
- the preset accurate threshold, the preset ratio threshold, and the preset distance threshold may be dynamically set by the aircraft according to factors such as the speed of the far field tracking (ie, the tracking efficiency) and the robustness of the tracking control. It is set by the user according to personal preference, wherein the process of switching from far-field tracking to near-field tracking and the process of switching from near-field tracking to far-field tracking, the corresponding preset ratio threshold and preset distance threshold may not The same or the same, the embodiment of the present invention is not limited.
- the aircraft needs to determine the feature points of the target object of the far field tracking. Since the characteristics of the human body are relatively obvious when tracking in the far field, the aircraft can track the human body of the target object to improve the robustness of the tracking control.
- the specific steps of the aircraft to determine the human body of the target object for far field tracking are as follows:
- the aircraft can determine the human body of the target object.
- the aircraft can acquire the human body and the center line of the tracking frame in the captured image, as shown in FIG. 4, search for the human body near the center line of the tracking frame, and the captured image contains the human body distance in the middle.
- the human body closest to the heart line is determined as the human body of the target object.
- a targeted search can be performed in the tracking frame area according to the position information of the human body, and it is easier to retrieve the human body that tracks the target object.
- the aircraft can track the determined human body of the target object.
- the aircraft can track the determined target object's human body to improve the robustness of the tracking control.
- the solid line frame is the captured image
- the dashed box is the tracking frame.
- the human body of the target object is in the tracking frame.
- the depth corresponding to the target object tracked in the captured image is found, thereby obtaining the distance between the drone and the target object, if the distance is too close (eg, less than or equal to 3 m) Switch to near-field tracking, if the distance is too far (such as greater than 4m), cut to far-field tracking; in addition, you can also directly measure by distance (such as binocular ranging, ultrasonic ranging or lidar ranging) The distance between the target object and the aircraft is tracked to assist in determining the switching conditions.
- the tracking switching thread can be triggered to perform tracking frame size detection. If the size of the tracking frame in the captured image is found to be greater than or equal to 30%, Then, the face of the target object can be automatically detected, and the detected face is expanded, and the extended face (such as the face and the shoulder) is tracked, and the far field tracking is switched to the near field tracking. Field tracking and far field tracking are similar to the above process.
- the tracking switching thread of the aircraft can start near-field tracking (face tracking or face and shoulder) Tracking), when the distance between the aircraft and the person is large (such as 3m), the aircraft can automatically switch to far-field tracking (ie, human tracking), which can achieve the effect of automatically focusing on the user.
- near-field tracking face tracking or face and shoulder Tracking
- far-field tracking ie, human tracking
- the aircraft can take off directly from the user's hand, fly to the user's oblique upper rear, and start tracking the face, after flying out (ie when the distance between the aircraft and the person When you are big, you can track the human body.
- the aircraft spirally surrounds the video, but adopts far-field tracking and near-field tracking switching.
- the aircraft can take off directly from the user's hand and start a spiral flight to track the face or face and shoulders. After flying out (ie when the distance between the aircraft and the person is large), the person can be tracked. Human body.
- the target object to be tracked may be a human or an object having a distinct feature, which is not limited in the embodiment of the present invention.
- the switching condition is used in the tracking process to determine the switching between the far field tracking and the near field tracking, so that the tracking is smoother, and the efficiency of the tracking algorithm and the robustness of the tracking control are improved, and during the tracking process, Since the tracking object is known to be a human body or a human face, when the tracking object is lost, a targeted search can be performed in the area of the tracking frame according to the position information of the human body or the human face, and it is easier to retrieve the tracking target.
- FIG. 6 is a schematic flowchart of a tracking control method according to an embodiment of the present invention.
- the tracking control method described in this embodiment includes:
- the tracking parameter includes: a size ratio of the tracking frame of the target object in the captured image and/or a distance between the target object and the aircraft.
- the aircraft may acquire tracking parameters such as the size ratio of the tracking frame of the target object in the captured image and/or the distance between the target object and the aircraft.
- the aircraft may determine, according to the target tracking parameter, whether switching from near field tracking to far field tracking or from far field tracking to near field tracking, and determining that the handover is required, the aircraft may determine the target object.
- the feature part is such that different feature parts of the target object can be tracked for different tracking scenes, thereby improving the robustness of the tracking control.
- the specific implementation manner of determining the feature part of the target object according to the tracking parameter may include:
- the tracking parameter satisfies the preset first tracking condition, determining that the feature part of the target object is the first feature part, the first feature part is the human body of the target object.
- the size ratio is less than or equal to a preset first ratio threshold, and/or the distance of the target object from the aircraft is greater than or equal to a preset first distance.
- the aircraft when the size ratio is less than or equal to a preset first ratio threshold, and/or the distance between the target object and the aircraft is greater than or equal to a preset first distance, that is, the aircraft may be near-field Tracking to the far-field tracking indicates that the distance between the aircraft and the target object is large, and the aircraft can detect the human body of the target object so that the human body of the target object can be tracked.
- the preset first ratio threshold and the preset first distance may be dynamically set by the aircraft according to factors such as robustness of the far field tracking control, the speed of tracking, or the like, or may be set by the user according to personal preferences.
- the embodiments of the invention are not limited.
- the determining that the feature part of the target object is the first feature part, and the first feature part is the human body of the target object may include:
- the human body closest to the center line among the human body included in the captured image is determined as the human body of the target object.
- the human body of the target object is used as the first feature portion, and the first feature portion is used as the feature portion of the target object.
- the aircraft can acquire the human body and the center line of the tracking frame in the captured image, search for the human body near the center line of the tracking frame, and determine the human body closest to the center line in the human body included in the captured image as the
- the human body of the target object uses the human body of the target object as the first feature portion, and uses the first feature portion as the feature portion of the target object, so that the aircraft can use the human body of the target object as the tracking target when tracking in the near field.
- a targeted search can be performed in the area of the tracking frame according to the position information of the human body, and it is easier to retrieve the human body that tracks the target object.
- the specific implementation manner of determining the feature part of the target object according to the tracking parameter may further include:
- the tracking parameter When the tracking parameter satisfies a preset second tracking condition, determining a second feature part of the target object according to the face area of the target object, and determining that the feature part of the target object is the second feature Part.
- the preset second tracking condition includes: the size ratio is greater than or equal to a preset second ratio threshold, and/or the distance between the target object and the aircraft is less than or equal to a preset second. Distance from.
- the aircraft when the size ratio is greater than or equal to the preset second ratio threshold, and/or the distance between the target object and the aircraft is less than or equal to the preset second distance, the aircraft may be far away.
- the field tracking is switched to the near field tracking, indicating that the distance between the aircraft and the target object is small, the second feature portion of the target object may be determined according to the face region of the target object, and the feature portion of the target object is determined to be the second feature. Part.
- the preset second ratio threshold and the preset second distance may be dynamically set by the aircraft according to factors such as the robustness of the near field tracking control, the speed of the tracking, or the like, or may be set by the user according to personal preferences.
- the embodiments of the invention are not limited.
- the second feature portion includes a face and a shoulder of the target object.
- the foregoing determining, according to the face area of the target object, the second feature part of the target object, and determining that the feature part of the target object is the second feature part may include:
- Face recognition is performed in the tracking frame to obtain a face region of the target object in the captured image.
- the face area of the target object is expanded according to a preset expansion factor to obtain the extended face area.
- a second feature portion of the target object included in the extended face region is used as a feature portion of the target object.
- the aircraft may perform face recognition in the tracking frame to obtain a face region of the target object in the captured image. Specifically, the aircraft may perform face recognition on the tracking frame to detect the tracking frame.
- the face included in the search box determines the search center point in the tracking frame, searches for the face centered on the search center point, and uses the face closest to the search center point as the face of the target object.
- a point that is located on the center line of the tracking frame, and the ratio of the distance from the lower boundary of the tracking frame to the upper and lower boundaries of the tracking frame is a preset proportional threshold, and the search center is centered on the search center point.
- the preset ratio threshold may be set according to an actual requirement.
- the preset ratio threshold may be any value between 0.5 and 0.9.
- the value of the preset ratio threshold is only an exemplary value. limited.
- a targeted search can be performed in the area of the tracking frame according to the position information of the face. It is easier to retrieve the face or face and shoulders of the target.
- the range of the face can be expanded so that the face, the shoulder, the neck, and the like can be more
- the tracking of the combination improves the robustness of the tracking control.
- the aircraft may expand the face region of the target object according to a preset expansion factor to obtain the extended face region, and use the expanded second feature portion of the target object included in the extended face region as the target The feature part of the object.
- the aircraft when the aircraft switches from near-field tracking to far-field tracking, whether the current tracking feature portion can be detected is a human body, and if not, the aircraft can switch the face using the target object to use the human body pair.
- the target object is tracked, and if so, the human body of the target object is continuously tracked; when the aircraft switches from far-field tracking to near-field tracking, whether the current tracking feature portion can be detected is a face or a face and a shoulder If not, the aircraft can switch the human body using the target object to track the target object using the face or the face and the shoulder, and if so, continue to perform the face or face and shoulder of the target object. track.
- the foregoing specific implementation manner of tracking the target object by using the feature part may include:
- the face using the target object is switched to track the target object using the first feature portion.
- the face using the target object can be switched to track the target object by using the first feature part.
- the foregoing specific implementation manner of tracking the target object by using the feature part may include:
- the human body using the target object is switched to track the target object using the second feature portion.
- the human body using the target object can be switched to track the target object by using the second feature portion.
- the aircraft may acquire the tracking parameter of the target object, determine the feature part of the target object according to the tracking parameter, and track the target object by using the feature part, which may be based on Re-determining the feature points used to track the target object in the tracking parameters can prevent the tracking target object from being lost and improve the robustness of the tracking control.
- FIG. 7 is a schematic structural diagram of a tracking control apparatus according to an embodiment of the present invention.
- the tracking control device described in this embodiment includes:
- the obtaining module 301 is configured to acquire tracking parameters of the target object.
- the determining module 302 is configured to determine a feature part of the target object according to the tracking parameter.
- the tracking module 303 is configured to track the target object by using the feature part.
- the tracking parameter includes: a size ratio of the tracking frame of the target object in the captured image and/or a distance between the target object and the aircraft.
- the determining module 302 is specifically configured to:
- the tracking parameter meets the preset first tracking condition, determining that the feature part of the target object is the first feature part, and the first feature part is the human body of the target object.
- the preset first tracking condition includes: the size ratio is less than or equal to a preset first ratio threshold, and/or the distance between the target object and the aircraft is greater than or equal to a preset first distance.
- the determining module 302 is specifically configured to:
- the tracking parameter When the tracking parameter satisfies a preset second tracking condition, determining a second feature part of the target object according to the face area of the target object, and determining that the feature part of the target object is the second feature Part.
- the preset second tracking condition includes: the size ratio is greater than or equal to a preset second ratio threshold, and/or the distance between the target object and the aircraft is less than or equal to a preset second. distance.
- the determining module 302 determines that the feature part of the target object is the first feature part, and the specific manner of the first feature part being the human body of the target object is:
- the human body closest to the center line among the human body included in the captured image is determined as the human body of the target object.
- the determining module 302 determines, according to the face region of the target object, the second feature portion of the target object, and determines that the feature portion of the target object is the second feature portion:
- Face recognition is performed in the tracking frame to acquire a face region of the target object in the captured image.
- the face area of the target object is expanded according to a preset expansion factor to obtain the extended face area.
- a second feature portion of the target object included in the extended face region is used as a feature portion of the target object.
- the second feature portion includes a face and a shoulder of the target object.
- the tracking module 303 is specifically configured to:
- the face of the target object is switched to track the target object with the first feature.
- the tracking module 303 is specifically configured to:
- the acquiring module 301 acquires the tracking parameter of the target object
- the determining module 302 determines the feature part of the target object according to the tracking parameter
- the tracking module 303 uses the feature part to track the target object, which may be re-based on the tracking parameter. Determining the feature points used to track the target object prevents the tracking target object from being lost and improves the robustness of the tracking control.
- FIG. 8 is a schematic block diagram of an aircraft according to an embodiment of the present invention.
- An aircraft in this embodiment as shown may include: at least one processor 401, such as a CPU; at least one memory 402, camera 403, power system 404, communication device 405, processor 401, memory 402, camera 403.
- the power system 404 and the communication device 405 are connected by a bus 406.
- the power system 404 is used to power the flight of the aircraft
- the communication device 405 is used to send and receive messages
- the camera 403 is used to capture images.
- the memory 402 is used to store instructions, and the processor 401 calls program code stored in the memory 402.
- the processor 401 calls the program code stored in the memory 402 to perform the following operations:
- the target object is tracked using the feature portion.
- the tracking parameter includes: a size ratio of a tracking frame of the target object in a captured image and/or a distance between the target object and an aircraft.
- the specific manner in which the processor 401 determines the feature part of the target object according to the tracking parameter is:
- the tracking parameter meets the preset first tracking condition, determining that the feature part of the target object is the first feature part, and the first feature part is the human body of the target object.
- the preset first tracking condition includes: the size ratio is less than or equal to a preset first ratio threshold, and/or the distance between the target object and the aircraft is greater than or equal to a preset first distance.
- the specific manner in which the processor 401 determines the feature part of the target object according to the tracking parameter is:
- the tracking parameter When the tracking parameter satisfies a preset second tracking condition, determining a second feature part of the target object according to the face area of the target object, and determining that the feature part of the target object is the second feature Part.
- the preset second tracking condition includes: the size ratio is greater than or equal to a preset second ratio threshold, and/or the distance between the target object and the aircraft is less than or equal to a preset second. distance.
- the processor 401 determines that the feature part of the target object is the first feature part, and the specific manner of the first feature part being the human body of the target object is:
- the human body closest to the center line among the human body included in the captured image is determined as the human body of the target object.
- the human body of the target object is used as a first feature portion, and the first feature portion is used as a feature portion of the target object.
- the processor 401 determines, according to the face area of the target object, the target object. And the specific manner of determining the feature part of the target object as the second feature part is:
- Face recognition is performed in the tracking frame to acquire a face region of the target object in the captured image.
- the face area of the target object is expanded according to a preset expansion factor to obtain the extended face area.
- a second feature portion of the target object included in the extended face region is used as a feature portion of the target object.
- the second feature portion includes a face and a shoulder of the target object.
- the specific manner in which the processor 401 tracks the target object by using the feature part is:
- the face of the target object is switched to track the target object with the first feature.
- the specific manner in which the processor 401 tracks the target object by using the feature part is:
- the processor 401 may acquire a tracking parameter of the target object, determine a feature part of the target object according to the tracking parameter, and track the target object by using the feature part, and may re-determine the tracking target based on the tracking parameter.
- the feature points of the object prevent the tracking target object from being lost and improve the robustness of the tracking control.
- the program may be stored in a computer readable storage medium, and the storage medium may include: Flash disk, read-only memory (Read-Only Memory, ROM), random access memory (RAM), disk or optical disc.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- General Health & Medical Sciences (AREA)
- Remote Sensing (AREA)
- Astronomy & Astrophysics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Automation & Control Theory (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Analysis (AREA)
Abstract
本发明实施例公开了一种跟踪控制方法、装置及飞行器。其中,所述方法包括:飞行器获取目标对象的跟踪参数,根据该跟踪参数确定该目标对象的特征部位,利用该特征部位对该目标对象进行跟踪。通过本发明实施例可以基于跟踪参数重新确定用于跟踪目标对象的特征部位,从而可防止跟踪目标对象丢失,并可提高跟踪控制的鲁棒性。
Description
本发明涉及电子技术领域,尤其涉及一种跟踪控制方法、装置及飞行器。
现有的飞行器跟踪策略为把一个具有特征明显的物体作为目标对象进行跟踪,通常在对目标对象的某个特征部位(如人脸)进行跟踪的过程中,由于飞行器与目标对象的距离在变化,目标对象的跟踪框在拍摄图像中的尺寸占比也随之变化,因此会影响跟踪的效果。例如,对目标对象的同一个特征部位进行跟踪,当飞行器与目标对象的距离变近时,目标对象的跟踪框在拍摄图像中的尺寸占比较大(如10%),会造成跟踪速度变慢,进而容易造成目标对象跟踪丢失,同时,会使得跟踪控制鲁棒性差;当飞行器与目标对象的距离变远时,目标对象的跟踪框在拍摄图像中的尺寸占比较小(如5%),会造成跟踪到的目标对象的特征模糊,跟踪控制的鲁棒性差。
因此,采用现有的跟踪策略对目标对象进行跟踪,容易造成目标对象跟踪丢失,跟踪控制的鲁棒性差。
发明内容
本发明实施例公开了一种跟踪控制方法、装置及飞行器,可防止跟踪的目标对象丢失,并可提高跟踪控制的鲁棒性。
第一方面,本发明实施例提供了一种跟踪控制方法,该方法包括:
获取目标对象的跟踪参数。
根据所述跟踪参数确定所述目标对象的特征部位。
利用所述特征部位对所述目标对象进行跟踪。
第二方面,本发明实施例提供了一种跟踪控制装置,该装置包括:
获取模块,用于获取目标对象的跟踪参数。
确定模块,用于根据所述跟踪参数确定所述目标对象的特征部位。
跟踪模块,用于利用所述特征部位对所述目标对象进行跟踪。
第三方面,本发明实施例提供了一种飞行器,该飞行器包括:
处理器和存储器,所述处理器和所述存储器通过总线连接,所述存储器存储有可执行程序代码,所述处理器用于调用所述可执行程序代码,执行本发明实施例第一方面提供的跟踪控制方法。
通过本发明实施例可以获取目标对象的跟踪参数,根据该跟踪参数确定该目标对象的特征部位,并利用该特征部位对该目标对象进行跟踪,可以基于跟踪参数重新确定用于跟踪目标对象的特征部位,从而可防止跟踪目标对象丢失,并可提高跟踪控制的鲁棒性。
为了更清楚地说明本发明实施例中的技术方案,下面将对实施例中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本发明的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动性的前提下,还可以根据这些附图获得其他的附图。
图1a是本发明实施例公开的一种飞行器由远场跟踪切换到近场跟踪的过程的流程示意图;
图1b是本发明实施例公开的一种飞行器由近场跟踪切换到远场跟踪的过程的流程示意图;
图2是本发明实施例公开的在跟踪框内搜索人脸的示意图;
图3是本发明实施例公开的跟踪人脸和肩部的示意图;
图4是本发明实施例公开的在跟踪框内搜索人体的示意图;
图5是本发明实施例公开的跟踪人体的示意图;
图6是本发明实施例公开的一种跟踪控制方法的流程示意图;
图7是本发明实施例公开的一种跟踪控制装置的示意性框图;
图8是本发明实施例公开的一种飞行器的的示意性框图。
下面将结合本发明实施例中的附图,对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本发明一部分实施例,而不是全
部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。
本发明实施例公开了一种跟踪控制方法、装置及飞行器,用于基于跟踪参数重新确定用于跟踪目标对象的特征部位,可防止跟踪目标对象丢失,并可提高跟踪控制的鲁棒性。以下分别进行详细说明。
本发明实施例中,根据飞行器的实际跟踪情况可知,跟踪过程中包括两方面,一方面是飞行器由远场跟踪切换到近场跟踪的过程,另方面是飞行器由近场跟踪切换到远场跟踪的过程,下面对两方面进行详细介绍。
其中,近场跟踪、远场跟踪可以根据飞行器与跟踪的目标对象的距离、跟踪框在拍摄图像中的尺寸占比等跟踪参数来定义,如,近场跟踪是指飞行器与跟踪的目标对象的距离小于或等于预设距离阈值时的跟踪,远场跟踪是指飞行器与跟踪的目标对象的距离小于预设距离阈值时的跟踪。
请参阅图1a,为本发明实施例提供的飞行器由远场跟踪切换到近场跟踪的过程的流程示意图,包括:
101、飞行器由远场跟踪切换到近场跟踪的过程。
S1011、飞行器判断是否满足由远场跟踪切换到近场跟踪的切换条件,若是,则执行步骤S1012;若否,则结束本次流程。
在远场跟踪时,飞行器跟踪的是目标对象的人体,在飞行器接近目标对象时,即飞行器由远场跟踪切换到近场跟踪时,跟踪框在拍摄图像中的尺寸占比会变大,进而导致跟踪速度变慢,所以飞行器可以为近场跟踪确定用于跟踪的目标对象的特征部位,以提高跟踪的效率。
飞行器可以通过跟踪的速度、跟踪框在拍摄图像中的尺寸占比、目标对象与飞行器的距离等飞行参数,判断是否满足由远场跟踪切换到近场跟踪的切换条件,在跟踪的速度小于预设速度阈值时,和/或,该尺寸占比大于或等于预设占比阈值(如10%)时,和/或,目标对象与飞行器的距离小于或等于预设距离阈值(如3m)时,飞行器可以确定满足由远场跟踪切换到近场跟踪的切换条件。
需要说明的是,预设速度阈值、预设占比阈值、预设距离阈值可以是无人机根据跟踪的准确度、跟踪的速度(即跟踪的效率)等因素动态设置的,也可
以是用户根据个人喜好设置的,本发明实施例不做限定。
进一步,飞行器需要确定近场跟踪的目标对象的特征部位,由于在近场跟踪时,人脸的特征比较明显,因此飞行器可以跟踪该目标对象的人脸或者人脸及肩部,以提高跟踪的效率,及跟踪控制的鲁棒性。飞行器确定近场跟踪的目标对象的人脸或人脸肩部的具体步骤如下:
S1012、飞行器可以在跟踪框中进行人脸识别,以获得目标对象的人脸。
具体的,飞行器可以在跟踪框中进行人脸识别,以检测出跟踪框中包含的人脸,在跟踪框内确定搜索中心点,以搜索中心点为中心搜索人脸,并将离搜索中心点位置最近的人脸作为目标对象的人脸。如,将位于跟踪框的中心线上,且离跟踪框下边界的距离与跟踪框上下边界的距离的比为预设比例阈值的点作为搜索中心点,以搜索中心点为中心搜索人脸,预设比例阈值可以根据实际需要设置,如预设比例阈值可以取0.5-0.9之间的任一数值,以上的预设比例阈值的取值仅仅示意性的取值,本发明实施例不做限定。举例来说,预设比例阈值可以取0.9,如图2所示,飞行器可以以跟踪框内的黑点为中心搜索人脸,并找出离黑点最近的人脸作为目标对象的人脸。
需要说明的是,在跟踪目标对象丢失时,由于已知当前跟踪目标对象是人脸或人脸及肩部,因此可以根据人脸的位置信息在跟踪框的区域中进行有针对性的搜索,更容易找回跟踪目标对象的人脸或人脸及肩部。
S1013、在得到该目标对象的人脸后,对该人脸的范围进行扩展。
具体的,在得到该目标对象的人脸后,由于人脸的范围太小,不利于进行跟踪,因此可以对人脸的范围进行扩展,以便可以对人脸与肩部、颈部等更多部位的组合的跟踪,进而,提高跟踪控制的鲁棒性。具体的,人脸为正脸时比人脸为侧脸时的特征更加明显,因此可以针对人脸为正脸时和人脸为侧脸时,采用不同扩展策略以取得合适的扩展范围,进一步,提高跟踪控制的鲁棒性。
其中,正脸是指目标对象的人脸正对着飞行器的拍摄装置的镜头,侧脸是指目标对象的人脸的左边或者右边正对着飞行器的拍摄装置的镜头,即,侧脸包括脸朝向左和脸朝向右。
更具体的,对该目标对象的人脸进行人脸识别,以确定该人脸是正脸还是侧脸。例如,飞行器可以采用人脸识别算法对该目标对象的人脸进行检测,以
获得人脸中左右眼睛、鼻尖和嘴角等部位的位置信息,以及眼睛的中心点的位置信息,进而可以利用两只眼睛的中心点与鼻尖的对称性来判定该目标对象的人脸是正脸还是侧脸,判定公式如下:
dist_leye2nose=(x_nose–x_leye)/(x_reye–x_leye)
dist_reye2nose=(x_reye–x_nose)/(x_reye–x_leye)
if abs(dist_leye2nose–dist_reye2nose)>0.35
目标对象的人脸是正脸;
else
该目标对象的人脸是侧脸;
if dist_leye2nose>dist_reye2nose
该目标对象的人脸是朝向左;
else
该目标对象的人脸是朝向右;
其中,dist_leye2nose表示目标对象的人脸的左眼到鼻子的距离,dist_reye2nose表示目标对象的人脸的右眼到鼻子的距离,x_nose表示鼻子的横坐标,x_leye表示左眼的横坐标,x_reye表示右眼的横坐标,abs()表示某个数据的绝对值。
进一步,飞行器可以对当前的跟踪框进行扩展,以使得到的跟踪框更大,进而,使该人脸的范围得到扩展。
具体的,在该目标对象的人脸是正脸时,可以设置一个扩展因子,对当前的跟踪框进行扩展,使跟踪框在横向拉长,在纵向也拉长。如,可以设置扩展因子为1.7,对当前跟踪框的扩展公式如下:
expand=1.7;
box_x=x;
box_y=y+h*expand*0.25;
box_w=w*expand*1.35;
box_h=h*expand;
其中,expand为扩展因子,x为当前的跟踪框的中心点的横坐标,y为当前的跟踪框的中心点的纵坐标,h为当前的跟踪框的高度,w为当前的跟踪框的
宽度,box_x为扩展后的跟踪框的中心点的横坐标,box_y为扩展后的跟踪框的中心点的纵坐标,box_h为扩展后的跟踪框的高度,box_w为扩展后的跟踪框的宽度。
在该目标对象的人脸是侧脸时,也可以设置一个扩展因子,对当前的跟踪框进行扩展,使人脸朝左时的使跟踪框的横向向左拉长,纵向也拉长,或使人脸朝右时的跟踪框的横向向右拉长,纵向也拉长。可以设置扩展因子为2.0,对当前跟踪框的扩展公式如下:
expand=2
box_x=x+direction*box_w;//人脸朝左direction=-1,人脸朝右direction=1
box_y=y+h*expand*0.25;
box_w=w*expand*1.5;
box_h=h*expand;
其中,direction是指人脸朝向的权重值,在人脸为侧脸时,跟踪框在进行上述扩展后,还可以对跟踪框再做一次扩展,以使可以跟踪到目标对象的头部及肩部。如,侧脸朝左时,可以对跟踪框的横向向右扩大一倍,侧脸朝右时,可以对跟踪框的横向向左扩大一倍。
需要说明的是,扩展因子越大,扩展得到的跟踪框就越大,跟踪到的人体特征(可能包括人脸及人脸以外的部位)就越多,跟踪控制的鲁棒性会提高,但是跟踪的速度会变慢,所以扩展因子的大小可以是飞行器根据跟踪控制的鲁棒性及跟踪的速度折衷设置的,扩展因子的大小也可以是用户根据个人喜好设置的,由于目标对象的人脸为正脸时比人脸为侧脸时的特征更明显,因此人脸为正脸时,可以设置一个较小的扩展因子(如1.8),人脸为侧脸时,可以设置一个较大的扩展因子(如2.1)。
通过上述对跟踪框进行扩展后,得到的跟踪框变得更大,即,得到范围扩展后的目标对象的人脸,该扩展后的目标对象的人脸可以包括目标对象的人脸和肩部等部位,由于在人的上半身,头肩模型是非常明显的特征,因此可以较大的提高跟踪控制的鲁棒性。
S1014、在获得扩展后的目标对象的人脸后,飞行器可以对扩展后的目标
对象的人脸进行跟踪。
具体的,在近场跟踪时,飞行器可以对确定出的目标对象的人脸和肩部进行跟踪,以便提高跟踪的效率及鲁棒性,如图3所示,实线框为拍摄的图像,虚线框为跟踪框,目标对象的人脸和肩部在跟踪框内。
请参阅图1b,为本发明实施例提供的飞行器由近场跟踪切换到远场跟踪的过程的流程示意图,包括:
102、飞行器由近场跟踪切换到远场跟踪的过程。
S1021、飞行器判断是否满足由近场跟踪切换到远场跟踪的切换条件,若是,则执行步骤S1022;若否,则结束本次流程。
在近场跟踪时,飞行器跟踪的是目标对象的人脸或扩展后的人脸,在飞行器远离目标对象时,即飞行器由远场跟踪切换到近场跟踪时,跟踪框在拍摄图像中的尺寸占比会变小,进而跟踪的准确度会变差,所以飞行器可以为远场跟踪确定用于跟踪的目标对象的特征部位,以提高跟踪的准确度。
飞行器可以通过跟踪的速度、跟踪框在拍摄图像中的尺寸占比、目标对象与飞行器的距离等飞行参数,判断是否满足由远场跟踪切换到近场跟踪的切换条件,在跟踪的准确度小于预设准确阈值时,和/或,该尺寸占比小于或等于预设占比阈值时,和/或,目标对象与飞行器的距离大于或等于预设距离阈值时,飞行器可以确定满足由近场跟踪切换到远场跟踪的切换条件。
需要说明的是,预设准确阈值、预设占比阈值、预设距离阈值可以是飞行器根据远场跟踪的速度(即跟踪的效率)、跟踪控制的鲁棒性等因素动态设置的,也可以是用户根据个人喜好设置的,其中,在由远场跟踪切换到近场跟踪的过程及在由近场跟踪切换到远场跟踪的过程,对应的预设占比阈值和预设距离阈值可以不相同,也可以相同,本发明实施例不做限定。
进一步,飞行器需要确定远场跟踪的目标对象的特征部位,由于在远场跟踪时,人体的特征比较明显,因此飞行器可以跟踪该目标对象的人体,以提高跟踪控制的鲁棒性。飞行器确定远场跟踪的目标对象的人体的具体步骤如下:
S1022、飞行器可以确定目标对象的人体。
具体的,飞行器可以获取拍摄图像中的人体及该跟踪框的中心线,如图4所示,搜索跟踪框的中心线附近的人体,将该拍摄图像包含的人体中距离该中
心线最近的人体确定为该目标对象的人体。
需要说明的是,在跟踪目标对象丢失时,由于已知当前跟踪目标对象是人体,因此可以根据人体的位置信息在跟踪框区域进行有针对性的搜索,更容易找回跟踪目标对象的人体。
S1023、飞行器可以对确定出的目标对象的人体进行跟踪。
具体的,在远场跟踪时,飞行器可以对确定出的目标对象的人体进行跟踪,以便提高跟踪控制的鲁棒性,如图5所示,实线框为拍摄的图像,虚线框为跟踪框,目标对象的人体在跟踪框内。
需要说明的是,在远场跟踪与近场跟踪切换中,由于跟踪框的大小可能会发生一定的变化,同时跟踪框的大小可能与跟踪的目标对象的大小不一致,因此可以使用其他辅助方式来确定切换条件,以提高切换的准确度。如,采用深度传感器技术及图像投影关系,找到拍摄图像中跟踪的目标对象对应处的深度,从而得到无人机与该目标对象之间的距离,如果距离过近(如小于或等于3m)则切换为近场跟踪,距离过远(如大于4m)则切为远场跟踪;另外,也可以直接采用测距的方式(如双目测距、超声波测距或激光雷达测距等)测出跟踪目标对象与飞行器的距离,从而辅助确定切换条件。
举例来说,在跟踪目标对象是人时,飞行器检测到目标对象时,可以就触发跟踪切换线程,进行跟踪框大小检测,如果发现跟踪框在拍摄图像中的尺寸占比大于或等于30%,则可以自动对该目标对象的人脸进行检测,并对检测到的人脸进行扩展,对扩展得到人脸(如人脸及肩部)进行跟踪,完成远场跟踪切换到近场跟踪,近场跟踪切远场跟踪类似以上过程。
进一步举例来说,具有扫脸起飞功能的自拍飞行器使用扫脸起飞的过程中,在扫脸成功后,飞行器的跟踪切换线程可以就开始进行近场跟踪(人脸跟踪或人脸及肩部的跟踪),当飞行器与人的距离较大(如3m)时,飞行器可以自动切换为远场跟踪(即人体跟踪),可以达到自动以开机用户为焦点的效果。
再举例来说,在由近及远的录像时,飞行器可以直接从用户手上起飞,向用户的斜后上方飞行,并开始时跟踪人脸,飞出后(即当飞行器与人的距离较大时)可以跟踪人的人体。具体的,在飞行器环绕录像过程中,在对焦到人体后,飞行器向往外螺旋环绕录像,但是在采用远场跟踪与近场跟踪切换进行环
绕录像的过程中,飞行器可以直接从用户手上起飞,并开始进行螺旋线飞行拍摄跟踪人脸或人脸及肩部,飞出后(即当飞行器与人的距离较大时)可以跟踪人的人体。
需要说明的是,跟踪的目标对象可以是人,也可以具有明显特征的物体,本发明实施例不做限定。
本发明实施例中,跟踪过程中利用切换条件来确定远场跟踪与近场跟踪的切换,使得跟踪更加流畅,而且提高了跟踪算法的效率和跟踪控制的鲁棒性,同时在跟踪过程中,由于已知跟踪对象是人体或者人脸,因此在跟踪对象丢失时,也可以根据人体或人脸的位置信息在跟踪框的区域中进行有针对性的搜索,更容易找回跟踪目标。
请参阅图6,为本发明实施例提供的一种跟踪控制方法的流程示意图。本实施例中所描述的跟踪控制方法,包括:
S201、获取目标对象的跟踪参数。
其中,所述跟踪参数包括:该目标对象的跟踪框在拍摄图像中的尺寸占比和/或该目标对象与飞行器的距离。
本发明实施例中,飞行器可以获取该目标对象的跟踪框在拍摄图像中的尺寸占比和/或该目标对象与飞行器的距离等跟踪参数。
S202、根据该跟踪参数确定该目标对象的特征部位。
本发明实施例中,飞行器可以根据该目标跟踪参数判断是否需要由近场跟踪切换到远场跟踪或者是由远场跟踪切换到近场跟踪,在确定需要切换时,飞行器可以确定该目标对象的特征部位,以便可以针对不同的跟踪场景,跟踪该目标对象的不同特征部位,进而提高跟踪控制的鲁棒性。
作为一种可选的实施例,该根据该跟踪参数确定该目标对象的特征部位的具体实施方式可以包括:
在该跟踪参数满足预设的第一跟踪条件时,确定该目标对象的特征部位为第一特征部位,该第一特征部位为所述目标对象的人体。
其中,该尺寸占比小于或等于预设第一占比阈值,和/或,该目标对象与所述飞行器的距离大于或等于预设第一距离。
本发明实施例中,在该尺寸占比小于或等于预设第一占比阈值,和/或,该目标对象与该飞行器的距离大于或等于预设第一距离时,即飞行器可能由近场跟踪切换到远场跟踪,说明飞行器与该目标对象的距离较大,飞行器可以对该目标对象的人体进行检测,以便可以跟踪目标对象的人体。
需要说明的是,预设第一占比阈值、预设第一距离可以飞行器根据远场跟踪控制的鲁棒性、跟踪的速度等因素动态设置的,也可以是用户根据个人喜好设置的,本发明实施例不做限定。
作为一种可选的实施例,上述该确定所述目标对象的特征部位为第一特征部位,该第一特征部位为该目标对象的人体的具体实施方式可以包括:
获取该跟踪框的中心线和该拍摄图像包含的人体。
将该拍摄图像包含的人体中距离所述中心线最近的人体确定为该目标对象的人体。
将该目标对象的人体作为第一特征部位,并将该第一特征部位作为该目标对象的特征部位。
本发明实施例中,飞行器可以获取拍摄图像中的人体及该跟踪框的中心线,搜索跟踪框的中心线附近的人体,将该拍摄图像包含的人体中距离该中心线最近的人体确定为该目标对象的人体,将该目标对象的人体作为第一特征部位,并将该第一特征部位作为该目标对象的特征部位,以便在近场跟踪时,飞行器可以将目标对象的人体作为跟踪目标。
需要说明的是,在跟踪对象丢失时,由于已知当前跟踪对象是人体,因此可以根据人体的位置信息在跟踪框的区域中进行有针对性的搜索,更容易找回跟踪目标对象的人体。
作为一种可选的实施例,所述根据所述跟踪参数确定所述目标对象的特征部位的具体实施方式还可以包括:
在所述跟踪参数满足预设的第二跟踪条件时,根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位。
其中,所述预设的第二跟踪条件包括:所述尺寸占比大于或等于预设第二占比阈值,和/或,所述目标对象与所述飞行器的距离小于或等于预设第二距
离。
本发明实施例中,在尺寸占比大于或等于预设第二占比阈值,和/或,所述目标对象与所述飞行器的距离小于或等于预设第二距离时,即飞行器可能由远场跟踪切换到近场跟踪,说明飞行器与目标对象的距离较小,可以根据该目标对象的人脸区域确定该目标对象的第二特征部位,并确定该目标对象的特征部位为该第二特征部位。
需要说明的是,预设第二占比阈值、预设第二距离可以飞行器根据近场跟踪控制的鲁棒性、跟踪的速度等因素动态设置的,也可以是用户根据个人喜好设置的,本发明实施例不做限定。
其中,该第二特征部位包括该目标对象的人脸和肩部。
作为一种可选的实施例,上述根据该目标对象的人脸区域确定该目标对象的第二特征部位,并确定该目标对象的特征部位为该第二特征部位的具体实施方式可以包括:
在该跟踪框中进行人脸识别,获取该拍摄图像中该目标对象的人脸区域。
按照预设的扩展因子对该目标对象的人脸区域进行扩展,得到扩展后的所述人脸区域。
将扩展后的所述人脸区域包含的所述目标对象的第二特征部位作为所述目标对象的特征部位。
本发明实施例中,飞行器可以在该跟踪框中进行人脸识别,获取该拍摄图像中该目标对象的人脸区域,具体的,飞行器可以对跟踪框中进行人脸识别,以检测出跟踪框中包含的人脸,在跟踪框内确定搜索中心点,以搜索中心点为中心搜索人脸,并将离搜索中心点位置最近的人脸作为目标对象的人脸。如,将位于跟踪框的中心线上,且离跟踪框下边界的距离与跟踪框上下边界的距离的比为预设比例阈值的点作为搜索中心点,以搜索中心点为中心搜索人脸,预设比例阈值可以根据实际需要设置,如预设比例阈值可以取0.5-0.9之间的任一数值,以上的预设比例阈值的取值仅仅是示意性的取值,本发明实施例不做限定。
需要说明的是,在跟踪对象丢失时,由于已知当前跟踪对象是人脸或人脸及肩部,因此可以根据人脸的位置信息在跟踪框的区域中进行有针对性的搜
索,更容易找回跟踪目标对象的人脸或人脸及肩部。
进一步,在得到该目标对象的人脸后,由于人脸的范围太小,不利于进行跟踪,因此可以对人脸的范围进行扩展,以便可以对人脸与肩部、颈部等更多部位的组合的跟踪,进而提高跟踪控制的鲁棒性。飞行器可以按照预设的扩展因子对该目标对象的人脸区域进行扩展,得到扩展后的所述人脸区域,将扩展后的该人脸区域包含的该目标对象的第二特征部位作为该目标对象的特征部位。
S203、利用该特征部位对该目标对象进行跟踪。
本发明实施例中,飞行器在由近场跟踪切换到远场跟踪时,可以检测的当前的跟踪特征部位是否是人体,若不是,则飞行器可以将利用该目标对象的人脸切换为利用人体对该目标对象进行跟踪,若是,则继续对该目标对象的人体进行跟踪;飞行器在由远场跟踪切换到近场跟踪时,可以检测的当前的跟踪特征部位是否是人脸或者人脸及肩部,若不是,则飞行器可以将利用该目标对象的人体切换为利用人脸或者人脸及肩部对该目标对象进行跟踪,若是,则继续对该目标对象的人脸或者人脸及肩部进行跟踪。
作为一种可选的实施例,上述利用该特征部位对该目标对象进行跟踪的具体实施方式可以包括:
将利用该目标对象的人脸切换为利用该第一特征部位对该目标对象进行跟踪。
本发明实施例中,飞行器在由近场跟踪切换到远场跟踪时,可以将利用该目标对象的人脸切换为利用该第一特征部位对该目标对象进行跟踪。
作为一种可选的实施例,上述利用该特征部位对该目标对象进行跟踪的具体实施方式可以包括:
将利用该目标对象的人体切换为利用该第二特征部位对该目标对象进行跟踪。
本发明实施例中,飞行器在由远场跟踪切换到近场跟踪时,可以将利用该目标对象的人体切换为利用该第二特征部位对该目标对象进行跟踪。
本发明实施例中,飞行器可以获取目标对象的跟踪参数,根据该跟踪参数确定该目标对象的特征部位,利用该特征部位对该目标对象进行跟踪,可以基
于跟踪参数重新确定用于跟踪目标对象的特征部位,可防止跟踪目标对象丢失,并可提高跟踪控制的鲁棒性。
请参阅图7,为本发明实施例提供的一种跟踪控制装置的结构示意图。本实施例中所描述的跟踪控制装置,包括:
获取模块301,用于获取目标对象的跟踪参数。
确定模块302,用于根据所述跟踪参数确定所述目标对象的特征部位。
跟踪模块303,用于利用所述特征部位对所述目标对象进行跟踪。
可选的,所述跟踪参数包括:所述目标对象的跟踪框在拍摄图像中的尺寸占比和/或所述目标对象与飞行器的距离。
可选的,所述确定模块302,具体用于:
在所述跟踪参数满足预设的第一跟踪条件时,确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体。
其中,所述预设的第一跟踪条件包括:所述尺寸占比小于或等于预设第一占比阈值,和/或,所述目标对象与所述飞行器的距离大于或等于预设第一距离。
可选的,所述确定模块302,具体用于:
在所述跟踪参数满足预设的第二跟踪条件时,根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位。
其中,所述预设的第二跟踪条件包括:所述尺寸占比大于或等于预设第二占比阈值,和/或,所述目标对象与所述飞行器的距离小于或等于预设第二距离。
可选的,所述确定模块302确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体的具体方式为:
获取所述跟踪框的中心线和所述拍摄图像包含的人体。
将所述拍摄图像包含的人体中距离所述中心线最近的人体确定为所述目标对象的人体。
将所述目标对象的人体作为第一特征部位,并将所述第一特征部位作为所
述目标对象的特征部位。
可选的,所述确定模块302根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位的具体方式为:
在所述跟踪框中进行人脸识别,获取所述拍摄图像中所述目标对象的人脸区域。
按照预设的扩展因子对所述目标对象的人脸区域进行扩展,得到扩展后的所述人脸区域。
将扩展后的所述人脸区域包含的所述目标对象的第二特征部位作为所述目标对象的特征部位。
其中,所述第二特征部位包括所述目标对象的人脸和肩部。
可选的,所述跟踪模块303,具体用于:
将利用所述目标对象的人脸切换为利用所述第一特征部位对所述目标对象进行跟踪。
可选的,所述跟踪模块303,具体用于:
将利用所述目标对象的人体切换为利用所述第二特征部位对所述目标对象进行跟踪。
本发明实施例中,获取模块301获取目标对象的跟踪参数,确定模块302根据该跟踪参数确定该目标对象的特征部位,跟踪模块303利用该特征部位对该目标对象进行跟踪,可以基于跟踪参数重新确定用于跟踪目标对象的特征部位,可防止跟踪目标对象丢失,并可提高跟踪控制的鲁棒性。
请参见图8,图8是本发明实施例提供的一种飞行器的示意性框图。如图所示的本实施例中的一种飞行器可以包括:至少一个处理器401,例如CPU;至少一个存储器402,相机403,动力系统404、通信装置405,上述处理器401、存储器402、相机403、动力系统404和通信装置405通过总线406连接。
其中,动力系统404,用于为飞行器的飞行提供动力,通信装置405,用于收发消息,相机403,用于拍摄图像。
存储器402用于存储指令,处理器401调用存储器402中存储的程序代码。
具体的,处理器401调用存储器402中存储的程序代码,执行以下操作:
获取目标对象的跟踪参数。
根据所述跟踪参数确定所述目标对象的特征部位。
利用所述特征部位对所述目标对象进行跟踪。
其中,所述跟踪参数包括:所述目标对象的跟踪框在拍摄图像中的尺寸占比和/或所述目标对象与飞行器的距离。
可选的,处理器401根据所述跟踪参数确定所述目标对象的特征部位的具体方式为:
在所述跟踪参数满足预设的第一跟踪条件时,确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体。
其中,所述预设的第一跟踪条件包括:所述尺寸占比小于或等于预设第一占比阈值,和/或,所述目标对象与所述飞行器的距离大于或等于预设第一距离。
可选的,处理器401根据所述跟踪参数确定所述目标对象的特征部位的具体方式为:
在所述跟踪参数满足预设的第二跟踪条件时,根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位。
其中,所述预设的第二跟踪条件包括:所述尺寸占比大于或等于预设第二占比阈值,和/或,所述目标对象与所述飞行器的距离小于或等于预设第二距离。
可选的,处理器401确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体的具体方式为:
获取所述跟踪框的中心线和所述拍摄图像包含的人体。
将所述拍摄图像包含的人体中距离所述中心线最近的人体确定为所述目标对象的人体。
将所述目标对象的人体作为第一特征部位,并将所述第一特征部位作为所述目标对象的特征部位。
可选的,处理器401根据所述目标对象的人脸区域确定所述目标对象的第
二特征部位,并确定所述目标对象的特征部位为所述第二特征部位的具体方式为:
在所述跟踪框中进行人脸识别,获取所述拍摄图像中所述目标对象的人脸区域。
按照预设的扩展因子对所述目标对象的人脸区域进行扩展,得到扩展后的所述人脸区域。
将扩展后的所述人脸区域包含的所述目标对象的第二特征部位作为所述目标对象的特征部位。
其中,所述第二特征部位包括所述目标对象的人脸和肩部。
可选的,处理器401利用所述特征部位对所述目标对象进行跟踪的具体方式为:
将利用所述目标对象的人脸切换为利用所述第一特征部位对所述目标对象进行跟踪。
可选的,处理器401利用所述特征部位对所述目标对象进行跟踪的具体方式为:
将利用所述目标对象的人体切换为利用所述第二特征部位对所述目标对象进行跟踪。
本发明实施例中,处理器401可以获取目标对象的跟踪参数,根据该跟踪参数确定该目标对象的特征部位,利用该特征部位对该目标对象进行跟踪,可以基于跟踪参数重新确定用于跟踪目标对象的特征部位,可防止跟踪目标对象丢失,并可提高跟踪控制的鲁棒性。
需要说明的是,对于前述的各个方法实施例,为了简单描述,故将其都表述为一系列的动作组合,但是本领域技术人员应该知悉,本发明并不受所描述的动作顺序的限制,因为依据本发明,某一些步骤可以采用其他顺序或者同时进行。其次,本领域技术人员也应该知悉,说明书中所描述的实施例均属于优选实施例,所涉及的动作和模块并不一定是本发明所必须的。
本领域普通技术人员可以理解上述实施例的各种方法中的全部或部分步骤是可以通过程序来指令相关的硬件来完成,该程序可以存储于一计算机可读存储介质中,存储介质可以包括:闪存盘、只读存储器(Read-Only Memory,
ROM)、随机存取器(Random Access Memory,RAM)、磁盘或光盘等。
以上所揭露的仅为本发明一种较佳实施例而已,当然不能以此来限定本发明之权利范围,本领域普通技术人员可以理解实现上述实施例的全部或部分流程,并依本发明权利要求所作的等同变化,仍属于发明所涵盖的范围。
Claims (23)
- 一种跟踪控制方法,其特征在于,包括:获取目标对象的跟踪参数;根据所述跟踪参数确定所述目标对象的特征部位;利用所述特征部位对所述目标对象进行跟踪。
- 根据权利要求1所述的方法,其特征在于,所述跟踪参数包括:所述目标对象的跟踪框在拍摄图像中的尺寸占比和/或所述目标对象与飞行器的距离。
- 根据权利要求2所述的方法,其特征在于,所述根据所述跟踪参数确定所述目标对象的特征部位,包括:在所述跟踪参数满足预设的第一跟踪条件时,确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体。
- 根据权利要求3所述的方法,其特征在于,所述预设的第一跟踪条件包括:所述尺寸占比小于或等于预设第一占比阈值,和/或,所述目标对象与所述飞行器的距离大于或等于预设第一距离。
- 根据权利要求2所述的方法,其特征在于,所述根据所述跟踪参数确定所述目标对象的特征部位,包括:在所述跟踪参数满足预设的第二跟踪条件时,根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位。
- 根据权利要求5所述的方法,其特征在于,所述预设的第二跟踪条件包括:所述尺寸占比大于或等于预设第二占比阈值,和/或,所述目标对象与所述飞行器的距离小于或等于预设第二距离。
- 根据权利要求3或4所述的方法,其特征在于,所述确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体,包括:获取所述跟踪框的中心线和所述拍摄图像包含的人体;将所述拍摄图像包含的人体中距离所述中心线最近的人体确定为所述目标对象的人体;将所述目标对象的人体作为第一特征部位,并将所述第一特征部位作为所述目标对象的特征部位。
- 根据权利要求5或6所述的方法,其特征在于,所述根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位,包括:在所述跟踪框中进行人脸识别,获取所述拍摄图像中所述目标对象的人脸区域;按照预设的扩展因子对所述目标对象的人脸区域进行扩展,得到扩展后的所述人脸区域;将扩展后的所述人脸区域包含的所述目标对象的第二特征部位作为所述目标对象的特征部位。
- 根据权利要求8所述的方法,其特征在于,所述第二特征部位包括所述目标对象的人脸和肩部。
- 根据权利要求3或4所述的方法,其特征在于,所述利用所述特征部位对所述目标对象进行跟踪,包括:将利用所述目标对象的人脸切换为利用所述第一特征部位对所述目标对象进行跟踪。
- 根据权利要求5或6所述的方法,其特征在于,所述利用所述特征部位对所述目标对象进行跟踪,包括:将利用所述目标对象的人体切换为利用所述第二特征部位对所述目标对象进行跟踪。
- 一种跟踪控制装置,其特征在于,包括:获取模块,用于获取目标对象的跟踪参数;确定模块,用于根据所述跟踪参数确定所述目标对象的特征部位;跟踪模块,用于利用所述特征部位对所述目标对象进行跟踪。
- 根据权利要求12所述的装置,其特征在于,所述跟踪参数包括:所述目标对象的跟踪框在拍摄图像中的尺寸占比和/或所述目标对象与飞行器的距离。
- 根据权利要求13所述的装置,其特征在于,所述确定模块,具体用于:在所述跟踪参数满足预设的第一跟踪条件时,确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体。
- 根据权利要求14所述的装置,其特征在于,所述预设的第一跟踪条件包括:所述尺寸占比小于或等于预设第一占比阈值,和/或,所述目标对象与所述飞行器的距离大于或等于预设第一距离。
- 根据权利要求13所述的装置,其特征在于,所述确定模块,具体用于:在所述跟踪参数满足预设的第二跟踪条件时,根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位。
- 根据权利要求16所述的装置,其特征在于,所述预设的第二跟踪条件包括:所述尺寸占比大于或等于预设第二占比阈值,和/或,所述目标对象与所述飞行器的距离小于或等于预设第二距离。
- 根据权利要求14或15所述的装置,其特征在于,所述确定模块确定所述目标对象的特征部位为第一特征部位,所述第一特征部位为所述目标对象的人体的具体方式为:获取所述跟踪框的中心线和所述拍摄图像包含的人体;将所述拍摄图像包含的人体中距离所述中心线最近的人体确定为所述目标对象的人体;将所述目标对象的人体作为第一特征部位,并将所述第一特征部位作为所述目标对象的特征部位。
- 根据权利要求16或17所述的装置,其特征在于,所述确定模块根据所述目标对象的人脸区域确定所述目标对象的第二特征部位,并确定所述目标对象的特征部位为所述第二特征部位的具体方式为:在所述跟踪框中进行人脸识别,获取所述拍摄图像中所述目标对象的人脸区域;按照预设的扩展因子对所述目标对象的人脸区域进行扩展,得到扩展后的所述人脸区域;将扩展后的所述人脸区域包含的所述目标对象的第二特征部位作为所述目标对象的特征部位。
- 根据权利要求19所述的装置,其特征在于,所述第二特征部位包括所述目标对象的人脸和肩部。
- 根据权利要求14或15所述的装置,其特征在于,所述跟踪模块,具体用于:将利用所述目标对象的人脸切换为利用所述第一特征部位对所述目标对象进行跟踪。
- 根据权利要求16或17所述的装置,其特征在于,所述跟踪模块,具体用于:将利用所述目标对象的人体切换为利用所述第二特征部位对所述目标对象进行跟踪。
- 一种飞行器,其特征在于,包括:处理器和存储器,所述处理器和所 述存储器通过总线连接,所述存储器存储有可执行程序代码,所述处理器用于调用所述可执行程序代码,执行如权利要求1~11中任一项所述的跟踪控制方法。
Priority Applications (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/082562 WO2018195979A1 (zh) | 2017-04-28 | 2017-04-28 | 一种跟踪控制方法、装置及飞行器 |
| CN201780004509.5A CN108475072A (zh) | 2017-04-28 | 2017-04-28 | 一种跟踪控制方法、装置及飞行器 |
| US16/664,804 US11587355B2 (en) | 2017-04-28 | 2019-10-26 | Tracking control method, device, and aircraft |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| PCT/CN2017/082562 WO2018195979A1 (zh) | 2017-04-28 | 2017-04-28 | 一种跟踪控制方法、装置及飞行器 |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/664,804 Continuation US11587355B2 (en) | 2017-04-28 | 2019-10-26 | Tracking control method, device, and aircraft |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2018195979A1 true WO2018195979A1 (zh) | 2018-11-01 |
Family
ID=63266467
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/CN2017/082562 Ceased WO2018195979A1 (zh) | 2017-04-28 | 2017-04-28 | 一种跟踪控制方法、装置及飞行器 |
Country Status (3)
| Country | Link |
|---|---|
| US (1) | US11587355B2 (zh) |
| CN (1) | CN108475072A (zh) |
| WO (1) | WO2018195979A1 (zh) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111144215A (zh) * | 2019-11-27 | 2020-05-12 | 北京迈格威科技有限公司 | 图像处理方法、装置、电子设备及存储介质 |
| CN112037253A (zh) * | 2020-08-07 | 2020-12-04 | 浙江大华技术股份有限公司 | 目标跟踪方法及其装置 |
Families Citing this family (7)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US10168704B2 (en) * | 2017-06-05 | 2019-01-01 | Hanzhou Zero Zero Technology Co., Ltd. | System and method for providing easy-to-use release and auto-positioning for drone applications |
| WO2019144263A1 (zh) * | 2018-01-23 | 2019-08-01 | 深圳市大疆创新科技有限公司 | 可移动平台的控制方法、设备、计算机可读存储介质 |
| CN109196438A (zh) * | 2018-01-23 | 2019-01-11 | 深圳市大疆创新科技有限公司 | 一种飞行控制方法、设备、飞行器、系统及存储介质 |
| CN111627256A (zh) * | 2019-02-28 | 2020-09-04 | 上海博泰悦臻电子设备制造有限公司 | 无人机控制方法、车载终端及计算机可读存储介质 |
| CN111602139A (zh) * | 2019-05-31 | 2020-08-28 | 深圳市大疆创新科技有限公司 | 图像处理方法、装置、控制终端及可移动设备 |
| WO2020258258A1 (zh) * | 2019-06-28 | 2020-12-30 | 深圳市大疆创新科技有限公司 | 目标跟随的方法、系统、可读存储介质和可移动平台 |
| US11394865B1 (en) * | 2021-06-28 | 2022-07-19 | Western Digital Technologies, Inc. | Low-power, fast-response machine learning autofocus enhancements |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016029170A1 (en) * | 2014-08-22 | 2016-02-25 | Cape Productions Inc. | Methods and apparatus for automatic editing of video recorded by an unmanned aerial vehicle |
| CN105847684A (zh) * | 2016-03-31 | 2016-08-10 | 深圳奥比中光科技有限公司 | 无人机 |
| CN105979147A (zh) * | 2016-06-22 | 2016-09-28 | 上海顺砾智能科技有限公司 | 一种无人机智能拍摄方法 |
Family Cites Families (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8331674B2 (en) * | 2007-04-06 | 2012-12-11 | International Business Machines Corporation | Rule-based combination of a hierarchy of classifiers for occlusion detection |
| US10645344B2 (en) * | 2010-09-10 | 2020-05-05 | Avigilion Analytics Corporation | Video system with intelligent visual display |
| US9530060B2 (en) * | 2012-01-17 | 2016-12-27 | Avigilon Fortress Corporation | System and method for building automation using video content analysis with depth sensing |
| US9125987B2 (en) * | 2012-07-17 | 2015-09-08 | Elwha Llc | Unmanned device utilization methods and systems |
| US9152243B2 (en) * | 2012-08-23 | 2015-10-06 | Qualcomm Incorporated | Object tracking using background and foreground models |
| CN103634521A (zh) * | 2012-08-30 | 2014-03-12 | 苏州翔合智能科技有限公司 | 摄像机拍摄方法及系统 |
| US9165190B2 (en) * | 2012-09-12 | 2015-10-20 | Avigilon Fortress Corporation | 3D human pose and shape modeling |
| US9070202B2 (en) * | 2013-03-14 | 2015-06-30 | Nec Laboratories America, Inc. | Moving object localization in 3D using a single camera |
| US9367067B2 (en) * | 2013-03-15 | 2016-06-14 | Ashley A Gilmore | Digital tethering for tracking with autonomous aerial robot |
| CN103679142B (zh) * | 2013-12-02 | 2016-09-07 | 宁波大学 | 一种基于空间约束的目标人体识别方法 |
| WO2015104644A2 (en) * | 2014-01-10 | 2015-07-16 | The Eye Tribe Aps | Light modulation in eye tracking devices |
| JP6648411B2 (ja) * | 2014-05-19 | 2020-02-14 | 株式会社リコー | 処理装置、処理システム、処理プログラム及び処理方法 |
| JP6370140B2 (ja) * | 2014-07-16 | 2018-08-08 | キヤノン株式会社 | ズーム制御装置、撮像装置、ズーム制御装置の制御方法、ズーム制御装置の制御プログラムおよび記憶媒体 |
| CN107703963B (zh) * | 2014-07-30 | 2020-12-01 | 深圳市大疆创新科技有限公司 | 目标追踪系统及方法 |
| EP3065042B1 (en) * | 2015-02-13 | 2018-11-07 | LG Electronics Inc. | Mobile terminal and method for controlling the same |
| US20160299229A1 (en) * | 2015-04-09 | 2016-10-13 | Sharper Shape Oy | Method and system for detecting objects |
| EP3101889A3 (en) * | 2015-06-02 | 2017-03-08 | LG Electronics Inc. | Mobile terminal and controlling method thereof |
| CN105139420B (zh) * | 2015-08-03 | 2017-08-29 | 山东大学 | 一种基于粒子滤波和感知哈希的视频目标跟踪方法 |
| CN108139757A (zh) * | 2015-09-11 | 2018-06-08 | 深圳市大疆创新科技有限公司 | 用于检测和跟踪可移动物体的系统和方法 |
| EP3353706A4 (en) * | 2015-09-15 | 2019-05-08 | SZ DJI Technology Co., Ltd. | SYSTEM AND METHOD FOR SUPPORTING A JUMP-FREE GUIDANCE |
| US9781350B2 (en) * | 2015-09-28 | 2017-10-03 | Qualcomm Incorporated | Systems and methods for performing automatic zoom |
| CN105678809A (zh) * | 2016-01-12 | 2016-06-15 | 湖南优象科技有限公司 | 手持式自动跟拍装置及其目标跟踪方法 |
| EP3420428B1 (en) * | 2016-02-26 | 2022-03-23 | SZ DJI Technology Co., Ltd. | Systems and methods for visual target tracking |
| US11255663B2 (en) * | 2016-03-04 | 2022-02-22 | May Patents Ltd. | Method and apparatus for cooperative usage of multiple distance meters |
| CN105894538A (zh) * | 2016-04-01 | 2016-08-24 | 海信集团有限公司 | 一种目标跟踪方法和装置 |
| CN106204647B (zh) * | 2016-07-01 | 2019-05-10 | 国家新闻出版广电总局广播科学研究院 | 基于多特征和组稀疏的视觉目标跟踪方法 |
| CN106292721A (zh) * | 2016-09-29 | 2017-01-04 | 腾讯科技(深圳)有限公司 | 一种控制飞行器跟踪目标对象的方法、设备及系统 |
-
2017
- 2017-04-28 WO PCT/CN2017/082562 patent/WO2018195979A1/zh not_active Ceased
- 2017-04-28 CN CN201780004509.5A patent/CN108475072A/zh active Pending
-
2019
- 2019-10-26 US US16/664,804 patent/US11587355B2/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2016029170A1 (en) * | 2014-08-22 | 2016-02-25 | Cape Productions Inc. | Methods and apparatus for automatic editing of video recorded by an unmanned aerial vehicle |
| CN105847684A (zh) * | 2016-03-31 | 2016-08-10 | 深圳奥比中光科技有限公司 | 无人机 |
| CN105979147A (zh) * | 2016-06-22 | 2016-09-28 | 上海顺砾智能科技有限公司 | 一种无人机智能拍摄方法 |
Cited By (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN111144215A (zh) * | 2019-11-27 | 2020-05-12 | 北京迈格威科技有限公司 | 图像处理方法、装置、电子设备及存储介质 |
| CN111144215B (zh) * | 2019-11-27 | 2023-11-24 | 北京迈格威科技有限公司 | 图像处理方法、装置、电子设备及存储介质 |
| CN112037253A (zh) * | 2020-08-07 | 2020-12-04 | 浙江大华技术股份有限公司 | 目标跟踪方法及其装置 |
Also Published As
| Publication number | Publication date |
|---|---|
| US11587355B2 (en) | 2023-02-21 |
| US20200057881A1 (en) | 2020-02-20 |
| CN108475072A (zh) | 2018-08-31 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2018195979A1 (zh) | 一种跟踪控制方法、装置及飞行器 | |
| KR102364993B1 (ko) | 제스처 인식 방법, 장치 및 디바이스 | |
| US11227388B2 (en) | Control method and device for mobile platform, and computer readable storage medium | |
| WO2019061079A1 (zh) | 一种对焦处理方法及设备 | |
| JP6187817B2 (ja) | 顔検出装置、方法およびプログラム | |
| CN113302907B (zh) | 拍摄方法、装置、设备及计算机可读存储介质 | |
| TW201941104A (zh) | 智慧設備的控制方法、裝置、設備和存儲介質 | |
| JP6340957B2 (ja) | 物体検出装置および物体検出プログラム | |
| WO2018120033A1 (zh) | 一种辅助用户寻物的方法及装置 | |
| EP3429185B1 (en) | Method, device and apparatus for determining focus window | |
| CN105809956B (zh) | 获取车辆排队长度的方法和装置 | |
| US9165364B1 (en) | Automatic tracking image pickup system | |
| CN110091866A (zh) | 泊车路径获取方法及装置 | |
| BR112019004631B1 (pt) | Aparelho e método de assistência de estacionamento | |
| WO2019210599A1 (zh) | 一种停车位识别方法及泊车方法 | |
| CN108875672A (zh) | 智能穿戴设备的屏幕亮度调整方法及装置 | |
| WO2019051813A1 (zh) | 一种目标识别方法、装置和智能终端 | |
| WO2018090252A1 (zh) | 机器人语音指令识别的方法及相关机器人装置 | |
| TW201923499A (zh) | 移動體控制系統、移動體控制方法及程式 | |
| CN115972955A (zh) | 基于充电机器人的车辆自动充电方法和装置 | |
| CN117376712A (zh) | 车载摄像头摄像参数的调整方法、装置、设备和存储介质 | |
| CN116363693A (zh) | 一种基于深度相机和视觉算法的自动跟随方法及装置 | |
| US10410044B2 (en) | Image processing apparatus, image processing method, and storage medium for detecting object from image | |
| CN114851211B (zh) | 臂架轨迹的规划方法、装置、服务器及存储介质 | |
| CN111506082A (zh) | 一种自动跟随拍摄避障系统及方法 |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17907394 Country of ref document: EP Kind code of ref document: A1 |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 17907394 Country of ref document: EP Kind code of ref document: A1 |