WO2018176426A1 - Procédé de commande de vol d'un véhicule aérien sans pilote et véhicule aérien sans pilote - Google Patents
Procédé de commande de vol d'un véhicule aérien sans pilote et véhicule aérien sans pilote Download PDFInfo
- Publication number
- WO2018176426A1 WO2018176426A1 PCT/CN2017/079134 CN2017079134W WO2018176426A1 WO 2018176426 A1 WO2018176426 A1 WO 2018176426A1 CN 2017079134 W CN2017079134 W CN 2017079134W WO 2018176426 A1 WO2018176426 A1 WO 2018176426A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drone
- wearable device
- target object
- information
- position information
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 74
- RZVHIXYEVGDQDX-UHFFFAOYSA-N 9,10-anthraquinone Chemical compound C1=CC=C2C(=O)C3=CC=CC=C3C(=O)C2=C1 RZVHIXYEVGDQDX-UHFFFAOYSA-N 0.000 title claims abstract description 24
- 230000009471 action Effects 0.000 claims abstract description 139
- 238000003384 imaging method Methods 0.000 claims abstract description 136
- 230000000007 visual effect Effects 0.000 claims description 76
- 210000003414 extremity Anatomy 0.000 claims description 20
- 238000004891 communication Methods 0.000 claims description 15
- 210000000707 wrist Anatomy 0.000 claims description 7
- 230000003287 optical effect Effects 0.000 claims description 5
- 230000008569 process Effects 0.000 description 10
- 230000001133 acceleration Effects 0.000 description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000005516 engineering process Methods 0.000 description 4
- 230000008859 change Effects 0.000 description 3
- 238000005259 measurement Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000012634 optical imaging Methods 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- 238000000844 transformation Methods 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
- G05D1/0016—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement characterised by the operator's input device
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/20—Instruments for performing navigational calculations
Definitions
- the invention relates to the field of UAV communication, in particular to a flight control method for a UAV and a UAV.
- the consumer drone market is currently booming, and most consumer-grade drones are used for aerial photography.
- visual tracking technology is often used to control the drone to automatically follow the target object.
- the existing visual tracking technology is difficult to obtain the position of the target object, and it is difficult to recognize after the target object passes the obstacle, and it is easy for the drone to lose the target object.
- a remote controller is often used to control the drone, but when the user goes outdoors to participate in activities such as skiing, mountaineering, mountain biking, etc., carrying a relatively bulky remote controller and its inconvenience.
- the technical problem to be solved by the present invention is to provide a flight control method for a drone and a drone, which can solve the problem that the drone using the visual tracking technology in the prior art is easy to lose the target object and the inconvenience of carrying the remote controller.
- the first technical solution adopted by the present invention is to provide a flight control method for a drone, comprising: acquiring position information and action information of a wearable device worn by a target object; and determining an action of the wearable device Whether the information matches the preset first action template; if it matches the first action template, the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the shooting position according to the position information of the wearable device
- the target object is adjusted in at least one or a combination of imaging sizes in the imaging device carried by the drone such that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device .
- the wearing device is a wristband or a watch worn on the arm of the target object
- the first motion template corresponds to the waving motion of the target object
- the step of adjusting the combination includes: adjusting the horizontal flight position of the drone according to the horizontal position information of the wearable device and the horizontal position information of the drone to adjust the horizontal relative distance between the drone and the wearable device to a first predetermined distance Within the range; adjust the flying height of the drone according to the height information of the wearable device and the height information of the drone to adjust the relative height of the drone and the wearable device to a second predetermined distance; according to the drone and wear Calculating the angle of the connection between the drone and the wearable device with respect to the horizontal direction or the vertical direction according to the horizontal relative distance and the relative height of the device, and adjusting the shooting angle of the imaging device according to the angle to make the optical axis of the imaging device The direction is adjusted to be within a predetermined angle range relative to the connection between the drone and the wear
- the method further comprises: visually recognizing the target object from within the captured image of the imaging device.
- the step of visually recognizing the target object from the captured image of the imaging device includes: performing motion recognition on at least two candidate objects in the captured image to respectively acquire motion information of at least two candidate objects; and selecting at least two candidates
- the motion information of the object is matched with the motion information of the wearable device or the first motion template; the matched candidate object is used as the target object.
- the method further includes: controlling the drone to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition.
- the step of controlling the drone to track the target object according to the position information of the wearable device obtained by the subsequent obtaining and the position information of the target object obtained by the visual recognition comprises: adjusting the flight position of the drone according to the position information of the wearable device, and according to The position information of the target object obtained by visual recognition adjusts the shooting angle of the drone.
- the step of controlling the drone to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition further comprises: determining the wearable device Whether the difference between the position information and the position information of the target object obtained by the visual recognition is greater than a preset threshold; if it is greater than the preset threshold, the flight position of the drone and the drone are reloaded according to the position information of the wearable device. At least one or a combination of the photographing angle of the imaging device and the imaging size of the target object in the imaging device carried by the drone is adjusted such that the drone is within a predetermined distance range of the wearable device and the target object is at The intended shooting range of the imaging device.
- At least one of the flying position of the drone, the shooting angle of the imaging device mounted on the drone, and the imaging size of the target object in the imaging device mounted on the drone are re-according to the position information of the wearing device.
- the method further includes: determining whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template; if the second action template matches, Azimuth adjustment of the drone.
- the wearable device is a wristband or a watch worn on the arm of the target object
- the second motion template corresponds to the wrist flipping action of the target object
- the method further includes: acquiring posture information of the wearable device; Whether the posture information of the device satisfies the preset orientation adjustment trigger condition; if the orientation adjustment trigger condition is satisfied, performing the determination of whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object is preset
- the second action template matches the steps.
- the step of determining whether the posture information of the wearable device satisfies the preset orientation adjustment trigger condition comprises: determining, according to the posture information of the wearable device and the relative positional relationship between the drone and the wearable device, determining that the target object is wearing the wearable device The direction of the limb is relative to whether the connection between the drone and the wearable device is within a preset angle range; if it is within the preset angle range, the orientation adjustment trigger condition is satisfied.
- the step of adjusting the orientation of the drone includes: adjusting the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently.
- the step of adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device obtained subsequently includes: adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device that is subsequently obtained, so as to achieve the target
- the orientation of the body on which the wearer wears the device relative to the line between the drone and the wearable device is always within a predetermined range of angles.
- the step of adjusting the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently includes: recording the pointing of the limb of the target object wearing the wearable device with respect to the connection between the drone and the wearable device The angle between the angles of the wearer and the wearable device is adjusted according to the attitude information of the wearable device that is obtained later, so that the adjusted target object is pointed with the limb of the wearable device. And the connection between the drone and the wearable device coincides with each other.
- the second technical solution adopted by the present invention is to provide a flight control method for a drone, comprising: acquiring position information of a wearable device worn by the target object; and combining the obtained position information of the wearable device And the position information of the target object obtained by visual recognition controls the drone to track the target object.
- the step of controlling the drone to track the target object according to the obtained position information of the wearable device and the position information of the target object obtained by the visual recognition comprises: adjusting the flight position of the drone according to the position information of the wearable device, and according to the visual
- the position information of the obtained target object is identified to adjust the shooting angle of the drone.
- the method further includes: determining whether a difference between the location information of the wearable device and the location information of the target object obtained by the visual recognition is greater than a first preset threshold; if greater than the first preset threshold, the location information according to the wearable device The flight position of the drone is adjusted so that the drone is within a predetermined distance of the wearable device.
- the method further includes: adjusting a shooting angle of the image forming device mounted on the drone so that the target object is in a predetermined shooting of the imaging device.
- the method further comprises: visually recognizing the target object from within the captured image of the imaging device.
- the method further includes: determining whether a difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is greater than a duration of the first preset threshold is greater than a second preset threshold; if the duration is greater than the first
- the second preset threshold is used to control the drone to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by visual recognition; otherwise, the flight of the drone is re-based according to the position information of the wearable device. Position adjustment.
- the third technical solution adopted by the present invention is to provide a drone, comprising: a wireless communication circuit, configured to acquire position information and action information of a wearable device worn by a target object;
- the wireless communication circuit is configured to determine whether the action information of the wearable device matches the preset first action template, and when the action information of the wearable device matches the first action template, according to the position information of the wearable device At least one or a combination of a flying position of the human machine, a shooting angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone to make the drone Within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device.
- the wearing device is a wristband or a watch worn on the arm of the target object
- the first motion template corresponds to the waving motion of the target object
- the processor according to the position information of the wearable device, at least a flight position of the drone, a shooting angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone Adjusting one or a combination specifically includes: adjusting a horizontal flight position of the drone according to the horizontal position information of the wearable device and the horizontal position information of the drone to adjust the horizontal relative distance between the drone and the wearable device to the first predetermined Within the distance range; adjust the flying height of the drone according to the height information of the wearable device and the height information of the drone to adjust the relative height of the drone and the wearable device to a second predetermined distance; according to the drone and Calculating the angle of the connection between the drone and the wearable device with respect to the horizontal direction or the vertical direction according to the horizontal relative distance and the relative height of the wearable device, and adjusting the photographing angle of the imaging device according to the angle to make the light of the imaging device
- the axis direction is adjusted to be within a predetermined angle range relative to the connection between the drone
- the processor is further configured to: visually recognize the target object from within the captured image of the imaging device.
- the visually recognizing the target object from the captured image of the imaging device includes: performing motion recognition on at least two candidate objects in the captured image to acquire motion information of at least two candidate objects respectively;
- the motion information of the candidate object is matched with the motion information of the wearable device or the first motion template; the matched candidate object is used as the target object.
- the processor is further configured to: control the drone to track the target object by combining the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition.
- the processor in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition, controlling the drone to track the target object, specifically: adjusting the flight position of the drone according to the position information of the wearable device, and The shooting angle of the drone is adjusted according to the position information of the target object obtained by the visual recognition.
- the step of controlling the UAV to track the target object by combining the position information of the wearable device obtained by the processor and the position information of the target object obtained by the visual recognition further includes: determining the position information of the wearable device and the target object obtained by visual recognition. Whether the difference between the location information is greater than a preset threshold; if it is greater than the preset threshold, the flight position of the drone, the shooting angle of the imaging device carried by the drone, and the target object are re-according to the position information of the wearable device. At least one or a combination of imaging sizes in the imaging device mounted on the drone is adjusted.
- the processor re-based the position information of the wearable device on the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the imaging size of the target object in the imaging device mounted on the drone After at least one or a combination of adjustments, further used to: re-visualize the target object.
- the processor is further configured to: determine whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template, and obtain the action information and the When the two action templates match, the azimuth is adjusted.
- the wearable device is a wristband or a watch worn on the arm of the target object
- the second motion template corresponds to the wrist flipping action of the target object
- the wireless communication circuit is further configured to acquire posture information of the wearable device, before the processor determines whether the action information of the subsequently acquired wearable device or the motion information obtained by performing motion recognition on the target object matches the preset second motion template.
- the processor is further configured to determine whether the posture information of the wearable device satisfies a preset azimuth adjustment trigger condition, and when the azimuth adjustment trigger condition is satisfied, performing action information for determining the subsequently acquired wearable device or performing motion recognition on the target object The step of whether the action information matches the preset second action template.
- the determining whether the gesture information of the wearable device meets the preset orientation adjustment triggering condition specifically includes: determining, according to the posture information of the wearable device and the relative positional relationship between the drone and the wearable device, determining that the target object is worn with the wearable device.
- the pointing of the limb is relative to whether the connection between the drone and the wearable device is within a preset angle range, and when it is within the preset angle range, it is determined that the posture information of the wearable device satisfies the orientation adjustment trigger condition.
- the processor adjusts the orientation of the drone according to the attitude information of the wearable device that is subsequently obtained, and adjusts the relative positional relationship between the drone and the wearable device.
- the adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device that is subsequently obtained includes: adjusting the relative positional relationship between the UAV and the wearable device according to the posture information of the wearable device that is subsequently obtained, so that The orientation of the target object wearing the limb of the wearable device is always within a preset angle range relative to the connection between the drone and the wearable device.
- the processor adjusts the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device that is subsequently obtained, which includes: recording the pointing of the target object wearing the wearable device relative to the drone and the wearable device The angle between the lines; the relative positional relationship between the drone and the wearable device is adjusted according to the posture information of the wearable device obtained in the following manner, so that the adjusted target object is worn with the limb of the wearable device The connection and the connection between the drone and the wearable device coincide with each other.
- the fourth technical solution adopted by the present invention is: providing a drone,
- the method includes: a wireless communication circuit, configured to acquire location information of a wearable device worn by the target object; and a processor coupled to the wireless communication circuit, configured to combine the obtained position information of the wearable device and the position information of the target object obtained by visual recognition The drone tracks the target object.
- the processor in combination with the obtained position information of the wearable device and the position information of the target object obtained by the visual recognition, controlling the drone to track the target object, specifically: adjusting the flight position of the drone according to the position information of the wearable device, and according to The position information of the target object obtained by visual recognition adjusts the shooting angle of the drone.
- the processor is further configured to: determine whether a difference between the location information of the wearable device and the location information of the target object obtained by the visual recognition is greater than a first preset threshold, and when greater than the first preset threshold, according to the wearable device The location information adjusts the flight position of the drone such that the drone is within a predetermined distance of the wearable device.
- the processor After the processor adjusts the flight position of the drone according to the position information of the wearable device, the processor further uses: adjusting the shooting angle of the imaging device mounted on the drone to make the target object be in the predetermined position of the imaging device. Within the shooting range.
- the processor is further configured to: visually recognize the target object from within the captured image of the imaging device.
- the processor is further configured to: determine whether a difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is not greater than a duration of the first preset threshold is greater than a second preset threshold, and is in a duration
- the UAV controls the UAV to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition, when the duration is not greater than the second preset threshold. Re-adjust the flight position of the drone based on the position information of the wearable device.
- the embodiment of the present invention determines whether the action information of the wearable device and the preset first action are obtained by acquiring the position information and the action information of the wearable device worn by the target object.
- the templates are matched, and at the time of matching, at least one or a group of the flying position of the drone and the shooting angle of the imaging device mounted on the drone according to the position information of the wearing device Adjusting so that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device, so that the user can control the drone flight through the portable wearable device without carrying bulky Controller to improve the convenience of drone control;
- the embodiment of the present invention controls the drone to track the target object by combining the obtained position information of the wearable device and the position information of the target object obtained by the visual recognition, thereby using the position information of the wearable device to compensate for the problem of unstable visual recognition. Improve the accuracy of target tracking.
- FIG. 1 is a schematic flow chart of a first embodiment of a flight control method for a drone of the present invention
- FIG. 2 is a schematic diagram showing the relative position change between the UAV and the target object before and after the execution of step S14 in FIG. 1;
- FIG. 3 is a schematic flow chart of a second embodiment of a flight control method for a drone of the present invention.
- FIG. 4 is a schematic flow chart of a third embodiment of a flight control method for a drone of the present invention.
- Figure 5 is a schematic diagram showing the relative position change between the drone and the target object before and after the step S3211;
- Figure 6 is a schematic diagram showing the relative position change between the drone and the target object before and after the step S3213;
- FIG. 7 is a schematic diagram of a process of determining whether the UAV azimuth adjustment trigger condition is satisfied in step S301;
- Figure 8 is a schematic view showing the structure of an embodiment of the drone of the present invention.
- FIG. 1 is a schematic flow chart of a first embodiment of a flight control method for a drone of the present invention.
- the flight control method of the drone of the present invention includes:
- Step S10 acquiring location information and action information of the wearable device worn by the target object
- the wearing device is a wristband or a wrist watch worn on the arm of the target object; the position information is worn The GPS position data of the device is worn; the action information is at least one type of data of the wearable device, including acceleration, angular velocity or motion trajectory, and the motion information is detected by an inertial measurement unit of the wearable device or a sensor such as a magnetometer.
- the wearable device may also be other types of devices such as a ring that is worn on the finger of the target object, and is not specifically limited herein.
- the wearable device worn by the target object is bound to the drone in advance, and can communicate with the drone through a wireless communication link, such as 4G network, wifi, Bluetooth, or the like.
- Step S12 determining whether the action information of the wearable device matches the preset first action template
- the first action template is preset posture data, and is associated with some control command of the drone in advance.
- the first motion template corresponds to a swing motion of the target object, that is, the first motion template is motion information generated by the wearable device worn by the target object when the target object is waved, when the drone obtains
- the motion information of the target object for example, the acceleration
- the first action template may correspond to other actions such as the hand-raising action of the target object
- the matching determination process may also adopt other methods such as determining the variance, which is not specifically limited herein.
- Step S14 if it matches the first motion template, the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the target object are mounted on the drone according to the position information of the wearable device. At least one or a combination of imaging sizes in the imaging device is adjusted such that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined imaging range of the imaging device.
- step S14 includes:
- Step S141 adjusting the horizontal flight position of the drone according to the horizontal position information of the wearable device and the horizontal position information of the drone to adjust the horizontal relative distance between the drone and the wearable device to a first predetermined distance range;
- the first predetermined distance range is a preset first distance threshold range, which may be set according to specific requirements, and is not specifically limited herein; the horizontal position information of the wearable device may be through the wearable device.
- the transmitted GPS position data is obtained, and the horizontal position information of the drone can be obtained by the GPS locator of the drone.
- the horizontal position information of the wearable device 20 and the horizontal position information of the drone 10 are obtained. That is, according to the GPS position data of the wearable device 20 and the GPS position data of the drone 10, the horizontal relative distance between the drone 10 and the wearable device 20 is calculated, thereby adjusting the horizontal flight position of the drone 10, for example, from the figure.
- the A position in 2 flies to the B position, so that the horizontal relative distance between the drone 10 and the wearable device 20 is gradually reduced/increased to a range within a first predetermined distance (for example, within a range of 1-2 meters), as shown in FIG. Reduced from X1 to X2.
- Step S142 adjusting the flying height of the drone according to the height information of the wearing device and the height information of the drone to adjust the relative height of the drone and the wearing device to a second predetermined distance range;
- the second predetermined distance range is a preset second distance threshold range, which may be set according to specific requirements, and is not specifically limited herein.
- the height information of the wearable device 20 may default to the same height as the ground, and the height information of the drone 10 may be obtained by an ultrasonic sensor or a barometer of the drone 10 or the like.
- the height difference between the wearable device 20 and the drone 10 can be calculated, and the flying height of the drone 10 can be adjusted, for example, from FIG.
- the A position flies to the B position to gradually reduce/increase the relative height between the drone 10 and the wearable device 20 to a second predetermined distance range (for example, within a range of 3-4 meters), as reduced from H1 in FIG. To H2.
- the height information of the wearable device 20 can also be set to other default height values, or obtained by other sensors.
- the altitude information of the drone 10 can also be measured by the binocular vision system. Specifically limited.
- Step S143 Calculate an angle between the connection line between the drone and the wearable device with respect to the horizontal direction or the vertical direction according to the horizontal relative distance and the relative height of the drone and the wearable device, and adjust the shooting of the imaging device according to the angle
- the angle is such that the optical axis direction of the imaging device is adjusted to be within a predetermined angular range with respect to the line between the drone and the wearable device.
- the predetermined angle range is a preset angle threshold range, which can be set according to specific requirements, and is not specifically limited herein.
- the connection between the drone 10 and the wearable device 20 is calculated (such as the CD connection in FIG. 2).
- the imaging angle of the imaging device 101 is adjusted according to the angle, so that the optical axis direction of the imaging device 101 (such as the DE connection direction in FIG. 2) is adjusted to be relative
- the connection between the drone 10 and the wearable device 20 i.e., the CD connection
- a predetermined angular range e.g., in the range of 0-10 degrees
- Step S144 Adjust the focal length of the imaging device according to the horizontal relative distance and the relative height of the drone and the wearable device, so that the imaging size of the target object in the imaging device accounts for a predetermined ratio within the proportion of the entire captured image.
- the predetermined ratio range is a preset range of the target object imaged in advance, and may be set according to specific requirements, and is not specifically limited herein.
- the target object 30 is already in the shooting range of the imaging device 101 mounted on the drone 10,
- the distance between the drone 10 and the wearable device 20 can be calculated, thereby corresponding to the distance according to the distance.
- the focal length of the imaging device 101 is adjusted, and the shooting range is gradually enlarged/reduced so that the imaging size of the target object 30 accounts for a predetermined ratio within the predetermined ratio, for example, in the range of 25% to 35%.
- the above steps S141-S144 may only perform one or more combinations thereof. For example, when it is determined that the acquired action information matches the first action template, the relative height of the drone and the wearable device is already in the first Within the preset distance range, only steps S142 and S143 need to be performed at this time.
- the flying position of the drone can also be adjusted according to the imaging size of the target object in the imaging device, so that the imaging size of the target object in the imaging device carried by the drone accounts for the entire shooting image.
- the ratio is within a predetermined ratio, while making drones and wearables
- the horizontal relative distance and the relative height between the two gradually increase/decrease to a predetermined distance range, which is not specifically limited herein.
- the location information and the action information of the wearable device worn by the target object are obtained, and it is determined whether the action information of the wearable device matches the preset first action template, and according to the position information of the wearable device when matching Adjusting at least one or a combination of a flight position of the drone, an imaging angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone so that none
- the human machine is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device, so that the user can control the drone flight through the portable wearable device without carrying a bulky controller to improve the unmanned person.
- FIG. 3 is a schematic flow chart of a second embodiment of a flight control method for a drone of the present invention.
- the flight control method of the drone of the present invention includes:
- Step S21 Control the drone to track the target object by combining the obtained position information of the wearable device and the position information of the target object obtained by visual recognition.
- step S21 specifically includes:
- Step S211 Adjust the flight position of the drone according to the position information of the wearable device, and adjust the shooting angle of the drone according to the position information of the target object obtained by the visual recognition.
- the wearable device is pre-bound with the drone, and during the flight of the drone, the flight position of the drone is adjusted according to the position information uploaded by the wearable device, so that the drone is within a predetermined distance range of the wearable device, for example,
- the linear distance between the drone and the wearable device is within a predetermined distance range (for example, within a range of 4-5 meters), and the shooting angle of the drone is adjusted according to the position information of the target object obtained by visual recognition, so that the target object is imaged.
- the predetermined shooting range of the device wherein the predetermined distance range and the predetermined shooting range can be set according to actual needs, and is not specifically limited herein.
- step S21 the method includes:
- Step S20 visually recognize the target object from within the photographing screen of the imaging device.
- the drone can trigger the visual recognition function through a certain predetermined motion information, and can also automatically trigger the visual recognition function, which is not specifically limited herein.
- step S20 includes:
- Step S201 performing motion recognition on at least two candidate objects in the captured image to respectively acquire motion information of at least two candidate objects;
- the drone may first perform contour recognition on the captured image, select at least two candidate objects that are relatively close to the human body contour for motion recognition, and acquire motion information of at least two candidate objects, such as motion trajectories, by using a visual recognition algorithm.
- Information such as acceleration.
- at least two objects in the screen may be randomly selected as candidate objects, and other selection manners may be used, which are not specifically limited herein.
- Step S202 Matching action information of at least two candidate objects with action information of the wearable device or a first action template
- Step S203 The matched candidate object is used as the target object.
- the first action template is preset posture data, and includes at least one type of data such as acceleration, angular velocity, and motion track of the corresponding motion.
- the action information of the at least two candidate objects is matched with the action information uploaded by the wearable device or the first action template, for example, determining whether the motion track of the motion of the candidate object is the same as the motion track uploaded by the wearable device, or The difference between the two is within the allowable range, wherein the allowable range is preset; the matching candidate object is finally used as the target object.
- only one candidate object may be selected for motion matching, or one of the objects in the captured image may be randomly selected as a target object, which is not specifically limited herein.
- step S21 includes:
- Step S212 determining whether the difference between the location information of the wearable device and the location information of the target object obtained by the visual recognition is greater than a first preset threshold
- the first preset threshold is a maximum allowable error of the preset visual ranging. If the first preset threshold is exceeded, the visual ranging is unreliable, and the specific value may be set according to specific requirements. limited.
- the drone can use a visual ranging algorithm, such as binocular vision ranging, to obtain the distance a between the target object and the drone, and the drone can also utilize the position information of the wearable device and the drone.
- the position information is used to calculate the distance b between the drone and the wearable device, so that the difference between a and b can be calculated, thereby determining whether the difference is greater than a preset threshold, for example, 0.5 meters.
- Step S213 If it is greater than the first preset threshold, the flight position of the drone, the shooting angle of the imaging device mounted on the drone, and the target object are mounted on the drone according to the position information of the wearable device. At least one or a combination of imaging sizes in the imaging device is adjusted such that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined imaging range of the imaging device.
- the step S213 specifically includes:
- Step S2131 If it is greater than the first preset threshold, adjust the flight position of the drone according to the position information of the wearable device, so that the drone is within a predetermined distance range of the wearable device.
- the drone can be controlled according to the position information of the wearable device when the UAV visually tracks and loses the target object or visually tracks the error, so that the drone is re-set within the preset range of the wearable device, thereby preventing The drone is losing the target.
- Step S2132 Adjust the shooting angle of the imaging device mounted on the drone so that the target object is within a predetermined shooting range of the imaging device.
- Step S2133 Adjust the focal length of the imaging device according to the horizontal relative distance and the relative height of the drone and the wearable device, so that the imaging size of the target object in the imaging device accounts for a predetermined ratio within the proportion of the entire captured image.
- steps S2131, S2132, and S2133 may also perform only one or a combination of any two, which is not specifically limited herein.
- Step S214 Perform visual recognition on the target object again.
- the drone can use a visual detection algorithm, for example, to identify the target object by feature extraction, or randomly select an object in the captured image as the target object, and the specific process may be Referring to the above step S20, it will not be repeated here.
- a visual detection algorithm for example, to identify the target object by feature extraction, or randomly select an object in the captured image as the target object, and the specific process may be Referring to the above step S20, it will not be repeated here.
- Step S215 determining whether the difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is not greater than a duration of the first preset threshold is greater than a second preset threshold;
- the second preset threshold is a set time threshold of the visual ranging, and the second preset threshold indicates that the visual ranging is stable and reliable, and the specific value may be set according to specific requirements, which is not specifically limited herein.
- Step S216 If the duration is greater than the second preset threshold, the UAV controls the UAV to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition; otherwise, according to the wearable device The position information adjusts the flight position of the drone.
- the drone continuously determines whether the difference between the position information of the wearable device and the position information of the target object obtained by the visual recognition is not greater than a first preset threshold, and records the position information of the wearable device and the target object obtained by visual recognition.
- the difference between the location information is not greater than the duration of the first preset threshold. If the duration is greater than the second predetermined threshold (for example, 1 minute), it indicates that the visual ranging is stable and reliable, that is, the vision of the drone.
- the tracking is stable and reliable.
- the UAV can be used to track the target object in combination with the position information of the subsequently obtained wearable device and the position information of the target object obtained by visual recognition; otherwise, the visual tracking of the drone is unstable, and The flight position of the drone is adjusted according to the position information of the wearable device.
- the drone by combining the obtained position information of the wearable device and the position information of the target object obtained by visual recognition, the drone is controlled to track the target object, thereby using the position information of the wearable device to compensate for the problem of unstable visual recognition, and improving The accuracy of the target tracking.
- This embodiment may be combined with the first embodiment of the flight control method of the unmanned aerial vehicle of the present invention, and the execution of the steps of the present embodiment may be after step S10 or step S14.
- FIG. 4 is a schematic flow chart of a third embodiment of a flight control method for a drone of the present invention.
- the flight control method of the drone of the present invention includes:
- Step S31 determining whether the acquired action information of the wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template;
- the wearable device is a wristband or a watch worn on the arm of the target object
- the second motion template corresponds to the wrist flipping action of the target object.
- the motion information is attitude data of the wearable device, and includes at least one type of data such as an acceleration, an angular velocity, or a motion trajectory, and the motion information is detected by a sensor such as an inertial measurement unit of the wearable device or a magnetometer.
- the wearable device may be other types of devices such as a ring worn on the finger of the target object, and the second action template may also correspond to other actions such as raising the hand of the target object, which is not specifically limited herein. .
- Step S32 If the second action template is matched, the azimuth adjustment is performed on the drone.
- the wearable device worn by the target object is bound to the drone in advance, and the action information is uploaded to the drone through the wireless communication link, and the drone determines the acquired action information of the wearable device or performs motion recognition on the target object.
- the obtained motion information matches the preset second action template, for example, determining whether the motion track of the wearable device or the motion track of the target object obtained by the visual recognition is the same as the motion track of the preset second action template, Or the difference between the two is within an allowable range, wherein the allowable range is preset; if the motion trajectory is the same or the difference is within the allowable range, determining the acquired action information of the wearable device or performing motion recognition on the target object
- the action information is matched with the preset second action template, and the azimuth can be adjusted for the unmanned aerial vehicle; at the same time, the shooting angle of the imaging device carried by the drone can be synchronously adjusted so that the target object is always within the shooting range.
- step S32 specifically includes:
- Step S321 Adjust the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently.
- the posture information is data such as an azimuth or Euler angle of the wearable device obtained by the magnetometer and/or the inertial measurement unit of the wearable device.
- step S321 includes:
- Step S3211 Adjust the relative positional relationship between the drone and the wearable device according to the posture information of the wearable device obtained subsequently, so that the pointing of the target object wearing the wearable device relative to the drone and the wearable device is always Keep within the preset angle range.
- the drone 10 is based on the subsequently obtained wearable device 20
- Attitude information such as Euler angles
- the flying position of the drone 10 is synchronously adjusted, for example, according to the Euler angle of the wearable device 20 obtained subsequently, the wearing device is acquired.
- step S321 specifically includes:
- Step S3212 recording an angle between the pointing of the limb of the target object wearing the wearable device relative to the line between the drone and the wearable device;
- the drone 10 can obtain the limb 301 of the target object 30 wearing the wearable device 20 according to the posture information of the wearable device 20 obtained later, such as Euler angles ( For example, the direction of the arm), as shown in FIG. 6 in the direction of the AB connection; the position information of the drone 10 and the wearable device 20 can be used to calculate the direction of the connection between the drone 10 and the wearable device 20, that is, FIG.
- the direction of the AC connection is such that the angle ⁇ between the direction of the arm 301 (the direction of the AB connection) relative to the line between the drone 10 and the wearable device 20 (the direction of the AC connection) can be obtained.
- Step S3213 adjusting the relative positional relationship between the drone and the wearable device according to the attitude information of the wearable device obtained by using the angle as the compensation value, so that the adjusted target object is pointed and unmanned by the wearable device.
- the connection between the machine and the wearable device coincides with each other.
- the angle ⁇ is used as the angle
- the compensation value is used to synchronously adjust the flight position of the drone 10, for example, according to the wearable device 20 obtained subsequently.
- the deflection angle ⁇ of the wearable device 20 is obtained, with the deflection angle ⁇ plus the compensation value ⁇ as the flight angle of the drone 10, on the horizontal plane where the drone 10 is located, and in the drone 10
- the flight position of the drone 10 is synchronously adjusted, so that the adjusted target object 30 can be worn with the pointing of the limb 301 of the wearable device 20 (such as the DE connection in FIG. 6).
- the direction) and the connection between the drone 10 and the wearable device 20 (such as the DF connection direction in FIG. 6) coincide with each other.
- step S31 the method includes:
- Step S300 Acquire posture information of the wearable device
- Step S301 determining whether the posture information of the wearable device meets the preset orientation adjustment trigger condition
- Step S302 If the orientation adjustment trigger condition is satisfied, step S31 is performed.
- step S301 includes:
- Step S3011 determining, according to the posture information of the wearable device and the relative positional relationship between the drone and the wearable device, whether the pointing of the limb of the target object wearing the wearable device is relative to the connection between the drone and the wearable device. Set within the angle range;
- Step S3012 If it is within the preset angle range, the orientation adjustment trigger condition is satisfied.
- the preset angle range is a preset maximum angle deviation range, and the specific value may be set according to actual requirements, and is not specifically limited herein.
- the drone 10 can obtain the limb 301 of the target object 30 wearing the wearable device 20 according to the obtained posture information of the wearable device 20, such as the Euler angle (for example, The orientation of the arm), as shown in the AB connection direction in FIG. 7, can be calculated according to the position information of the drone 10 and the wearable device 20, as shown in FIG. In the AC connection direction, it is judged whether the angle ⁇ between the pointing of the arm 301 (ie, the line connecting the AB) and the direction of the line between the drone 10 and the wearable device 20 (ie, the direction of the AC connection) is within a preset angle range.
- the angle ⁇ between the pointing of the arm 301 ie, the line connecting the AB
- the direction of the line between the drone 10 and the wearable device 20 ie, the direction of the AC connection
- the orientation adjustment trigger condition is satisfied, and the step S31 can be continued, so that the drone 10 can follow the inside (for example, within ⁇ 20 degrees).
- the movement of the arm 301 is synchronously adjusted to the flight position; otherwise, the orientation adjustment is not triggered, that is, not performed Step S31.
- the azimuth adjustment process if the angle between the pointing of the arm 301 (ie, the direction of the AB connection) and the direction of the line between the drone 10 and the wearable device 20 (ie, the direction of the AC connection) exceeds the preset The angle range determines the end of the orientation adjustment.
- the unmanned aerial vehicle is oriented. Adjustment, so that the relative position of the drone and the target object can be adjusted, and the shooting angle of the imaging device mounted on the drone can be synchronously adjusted so that the target object is always in the shooting range, and thus the portable wearing device can realize no
- the flight control of the man-machine does not require carrying a bulky controller to improve the control convenience of the drone.
- the present embodiment may be combined with the first and/or second embodiment of the flight control method of the unmanned aerial vehicle of the present invention, and the execution of the steps of the present embodiment may be after step S14 or step S21.
- FIG. 8 is a schematic structural view of an embodiment of the drone of the present invention.
- the drone 80 of the present invention comprises:
- the wireless communication circuit 801 is configured to acquire location information and action information of the wearable device worn by the target object;
- the processor 802 is coupled to the wireless communication circuit 801, configured to determine whether the action information of the wearable device matches the preset first action template, and when the action information of the wearable device matches the first action template, according to the wearable device At least one or a combination of the positional information of the flight position of the drone 80, the imaging angle of the imaging device mounted on the drone 80, and the imaging size of the target object in the imaging device mounted on the drone 80 Adjustments are made such that the drone 80 is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device.
- the wearing device is a wristband or a watch worn on the arm of the target object
- the first motion template corresponds to the waving motion of the target object.
- the wearable device may be other types of devices such as a ring that is worn on the finger of the target object
- the first action template may also be used for other actions such as shaking the arm up and down, and is not specifically limited herein.
- the drone 80 further includes:
- the memory 803 is coupled to the processor 802, and is configured to store instructions and data required for the processor 802 to operate, such as a first action template.
- the locator 804 is coupled to the processor 802 for acquiring location information of the drone 80;
- the processor 802 determines whether the action information of the wearable device matches the preset first motion template, and the flight position of the drone 80 according to the position information of the wearable device, and the imaging device carried by the drone 80.
- the specific process of adjusting the shooting angle and the image size of the target object in the imaging device mounted on the unmanned aerial vehicle 80 may refer to the corresponding steps of the first embodiment of the flight control method of the unmanned aerial vehicle of the present invention. , not repeated here.
- the processor 802 is configured to: visually recognize the target object from the captured image of the imaging device; and control the drone 80 to target the target by combining the position information of the subsequently obtained wearable device and the position information of the target object obtained by the visual recognition.
- the object is tracked; the specific process can refer to the corresponding step of the second embodiment of the flight control method of the unmanned aerial vehicle of the present invention, and is not repeated here.
- the processor 802 is further configured to: determine whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template, and obtain the action information. When the second action template is matched, the drone 80 is adjusted in orientation.
- the second action template corresponds to a wrist flipping action of the target object.
- the second motion template may also be corresponding to other actions such as raising a hand, and is not specifically limited herein.
- the processor 802 is further configured to determine whether the posture information of the wearable device meets the preset before determining whether the motion information of the wearable device that is subsequently acquired or the action information obtained by performing motion recognition on the target object matches the preset second action template.
- the azimuth adjustment trigger condition and when the azimuth adjustment trigger condition is satisfied, the step of determining whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object matches the preset second action template is performed. .
- the processor 802 determines whether the posture information of the wearable device satisfies the preset orientation adjustment trigger condition, and determines whether the action information of the subsequently acquired wearable device or the action information obtained by performing motion recognition on the target object is related to the preset second action.
- the specific process of matching the template and the azimuth adjustment of the drone 80 reference may be made to the corresponding steps of the third embodiment of the flight control method of the unmanned aerial vehicle of the present invention. It will not be repeated here.
- the drone may further include other components such as an ultrasonic sensor, a magnetometer, and the like, which is not specifically limited herein.
- the UAV determines whether the action information of the wearable device matches the preset first action template by acquiring the position information and the action information of the wearable device worn by the target object, and according to the wearable device when matching
- the position information is adjusted for at least one or a combination of a flight position of the drone, an imaging angle of the imaging device mounted on the drone, and an imaging size of the target object in the imaging device mounted on the drone, So that the drone is within a predetermined distance range of the wearable device and the target object is within a predetermined shooting range of the imaging device, so that the user can control the drone flight through the portable wearable device without carrying a bulky controller.
- the convenience of the drone control is improved; in addition, the position information of the target object obtained by the combination and the position information of the target object obtained by the visual recognition are controlled to track the target object, thereby utilizing the position information of the wearable device to compensate for the visual recognition. Unstable problems improve the accuracy of target tracking.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Computing Systems (AREA)
- Mathematical Physics (AREA)
- Theoretical Computer Science (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Image Analysis (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
La présente invention concerne un procédé de commande de vol d'un véhicule aérien sans pilote, et un véhicule aérien sans pilote. Le procédé consiste à : acquérir des informations de position et des informations d'action concernant un dispositif vestimentaire porté par un objet cible (S10) ; déterminer si les informations d'action concernant le dispositif vestimentaire correspondent à un premier modèle d'action prédéfini (S12) ; et, lors d'une concordance, ajuster, en fonction des informations de position concernant le dispositif vestimentaire, au moins l'un ou une combinaison de la position de vol d'un véhicule aérien sans pilote, de l'angle photographique d'un dispositif de formation d'image monté sur le véhicule aérien sans pilote, et d'une taille de formation d'image, dans le dispositif de formation d'image monté sur le véhicule aérien sans pilote, de l'objet cible, de sorte que le véhicule aérien sans pilote se situe dans une plage de distance prédéterminée du dispositif vestimentaire et que l'objet cible se situe dans une plage photographique prédéterminée du dispositif de formation d'image (S14). Un utilisateur peut ainsi commander le vol d'un véhicule aérien sans pilote au moyen d'un dispositif vestimentaire portable, sans transporter de dispositif de commande volumineux, cela permettant d'améliorer la facilité de commande d'un véhicule aérien sans pilote.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780054731.6A CN109690440B (zh) | 2017-03-31 | 2017-03-31 | 一种无人机的飞行控制方法及无人机 |
CN202210149842.8A CN114510079A (zh) | 2017-03-31 | 2017-03-31 | 一种无人机的飞行控制方法及无人机 |
PCT/CN2017/079134 WO2018176426A1 (fr) | 2017-03-31 | 2017-03-31 | Procédé de commande de vol d'un véhicule aérien sans pilote et véhicule aérien sans pilote |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/079134 WO2018176426A1 (fr) | 2017-03-31 | 2017-03-31 | Procédé de commande de vol d'un véhicule aérien sans pilote et véhicule aérien sans pilote |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018176426A1 true WO2018176426A1 (fr) | 2018-10-04 |
Family
ID=63673997
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/079134 WO2018176426A1 (fr) | 2017-03-31 | 2017-03-31 | Procédé de commande de vol d'un véhicule aérien sans pilote et véhicule aérien sans pilote |
Country Status (2)
Country | Link |
---|---|
CN (2) | CN114510079A (fr) |
WO (1) | WO2018176426A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112989982A (zh) * | 2021-03-05 | 2021-06-18 | 佛山科学技术学院 | 一种无人车图像采集控制方法及系统 |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113986098B (zh) * | 2021-09-07 | 2025-03-04 | 深圳大争智能科技有限公司 | 一种人机交互操作的触发控制方法和装置 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105676860A (zh) * | 2016-03-17 | 2016-06-15 | 歌尔声学股份有限公司 | 一种可穿戴设备、无人机控制装置和控制实现方法 |
CN105955306A (zh) * | 2016-07-20 | 2016-09-21 | 西安中科比奇创新科技有限责任公司 | 可穿戴设备、基于可穿戴设备的无人机控制方法及系统 |
JP6020872B1 (ja) * | 2016-06-24 | 2016-11-02 | 株式会社アドインテ | 分析システム及び分析方法 |
CN106161953A (zh) * | 2016-08-12 | 2016-11-23 | 零度智控(北京)智能科技有限公司 | 一种跟踪拍摄方法和装置 |
CN106370184A (zh) * | 2016-08-29 | 2017-02-01 | 北京奇虎科技有限公司 | 无人机自动跟踪拍摄的方法、无人机和移动终端设备 |
CN106446837A (zh) * | 2016-09-28 | 2017-02-22 | 湖南优象科技有限公司 | 一种基于运动历史图像的挥手检测方法 |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20120194551A1 (en) * | 2010-02-28 | 2012-08-02 | Osterhout Group, Inc. | Ar glasses with user-action based command and control of external devices |
EP3145811A4 (fr) * | 2014-05-23 | 2018-05-23 | LR Acquisition, LLC | Drone pour la photo et/ou la vidéo |
CN107577247B (zh) * | 2014-07-30 | 2021-06-25 | 深圳市大疆创新科技有限公司 | 目标追踪系统及方法 |
CN105120146B (zh) * | 2015-08-05 | 2018-06-26 | 普宙飞行器科技(深圳)有限公司 | 一种利用无人机进行运动物体自动锁定拍摄装置及拍摄方法 |
CN105843246A (zh) * | 2015-11-27 | 2016-08-10 | 深圳市星图智控科技有限公司 | 无人机跟踪方法、系统及无人机 |
CN105759839B (zh) * | 2016-03-01 | 2018-02-16 | 深圳市大疆创新科技有限公司 | 无人机视觉跟踪方法、装置以及无人机 |
CN106020492A (zh) * | 2016-06-07 | 2016-10-12 | 赵武刚 | 通过手的动作与手势产生遥控无人机及附件的信号的方法 |
CN106155090B (zh) * | 2016-08-29 | 2019-04-19 | 电子科技大学 | 基于体感的可穿戴无人机控制设备 |
CN106454069B (zh) * | 2016-08-31 | 2019-11-08 | 歌尔股份有限公司 | 一种控制无人机拍摄的方法、装置和可穿戴设备 |
CN106444843B (zh) * | 2016-12-07 | 2019-02-15 | 北京奇虎科技有限公司 | 无人机相对方位控制方法及装置 |
-
2017
- 2017-03-31 WO PCT/CN2017/079134 patent/WO2018176426A1/fr active Application Filing
- 2017-03-31 CN CN202210149842.8A patent/CN114510079A/zh active Pending
- 2017-03-31 CN CN201780054731.6A patent/CN109690440B/zh active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN105676860A (zh) * | 2016-03-17 | 2016-06-15 | 歌尔声学股份有限公司 | 一种可穿戴设备、无人机控制装置和控制实现方法 |
JP6020872B1 (ja) * | 2016-06-24 | 2016-11-02 | 株式会社アドインテ | 分析システム及び分析方法 |
CN105955306A (zh) * | 2016-07-20 | 2016-09-21 | 西安中科比奇创新科技有限责任公司 | 可穿戴设备、基于可穿戴设备的无人机控制方法及系统 |
CN106161953A (zh) * | 2016-08-12 | 2016-11-23 | 零度智控(北京)智能科技有限公司 | 一种跟踪拍摄方法和装置 |
CN106370184A (zh) * | 2016-08-29 | 2017-02-01 | 北京奇虎科技有限公司 | 无人机自动跟踪拍摄的方法、无人机和移动终端设备 |
CN106446837A (zh) * | 2016-09-28 | 2017-02-22 | 湖南优象科技有限公司 | 一种基于运动历史图像的挥手检测方法 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112989982A (zh) * | 2021-03-05 | 2021-06-18 | 佛山科学技术学院 | 一种无人车图像采集控制方法及系统 |
CN112989982B (zh) * | 2021-03-05 | 2024-04-30 | 佛山科学技术学院 | 一种无人车图像采集控制方法及系统 |
Also Published As
Publication number | Publication date |
---|---|
CN114510079A (zh) | 2022-05-17 |
CN109690440B (zh) | 2022-03-08 |
CN109690440A (zh) | 2019-04-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12387491B2 (en) | Systems and methods for adjusting flight control of an unmanned aerial vehicle | |
US11125563B2 (en) | Systems and methods for autonomous machine tracking and localization of mobile objects | |
US11733692B2 (en) | Systems and methods for controlling an unmanned aerial vehicle | |
US20210173396A1 (en) | System and method for providing easy-to-use release and auto-positioning for drone applications | |
CN110494360B (zh) | 用于提供自主摄影及摄像的系统和方法 | |
CN102355574B (zh) | 机载云台运动目标自主跟踪系统的图像稳定方法 | |
US10636150B2 (en) | Subject tracking systems for a movable imaging system | |
KR102700830B1 (ko) | 무인 이동체를 제어하기 위한 방법 및 전자 장치 | |
CN205610783U (zh) | 一种带自动视觉跟踪的自稳定手持拍照摄像云台 | |
WO2017197729A1 (fr) | Système de suivi et procédé de suivi | |
KR102670994B1 (ko) | 무인 비행체 및 그 제어 방법 | |
WO2019233210A1 (fr) | Lunettes intelligentes, procédé et appareil de poursuite de trajectoire de globe oculaire et support d'enregistrement | |
KR101959366B1 (ko) | 무인기와 무선단말기 간의 상호 인식 방법 | |
WO2021127888A1 (fr) | Procédé de commande, lunettes intelligentes, plateforme mobile, cardan, système de commande et support de stockage lisible par ordinateur | |
WO2019126958A1 (fr) | Procédé de commande d'assiette en lacet, véhicule aérien sans pilote et support d'informations lisible par ordinateur | |
EP3273318A1 (fr) | Système autonome de prise de vues animées par un drone avec poursuite de cible et localisation améliorée de la cible | |
US20210112194A1 (en) | Method and device for taking group photo | |
Hausamann et al. | Positional head-eye tracking outside the lab: an open-source solution | |
WO2020019113A1 (fr) | Procédé de commande de robot mobile et système de robot mobile | |
WO2018176426A1 (fr) | Procédé de commande de vol d'un véhicule aérien sans pilote et véhicule aérien sans pilote | |
KR101599149B1 (ko) | 피사체를 자동으로 추적하는 촬영장치 | |
US11582395B1 (en) | Gimbal device | |
KR102334509B1 (ko) | 무인기와 무선단말기 간의 상호 인식 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17903139 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17903139 Country of ref document: EP Kind code of ref document: A1 |