WO2018191840A1 - Système et procédé de photographie interactive destinés à un véhicule aérien sans pilote - Google Patents
Système et procédé de photographie interactive destinés à un véhicule aérien sans pilote Download PDFInfo
- Publication number
- WO2018191840A1 WO2018191840A1 PCT/CN2017/080738 CN2017080738W WO2018191840A1 WO 2018191840 A1 WO2018191840 A1 WO 2018191840A1 CN 2017080738 W CN2017080738 W CN 2017080738W WO 2018191840 A1 WO2018191840 A1 WO 2018191840A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- drone
- user
- instruction
- control
- camera
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D3/00—Control of position or direction
- G05D3/12—Control of position or direction using feedback
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
Definitions
- the invention relates to the technical field of drone control, in particular to a drone interactive shooting system and method directly controlled by a user through an action.
- Unmanned aerial vehicles or drones
- the existing UAV shooting can be divided into two parts: commercial aerial photography and personal entertainment self-timer.
- Currently it is controlled by an application in a remote control or a handheld mobile device.
- the user when using a drone for personal entertainment selfies, the user often has to take both the drone and the remote control into consideration, which is not convenient to operate.
- the remote controller in the hand when shooting a party group photo, it is often impossible for the user to observe the application screen in the handheld mobile device, so that it is impossible to capture a clear face, or to shoot a motion jump from the time of shooting, because the remote controller in the hand cannot do it. Satisfied action, affecting the shooting effect.
- the miniaturized self-timer drones tend to have less power and shorter battery life, which affects the fun of shooting and cannot meet the needs of current users.
- the object of the present invention is to provide an unmanned aerial vehicle interactive shooting system and method, which can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function. Improve the shooting effect.
- An embodiment of the present invention provides a UAV interactive photographing system, the system including a drone, a camera assembly, and a control assembly, one end of the camera assembly being rotatably coupled to one side of the drone;
- the control components include:
- control instruction library configured to store a preset mapping relationship between various user action features and various control commands, where the control command includes a drone control command and/or a camera component control command;
- An image processing module configured to process a captured image of the camera component to acquire a user action feature to be executed in the captured image
- An instruction determining module configured to search for a corresponding control instruction in the control instruction library according to the user action feature to be executed
- an instruction execution module configured to control the drone and/or the camera assembly according to the obtained control instruction.
- the camera assembly includes an imaging device and a camera bracket, and the camera device is disposed in the camera In the bracket, and one end of the camera bracket is rotatably connected to one side of the drone;
- the system also includes a display device detachably or fixedly mounted to the other end of the camera mount.
- the display device comprises an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device and displays through the array display screen.
- the display device includes a dot matrix display screen and a second display control unit; the second display control unit acquires a control command obtained by the instruction determination module, and controls the dot matrix display User prompt information associated with the search control command obtained.
- one end of the camera bracket is disposed as a bump, and one side of the drone is provided with a groove corresponding to the shape of the bump; the protrusion of the camera bracket is embedded in the In the groove of the man-machine;
- the lower surface of the drone is a plane, and the lower surface of the drone includes a corresponding area of the camera bracket, and the two sides of the groove of the drone are perpendicular to the lower surface of the drone, and
- the protrusion of the camera bracket is rotatable in a recess of the drone so that the camera bracket can be perpendicular to a lower surface of the drone and an angle corresponding to a corresponding area of the camera bracket Rotate within the range.
- the lower surface of the unmanned aerial vehicle further includes a corresponding area of the electrical storage device, and the corresponding area of the electrical storage device does not intersect with the corresponding area of the imaging bracket;
- the system further includes a power storage device detachably or fixedly mounted on a lower surface of the drone, and the power storage device is attached to the corresponding area of the power storage device.
- the camera bracket includes a first arm, a second arm, and a third arm, one side of the first arm is connected to the bump, and the other of the first arm One side is disposed at a side, one end of the second arm and one end of the third arm are respectively connected to two ends of the first arm, and the second arm and the third arm are vertical In the first arm, the other end of the second arm is provided with a second slot, and the other end of the third arm is provided with a third slot;
- One side of the display device is inserted into the first slot, and the other side of the display device is inserted into the second slot and the third slot.
- the method further includes: a voice acquiring device, where the voice acquiring device is configured to acquire voice data of the user;
- the control instruction library is further configured to store a mapping relationship between preset various voice keywords and various control instructions
- the control component further includes a voice processing module, where the voice processing module is configured to extract a voice keyword included in the voice data of the user;
- the instruction determining module is further configured to search for a corresponding control instruction in the control instruction library according to the extracted voice keyword.
- the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
- the instruction determining module extracts a voice keyword included in the voice data of the user, and is in the control instruction library according to the extracted voice keyword. Check Find the corresponding control command;
- the instruction determination module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
- the image processing module is further configured to acquire a physiological feature of the user in the captured image of the camera component, and determine whether the physiological feature of the user is a pre-stored designated physiological feature;
- the instruction determining module searches for the corresponding control instruction in the control instruction library according to the user action feature to be executed;
- the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
- the UAV control instruction includes at least one of a UAV translation command, a UAV rotation instruction, a UAV power-on instruction, and an unmanned machine instruction;
- the camera component control instruction includes a camera component At least one of a rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
- control instruction further includes:
- the instruction determining module searches for a corresponding drone in the control instruction library according to the user action feature Controlling the command and controlling the drone according to the obtained drone control command;
- the instruction determining module searches for a corresponding camera component control in the control instruction library according to the user action feature Commanding, and controlling the camera assembly according to the obtained camera component control command.
- control instruction further includes:
- a panoramic mode selection instruction instructing the control component to enter a panoramic mode, in which the instruction execution module controls the drone to continuously move within a range of (0, ⁇ ) angles at a preset speed, ⁇ is Preset panorama to shoot the maximum angle.
- the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
- the camera assembly detects a location of the user
- the drone starts with a user's position as a starting point, and rotates ⁇ /n to one side in the same horizontal plane, where n is a first preset split value, and n>1;
- the camera assembly starts shooting, and the drone rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane;
- the camera assembly stops shooting.
- the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
- the instruction execution module calculates a distance L between the camera component and a user
- the instruction execution module selects an positioning point between the camera component and a user, and the positioning point is a circle a first sector of the angle ⁇ with a radius of L/m, and the object to be photographed is located on the arc of the first sector, where m is a second predetermined segmentation value, and m>1;
- the instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector
- the radius is (m-1) L/m and the angle is ⁇ ;
- the camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
- the camera assembly stops shooting.
- the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
- the instruction execution module calculates a distance L between the camera component and a user
- the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of ⁇ with a length of L/m as a waist, and The subject to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second predetermined segmentation value, and m>1;
- the command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle.
- the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is ⁇ ;
- the camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
- the camera assembly stops shooting.
- control instruction further includes:
- a third mode selection instruction instructing the control component to enter a third mode, in the third mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
- control instruction further includes:
- a fourth mode selection instruction instructing the control component to enter a fourth mode
- the instruction execution module detects a position of the user through the camera component, and controls the drone and the camera The component automatically moves according to the location of the user such that the camera assembly continues to capture the user.
- the instruction execution module acquires a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
- the UAV is further provided with at least one distance sensor
- the control component further includes an obstacle calculation module
- the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor
- the to-be-executed control instruction includes a drone movement instruction, and the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold Take The drone movement command is cancelled, and a limit reminder signal is issued to the outside.
- the invention also provides a method for interactively photographing a drone, which adopts the UAV interactive photographing system, and the method comprises the following steps:
- the camera assembly acquires a captured image
- the image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image
- the instruction determining module searches for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
- the instruction execution module controls the drone and/or the camera assembly according to the obtained control command.
- control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component by the following steps Take a panoramic photo shoot:
- the camera assembly detects a location of the user
- the drone starts with a user's position as a starting point, and rotates ⁇ /n to one side in the same horizontal plane, where n is a first preset split value, and n>1, ⁇ is a preset panoramic shooting maximum angle;
- the camera assembly starts shooting, and the drone rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane;
- the camera assembly stops shooting.
- control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
- the instruction execution module calculates a distance L between the camera component and a user
- the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a center to generate a first sector with an angle of ⁇ with a radius of L/m, and the object to be photographed is located at the first a sector-shaped arc, where m is the second preset segmentation value, and m>1, ⁇ is the maximum angle of the preset panoramic shooting;
- the instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector
- the radius is (m-1) L/m and the angle is ⁇ ;
- the camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
- the camera assembly stops shooting.
- control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
- the instruction execution module calculates a distance L between the camera component and a user
- the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of ⁇ with a length of L/m as a waist, and The object to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second preset segmentation value, and m>1, where ⁇ is a preset panoramic shooting maximum angle;
- the command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle.
- the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is ⁇ ;
- the camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
- the camera assembly stops shooting.
- the invention provides a technical solution that the user directly controls through the action, the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed,
- This user can directly control the flight control of the drone and control the shooting of the camera unit, thus enabling the shooting function, which can easily meet the needs of shooting in any occasion and improve the user experience.
- FIG. 1 is a block diagram showing the structure of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention
- FIG. 2 is a schematic structural diagram of an unmanned aerial camera interactive shooting system using an array display screen according to an embodiment of the present invention
- FIG. 3 is a schematic structural diagram of a UAV interactive photographing system using a dot matrix display screen according to an embodiment of the present invention
- FIG. 4 is a schematic diagram of adjusting the position of a drone according to an embodiment of the present invention.
- FIG. 5 is a schematic diagram of adjusting an angle of a camera assembly according to an embodiment of the invention.
- 6-7 are schematic diagrams of gesture control according to an embodiment of the present invention.
- FIG. 8 is a schematic structural diagram of an external display device according to an embodiment of the present invention.
- FIG. 9 is a schematic structural diagram of a display device when it is stowed according to an embodiment of the present invention.
- FIG. 10 is a schematic bottom view of the unmanned aerial vehicle according to an embodiment of the present invention when not in use;
- FIG. 11 is a schematic structural diagram of an electrical storage device according to an embodiment of the present invention.
- FIG. 12 is a schematic diagram showing a state of a drone when charging according to an embodiment of the present invention.
- FIG. 13 is a flow chart showing a charging process of a drone according to an embodiment of the present invention.
- FIG. 14 is a schematic diagram of controlling the position of a drone by voice according to an embodiment of the present invention.
- 15 is a schematic structural diagram of an unmanned aerial camera interactive shooting system with voice control added according to an embodiment of the present invention.
- 16 is a flow chart of user voiceprint verification according to an embodiment of the present invention.
- 17 is a flow chart of user physiological feature verification according to an embodiment of the present invention.
- 18 to 20 are flowcharts of a method for interactively capturing a drone according to an embodiment of the present invention.
- 21 is a flow chart of panoramic shooting according to an embodiment of the present invention.
- Figure 22 is a schematic view showing the rotation of the drone during panoramic shooting according to an embodiment of the present invention.
- FIG. 23 is a schematic diagram of a drone moving along a circular arc path during panoramic shooting according to an embodiment of the present invention.
- 24 is a schematic diagram of a drone moving along a linear trajectory during panoramic shooting according to an embodiment of the present invention.
- 25 is a flow chart of automatically tracking a user's position by a drone according to an embodiment of the present invention.
- Figure 26 is a flow chart showing the automatic obstacle avoidance of the drone according to an embodiment of the present invention.
- an embodiment of the present invention provides a UAV interactive photographing system, which includes a drone 200, a camera assembly 300, and a control assembly 100.
- One end of the camera assembly 300 is rotatably coupled to the One side of the drone 200;
- the control component 100 includes: a control instruction library 110 for storing a mapping relationship between preset various user action features and various control commands, the control command including a drone a control instruction and/or a camera component control instruction; an image processing module 120, configured to process the captured image of the camera component 300 to acquire a user action feature to be executed in the captured image; and an instruction determining module 130, configured to: And searching for the corresponding control instruction in the control instruction library according to the user action feature to be executed; and the instruction execution module 140, configured to control the drone 200 and/or the Camera assembly 300.
- the user action feature here is preferably a gesture of the user, that is, different control commands can be used to obtain different control commands.
- other user action features such as the user's eyes, the user nods, shaking his head, and the user are also available. Laughing, etc., for example, it is possible to set a picture that captures the user's laughter, so that automatic capture of the user's smile can be achieved, and the like.
- the following embodiments describe multi-user based gestures for control, however it will be appreciated that the use of other user motion features is also within the scope of the present invention.
- FIG. 2 is a schematic structural diagram of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention.
- a drone 200 in which one side of the drone 200 is rotatably mounted with a camera assembly 300, the camera assembly 300
- the image capturing device 320 and the image capturing device 310 are disposed in the image capturing bracket 310, and one end of the image capturing bracket 310 is rotatably connected to one side of the drone 200; further,
- the system can also include a display device 330 that is detachably or fixedly mounted to the other end of the camera mount 310.
- control assembly 100 may be disposed inside the drone 200, or disposed on the surface of the drone 200, or set All other locations are within the scope of the invention.
- the instruction execution module 140 can directly communicate with the controller of the drone 200, or can perform wireless communication with the camera assembly 300, thereby implementing delivery and feedback of control commands.
- the display device 330 can display content for viewing by the user according to needs, and two setting manners of the display device 330 are given in FIG. 2 and FIG.
- the display device 330 shown in FIG. 2 includes an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device 320 and displays it through the array display screen.
- the array display can include, but is not limited to, a color LCD screen, and the user can view the self-timer picture in real time through the display.
- the display device 330 shown in FIG. 3 includes a dot matrix display screen and a second display control unit; the second display control unit acquires the control command obtained by the instruction determination module 130, and controls the dot matrix
- the display screen displays user prompt information associated with the search control command obtained.
- the dot matrix display screen may include, but is not limited to, a dot matrix LED screen, and the user can perform self-photographing preparation and shooting through the LED number arrangement form.
- the user prompt information may be a self-timer countdown. For example, when the countdown starts shooting for five seconds, the dot matrix display sequentially displays 5, 4, 3, 2, and 1, and the user can prepare for self-time according to the countdown; the user prompts information. It is also possible to indicate which shooting mode is currently in use, for example, when 2 is displayed, it means that it is currently in the second mode, and so on.
- the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed, thereby the user can Control of the drone 200 and/or camera assembly 300 is accomplished.
- the drone control command may include at least one of a drone panning command, a drone rotation command, a drone powering command, and a drone machine command.
- the camera component control command may include at least one of a camera component rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
- the shooting parameters that can be adjusted here can include focus, fill light, image size, and so on.
- FIG. 4 a schematic diagram of adjusting the position of the drone 200 according to an embodiment of the present invention is shown. To adjust the position of the drone, you can use the following steps:
- the user 400 observes the self-portrait angle from the display device 330, and finds that the portrait is in the left-left position in the display device 330 (the portrait shown by the broken line in FIG. 4), and the user 400 passes the gesture (from the dotted line state of the user 400 in FIG. 4 to the solid line) State) controls the drone to move to the left position until the portrait is in the center of the screen (the portrait shown in solid lines in Figure 4);
- the user 400 After the shooting conditions are met, the user 400 performs shooting by gesture control.
- FIG. 5 it is a schematic diagram of adjusting the angle of the camera assembly 300 according to an embodiment of the invention. To specifically adjust the camera component 300, the following steps can be taken:
- the user 400 observes the self-portrait angle from the display device 303, finds that the drone 200 is high, the portrait is in the downward position (such as the portrait shown by the dotted line in FIG. 5), and the user passes the gesture (from the user 400 hand dotted state in FIG. 4) Up to the solid state) controlling the camera assembly 300 to flip down, thereby driving the camera device 302 to flip down until the portrait is in the center of the screen (such as the portrait shown by the solid line in FIG. 5);
- the user 400 After the shooting conditions are met, the user 400 performs shooting by gesture control.
- the manner of controlling the drone 200 and the camera assembly 300 can also be flexibly selected.
- the adjustment can also be performed by reducing the height of the drone 200.
- the portrait is in the middle of the screen.
- the adjustment manner of the drone 200 and the camera assembly 300 can be distinguished by using different preset gesture commands. That is, when a gesture is known, it can be known whether the gesture specific control object is the drone 200 or the camera assembly 300, and it can be known that the gesture specifically controls the action of the drone 200 or the camera assembly 300.
- gesture control is given in Figures 6 and 7.
- the user can also customize the mapping relationship between different gestures and different control commands, and modify it to a gesture that conforms to its usage habits.
- Other action features can also be added. For example, the user nods to confirm the shooting, the user shakes the head to delete the previous captured image, and so on.
- the camera assembly 300 uses an external display device 340.
- the external display device 340 can further be a user's mobile terminal.
- the external display device 340 and the control component 100 can communicate via wireless or USB. Wait for the data line to communicate.
- One end of the camera bracket 310 is disposed as a bump 311, and one side of the drone 200 is provided with a groove 210 corresponding to the shape of the bump; the bump 311 of the camera bracket is embedded in the In the groove 210 of the drone.
- the camera bracket 310 includes a first arm 312 , a second arm 313 , and a third arm 314 .
- One side of the first arm 312 is connected to the bump 311 .
- a first slot is disposed on the other side of the first arm 312, and one end of the second arm 313 and one end of the third arm 314 are respectively connected to two ends of the first arm 312.
- the second arm 313 and the third arm 314 are both perpendicular to the first arm 312, and the other end of the second arm 313 is provided with a second slot, the third arm 314 The other end is provided with a third slot.
- the external display device 340 can be placed in the camera holder 310, the upper end of the external display device 340 is inserted into the first slot, and the lower end of the external display device 340 is inserted into the second slot and the third slot. Thereby, a stable and convenient connection between the external display device 340 and the imaging stand 310 is formed.
- the display device 330 is a built-in display device 330.
- the camera holder 310 is rotated by the cooperation of the bump 311 and the groove 210, and the display device 330 is also rotated together with the camera holder 310.
- the lower surface of the drone 200 is a plane, and the lower surface of the drone 200 includes a camera bracket corresponding area 220. The two sides of the recess 210 of the drone 200 are perpendicular to the drone.
- the camera assembly 300 can be adjusted within a desired range of angles to achieve better shooting results.
- the camera stand 310 can be folded into the corresponding area 220 of the camera stand to facilitate folding and carrying.
- the embodiment of the present invention further provides a convenient charging mode.
- the lower surface of the unmanned aerial vehicle 200 further includes a power storage device corresponding area 230, and the corresponding area of the power storage device does not intersect with the camera support corresponding area 220; the system further includes a power storage device 500, The power storage device 500 is detachably or fixedly mounted on a lower surface of the drone 200, and the power storage device 500 is attached to the corresponding region of the power storage device.
- the connection between the external display screen and the drone is first disconnected, and the external display can be displayed.
- the screen is removed, and it can also be left on the camera holder 310 and folded together; if the power storage device 500 is inserted at this time, charging starts, otherwise it is directly turned off.
- the power storage device 500 is installed in the corresponding area of the power storage device.
- the charging device is connected to the rechargeable battery of the drone using the power storage device 500 to perform a charging operation.
- the embodiment of the present invention may further include a voice acquiring device 600, where the voice acquiring device 600 is configured to acquire voice data of a user; and the control command library 110 is further configured to store presets. a mapping relationship between the voice keyword and the various control commands; the control component 100 further includes a voice processing module 150, the voice processing module 150 is configured to extract a voice keyword included in the voice data of the user; The module 130 is further configured to search for the corresponding control instruction in the control instruction library according to the extracted voice keyword.
- this embodiment can also implement the user's shooting control by voice. For example, if the keyword “power on” is set to turn on the camera component 300, when the word “power on” is detected in the voice data of the user, the camera component 300 is automatically turned on, or “the drone” is detected in the voice data of the user. And “moving to the left” automatically controls the drone to move to the left. Voice control is more convenient and convenient, and is not subject to other conditions, and can be applied to any occasion without affecting the user's shooting effect.
- the control component 100 may receive noise from other people's voices or the environment, and also need to distinguish different sounds. That is, the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
- the command determining module extracts the voice data of the user. a voice keyword included, and searching for a corresponding control instruction in the control instruction library according to the extracted voice keyword; if the voiceprint feature of the user is not a preset allowed voiceprint feature, indicating that the voice data is not The voice data of the specified user needs to be screened out, that is, the command determining module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
- the camera component 300 may also acquire motion characteristics of other people who are not designated users.
- the image processing module is further configured to acquire a captured image of the camera component. The physiological characteristics of the user, and determining whether the physiological characteristic of the user is a pre-stored designated physiological feature;
- the instruction determining module searches for the corresponding one in the control instruction library according to the user action feature to be executed. And controlling the instruction; if the physiological characteristic of the user is not pre-existing the specified physiological feature, the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
- obtaining the physiological characteristics of the user may refer to the facial features of the user, the color of the user, the length of the hair, the skin color of the user, the color of the lips, etc., or a combination of various physiological features for more accurate identification, etc. And the like are all within the scope of protection of the present invention.
- an embodiment of the present invention further provides a method for interactively capturing a UAV, which adopts the UAV interactive photographing system, and the method includes the following steps:
- the image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image
- the instruction execution module controls the drone and/or the camera component according to the obtained control instruction.
- the determination process may adopt the flow shown in FIG. 19 to sequentially perform the determination and control, but is not limited to this manner. Others determine whether it is a camera component control command, and then determine whether it is a drone control command or the like, which falls within the scope of protection of the present invention.
- an embodiment of a specific UAV interactive photographing method is shown. First, determine the type of display. If it is an external display, you need to first connect the control unit to the external display through wireless communication to prepare for the rear control. Then, according to the correspondence between the gesture and the control instruction, the corresponding control instruction is searched for and the control is executed.
- the action features of the present invention are not limited to the one of the gestures, and the different actions of other body parts can also achieve the object of the present invention.
- control instruction may further include a first mode selection instruction and a second mode selection instruction respectively instructing the control component to enter the first mode and the second mode.
- the received user action feature defaults to pointing to the drone control command, that is, the command determining module searches for the corresponding drone control in the control command library according to the user action feature. Commanding, and controlling the drone according to the obtained drone control command, and no longer executing the camera component control instruction; after entering the second mode, the received user action feature defaults to pointing to the camera component control An instruction, that is, the instruction determining module searches for a corresponding camera component control instruction in the control instruction library according to the user action feature, and controls the camera component according to the obtained camera component control instruction, and does not execute Camera component control instructions.
- the palm is also spread out and moved downward.
- the first mode it means that the drone is controlled to move downward
- the second mode it means that the camera assembly is turned down. Only one specific embodiment is given herein, and the scope of protection of the present invention is not limited thereto.
- the drone due to the smoothness and controllability of the drone during flight, it has some irreplaceable advantages compared to the user's hand taking the camera. For example, the drone can take a photo with less jitter, and the camera device The anti-shake performance requirements are lower. When a user takes a panoramic photo with the camera in his hand, he or she often loses the ideal panoramic photo due to jitter or other factors. And this problem can be overcome by drones.
- control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone to preset The speed continues to rotate within the (0, ⁇ ) angle range, and ⁇ is the maximum angle for the preset panorama.
- the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
- the camera assembly detects a location of the user 400
- the drone 200 takes the position of the user 400 as a starting point, and rotates ⁇ /n to one side in the same horizontal plane.
- This stage is a positioning stage of the drone, and no shooting is performed in this process, where n is the first preset dividing value. ;
- the camera assembly starts shooting, and the drone 200 rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane, thereby reaching a panoramic photo with an angle ⁇ , and the user is located at a designated position of the panoramic photo;
- the camera assembly stops shooting.
- the user can be placed in the center of the panoramic photo.
- the angle ⁇ can be set as needed, and the position of the user in the panoramic photo can also be adjusted. For example, if the user is located in the left position, the drone can be rotated to one side by a/4, etc., and the shooting mode is adopted. Very flexible, and the success rate of taking panoramic photos is better, and taking photos is better.
- the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
- the instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
- the instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a center to generate a first sector 701 of an angle ⁇ with a radius of L/m, and the object to be photographed is located at the center.
- ⁇ is the maximum angle of the preset panoramic shooting;
- the instruction execution module generates a second sector 702 opposite to the first sector 701, the second sector
- the two sides of the 702 are respectively opposite extension lines of the two sides of the first sector 701, and the radius of the second sector 702 is (m-1) L / m, the angle is ⁇ ;
- the camera assembly starts shooting, and the drone 200 moves from one end of the arc of the second sector 702 along the trajectory of the arc to the other end of the arc;
- the camera assembly stops shooting.
- the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
- the instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
- the instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a vertex to generate a first isosceles triangle 703 with an apex angle ⁇ with a length of L/m as a waist. And the object to be photographed is located on a bottom edge of the first isosceles triangle 703, where m is a second preset segmentation value, and m>1, ⁇ is a preset panoramic shooting maximum angle;
- the instruction execution module generates a second isosceles triangle 704 opposite to the first isosceles triangle 703, and the two waists of the second isosceles triangle 704 are respectively opposite to the two waists of the first isosceles triangle 703 To the extension line, and the length of the waist of the second isosceles triangle 704 is (m-1) L/m, and the apex angle is ⁇ ;
- the camera assembly starts shooting, and the drone 200 moves from one end of the bottom edge of the second isosceles triangle 704 along the trajectory of the bottom edge to the other end of the bottom edge;
- the camera assembly stops shooting.
- the shooting trajectories in FIG. 23 and FIG. 24 can be selected as needed, forming a panoramic photo by continuous shooting, or synthesizing a plurality of photos into one panoramic photo, and different selections of m and ⁇ can obtain different shooting ranges, and more flexibility.
- the drone can move according to the calculated preset trajectory, so that the camera component acquires different shooting positions and shooting angles.
- a shooting countdown may be set, that is, the control command may further include a third mode selection instruction indicating that the control component enters the third mode, in the In the three mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
- the countdown time can be displayed by the display device, or the remaining preparation time can be indicated by other display lights or prompts.
- the drone of the present invention can also realize a user automatic tracking shooting function.
- the control instruction may further include a fourth mode selection instruction instructing the control component to enter a fourth mode, in the fourth mode, the instruction execution module detects a position of the user through the camera component, and controls the The drone and the camera assembly automatically move according to the position of the user, so that the camera assembly continuously captures the user. This enables automatic tracking of user shots, ensuring that the user is always within range.
- the instruction execution module may further acquire a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
- the instruction execution module sends an alarm signal to the outside.
- the camera component cannot capture the user's position
- the user can be alerted by the alarm, so that the user can actively come to the shooting range of the camera component; on the other hand, the fall detection can also be realized.
- the alarm signal can be automatically sent to the outside. If the user does not cancel the alarm signal within a certain period of time, the mobile terminal of other users associated with the user or the emergency telephone can be further notified. While providing users with high-quality shooting, it also ensures the safety of users during use.
- At least one distance sensor may be disposed on the drone, the control component further includes an obstacle calculation module, and the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor;
- the to-be-executed control instruction includes a drone movement instruction
- the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold , cancel the drone movement instruction, and issue a limit reminder signal to the outside. That is, the obstacle calculation module predicts according to the pointing direction of the control command after the obstacle is detected by the distance sensor, and if the drone performs the drone movement instruction, it may hit the obstacle. If so, the drone movement command is not executed, and the user is reminded that the distance is already less than the limit and there is a danger of hitting an obstacle.
- this embodiment is particularly suitable for shooting indoors. Due to the limitation of wall and ceiling in the room, and many other obstacles such as furniture and furnishings, this method can ensure the safety of the drone indoors through reliable calculation and danger prediction. Similarly, it can also be applied to the case where the drone is photographed outdoors. In an open space, the drone may move faster, and the user cannot predict the danger of the arrival. Therefore, in this way, Ensure the stability and reliability of the drone interactive shooting process.
- the present invention provides a technical solution in which a user directly controls through an action, and the camera component automatically acquires a captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the user action feature to be executed according to the user action feature to be executed.
- the user needs the control command, so that the user can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function, and can easily realize the shooting to meet the demand in any occasion, thereby improving the user experience.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Automation & Control Theory (AREA)
- Studio Devices (AREA)
Abstract
La présente invention concerne un système et un procédé de photographie interactive destinés à un véhicule aérien sans pilote. Le système comprend un véhicule aérien sans pilote, un élément caméra et un élément de commande. Une extrémité de l'élément caméra est reliée en rotation à un côté du véhicule aérien sans pilote. L'élément de commande comprend une bibliothèque d'instructions de commande, un module de traitement d'images, un module de détermination d'instructions et un module d'exécution d'instructions. Le module d'exécution d'instructions commande le véhicule aérien sans pilote et/ou l'élément caméra selon une instruction de commande établie. L'invention concerne en outre une solution technique de mise en œuvre d'une photographie interactive faisant appel à un véhicule aérien sans pilote. Une image capturée est automatiquement obtenue par l'élément caméra et automatiquement analysée par l'élément de commande afin d'obtenir une caractéristique d'action d'utilisateur à exécuter ; une instruction de commande souhaitée par un utilisateur est interprétée selon la caractéristique d'action d'utilisateur à exécuter ; ainsi, l'utilisateur peut directement effectuer une commande de vol sur le véhicule aérien sans pilote et effectuer une commande de photographie sur l'élément caméra au moyen d'une action, de manière à obtenir une fonction de photographie. La présente invention peut facilement mettre en œuvre une photographie satisfaisant aux exigences en toute occasion, et améliorant l'expérience utilisateur.
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201780000407.6A CN109121434B (zh) | 2017-04-17 | 2017-04-17 | 无人机交互拍摄系统及方法 |
PCT/CN2017/080738 WO2018191840A1 (fr) | 2017-04-17 | 2017-04-17 | Système et procédé de photographie interactive destinés à un véhicule aérien sans pilote |
TW107111546A TWI696122B (zh) | 2017-04-17 | 2018-04-02 | 無人機互動拍攝系統方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/CN2017/080738 WO2018191840A1 (fr) | 2017-04-17 | 2017-04-17 | Système et procédé de photographie interactive destinés à un véhicule aérien sans pilote |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018191840A1 true WO2018191840A1 (fr) | 2018-10-25 |
Family
ID=63855487
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2017/080738 WO2018191840A1 (fr) | 2017-04-17 | 2017-04-17 | Système et procédé de photographie interactive destinés à un véhicule aérien sans pilote |
Country Status (3)
Country | Link |
---|---|
CN (1) | CN109121434B (fr) |
TW (1) | TWI696122B (fr) |
WO (1) | WO2018191840A1 (fr) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112019744A (zh) * | 2020-08-27 | 2020-12-01 | 新石器慧义知行智驰(北京)科技有限公司 | 一种拍照方法、装置、设备和介质 |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI768630B (zh) * | 2020-12-29 | 2022-06-21 | 財團法人工業技術研究院 | 可移動攝影系統和攝影構圖控制方法 |
US11445121B2 (en) | 2020-12-29 | 2022-09-13 | Industrial Technology Research Institute | Movable photographing system and photography composition control method |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104808799A (zh) * | 2015-05-20 | 2015-07-29 | 成都通甲优博科技有限责任公司 | 一种能够识别手势的无人机及其识别方法 |
CN105138126A (zh) * | 2015-08-26 | 2015-12-09 | 小米科技有限责任公司 | 无人机的拍摄控制方法及装置、电子设备 |
WO2015200209A1 (fr) * | 2014-06-23 | 2015-12-30 | Nixie Labs, Inc. | Véhicules aériens sans pilote portatifs, véhicules aériens sans pilote à lancement commandé, et systèmes et procédés associés |
CN105391939A (zh) * | 2015-11-04 | 2016-03-09 | 腾讯科技(深圳)有限公司 | 无人机拍摄控制方法和装置、无人机拍摄方法和无人机 |
CN105607740A (zh) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | 一种基于计算机视觉的无人飞行器控制方法及装置 |
CN105847684A (zh) * | 2016-03-31 | 2016-08-10 | 深圳奥比中光科技有限公司 | 无人机 |
JP2016225872A (ja) * | 2015-06-01 | 2016-12-28 | 日本電信電話株式会社 | 移動装置操作端末、移動装置操作方法及び移動装置操作プログラム |
Family Cites Families (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9678506B2 (en) * | 2014-06-19 | 2017-06-13 | Skydio, Inc. | Magic wand interface and other user interaction paradigms for a flying digital assistant |
CN105338238B (zh) * | 2014-08-08 | 2019-04-23 | 联想(北京)有限公司 | 一种拍照方法及电子设备 |
US9471059B1 (en) * | 2015-02-17 | 2016-10-18 | Amazon Technologies, Inc. | Unmanned aerial vehicle assistant |
CN104865856B (zh) * | 2015-03-30 | 2018-02-06 | 广州势必可赢网络科技有限公司 | 一种适用于无人机的语音控制方法 |
CN106155080B (zh) * | 2015-07-28 | 2020-04-10 | 英华达(上海)科技有限公司 | 无人机 |
CN105677300A (zh) * | 2016-02-04 | 2016-06-15 | 普宙飞行器科技(深圳)有限公司 | 基于手势识别操控无人机的方法、无人机及系统 |
CN106227231A (zh) * | 2016-07-15 | 2016-12-14 | 深圳奥比中光科技有限公司 | 无人机的控制方法、体感交互装置以及无人机 |
CN106200679B (zh) * | 2016-09-21 | 2019-01-29 | 中国人民解放军国防科学技术大学 | 基于多模态自然交互的单操作员多无人机混合主动控制方法 |
CN106444843B (zh) * | 2016-12-07 | 2019-02-15 | 北京奇虎科技有限公司 | 无人机相对方位控制方法及装置 |
-
2017
- 2017-04-17 WO PCT/CN2017/080738 patent/WO2018191840A1/fr active Application Filing
- 2017-04-17 CN CN201780000407.6A patent/CN109121434B/zh active Active
-
2018
- 2018-04-02 TW TW107111546A patent/TWI696122B/zh active
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015200209A1 (fr) * | 2014-06-23 | 2015-12-30 | Nixie Labs, Inc. | Véhicules aériens sans pilote portatifs, véhicules aériens sans pilote à lancement commandé, et systèmes et procédés associés |
CN104808799A (zh) * | 2015-05-20 | 2015-07-29 | 成都通甲优博科技有限责任公司 | 一种能够识别手势的无人机及其识别方法 |
JP2016225872A (ja) * | 2015-06-01 | 2016-12-28 | 日本電信電話株式会社 | 移動装置操作端末、移動装置操作方法及び移動装置操作プログラム |
CN105138126A (zh) * | 2015-08-26 | 2015-12-09 | 小米科技有限责任公司 | 无人机的拍摄控制方法及装置、电子设备 |
CN105391939A (zh) * | 2015-11-04 | 2016-03-09 | 腾讯科技(深圳)有限公司 | 无人机拍摄控制方法和装置、无人机拍摄方法和无人机 |
CN105607740A (zh) * | 2015-12-29 | 2016-05-25 | 清华大学深圳研究生院 | 一种基于计算机视觉的无人飞行器控制方法及装置 |
CN105847684A (zh) * | 2016-03-31 | 2016-08-10 | 深圳奥比中光科技有限公司 | 无人机 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112019744A (zh) * | 2020-08-27 | 2020-12-01 | 新石器慧义知行智驰(北京)科技有限公司 | 一种拍照方法、装置、设备和介质 |
Also Published As
Publication number | Publication date |
---|---|
CN109121434B (zh) | 2021-07-27 |
TWI696122B (zh) | 2020-06-11 |
TW201839663A (zh) | 2018-11-01 |
CN109121434A (zh) | 2019-01-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US12149819B2 (en) | Autonomous media capturing | |
CN104828256B (zh) | 一种智能多模式飞行拍摄设备及其飞行控制方法 | |
CN110119154A (zh) | 飞行器的控制方法、装置和设备以及飞行器 | |
JP6696118B2 (ja) | 電子機器 | |
WO2018209702A1 (fr) | Procédé de commande de véhicule aérien sans pilote, véhicule aérien sans pilote et support d'informations lisible par machine | |
CN108476288A (zh) | 拍摄控制方法及装置 | |
US20160292886A1 (en) | Apparatus and method for photographing people using a movable remote device | |
CN110692027A (zh) | 用于提供无人机应用的易用的释放和自动定位的系统和方法 | |
EP4195651A1 (fr) | Stabilisateur de dispositif photographique | |
US20200329202A1 (en) | Image capturing apparatus, control method, and recording medium | |
CN106131413A (zh) | 一种拍摄设备的控制方法及拍摄设备 | |
WO2018191840A1 (fr) | Système et procédé de photographie interactive destinés à un véhicule aérien sans pilote | |
WO2019104681A1 (fr) | Procédé et dispositif de capture d'image | |
KR101951666B1 (ko) | 촬영용 드론 및 그 제어 방법 | |
CN109995991A (zh) | 一种拍摄方法、机器人及移动终端 | |
WO2022000138A1 (fr) | Procédé et appareil dfe commande de photographie, cardan et système de photographie | |
CN114710623A (zh) | 基于手持云台的拍摄方法、手持云台及存储介质 | |
CN110337806A (zh) | 集体照拍摄方法和装置 | |
CN112189333B (zh) | 跟随拍摄、云台控制方法、拍摄装置、手持云台和拍摄系统 | |
US20230033760A1 (en) | Aerial Camera Device, Systems, and Methods | |
CN110291776B (zh) | 飞行控制方法及飞行器 | |
CN109447924B (zh) | 一种图片合成方法、装置及电子设备 | |
KR101599149B1 (ko) | 피사체를 자동으로 추적하는 촬영장치 | |
CN112154652A (zh) | 手持云台的控制方法、控制装置、手持云台及存储介质 | |
CN111031202A (zh) | 基于四旋翼的智能拍照无人机、智能拍照系统及其方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17906581 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17906581 Country of ref document: EP Kind code of ref document: A1 |