[go: up one dir, main page]

WO2018191840A1 - 无人机交互拍摄系统及方法 - Google Patents

无人机交互拍摄系统及方法 Download PDF

Info

Publication number
WO2018191840A1
WO2018191840A1 PCT/CN2017/080738 CN2017080738W WO2018191840A1 WO 2018191840 A1 WO2018191840 A1 WO 2018191840A1 CN 2017080738 W CN2017080738 W CN 2017080738W WO 2018191840 A1 WO2018191840 A1 WO 2018191840A1
Authority
WO
WIPO (PCT)
Prior art keywords
drone
user
instruction
control
camera
Prior art date
Application number
PCT/CN2017/080738
Other languages
English (en)
French (fr)
Inventor
张景嵩
张凌
戴志宏
Original Assignee
英华达(上海)科技有限公司
英华达(上海)电子有限公司
英华达股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 英华达(上海)科技有限公司, 英华达(上海)电子有限公司, 英华达股份有限公司 filed Critical 英华达(上海)科技有限公司
Priority to CN201780000407.6A priority Critical patent/CN109121434B/zh
Priority to PCT/CN2017/080738 priority patent/WO2018191840A1/zh
Priority to TW107111546A priority patent/TWI696122B/zh
Publication of WO2018191840A1 publication Critical patent/WO2018191840A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D3/00Control of position or direction
    • G05D3/12Control of position or direction using feedback
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer

Definitions

  • the invention relates to the technical field of drone control, in particular to a drone interactive shooting system and method directly controlled by a user through an action.
  • Unmanned aerial vehicles or drones
  • the existing UAV shooting can be divided into two parts: commercial aerial photography and personal entertainment self-timer.
  • Currently it is controlled by an application in a remote control or a handheld mobile device.
  • the user when using a drone for personal entertainment selfies, the user often has to take both the drone and the remote control into consideration, which is not convenient to operate.
  • the remote controller in the hand when shooting a party group photo, it is often impossible for the user to observe the application screen in the handheld mobile device, so that it is impossible to capture a clear face, or to shoot a motion jump from the time of shooting, because the remote controller in the hand cannot do it. Satisfied action, affecting the shooting effect.
  • the miniaturized self-timer drones tend to have less power and shorter battery life, which affects the fun of shooting and cannot meet the needs of current users.
  • the object of the present invention is to provide an unmanned aerial vehicle interactive shooting system and method, which can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function. Improve the shooting effect.
  • An embodiment of the present invention provides a UAV interactive photographing system, the system including a drone, a camera assembly, and a control assembly, one end of the camera assembly being rotatably coupled to one side of the drone;
  • the control components include:
  • control instruction library configured to store a preset mapping relationship between various user action features and various control commands, where the control command includes a drone control command and/or a camera component control command;
  • An image processing module configured to process a captured image of the camera component to acquire a user action feature to be executed in the captured image
  • An instruction determining module configured to search for a corresponding control instruction in the control instruction library according to the user action feature to be executed
  • an instruction execution module configured to control the drone and/or the camera assembly according to the obtained control instruction.
  • the camera assembly includes an imaging device and a camera bracket, and the camera device is disposed in the camera In the bracket, and one end of the camera bracket is rotatably connected to one side of the drone;
  • the system also includes a display device detachably or fixedly mounted to the other end of the camera mount.
  • the display device comprises an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device and displays through the array display screen.
  • the display device includes a dot matrix display screen and a second display control unit; the second display control unit acquires a control command obtained by the instruction determination module, and controls the dot matrix display User prompt information associated with the search control command obtained.
  • one end of the camera bracket is disposed as a bump, and one side of the drone is provided with a groove corresponding to the shape of the bump; the protrusion of the camera bracket is embedded in the In the groove of the man-machine;
  • the lower surface of the drone is a plane, and the lower surface of the drone includes a corresponding area of the camera bracket, and the two sides of the groove of the drone are perpendicular to the lower surface of the drone, and
  • the protrusion of the camera bracket is rotatable in a recess of the drone so that the camera bracket can be perpendicular to a lower surface of the drone and an angle corresponding to a corresponding area of the camera bracket Rotate within the range.
  • the lower surface of the unmanned aerial vehicle further includes a corresponding area of the electrical storage device, and the corresponding area of the electrical storage device does not intersect with the corresponding area of the imaging bracket;
  • the system further includes a power storage device detachably or fixedly mounted on a lower surface of the drone, and the power storage device is attached to the corresponding area of the power storage device.
  • the camera bracket includes a first arm, a second arm, and a third arm, one side of the first arm is connected to the bump, and the other of the first arm One side is disposed at a side, one end of the second arm and one end of the third arm are respectively connected to two ends of the first arm, and the second arm and the third arm are vertical In the first arm, the other end of the second arm is provided with a second slot, and the other end of the third arm is provided with a third slot;
  • One side of the display device is inserted into the first slot, and the other side of the display device is inserted into the second slot and the third slot.
  • the method further includes: a voice acquiring device, where the voice acquiring device is configured to acquire voice data of the user;
  • the control instruction library is further configured to store a mapping relationship between preset various voice keywords and various control instructions
  • the control component further includes a voice processing module, where the voice processing module is configured to extract a voice keyword included in the voice data of the user;
  • the instruction determining module is further configured to search for a corresponding control instruction in the control instruction library according to the extracted voice keyword.
  • the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
  • the instruction determining module extracts a voice keyword included in the voice data of the user, and is in the control instruction library according to the extracted voice keyword. Check Find the corresponding control command;
  • the instruction determination module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
  • the image processing module is further configured to acquire a physiological feature of the user in the captured image of the camera component, and determine whether the physiological feature of the user is a pre-stored designated physiological feature;
  • the instruction determining module searches for the corresponding control instruction in the control instruction library according to the user action feature to be executed;
  • the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
  • the UAV control instruction includes at least one of a UAV translation command, a UAV rotation instruction, a UAV power-on instruction, and an unmanned machine instruction;
  • the camera component control instruction includes a camera component At least one of a rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
  • control instruction further includes:
  • the instruction determining module searches for a corresponding drone in the control instruction library according to the user action feature Controlling the command and controlling the drone according to the obtained drone control command;
  • the instruction determining module searches for a corresponding camera component control in the control instruction library according to the user action feature Commanding, and controlling the camera assembly according to the obtained camera component control command.
  • control instruction further includes:
  • a panoramic mode selection instruction instructing the control component to enter a panoramic mode, in which the instruction execution module controls the drone to continuously move within a range of (0, ⁇ ) angles at a preset speed, ⁇ is Preset panorama to shoot the maximum angle.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the camera assembly detects a location of the user
  • the drone starts with a user's position as a starting point, and rotates ⁇ /n to one side in the same horizontal plane, where n is a first preset split value, and n>1;
  • the camera assembly starts shooting, and the drone rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane;
  • the camera assembly stops shooting.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera component and a user, and the positioning point is a circle a first sector of the angle ⁇ with a radius of L/m, and the object to be photographed is located on the arc of the first sector, where m is a second predetermined segmentation value, and m>1;
  • the instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector
  • the radius is (m-1) L/m and the angle is ⁇ ;
  • the camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
  • the camera assembly stops shooting.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of ⁇ with a length of L/m as a waist, and The subject to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second predetermined segmentation value, and m>1;
  • the command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle.
  • the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is ⁇ ;
  • the camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
  • the camera assembly stops shooting.
  • control instruction further includes:
  • a third mode selection instruction instructing the control component to enter a third mode, in the third mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
  • control instruction further includes:
  • a fourth mode selection instruction instructing the control component to enter a fourth mode
  • the instruction execution module detects a position of the user through the camera component, and controls the drone and the camera The component automatically moves according to the location of the user such that the camera assembly continues to capture the user.
  • the instruction execution module acquires a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
  • the UAV is further provided with at least one distance sensor
  • the control component further includes an obstacle calculation module
  • the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor
  • the to-be-executed control instruction includes a drone movement instruction, and the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold Take The drone movement command is cancelled, and a limit reminder signal is issued to the outside.
  • the invention also provides a method for interactively photographing a drone, which adopts the UAV interactive photographing system, and the method comprises the following steps:
  • the camera assembly acquires a captured image
  • the image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image
  • the instruction determining module searches for a corresponding control instruction in the control instruction library according to the user action feature to be executed;
  • the instruction execution module controls the drone and/or the camera assembly according to the obtained control command.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component by the following steps Take a panoramic photo shoot:
  • the camera assembly detects a location of the user
  • the drone starts with a user's position as a starting point, and rotates ⁇ /n to one side in the same horizontal plane, where n is a first preset split value, and n>1, ⁇ is a preset panoramic shooting maximum angle;
  • the camera assembly starts shooting, and the drone rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane;
  • the camera assembly stops shooting.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a center to generate a first sector with an angle of ⁇ with a radius of L/m, and the object to be photographed is located at the first a sector-shaped arc, where m is the second preset segmentation value, and m>1, ⁇ is the maximum angle of the preset panoramic shooting;
  • the instruction execution module generates a second sector shape opposite to the first sector shape, the two sides of the second sector are respectively opposite extension lines of the two sides of the first sector, and the second sector
  • the radius is (m-1) L/m and the angle is ⁇ ;
  • the camera assembly starts shooting, and the drone moves from one end of the arc of the second sector along the trajectory of the arc to the other end of the arc;
  • the camera assembly stops shooting.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone and the camera component in the following manner Take a panoramic photo shoot:
  • the instruction execution module calculates a distance L between the camera component and a user
  • the instruction execution module selects an positioning point between the camera assembly and the user, and uses the positioning point as a vertex to generate a first isosceles triangle with an apex angle of ⁇ with a length of L/m as a waist, and The object to be photographed is located on a bottom edge of the first isosceles triangle, wherein m is a second preset segmentation value, and m>1, where ⁇ is a preset panoramic shooting maximum angle;
  • the command execution module generates a second isosceles triangle opposite to the first isosceles triangle, and the two waists of the second isosceles triangle are respectively opposite extension lines of the two waists of the first isosceles triangle.
  • the length of the waist of the second isosceles triangle is (m-1) L / m, the apex angle is ⁇ ;
  • the camera assembly starts to shoot, and the drone moves from one end of the bottom edge of the second isosceles triangle along the trajectory of the bottom edge to the other end of the bottom edge;
  • the camera assembly stops shooting.
  • the invention provides a technical solution that the user directly controls through the action, the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed,
  • This user can directly control the flight control of the drone and control the shooting of the camera unit, thus enabling the shooting function, which can easily meet the needs of shooting in any occasion and improve the user experience.
  • FIG. 1 is a block diagram showing the structure of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention
  • FIG. 2 is a schematic structural diagram of an unmanned aerial camera interactive shooting system using an array display screen according to an embodiment of the present invention
  • FIG. 3 is a schematic structural diagram of a UAV interactive photographing system using a dot matrix display screen according to an embodiment of the present invention
  • FIG. 4 is a schematic diagram of adjusting the position of a drone according to an embodiment of the present invention.
  • FIG. 5 is a schematic diagram of adjusting an angle of a camera assembly according to an embodiment of the invention.
  • 6-7 are schematic diagrams of gesture control according to an embodiment of the present invention.
  • FIG. 8 is a schematic structural diagram of an external display device according to an embodiment of the present invention.
  • FIG. 9 is a schematic structural diagram of a display device when it is stowed according to an embodiment of the present invention.
  • FIG. 10 is a schematic bottom view of the unmanned aerial vehicle according to an embodiment of the present invention when not in use;
  • FIG. 11 is a schematic structural diagram of an electrical storage device according to an embodiment of the present invention.
  • FIG. 12 is a schematic diagram showing a state of a drone when charging according to an embodiment of the present invention.
  • FIG. 13 is a flow chart showing a charging process of a drone according to an embodiment of the present invention.
  • FIG. 14 is a schematic diagram of controlling the position of a drone by voice according to an embodiment of the present invention.
  • 15 is a schematic structural diagram of an unmanned aerial camera interactive shooting system with voice control added according to an embodiment of the present invention.
  • 16 is a flow chart of user voiceprint verification according to an embodiment of the present invention.
  • 17 is a flow chart of user physiological feature verification according to an embodiment of the present invention.
  • 18 to 20 are flowcharts of a method for interactively capturing a drone according to an embodiment of the present invention.
  • 21 is a flow chart of panoramic shooting according to an embodiment of the present invention.
  • Figure 22 is a schematic view showing the rotation of the drone during panoramic shooting according to an embodiment of the present invention.
  • FIG. 23 is a schematic diagram of a drone moving along a circular arc path during panoramic shooting according to an embodiment of the present invention.
  • 24 is a schematic diagram of a drone moving along a linear trajectory during panoramic shooting according to an embodiment of the present invention.
  • 25 is a flow chart of automatically tracking a user's position by a drone according to an embodiment of the present invention.
  • Figure 26 is a flow chart showing the automatic obstacle avoidance of the drone according to an embodiment of the present invention.
  • an embodiment of the present invention provides a UAV interactive photographing system, which includes a drone 200, a camera assembly 300, and a control assembly 100.
  • One end of the camera assembly 300 is rotatably coupled to the One side of the drone 200;
  • the control component 100 includes: a control instruction library 110 for storing a mapping relationship between preset various user action features and various control commands, the control command including a drone a control instruction and/or a camera component control instruction; an image processing module 120, configured to process the captured image of the camera component 300 to acquire a user action feature to be executed in the captured image; and an instruction determining module 130, configured to: And searching for the corresponding control instruction in the control instruction library according to the user action feature to be executed; and the instruction execution module 140, configured to control the drone 200 and/or the Camera assembly 300.
  • the user action feature here is preferably a gesture of the user, that is, different control commands can be used to obtain different control commands.
  • other user action features such as the user's eyes, the user nods, shaking his head, and the user are also available. Laughing, etc., for example, it is possible to set a picture that captures the user's laughter, so that automatic capture of the user's smile can be achieved, and the like.
  • the following embodiments describe multi-user based gestures for control, however it will be appreciated that the use of other user motion features is also within the scope of the present invention.
  • FIG. 2 is a schematic structural diagram of an unmanned aerial vehicle interactive photographing system according to an embodiment of the present invention.
  • a drone 200 in which one side of the drone 200 is rotatably mounted with a camera assembly 300, the camera assembly 300
  • the image capturing device 320 and the image capturing device 310 are disposed in the image capturing bracket 310, and one end of the image capturing bracket 310 is rotatably connected to one side of the drone 200; further,
  • the system can also include a display device 330 that is detachably or fixedly mounted to the other end of the camera mount 310.
  • control assembly 100 may be disposed inside the drone 200, or disposed on the surface of the drone 200, or set All other locations are within the scope of the invention.
  • the instruction execution module 140 can directly communicate with the controller of the drone 200, or can perform wireless communication with the camera assembly 300, thereby implementing delivery and feedback of control commands.
  • the display device 330 can display content for viewing by the user according to needs, and two setting manners of the display device 330 are given in FIG. 2 and FIG.
  • the display device 330 shown in FIG. 2 includes an array display screen and a first display control unit; the first display control unit acquires a captured image of the imaging device 320 and displays it through the array display screen.
  • the array display can include, but is not limited to, a color LCD screen, and the user can view the self-timer picture in real time through the display.
  • the display device 330 shown in FIG. 3 includes a dot matrix display screen and a second display control unit; the second display control unit acquires the control command obtained by the instruction determination module 130, and controls the dot matrix
  • the display screen displays user prompt information associated with the search control command obtained.
  • the dot matrix display screen may include, but is not limited to, a dot matrix LED screen, and the user can perform self-photographing preparation and shooting through the LED number arrangement form.
  • the user prompt information may be a self-timer countdown. For example, when the countdown starts shooting for five seconds, the dot matrix display sequentially displays 5, 4, 3, 2, and 1, and the user can prepare for self-time according to the countdown; the user prompts information. It is also possible to indicate which shooting mode is currently in use, for example, when 2 is displayed, it means that it is currently in the second mode, and so on.
  • the camera component automatically acquires the captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the control command required by the user according to the user action feature to be executed, thereby the user can Control of the drone 200 and/or camera assembly 300 is accomplished.
  • the drone control command may include at least one of a drone panning command, a drone rotation command, a drone powering command, and a drone machine command.
  • the camera component control command may include at least one of a camera component rotation command, a shooting parameter adjustment command, a shooting start command, and a shooting stop command.
  • the shooting parameters that can be adjusted here can include focus, fill light, image size, and so on.
  • FIG. 4 a schematic diagram of adjusting the position of the drone 200 according to an embodiment of the present invention is shown. To adjust the position of the drone, you can use the following steps:
  • the user 400 observes the self-portrait angle from the display device 330, and finds that the portrait is in the left-left position in the display device 330 (the portrait shown by the broken line in FIG. 4), and the user 400 passes the gesture (from the dotted line state of the user 400 in FIG. 4 to the solid line) State) controls the drone to move to the left position until the portrait is in the center of the screen (the portrait shown in solid lines in Figure 4);
  • the user 400 After the shooting conditions are met, the user 400 performs shooting by gesture control.
  • FIG. 5 it is a schematic diagram of adjusting the angle of the camera assembly 300 according to an embodiment of the invention. To specifically adjust the camera component 300, the following steps can be taken:
  • the user 400 observes the self-portrait angle from the display device 303, finds that the drone 200 is high, the portrait is in the downward position (such as the portrait shown by the dotted line in FIG. 5), and the user passes the gesture (from the user 400 hand dotted state in FIG. 4) Up to the solid state) controlling the camera assembly 300 to flip down, thereby driving the camera device 302 to flip down until the portrait is in the center of the screen (such as the portrait shown by the solid line in FIG. 5);
  • the user 400 After the shooting conditions are met, the user 400 performs shooting by gesture control.
  • the manner of controlling the drone 200 and the camera assembly 300 can also be flexibly selected.
  • the adjustment can also be performed by reducing the height of the drone 200.
  • the portrait is in the middle of the screen.
  • the adjustment manner of the drone 200 and the camera assembly 300 can be distinguished by using different preset gesture commands. That is, when a gesture is known, it can be known whether the gesture specific control object is the drone 200 or the camera assembly 300, and it can be known that the gesture specifically controls the action of the drone 200 or the camera assembly 300.
  • gesture control is given in Figures 6 and 7.
  • the user can also customize the mapping relationship between different gestures and different control commands, and modify it to a gesture that conforms to its usage habits.
  • Other action features can also be added. For example, the user nods to confirm the shooting, the user shakes the head to delete the previous captured image, and so on.
  • the camera assembly 300 uses an external display device 340.
  • the external display device 340 can further be a user's mobile terminal.
  • the external display device 340 and the control component 100 can communicate via wireless or USB. Wait for the data line to communicate.
  • One end of the camera bracket 310 is disposed as a bump 311, and one side of the drone 200 is provided with a groove 210 corresponding to the shape of the bump; the bump 311 of the camera bracket is embedded in the In the groove 210 of the drone.
  • the camera bracket 310 includes a first arm 312 , a second arm 313 , and a third arm 314 .
  • One side of the first arm 312 is connected to the bump 311 .
  • a first slot is disposed on the other side of the first arm 312, and one end of the second arm 313 and one end of the third arm 314 are respectively connected to two ends of the first arm 312.
  • the second arm 313 and the third arm 314 are both perpendicular to the first arm 312, and the other end of the second arm 313 is provided with a second slot, the third arm 314 The other end is provided with a third slot.
  • the external display device 340 can be placed in the camera holder 310, the upper end of the external display device 340 is inserted into the first slot, and the lower end of the external display device 340 is inserted into the second slot and the third slot. Thereby, a stable and convenient connection between the external display device 340 and the imaging stand 310 is formed.
  • the display device 330 is a built-in display device 330.
  • the camera holder 310 is rotated by the cooperation of the bump 311 and the groove 210, and the display device 330 is also rotated together with the camera holder 310.
  • the lower surface of the drone 200 is a plane, and the lower surface of the drone 200 includes a camera bracket corresponding area 220. The two sides of the recess 210 of the drone 200 are perpendicular to the drone.
  • the camera assembly 300 can be adjusted within a desired range of angles to achieve better shooting results.
  • the camera stand 310 can be folded into the corresponding area 220 of the camera stand to facilitate folding and carrying.
  • the embodiment of the present invention further provides a convenient charging mode.
  • the lower surface of the unmanned aerial vehicle 200 further includes a power storage device corresponding area 230, and the corresponding area of the power storage device does not intersect with the camera support corresponding area 220; the system further includes a power storage device 500, The power storage device 500 is detachably or fixedly mounted on a lower surface of the drone 200, and the power storage device 500 is attached to the corresponding region of the power storage device.
  • the connection between the external display screen and the drone is first disconnected, and the external display can be displayed.
  • the screen is removed, and it can also be left on the camera holder 310 and folded together; if the power storage device 500 is inserted at this time, charging starts, otherwise it is directly turned off.
  • the power storage device 500 is installed in the corresponding area of the power storage device.
  • the charging device is connected to the rechargeable battery of the drone using the power storage device 500 to perform a charging operation.
  • the embodiment of the present invention may further include a voice acquiring device 600, where the voice acquiring device 600 is configured to acquire voice data of a user; and the control command library 110 is further configured to store presets. a mapping relationship between the voice keyword and the various control commands; the control component 100 further includes a voice processing module 150, the voice processing module 150 is configured to extract a voice keyword included in the voice data of the user; The module 130 is further configured to search for the corresponding control instruction in the control instruction library according to the extracted voice keyword.
  • this embodiment can also implement the user's shooting control by voice. For example, if the keyword “power on” is set to turn on the camera component 300, when the word “power on” is detected in the voice data of the user, the camera component 300 is automatically turned on, or “the drone” is detected in the voice data of the user. And “moving to the left” automatically controls the drone to move to the left. Voice control is more convenient and convenient, and is not subject to other conditions, and can be applied to any occasion without affecting the user's shooting effect.
  • the control component 100 may receive noise from other people's voices or the environment, and also need to distinguish different sounds. That is, the voice processing module is further configured to acquire a voiceprint feature of the user in the voice data of the user, and determine whether the voiceprint feature of the user is a pre-stored specified voiceprint feature;
  • the command determining module extracts the voice data of the user. a voice keyword included, and searching for a corresponding control instruction in the control instruction library according to the extracted voice keyword; if the voiceprint feature of the user is not a preset allowed voiceprint feature, indicating that the voice data is not The voice data of the specified user needs to be screened out, that is, the command determining module ignores the voiceprint feature of the user, and does not perform the process of extracting voice keywords.
  • the camera component 300 may also acquire motion characteristics of other people who are not designated users.
  • the image processing module is further configured to acquire a captured image of the camera component. The physiological characteristics of the user, and determining whether the physiological characteristic of the user is a pre-stored designated physiological feature;
  • the instruction determining module searches for the corresponding one in the control instruction library according to the user action feature to be executed. And controlling the instruction; if the physiological characteristic of the user is not pre-existing the specified physiological feature, the instruction determining module ignores the user action feature to be executed, and does not perform the search control instruction process.
  • obtaining the physiological characteristics of the user may refer to the facial features of the user, the color of the user, the length of the hair, the skin color of the user, the color of the lips, etc., or a combination of various physiological features for more accurate identification, etc. And the like are all within the scope of protection of the present invention.
  • an embodiment of the present invention further provides a method for interactively capturing a UAV, which adopts the UAV interactive photographing system, and the method includes the following steps:
  • the image processing module processes the captured image of the camera component to acquire a user action feature to be executed in the captured image
  • the instruction execution module controls the drone and/or the camera component according to the obtained control instruction.
  • the determination process may adopt the flow shown in FIG. 19 to sequentially perform the determination and control, but is not limited to this manner. Others determine whether it is a camera component control command, and then determine whether it is a drone control command or the like, which falls within the scope of protection of the present invention.
  • an embodiment of a specific UAV interactive photographing method is shown. First, determine the type of display. If it is an external display, you need to first connect the control unit to the external display through wireless communication to prepare for the rear control. Then, according to the correspondence between the gesture and the control instruction, the corresponding control instruction is searched for and the control is executed.
  • the action features of the present invention are not limited to the one of the gestures, and the different actions of other body parts can also achieve the object of the present invention.
  • control instruction may further include a first mode selection instruction and a second mode selection instruction respectively instructing the control component to enter the first mode and the second mode.
  • the received user action feature defaults to pointing to the drone control command, that is, the command determining module searches for the corresponding drone control in the control command library according to the user action feature. Commanding, and controlling the drone according to the obtained drone control command, and no longer executing the camera component control instruction; after entering the second mode, the received user action feature defaults to pointing to the camera component control An instruction, that is, the instruction determining module searches for a corresponding camera component control instruction in the control instruction library according to the user action feature, and controls the camera component according to the obtained camera component control instruction, and does not execute Camera component control instructions.
  • the palm is also spread out and moved downward.
  • the first mode it means that the drone is controlled to move downward
  • the second mode it means that the camera assembly is turned down. Only one specific embodiment is given herein, and the scope of protection of the present invention is not limited thereto.
  • the drone due to the smoothness and controllability of the drone during flight, it has some irreplaceable advantages compared to the user's hand taking the camera. For example, the drone can take a photo with less jitter, and the camera device The anti-shake performance requirements are lower. When a user takes a panoramic photo with the camera in his hand, he or she often loses the ideal panoramic photo due to jitter or other factors. And this problem can be overcome by drones.
  • control instruction further includes a panoramic mode selection instruction, indicating that the control component enters a panoramic mode, in the panoramic mode, the instruction execution module controls the drone to preset The speed continues to rotate within the (0, ⁇ ) angle range, and ⁇ is the maximum angle for the preset panorama.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the camera assembly detects a location of the user 400
  • the drone 200 takes the position of the user 400 as a starting point, and rotates ⁇ /n to one side in the same horizontal plane.
  • This stage is a positioning stage of the drone, and no shooting is performed in this process, where n is the first preset dividing value. ;
  • the camera assembly starts shooting, and the drone 200 rotates ⁇ to the other side at a preset speed at a preset speed in the same horizontal plane, thereby reaching a panoramic photo with an angle ⁇ , and the user is located at a designated position of the panoramic photo;
  • the camera assembly stops shooting.
  • the user can be placed in the center of the panoramic photo.
  • the angle ⁇ can be set as needed, and the position of the user in the panoramic photo can also be adjusted. For example, if the user is located in the left position, the drone can be rotated to one side by a/4, etc., and the shooting mode is adopted. Very flexible, and the success rate of taking panoramic photos is better, and taking photos is better.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
  • the instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a center to generate a first sector 701 of an angle ⁇ with a radius of L/m, and the object to be photographed is located at the center.
  • is the maximum angle of the preset panoramic shooting;
  • the instruction execution module generates a second sector 702 opposite to the first sector 701, the second sector
  • the two sides of the 702 are respectively opposite extension lines of the two sides of the first sector 701, and the radius of the second sector 702 is (m-1) L / m, the angle is ⁇ ;
  • the camera assembly starts shooting, and the drone 200 moves from one end of the arc of the second sector 702 along the trajectory of the arc to the other end of the arc;
  • the camera assembly stops shooting.
  • the instruction execution module controls the drone and the camera assembly to perform panoramic photo shooting in the following manner:
  • the instruction execution module calculates a distance L between the camera assembly and the user 400, that is, a distance indicated by a broken line connecting the user 400 and the drone 200 in the figure;
  • the instruction execution module selects an positioning point between the camera assembly and the user 400, and uses the positioning point as a vertex to generate a first isosceles triangle 703 with an apex angle ⁇ with a length of L/m as a waist. And the object to be photographed is located on a bottom edge of the first isosceles triangle 703, where m is a second preset segmentation value, and m>1, ⁇ is a preset panoramic shooting maximum angle;
  • the instruction execution module generates a second isosceles triangle 704 opposite to the first isosceles triangle 703, and the two waists of the second isosceles triangle 704 are respectively opposite to the two waists of the first isosceles triangle 703 To the extension line, and the length of the waist of the second isosceles triangle 704 is (m-1) L/m, and the apex angle is ⁇ ;
  • the camera assembly starts shooting, and the drone 200 moves from one end of the bottom edge of the second isosceles triangle 704 along the trajectory of the bottom edge to the other end of the bottom edge;
  • the camera assembly stops shooting.
  • the shooting trajectories in FIG. 23 and FIG. 24 can be selected as needed, forming a panoramic photo by continuous shooting, or synthesizing a plurality of photos into one panoramic photo, and different selections of m and ⁇ can obtain different shooting ranges, and more flexibility.
  • the drone can move according to the calculated preset trajectory, so that the camera component acquires different shooting positions and shooting angles.
  • a shooting countdown may be set, that is, the control command may further include a third mode selection instruction indicating that the control component enters the third mode, in the In the three mode, the instruction execution module controls the camera component to perform shooting after a preset waiting time.
  • the countdown time can be displayed by the display device, or the remaining preparation time can be indicated by other display lights or prompts.
  • the drone of the present invention can also realize a user automatic tracking shooting function.
  • the control instruction may further include a fourth mode selection instruction instructing the control component to enter a fourth mode, in the fourth mode, the instruction execution module detects a position of the user through the camera component, and controls the The drone and the camera assembly automatically move according to the position of the user, so that the camera assembly continuously captures the user. This enables automatic tracking of user shots, ensuring that the user is always within range.
  • the instruction execution module may further acquire a position change acceleration of the user, and when the position change acceleration of the user exceeds a preset acceleration threshold, the instruction execution module sends an alarm signal to the outside.
  • the instruction execution module sends an alarm signal to the outside.
  • the camera component cannot capture the user's position
  • the user can be alerted by the alarm, so that the user can actively come to the shooting range of the camera component; on the other hand, the fall detection can also be realized.
  • the alarm signal can be automatically sent to the outside. If the user does not cancel the alarm signal within a certain period of time, the mobile terminal of other users associated with the user or the emergency telephone can be further notified. While providing users with high-quality shooting, it also ensures the safety of users during use.
  • At least one distance sensor may be disposed on the drone, the control component further includes an obstacle calculation module, and the obstacle calculation module is configured to acquire obstacle detection data of the distance sensor;
  • the to-be-executed control instruction includes a drone movement instruction
  • the obstacle calculation module determines that the distance between the obstacle in the moving direction and the drone in the UAV movement instruction is less than a preset safety threshold , cancel the drone movement instruction, and issue a limit reminder signal to the outside. That is, the obstacle calculation module predicts according to the pointing direction of the control command after the obstacle is detected by the distance sensor, and if the drone performs the drone movement instruction, it may hit the obstacle. If so, the drone movement command is not executed, and the user is reminded that the distance is already less than the limit and there is a danger of hitting an obstacle.
  • this embodiment is particularly suitable for shooting indoors. Due to the limitation of wall and ceiling in the room, and many other obstacles such as furniture and furnishings, this method can ensure the safety of the drone indoors through reliable calculation and danger prediction. Similarly, it can also be applied to the case where the drone is photographed outdoors. In an open space, the drone may move faster, and the user cannot predict the danger of the arrival. Therefore, in this way, Ensure the stability and reliability of the drone interactive shooting process.
  • the present invention provides a technical solution in which a user directly controls through an action, and the camera component automatically acquires a captured image and automatically analyzes the user action feature to be executed by the control component, and interprets the user action feature to be executed according to the user action feature to be executed.
  • the user needs the control command, so that the user can directly control the flight control of the drone and perform shooting control on the camera component through the action, thereby realizing the shooting function, and can easily realize the shooting to meet the demand in any occasion, thereby improving the user experience.

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Automation & Control Theory (AREA)
  • Studio Devices (AREA)

Abstract

提供了一种无人机交互拍摄系统及方法,所述系统包括无人机、摄像组件和控制组件,所述摄像组件的一端可转动地连接至所述无人机的一侧;其中所述控制组件包括控制指令库、图像处理模块、指令判定模块以及指令执行模块,所述指令执行模块根据查找得到的控制指令控制所述无人机和/或所述摄像组件。提供了一种基于无人机实现交互拍摄的技术方案,摄像组件自动获取拍摄图像并由控制组件自动分析得到待执行的用户动作特征,根据待执行的用户动作特征解读用户需要的控制指令,由此用户可以直接通过动作对无人机进行飞行控制和对摄像组件进行拍摄控制,从而实现拍摄功能,在任何场合均能够轻松实现满足需求的拍摄,提升了用户体验。

Description

无人机交互拍摄系统及方法 技术领域
本发明涉及无人机控制技术领域,尤其涉及一种用户通过动作直接控制的无人机交互拍摄系统及方法。
背景技术
无人机即无人驾驶飞机,是利用无线电遥控设备和自备的程序控制操纵的不载人飞机。由于近年来无人机技术的迅速发展,其已经广泛应用于各个领域。
现有的无人机拍摄可分为商用航拍及个人娱乐自拍两大部分,目前均采用遥控器或手持移动设备中的应用程序进行操控。然而在使用无人机进行个人娱乐自拍时,用户往往要同时兼顾无人机和遥控器两方面,操作起来并不方便。例如,在拍摄聚会团体照时,常常会因为用户需要观察手持移动设备中的应用程序画面,而导致无法拍到自己清晰的面部,或拍摄运动跳跃自拍摄时,因为手中的遥控器而无法做出满意的动作,影响拍摄效果。
另外,考虑到无人机自身重量和尺寸等因素,小型化自拍无人机往往电量比较少、续航时间短,从而影响拍摄乐趣,无法满足当前用户的使用需求。
发明内容
针对现有技术中的问题,本发明的目的在于提供一种无人机交互拍摄系统及方法,用户可以直接通过动作对无人机进行飞行控制和对摄像组件进行拍摄控制,从而实现拍摄功能,提高了拍摄效果。
本发明实施例提供一种无人机交互拍摄系统,所述系统包括无人机、摄像组件和控制组件,所述摄像组件的一端可转动地连接至所述无人机的一侧;其中所述控制组件包括:
控制指令库,用于存储预设的各种用户动作特征与各种控制指令的映射关系,所述控制指令包括无人机控制指令和/或摄像组件控制指令;
图像处理模块,用于对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;
指令判定模块,用于根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及
指令执行模块,用于根据查找得到的控制指令控制所述无人机和/或所述摄像组件。
可选地,所述摄像组件包括摄像设备和摄像支架,所述摄像设备设置于所述摄像 支架中,且所述摄像支架的一端可转动地连接至所述无人机的一侧;
所述系统还包括一显示设备,所述显示设备可拆卸或固定地安装于所述摄像支架的另一端。
可选地,所述显示设备包括阵列式显示屏和第一显示控制单元;所述第一显示控制单元获取所述摄像设备的拍摄图像,并通过所述阵列式显示屏进行显示。
可选地,所述显示设备包括点阵式显示屏和第二显示控制单元;所述第二显示控制单元获取所述指令判定模块查找得到的控制指令,并控制所述点阵式显示屏显示与所述查找得到的控制指令相关联的用户提示信息。
可选地,所述摄像支架的一端设置为凸块,所述无人机的一侧设置有一与所述凸块形状相适应的凹槽;所述摄像支架的凸块嵌设于所述无人机的凹槽中;
所述无人机的下表面为一平面,所述无人机的下表面包括一摄像支架对应区,所述无人机的凹槽的两侧面垂直于所述无人机的下表面,且所述摄像支架的凸块在所述无人机的凹槽中可转动,以使得所述摄像支架可在与所述无人机的下表面垂直以及与所述摄像支架对应区贴合的角度范围内转动。
可选地,所述无人机的下表面还包括一储电设备对应区,且所述储电设备对应区与所述摄像支架对应区无交叉;
所述系统还包括一储电设备,所述储电设备可拆卸或固定地安装于所述无人机的下表面,且所述储电设备贴合所述储电设备对应区。
可选地,所述摄像支架包括第一支臂、第二支臂和第三支臂,所述第一支臂的一侧连接至所述凸块,且所述第一支臂的另一侧设置有一第一插槽,所述第二支臂的一端和第三支臂的一端分别连接至所述第一支臂的两端,且所述第二支臂和第三支臂均垂直于所述第一支臂,所述第二支臂的另一端设置有一第二插槽,所述第三支臂的另一端设置有一第三插槽;
所述显示设备的一侧插入所述第一插槽中,所述显示设备的另一侧插入所述第二插槽和所述第三插槽中。
可选地,还包括语音获取设备,所述语音获取设备用于获取用户的语音数据;
所述控制指令库还用于存储预设的各种语音关键词与各种控制指令的映射关系;
所述控制组件还包括语音处理模块,所述语音处理模块用于提取所述用户的语音数据中包括的语音关键词;
所述指令判定模块还用于根据提取的语音关键词在所述控制指令库中查找所对应的控制指令。
可选地,所述语音处理模块还用于获取所述用户的语音数据中用户的声纹特征,并判断所述用户的声纹特征是否为预存指定声纹特征;
如果所述用户的声纹特征为预设允许声纹特征,则所述指令判定模块提取所述用户的语音数据中包括的语音关键词,并根据提取的语音关键词在所述控制指令库中查 找所对应的控制指令;
如果所述用户的声纹特征不为预设允许声纹特征,则所述指令判定模块忽略所述用户的声纹特征,不进行提取语音关键词处理。
可选地,所述图像处理模块还用于获取所述摄像组件的拍摄图像中用户的生理特征,并判断用户的生理特征是否为预存指定生理特征;
如果所述用户的生理特征为预存指定生理特征,则所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;
如果所述用户的生理特征不为预存指定生理特征,则所述指令判定模块忽略所述待执行的用户动作特征,不进行查找控制指令处理。
可选地,所述无人机控制指令包括无人机平移指令、无人机转动指令、无人机开机指令和无人机关机指令中的至少一种;所述摄像组件控制指令包括摄像组件转动指令、拍摄参数调整指令、拍摄开始指令和拍摄停止指令中的至少一种。
可选地,所述控制指令还包括:
第一模式选择指令,指示所述控制组件进入第一模式,在所述第一模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的无人机控制指令,并根据查找得到的无人机控制指令控制所述无人机;
第二模式选择指令,指示所述控制组件进入第二模式,在所述第二模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的摄像组件控制指令,并根据查找得到的摄像组件控制指令控制所述摄像组件。
可选地,所述控制指令还包括:
全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机以预设速度在(0,α)角度范围内持续移动,α为预设全景拍摄最大角度。
可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述摄像组件检测用户的位置;
所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1;
所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;
所述无人机停止转动后,所述摄像组件停止拍摄。
可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述指令执行模块计算所述摄像组件与用户之间的距离L;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆 心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1;
所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;
所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;
所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。
可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述指令执行模块计算所述摄像组件与用户之间的距离L;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1;
所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;
所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;
所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。
可选地,所述控制指令还包括:
第三模式选择指令,指示所述控制组件进入第三模式,在所述第三模式下,所述指令执行模块控制所述摄像组件在预设等待时间后进行拍摄。
可选地,所述控制指令还包括:
第四模式选择指令,指示所述控制组件进入第四模式,在所述第四模式下,所述指令执行模块通过所述摄像组件检测用户的位置,且控制所述无人机和所述摄像组件自动根据所述用户的位置移动,以使得所述摄像组件持续对用户进行拍摄。
可选地,所述第四模式下,所述指令执行模块获取用户的位置变化加速度,当用户的位置变化加速度超过预设加速度阈值时,所述指令执行模块向外部发出报警信号。
可选地,所述无人机上还设置有至少一个距离传感器,所述控制组件还包括障碍计算模块,所述障碍计算模块用于获取所述距离传感器的障碍物探测数据;
所述待执行控制指令中包含无人机移动指令,且所述障碍计算模块判断所述无人机移动指令中移动方向上的障碍物与所述无人机之间的距离小于预设安全阈值时,取 消所述无人机移动指令,并向外部发出限值提醒信号。
本发明还提供一种无人机交互拍摄方法,采用所述的无人机交互拍摄系统,所述方法包括如下步骤:
所述摄像组件获取拍摄图像;
所述图像处理模块对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;
所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及
所述指令执行模块根据查找得到的控制指令控制所述无人机和/或所述摄像组件。
可选地,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下步骤进行全景照片拍摄:
所述摄像组件检测用户的位置;
所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1,α为预设全景拍摄最大角度;
所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;
所述无人机停止转动后,所述摄像组件停止拍摄。
可选地,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述指令执行模块计算所述摄像组件与用户之间的距离L;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;
所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;
所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;
所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。
可选地,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述指令执行模块计算所述摄像组件与用户之间的距离L;
所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;
所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;
所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;
所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。
本发明所提供的无人机交互拍摄系统及方法具有下列优点:
本发明提供了一种用户通过动作直接控制的技术方案,摄像组件自动获取拍摄图像并由控制组件自动分析得到待执行的用户动作特征,根据待执行的用户动作特征解读用户需要的控制指令,由此用户可以直接通过动作对无人机进行飞行控制和对摄像组件进行拍摄控制,从而实现拍摄功能,在任何场合均能够轻松实现满足需求的拍摄,提升了用户体验。
附图说明
通过阅读参照以下附图对非限制性实施例所作的详细描述,本发明的其它特征、目的和优点将会变得更明显。
图1是本发明一实施例的无人机交互拍摄系统的结构框图;
图2是本发明一实施例的采用阵列式显示屏的无人机交互拍摄系统的结构示意图;
图3是本发明一实施例的采用点阵式显示屏的无人机交互拍摄系统的结构示意图;
图4是本发明一实施例的调整无人机位置的示意图;
图5是本发明一实施例的调整摄像组件角度的示意图;
图6~7是本发明一实施例的手势控制的示意图;
图8是本发明一实施例的采用外置显示设备的结构示意图;
图9是本发明一实施例的显示设备收起时的结构示意图;
图10是本发明一实施例的无人机不使用时的底面示意图;
图11是本发明一实施例的储电设备的结构示意图;
图12是本发明一实施例的无人机充电时的状态示意图;
图13是本发明一实施例的无人机充电过程的流程图;
图14是本发明一实施例的通过语音控制无人机位置的示意图;
图15是本发明一实施例的增加语音控制的无人机交互拍摄系统的结构示意图;
图16是本发明一实施例的用户声纹验证的流程图;
图17是本发明一实施例的用户生理特征验证的流程图;
图18~20是本发明一实施例的无人机交互拍摄方法的流程图;
图21是本发明一实施例的全景拍摄的流程图;
图22是本发明一实施例的全景拍摄时无人机转动的示意图;
图23是本发明一实施例的全景拍摄时无人机沿圆弧轨迹移动的示意图;
图24是本发明一实施例的全景拍摄时无人机沿直线轨迹移动的示意图;
图25是本发明一实施例的无人机自动跟踪用户位置的流程图;
图26是本发明一实施例的无人机自动障碍物躲避的流程图。
具体实施方式
现在将参考附图更全面地描述示例实施方式。然而,示例实施方式能够以多种形式实施,且不应被理解为限于在此阐述的实施方式;相反,提供这些实施方式使得本发明将全面和完整,并将示例实施方式的构思全面地传达给本领域的技术人员。在图中相同的附图标记表示相同或类似的结构,因而将省略对它们的重复描述。
如图1所示,本发明实施例提供一种无人机交互拍摄系统,所述系统包括无人机200、摄像组件300和控制组件100,所述摄像组件300的一端可转动地连接至所述无人机200的一侧;其中所述控制组件100包括:控制指令库110,用于存储预设的各种用户动作特征与各种控制指令的映射关系,所述控制指令包括无人机控制指令和/或摄像组件控制指令;图像处理模块120,用于对所述摄像组件300的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;指令判定模块130,用于根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及指令执行模块140,用于根据查找得到的控制指令控制所述无人机200和/或所述摄像组件300。
此处的用户动作特征优选为用户的手势,即采用不同的手势可以获得不同的控制指令,然而在实际应用中,也可以采用其他的用户动作特征,如用户眼神,用户点头、摇头,用户大笑等,例如可以设置捕捉到用户大笑的画面时即进行拍摄,从而可以实现用户笑容的自动捕捉,等等。下文中的实施例介绍多基于用户的手势进行控制,然而可以理解的是,采用其他用户动作特征,也属于本发明的保护范围之内。
如图2所示,为本发明一实施例的无人机交互拍摄系统的结构示意图。其中示出了无人机200,无人机200的一侧可转动地安装有一摄像组件300,所述摄像组件300 包括摄像设备320和摄像支架310,所述摄像设备320设置于所述摄像支架310中,且所述摄像支架310的一端可转动地连接至所述无人机200的一侧;进一步地,所述系统还可以包括一显示设备330,所述显示设备330可拆卸或固定地安装于所述摄像支架310的另一端。
为了方便所述控制组件100对所述无人机200和/或摄像组件300进行控制,所述控制组件100可以设置在无人机200的内部,或设置在无人机200的表面,或设置在其他位置处,均属于本发明的保护范围之内。所述指令执行模块140可以直接与所述无人机200的控制器进行通信,也可以与所述摄像组件300进行无线通信,从而实现控制指令的传递和反馈。
所述显示设备330可以根据需要显示供用户查看的内容,图2和图3中给出了显示设备330的两种设置方式。
图2中示出的显示设备330中包括阵列式显示屏和第一显示控制单元;所述第一显示控制单元获取所述摄像设备320的拍摄图像,并通过所述阵列式显示屏进行显示。阵列式显示屏可以包括但不限于彩色LCD屏幕,用户可以通过显示屏实时观察自拍画面。
图3中示出的显示设备330中包括点阵式显示屏和第二显示控制单元;所述第二显示控制单元获取所述指令判定模块130查找得到的控制指令,并控制所述点阵式显示屏显示与所述查找得到的控制指令相关联的用户提示信息。所述点阵式显示屏可以包括但不限于点阵式LED屏幕,用户可以通过LED的灯号排列形式进行自拍准备和拍摄。
例如,用户提示信息可以是自拍倒计时,例如倒计时五秒开始拍摄时,所述点阵式显示屏依次显示5、4、3、2、1,用户可以根据倒计时来做好自拍准备;用户提示信息也可以指示当前处于何种拍摄模式,例如显示2时,表示当前处于第二模式,等等。
通过采用本发明的无人机交互拍摄系统,摄像组件自动获取拍摄图像并由控制组件自动分析得到待执行的用户动作特征,根据待执行的用户动作特征解读用户需要的控制指令,由此用户可以实现无人机200和/或摄像组件300的控制。
在控制无人机200和/或摄像组件300时,所述无人机控制指令可以包括无人机平移指令、无人机转动指令、无人机开机指令和无人机关机指令中的至少一种;所述摄像组件控制指令可以包括摄像组件转动指令、拍摄参数调整指令、拍摄开始指令和拍摄停止指令中的至少一种。此处可以调整的拍摄参数可以包括拍摄时的对焦、补光、图像大小等等。
如图4所示,为本发明一实施例的调整无人机200位置的示意图。具体调整无人机位置可以采用如下步骤:
a.当无人机200启动并起飞后,悬停于初始位置;
b.用户400从显示设备330观察自拍角度,发现人像处于显示设备330中偏左位置(图4中虚线示出的人像),用户400通过手势(从图4中用户400手虚线状态至实线状态)控制无人机向左移动位置,至人像处于画面居中位置(图4中实线示出的人像);
c.当符合拍摄条件后,用户400通过手势控制进行拍摄。
如图5所示,为本发明一实施例的调整摄像组件300角度的示意图。具体调整摄像组件300可以采用如下步骤:
a.无人机200启动并起飞后,悬停于初始位置;
b.用户400从显示设备303观察自拍角度,发现无人机200偏高,人像处于偏下位置(如图5中虚线示出的人像),用户通过手势(从图4中用户400手虚线状态至实线状态)控制摄像组件300向下翻转,从而带动摄像设备302向下翻转,至人像处于画面居中位置(如图5中实线示出的人像);
c.当符合拍摄条件后,用户400通过手势控制进行拍摄。
另外,控制无人机200和摄像组件300的方式也可以灵活选择,例如,在图5中,当人像处于偏下位置时,也可以通过降低无人机200的高度的方式来进行调整,使得人像处于画面居中位置。
具体地,对无人机200和摄像组件300的调整方式可以采用预设的不同手势指令进行区分。即,当获知一个手势时,即可以知道该手势具体控制对象是无人机200还是摄像组件300,也可以知道该手势具体控制无人机200或摄像组件300的动作是什么。
如图6和图7所示,示出了一种用户动作特征与控制指令的映射关系。此处各种不同的手势所对应的控制指令如下表1所示。
表1手势与控制指令映射表
Figure PCTCN2017080738-appb-000001
Figure PCTCN2017080738-appb-000002
图6和图7中仅给出了一种手势控制的示例。在实际应用中,用户也可以自定义各种不同的手势与不同控制指令的映射关系,将其修改为符合其使用习惯的手势。并且也可以增加其他动作特征,例如,用户点头则表示确认拍摄,用户摇头则表示删除前一拍摄图像,等等。
如图8和图9中给出了摄像组件300的两种设置方式。如图8所示,所述摄像组件300采用外置显示设备340,该外置显示设备340进一步可以是用户的移动终端,外置显示设备340与控制组件100可以通过无线通信,也可以通过USB等数据线进行通信。所述摄像支架310的一端设置为凸块311,所述无人机200的一侧设置有一与所述凸块形状相适应的凹槽210;所述摄像支架的凸块311嵌设于所述无人机的凹槽210中。
在图8的设置方式中,所述摄像支架310包括第一支臂312、第二支臂313和第三支臂314,所述第一支臂312的一侧连接至所述凸块311,且所述第一支臂312的另一侧设置有一第一插槽,所述第二支臂313的一端和第三支臂314的一端分别连接至所述第一支臂312的两端,且所述第二支臂313和第三支臂314均垂直于所述第一支臂312,所述第二支臂313的另一端设置有一第二插槽,所述第三支臂314的另一端设置有一第三插槽。所述外置显示设备340可以放置于摄像支架310中,外置显示设备340的上端插入所述第一插槽,外置显示设备340的下端插入所述第二插槽和第三插槽,从而形成外置显示设备340和摄像支架310之间稳定且方便装卸的连接。
在图9的实施方式中,所述显示设备330为内置的显示设备330。同样地,该种设置方式中,摄像支架310通过凸块311与凹槽210的配合实现转动,显示设备330也随摄像支架310一起转动。所述无人机200的下表面为一平面,所述无人机200的下表面包括一摄像支架对应区220,所述无人机200的凹槽210的两侧面垂直于所述无人机200的下表面,由此所述凸块311可以在凹槽210中上下转动,以使得所述摄像支架310可在与所述无人机200的下表面垂直以及与所述摄像支架对应区220贴合的角度范围内转动。如上所述,在使用过程中,摄像组件300可以在需要的角度范围内进行调整,以获取更好的拍摄效果。而在使用完毕或者无人机电量耗尽无法使用时,可以将摄像支架310折叠至与所述摄像支架对应区220贴合,方便将其折叠携带。
另外,由于无人机200普遍体积较小,电池容量也很小,续航时间不长,为了克服该问题,本发明实施例进一步提供了一种便捷充电的方式。如图10~12所示,所 述无人机200的下表面还包括一储电设备对应区230,且所述储电设备对应区与所述摄像支架对应区220无交叉;所述系统还包括一储电设备500,所述储电设备500可拆卸或固定地安装于所述无人机200的下表面,且所述储电设备500贴合所述储电设备对应区。
如图13所示,为采用该实施例的结构时充电的流程图,当显示屏为外置显示屏时,首先断开外置显示屏与无人机之间的连接,可以将外置显示屏拆除,也可以将其留在摄像支架310上,一起折叠;如果此时插入了储电设备500,则开始充电,否则直接关机。为了保证无人机200飞行时的小负载,在无人机200有电且使用时,将储电设备500拆除,而当无人机200不使用或电量耗尽时,可以将摄像支架310折叠,然后将储电设备500安装于储电设备对应区。使用储电设备500连接至无人机的可充电电池,进行充电动作。折叠充电时,无人机200通过折叠获取一个较小的体积,方便携带,至充电完成后可以继续使用。
如图14和图15所示,本发明实施例还可以包括语音获取设备600,所述语音获取设备600用于获取用户的语音数据;所述控制指令库110还用于存储预设的各种语音关键词与各种控制指令的映射关系;所述控制组件100还包括语音处理模块150,所述语音处理模块150用于提取所述用户的语音数据中包括的语音关键词;所述指令判定模块130还用于根据提取的语音关键词在所述控制指令库中查找所对应的控制指令。
通过设置语音获取设备600,该实施例还可以实现用户通过语音进行拍摄控制。例如设置关键词“开机”为将摄像组件300开启,当检测到用户的语音数据中存在“开机”一词时,则自动开启摄像组件300,或检测到用户的语音数据中存在“无人机”和“向左移动”,则自动控制无人机向左移动。语音控制更加便捷方便,并且不受其他条件的约束,可以适用于任何场合,而不影响用户的拍摄效果。
进一步地,如图16所示,考虑到用户在使用语音控制时,控制组件100可能会接收到外界其他人的声音或环境中的噪音,还需要对不同的声音进行辨别。即所述语音处理模块还用于获取所述用户的语音数据中用户的声纹特征,并判断所述用户的声纹特征是否为预存指定声纹特征;
如果所述用户的声纹特征为预设允许声纹特征,则表明该语音数据为指定用户的语音数据,可以按照此语音数据执行控制,则所述指令判定模块提取所述用户的语音数据中包括的语音关键词,并根据提取的语音关键词在所述控制指令库中查找所对应的控制指令;如果所述用户的声纹特征不为预设允许声纹特征,则表明该语音数据非指定用户的语音数据,需要进行筛除,即所述指令判定模块忽略所述用户的声纹特征,不进行提取语音关键词处理。
同样地,如图17所示,所述摄像组件300还可能会获取到非指定用户的其他人的动作特征,为了避免混淆,所述图像处理模块还用于获取所述摄像组件的拍摄图像 中用户的生理特征,并判断用户的生理特征是否为预存指定生理特征;
如果所述用户的生理特征为预存指定生理特征,则表明获取到的是指定用户的动作,所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;如果所述用户的生理特征不为预存指定生理特征,则所述指令判定模块忽略所述待执行的用户动作特征,不进行查找控制指令处理。
此处,获取用户的生理特征,可以指的是用户的五官轮廓、用户的发色、头发长度、用户的肤色、唇色等等,也可以多种生理特征结合起来进行更准确的辨别,等等,均属于本发明的保护范围之内。
如图18所示,本发明实施例还提供一种无人机交互拍摄方法,采用所述的无人机交互拍摄系统,所述方法包括如下步骤:
S100:所述摄像组件获取拍摄图像;
S200:所述图像处理模块对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;
S300:所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及
S400:所述指令执行模块根据查找得到的控制指令控制所述无人机和/或所述摄像组件。
当所述控制指令包括无人机控制指令、摄像组件控制指令或其他有效指令时,其判断过程可以采用如图19所示的流程,依次执行判断和控制,然而并不限于此种方式。其他先判断是否为摄像组件控制指令,再判断是否为无人机控制指令等等,均属于本发明的保护范围之内。
如图20所示,示出了一种具体的无人机交互拍摄方法的实施方式。首先判断显示屏种类,如果为外置显示屏,则需要首先通过无线通讯方式使控制组件连接外置显示屏,从而为后面控制做准备。然后根据手势和控制指令的对应关系来查找对应的控制指令,并执行控制。如上所述,本发明中动作特征并不仅限于手势这一种,其他身体部位的不同动作,也可以实现本发明的目的。
如上所述,为了区分无人机控制指令和摄像组件控制指令,可以通过不同的动作特征来进行区分。另外,也可以采用不同控制模块来实现。例如,所述控制指令还可以包括第一模式选择指令和第二模式选择指令,分别指示所述控制组件进入第一模式和第二模式。
进入第一模式后,之后接收到的用户动作特征默认为指向无人机控制指令,即所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的无人机控制指令,并根据查找得到的无人机控制指令控制所述无人机,而不再执行摄像组件控制指令;进入所述第二模式后,之后接收到的用户动作特征默认为指向摄像组件控制 指令,即所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的摄像组件控制指令,并根据查找得到的摄像组件控制指令控制所述摄像组件,而不再执行摄像组件控制指令。
采用该种方式,可以减少用户设置动作特征的数量。例如,同样是手掌摊开,向下移动,在第一模式下,就表示控制无人机向下移动,在第二模式下,则表示控制摄像组件向下翻转。此处仅给出一种具体的实施方式,本发明的保护范围不以此为限。
进一步地,由于无人机飞行过程中的平稳性和可控性,其相对于用户手拿相机拍摄具有一些不可替代的优势,例如,无人机可以拍出抖动更小的照片,对摄像设备的防抖性能要求更低。用户在手拿相机拍摄全景照片时,往往会因为抖动或其他因素干扰而无法获得理想的全景照片。而这一问题可以被无人机所克服。
如图21和图22所示,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机以预设速度在(0,α)角度范围内持续转动,α为预设全景拍摄最大角度。可选地,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述摄像组件检测用户400的位置;
所述无人机200以用户400的位置为起始点,在同一水平面内向一侧转动α/n,此阶段为无人机定位阶段,此过程中不拍摄,其中n为第一预设分割值;
所述摄像组件开始拍摄,且所述无人机200在同一水平面内以预设速度匀速向另一侧转动α,从而达到一个角度为α的全景照片,并且用户位于全景照片的指定位置;
所述无人机200停止转动后,所述摄像组件停止拍摄。
当n为2时,可以使得用户位于全景照片的中央。在实际应用中,角度α可以根据需要设置,用户位于全景照片中的位置也可以进行调整,例如设置用户位于偏左位置,则可以先将无人机向一侧转动α/4等,拍摄方式十分灵活,并且拍摄全景照片成功率高,拍摄照片效果更好。
如图23所示,为另一种全景拍摄的方式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述指令执行模块计算所述摄像组件与用户400之间的距离L,即图中用户400和无人机200之间连接的虚线示出的距离;
所述指令执行模块选定所述摄像组件与用户400之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形701,且待拍摄物位于所述第一扇形701的圆弧上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;
所述指令执行模块生成与所述第一扇形701相对的第二扇形702,所述第二扇形 702的两侧边分别为所述第一扇形701的两侧边的反向延长线,且所述第二扇形702的半径为(m-1)L/m,角度为α;
所述摄像组件开始拍摄,且所述无人机200从所述第二扇形702的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;
所述无人机200移动至所述第二扇形702的圆弧的另一端后,所述摄像组件停止拍摄。
如图24所示,为再一种全景拍摄的方式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
所述指令执行模块计算所述摄像组件与用户400之间的距离L,即图中用户400和无人机200之间连接的虚线示出的距离;
所述指令执行模块选定所述摄像组件与用户400之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形703,且待拍摄物位于所述第一等腰三角形703的底边上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;
所述指令执行模块生成与所述第一等腰三角形703相对的第二等腰三角形704,所述第二等腰三角形704的两腰分别为所述第一等腰三角形703的两腰的反向延长线,且所述第二等腰三角形704的腰的长度为(m-1)L/m,顶角角度为α;
所述摄像组件开始拍摄,且所述无人机200从所述第二等腰三角形704的底边的一端沿该底边的轨迹移动至该底边的另一端;
所述无人机200移动至所述第二等腰三角形704的底边的另一端后,所述摄像组件停止拍摄。
图23和图24中的拍摄轨迹可以根据需要进行选择,通过持续拍摄形成全景照片,或通过拍摄多张照片合成为一张全景照片,m和α的不同选择可以获得不同的拍摄范围,更具有灵活性。无人机可以根据计算得到的预设轨迹移动,使摄像组件获取不同的拍摄位置和拍摄角度。
在使用无人机进行拍摄时,有时需要一定的准备时间,例如可以设置拍摄倒计时,即所述控制指令还可以包括第三模式选择指令,指示所述控制组件进入第三模式,在所述第三模式下,所述指令执行模块控制所述摄像组件在预设等待时间后进行拍摄。倒计时过程中,可以通过显示设备显示倒计时时间,也可以通过其他显示灯或提示音指示剩余准备时间。
如图25所示,本发明的无人机还可以实现用户自动跟踪拍摄功能。所述控制指令还可以包括第四模式选择指令,指示所述控制组件进入第四模式,在所述第四模式下,所述指令执行模块通过所述摄像组件检测用户的位置,且控制所述无人机和所述摄像组件自动根据所述用户的位置移动,以使得所述摄像组件持续对用户进行拍摄, 从而实现自动跟踪用户拍摄,保证用户一直处于拍摄范围之内。
可选地,所述第四模式下,所述指令执行模块还可以获取用户的位置变化加速度,当用户的位置变化加速度超过预设加速度阈值时,所述指令执行模块向外部发出报警信号。采用该种方式,一方面,当摄像组件无法捕捉到用户位置时,可以通过报警提示用户注意,让用户自己主动来到摄像组件的拍摄范围之内;另一方面,也可以实现跌倒检测,当用户不小心跌倒或身体不适晕倒时,可以自动向外部发出报警信号,如果用户在一定时间内没有取消该报警信号,则可以进一步通知与用户关联的其他用户的移动终端或拨打急救电话等,在为用户提供高质量拍摄的同时,也保证用户使用过程中的安全。
如图26所示,进一步地,考虑到用户在控制无人机动作时可能会因为距离的错误估计或误操作而使得无人机撞上其他障碍物,为了保障无人机本身的安全性,所述无人机上还可以设置有至少一个距离传感器,所述控制组件还包括障碍计算模块,所述障碍计算模块用于获取所述距离传感器的障碍物探测数据;
所述待执行控制指令中包含无人机移动指令,且所述障碍计算模块判断所述无人机移动指令中移动方向上的障碍物与所述无人机之间的距离小于预设安全阈值时,取消所述无人机移动指令,并向外部发出限值提醒信号。即所述障碍计算模块在通过距离传感器已知周围可能撞上的障碍物之后,根据控制指令的指向方向进行预测,如果无人机执行了无人机移动指令,是否可能会撞上障碍物,如果是,则不执行该无人机移动指令,并且提醒用户距离已经小于限值,有撞上障碍物的危险。
采用该种实施方式,特别适用于在室内进行拍摄的情况。由于在室内受墙面和天花板的限制,并且有其他很多的家具、摆设等障碍物,采用该种方式,通过可靠的计算和危险预测,则可以保证无人机在室内拍摄的安全性。同样地,也可以适用于无人机在室外拍摄的情况,在室外空旷之处,无人机可能移动速度较快,而用户无法很好地预知未到来的危险,采用此种方式,则可以保证无人机交互拍摄过程的稳定性和可靠性。
与现有技术相比,本发明提供了一种用户通过动作直接控制的技术方案,摄像组件自动获取拍摄图像并由控制组件自动分析得到待执行的用户动作特征,根据待执行的用户动作特征解读用户需要的控制指令,由此用户可以直接通过动作对无人机进行飞行控制和对摄像组件进行拍摄控制,从而实现拍摄功能,在任何场合均能够轻松实现满足需求的拍摄,提升了用户体验。
以上内容是结合具体的优选实施方式对本发明所作的进一步详细说明,不能认定本发明的具体实施只局限于这些说明。对于本发明所属技术领域的普通技术人员来说,在不脱离本发明构思的前提下,还可以做出若干简单推演或替换,都应当视为属于本发明的保护范围。

Claims (24)

  1. 一种无人机交互拍摄系统,其特征在于,所述系统包括无人机、摄像组件和控制组件,所述摄像组件的一端可转动地连接至所述无人机的一侧;其中所述控制组件包括:
    控制指令库,用于存储预设的各种用户动作特征与各种控制指令的映射关系,所述控制指令包括无人机控制指令和/或摄像组件控制指令;
    图像处理模块,用于对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中待执行的用户动作特征;
    指令判定模块,用于根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及
    指令执行模块,用于根据查找得到的控制指令控制所述无人机和/或所述摄像组件。
  2. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述摄像组件包括摄像设备和摄像支架,所述摄像设备设置于所述摄像支架中,且所述摄像支架的一端可转动地连接至所述无人机的一侧;
    所述系统还包括一显示设备,所述显示设备可拆卸或固定地安装于所述摄像支架的另一端。
  3. 根据权利要求2所述的无人机交互拍摄系统,其特征在于,所述显示设备包括阵列式显示屏和第一显示控制单元;所述第一显示控制单元获取所述摄像设备的拍摄图像,并通过所述阵列式显示屏进行显示。
  4. 根据权利要求2所述的无人机交互拍摄系统,其特征在于,所述显示设备包括点阵式显示屏和第二显示控制单元;所述第二显示控制单元获取所述指令判定模块查找得到的控制指令,并控制所述点阵式显示屏显示与所述查找得到的控制指令相关联的用户提示信息。
  5. 根据权利要求2所述的无人机交互拍摄系统,其特征在于,所述摄像支架的一端设置为凸块,所述无人机的一侧设置有一与所述凸块形状相适应的凹槽;所述摄像支架的凸块嵌设于所述无人机的凹槽中;
    所述无人机的下表面为一平面,所述无人机的下表面包括一摄像支架对应区,所述无人机的凹槽的两侧面垂直于所述无人机的下表面,且所述摄像支架的凸块在所述无人机的凹槽中可转动,以使得所述摄像支架可在与所述无人机的下表面垂直以及与所述摄像支架对应区贴合的角度范围内转动。
  6. 根据权利要求5所述的无人机交互拍摄系统,其特征在于,所述无人机的下表面还包括一储电设备对应区,且所述储电设备对应区与所述摄像支架对应区无交叉;
    所述系统还包括一储电设备,所述储电设备可拆卸或固定地安装于所述无人机的 下表面,且所述储电设备贴合所述储电设备对应区。
  7. 根据权利要求5所述的无人机交互拍摄系统,其特征在于,所述摄像支架包括第一支臂、第二支臂和第三支臂,所述第一支臂的一侧连接至所述凸块,且所述第一支臂的另一侧设置有一第一插槽,所述第二支臂的一端和第三支臂的一端分别连接至所述第一支臂的两端,且所述第二支臂和第三支臂均垂直于所述第一支臂,所述第二支臂的另一端设置有一第二插槽,所述第三支臂的另一端设置有一第三插槽;
    所述显示设备的一侧插入所述第一插槽中,所述显示设备的另一侧插入所述第二插槽和所述第三插槽中。
  8. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,还包括语音获取设备,所述语音获取设备用于获取用户的语音数据;
    所述控制指令库还用于存储预设的各种语音关键词与各种控制指令的映射关系;
    所述控制组件还包括语音处理模块,所述语音处理模块用于提取所述用户的语音数据中包括的语音关键词;
    所述指令判定模块还用于根据提取的语音关键词在所述控制指令库中查找所对应的控制指令。
  9. 根据权利要求8所述的无人机交互拍摄系统,其特征在于,所述语音处理模块还用于获取所述用户的语音数据中用户的声纹特征,并判断所述用户的声纹特征是否为预存指定声纹特征;
    如果所述用户的声纹特征为预设允许声纹特征,则所述指令判定模块提取所述用户的语音数据中包括的语音关键词,并根据提取的语音关键词在所述控制指令库中查找所对应的控制指令;
    如果所述用户的声纹特征不为预设允许声纹特征,则所述指令判定模块忽略所述用户的声纹特征,不进行提取语音关键词处理。
  10. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述图像处理模块还用于获取所述摄像组件的拍摄图像中用户的生理特征,并判断用户的生理特征是否为预存指定生理特征;
    如果所述用户的生理特征为预存指定生理特征,则所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;
    如果所述用户的生理特征不为预存指定生理特征,则所述指令判定模块忽略所述待执行的用户动作特征,不进行查找控制指令处理。
  11. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述无人机控制指令包括无人机平移指令、无人机转动指令、无人机开机指令和无人机关机指令中的至少一种;所述摄像组件控制指令包括摄像组件转动指令、拍摄参数调整指令、拍摄开始指令和拍摄停止指令中的至少一种。
  12. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令 还包括:
    第一模式选择指令,指示所述控制组件进入第一模式,在所述第一模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的无人机控制指令,并根据查找得到的无人机控制指令控制所述无人机;
    第二模式选择指令,指示所述控制组件进入第二模式,在所述第二模式下,所述指令判定模块根据所述用户动作特征,在所述控制指令库中查找所对应的摄像组件控制指令,并根据查找得到的摄像组件控制指令控制所述摄像组件。
  13. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括:
    全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机以预设速度在(0,α)角度范围内持续移动,α为预设全景拍摄最大角度。
  14. 根据权利要求13所述的无人机交互拍摄系统,其特征在于,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
    所述摄像组件检测用户的位置;
    所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1;
    所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;
    所述无人机停止转动后,所述摄像组件停止拍摄。
  15. 根据权利要求13所述的无人机交互拍摄系统,其特征在于,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1;
    所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;
    所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;
    所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。
  16. 根据权利要求13所述的无人机交互拍摄系统,其特征在于,在所述全景模 式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1;
    所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;
    所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;
    所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。
  17. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括:
    第三模式选择指令,指示所述控制组件进入第三模式,在所述第三模式下,所述指令执行模块控制所述摄像组件在预设等待时间后进行拍摄。
  18. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括:
    第四模式选择指令,指示所述控制组件进入第四模式,在所述第四模式下,所述指令执行模块通过所述摄像组件检测用户的位置,且控制所述无人机和所述摄像组件自动根据所述用户的位置移动,以使得所述摄像组件持续对用户进行拍摄。
  19. 根据权利要求18所述的无人机交互拍摄系统,其特征在于,所述第四模式下,所述指令执行模块获取用户的位置变化加速度,当用户的位置变化加速度超过预设加速度阈值时,所述指令执行模块向外部发出报警信号。
  20. 根据权利要求1所述的无人机交互拍摄系统,其特征在于,所述无人机上还设置有至少一个距离传感器,所述控制组件还包括障碍计算模块,所述障碍计算模块用于获取所述距离传感器的障碍物探测数据;
    所述待执行控制指令中包含无人机移动指令,且所述障碍计算模块判断所述无人机移动指令中移动方向上的障碍物与所述无人机之间的距离小于预设安全阈值时,取消所述无人机移动指令,并向外部发出限值提醒信号。
  21. 一种无人机交互拍摄方法,其特征在于,采用权利要求1至20中任一项所述的无人机交互拍摄系统,所述方法包括如下步骤:
    所述摄像组件获取拍摄图像;
    所述图像处理模块对所述摄像组件的拍摄图像进行处理,以获取所述拍摄图像中 待执行的用户动作特征;
    所述指令判定模块根据所述待执行的用户动作特征,在所述控制指令库中查找所对应的控制指令;以及
    所述指令执行模块根据查找得到的控制指令控制所述无人机和/或所述摄像组件。
  22. 根据权利要求21所述的无人机交互拍摄方法,其特征在于,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下步骤进行全景照片拍摄:
    所述摄像组件检测用户的位置;
    所述无人机以用户的位置为起始点,在同一水平面内向一侧转动α/n,其中n为第一预设分割值,且n>1,α为预设全景拍摄最大角度;
    所述摄像组件开始拍摄,且所述无人机在同一水平面内以预设速度匀速向另一侧转动α;
    所述无人机停止转动后,所述摄像组件停止拍摄。
  23. 根据权利要求21所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为圆心,以L/m为半径生成角度为α的第一扇形,且待拍摄物位于所述第一扇形的圆弧上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;
    所述指令执行模块生成与所述第一扇形相对的第二扇形,所述第二扇形的两侧边分别为所述第一扇形的两侧边的反向延长线,且所述第二扇形的半径为(m-1)L/m,角度为α;
    所述摄像组件开始拍摄,且所述无人机从所述第二扇形的圆弧的一端沿该圆弧的轨迹移动至该圆弧的另一端;
    所述无人机移动至所述第二扇形的圆弧的另一端后,所述摄像组件停止拍摄。
  24. 根据权利要求21所述的无人机交互拍摄系统,其特征在于,所述控制指令还包括全景模式选择指令,指示所述控制组件进入全景模式,在所述全景模式下,所述指令执行模块控制所述无人机和所述摄像组件以如下方式进行全景照片拍摄:
    所述指令执行模块计算所述摄像组件与用户之间的距离L;
    所述指令执行模块选定所述摄像组件与用户之间的定位点,以所述定位点为顶点,以L/m为腰的长度,生成顶角角度为α的第一等腰三角形,且待拍摄物位于所述第一等腰三角形的底边上,其中m为第二预设分割值,且m>1,α为预设全景拍摄最大角度;
    所述指令执行模块生成与所述第一等腰三角形相对的第二等腰三角形,所述第二等腰三角形的两腰分别为所述第一等腰三角形的两腰的反向延长线,且所述第二等腰三角形的腰的长度为(m-1)L/m,顶角角度为α;
    所述摄像组件开始拍摄,且所述无人机从所述第二等腰三角形的底边的一端沿该底边的轨迹移动至该底边的另一端;
    所述无人机移动至所述第二等腰三角形的底边的另一端后,所述摄像组件停止拍摄。
PCT/CN2017/080738 2017-04-17 2017-04-17 无人机交互拍摄系统及方法 WO2018191840A1 (zh)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201780000407.6A CN109121434B (zh) 2017-04-17 2017-04-17 无人机交互拍摄系统及方法
PCT/CN2017/080738 WO2018191840A1 (zh) 2017-04-17 2017-04-17 无人机交互拍摄系统及方法
TW107111546A TWI696122B (zh) 2017-04-17 2018-04-02 無人機互動拍攝系統方法

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2017/080738 WO2018191840A1 (zh) 2017-04-17 2017-04-17 无人机交互拍摄系统及方法

Publications (1)

Publication Number Publication Date
WO2018191840A1 true WO2018191840A1 (zh) 2018-10-25

Family

ID=63855487

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/080738 WO2018191840A1 (zh) 2017-04-17 2017-04-17 无人机交互拍摄系统及方法

Country Status (3)

Country Link
CN (1) CN109121434B (zh)
TW (1) TWI696122B (zh)
WO (1) WO2018191840A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112019744A (zh) * 2020-08-27 2020-12-01 新石器慧义知行智驰(北京)科技有限公司 一种拍照方法、装置、设备和介质

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI768630B (zh) * 2020-12-29 2022-06-21 財團法人工業技術研究院 可移動攝影系統和攝影構圖控制方法
US11445121B2 (en) 2020-12-29 2022-09-13 Industrial Technology Research Institute Movable photographing system and photography composition control method

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104808799A (zh) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 一种能够识别手势的无人机及其识别方法
CN105138126A (zh) * 2015-08-26 2015-12-09 小米科技有限责任公司 无人机的拍摄控制方法及装置、电子设备
WO2015200209A1 (en) * 2014-06-23 2015-12-30 Nixie Labs, Inc. Wearable unmanned aerial vehicles, launch- controlled unmanned aerial vehicles, and associated systems and methods
CN105391939A (zh) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 无人机拍摄控制方法和装置、无人机拍摄方法和无人机
CN105607740A (zh) * 2015-12-29 2016-05-25 清华大学深圳研究生院 一种基于计算机视觉的无人飞行器控制方法及装置
CN105847684A (zh) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 无人机
JP2016225872A (ja) * 2015-06-01 2016-12-28 日本電信電話株式会社 移動装置操作端末、移動装置操作方法及び移動装置操作プログラム

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9678506B2 (en) * 2014-06-19 2017-06-13 Skydio, Inc. Magic wand interface and other user interaction paradigms for a flying digital assistant
CN105338238B (zh) * 2014-08-08 2019-04-23 联想(北京)有限公司 一种拍照方法及电子设备
US9471059B1 (en) * 2015-02-17 2016-10-18 Amazon Technologies, Inc. Unmanned aerial vehicle assistant
CN104865856B (zh) * 2015-03-30 2018-02-06 广州势必可赢网络科技有限公司 一种适用于无人机的语音控制方法
CN106155080B (zh) * 2015-07-28 2020-04-10 英华达(上海)科技有限公司 无人机
CN105677300A (zh) * 2016-02-04 2016-06-15 普宙飞行器科技(深圳)有限公司 基于手势识别操控无人机的方法、无人机及系统
CN106227231A (zh) * 2016-07-15 2016-12-14 深圳奥比中光科技有限公司 无人机的控制方法、体感交互装置以及无人机
CN106200679B (zh) * 2016-09-21 2019-01-29 中国人民解放军国防科学技术大学 基于多模态自然交互的单操作员多无人机混合主动控制方法
CN106444843B (zh) * 2016-12-07 2019-02-15 北京奇虎科技有限公司 无人机相对方位控制方法及装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015200209A1 (en) * 2014-06-23 2015-12-30 Nixie Labs, Inc. Wearable unmanned aerial vehicles, launch- controlled unmanned aerial vehicles, and associated systems and methods
CN104808799A (zh) * 2015-05-20 2015-07-29 成都通甲优博科技有限责任公司 一种能够识别手势的无人机及其识别方法
JP2016225872A (ja) * 2015-06-01 2016-12-28 日本電信電話株式会社 移動装置操作端末、移動装置操作方法及び移動装置操作プログラム
CN105138126A (zh) * 2015-08-26 2015-12-09 小米科技有限责任公司 无人机的拍摄控制方法及装置、电子设备
CN105391939A (zh) * 2015-11-04 2016-03-09 腾讯科技(深圳)有限公司 无人机拍摄控制方法和装置、无人机拍摄方法和无人机
CN105607740A (zh) * 2015-12-29 2016-05-25 清华大学深圳研究生院 一种基于计算机视觉的无人飞行器控制方法及装置
CN105847684A (zh) * 2016-03-31 2016-08-10 深圳奥比中光科技有限公司 无人机

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112019744A (zh) * 2020-08-27 2020-12-01 新石器慧义知行智驰(北京)科技有限公司 一种拍照方法、装置、设备和介质

Also Published As

Publication number Publication date
CN109121434B (zh) 2021-07-27
TWI696122B (zh) 2020-06-11
TW201839663A (zh) 2018-11-01
CN109121434A (zh) 2019-01-01

Similar Documents

Publication Publication Date Title
US12149819B2 (en) Autonomous media capturing
CN104828256B (zh) 一种智能多模式飞行拍摄设备及其飞行控制方法
CN110119154A (zh) 飞行器的控制方法、装置和设备以及飞行器
JP6696118B2 (ja) 電子機器
WO2018209702A1 (zh) 无人机的控制方法、无人机以及机器可读存储介质
CN108476288A (zh) 拍摄控制方法及装置
US20160292886A1 (en) Apparatus and method for photographing people using a movable remote device
CN110692027A (zh) 用于提供无人机应用的易用的释放和自动定位的系统和方法
EP4195651A1 (en) Photographing device stabilizer
US20200329202A1 (en) Image capturing apparatus, control method, and recording medium
CN106131413A (zh) 一种拍摄设备的控制方法及拍摄设备
WO2018191840A1 (zh) 无人机交互拍摄系统及方法
WO2019104681A1 (zh) 拍摄方法和装置
KR101951666B1 (ko) 촬영용 드론 및 그 제어 방법
CN109995991A (zh) 一种拍摄方法、机器人及移动终端
WO2022000138A1 (zh) 拍摄控制方法和装置、云台、拍摄系统
CN114710623A (zh) 基于手持云台的拍摄方法、手持云台及存储介质
CN110337806A (zh) 集体照拍摄方法和装置
CN112189333B (zh) 跟随拍摄、云台控制方法、拍摄装置、手持云台和拍摄系统
US20230033760A1 (en) Aerial Camera Device, Systems, and Methods
CN110291776B (zh) 飞行控制方法及飞行器
CN109447924B (zh) 一种图片合成方法、装置及电子设备
KR101599149B1 (ko) 피사체를 자동으로 추적하는 촬영장치
CN112154652A (zh) 手持云台的控制方法、控制装置、手持云台及存储介质
CN111031202A (zh) 基于四旋翼的智能拍照无人机、智能拍照系统及其方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17906581

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17906581

Country of ref document: EP

Kind code of ref document: A1