CN107065914B - Flight assistance method and device for unmanned aerial vehicle - Google Patents
Flight assistance method and device for unmanned aerial vehicle Download PDFInfo
- Publication number
- CN107065914B CN107065914B CN201710335356.4A CN201710335356A CN107065914B CN 107065914 B CN107065914 B CN 107065914B CN 201710335356 A CN201710335356 A CN 201710335356A CN 107065914 B CN107065914 B CN 107065914B
- Authority
- CN
- China
- Prior art keywords
- unmanned aerial
- aerial vehicle
- display screen
- information
- observation position
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/08—Control of attitude, i.e. control of roll, pitch, or yaw
- G05D1/0808—Control of attitude, i.e. control of roll, pitch, or yaw specially adapted for aircraft
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/0011—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots associated with a remote control arrangement
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/10—Simultaneous control of position or course in three dimensions
- G05D1/101—Simultaneous control of position or course in three dimensions specially adapted for aircraft
-
- G—PHYSICS
- G08—SIGNALLING
- G08C—TRANSMISSION SYSTEMS FOR MEASURED VALUES, CONTROL OR SIMILAR SIGNALS
- G08C17/00—Arrangements for transmitting signals characterised by the use of a wireless electrical link
- G08C17/02—Arrangements for transmitting signals characterised by the use of a wireless electrical link using a radio link
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Computer Networks & Wireless Communication (AREA)
- Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention discloses a flight auxiliary method and a flight auxiliary device of an unmanned aerial vehicle, which belong to the field of unmanned aerial vehicles, and comprise the following steps: acquiring observation position information, position information of the unmanned aerial vehicle and flight state information of the unmanned aerial vehicle; obtaining the position information of the unmanned aerial vehicle relative to the observation position according to the position information and the observation position information of the unmanned aerial vehicle; and outputting the flight state information of the unmanned aerial vehicle and the position information relative to the observation position. By adopting the method and the device provided by the invention, the current flight position and state of the unmanned aerial vehicle are mastered, so that the operator is helped to control the flight path of the unmanned aerial vehicle, blind flight and flight loss are avoided, and the flight experience of the operator is improved.
Description
Technical Field
The invention relates to an unmanned aerial vehicle, in particular to a flight assisting method and a flight assisting device of the unmanned aerial vehicle.
Background
In recent years, unmanned aircraft (e.g., fixed wing aircraft, rotorcraft including helicopter), motor vehicles, submarines or ships, and satellites, space stations, or airships have been widely used, for example, in the fields of detection, search and rescue, and the like. Manipulation of these moving bodies is typically accomplished by a user through a remote control device.
A remote control aircraft, submarine, or motor vehicle may carry a carrier, such as a carrier device carrying a camera or light. For example, a drone may carry a camera for aerial photography.
In the process of operating a moving body such as an unmanned aerial vehicle, since the unmanned aerial vehicle is generally small in size, the unmanned aerial vehicle is difficult to see clearly with naked eyes in a far-flying situation (such as four or five hundred meters), in such a situation, the operator is difficult to observe the heading angle of the unmanned aerial vehicle, which is equivalent to a blind flight, and the unmanned aerial vehicle is easy to lose without an auxiliary means of flight. In addition, if the user flies in a First Person View (FPV) mode, the user may be greatly impaired by focusing on the display screen, and finally, the user may not know the current position of the unmanned aerial vehicle, thereby losing the direction and even losing the flight, and watching the FPV while paying attention to the position of the unmanned aerial vehicle.
Disclosure of Invention
The invention aims to solve the technical problem of a flight auxiliary method and a flight auxiliary device of an unmanned aerial vehicle, which are used for helping a controller to master the current flight position and flight state of the unmanned aerial vehicle and can assist in controlling the flight of the unmanned aerial vehicle.
The technical scheme adopted by the invention for solving the technical problems is as follows:
according to one aspect of the present invention, there is provided a flight assistance method for an unmanned aerial vehicle, including:
acquiring observation position information, position information of the unmanned aerial vehicle and flight state information of the unmanned aerial vehicle; obtaining the position information of the unmanned aerial vehicle relative to the observation position according to the position information and the observation position information of the unmanned aerial vehicle; and outputting the flight state information of the unmanned aerial vehicle and the position information relative to the observation position.
Preferably, the observation position is the position of the operator and/or the position of the unmanned aerial vehicle takeoff.
Preferably, the position information and the flight state information of the unmanned aerial vehicle are obtained by a state measurement sensor arranged on the unmanned aerial vehicle; when the observation position is the position of the operator, the position information of the operator is obtained by a sensor attached to the operator or an object carried by the operator; when the observation position is the takeoff position of the unmanned aerial vehicle, the takeoff position of the unmanned aerial vehicle is obtained through a state measurement sensor arranged on the unmanned aerial vehicle.
Preferably, the outputting of the flight status information and the position information with respect to the observation position of the unmanned aerial vehicle includes: and outputting the flight state information of the unmanned aerial vehicle and the position information relative to the observation position to a mobile terminal or a remote controller with a display function.
Preferably, the flight status information includes a heading of the unmanned aerial vehicle and/or altitude information of the unmanned aerial vehicle relative to a ground plane.
Preferably, the outputting of the flight status information and the position information with respect to the observation position of the unmanned aerial vehicle includes: and displaying the position information of the unmanned aerial vehicle relative to the observation position and the heading of the unmanned aerial vehicle on a display screen of the mobile terminal or a remote controller with a display function in a graphic mode.
Preferably, the position information of the unmanned aerial vehicle with respect to the observation position includes a distance between the unmanned aerial vehicle and the observation position and an angle of the unmanned aerial vehicle with respect to the observation position.
Preferably, the method further comprises: and acquiring the azimuth angle of a screen reference line of the display screen, and reversely rotating the pattern relative to the front visual axis of the display screen.
Preferably, the method further comprises: and judging whether the difference value between the azimuth angle of the screen reference line of the display screen and the azimuth angle of the unmanned aerial vehicle relative to the observation position is smaller than a preset threshold value or not, and the difference value between the altitude angle of the display screen and the altitude angle of the unmanned aerial vehicle relative to the observation position is smaller than the preset threshold value, if so, generating capture target prompt information on the screen.
Preferably, the predetermined threshold is 10 degrees.
According to another aspect of the present invention, there is provided a flight assistance apparatus for an unmanned aerial vehicle including an information acquisition module, an information processing module, and an information output module, wherein:
the information acquisition module is used for acquiring the position information of the observation position, the position information of the unmanned aerial vehicle and the flight state information, transmitting the position information of the observation position and the position information of the unmanned aerial vehicle to the information processing module and transmitting the flight state information to the information output module;
the information processing module is used for obtaining the position information of the unmanned aerial vehicle relative to the observation position according to the position information and the observation position information of the unmanned aerial vehicle;
and the information output module is used for outputting the flight state information of the unmanned aerial vehicle and the position information of the unmanned aerial vehicle relative to the observation position.
Preferably, the observation position is the position of the operator and/or the position of the unmanned aerial vehicle takeoff.
Preferably, the flight status information includes heading of the unmanned aerial vehicle and/or altitude information of the unmanned aerial vehicle relative to a ground plane.
Preferably, the information output module is specifically configured to: and displaying the course of the unmanned aerial vehicle and the position information of the unmanned aerial vehicle relative to the observation position on a display screen of the mobile terminal or a remote controller with a display function in a graphic mode.
Preferably, the position information of the unmanned aerial vehicle relative to the observation position output by the information output module includes a distance between the unmanned aerial vehicle and the observation position and an angle of the unmanned aerial vehicle relative to the observation position.
Preferably, the information acquisition module further comprises a direction acquisition unit, wherein: the direction acquisition unit is used for acquiring the azimuth angle of a screen reference line of the display screen; and the information processing module is also used for reversely rotating the image relative to the front visual axis of the display screen by the azimuth angle.
Preferably, the information processing module is further configured to determine whether a difference between an azimuth of a screen reference line of the display screen and an azimuth of the unmanned aerial vehicle relative to the observation position is smaller than a predetermined threshold, and a difference between an altitude of the display screen and an altitude of the unmanned aerial vehicle relative to the observation position is also smaller than the predetermined threshold, and if so, generate a capture target prompt message on the screen.
Preferably, the predetermined threshold is 10 degrees.
According to the method and the device provided by the invention, the flight position and the flight state of the unmanned aerial vehicle are mastered, so that the controller is helped to control the flight path of the unmanned aerial vehicle, blind flight and flight loss are avoided, and the flight experience of the controller is improved.
Drawings
Fig. 1 is a flowchart of a flight assistance method for an unmanned aerial vehicle according to an embodiment of the present invention;
FIG. 2 is a flow chart of a flight assistance method for an UAV in accordance with a preferred embodiment of the present invention;
FIG. 3 is a schematic illustration of a display screen horizontally disposed in the east-west direction according to a preferred embodiment of the present invention;
FIG. 4 is a schematic view of a display screen according to a preferred embodiment of the present invention, wherein the display screen is horizontally disposed in the north-south direction; FIG. 5 is a schematic illustration of an azimuth angle provided by an embodiment of the present invention;
FIG. 6 is a schematic diagram of a display screen provided in accordance with a preferred embodiment of the present invention when the azimuth of the screen reference line is coincident with the azimuth of the UAV relative to the viewing position;
FIG. 7 is a schematic view of a display screen in which the front view axis of the display screen is aligned with the UAV;
FIG. 8 is a schematic view of an altitude angle of an UAV provided by an embodiment of the present invention;
FIG. 9 is a schematic diagram of an elevation angle of a display screen according to an embodiment of the present invention;
fig. 10 is a schematic structural diagram of a flight assistance device of an unmanned aerial vehicle according to a preferred embodiment of the present invention.
Detailed Description
In order to make the technical problems, technical solutions and advantageous effects to be solved by the present invention clearer and clearer, the present invention is further described in detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Fig. 1 is a flowchart of a flight assistance method for an unmanned aerial vehicle according to an embodiment of the present invention, where the method includes:
s102, obtaining observation position information, position information of the unmanned aerial vehicle and flight state information of the unmanned aerial vehicle.
The position information and the flight state information of the unmanned aerial vehicle are obtained through a state measuring sensor arranged on the unmanned aerial vehicle; the flight position information includes longitude and latitude of the flight position, and the flight state includes pitch, roll, and heading, and may further include altitude information of the unmanned aerial vehicle with respect to the ground plane.
The observation position may be a position of the operator and a position at which the unmanned aerial vehicle takes off, or may be any one of the position of the operator and the position at which the unmanned aerial vehicle takes off. When the observation position is the position of the operator, the position information of the operator is obtained by a sensor attached to the operator or an object carried by the operator; when the observation position is the takeoff position of the unmanned aerial vehicle, the takeoff position of the unmanned aerial vehicle is obtained through a state measurement sensor arranged on the unmanned aerial vehicle. The position information of the takeoff point of the unmanned aerial vehicle is the position information recorded when the unmanned aerial vehicle searches for enough positioning satellites for the first time after being electrified.
The operator can select the position of the operator or the takeoff position of the unmanned aerial vehicle as the observation position according to the requirement, and at the moment, the module needs to be selected. The viewing position may be set initially and not selected by the operator.
In this step, a Wi-Fi (wireless fidelity) network may be established with the unmanned aerial vehicle to perform point-to-point communication, and the position information and the flight state information of the unmanned aerial vehicle may be acquired from the unmanned aerial vehicle through a wireless network, or may be communicated through a 2G network, a 3G network, a 4G network, or a future 5G network.
And S104, obtaining the position information of the unmanned aerial vehicle relative to the observation position according to the position information and the observation position information of the unmanned aerial vehicle.
Specifically, the distance between the unmanned aerial vehicle and the observation position and the angle of the unmanned aerial vehicle relative to the observation position are calculated according to the position information of the unmanned aerial vehicle and the observation position information, and the position information of the unmanned aerial vehicle relative to the observation position is determined.
And S106, outputting the flight state information of the unmanned aerial vehicle and the position information relative to the observation position.
As a preferable aspect of the present embodiment, in the present embodiment, the flight state information of the unmanned aerial vehicle and the position information with respect to the observation position may be output to the mobile terminal or the remote controller with a display function.
As another preferable scheme of the embodiment, the heading of the unmanned aerial vehicle, the distance between the unmanned aerial vehicle and the observation position, and the angle of the unmanned aerial vehicle relative to the observation position are displayed on a display screen of the mobile terminal or the remote controller with a display function in the form of a graph, wherein the angle of the unmanned aerial vehicle relative to the observation position includes an azimuth angle relative to the observation position and an altitude angle relative to the observation position, and the graph can be a planar graph or a stereoscopic graph. Referring to fig. 3 and 4, in the displayed graph, the position of an arrow indicates the position of the unmanned aerial vehicle, the direction of the arrow indicates the heading of the unmanned aerial vehicle, the center of a circle indicates the position of the observation position, the line connecting the arrow and the center of a circle indicates the projection of the line connecting the position of the unmanned aerial vehicle and the observation position on the ground plane, the included angle a between the line and the north of the ground plane indicates the azimuth angle of the unmanned aerial vehicle relative to the observation position, several circles with the observation position as the center of a circle indicate that the distance from the unmanned aerial vehicle to the observation position is from near to far from inside to outside (for example, the radius difference between adjacent circles is 50 meters), wherein the range of the dashed circle indicates the safe distance for flight, when the unmanned aerial vehicle is outside the safe distance, in addition to the graphical display, the operator can be reminded by voice or text, and the operator can control the path of the unmanned aerial vehicle to fly within the safe range, thereby avoiding blind flight and loss (in the case shown in fig. 3, the operator only needs to strike the pole at the lower left to fly the unmanned aerial vehicle back).
Fig. 2 is a flowchart of a flight assistance method for an unmanned aerial vehicle according to a preferred embodiment of the present invention, the method including:
s202, acquiring observation position information, position information of the unmanned aerial vehicle and flight state information of the unmanned aerial vehicle;
s204, obtaining the distance and the angle of the unmanned aerial vehicle relative to the observation position according to the position information and the observation position information of the unmanned aerial vehicle;
s206, displaying the distance, the angle and the heading of the unmanned aerial vehicle relative to the observation position on a display screen in a graphic mode;
s208, acquiring the azimuth angle of the screen reference line of the display screen, and reversely rotating the pattern relative to the front visual axis of the display screen.
The screen reference line of the display screen refers to a reference line parallel to one side of the display screen, and may be a connection line of midpoints of upper and lower sides of the display screen, where the upper and lower sides refer to when the displayed graph is in a forward direction, and the screen reference line is in the forward direction (as shown in fig. 3 and 4). The azimuth angle of the screen reference line of the display screen refers to an included angle (e.g., angle B shown in fig. 5) between the projection of the screen reference line on the ground plane and the due north direction. The front view axis of the display screen is an axis perpendicular to the display screen, and may also be considered as an axis that is viewed by human eyes at an angle perpendicular to the display screen (as shown in fig. 7).
Specifically, when the display screen of the mobile terminal or the remote controller with the display function is horizontally placed, the display screen can be implemented by the magnetometer, and the azimuth angle of the screen reference line of the display screen rotating in the horizontal direction relative to the due north direction is acquired by taking the pointing direction of the magnetometer as a reference. And by rotating the graphic in the opposite direction relative to the front viewing axis of the display screen, the relative position and the heading of the unmanned aerial vehicle displayed on the display screen are kept with reference to the ground plane, and are irrelevant to the horizontal placement direction of the display screen. Here, the azimuth angle of the screen reference line actually refers to an included angle between the screen reference line and the due north direction, please refer to fig. 3 and 4, the magnetometer can be implemented by the compass, for example, the position displayed on the display screen in real time by the unmanned aerial vehicle is determined by using the pointing direction of the compass as a reference, so as to ensure that the angle of the connection line between the arrow and the circle center on the display screen with respect to the ground plane can be kept unchanged as long as the position and the heading of the unmanned aerial vehicle are not changed and no matter how the operator holds the mobile terminal or the remote controller with the display function rotates in the horizontal placement direction to change the angle, that is, the azimuth angle a of the unmanned aerial vehicle in fig. 3 and 4 with respect to the observation position remains constant, and the angle of the pointing direction of the arrow with respect to the ground plane also remains constant. If the compass function is not provided, the angle a cannot be kept unchanged, but the actual relative positions of the unmanned aerial vehicle and the mobile terminal or the remote controller with the display function can be reflected by the arrow and the circle center when the display screen is horizontally placed in the east-west direction (as shown in fig. 3).
When the display screen of the mobile terminal or the remote controller with the display function is placed in a non-horizontal mode, the azimuth angle of the screen reference line of the display screen can be obtained through the magnetometer and the accelerometer, and the azimuth angle of the screen reference line of the display screen refers to the included angle between the projection of the screen reference line on the ground plane and the due north direction. For example, the magnetometer and the accelerometer can be used to calculate the attitude Rbg of the display screen relative to the ground, a vector Pg is obtained by calculating the difference between the position of the unmanned aerial vehicle and the current position of the display screen, and the vector direction displayed on the display screen is as follows: and the x and y coordinates of the vector (Pb, Rbg and Pg) are obtained, so that the azimuth angle of the projection of the screen reference line of the display screen on the ground plane relative to the true north direction is obtained. And then rotating the displayed figure reversely relative to the front visual axis of the display screen by the azimuth angle, so that the relative position and the heading angle of the unmanned aerial vehicle displayed on the display screen are kept by taking the ground plane as a reference, and the direction is irrelevant to the horizontal placement direction or the vertical placement direction of the display screen of the mobile terminal or the remote controller with the display function.
It should be noted that this step is not applicable to the special case when the display screen is vertically placed.
S210, calculating the difference value between the azimuth angle and the altitude angle of the screen datum line of the display screen and the azimuth angle and the altitude angle of the unmanned aerial vehicle relative to the observation position respectively.
S212, judging whether the difference value between the azimuth angle of the screen datum line of the display screen and the azimuth angle of the unmanned aerial vehicle relative to the observation position is smaller than a preset threshold value or not, and the difference value between the altitude angle of the display screen and the altitude angle of the unmanned aerial vehicle relative to the observation position is smaller than the preset threshold value, if so, executing a step S214, otherwise, returning to the step S208.
And S214, generating capture target prompt information on the display screen.
The screen reference line of the display screen refers to a reference line parallel to one side of the display screen, and may be a connection line of midpoints of upper and lower sides of the display screen, where the upper and lower sides refer to when the displayed graph is in a forward direction, and the screen reference line is in the forward direction (as shown in fig. 3 and 4). The azimuth angle of the screen reference line of the display screen refers to an included angle (e.g., angle B shown in fig. 5) between the projection of the screen reference line on the ground plane and the due north direction. The altitude angle of the display screen refers to an included angle between a front visual axis of the display screen and a ground plane (an rb angle shown in fig. 9), the azimuth angle of the unmanned aerial vehicle relative to the observation position refers to an included angle between a projection of a connecting line of the unmanned aerial vehicle and the observation position on the ground plane and a direction due to the ground plane (an angle a shown in fig. 3, 4 and 5), the altitude angle of the unmanned aerial vehicle refers to an included angle between the connecting line of the unmanned aerial vehicle and the observation position and the ground plane (an angle ra shown in fig. 8), and the predetermined threshold value is 10 degrees.
For example, there is the relationship: vv2 ═ v2x v2y v2z ] ^ T ^ Rgb ([ 001 ] ^ T)
The display screen comprises a first two coordinates of v2 ═ vv2, v1 ═ x2-x1, y2-y1] < Lambda > T, v1 represents erecting vectors, v2 represents a z-axis of the display screen in a world coordinate system, the first two numbers represent projection in x and y directions, x1 and y1 represent coordinates of the display screen, x2 and y2 represent coordinates of the unmanned aerial vehicle, T is transposition, Rgb represents a display screen attitude matrix, and when an angle difference between v2 and v1 is smaller than a certain threshold value, the position of the display screen relative to the unmanned aerial vehicle is represented.
Referring to fig. 6, when the display screen of the mobile terminal or the remote controller with the display function is horizontally disposed, the operator may hold the display screen and rotate along with the arrow in the horizontal direction to find the unmanned aerial vehicle, and when the azimuth angle of the display screen, which rotates along with the arrow, to the screen reference line of the display screen is consistent with the azimuth angle of the unmanned aerial vehicle relative to the observation position (i.e., angle a), the arrow should be located right above the display screen. Referring to fig. 7, when the operator holds the display screen and rotates the display screen in the vertical direction to the position where the altitude angle of the display screen is consistent with the altitude angle of the unmanned aerial vehicle relative to the observation position on the basis that the azimuth angle of the screen reference line of the display screen in fig. 6 is consistent with the azimuth angle of the unmanned aerial vehicle relative to the observation position, the front viewing axis of the display screen placed non-horizontally points to the position of the unmanned aerial vehicle, the prompt message for capturing the target is generated on the screen, the target can be captured by the disappearance of the arrow or the color change of the arrow, and the flight experience of the operator is improved.
In practical applications, the display screen may also be considered to be an unmanned aerial vehicle when the front view axis of the display screen is approaching alignment with the unmanned aerial vehicle (the angle difference is smaller than a predetermined threshold). It will be understood that the center of the circle in fig. 7 is the vertex of a vertical cone, the line connecting the center of the circle to the unmanned aerial vehicle is the central axis of the vertical cone, the predetermined threshold is the vertex angle of the vertical cone, and thus, when the operator holds the display screen and rotates along the direction of the arrow until the position of the unmanned aerial vehicle falls within the space range of the vertical cone, the display screen can be prompted to face the unmanned aerial vehicle, at the moment, the azimuth angle of the screen reference line of the display screen and the azimuth angle of the unmanned aerial vehicle relative to the observation position tend to be consistent, the altitude angle of the display screen and the altitude angle of the unmanned aerial vehicle relative to the observation position tend to be consistent, the front viewing axis of the display screen tends to point to the position of the unmanned aerial vehicle, and the capture target prompt information is also generated on the screen, the prompt message can be the disappearance of the arrow or the color change of the arrow, so that the flight experience of the operator is improved.
Certainly, in the method, the position information and the flight attitude information of the unmanned aerial vehicle can be notified to the operator in a voice or screen text manner, for example, how many degrees the unmanned aerial vehicle is in which direction of the operator, how many meters the unmanned aerial vehicle is away from the operator, how many altitude angles the unmanned aerial vehicle is, and the like, for example, voice broadcasting "the unmanned aerial vehicle is 30 degrees in your north east, please see the unmanned aerial vehicle by lifting the head by 50 degrees", so that the flight experience of the operator is better improved.
By the method provided by the embodiment, the operator can easily know the position of the unmanned aerial vehicle according to the position and the course of the unmanned aerial vehicle displayed on the display screen, and accordingly the flight path of the unmanned aerial vehicle is controlled, so that the eyes can freely control the unmanned aerial vehicle without leaving the display screen, and the flight experience of the operator is improved.
Fig. 10 is a schematic structural diagram of a flight assistance device for an unmanned aerial vehicle according to a preferred embodiment of the present invention, the flight assistance device includes an information acquisition module 10, an information processing module 20, and an information output module 30, where:
the information acquisition module 10 is used for acquiring position information of an observation position, position information of the unmanned aerial vehicle and flight state information;
specifically, the information acquisition module 10 further includes: an observation position information acquisition module 101 and a flight position and state acquisition module 102, wherein:
the observation position information acquiring module 101 is configured to acquire position information of an observation position, where the observation position is a position of an operator and/or a takeoff position of the unmanned aerial vehicle. Since the flight assistance device is embedded in the handheld device, the position of the operator is the same as the position information of the flight assistance device, and the position information can be obtained by the position information of the flight assistance device or by a sensor attached to the operator or an object carried by the operator. When the flight assistance device includes a GPS module, the observation position information acquiring module 101 acquires GPS coordinates of the flight assistance device as position information of the operator through the GPS module. When the flight assistance device does not include a GPS function, the observation position information acquisition module 101 acquires position information of a takeoff point of the unmanned aerial vehicle, where a flight start point GPS coordinate is a GPS coordinate recorded when the unmanned aerial vehicle first searches for enough GPS satellites after being powered on.
The flight position and state acquiring module 102 is configured to acquire current flight position information and flight state information from the unmanned aerial vehicle, where the flight position information includes a longitude and a latitude of a flight position. The flight state information includes pitch, roll, and heading of the unmanned aerial vehicle, and may further include altitude information of the unmanned aerial vehicle with respect to a ground level. The flight position information and the flight state information can be acquired through a wireless network, the wireless network comprises but is not limited to a Wi-Fi network, and of course, the communication can be performed through other networks, such as 2G \3G \4G and future 5G, as long as the unmanned aerial vehicle and the flight assisting device support the communication protocols.
The information processing module 20 is used for obtaining the position information of the unmanned aerial vehicle relative to the observation position according to the position information and the observation position information of the unmanned aerial vehicle and transmitting the position information to the information output module 30;
specifically, the information processing module 20 is configured to calculate a distance between the unmanned aerial vehicle and the observation position and an angle of the unmanned aerial vehicle with respect to the observation position according to the position information of the unmanned aerial vehicle and the observation position information, and determine the position information of the unmanned aerial vehicle with respect to the observation position.
And the information output module 30 is used for outputting flight state information of the unmanned aerial vehicle and position information of the unmanned aerial vehicle relative to the observation position.
The position information of the unmanned aerial vehicle relative to the observation position output by the information output module 30 includes a distance between the unmanned aerial vehicle and the observation position and an angle of the unmanned aerial vehicle relative to the observation position.
As a preferable scheme of this embodiment, the information output module 30 outputs the flight state information of the unmanned aerial vehicle and the position information of the unmanned aerial vehicle relative to the observation position in a graphical manner through a display screen of the mobile terminal or a remote controller with a display function (see fig. 3 and 4).
As another preferable scheme of the present embodiment, the information acquiring module 10 further includes a direction acquiring unit 103, configured to acquire an azimuth angle of a screen reference line of the display screen. The information processing module 20 is further configured to rotate the displayed graphic in a reverse direction relative to the elevational axis of the display screen by the azimuth angle such that the relative position and heading angle of the unmanned aerial vehicle displayed on the display screen remains referenced to the ground plane regardless of the horizontal placement direction of the display screen. The screen reference line of the display screen refers to a reference line parallel to one side of the display screen, and may be a connection line of midpoints of upper and lower sides of the display screen, where the upper and lower sides refer to when the displayed graph is in a forward direction, and the screen reference line is in the forward direction (as shown in fig. 3 and 4). The azimuth angle of the screen reference line of the display screen refers to an included angle (e.g., angle B shown in fig. 5) between the projection of the screen reference line on the ground plane and the due north direction. The front view axis of the display screen is an axis perpendicular to the display screen, and may also be considered as an axis that is viewed by human eyes at an angle perpendicular to the display screen (as shown in fig. 7).
Specifically, when the display screen is placed in the horizontal direction, the direction acquiring unit 103 may be implemented by a magnetometer, that is, by taking the pointing direction of the magnetometer as a reference, an azimuth angle of the screen reference line of the display screen rotated in the horizontal direction with respect to the due north direction is acquired. Information processing module 20 rotates the displayed graphic in the opposite direction relative to the front viewing axis of the display screen by the azimuth angle. Referring to fig. 3 and 4, the magnetometer may be implemented by a compass, and the position of the unmanned aerial vehicle displayed on the display screen in real time is determined according to the pointing direction of the compass as a reference, so as to ensure that the angle of the connection line between the arrow and the center of the circle on the display screen with respect to the ground plane remains unchanged (i.e., the azimuth angle a of the unmanned aerial vehicle in fig. 3 and 4 with respect to the observation position remains unchanged) no matter how the angle of the mobile terminal or the remote controller with the display function changes in the horizontal direction as long as the position and the heading of the unmanned aerial vehicle do not change.
When the display screen is placed in a non-horizontal direction, the direction obtaining unit 103 may be implemented by a magnetometer and an accelerometer, for example, the magnetometer and the accelerometer may be used to calculate the attitude Rbg of the display screen relative to the ground, a vector Pg is obtained by subtracting the position of the unmanned aerial vehicle from the current position of the display screen, and the vector direction displayed on the display screen is: and the x and y coordinates of the vector (Pb, Rbg and Pg) are obtained, so that the azimuth angle of the projection of the screen reference line of the display screen on the ground plane relative to the true north direction is obtained. So that the relative position and the course angle of the unmanned aerial vehicle displayed on the display screen are kept to be referenced to the ground plane, regardless of the horizontal placement direction or the vertical placement direction of the display screen of the mobile terminal or the remote controller with the display function.
It should be noted that the preferred solution does not apply to the special case when the display is vertically positioned.
As another preferable scheme of this embodiment, the information processing module 20 is further configured to determine whether a difference between an azimuth angle of a screen reference line of the display screen and an azimuth angle of the unmanned aerial vehicle relative to the observation position is smaller than a predetermined threshold, and whether a difference between an altitude angle of the display screen and an altitude angle of the unmanned aerial vehicle relative to the observation position is also smaller than the predetermined threshold, and if so, generate the capture target prompt information on the screen. The screen reference line of the display screen refers to a reference line parallel to one side of the display screen, and may be a connection line of midpoints of upper and lower sides of the display screen, where the upper and lower sides refer to when the displayed graph is in a forward direction, and the screen reference line is in the forward direction (as shown in fig. 3 and 4). The azimuth angle of the screen reference line of the display screen refers to an included angle (e.g., angle B shown in fig. 5) between the projection of the screen reference line on the ground plane and the due north direction. The altitude angle of the display screen refers to an included angle between a front visual axis of the display screen and a ground plane (an rb angle shown in fig. 9), the azimuth angle of the unmanned aerial vehicle relative to the observation position refers to an included angle between a projection of a connecting line of the unmanned aerial vehicle and the observation position on the ground plane and a direction due to the ground plane (an angle a shown in fig. 3, 4 and 5), the altitude angle of the unmanned aerial vehicle refers to an included angle between the connecting line of the unmanned aerial vehicle and the observation position and the ground plane (an angle ra shown in fig. 8), and the predetermined threshold value is 10 degrees.
Referring to fig. 7, an operator may hold a mobile terminal or a remote controller with a display function to rotate in the direction of an arrow displayed on a screen to find the unmanned aerial vehicle, and when the display screen rotates in the direction of the arrow to the position of the screen reference line of the display screen is consistent with the azimuth of the unmanned aerial vehicle relative to the observation position and the altitude of the display screen is consistent with the altitude of the unmanned aerial vehicle relative to the observation position, the front view axis of the display screen placed non-horizontally points to the position of the unmanned aerial vehicle, and a target capturing prompt message is generated on the screen. In practical applications, the display screen may also be considered to be an unmanned aerial vehicle when the front view axis of the display screen is approaching alignment with the unmanned aerial vehicle (the angle difference is smaller than a predetermined threshold). Target capturing prompt information is generated on a screen, and flight experience of an operator is improved.
Certainly, the mobile terminal or the remote controller with the display function may further include a voice module for voice-prompting the position information and the flight attitude information of the unmanned aerial vehicle, including how many degrees the unmanned aerial vehicle is in which direction of the operator, how many meters the unmanned aerial vehicle is away from the operator, how many altitude angles the unmanned aerial vehicle is, and the like, for example, voice-broadcasting "the unmanned aerial vehicle is 30 degrees in the northeast, please see the unmanned aerial vehicle by lifting the head by 50 degrees", thereby better improving the flight experience of the operator.
The above-described flight assistance method for the unmanned aerial vehicle is also applicable to the flight assistance device for the unmanned aerial vehicle, and details thereof are not repeated.
According to the flight assisting method and device for the unmanned aerial vehicle, provided by the embodiment of the invention, the flight position and the flight state of the unmanned aerial vehicle are mastered, so that an operator is helped to control the flight path of the unmanned aerial vehicle, blind flight and flight loss are avoided, and the flight experience of the operator is improved.
The preferred embodiments of the present invention have been described above with reference to the accompanying drawings, and are not to be construed as limiting the scope of the invention. Those skilled in the art can implement the invention in various modifications, such as features from one embodiment can be used in another embodiment to yield yet a further embodiment, without departing from the scope and spirit of the invention. Any modification, equivalent replacement and improvement made within the technical idea of using the present invention should be within the scope of the right of the present invention.
Claims (15)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201710335356.4A CN107065914B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201310282385.0A CN103344250B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
| CN201710335356.4A CN107065914B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
Related Parent Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201310282385.0A Division CN103344250B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN107065914A CN107065914A (en) | 2017-08-18 |
| CN107065914B true CN107065914B (en) | 2020-04-28 |
Family
ID=49279066
Family Applications (3)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201710338641.1A Expired - Fee Related CN107168360B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
| CN201710335356.4A Expired - Fee Related CN107065914B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
| CN201310282385.0A Active CN103344250B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201710338641.1A Expired - Fee Related CN107168360B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
Family Applications After (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201310282385.0A Active CN103344250B (en) | 2013-07-05 | 2013-07-05 | Flight assistance method and device for unmanned aerial vehicle |
Country Status (1)
| Country | Link |
|---|---|
| CN (3) | CN107168360B (en) |
Families Citing this family (19)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112947510A (en) | 2014-09-30 | 2021-06-11 | 深圳市大疆创新科技有限公司 | System and method for flight simulation |
| CN113628500A (en) | 2014-09-30 | 2021-11-09 | 深圳市大疆创新科技有限公司 | System and method for supporting analog mobility |
| CN105438488B (en) * | 2014-09-30 | 2018-07-17 | 深圳市大疆创新科技有限公司 | Aircraft and its control method and aerocraft system |
| WO2016065513A1 (en) * | 2014-10-27 | 2016-05-06 | 深圳市大疆创新科技有限公司 | Method and apparatus for prompting position of air vehicle |
| CN108227748A (en) * | 2015-05-18 | 2018-06-29 | 深圳市大疆创新科技有限公司 | Unmanned aerial vehicle (UAV) control method and apparatus based on Headless mode |
| JP6657030B2 (en) * | 2015-07-17 | 2020-03-04 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Unmanned aerial vehicle, flight control method, flight basic program and forced movement program |
| WO2017013840A1 (en) * | 2015-07-17 | 2017-01-26 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ | Unmanned flight vehicle, flight control method, flight basic program, and forced movement program |
| CN106598063B (en) * | 2015-10-14 | 2021-10-26 | 松下电器(美国)知识产权公司 | Unmanned aerial vehicle and flight control method |
| JP6767802B2 (en) * | 2015-11-30 | 2020-10-14 | パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカPanasonic Intellectual Property Corporation of America | Unmanned aerial vehicle and its flight control method |
| CN105717930A (en) * | 2016-01-19 | 2016-06-29 | 深圳一电科技有限公司 | Method, device and system for controlling drone |
| CN106843243A (en) * | 2016-03-22 | 2017-06-13 | 北京京东尚科信息技术有限公司 | The management method of UAS and unmanned plane route |
| CN106292799B (en) * | 2016-08-25 | 2018-10-23 | 北京奇虎科技有限公司 | Unmanned plane, remote control and its control method |
| WO2018045654A1 (en) * | 2016-09-09 | 2018-03-15 | 深圳市大疆创新科技有限公司 | Method and system for displaying state of mobile device and control device |
| CN108319274A (en) * | 2017-01-16 | 2018-07-24 | 吕佩剑 | A kind of graphic display method of unmanned vehicle position |
| CN108475064B (en) * | 2017-05-16 | 2021-11-05 | 深圳市大疆创新科技有限公司 | Method, apparatus, and computer-readable storage medium for apparatus control |
| CN108170475A (en) * | 2017-12-21 | 2018-06-15 | 中国船舶重工集团公司第七0七研究所 | A kind of embedded software development method for mobile heading reference equipment |
| CN109001647A (en) * | 2018-10-08 | 2018-12-14 | 成都戎创航空科技有限公司 | A kind of unmanned plane battery capacity early warning system |
| CN113110581B (en) * | 2021-04-19 | 2022-09-13 | 西北工业大学 | A Nonlinear Aircraft Position Keeping Control Method Based on Combination of Main and Auxiliary Systems |
| CN119611683A (en) * | 2024-11-01 | 2025-03-14 | 广船国际有限公司 | Method for testing navigation lights of unmanned aerial vehicle |
Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5716032A (en) * | 1996-04-22 | 1998-02-10 | United States Of America As Represented By The Secretary Of The Army | Unmanned aerial vehicle automatic landing system |
| CN1515914A (en) * | 2001-12-29 | 2004-07-28 | 北京航空航天大学 | A method of operating an antenna tracking device for an unmanned helicopter |
| CN102183941A (en) * | 2011-05-31 | 2011-09-14 | 河北科技大学 | Civil-mobile-phone-network-based ultra-long-range unmanned aerial vehicle control system |
| CN202285045U (en) * | 2011-11-03 | 2012-06-27 | 南京鑫轩电子系统工程有限公司 | Angle tracking system |
| CN102854887A (en) * | 2012-09-06 | 2013-01-02 | 北京工业大学 | A UAV trajectory planning and remote synchronization control method |
| CN202694151U (en) * | 2011-12-16 | 2013-01-23 | 新时代集团国防科技研究中心 | Control terminal device for unmanned aircraft |
Family Cites Families (17)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS57186111A (en) * | 1981-05-13 | 1982-11-16 | Nissan Motor Co Ltd | Map display device for vehicle |
| JP3412973B2 (en) * | 1995-07-21 | 2003-06-03 | 株式会社東芝 | ISAR image target identification processing device |
| JP2830843B2 (en) * | 1996-05-15 | 1998-12-02 | 日本電気株式会社 | 3D information display method for terminal control |
| JPH11248468A (en) * | 1998-03-05 | 1999-09-17 | Mitsubishi Electric Corp | Map data display device |
| WO2004113836A1 (en) * | 2003-06-20 | 2004-12-29 | Mitsubishi Denki Kabushiki Kaisha | Picked-up image display method |
| US9057627B2 (en) * | 2005-03-15 | 2015-06-16 | Fci Associates, Inc. | Low cost flight instrumentation system |
| FR2887329B1 (en) * | 2005-06-21 | 2007-08-03 | Airbus France Sas | METHOD AND DISPLAY DEVICE FOR AN AIRCRAFT THAT FOLLOWS A FLIGHT PLAN |
| US7873240B2 (en) * | 2005-07-01 | 2011-01-18 | The Boeing Company | Method for analyzing geographic location and elevation data and geocoding an image with the data |
| FR2897975B1 (en) * | 2006-02-28 | 2008-10-17 | Airbus France Sas | METHOD AND DEVICE FOR ASSISTING THE CONTROL OF AN AIRCRAFT. |
| CN101493699B (en) * | 2009-03-04 | 2011-07-20 | 北京航空航天大学 | Aerial unmanned plane ultra-viewing distance remote control method |
| US8515596B2 (en) * | 2009-08-18 | 2013-08-20 | Honeywell International Inc. | Incremental position-based guidance for a UAV |
| CN102445947A (en) * | 2010-10-06 | 2012-05-09 | 鸿富锦精密工业(深圳)有限公司 | Control system and method of unmanned aerial vehicle |
| CN102620731A (en) * | 2011-01-26 | 2012-08-01 | 英华达(南京)科技有限公司 | Handheld apparatus capable of transmitting positioning data and method for transmitting positioning data thereof |
| CN102419171A (en) * | 2011-08-10 | 2012-04-18 | 王桥生 | Disaster detection electronic mapping system based on UAV aerial photography |
| US8798820B2 (en) * | 2011-09-08 | 2014-08-05 | The Boeing Company | Consistent localizer captures |
| WO2013103403A2 (en) * | 2011-09-30 | 2013-07-11 | Aurora Flight Sciences Corporation | Hardware-based weight and range limitation system, apparatus and method |
| CN103176475A (en) * | 2013-02-27 | 2013-06-26 | 广东工业大学 | Ground station for unmanned aerial vehicles |
-
2013
- 2013-07-05 CN CN201710338641.1A patent/CN107168360B/en not_active Expired - Fee Related
- 2013-07-05 CN CN201710335356.4A patent/CN107065914B/en not_active Expired - Fee Related
- 2013-07-05 CN CN201310282385.0A patent/CN103344250B/en active Active
Patent Citations (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US5716032A (en) * | 1996-04-22 | 1998-02-10 | United States Of America As Represented By The Secretary Of The Army | Unmanned aerial vehicle automatic landing system |
| CN1515914A (en) * | 2001-12-29 | 2004-07-28 | 北京航空航天大学 | A method of operating an antenna tracking device for an unmanned helicopter |
| CN102183941A (en) * | 2011-05-31 | 2011-09-14 | 河北科技大学 | Civil-mobile-phone-network-based ultra-long-range unmanned aerial vehicle control system |
| CN202285045U (en) * | 2011-11-03 | 2012-06-27 | 南京鑫轩电子系统工程有限公司 | Angle tracking system |
| CN202694151U (en) * | 2011-12-16 | 2013-01-23 | 新时代集团国防科技研究中心 | Control terminal device for unmanned aircraft |
| CN102854887A (en) * | 2012-09-06 | 2013-01-02 | 北京工业大学 | A UAV trajectory planning and remote synchronization control method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN107168360A (en) | 2017-09-15 |
| CN103344250B (en) | 2017-06-09 |
| CN107168360B (en) | 2021-03-30 |
| CN107065914A (en) | 2017-08-18 |
| CN103344250A (en) | 2013-10-09 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN107065914B (en) | Flight assistance method and device for unmanned aerial vehicle | |
| CN107256030B (en) | Remote control terminal, flight assistance system and method of unmanned aerial vehicle | |
| CN107203219B (en) | Flight assistance system and method for unmanned aerial vehicle | |
| US10645300B2 (en) | Methods and apparatus for image processing | |
| US10599149B2 (en) | Salient feature based vehicle positioning | |
| US10802509B2 (en) | Selective processing of sensor data | |
| US10475209B2 (en) | Camera calibration | |
| US20200007746A1 (en) | Systems, methods, and devices for setting camera parameters | |
| CN107923567B (en) | Pan-tilt for image capture | |
| CN109219785B (en) | Multi-sensor calibration method and system | |
| CN108351653B (en) | System and method for UAV flight control | |
| JP7055324B2 (en) | Display device | |
| JP6878567B2 (en) | 3D shape estimation methods, flying objects, mobile platforms, programs and recording media | |
| JP2017508166A (en) | Sensor fusion | |
| CN109564434B (en) | System and method for positioning a movable object | |
| CN109032184B (en) | Flight control method and device of aircraft, terminal equipment and flight control system | |
| CN109073386A (en) | A kind of prompt and determining method, controlling terminal in unmanned vehicle orientation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant | ||
| CF01 | Termination of patent right due to non-payment of annual fee |
Granted publication date: 20200428 |
|
| CF01 | Termination of patent right due to non-payment of annual fee |