Disclosure of Invention
The invention provides a method and a device for determining the position and the flight direction of an unmanned aerial vehicle and the unmanned aerial vehicle, so as to improve the positioning accuracy of the unmanned aerial vehicle.
The technical scheme of the invention is realized as follows:
calculating coordinates of m fixed points under a camera coordinate system according to an image which is shot by a camera carried by a cradle head of the unmanned aerial vehicle and contains m fixed points, wherein m is more than or equal to 4;
calculating the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system;
according to the rotation angle of the cradle head relative to the unmanned aerial vehicle when the camera shoots an image containing m fixed points, calculating the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system;
and calculating the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
Through the embodiment, the position coordinate of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system can be calculated simultaneously, so that the position and the flight direction of the unmanned aerial vehicle can be obtained simultaneously, and the positioning precision of the unmanned aerial vehicle is improved.
The calculating the conversion relation from the camera coordinate system to the world coordinate system comprises the following steps:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated, which comprises the following steps:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the calculating of the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system comprises:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by the rotation matrix Q and the rotation matrix R.
When the right front of the unmanned aerial vehicle is adopted as the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
The rotation angle of the cradle head relative to the unmanned aerial vehicle comprises: roll angle θ, pitch angle phi and yaw angle
The conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated, which comprises the following steps:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
The rotation matrix Q meets the following formula requirement:
an apparatus for determining a position and a flight direction of an unmanned aerial vehicle, the apparatus being applied to an unmanned aerial vehicle having a pan-tilt on which a camera and an angle sensor are mounted, the apparatus comprising:
the rotation angle acquisition module is used for acquiring the rotation angle of the cradle head relative to the unmanned aerial vehicle when the camera shoots an image containing m fixed points from the angle sensor, wherein m is more than or equal to 4;
the fixed point coordinate calculation module is used for calculating the coordinates of the m fixed points under a camera coordinate system according to the image which is shot by the camera and contains the m fixed points;
the first coordinate system conversion relation calculation module is used for calculating the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system;
the second coordinate system conversion relation calculation module is used for calculating the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle;
and the unmanned aerial vehicle position and flight direction calculation module is used for calculating the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
The first coordinate system conversion relation calculating module calculates a conversion relation from a camera coordinate system to a world coordinate system, including:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the second coordinate system conversion relation calculating module calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the unmanned aerial vehicle position and flight direction calculation module calculates a position coordinate of the unmanned aerial vehicle in a world coordinate system and a flight direction vector of the unmanned aerial vehicle in the world coordinate system, and the unmanned aerial vehicle position and flight direction vector calculation module comprises:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by the rotation matrix Q and the rotation matrix R.
When the unmanned aerial vehicle position and flight direction calculation module adopts the right front of the unmanned aerial vehicle as the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
The rotation angle of the cradle head acquired by the rotation angle acquisition module relative to the unmanned aerial vehicle comprises the following steps: roll angle θ, pitch angle phi and yaw angle
The second coordinate system conversion relation calculating module calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
The rotation matrix Q meets the following formula requirement:
a non-transitory computer readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the steps of the method of determining a position and a direction of flight of a drone as claimed in any one of the preceding claims.
A drone, comprising: a flight controller and a cradle head carrying a camera and an angle sensor, the flight controller performing the method of determining the position and direction of flight of a drone as described in any of the above.
According to the method, the conversion relation from the camera coordinate system to the world coordinate system is calculated according to the image which is shot by the camera and contains m fixed points, m is more than or equal to 4, and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated according to the rotation angle of the camera relative to the unmanned aerial vehicle, so that the position coordinate of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system are obtained, the position and the flight direction of the unmanned aerial vehicle can be obtained at the same time, and the positioning precision of the unmanned aerial vehicle is improved.
Detailed Description
The invention will be described in further detail with reference to the accompanying drawings and specific examples.
Related art proposes an unmanned aerial vehicle positioning and target tracking method based on two-dimensional labels. The method comprises the steps of carrying a camera and a cradle head on an unmanned aerial vehicle, aligning the camera to a two-dimensional tag, knowing the position of the two-dimensional tag in a world coordinate system, calculating the position of the camera in the world coordinate system through the position of an image of the two-dimensional tag in the camera, and taking the position of the camera in the world coordinate system as the coordinate of the unmanned aerial vehicle, wherein the method comprises the following steps of:
A. and calibrating the camera. Calibrating an internal parameter matrix of the camera according to the image shot by the cameraWherein f x 、f y U is the focal length of the camera 0 、v 0 Is the optical center of the camera.
B. And detecting the two-dimensional label. And identifying the two-dimensional label in the image shot by the cradle head camera to obtain coordinates [ x, y,1] of the two-dimensional label in the image, wherein x and y are horizontal coordinates and vertical coordinates expressed by pixel values.
C. And positioning the two-dimensional label. According to the parameters in the camera and the coordinates of the two-dimensional label in the image, calculating the external parameters of the camera, wherein the calculation formula is as follows:
wherein s represents a proportionality coefficient; the second matrix on the right of the equation is an extrinsic parameter matrix of the camera, i.e. a transformation matrix from the world coordinate system to the coordinate system in the image, where r 1-r 9 represent the rotational relationship between the two coordinate systems, t1-t3 represent the translational relationship between the two coordinate systems, and [ X, Y, Z,1] is the coordinate of the two-dimensional tag in the world coordinate system.
D. And calculating the coordinates of the camera in the world coordinate system, namely the coordinates of the unmanned aerial vehicle, through the external parameters of the camera.
E. And taking the moving speed of the two-dimensional label in the picture and the distance of the two-dimensional label relative to the unmanned aerial vehicle as inputs, and controlling the camera to always aim at the two-dimensional label.
The disadvantage of the above method is that the drone coordinates are equivalent to the camera coordinates without taking into account the conversion of the camera direction into the drone direction, which in practice would have to be rotated to obtain the drone direction.
Fig. 1 is a flowchart of a method for determining a position and a flight direction of an unmanned aerial vehicle according to an embodiment of the present invention, which specifically includes the following steps:
step 101: the unmanned aerial vehicle flight controller controls the cradle head of the unmanned aerial vehicle to rotate, so that a camera carried by the cradle head shoots m preset fixed points in the surrounding environment, and m is more than or equal to 4.
Step 102: and the flight controller of the unmanned aerial vehicle determines that the camera shoots a preset m fixed points, and then obtains the rotation angle of the cradle head relative to the unmanned aerial vehicle from the angle sensor carried by the cradle head.
Step 103: and the flight controller of the unmanned aerial vehicle calculates coordinates of the m fixed points under a camera coordinate system according to the image which is shot by the camera and contains the m fixed points.
Step 104: and the flight controller of the unmanned aerial vehicle calculates the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system.
Step 105: and the flight controller of the unmanned aerial vehicle calculates the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle.
Step 106: and the flight controller of the unmanned aerial vehicle calculates the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
Therefore, through the images of the preset m fixed points shot by the camera carried by the unmanned aerial vehicle cloud deck and the rotating angle of the cloud deck relative to the unmanned aerial vehicle when the m fixed points are shot, the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system are calculated, so that the position coordinate of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system are calculated, the position and the flight direction of the unmanned aerial vehicle can be obtained simultaneously, and the positioning precision of the unmanned aerial vehicle is improved.
The following gives a detailed description of the conversion of the camera coordinate system to the world coordinate system, and the conversion of the unmanned aerial vehicle coordinate system to the camera coordinate system:
fig. 2 is a flowchart of a method for determining a position and a flight direction of an unmanned aerial vehicle according to another embodiment of the present invention, which specifically includes the following steps:
step 201: the coordinates of m fixed points in the surrounding environment under a world coordinate system are stored on a flight controller of the unmanned aerial vehicle in advance, and m is more than or equal to 4.
The "surroundings" here are the flight environment of the unmanned aerial vehicle.
Calculating conversion relations among the world coordinate system, the camera coordinate system and the unmanned aerial vehicle coordinate system, namely, calculating a rotation matrix and a translation vector converted between the two coordinate systems, wherein the rotation matrix is a 3*3 matrix, and the translation vector is a 3-dimensional vector, namely, 12 unknown quantities are required to be obtained in total; considering that the coordinates of each fixed point is a three-dimensional vector, at least 4 fixed points are required to solve an equation containing 12 unknowns and resulting in a unique solution.
In practical applications, m fixing points may be located on a marker at the same time, the marker may be a plane, and the marker may be a piece of paper attached to a wall, or a license plate, or a display screen, etc.
The feature points with obvious features on the marker can be directly used as fixed points, so that the feature points (namely fixed points) can be directly found according to the features of the feature points; or the points with fixed relation with the feature points are used as fixed points, so that the feature points are found according to the features of the feature points, and then the fixed points are found according to the relation between the fixed points and the feature points.
For example: the marker is marked with 1 or more feature points, and the position of the feature points on the marker must be accurately displayed. Taking 4 feature points as an example, the 4 feature points may be connected into a quadrilateral, and the 4 corners of the quadrilateral are positions of the 4 feature points, and the 4 feature points may be directly used as fixed points. Alternatively, one feature point may be represented by one quadrangle, and the diagonal intersection of the quadrangle is the position of the feature point, and since the diagonal intersection has a fixed relationship with four vertexes of the quadrangle, the four vertexes are set as 4 fixed points.
The marker stores the world coordinate system position of each feature point by using a word or a two-dimensional code, or may be a certain index mark or index number corresponding to the information (the unmanned aerial vehicle may acquire the corresponding world coordinate system position information according to the index mark or index number).
The marker has a feature for indicating that the marker is a positioning marker, and the feature may be a special figure, or a color, or a shape of the marker itself, or the two-dimensional code, the text, the index mark/the index number containing special information.
When the camera on the unmanned aerial vehicle holder is aligned to the marker, the world coordinate system position information of the feature points can be read from the characters, the two-dimensional codes or the index marks/index numbers on the marker, so that the world coordinates of m fixed points are obtained.
Further, a plurality of markers at a distance may be combined for locating a drone with a greater range of flight.
Step 202: when the position and the flight direction of the unmanned aerial vehicle need to be acquired, the flight controller of the unmanned aerial vehicle controls the cradle head to rotate, so that the camera shoots images simultaneously containing m fixed points.
In particular, the flight controller may pre-store the characteristics of m fixed points, such as: the shape, the color and other identification features can control the cradle head to rotate in each direction in sequence according to a preset step length, whether m fixed points appear simultaneously or not is detected in real time in an image shot by the camera, and if yes, the camera is determined to shoot the image comprising the m fixed points simultaneously.
Step 203: when the camera shoots an image simultaneously comprising m fixed points, the unmanned aerial vehicle's flight controller acquires the rotation angle of the cradle head relative to the unmanned aerial vehicle from the angle sensor carried by the cradle head: roll angle (roll) θ, pitch angle (pitch) phi and yaw angle (yaw)
Step 204: and the flight controller of the unmanned aerial vehicle calculates coordinates of the m fixed points under a camera coordinate system according to the image which is shot by the camera and contains the m fixed points.
The method belongs to the prior art, and the coordinates of m fixed points under the camera coordinate system can be calculated through the memory number (such as focal length and the like) of the camera, the coordinates of m fixed points under the world coordinate system and the coordinates on the image.
Step 205: the unmanned aerial vehicle flight controller calculates a rotation matrix R and a translation vector T from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points in the world coordinate system and the coordinates of the m fixed points in the camera coordinate system.
For example: assuming that the coordinate of any fixed point r (r is more than or equal to 1 and less than or equal to m) in the world coordinate system is pos_r and the coordinate of any fixed point r in the camera coordinate system is c_r, the equation is as follows: pos_r=c_r×r+t, where R is a matrix of 3*3 and T is a 3-dimensional vector, and R and T can be obtained after substituting the coordinates of m fixed points into the above equation to calculate.
Step 206: unmanned aerial vehicle's flight control ware is according to the rotation angle of camera relative unmanned aerial vehicle: roll angle θ, pitch angle phi and yaw angleAnd calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
For example: the following principle can be used: corresponding the unmanned plane to the yaw angleRotation of the shaftThe angle is rotated by phi angle around the corresponding shaft of the pitch angle, and finally the angle theta is rotated around the corresponding shaft of the roll angle, so that a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system is constructed; wherein, the rotation matrix Q may meet the following formula requirement:
step 207: according to a rotation matrix R from a camera coordinate system to a world coordinate system and a translation vector T and a rotation matrix Q from an unmanned aerial vehicle coordinate system to the camera coordinate system, the flight controller of the unmanned aerial vehicle obtains that the rotation matrix from the unmanned aerial vehicle coordinate system to the world coordinate system is Q×R, the translation vector T is the position coordinate of the current unmanned aerial vehicle in the world coordinate system is the translation vector T, and the flight direction vector of the current unmanned aerial vehicle in the world coordinate system is dir= [0, 1] ×Q×R, wherein [0, 1] is the flight direction vector of the current unmanned aerial vehicle in the unmanned aerial vehicle coordinate system when the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is adopted as the right ahead of the unmanned aerial vehicle.
Because the camera is fixed on the unmanned aerial vehicle, the translation vector from the camera coordinate system to the world coordinate system and the unmanned aerial vehicle coordinate system are the same, and the position of the unmanned aerial vehicle is the origin of the unmanned aerial vehicle coordinate system, so that the translation vector T from the unmanned aerial vehicle coordinate system to the world coordinate system is the position coordinate of the unmanned aerial vehicle in the world coordinate system.
When the unmanned aerial vehicle cruises, the camera can be controlled to track the m fixed points after the camera is aligned with the m fixed points for the first time, namely, the camera is always kept to be aligned with the m fixed points, or the camera is controlled to be aligned with the m fixed points at intervals, or the camera is controlled to be aligned with the m fixed points when a positioning instruction for the unmanned aerial vehicle is received, so that the position and the flight direction of the unmanned aerial vehicle in a world coordinate system can be acquired in real time or periodically or according to requirements through the technical scheme provided by the invention.
Fig. 3 is a schematic structural diagram of an apparatus for determining a position and a flight direction of an unmanned aerial vehicle according to an embodiment of the present invention, where the apparatus is applied to an unmanned aerial vehicle, and the unmanned aerial vehicle has a pan-tilt on which a camera and an angle sensor are mounted, and the apparatus mainly includes: the cradle head control module 31, the rotation angle acquisition module 32, the fixed point coordinate calculation module 33, the first coordinate system conversion relation calculation module 34, the second coordinate system conversion relation calculation module 35 and the unmanned aerial vehicle position and flight direction calculation module 36, wherein:
and the cradle head control module 31 is used for controlling the cradle head of the unmanned aerial vehicle to rotate so that the camera shoots m preset fixed points in the surrounding environment, and m is more than or equal to 4.
The rotation angle obtaining module 32 is configured to obtain, from the angle sensor, a rotation angle of the pan-tilt relative to the unmanned aerial vehicle when the camera captures m fixed points.
The fixed point coordinate calculating module 33 is configured to calculate coordinates of the m fixed points in the camera coordinate system according to an image including the m fixed points captured by the camera.
The first coordinate system conversion relation calculating module 34 is configured to calculate a conversion relation from the camera coordinate system to the world coordinate system according to the m coordinates of the fixed points under the world coordinate system stored in advance and the m coordinates of the fixed points under the camera coordinate system calculated by the fixed point coordinate calculating module 33.
The second coordinate system conversion relation calculating module 35 is configured to calculate a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle acquired by the rotation angle acquiring module 32.
The unmanned aerial vehicle position and flight direction calculation module 36 is configured to calculate a position coordinate of the unmanned aerial vehicle in the world coordinate system and a flight direction vector of the unmanned aerial vehicle in the world coordinate system according to the conversion relationship from the camera coordinate system to the world coordinate system calculated by the first coordinate system conversion relationship calculation module 34 and the conversion relationship from the unmanned aerial vehicle coordinate system to the camera coordinate system calculated by the second coordinate system conversion relationship calculation module 35.
In practical applications, the first coordinate system conversion relation calculation module 34 calculates a conversion relation from a camera coordinate system to a world coordinate system, including:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the second coordinate system conversion relation calculation module 35 calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the unmanned aerial vehicle position and flight direction calculation module 36 calculates the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system, including:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by Q and R.
When the unmanned aerial vehicle position and flight direction calculation module adopts the right front of the unmanned aerial vehicle as the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
In practical applications, the rotation angle of the camera of the unmanned aerial vehicle acquired by the rotation angle acquisition module 32 relative to the unmanned aerial vehicle includes: roll angle θ, pitch angle phi and yaw angle
The second coordinate system conversion relation calculating module 35 calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle, including:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
Wherein, the rotation matrix Q meets the following formula requirement:
in practical application, the pan/tilt control module 31 includes:
according to the preset characteristics of m fixed points, the cradle head is controlled to rotate, so that the camera shoots an image containing m fixed points at the same time.
The embodiment of the invention also provides an unmanned aerial vehicle, which comprises a flight controller and a cradle head carrying a camera and an angle sensor, wherein the flight controller executes the method for determining the position and the flight direction of the unmanned aerial vehicle in the steps 101-106 or in the steps 201-207.
Embodiments of the present invention also provide a non-transitory computer readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the steps of the method of determining a position and a direction of flight of a drone as described in steps 101-106, or in steps 201-207.
Embodiments of the present invention also provide an electronic device comprising a non-transitory computer readable storage medium as described above, and the above-described processor having access to the non-transitory computer readable storage medium.
The beneficial technical effects of the invention are as follows:
the position and the flight direction of the unmanned aerial vehicle in the world coordinate system can be obtained at the same time, and the positioning precision of the unmanned aerial vehicle is improved, so that the control precision of the unmanned aerial vehicle is improved; and because the GPS technology is not adopted for positioning, the GPS positioning device is not limited by the strength of GPS signals, has wider application range and can be used outdoors and indoors.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the invention.