[go: up one dir, main page]

CN111192318B - Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle - Google Patents

Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle Download PDF

Info

Publication number
CN111192318B
CN111192318B CN201811356896.1A CN201811356896A CN111192318B CN 111192318 B CN111192318 B CN 111192318B CN 201811356896 A CN201811356896 A CN 201811356896A CN 111192318 B CN111192318 B CN 111192318B
Authority
CN
China
Prior art keywords
coordinate system
aerial vehicle
unmanned aerial
camera
calculating
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811356896.1A
Other languages
Chinese (zh)
Other versions
CN111192318A (en
Inventor
沈亢伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hangzhou Hikvision Digital Technology Co Ltd
Original Assignee
Hangzhou Hikvision Digital Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hangzhou Hikvision Digital Technology Co Ltd filed Critical Hangzhou Hikvision Digital Technology Co Ltd
Priority to CN201811356896.1A priority Critical patent/CN111192318B/en
Publication of CN111192318A publication Critical patent/CN111192318A/en
Application granted granted Critical
Publication of CN111192318B publication Critical patent/CN111192318B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/0088Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots characterized by the autonomous decision making process, e.g. artificial intelligence, predefined behaviours
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/12Target-seeking control
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Theoretical Computer Science (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Business, Economics & Management (AREA)
  • Health & Medical Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Computation (AREA)
  • Game Theory and Decision Science (AREA)
  • Medical Informatics (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

The invention provides a method and a device for determining the position and the flight direction of an unmanned aerial vehicle and the unmanned aerial vehicle. The method comprises the following steps: calculating coordinates of m fixed points under a camera coordinate system according to an image which is shot by a camera carried by a cradle head of the unmanned aerial vehicle and contains m fixed points, wherein m is more than or equal to 4; calculating the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system; according to the rotation angle of the cradle head relative to the unmanned aerial vehicle when the camera shoots an image containing m fixed points, calculating the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system; and calculating the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system. The invention improves the positioning precision of the unmanned aerial vehicle.

Description

Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle
Technical Field
The invention relates to the technical field of unmanned aerial vehicles, in particular to a method and a device for determining the position and the flight direction of an unmanned aerial vehicle and the unmanned aerial vehicle.
Background
Unmanned aerial vehicles have been widely used in fields such as aerial photography, electric power inspection, environmental monitoring, forest fire prevention, disaster investigation, anti-terrorism lifesaving, military reconnaissance, battlefield evaluation, etc., effectively overcome the deficiency of manned aircraft aerial work, reduced purchase and maintenance costs, and improved vehicle safety.
When unmanned aerial vehicle carries out aerial work, need learn unmanned aerial vehicle's real-time position to control unmanned aerial vehicle.
Disclosure of Invention
The invention provides a method and a device for determining the position and the flight direction of an unmanned aerial vehicle and the unmanned aerial vehicle, so as to improve the positioning accuracy of the unmanned aerial vehicle.
The technical scheme of the invention is realized as follows:
calculating coordinates of m fixed points under a camera coordinate system according to an image which is shot by a camera carried by a cradle head of the unmanned aerial vehicle and contains m fixed points, wherein m is more than or equal to 4;
calculating the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system;
according to the rotation angle of the cradle head relative to the unmanned aerial vehicle when the camera shoots an image containing m fixed points, calculating the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system;
and calculating the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
Through the embodiment, the position coordinate of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system can be calculated simultaneously, so that the position and the flight direction of the unmanned aerial vehicle can be obtained simultaneously, and the positioning precision of the unmanned aerial vehicle is improved.
The calculating the conversion relation from the camera coordinate system to the world coordinate system comprises the following steps:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated, which comprises the following steps:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the calculating of the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system comprises:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by the rotation matrix Q and the rotation matrix R.
When the right front of the unmanned aerial vehicle is adopted as the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
The rotation angle of the cradle head relative to the unmanned aerial vehicle comprises: roll angle θ, pitch angle phi and yaw angle
The conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated, which comprises the following steps:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
The rotation matrix Q meets the following formula requirement:
an apparatus for determining a position and a flight direction of an unmanned aerial vehicle, the apparatus being applied to an unmanned aerial vehicle having a pan-tilt on which a camera and an angle sensor are mounted, the apparatus comprising:
the rotation angle acquisition module is used for acquiring the rotation angle of the cradle head relative to the unmanned aerial vehicle when the camera shoots an image containing m fixed points from the angle sensor, wherein m is more than or equal to 4;
the fixed point coordinate calculation module is used for calculating the coordinates of the m fixed points under a camera coordinate system according to the image which is shot by the camera and contains the m fixed points;
the first coordinate system conversion relation calculation module is used for calculating the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system;
the second coordinate system conversion relation calculation module is used for calculating the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle;
and the unmanned aerial vehicle position and flight direction calculation module is used for calculating the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
The first coordinate system conversion relation calculating module calculates a conversion relation from a camera coordinate system to a world coordinate system, including:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the second coordinate system conversion relation calculating module calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the unmanned aerial vehicle position and flight direction calculation module calculates a position coordinate of the unmanned aerial vehicle in a world coordinate system and a flight direction vector of the unmanned aerial vehicle in the world coordinate system, and the unmanned aerial vehicle position and flight direction vector calculation module comprises:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by the rotation matrix Q and the rotation matrix R.
When the unmanned aerial vehicle position and flight direction calculation module adopts the right front of the unmanned aerial vehicle as the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
The rotation angle of the cradle head acquired by the rotation angle acquisition module relative to the unmanned aerial vehicle comprises the following steps: roll angle θ, pitch angle phi and yaw angle
The second coordinate system conversion relation calculating module calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
The rotation matrix Q meets the following formula requirement:
a non-transitory computer readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the steps of the method of determining a position and a direction of flight of a drone as claimed in any one of the preceding claims.
A drone, comprising: a flight controller and a cradle head carrying a camera and an angle sensor, the flight controller performing the method of determining the position and direction of flight of a drone as described in any of the above.
According to the method, the conversion relation from the camera coordinate system to the world coordinate system is calculated according to the image which is shot by the camera and contains m fixed points, m is more than or equal to 4, and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated according to the rotation angle of the camera relative to the unmanned aerial vehicle, so that the position coordinate of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system are obtained, the position and the flight direction of the unmanned aerial vehicle can be obtained at the same time, and the positioning precision of the unmanned aerial vehicle is improved.
Drawings
FIG. 1 is a flowchart of a method for determining a position and a flight direction of a drone according to an embodiment of the present invention;
FIG. 2 is a flowchart of a method for determining a position and a flight direction of a drone according to another embodiment of the present invention;
fig. 3 is a schematic structural diagram of a device for determining a position and a flight direction of an unmanned aerial vehicle according to an embodiment of the present invention.
Detailed Description
The invention will be described in further detail with reference to the accompanying drawings and specific examples.
Related art proposes an unmanned aerial vehicle positioning and target tracking method based on two-dimensional labels. The method comprises the steps of carrying a camera and a cradle head on an unmanned aerial vehicle, aligning the camera to a two-dimensional tag, knowing the position of the two-dimensional tag in a world coordinate system, calculating the position of the camera in the world coordinate system through the position of an image of the two-dimensional tag in the camera, and taking the position of the camera in the world coordinate system as the coordinate of the unmanned aerial vehicle, wherein the method comprises the following steps of:
A. and calibrating the camera. Calibrating an internal parameter matrix of the camera according to the image shot by the cameraWherein f x 、f y U is the focal length of the camera 0 、v 0 Is the optical center of the camera.
B. And detecting the two-dimensional label. And identifying the two-dimensional label in the image shot by the cradle head camera to obtain coordinates [ x, y,1] of the two-dimensional label in the image, wherein x and y are horizontal coordinates and vertical coordinates expressed by pixel values.
C. And positioning the two-dimensional label. According to the parameters in the camera and the coordinates of the two-dimensional label in the image, calculating the external parameters of the camera, wherein the calculation formula is as follows:
wherein s represents a proportionality coefficient; the second matrix on the right of the equation is an extrinsic parameter matrix of the camera, i.e. a transformation matrix from the world coordinate system to the coordinate system in the image, where r 1-r 9 represent the rotational relationship between the two coordinate systems, t1-t3 represent the translational relationship between the two coordinate systems, and [ X, Y, Z,1] is the coordinate of the two-dimensional tag in the world coordinate system.
D. And calculating the coordinates of the camera in the world coordinate system, namely the coordinates of the unmanned aerial vehicle, through the external parameters of the camera.
E. And taking the moving speed of the two-dimensional label in the picture and the distance of the two-dimensional label relative to the unmanned aerial vehicle as inputs, and controlling the camera to always aim at the two-dimensional label.
The disadvantage of the above method is that the drone coordinates are equivalent to the camera coordinates without taking into account the conversion of the camera direction into the drone direction, which in practice would have to be rotated to obtain the drone direction.
Fig. 1 is a flowchart of a method for determining a position and a flight direction of an unmanned aerial vehicle according to an embodiment of the present invention, which specifically includes the following steps:
step 101: the unmanned aerial vehicle flight controller controls the cradle head of the unmanned aerial vehicle to rotate, so that a camera carried by the cradle head shoots m preset fixed points in the surrounding environment, and m is more than or equal to 4.
Step 102: and the flight controller of the unmanned aerial vehicle determines that the camera shoots a preset m fixed points, and then obtains the rotation angle of the cradle head relative to the unmanned aerial vehicle from the angle sensor carried by the cradle head.
Step 103: and the flight controller of the unmanned aerial vehicle calculates coordinates of the m fixed points under a camera coordinate system according to the image which is shot by the camera and contains the m fixed points.
Step 104: and the flight controller of the unmanned aerial vehicle calculates the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system.
Step 105: and the flight controller of the unmanned aerial vehicle calculates the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle.
Step 106: and the flight controller of the unmanned aerial vehicle calculates the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
Therefore, through the images of the preset m fixed points shot by the camera carried by the unmanned aerial vehicle cloud deck and the rotating angle of the cloud deck relative to the unmanned aerial vehicle when the m fixed points are shot, the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system are calculated, so that the position coordinate of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system are calculated, the position and the flight direction of the unmanned aerial vehicle can be obtained simultaneously, and the positioning precision of the unmanned aerial vehicle is improved.
The following gives a detailed description of the conversion of the camera coordinate system to the world coordinate system, and the conversion of the unmanned aerial vehicle coordinate system to the camera coordinate system:
fig. 2 is a flowchart of a method for determining a position and a flight direction of an unmanned aerial vehicle according to another embodiment of the present invention, which specifically includes the following steps:
step 201: the coordinates of m fixed points in the surrounding environment under a world coordinate system are stored on a flight controller of the unmanned aerial vehicle in advance, and m is more than or equal to 4.
The "surroundings" here are the flight environment of the unmanned aerial vehicle.
Calculating conversion relations among the world coordinate system, the camera coordinate system and the unmanned aerial vehicle coordinate system, namely, calculating a rotation matrix and a translation vector converted between the two coordinate systems, wherein the rotation matrix is a 3*3 matrix, and the translation vector is a 3-dimensional vector, namely, 12 unknown quantities are required to be obtained in total; considering that the coordinates of each fixed point is a three-dimensional vector, at least 4 fixed points are required to solve an equation containing 12 unknowns and resulting in a unique solution.
In practical applications, m fixing points may be located on a marker at the same time, the marker may be a plane, and the marker may be a piece of paper attached to a wall, or a license plate, or a display screen, etc.
The feature points with obvious features on the marker can be directly used as fixed points, so that the feature points (namely fixed points) can be directly found according to the features of the feature points; or the points with fixed relation with the feature points are used as fixed points, so that the feature points are found according to the features of the feature points, and then the fixed points are found according to the relation between the fixed points and the feature points.
For example: the marker is marked with 1 or more feature points, and the position of the feature points on the marker must be accurately displayed. Taking 4 feature points as an example, the 4 feature points may be connected into a quadrilateral, and the 4 corners of the quadrilateral are positions of the 4 feature points, and the 4 feature points may be directly used as fixed points. Alternatively, one feature point may be represented by one quadrangle, and the diagonal intersection of the quadrangle is the position of the feature point, and since the diagonal intersection has a fixed relationship with four vertexes of the quadrangle, the four vertexes are set as 4 fixed points.
The marker stores the world coordinate system position of each feature point by using a word or a two-dimensional code, or may be a certain index mark or index number corresponding to the information (the unmanned aerial vehicle may acquire the corresponding world coordinate system position information according to the index mark or index number).
The marker has a feature for indicating that the marker is a positioning marker, and the feature may be a special figure, or a color, or a shape of the marker itself, or the two-dimensional code, the text, the index mark/the index number containing special information.
When the camera on the unmanned aerial vehicle holder is aligned to the marker, the world coordinate system position information of the feature points can be read from the characters, the two-dimensional codes or the index marks/index numbers on the marker, so that the world coordinates of m fixed points are obtained.
Further, a plurality of markers at a distance may be combined for locating a drone with a greater range of flight.
Step 202: when the position and the flight direction of the unmanned aerial vehicle need to be acquired, the flight controller of the unmanned aerial vehicle controls the cradle head to rotate, so that the camera shoots images simultaneously containing m fixed points.
In particular, the flight controller may pre-store the characteristics of m fixed points, such as: the shape, the color and other identification features can control the cradle head to rotate in each direction in sequence according to a preset step length, whether m fixed points appear simultaneously or not is detected in real time in an image shot by the camera, and if yes, the camera is determined to shoot the image comprising the m fixed points simultaneously.
Step 203: when the camera shoots an image simultaneously comprising m fixed points, the unmanned aerial vehicle's flight controller acquires the rotation angle of the cradle head relative to the unmanned aerial vehicle from the angle sensor carried by the cradle head: roll angle (roll) θ, pitch angle (pitch) phi and yaw angle (yaw)
Step 204: and the flight controller of the unmanned aerial vehicle calculates coordinates of the m fixed points under a camera coordinate system according to the image which is shot by the camera and contains the m fixed points.
The method belongs to the prior art, and the coordinates of m fixed points under the camera coordinate system can be calculated through the memory number (such as focal length and the like) of the camera, the coordinates of m fixed points under the world coordinate system and the coordinates on the image.
Step 205: the unmanned aerial vehicle flight controller calculates a rotation matrix R and a translation vector T from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points in the world coordinate system and the coordinates of the m fixed points in the camera coordinate system.
For example: assuming that the coordinate of any fixed point r (r is more than or equal to 1 and less than or equal to m) in the world coordinate system is pos_r and the coordinate of any fixed point r in the camera coordinate system is c_r, the equation is as follows: pos_r=c_r×r+t, where R is a matrix of 3*3 and T is a 3-dimensional vector, and R and T can be obtained after substituting the coordinates of m fixed points into the above equation to calculate.
Step 206: unmanned aerial vehicle's flight control ware is according to the rotation angle of camera relative unmanned aerial vehicle: roll angle θ, pitch angle phi and yaw angleAnd calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
For example: the following principle can be used: corresponding the unmanned plane to the yaw angleRotation of the shaftThe angle is rotated by phi angle around the corresponding shaft of the pitch angle, and finally the angle theta is rotated around the corresponding shaft of the roll angle, so that a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system is constructed; wherein, the rotation matrix Q may meet the following formula requirement:
step 207: according to a rotation matrix R from a camera coordinate system to a world coordinate system and a translation vector T and a rotation matrix Q from an unmanned aerial vehicle coordinate system to the camera coordinate system, the flight controller of the unmanned aerial vehicle obtains that the rotation matrix from the unmanned aerial vehicle coordinate system to the world coordinate system is Q×R, the translation vector T is the position coordinate of the current unmanned aerial vehicle in the world coordinate system is the translation vector T, and the flight direction vector of the current unmanned aerial vehicle in the world coordinate system is dir= [0, 1] ×Q×R, wherein [0, 1] is the flight direction vector of the current unmanned aerial vehicle in the unmanned aerial vehicle coordinate system when the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is adopted as the right ahead of the unmanned aerial vehicle.
Because the camera is fixed on the unmanned aerial vehicle, the translation vector from the camera coordinate system to the world coordinate system and the unmanned aerial vehicle coordinate system are the same, and the position of the unmanned aerial vehicle is the origin of the unmanned aerial vehicle coordinate system, so that the translation vector T from the unmanned aerial vehicle coordinate system to the world coordinate system is the position coordinate of the unmanned aerial vehicle in the world coordinate system.
When the unmanned aerial vehicle cruises, the camera can be controlled to track the m fixed points after the camera is aligned with the m fixed points for the first time, namely, the camera is always kept to be aligned with the m fixed points, or the camera is controlled to be aligned with the m fixed points at intervals, or the camera is controlled to be aligned with the m fixed points when a positioning instruction for the unmanned aerial vehicle is received, so that the position and the flight direction of the unmanned aerial vehicle in a world coordinate system can be acquired in real time or periodically or according to requirements through the technical scheme provided by the invention.
Fig. 3 is a schematic structural diagram of an apparatus for determining a position and a flight direction of an unmanned aerial vehicle according to an embodiment of the present invention, where the apparatus is applied to an unmanned aerial vehicle, and the unmanned aerial vehicle has a pan-tilt on which a camera and an angle sensor are mounted, and the apparatus mainly includes: the cradle head control module 31, the rotation angle acquisition module 32, the fixed point coordinate calculation module 33, the first coordinate system conversion relation calculation module 34, the second coordinate system conversion relation calculation module 35 and the unmanned aerial vehicle position and flight direction calculation module 36, wherein:
and the cradle head control module 31 is used for controlling the cradle head of the unmanned aerial vehicle to rotate so that the camera shoots m preset fixed points in the surrounding environment, and m is more than or equal to 4.
The rotation angle obtaining module 32 is configured to obtain, from the angle sensor, a rotation angle of the pan-tilt relative to the unmanned aerial vehicle when the camera captures m fixed points.
The fixed point coordinate calculating module 33 is configured to calculate coordinates of the m fixed points in the camera coordinate system according to an image including the m fixed points captured by the camera.
The first coordinate system conversion relation calculating module 34 is configured to calculate a conversion relation from the camera coordinate system to the world coordinate system according to the m coordinates of the fixed points under the world coordinate system stored in advance and the m coordinates of the fixed points under the camera coordinate system calculated by the fixed point coordinate calculating module 33.
The second coordinate system conversion relation calculating module 35 is configured to calculate a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle acquired by the rotation angle acquiring module 32.
The unmanned aerial vehicle position and flight direction calculation module 36 is configured to calculate a position coordinate of the unmanned aerial vehicle in the world coordinate system and a flight direction vector of the unmanned aerial vehicle in the world coordinate system according to the conversion relationship from the camera coordinate system to the world coordinate system calculated by the first coordinate system conversion relationship calculation module 34 and the conversion relationship from the unmanned aerial vehicle coordinate system to the camera coordinate system calculated by the second coordinate system conversion relationship calculation module 35.
In practical applications, the first coordinate system conversion relation calculation module 34 calculates a conversion relation from a camera coordinate system to a world coordinate system, including:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the second coordinate system conversion relation calculation module 35 calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the unmanned aerial vehicle position and flight direction calculation module 36 calculates the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system, including:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by Q and R.
When the unmanned aerial vehicle position and flight direction calculation module adopts the right front of the unmanned aerial vehicle as the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
In practical applications, the rotation angle of the camera of the unmanned aerial vehicle acquired by the rotation angle acquisition module 32 relative to the unmanned aerial vehicle includes: roll angle θ, pitch angle phi and yaw angle
The second coordinate system conversion relation calculating module 35 calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle, including:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
Wherein, the rotation matrix Q meets the following formula requirement:
in practical application, the pan/tilt control module 31 includes:
according to the preset characteristics of m fixed points, the cradle head is controlled to rotate, so that the camera shoots an image containing m fixed points at the same time.
The embodiment of the invention also provides an unmanned aerial vehicle, which comprises a flight controller and a cradle head carrying a camera and an angle sensor, wherein the flight controller executes the method for determining the position and the flight direction of the unmanned aerial vehicle in the steps 101-106 or in the steps 201-207.
Embodiments of the present invention also provide a non-transitory computer readable storage medium storing instructions that, when executed by a processor, cause the processor to perform the steps of the method of determining a position and a direction of flight of a drone as described in steps 101-106, or in steps 201-207.
Embodiments of the present invention also provide an electronic device comprising a non-transitory computer readable storage medium as described above, and the above-described processor having access to the non-transitory computer readable storage medium.
The beneficial technical effects of the invention are as follows:
the position and the flight direction of the unmanned aerial vehicle in the world coordinate system can be obtained at the same time, and the positioning precision of the unmanned aerial vehicle is improved, so that the control precision of the unmanned aerial vehicle is improved; and because the GPS technology is not adopted for positioning, the GPS positioning device is not limited by the strength of GPS signals, has wider application range and can be used outdoors and indoors.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather to enable any modification, equivalent replacement, improvement or the like to be made within the spirit and principles of the invention.

Claims (12)

1. A method of determining the position and direction of flight of a drone, the method comprising:
calculating coordinates of m fixed points under a camera coordinate system according to an image which is shot by a camera carried by a cradle head of the unmanned aerial vehicle and contains m fixed points, wherein m is more than or equal to 4; the m fixed points are simultaneously positioned on one marker; the m fixed points are: characteristic points with obvious characteristics on the marker or points with fixed relation with the characteristic points; and, according to the characteristic of the characteristic point of the marker, find the characteristic point, thus find the fixed point, or find the characteristic point according to the characteristic of the characteristic point of the marker, then find the fixed point according to the relation of fixed point and characteristic point; when the camera on the unmanned aerial vehicle holder is aligned to the marker, world coordinates of m fixed points are obtained from characters or two-dimensional codes or index marks or index numbers on the marker;
calculating the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system;
according to the rotation angle of the cradle head relative to the unmanned aerial vehicle when the camera shoots the image containing m fixed points, calculating the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system;
and calculating the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
2. The method of claim 1, wherein calculating the conversion of the camera coordinate system to the world coordinate system comprises:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated, which comprises the following steps:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the calculating of the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vector of the unmanned aerial vehicle in the world coordinate system comprises:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by the rotation matrix Q and the rotation matrix R.
3. The method according to claim 2, characterized in that when the straight ahead of the unmanned aerial vehicle is adopted as the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the flight direction vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
4. The method according to claim 1 or 2, wherein the rotation angle of the pan-tilt relative to the drone comprises: roll angle θ, pitch angle phi and yaw angle
The conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system is calculated, which comprises the following steps:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
5. The method of claim 4, wherein the rotation matrix Q meets the following formula requirement:
6. an apparatus for determining a position and a flight direction of an unmanned aerial vehicle, the apparatus being applied to an unmanned aerial vehicle having a pan-tilt on which a camera and an angle sensor are mounted, the apparatus comprising:
the rotation angle acquisition module is used for acquiring the rotation angle of the cradle head relative to the unmanned aerial vehicle when the camera shoots an image containing m fixed points from the angle sensor, wherein m is more than or equal to 4; the m fixed points are simultaneously positioned on one marker; the m fixed points are: characteristic points with obvious characteristics on the marker or points with fixed relation with the characteristic points; and, according to the characteristic of the characteristic point of the marker, find the characteristic point, thus find the fixed point, or find the characteristic point according to the characteristic of the characteristic point of the marker, then find the fixed point according to the relation of fixed point and characteristic point; when the camera on the unmanned aerial vehicle holder is aligned to the marker, world coordinates of m fixed points are obtained from characters or two-dimensional codes or index marks or index numbers on the marker;
the fixed point coordinate calculation module is used for calculating the coordinates of the m fixed points under a camera coordinate system according to the image containing the m fixed points shot by the camera;
the first coordinate system conversion relation calculation module is used for calculating the conversion relation from the camera coordinate system to the world coordinate system according to the coordinates of the m fixed points under the world coordinate system and the coordinates under the camera coordinate system;
the second coordinate system conversion relation calculation module is used for calculating the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system according to the rotation angle of the camera relative to the unmanned aerial vehicle;
and the unmanned aerial vehicle position and flight direction calculation module is used for calculating the position coordinates of the unmanned aerial vehicle in the world coordinate system and the flight direction vectors of the unmanned aerial vehicle in the world coordinate system according to the conversion relation from the camera coordinate system to the world coordinate system and the conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system.
7. The apparatus of claim 6, wherein the first coordinate system conversion relation calculation module calculates a conversion relation of a camera coordinate system to a world coordinate system comprising:
calculating a rotation matrix R and a translation vector T from a camera coordinate system to a world coordinate system;
the second coordinate system conversion relation calculating module calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
calculating a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system;
the unmanned aerial vehicle position and flight direction calculation module calculates a position coordinate of the unmanned aerial vehicle in a world coordinate system and a flight direction vector of the unmanned aerial vehicle in the world coordinate system, and the unmanned aerial vehicle position and flight direction vector calculation module comprises:
the position coordinate of the unmanned aerial vehicle in the world coordinate system is determined to be a translation vector T, and the flight direction vector of the unmanned aerial vehicle in the world coordinate system is calculated as follows: the unmanned aerial vehicle's flight direction vector in the unmanned aerial vehicle coordinate system is multiplied by the rotation matrix Q and the rotation matrix R.
8. The apparatus of claim 7, wherein when the unmanned aerial vehicle position and direction of flight calculation module employs a direct front of the unmanned aerial vehicle as a direction of flight vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system, the direction of flight vector of the unmanned aerial vehicle in the unmanned aerial vehicle coordinate system is [0, 1].
9. The apparatus according to claim 6 or 7, wherein the rotation angle of the pan-tilt acquired by the rotation angle acquisition module relative to the unmanned aerial vehicle includes: roll angle θ, pitch angle phi and yaw angle
The second coordinate system conversion relation calculating module calculates a conversion relation from the unmanned aerial vehicle coordinate system to the camera coordinate system, including:
according to the following principle: the unmanned plane is firstly rotated around the corresponding yaw angle shaftAnd (3) rotating the unmanned aerial vehicle by an angle phi around a corresponding shaft of the pitch angle, and finally rotating the unmanned aerial vehicle by an angle theta around a corresponding shaft of the roll angle to construct a rotation matrix Q from the unmanned aerial vehicle coordinate system to the camera coordinate system.
10. The apparatus of claim 9, wherein the rotation matrix Q meets the following formula requirement:
11. a non-transitory computer readable storage medium storing instructions which, when executed by a processor, cause the processor to perform the steps of the method of determining a position and a direction of flight of a drone of any one of claims 1 to 5.
12. An unmanned aerial vehicle, comprising: a flight controller and a cradle head carrying a camera and an angle sensor, the flight controller performing the method of determining the position and direction of flight of a drone as claimed in any one of claims 1 to 5.
CN201811356896.1A 2018-11-15 2018-11-15 Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle Active CN111192318B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811356896.1A CN111192318B (en) 2018-11-15 2018-11-15 Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811356896.1A CN111192318B (en) 2018-11-15 2018-11-15 Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle

Publications (2)

Publication Number Publication Date
CN111192318A CN111192318A (en) 2020-05-22
CN111192318B true CN111192318B (en) 2023-09-01

Family

ID=70710606

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811356896.1A Active CN111192318B (en) 2018-11-15 2018-11-15 Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle

Country Status (1)

Country Link
CN (1) CN111192318B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112261221B (en) * 2020-09-21 2021-10-26 电子科技大学 Human body falling detection method based on intelligent terminal
CN112995890A (en) * 2021-02-06 2021-06-18 广东特视能智能科技有限公司 Unmanned aerial vehicle positioning method and device, storage medium and unmanned aerial vehicle nest
CN113436276B (en) * 2021-07-13 2023-04-07 天津大学 Visual relative positioning-based multi-unmanned aerial vehicle formation method
CN113686736B (en) * 2021-08-19 2024-02-20 易视智瞳科技(深圳)有限公司 Visual method and visual system for measuring emission direction of injection valve
CN113821052A (en) * 2021-09-22 2021-12-21 一飞智控(天津)科技有限公司 Cluster unmanned aerial vehicle cooperative target positioning method and system and cooperative target positioning terminal
CN114020029B (en) * 2021-11-09 2022-06-10 深圳大漠大智控技术有限公司 Automatic generation method and device of aerial route for cluster and related components
CN115752382A (en) * 2022-11-18 2023-03-07 深圳赛尔智控科技有限公司 Method, device and equipment for correcting offset data of aerial photo and storage medium
CN115793698A (en) * 2023-02-07 2023-03-14 北京四维远见信息技术有限公司 Automatic attitude control system and method

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
EP2805671A2 (en) * 2013-05-23 2014-11-26 Stiftung caesar - center of advanced european studies and research assoziiert mit der Max-Planck-Gesellschaft Ocular videography system
CN104215239A (en) * 2014-08-29 2014-12-17 西北工业大学 Vision-based autonomous unmanned plane landing guidance device and method
WO2015173256A2 (en) * 2014-05-13 2015-11-19 Immersight Gmbh Method and system for determining a representational position
CN105225241A (en) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 The acquisition methods of unmanned plane depth image and unmanned plane
CN106326892A (en) * 2016-08-01 2017-01-11 西南科技大学 Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
WO2017049816A1 (en) * 2015-09-24 2017-03-30 北京零零无限科技有限公司 Method and device for controlling unmanned aerial vehicle to rotate along with face
CN106651961A (en) * 2016-12-09 2017-05-10 中山大学 Color stereoscopic calibration object-based unmanned aerial vehicle calibration method and system
CN107194941A (en) * 2017-05-23 2017-09-22 武汉科技大学 A kind of unmanned plane independent landing method, system and electronic equipment based on monocular vision
WO2017181513A1 (en) * 2016-04-20 2017-10-26 高鹏 Flight control method and device for unmanned aerial vehicle
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target
CN107831776A (en) * 2017-09-14 2018-03-23 湖南优象科技有限公司 Unmanned plane based on nine axle inertial sensors independently makes a return voyage method
CN107966136A (en) * 2016-10-19 2018-04-27 杭州海康机器人技术有限公司 Slave unmanned plane position display method, apparatus and system based on main unmanned plane vision
CN108007463A (en) * 2017-11-29 2018-05-08 天津聚飞创新科技有限公司 UAV Attitude acquisition methods, device and unmanned plane
CN108022255A (en) * 2017-12-07 2018-05-11 深圳慧源创新科技有限公司 Unmanned plane automatic tracking method, unmanned plane autotracker and unmanned plane
CN108489454A (en) * 2018-03-22 2018-09-04 沈阳上博智像科技有限公司 Depth distance measurement method, device, computer readable storage medium and electronic equipment
CN108681718A (en) * 2018-05-20 2018-10-19 北京工业大学 A kind of accurate detection recognition method of unmanned plane low target

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8514268B2 (en) * 2008-01-22 2013-08-20 California Institute Of Technology Method and device for high-resolution three-dimensional imaging which obtains camera pose using defocusing
KR102085180B1 (en) * 2013-10-08 2020-03-05 삼성전자주식회사 Method of estimating body's orientation, Computer readable storage medium of recording the method and an device
KR102706191B1 (en) * 2016-11-30 2024-09-13 삼성전자주식회사 Unmanned flying vehicle and flying control method thereof

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103149939A (en) * 2013-02-26 2013-06-12 北京航空航天大学 Dynamic target tracking and positioning method of unmanned plane based on vision
EP2805671A2 (en) * 2013-05-23 2014-11-26 Stiftung caesar - center of advanced european studies and research assoziiert mit der Max-Planck-Gesellschaft Ocular videography system
WO2015173256A2 (en) * 2014-05-13 2015-11-19 Immersight Gmbh Method and system for determining a representational position
CN104215239A (en) * 2014-08-29 2014-12-17 西北工业大学 Vision-based autonomous unmanned plane landing guidance device and method
WO2017049816A1 (en) * 2015-09-24 2017-03-30 北京零零无限科技有限公司 Method and device for controlling unmanned aerial vehicle to rotate along with face
CN105225241A (en) * 2015-09-25 2016-01-06 广州极飞电子科技有限公司 The acquisition methods of unmanned plane depth image and unmanned plane
WO2017181513A1 (en) * 2016-04-20 2017-10-26 高鹏 Flight control method and device for unmanned aerial vehicle
CN106326892A (en) * 2016-08-01 2017-01-11 西南科技大学 Visual landing pose estimation method of rotary wing type unmanned aerial vehicle
CN107966136A (en) * 2016-10-19 2018-04-27 杭州海康机器人技术有限公司 Slave unmanned plane position display method, apparatus and system based on main unmanned plane vision
CN106651961A (en) * 2016-12-09 2017-05-10 中山大学 Color stereoscopic calibration object-based unmanned aerial vehicle calibration method and system
CN107194941A (en) * 2017-05-23 2017-09-22 武汉科技大学 A kind of unmanned plane independent landing method, system and electronic equipment based on monocular vision
CN107314771A (en) * 2017-07-04 2017-11-03 合肥工业大学 Unmanned plane positioning and attitude angle measuring method based on coded target
CN107831776A (en) * 2017-09-14 2018-03-23 湖南优象科技有限公司 Unmanned plane based on nine axle inertial sensors independently makes a return voyage method
CN108007463A (en) * 2017-11-29 2018-05-08 天津聚飞创新科技有限公司 UAV Attitude acquisition methods, device and unmanned plane
CN108022255A (en) * 2017-12-07 2018-05-11 深圳慧源创新科技有限公司 Unmanned plane automatic tracking method, unmanned plane autotracker and unmanned plane
CN108489454A (en) * 2018-03-22 2018-09-04 沈阳上博智像科技有限公司 Depth distance measurement method, device, computer readable storage medium and electronic equipment
CN108681718A (en) * 2018-05-20 2018-10-19 北京工业大学 A kind of accurate detection recognition method of unmanned plane low target

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Jin zhang et al..Measurement of Unmanned Aerial Vehicle Attitude Angles Based on a Single Captured Image.《sensor》.2018,第18卷(第18期),1-14. *

Also Published As

Publication number Publication date
CN111192318A (en) 2020-05-22

Similar Documents

Publication Publication Date Title
CN111192318B (en) Method and device for determining position and flight direction of unmanned aerial vehicle and unmanned aerial vehicle
CN106197422B (en) A method for positioning and target tracking of UAV based on two-dimensional tags
CN104298248B (en) Rotor wing unmanned aerial vehicle accurate vision positioning and orienting method
CN105549614B (en) Unmanned plane target tracking
CN111966133A (en) Visual servo control system of holder
CN102967305B (en) Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square
CN108810473B (en) Method and system for realizing GPS mapping camera picture coordinate on mobile platform
CN109556616A (en) A kind of automatic Jian Tu robot of view-based access control model label builds figure dressing method
CN109212545A (en) Multiple source target following measuring system and tracking based on active vision
CN110595476A (en) Unmanned aerial vehicle landing navigation method and device based on GPS and image visual fusion
CN112955711A (en) Position information determining method, apparatus and storage medium
GB2506239A (en) Projecting maintenance history using optical reference points
CN110009682A (en) A Target Recognition and Localization Method Based on Monocular Vision
US10303943B2 (en) Cloud feature detection
CN113126126A (en) All-time automatic target-scoring system and ammunition drop point positioning method thereof
CN110333735A (en) A system and method for realizing secondary positioning of unmanned aerial vehicle on land and water
WO2021157136A1 (en) Positioning system
WO2020137311A1 (en) Positioning device and moving object
JP2021131762A (en) Information processing equipment, information processing methods, and programs
CN114820725A (en) Target display method and device, electronic equipment and storage medium
US20180012060A1 (en) Detecting and ranging cloud features
CN115790610B (en) Unmanned aerial vehicle accurate positioning system and method
CN111402324B (en) Target measurement method, electronic equipment and computer storage medium
CN115690612A (en) A method, device and medium for quantitative indication of UAV photoelectric image target search
CN211349366U (en) Visual positioning device based on two-dimensional label

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Hikvision Robot Co.,Ltd.

Address before: 310051 5th floor, building 1, building 2, no.700 Dongliu Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: HANGZHOU HIKROBOT TECHNOLOGY Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230615

Address after: No.555, Qianmo Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant after: Hangzhou Hikvision Digital Technology Co.,Ltd.

Address before: 310051 room 304, B / F, building 2, 399 Danfeng Road, Binjiang District, Hangzhou City, Zhejiang Province

Applicant before: Hangzhou Hikvision Robot Co.,Ltd.

GR01 Patent grant
GR01 Patent grant