CN101833761A - UAV pose estimation method based on cooperative target feature lines - Google Patents
UAV pose estimation method based on cooperative target feature lines Download PDFInfo
- Publication number
- CN101833761A CN101833761A CN 201010151984 CN201010151984A CN101833761A CN 101833761 A CN101833761 A CN 101833761A CN 201010151984 CN201010151984 CN 201010151984 CN 201010151984 A CN201010151984 A CN 201010151984A CN 101833761 A CN101833761 A CN 101833761A
- Authority
- CN
- China
- Prior art keywords
- coordinate system
- cooperative target
- uav
- angle
- parallel lines
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Landscapes
- Image Analysis (AREA)
Abstract
本发明公开了一种基于合作目标特征线的无人机位姿估计方法,其特征在于包括以下步骤:(1)对输入的图像进行图像预处理得到合作目标的两组平行线;(2)对提取的平行线分析计算出灭影线方程,得出灭影线的斜率,从而计算出无人机的滚转角;(3)根据图像直接得到无人机的偏航角,将已知的四个点坐标代入无人机摄像机坐标系和世界坐标系的关系方程计算出其余四个位姿参数。本发明实现了无人机位姿信息的精确计算。
The invention discloses a method for estimating the pose of an unmanned aerial vehicle based on characteristic lines of a cooperation target, which is characterized in that it comprises the following steps: (1) performing image preprocessing on an input image to obtain two sets of parallel lines of the cooperation target; (2) Analyze and calculate the vanishing line equation for the extracted parallel lines, and obtain the slope of the vanishing line, thereby calculating the roll angle of the UAV; (3) directly obtain the yaw angle of the UAV according to the image, and convert the known The four point coordinates are substituted into the relationship equation between the UAV camera coordinate system and the world coordinate system to calculate the remaining four pose parameters. The invention realizes the accurate calculation of the position and attitude information of the drone.
Description
技术领域technical field
本发明涉及一种图像处理技术领域的方法,具体公布了一种基于合作目标特征线的无人机位姿估计方法。The invention relates to a method in the technical field of image processing, and specifically discloses a method for estimating the pose of an unmanned aerial vehicle based on characteristic lines of cooperative targets.
背景技术Background technique
在无人飞行器的导航和着陆导引中,应用最多的是利用GPS/SINS组合导航系统结合高度表来完成。但是GPS受控于人,一旦GPS被关闭或者我们接收到的是误码,无人飞行器就无法实现自主导航和着陆,甚至发生误着陆导引导致无人飞行器损毁,所以我们必须拥有自主控制的导航和着陆导引系统。只有这样才能满足战时的无人飞行器导航和全天候自主精确着陆的要求。因此,为了提高我国无人飞行器抗干扰能力、解决GPS受制于人的问题以及提高无人飞行器的生存能力,有必要给我国的无人飞行器配备一套抗干扰能力强且能独立自主导航和着陆的导引系统。事实上,无人飞行器的全天候自主导航和着陆也是目前世界上众多国家迫切需要解决的关键问题之一。In the navigation and landing guidance of unmanned aerial vehicles, the most widely used is to use the GPS/SINS integrated navigation system combined with the altimeter to complete. However, GPS is controlled by people. Once the GPS is turned off or we receive error codes, the UAV will not be able to navigate and land autonomously, or even cause the UAV to be damaged due to wrong landing guidance. Therefore, we must have autonomous control. Navigation and landing guidance systems. Only in this way can the requirements of unmanned aerial vehicle navigation and all-weather autonomous precise landing be met in wartime. Therefore, in order to improve the anti-jamming ability of our UAVs, solve the problem that GPS is controlled by people, and improve the survivability of UAVs, it is necessary to equip our UAVs with a set of guidance systems that have strong anti-jamming capabilities and can independently navigate and land. system. In fact, the all-weather autonomous navigation and landing of unmanned aerial vehicles is also one of the key issues that many countries in the world urgently need to solve.
随着图像处理技术、光学摄像技术的发展,机器视觉导航技术已经成为无人机自主导航领域研究的热点。视觉导航技术是利用传感器获得图像,通过图像处理得到无人机导航定位姿态参数,视觉传感器具有轻便、低功耗、体积小等优点,此外,视觉导航可以解决GPS受他国控制和易受干扰的潜在危险问题,且具有精度适中,成本低等优点。因此,本发明提出了基于红外合作目标的导航方法,以实现无人机的全天候自主精确着陆。With the development of image processing technology and optical camera technology, machine vision navigation technology has become a research hotspot in the field of UAV autonomous navigation. Visual navigation technology uses sensors to obtain images, and obtains UAV navigation and positioning attitude parameters through image processing. Visual sensors have the advantages of portability, low power consumption, and small size. In addition, visual navigation can solve the problem that GPS is controlled by other countries and is susceptible to interference. Potential dangerous problems, and has the advantages of moderate accuracy and low cost. Therefore, the present invention proposes a navigation method based on infrared cooperative targets to realize the all-weather autonomous and precise landing of the UAV.
发明内容Contents of the invention
本发明的目的在于提出了一种基于合作目标特征线的无人机位姿估计方法,通过灭影点和灭影线在像面上的位置唯一地由该直线或平面的方向决定,它们与摄像机的空间姿态有着确定的关系来计算无人机的姿态信息。The object of the present invention is to propose a kind of UAV pose estimation method based on cooperative target feature line, by the position of vanishing point and vanishing line on the image surface uniquely determined by the direction of this straight line or plane, they and The spatial attitude of the camera has a definite relationship to calculate the attitude information of the UAV.
本发明是采取以下的技术方案来实现的:The present invention is achieved by taking the following technical solutions:
一种基于合作目标特征线的无人机位姿估计方法,其特征在于包括以下步骤:A method for estimating the pose of an unmanned aerial vehicle based on cooperative target feature lines, characterized in that it comprises the following steps:
(1)对输入的图像进行图像预处理得到合作目标的两组平行线;(1) Perform image preprocessing on the input image to obtain two sets of parallel lines of the cooperation target;
(2)对提取的平行线分析计算出灭影线方程,得出灭影线的斜率,从而计算出无人机的滚转角;(2) Analyze and calculate the vanishing line equation to the extracted parallel lines, obtain the slope of the vanishing line, thereby calculate the roll angle of the UAV;
(3)根据图像直接得到无人机的偏航角,将已知的四个点坐标代入无人机摄像机坐标系和世界坐标系的关系方程计算出其余四个位姿参数。(3) Obtain the yaw angle of the UAV directly according to the image, and substitute the known four point coordinates into the relationship equation between the UAV camera coordinate system and the world coordinate system to calculate the other four pose parameters.
前述的基于合作目标特征线的无人机位姿估计方法,其特征在于:在上述步骤(1)中,首先用sobel边缘检测方法检测出合作目标的边缘,然后对边缘进行轮廓链码跟踪,提取出两组平行线。The aforesaid UAV pose estimation method based on cooperative target feature line is characterized in that: in above-mentioned steps (1), at first detect the edge of cooperative target with sobel edge detection method, then carry out contour chain code tracking to edge, Two sets of parallel lines are extracted.
本发明的技术效果如下:Technical effect of the present invention is as follows:
本发明是一种图像处理技术领域的方法,具体公布了一种基于合作目标特征线的无人机位姿估计方法。它针对灭影点和灭影线在像面上的位置唯一地由该直线或平面的方向决定,它们与摄像机的空间姿态有着确定的关系来计算无人机的姿态。实现了无人机位姿信息的精确计算。The invention relates to a method in the technical field of image processing, and specifically discloses a method for estimating the pose of an unmanned aerial vehicle based on characteristic lines of cooperative targets. Its positions on the image plane for the vanishing point and vanishing line are uniquely determined by the direction of the straight line or plane, and they have a definite relationship with the spatial attitude of the camera to calculate the attitude of the UAV. Accurate calculation of UAV pose information is realized.
附图说明Description of drawings
图1为相关坐标系的关系;Fig. 1 is the relation of relevant coordinate system;
图2为合作目标的两组平行线;Figure 2 is two sets of parallel lines of the cooperation target;
图3偏航角示意图。Figure 3 Schematic diagram of yaw angle.
具体实施方式Detailed ways
下面结合具体实施方式对本发明做进一步的详细说明。The present invention will be further described in detail below in combination with specific embodiments.
如图2所示,计算出合作目标两组平行线的灭影线方程;然后,得到灭影线的斜率,从而计算出滚转角;如图3所示得到无人机的偏航角;将已知的四个点坐标代入无人机摄像机坐标系和世界坐标系的关系方程计算出其余四个位姿参数。As shown in Figure 2, calculate the vanishing line equation of the two sets of parallel lines of the cooperative target; then, get the slope of the vanishing line, thereby calculating the roll angle; as shown in Figure 3, get the yaw angle of the UAV; The known four point coordinates are substituted into the relationship equation between the UAV camera coordinate system and the world coordinate system to calculate the other four pose parameters.
如图1所示,WCS是世界坐标系,原点建立在着陆图标所在平面上一点,xw-yw面平行于地面;CCS是摄像机坐标系,原点建立在光心,yc轴与光轴重合,xc-zc平面平行于像平面;ICS是图像坐标系,原点位于摄像机坐标系(0,0,f)处(f是摄像机焦距),是一个二维坐标系。根据以上定义,可以得出着陆图标上的点(xw,yw,zw)与其像面上的投影(x1,z1)之间的转换关系。其中世界坐标系和摄像机坐标系之间的关系如下:As shown in Figure 1, WCS is the world coordinate system, the origin is established at a point on the plane where the landing icon is located, and the xw-yw plane is parallel to the ground; CCS is the camera coordinate system, the origin is established at the optical center, the yc axis coincides with the optical axis, and the xc The -zc plane is parallel to the image plane; ICS is the image coordinate system, and the origin is located at (0, 0, f) in the camera coordinate system (f is the focal length of the camera), which is a two-dimensional coordinate system. According to the above definition, the conversion relationship between the point (xw, yw, zw) on the landing icon and its projection (x1, z1) on the image plane can be obtained. The relationship between the world coordinate system and the camera coordinate system is as follows:
CCS-WCS
其中,T为三维平移向量,是世界坐标系原点在摄像机坐标系中的坐标。摄像机坐标系和图像坐标系之间的关系如下:Among them, T is a three-dimensional translation vector, which is the coordinate of the origin of the world coordinate system in the camera coordinate system. The relationship between the camera coordinate system and the image coordinate system is as follows:
ICS-CCS ICS-CCS
旋转矩阵其中θ、φ和ψ分别是俯仰角、滚转角和偏航角。设 rotation matrix where θ, φ, and ψ are the pitch, roll, and yaw angles, respectively. set up
图2中两组平行线分别为L11、L12与L21、L22,设两组平行线在像面上成的灭影点分别为v1(x1,z1)(由L11这组平行线得到)和v2(x2,z2)(由L21这组平行线得到),则连接这两点可得到该平面在像面上所形成的灭影线。In Fig. 2, two sets of parallel lines are respectively L11, L12 and L21, L22, and the vanishing points formed by the two sets of parallel lines on the image plane are respectively v 1 (x 1 , z 1 ) (obtained from the set of parallel lines L11 ) and v 2 (x 2 , z 2 ) (obtained from the group of parallel lines L21), then connect these two points to obtain the vanishing line formed by the plane on the image plane.
设直线方程表示为:Let the equation of the straight line be expressed as:
L11:y=k1x+b1 z=0L11: y=k 1 x+b 1 z=0
(3)(3)
L21:y=k2x+b2 z=0L21: y=k 2 x+b 2 z=0
则根据灭影点的定义,取L11上一点p(xp,yp,0),有:Then according to the definition of the vanishing point, taking a point p(x p , y p , 0) on L11, we have:
对y取极限y→∞,同时代入xp=(yp-b1)/k1,得: Taking the limit y→∞ for y, and substituting x p =(y p -b 1 )/k 1 at the same time, we get:
同理,对于L21有:Similarly, for L21:
连接以上两点,并将A-I表示的值代入,得灭影线方程:Connecting the above two points, and substituting the value indicated by A-I, the equation of the vanishing line is obtained:
x□sinφ-z□cosφ-tan θ□f=0,由此可得灭影线斜率k=tanφ。则可得到滚转角。然后将已知的四个点坐标代入无人机摄像机坐标系和世界坐标系的关系方程(1)计算出其余四个位姿参数。x□sinφ-z□cosφ-tan θ□f=0, from which the slope of the vanishing line k=tanφ can be obtained. Then the roll angle can be obtained. Then substitute the known four point coordinates into the relationship equation (1) between the UAV camera coordinate system and the world coordinate system to calculate the remaining four pose parameters.
以上已以较佳实施例公开了本发明,然其并非用以限制本发明,凡采用等同替换或者等效变换方式所获得的技术方案,均落在本发明的保护范围之内。The above has disclosed the present invention with preferred embodiments, but it is not intended to limit the present invention, and all technical solutions obtained by adopting equivalent replacement or equivalent transformation methods fall within the protection scope of the present invention.
Claims (4)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN 201010151984 CN101833761A (en) | 2010-04-20 | 2010-04-20 | UAV pose estimation method based on cooperative target feature lines |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN 201010151984 CN101833761A (en) | 2010-04-20 | 2010-04-20 | UAV pose estimation method based on cooperative target feature lines |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN101833761A true CN101833761A (en) | 2010-09-15 |
Family
ID=42717822
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN 201010151984 Pending CN101833761A (en) | 2010-04-20 | 2010-04-20 | UAV pose estimation method based on cooperative target feature lines |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN101833761A (en) |
Cited By (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN102320378A (en) * | 2011-06-20 | 2012-01-18 | 北京航空航天大学 | Balanced control distribution method of airplane with multiple control surfaces |
| CN102967305A (en) * | 2012-10-26 | 2013-03-13 | 南京信息工程大学 | Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square |
| CN103218607A (en) * | 2013-04-11 | 2013-07-24 | 北京航空航天大学 | Cooperative target designing and locating method for unmanned aerial vehicle autonomous landing |
| CN103492899A (en) * | 2011-04-20 | 2014-01-01 | 高通股份有限公司 | Online reference patch generation and pose estimation for augmented reality |
| CN103617622A (en) * | 2013-12-10 | 2014-03-05 | 云南大学 | Pose estimation orthogonal iterative optimization algorithm |
| CN108627142A (en) * | 2018-05-02 | 2018-10-09 | 成都纵横自动化技术有限公司 | A kind of object localization method of combination offline elevation and airborne photoelectric gondola |
| CN109683629A (en) * | 2019-01-09 | 2019-04-26 | 燕山大学 | Unmanned plane electric stringing system based on integrated navigation and computer vision |
| CN109764864A (en) * | 2019-01-16 | 2019-05-17 | 南京工程学院 | A method and system for indoor UAV pose acquisition based on color recognition |
| CN110310331A (en) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | A Pose Estimation Method Based on the Combination of Line Features and Point Cloud Features |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1916801A (en) * | 2005-10-28 | 2007-02-21 | 南京航空航天大学 | Method for identifying cooperated object for self-landing pilotless aircraft |
| CN101270993A (en) * | 2007-12-12 | 2008-09-24 | 北京航空航天大学 | A long-distance high-precision autonomous integrated navigation and positioning method |
| US20090015674A1 (en) * | 2006-04-28 | 2009-01-15 | Kevin Alley | Optical imaging system for unmanned aerial vehicle |
-
2010
- 2010-04-20 CN CN 201010151984 patent/CN101833761A/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN1916801A (en) * | 2005-10-28 | 2007-02-21 | 南京航空航天大学 | Method for identifying cooperated object for self-landing pilotless aircraft |
| US20090015674A1 (en) * | 2006-04-28 | 2009-01-15 | Kevin Alley | Optical imaging system for unmanned aerial vehicle |
| CN101270993A (en) * | 2007-12-12 | 2008-09-24 | 北京航空航天大学 | A long-distance high-precision autonomous integrated navigation and positioning method |
Non-Patent Citations (2)
| Title |
|---|
| 《航空兵器》 20100228 张勇 等 无人机自主着陆过程中合作目标特征点的提取方法研究 , 第1期 2 * |
| 《计算机工程与应用》 20041231 刘士清,胡春华,朱纪洪 一种基于灭影线的无人直升机位姿估计方法 , 第9期 2 * |
Cited By (15)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN103492899A (en) * | 2011-04-20 | 2014-01-01 | 高通股份有限公司 | Online reference patch generation and pose estimation for augmented reality |
| CN102320378B (en) * | 2011-06-20 | 2013-07-24 | 北京航空航天大学 | Balanced control distribution method of airplane with multiple control surfaces |
| CN102320378A (en) * | 2011-06-20 | 2012-01-18 | 北京航空航天大学 | Balanced control distribution method of airplane with multiple control surfaces |
| CN102967305B (en) * | 2012-10-26 | 2015-07-01 | 南京信息工程大学 | Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square |
| CN102967305A (en) * | 2012-10-26 | 2013-03-13 | 南京信息工程大学 | Multi-rotor unmanned aerial vehicle pose acquisition method based on markers in shape of large and small square |
| CN103218607A (en) * | 2013-04-11 | 2013-07-24 | 北京航空航天大学 | Cooperative target designing and locating method for unmanned aerial vehicle autonomous landing |
| CN103218607B (en) * | 2013-04-11 | 2016-08-24 | 北京航空航天大学 | A kind of cooperative target for unmanned plane autonomous landing on the ship designs and localization method |
| CN103617622A (en) * | 2013-12-10 | 2014-03-05 | 云南大学 | Pose estimation orthogonal iterative optimization algorithm |
| CN108627142A (en) * | 2018-05-02 | 2018-10-09 | 成都纵横自动化技术有限公司 | A kind of object localization method of combination offline elevation and airborne photoelectric gondola |
| CN108627142B (en) * | 2018-05-02 | 2020-07-17 | 成都纵横自动化技术股份有限公司 | Target positioning method combining offline elevation and airborne photoelectric pod |
| CN109683629A (en) * | 2019-01-09 | 2019-04-26 | 燕山大学 | Unmanned plane electric stringing system based on integrated navigation and computer vision |
| CN109683629B (en) * | 2019-01-09 | 2020-08-21 | 燕山大学 | Unmanned aerial vehicle electric power overhead line system based on combination navigation and computer vision |
| CN109764864A (en) * | 2019-01-16 | 2019-05-17 | 南京工程学院 | A method and system for indoor UAV pose acquisition based on color recognition |
| CN109764864B (en) * | 2019-01-16 | 2022-10-21 | 南京工程学院 | A method and system for indoor UAV pose acquisition based on color recognition |
| CN110310331A (en) * | 2019-06-18 | 2019-10-08 | 哈尔滨工程大学 | A Pose Estimation Method Based on the Combination of Line Features and Point Cloud Features |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN101833761A (en) | UAV pose estimation method based on cooperative target feature lines | |
| Patruno et al. | A vision-based approach for unmanned aerial vehicle landing | |
| CN105021184B (en) | It is a kind of to be used for pose estimating system and method that vision under mobile platform warship navigation | |
| CN102435188B (en) | Monocular vision/inertia autonomous navigation method for indoor environment | |
| CN106326892B (en) | A visual landing pose estimation method for rotary-wing UAV | |
| Zhao et al. | A robust real-time vision system for autonomous cargo transfer by an unmanned helicopter | |
| CN113474677A (en) | Automated method for UAV landing on a pipeline | |
| CN105783913A (en) | SLAM device integrating multiple vehicle-mounted sensors and control method of device | |
| Xu et al. | Use of land’s cooperative object to estimate UAV’s pose for autonomous landing | |
| CN103697883B (en) | A kind of aircraft horizontal attitude defining method based on skyline imaging | |
| Yu et al. | Appearance-based monocular visual odometry for ground vehicles | |
| CN102788580A (en) | Flight path synthetic method in unmanned aerial vehicle visual navigation | |
| US20210272289A1 (en) | Sky determination in environment detection for mobile platforms, and associated systems and methods | |
| CN108122255A (en) | It is a kind of based on trapezoidal with circular combination terrestrial reference UAV position and orientation method of estimation | |
| CN114415736B (en) | A UAV multi-stage visual precision landing method and device | |
| CN108036786A (en) | Position and posture detection method, device and computer-readable recording medium based on auxiliary line | |
| CN108225273A (en) | A kind of real-time runway detection method based on sensor priori | |
| KR102827978B1 (en) | 2D image and 3D point cloud registration method and system | |
| Stuebler et al. | Feature-based mapping and self-localization for road vehicles using a single grayscale camera | |
| Mansur et al. | Real time monocular visual odometry using optical flow: study on navigation of quadrotors UAV | |
| Shang et al. | Vision model-based real-time localization of unmanned aerial vehicle for autonomous structure inspection under GPS-denied environment | |
| Hoang et al. | Combining edge and one-point ransac algorithm to estimate visual odometry | |
| Hoang et al. | Motion estimation based on two corresponding points and angular deviation optimization | |
| Burdziakowski | Towards precise visual navigation and direct georeferencing for MAV using ORB-SLAM2 | |
| Boucheloukh et al. | UAV navigation based on adaptive fuzzy backstepping controller using visual odometry |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| C06 | Publication | ||
| PB01 | Publication | ||
| C10 | Entry into substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| C12 | Rejection of a patent application after its publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20100915 |