[go: up one dir, main page]

CN105222788B - The automatic correcting method of the matched aircraft Route Offset error of feature based - Google Patents

The automatic correcting method of the matched aircraft Route Offset error of feature based Download PDF

Info

Publication number
CN105222788B
CN105222788B CN201510641575.6A CN201510641575A CN105222788B CN 105222788 B CN105222788 B CN 105222788B CN 201510641575 A CN201510641575 A CN 201510641575A CN 105222788 B CN105222788 B CN 105222788B
Authority
CN
China
Prior art keywords
image
aircraft
matching
sift
landmark
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201510641575.6A
Other languages
Chinese (zh)
Other versions
CN105222788A (en
Inventor
陶晓明
刘喜佳
徐洁
葛宁
陆建华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN201510641575.6A priority Critical patent/CN105222788B/en
Publication of CN105222788A publication Critical patent/CN105222788A/en
Application granted granted Critical
Publication of CN105222788B publication Critical patent/CN105222788B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

基于特征匹配的飞行器航路偏移误差的自校正方法属于飞行器航路位置偏移校正技术领域,分为离线处理和在线处理两个部分:离线处理部分利用卫星或无人机获得选定地标区域的在先验图像,利用SIFT算法软法得到先验图像的SIFT特征点参数,及地标区域的地理位置信息,生成先验数据库;飞行器将该先验数据库载入到飞行器存储器中,飞行时对采集图像用SIFT算法软件求得SIFT特征点参数,与先验图像SIFT特征点进行匹配构成特征点对,然后再用设定的精匹配时特征点对数的阈值,以及设定的飞行器偏航精度分别采用不同的偏移解算方法,相对基于单点地标飞行器空间位置实时计算方法,本发明克服了高度的和姿态角的信息对解算精度的影响。

The self-calibration method of aircraft route offset error based on feature matching belongs to the field of aircraft route position offset correction technology, which is divided into two parts: offline processing and online processing: the offline processing part uses satellites or drones to obtain the in- Prior image, use the SIFT algorithm soft method to obtain the SIFT feature point parameters of the prior image and the geographic location information of the landmark area to generate a prior database; the aircraft loads the prior database into the aircraft memory, and collects images during flight Use the SIFT algorithm software to obtain the SIFT feature point parameters, match with the prior image SIFT feature points to form a feature point pair, and then use the set fine matching threshold of feature point pairs and the set aircraft yaw accuracy respectively Using different offset calculation methods, compared with the real-time calculation method based on the single-point landmark aircraft space position, the present invention overcomes the influence of height and attitude angle information on the calculation accuracy.

Description

基于特征匹配的飞行器航路偏移误差的自校正方法Self-calibration method of aircraft route offset error based on feature matching

技术领域technical field

本发明提供了一种基于特征匹配的飞行器航路偏移误差的自校正方法,属于图像信息处理与导航领域的结合领域。根据特征配准结果自适应选择航路偏移解算方法,从而获得飞行器的位置偏移。The invention provides a feature-matching-based self-correction method for flight path offset errors, which belongs to the combined field of image information processing and navigation. According to the feature registration result, the path offset calculation method is adaptively selected, so as to obtain the position offset of the aircraft.

背景技术Background technique

飞行器在飞行中要依赖惯性导航系统(Inertial Navigation System,INS)进行定位,但是由于惯性器件的累积误差,惯导输出的位置坐标会随着时间累积发散。为了纠正惯导的累积误差,一般采用组合导航的方式对INS的输出进行修正,提高惯导的输出精度。INS/卫星定位系统是现在最为常用的组合导航的方式,如全球定位系统(Global PositionSystem,GPS)或者北斗卫星导航系统(Beidou Navigation Satellite System,BDS)。The aircraft relies on the inertial navigation system (Inertial Navigation System, INS) for positioning during flight, but due to the cumulative error of the inertial device, the position coordinates output by the inertial navigation system will accumulate and diverge over time. In order to correct the cumulative error of inertial navigation, the output of INS is generally corrected by means of integrated navigation to improve the output accuracy of inertial navigation. INS/satellite positioning system is the most commonly used way of integrated navigation, such as Global Positioning System (Global Position System, GPS) or Beidou Navigation Satellite System (Beidou Navigation Satellite System, BDS).

但是在实际应用中,卫星定位系统并不是实时可用的,会受到卫星平台可用性,以及信号干扰等方面的影响。当卫星定位系统失效时,需要借助其它方式对惯导输出结果进行纠正。由于视觉导航自主性强,精度高,实时性好,所以INS/视觉导航成为了组合导航的一个重要发展方向。However, in practical applications, the satellite positioning system is not available in real time and will be affected by the availability of satellite platforms and signal interference. When the satellite positioning system fails, other methods are needed to correct the output results of the inertial navigation. Due to the strong autonomy, high precision and good real-time performance of visual navigation, INS/visual navigation has become an important development direction of integrated navigation.

丁文锐、康传波的专利(申请公布号CN 103822635A)基于视觉信息的无人机飞行中空间位置实时计算方法,该专利利用单点地标对飞行器的空间位置进行实时计算,计算时需要利用单点地标的高度信息,将高度误差引入到计算结果中。而本发明能够充分利用地标区域的信息,利用SIFT匹配特征点进行偏移解算,当匹配对个数大于设定阈值N_fine时,可以采用步骤(5.2)方法或步骤(5.3)方法进行解算,步骤(5.2)方法消除了高度误差对解算精度的影响,步骤(5.3)方法消除了姿态角和高度信息对解算精度的影响。Ding Wenrui and Kang Chuanbo's patent (application publication number CN 103822635A) is based on the visual information-based real-time calculation method of the spatial position of the drone in flight. This patent uses a single-point landmark to calculate the spatial position of the aircraft in real time. Height information, which introduces the height error into the calculation result. However, the present invention can make full use of the information of the landmark area, and use SIFT matching feature points to perform offset calculation. When the number of matching pairs is greater than the set threshold N_fine, the method of step (5.2) or step (5.3) can be used for calculation. , the step (5.2) method eliminates the influence of the height error on the calculation accuracy, and the step (5.3) method eliminates the influence of the attitude angle and height information on the calculation accuracy.

发明内容Contents of the invention

本发明的目的在于提供了一种利用图像传感器,基于特征配准的航路偏移解算方法,当卫星定位系统失效时,通过对已知地标的图像信息的采集,实现对飞行器惯性导航误差的实时修正。The object of the present invention is to provide a method for calculating route offset based on feature registration using an image sensor. When the satellite positioning system fails, by collecting image information of known landmarks, the inertial navigation error of the aircraft is realized. Correction in real time.

该系统分为离线处理和飞行器在线处理两个部分。The system is divided into two parts: offline processing and aircraft online processing.

离线处理的作用是选定地标区域并建立先验信息数据库,包括以下几个步骤:The role of offline processing is to select landmark areas and establish a priori information database, including the following steps:

步骤(1)地标区域选定并获得该地标区域的先验影像信息。Step (1) The landmark area is selected and the prior image information of the landmark area is obtained.

步骤(2)对该地标区域的先验影像进行SIFT特征点提取。Step (2) SIFT feature point extraction is performed on the prior image of the landmark area.

步骤(3)生成该地标区域的先验数据库,包括先验图像信息,特征点信息,以及地标区域的地理位置信息。Step (3) Generate a priori database of the landmark area, including prior image information, feature point information, and geographic location information of the landmark area.

在线处理的作用是飞行器飞行时利用图像传感器对地标区域进行图像采集,与离线建立的地标区域的先验数据库进行对比,检测地标区域,并根据地标区域在采集图像中的像素坐标,计算飞行器的位置信息。包括以下几个步骤:The role of online processing is to use the image sensor to collect images of landmark areas when the aircraft is flying, compare them with the prior database of landmark areas established offline, detect the landmark areas, and calculate the aircraft’s position based on the pixel coordinates of the landmark areas in the collected images. location information. Include the following steps:

步骤(1)将地标区域的先验数据库存储到飞行器的存储器中。Step (1) Store the prior database of landmark regions into the aircraft's memory.

步骤(2)拍摄选定地标区域的实时图像,并进行图像预处理:Step (2) Take a real-time image of the selected landmark area and perform image preprocessing:

由于受到环境、传感器及平台干扰等的影响,飞行器获得的图像数据会存在噪声及各种形式的干扰,为了保证后续操作的质量和性能,所以首先要对获取的图像进行预处理操作。Due to the influence of the environment, sensors, and platform interference, the image data obtained by the aircraft will have noise and various forms of interference. In order to ensure the quality and performance of subsequent operations, the acquired images must first be preprocessed.

步骤(3)提取采集图像的SIFT特征点:Step (3) extracts the SIFT feature points of the collected image:

利用SIFT算法软件对预处理之后的采集图像进行SIFT特征点提取,地标区域的采集图像作为SIFT算法的输入,输出得到得到稳定的SIFT特征点参数集合。The SIFT algorithm software is used to extract the SIFT feature points from the pre-processed collected images, the collected images of the landmark areas are used as the input of the SIFT algorithm, and the output is obtained to obtain a stable SIFT feature point parameter set.

步骤(4)特征匹配:Step (4) feature matching:

特征匹配即根据地标区域在线数据中先验图像的SIFT特征参数集合,以及图像传感器采集影像的SIFT特征参数集合进行配准,特征匹配分为粗匹配和精匹配两部分。Feature matching is based on the SIFT feature parameter set of the prior image in the online data of the landmark area and the SIFT feature parameter set of the image collected by the image sensor. The feature matching is divided into two parts: rough matching and fine matching.

步骤(4.1)特征粗匹配:Step (4.1) rough feature matching:

粗匹配是用唯一描述向量(地标区域先验图像和采集图像的SIFT特征点的描述向量间的欧氏距离)建立先验图像和采集图像之间的映射关系,利用向量间的欧氏距离来进行粗配准。粗匹配的输出结果包括先验图像和采集图像匹配成功的特征点对,也包括对该匹配特征点对的评分score,score是该特征点对描述向量欧氏距离的平方。Rough matching is to use the unique description vector (the Euclidean distance between the prior image of the landmark area and the description vector of the SIFT feature points of the collected image) to establish the mapping relationship between the prior image and the collected image, and use the Euclidean distance between the vectors to Perform coarse registration. The output result of the rough matching includes the feature point pairs that are successfully matched between the prior image and the collected image, and also includes the score of the matching feature point pair. The score is the square of the Euclidean distance of the feature point pair description vector.

步骤(4.2)统计粗匹配的结果:Step (4.2) counts the results of rough matching:

若粗配准中成功匹配的特征点对为“0”,则输出“解算失败”,转入步骤(2),若该步骤中成功匹配的特征点对数大于零,则统计成功匹配的对数N。If the successfully matched feature point pairs in the rough registration are "0", then output "failure to solve" and go to step (2). If the number of successfully matched feature point pairs in this step is greater than zero, count the number of successfully matched Logarithmic N.

步骤(4.3)特征精匹配:Step (4.3) feature fine matching:

设定特征精匹配的阈值为N_fine(N_fine>3),设定需要的解算精度为DIST(单位为米)。Set the threshold of feature fine matching to N_fine (N_fine>3), and set the required solution accuracy to DIST (unit is meter).

特征精匹配是在粗匹配的基础上,采用MLESAC(Maximum Likelihood Estimationby Sample and Consensus,随机抽样最大似然估计)进一步进行筛选,去除匹配错误的特征匹配对,将剩下的匹配对转入步骤(5)进行飞行器位置偏移的解算。Feature fine matching is based on rough matching, using MLESAC (Maximum Likelihood Estimation by Sample and Consensus, random sampling maximum likelihood estimation) to further screen, remove feature matching pairs with matching errors, and transfer the remaining matching pairs to the step ( 5) Carry out the calculation of the position offset of the aircraft.

当粗配准成功的特征点对的数目N≤N_fine时,输出N对特征点对中score中最小的特征点对,采用步骤(5.1)方法进行解算。当粗配准成功的特征点对的数目N>N_fine时,进行精配准筛选出错误的匹配对后输出正确的特征点对,同时对设定的解算精度DIST进行判断,若DIST≤10m,转入步骤(5.3)进行解算,若DIST>10m,转入步骤(5.2)进行解算。When the number of feature point pairs successfully registered is N≤N_fine, output the smallest feature point pair in the score among the N pairs of feature point pairs, and use the method of step (5.1) to solve. When the number of feature point pairs with successful coarse registration is N>N_fine, perform fine registration to filter out the wrong matching pairs and then output the correct feature point pairs. At the same time, judge the set solution accuracy DIST, if DIST≤10m , turn to step (5.3) for calculation, if DIST>10m, turn to step (5.2) for calculation.

步骤(5)航路偏移解算:Step (5) route offset calculation:

航路偏移解算是根据地标区域在采集图像中的像素坐标来计算飞行器的位置偏移,该部分是本发明的重点所在。The route offset solution is to calculate the position offset of the aircraft according to the pixel coordinates of the landmark area in the collected image, which is the key point of the present invention.

本发明提供了三种进行航路偏移解算的方法(a)(b)(c),根据精匹配中的成功匹配的特征点对的数目N和设定的解算精度DIST选择对应的方法进行解算。The present invention provides three methods (a) (b) (c) for calculating the route offset, and select the corresponding method according to the number N of successfully matched feature point pairs in the fine matching and the set calculation accuracy DIST to solve.

本发明的优点在于:The advantages of the present invention are:

(1)适应性好。无需用到GPS或北斗的信息,直接利用已知地标区域在采集图像中的坐标,就能够对飞行器的空间位置进行定位。(1) Good adaptability. The spatial position of the aircraft can be positioned by directly using the coordinates of the known landmark areas in the collected images without using GPS or Beidou information.

(2)稳定性好。本发明中计算飞行器的位置坐标时,具体设计了三种策略,采用三种方法相结合的方式,能够充分利用地标区域的信息,提高解算精度,同时,当地标点附近的信息不完善时,也能通过方法(a)进行解算,适应性强。(2) Good stability. When calculating the position coordinates of the aircraft in the present invention, three strategies are specifically designed, and the combination of the three methods can make full use of the information of the landmark area and improve the calculation accuracy. At the same time, when the information near the local punctuation point is not perfect, It can also be solved by method (a), which has strong adaptability.

附图说明Description of drawings

图1是本发明的离线处理部分的流程图。Figure 1 is a flowchart of the off-line processing portion of the present invention.

图2是本发明的在线处理部分的流程图。Figure 2 is a flow diagram of the on-line processing portion of the present invention.

图3是在线处理过程中粗匹配后的输出结果。Figure 3 is the output result after rough matching during online processing.

图4是在线处理过程中精匹配后的输出结果。Figure 4 is the output result after fine matching during online processing.

具体实施方式Detailed ways

下面根据附图对本发明的具体实施方式进行详细说明。The specific implementation manner of the present invention will be described in detail below according to the accompanying drawings.

本发明的基于特征匹配的飞行器航路偏移误差的自校正方法,其离线处理部分的流程图如图1所示,包括以下几个步骤:The self-calibration method of the aircraft route offset error based on feature matching of the present invention, the flow chart of its offline processing part is shown in Figure 1, including the following steps:

步骤(1)选定地标区域并获得地标区域的先验图像:Step (1) Select the landmark area and obtain the prior image of the landmark area:

根据预先拟定的飞行器的飞行航线,进行地标区域的选取。选定的地标区域应该具有丰富、明显的图像特征且不易混淆,如摩天大楼,立交桥,河流等具有明显特征且难以复制的特点。The landmark area is selected according to the pre-planned flight route of the aircraft. The selected landmark areas should have rich, obvious image features and are not easy to be confused, such as skyscrapers, overpasses, rivers, etc., which have obvious characteristics and are difficult to replicate.

利用卫星或无人机等方式对设定的地标区域进行图像采集,获得该地标区域的先验图像以及该地标区域的地理坐标信息。Use satellites or unmanned aerial vehicles to collect images of the set landmark area, and obtain the prior image of the landmark area and the geographic coordinate information of the landmark area.

步骤(2)提取先验图像的SIFT特征点:Step (2) extracts the SIFT feature points of the prior image:

利用SIFT算法软件对所述先验图像进行SIFT特征点提取,获得其特征点的描述向量组成先验图像特征点描述向量的集合Nc为先验图像的SIFT特征点数,Use SIFT algorithm software to extract SIFT feature points from the prior image to obtain the description vector of its feature points A collection of prior image feature point description vectors N c is the number of SIFT feature points of the prior image,

步骤(3)生成地标区域的先验数据库:将步骤(1)(2)获得的图像信息,地标区域的地理坐标,SIFT特征点参数存入到先验数据库,完成先验数据库的建立。Step (3) Generate a prior database of landmark areas: store the image information obtained in steps (1) and (2), geographic coordinates of landmark areas, and SIFT feature point parameters into the prior database to complete the establishment of the prior database.

本发明的基于特征匹配的飞行器航偏移误差的自校正方法,其在线处理部分的流程图如图2所示,包括以下几个步骤:The self-calibration method of the aircraft offset error based on feature matching of the present invention, the flow chart of its online processing part is shown in Figure 2, including the following steps:

步骤(1)将地标区域的先验数据库事先存储到飞行器的存储器中。Step (1) Store the a priori database of the landmark area in the memory of the aircraft in advance.

步骤(2)实时拍摄选定地标区域的图像,并进行图像预处理:Step (2) Capture images of selected landmark areas in real time, and perform image preprocessing:

对飞行器图像传感器获得的采集图像进行预处理,预处理包括采用中值滤波法进行去噪,灰度直方图校正进行图像增强,利用提前标定获得的相机内参数进行镜头畸变矫正。Preprocessing is performed on the captured image obtained by the aircraft image sensor. The preprocessing includes denoising by median filter method, image enhancement by grayscale histogram correction, and lens distortion correction by using the internal parameters of the camera calibrated in advance.

步骤(3)对地标区域的采集图像进行SIFT特征提取:Step (3) carries out SIFT feature extraction to the acquisition image of landmark area:

利用SIFT算法软件对预处理之后的采集图像进行SIFT特征点提取,获得其特征点的描述向量生成采集图像SIFT特征点描述向量的集合VpNp为采集图像的SIFT特征点数。Use the SIFT algorithm software to extract the SIFT feature points from the preprocessed image and obtain the description vector of the feature points Generate a set V p of the SIFT feature point description vectors of the acquired image, N p is the number of SIFT feature points of the collected image.

步骤(4)所述先验图像和采集图像进行SIFT特征点匹配:Step (4) described prior image and acquisition image carry out SIFT feature point matching:

特征匹配即根据地标区域在线数据中先验图像的SIFT特征参数集合,以及图像传感器采集影像的SIFT特征参数集合进行匹配。特征匹配分为粗匹配和精匹配两部分。Feature matching is based on the SIFT feature parameter set of the prior image in the online data of the landmark area and the SIFT feature parameter set of the image collected by the image sensor. Feature matching is divided into two parts: rough matching and fine matching.

步骤(4.1)特征粗匹配:Step (4.1) rough feature matching:

步骤(4.1.1),在采集图像SIFT特征点集合Vp中任取一特征点的描述向量使其与先验图像SIFT特征点描述向量的集合Vc中所有特征点的描述向量按以下步骤进行特征点粗匹配:Step (4.1.1), randomly select a description vector of a feature point in the collection image SIFT feature point set V p Make it with the description vectors of all feature points in the set V c of prior image SIFT feature point description vectors Follow the steps below to perform rough matching of feature points:

步骤(4.1.1.1),计算下列参数:Step (4.1.1.1), calculate the following parameters:

分别计算Vp与Vc中所有特征点的描述向量的欧氏距离并按所述欧氏距离的大小从小到大,组成欧氏距离的升序序列,假设该欧氏距离序列中为该序列中的唯一一个最小值,为所述欧氏距离序列的次小值。Calculate the description vectors of all feature points in V p and V c respectively Euclidean distance and by the Euclidean distance The size of from small to large forms an ascending sequence of Euclidean distances, assuming that in the Euclidean distance sequence is the only minimum value in the sequence, is the second smallest value of the Euclidean distance sequence.

步骤(4.1.1.2),判断否:Step (4.1.1.2), judge no:

若:小于,则匹配成功,得到一对特征点对并将作为该特征点对的分数score进行输出,即 If: less than, the matching is successful, and a pair of feature points is obtained and will Output as the score score of the feature point pair, namely

若:大于等于,则该点粗匹配失败,即采集图像中的该特征点在先验图像中没有对应的特征点能够匹配。If: greater than or equal to, the rough matching of this point fails, that is, the feature point in the collected image There are no corresponding feature points in the prior image that can be matched.

步骤(4.1.1.3),在所述采集图像SIFT特征点集合Vp中的余下各特征点依次执行步骤(4.1.1.1)~步骤(4.1.1.2),直到采集图像的Np个SIFT特征点全部匹配完毕。Step (4.1.1.3), performing step (4.1.1.1) to step (4.1.1.2) sequentially on the remaining feature points in the SIFT feature point set Vp of the collected image until Np SIFT feature points of the collected image All matched.

步骤(4.2)统计粗匹配结果:Step (4.2) statistics rough matching results:

若:若成功匹配的SIFT点数为“0”,输出解算失败,返回步骤(2),若成功匹配的SIFT特征点数大于零,则统计成功匹配的对数N。If: if the number of successfully matched SIFT points is "0", the output solution fails and returns to step (2). If the number of successfully matched SIFT feature points is greater than zero, count the number of successfully matched pairs N.

图(3)是粗匹配之后的结果,红色区域是先验图像,红色圆圈代表的是先验图像的SIFT特征点,蓝色部分是采集图像,绿色叉点代表的是采集图像的SIFT特征点。粗配准匹配的匹配对会有错误匹配的点,如图中所示,所以要用到精配准进一步筛选。Figure (3) is the result after rough matching, the red area is the prior image, the red circle represents the SIFT feature point of the prior image, the blue part is the collected image, and the green cross represents the SIFT feature point of the collected image . The matching pairs of coarse registration matching will have incorrect matching points, as shown in the figure, so fine registration is used for further screening.

步骤(4.3),特征点精匹配:Step (4.3), feature point fine matching:

步骤(4.3.1),初始化:Step (4.3.1), initialization:

设定精匹配的阈值N_define=8,偏航精度DIST=20m。Set fine matching threshold N_define=8, yaw accuracy DIST=20m.

步骤(4.3.2)判断步骤(4.2)中统计的粗匹配成功数N≤8否:Step (4.3.2) judge whether the number of rough matching successes N≤8 counted in step (4.2):

若:粗匹配特征点数N≤8,执行步骤(5-a),If: rough matching feature points N≤8, execute step (5-a),

若:粗匹配特征点数N>8,执行步骤(4.3.3),If: rough matching feature points N>8, execute step (4.3.3),

步骤(4.3.3),采用MLESAC(Maximum Likelihood Estimation by Sample andConsensus,随机抽样最大似然估计)进行精配准,对粗配准确定的匹配对坐标关系的一致性进行分析,去除匹配错误的特征匹配对,把剩余的正确匹配特征点进行输出。Step (4.3.3), use MLESAC (Maximum Likelihood Estimation by Sample and Consensus, random sampling maximum likelihood estimation) for fine registration, analyze the consistency of the coordinate relationship between the matching pairs determined by the rough registration, and remove the features of matching errors Matching pairs, output the remaining correct matching feature points.

图(4)是经过精配准之后的特征匹配结果输出,错误匹配点已经被去除。Figure (4) is the output of feature matching results after fine registration, and the wrong matching points have been removed.

步骤(4.3.4)判断设定的偏航精度DIST≤10m否:Step (4.3.4) judge whether the set yaw accuracy DIST≤10m:

若:>10m,则执行步骤(5-b),If: > 10m, then perform step (5-b),

若:≤10m,则执行步骤(5-c),If: ≤10m, execute step (5-c),

步骤(5),根据所述地标区域在采集图像中的像素坐标计算飞行器的位置偏移,这是本发明的重点所在:Step (5), calculating the position offset of the aircraft according to the pixel coordinates of the landmark area in the collected image, which is the key point of the present invention:

飞行器提供的数据包括,飞行器的当前经纬度坐标(B,L),当前高度H,飞行器的当前姿态角:偏航角α,俯仰角β,横滚角γ。The data provided by the aircraft includes the current latitude and longitude coordinates (B, L) of the aircraft, the current altitude H, and the current attitude angle of the aircraft: yaw angle α, pitch angle β, and roll angle γ.

计算飞行器的位置偏移的三种方法的具体实现如下:The specific implementation of the three methods for calculating the position offset of the aircraft is as follows:

步骤(5.1),利用高度信息H,三个姿态角α,β,γ,以及地标区域M在拍摄图像的像素点坐标计算飞行器在以飞行器惯性导航给出的位置坐标为原点建立的东北天坐标系下的坐标。Step (5.1), using the height information H, the three attitude angles α, β, γ, and the pixel coordinates of the landmark area M in the captured image Calculate the coordinates of the aircraft in the northeast sky coordinate system established with the position coordinates given by the aircraft inertial navigation as the origin.

设地标区域M的任意点m在摄像机坐标系下的坐标(xm,ym,zm),而其对应的成像点p在成像平面坐标系中的坐标为(px,py),成像点p在图像坐标系(u-v坐标系)的坐标为(up,vp)。则图像坐标系下的坐标(up,vp)和成像平面坐标系下的坐标(px,py)的关系如公式1-1所示。根据投影原理可得,m在摄像机坐标系下的坐标(xm,ym,zm)和成像平面坐标系的坐标(px,py)如公式1-2所示。Sx和Sy分别为成像平面X轴和Y轴方向上单位物理尺寸下的像素数目,f为摄像机的焦距。Let the coordinates of any point m in the landmark area M be (x m , y m , z m ) in the camera coordinate system, and the coordinates of the corresponding imaging point p in the imaging plane coordinate system be (p x , p y ), The coordinates of the imaging point p in the image coordinate system (uv coordinate system) are (u p , v p ). Then the relationship between the coordinates (up , v p ) in the image coordinate system and the coordinates (p x , p y ) in the imaging plane coordinate system is shown in Formula 1-1. According to the projection principle, the coordinates (x m , y m , z m ) of m in the camera coordinate system and the coordinates (p x , p y ) of the imaging plane coordinate system are shown in formula 1-2. Sx and Sy are the number of pixels per unit physical size in the X-axis and Y-axis directions of the imaging plane, respectively, and f is the focal length of the camera.

由公式1-1和公式1-2可得From formula 1-1 and formula 1-2 can get

which is

以摄像机的光心O为原点建立东北天坐标系(XwYwZw),摄像机坐标系和该东北天坐标系的关系如公式1-4所示,Rw是摄像机坐标系和摄像机东北天坐标的姿态旋转矩阵,The northeast sky coordinate system (X w Y w Z w ) is established with the optical center O of the camera as the origin. The relationship between the camera coordinate system and the northeast sky coordinate system is shown in formula 1-4. R w is the camera coordinate system and the camera Attitude rotation matrix for sky coordinates,

由公式1-3和公式1-4可得,From formula 1-3 and formula 1-4,

当有一个地标点m能正确识别时,可以获得此刻的三个姿态角,偏航角α,俯仰角β,横滚角γ以及此时的拍摄高度H,zw=H,求出地标点m在摄像机坐标系下的z轴坐标zm,从而得到m在摄像机东北天坐标系下的坐标(xw,yw,zw)。When a landmark point m can be correctly identified, the three attitude angles at the moment, yaw angle α, pitch angle β, roll angle γ, and the shooting height H at this time can be obtained, z w = H, and the landmark point can be obtained The z-axis coordinate z m of m in the camera coordinate system, so as to obtain the coordinates (x w , y w , z w ) of m in the northeast sky coordinate system of the camera.

这种方法需要依赖于飞行器的高度H,姿态角及地标在图像中的坐标。而高度H是由高度传感器测量得到,存在误差,姿态角同样也存在误差。利用已经有误差的数据去计算飞行器的位置,其结果会把误差带入到该结果中。This method needs to depend on the altitude H of the aircraft, the attitude angle and the coordinates of the landmarks in the image. The height H is measured by the height sensor, and there is an error, and the attitude angle also has an error. Using data that already has errors to calculate the position of the aircraft, the result will bring errors into the result.

步骤(5.2),根据方法(5-a),可以看出,假定飞行器的真实位置在以飞行器导航输出位置为原点建立的东北天作坐标系下的坐标(xc,yc,zc),地标区域中的地标点m在采集图像传感器中的像素坐标为(up,vp),同时地标点m在飞行器导航输出位置为原点建立的东北天坐标系下的坐标为(xm,ym,zm),其在图像坐标系中的成像位置(up,vp)可以由成像方程1-6确定。Step (5.2), according to method (5-a), it can be seen that the coordinates (x c , y c , z c ) in the northeast sky established with the aircraft navigation output position as the origin are assumed to be the real position of the aircraft , the pixel coordinates of the landmark point m in the landmark area in the acquisition image sensor are ( up , v p ), and the coordinates of the landmark point m in the northeast sky coordinate system established with the aircraft navigation output position as the origin are (x m , y m, z m ), its imaging position ( up , v p ) in the image coordinate system can be determined by imaging equations 1-6.

up=[(cosαcosγ+sinαsinβsinγ)(xm-xc)-cosβsinγ(zm-zc)u p =[(cosαcosγ+sinαsinβsinγ)(x m -x c )-cosβsinγ(z m -z c )

-(cosγsinα-cosαsinβsinγ)(ym-yc)]fSx -(cosγsinα-cosαsinβsinγ)(y m -y c )]fS x

/[(cosαsinγ-cosγsinαsinβ)(xm-xc)/[(cosαsinγ-cosγsinαsinβ)(x m -x c )

-(sinαsinγ+cosαcosγsinβ)(ym-yc)+cosβcosγ(zm-zc)]vp=[sinβ(zm-zc)+cosαcosβ(ym-yc)+cosβsinα(xm-xc)]fSy -(sinαsinγ+cosαcosγsinβ)(y m -y c )+cosβcosγ(z m -z c )]v p =[sinβ(z m -z c )+cosαcosβ(y m -y c )+cosβsinα(x m - x c )]fS y

/[(cosαsinγ-cosγsinαsinβ)(xm-xc)/[(cosαsinγ-cosγsinαsinβ)(x m -x c )

-(sinαsinγ+cosαcosγsinγ)(ym-yc)+cosβcosγ(zm-zc)]-(sinαsinγ+cosαcosγsinγ)(y m -y c )+cosβcosγ(z m -z c )]

(1-6)(1-6)

其中,(xc,yc,hc)为摄像机的空间坐标;(α,β,γ)为摄像机的空间姿态角;(f,Sx,Sy)为摄像机的成像焦距、像素分布密度,为摄像机的内置参数,可以由厂商提供或进行离线标定。Among them, (x c , y c , h c ) are the spatial coordinates of the camera; (α, β, γ) are the spatial attitude angle of the camera; (f, S x , S y ) are the imaging focal length and pixel distribution density of the camera , is a built-in parameter of the camera, which can be provided by the manufacturer or calibrated offline.

上述方程是关于(xc,yc,zc)与(α,β,γ)的非齐次线性方程组,利用最小二乘准则求解(xc,yc,zc),得到无人机当前位置坐标(xc,yc,zc),也即飞行器真实位置相对于飞行器惯导输出位置坐标的偏移,该方法只利用姿态角信息进行计算,消除了高度H测量误差对最后结果的影响。The above equations are about (x c , y c , z c ) and (α, β, γ) non-homogeneous linear equations, using the least squares criterion to solve (x c , y c , z c ), get The aircraft's current position coordinates (x c , y c , z c ), that is, the offset of the aircraft's real position relative to the aircraft's inertial navigation output position coordinates, this method only uses the attitude angle information for calculation, eliminating the impact of the height H measurement error on the final impact on the outcome.

步骤(5.3),当特征匹配对N≥3时,也可以构造优化函数,使用迭代方法求解该优化问题。In step (5.3), when feature matching pairs N≥3, an optimization function can also be constructed, and an iterative method can be used to solve the optimization problem.

如公式1-7构造优化函数,采集图像中精匹配成功的特征点根据飞行器的位置和姿态角计算得到的对应像素坐标,由成像方程计算得到,是关于(xc,yc,zc)与(α,β,γ)的函数,是采集图像中精匹配成功的特征点在采集图像中的像素坐标。As formula 1-7 constructs the optimization function, The corresponding pixel coordinates of the fine-matched feature points in the collected image are calculated according to the position and attitude angle of the aircraft, and are calculated by the imaging equation, which are about (x c , y c , z c ) and (α, β, γ) function, is the pixel coordinates of the feature points in the collected image that are fine-matched successfully.

m是精匹配后输出的特征点对数,i=1,2,...m。m is the logarithm of feature points output after fine matching, i=1, 2, . . . m.

对该优化函数J利用迭代方式获得飞行器的真实位置坐标和姿态角,使用matlab自带fminserch函数进行迭代求解,迭代次数设为10000,迭代的允许误差设为1e-4,求出(xc,yc,zc)。该方法计算得到的飞行器的位置坐标能够消除高度误差及姿态角误差的影响。The optimization function J is iteratively obtained to obtain the real position coordinates and attitude angle of the aircraft, and the fminserch function that comes with matlab is used to iteratively solve it. The number of iterations is set to 10000, and the allowable error of the iteration is set to 1e-4. Find (x c , y c , z c ). The position coordinates of the aircraft calculated by this method can eliminate the influence of height error and attitude angle error.

三种方法中,第一种方法需要利用姿态角信息和高度信息,计算得到的飞行器位置坐标引入了姿态角误差和高度误差。但是这种方法对匹配特征对的要求较低,理论上只要特征匹配对N≥1,即可使用。Among the three methods, the first method requires the use of attitude angle information and altitude information, and the calculated aircraft position coordinates introduces attitude angle errors and altitude errors. However, this method has lower requirements for matching feature pairs. In theory, as long as feature matching pairs N≥1, it can be used.

第二种方法,理论上要求特征匹配对N≥3,实际设计中可根据实际系统去设定相应的阈值N_define。该方法不需要利用高度信息,只需要提供姿态角信息,计算得到的飞行器位置坐标误差只受姿态角误差的影响,误差比第一种方法小。The second method theoretically requires feature matching pairs N≥3, and the corresponding threshold N_define can be set according to the actual system in actual design. This method does not need to use altitude information, but only needs to provide attitude angle information. The calculated position coordinate error of the aircraft is only affected by the attitude angle error, and the error is smaller than the first method.

第三种方法,理论上要求特征匹配对N≥3,实际设计中可根据实际系统去设定相应的阈值N_define。该方法采用迭代法,既不需要高度信息,也不需要姿态角信息,解算精度能达到10m以下。The third method theoretically requires feature matching pairs N≥3, and the corresponding threshold N_define can be set according to the actual system in actual design. This method adopts an iterative method, neither height information nor attitude angle information is required, and the calculation accuracy can reach below 10m.

采用(a)(b)(c)三种方法结合的解算方式,能够充分利用地标区域的信息,提高解算精度,同时,当地标点附近的信息不完善时,也能通过方法(a)进行解算,适应性强。The combination of (a), (b) and (c) can make full use of the information of the landmark area and improve the accuracy of the calculation. At the same time, when the information near the landmark is not perfect, the method (a) can also be used. Solve the problem and have strong adaptability.

Claims (1)

1. the automatic correcting method of the matched aircraft Route Offset error of feature based, which is characterized in that be divided into ground and locate offline Reason and two parts of aircraft online processing:
Ground processed offline part, is divided into following steps:
Step (1) is chosen with obvious characteristic and is difficult to the region replicated as landmark region, obtained using unmanned plane or satellite The prior image of the landmark region;
Step (2) extracts the SIFT feature of prior image:
Using SIFT algorithm softwares to the description vectors of prior image extraction SIFT featureForm prior image feature The set of point description vectors NcFor prior image SIFT feature is counted,
Step (3) generates the prior data bank of landmark region, including prior image, the ground of characteristic point information and landmark region Manage location information,
Aircraft online processing part, is divided into following steps:
Step (1), the prior data bank of landmark region is previously loaded into the memory of aircraft,
Step (2), shoots the realtime graphic of selected landmark region, and carries out image preprocessing:
The landmark region is shot during aircraft real-time flight and carries out Image Acquisition, then, successively using medium filtering denoising, ash Histogram adjusting is spent to enhance image, finally carries out lens distortion calibration with preset camera intrinsic parameter,
The SIFT feature of step (3) extraction acquisition image,
The description vectors of image zooming-out SIFT feature collected using SIFT algorithm softwaresGeneration acquisition image SIFT The set V of feature point description vectorp, NpTo adopt Collect the SIFT feature points of image,
Step (4) carries out SIFT feature matching to the prior image and acquisition image:
Step (4.1) characteristic point slightly matches:
Step (4.1.1), in acquisition image SIFT feature point set VpIn appoint and take the description vectors of a characteristic pointMake itself and elder generation Test the set V of image SIFT feature description vectorscIn all characteristic points description vectorsCharacteristic point is carried out according to the following steps Thick matching:
Step (4.1.1.1) calculates following parameters:
V is calculated respectivelypWith VcIn all characteristic points description vectorsEuclidean distanceAnd by the Euclidean away from FromSize from small to large, form the ascending sequence of Euclidean distance, it is assumed that in the Euclidean distance sequenceFor the only one minimum value in the sequence,For the sub-minimum of the Euclidean distance sequence,
Step (4.1.1.2) judgesIt is no:
If:It is less than, then successful match, obtains a pair of of characteristic point pairIt and willAs this feature point pair Score score exported, i.e.,
If:It is more than or equal to, then the thick this feature point that it fails to match, that is, acquires in image of the pointNo pair in prior image The characteristic point answered can match,
Step (4.1.1.3), in the acquisition image SIFT feature point set VpIn each characteristic point of remainder perform step successively (4.1.1.1)~step (4.1.1.2), the N until acquiring imagepAll matching finishes a SIFT feature,
Step (4.2) counts thick matching result:
If:If the SIFT points of successful match are " 0 ", output resolves failure, return to step (2), if the SIFT of successful match is special Sign points are more than zero, then count the logarithm N of successful match,
Step (4.3), the matching of characteristic point essence:
Step (4.3.1), initialization:
The matched threshold value N_define of setting essence, and the N_define set should meet N_define>3, precision DIST is yawed, it is single Position is rice,
The thick successful match number N≤N_define counted in step (4.3.2) judgment step (4.2) is no:
If:Thick matching characteristic points N≤N_define, performs step (5.2),
If:Thick matching characteristic points N>N_define performs step (4.3.3),
Step (4.3.3), using MLESAC (Maximum Likelihood Estimation by Sample and Consensus, random sampling maximal possibility estimation) smart registration is carried out, to the matching that rough registration determines to the consistent of coordinate relationship Property is analyzed, and removes the characteristic matching pair of matching error, remaining correct matching characteristic point is exported,
Step (4.3.4) judges that yaw precision DIST≤10m of setting is no:
If:> 10m then perform step (5.2),
If:≤ 10m then performs step (5.3),
Step (5), according to the position offset of pixel coordinate calculating aircraft of the landmark region in image is acquired:
Step (5.1) aircraft acquires following parameter,
The current latitude and longitude coordinates of unmanned plane (B, L), present level H, unmanned plane current pose angle:Yaw angle α, pitch angle β, roll Angle γ,
Step (5.2), step is as follows:
Step (5.2.1) selects matching pair minimum in score, performs step (5.2.2),
Step (5.2.2), the successively position offset of calculating aircraft according to the following steps:
Step (5.2.2.1) establishes northeast day coordinate system (X by origin of the optical center O of video camerawYwZw),
Landmark point m is calculated as follows using the position coordinates that aircraft inertia navigation system provides as origin in step (5.2.2.2) Coordinate (x under the northeast day coordinate system of foundationw,yw,zw):
Wherein (up,vp) it is pixel coordinates of the landmark point m in image is acquired,
zmFor z-axis coordinates of the landmark point m in camera coordinate system, zw=H is it is known that be obtained zm, so as to find out xw,yw,
RwIt is the posture spin matrix of camera coordinate system and video camera northeast day coordinate,It is RwTransposed matrix,
Sx and Sy is respectively the pixel number acquired in image forming plane coordinate system X-axis and Y direction under unit physical size, For the parameter-embedded of video camera,
F is focal length of camera, is the parameter-embedded of video camera,
Step (5.3), step is as follows:
Step (5.3.1) is set:
Coordinate of the actual position of aircraft under the heaven-made coordinate system in northeast established using aircraft navigation outgoing position as origin (xc,yc,zc), pixel coordinates of the landmark point m in imaging sensor is acquired in landmark region is (up,vp), while landmark point m Coordinate under the northeast day coordinate system that aircraft navigation outgoing position is origin foundation is (xm,ym,zm),
Step (5.3.1.1) imaging equation is:
up=[(cos α cos γ+sin α sin β sin γ) (xm-xc)-cosβsinγ(zm-zc)
-(cosγsinα-cosαsinβsinγ)(ym-yc)]fSx
/[(cosαsinγ-cosγsinαsinβ)(xm-xc)
-(sinαsinγ+cosαcosγsinβ)(ym-yc)+cosβcosγ(zm-zc)]vp=[sin β (zm-zc)+cosα cosβ(ym-yc)+cosβsinα(xm-xc)]fSy
/[(cosαsinγ-cosγsinαsinβ)(xm-xc)
-(sinαsinγ+cosαcosγsinγ)(ym-yc)+cosβcosγ(zm-zc)]
Step (5.3.1.2), above-mentioned equation are about (xc,yc,zc) with the Linear Equations of (α, beta, gamma), using most Small two, which multiply criterion, solves (xc,yc,zc), obtain unmanned plane current position coordinates (xc,yc,zc) namely aircraft actual position phase Offset for aircraft inertial navigation outgoing position coordinate,
Step (5.4), step is as follows:
Majorized function J is calculated as follows in step (5.4.1),
Wherein
M is the characteristic point logarithm that exports after essence matching, i=1,2 ... m,
The correspondence picture that the characteristic point of smart successful match is calculated according to the position and attitude angle of aircraft in acquisition image Plain coordinate, is calculated by imaging equation, is about (xc,yc,zc) with the function of (α, beta, gamma),
It is pixel coordinate of the characteristic point of smart successful match in acquisition image in image is acquired,
Step (5.4.2) uses following Optimization Methods:
Solution is iterated using the fminseach function pairs J of matlab, iterations are set as 10000, the allowable error of iteration 1e-4 is set as, (x is obtainedc,yc,zc)。
CN201510641575.6A 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based Active CN105222788B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510641575.6A CN105222788B (en) 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510641575.6A CN105222788B (en) 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based

Publications (2)

Publication Number Publication Date
CN105222788A CN105222788A (en) 2016-01-06
CN105222788B true CN105222788B (en) 2018-07-06

Family

ID=54991878

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510641575.6A Active CN105222788B (en) 2015-09-30 2015-09-30 The automatic correcting method of the matched aircraft Route Offset error of feature based

Country Status (1)

Country Link
CN (1) CN105222788B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444832A (en) * 2016-09-28 2017-02-22 北京航天时代激光导航技术有限责任公司 Navigation method for low-altitude cruising state of antisubmarine aircraft

Families Citing this family (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105825517B (en) * 2016-03-31 2018-09-07 湖北航天技术研究院总体设计所 A kind of image correcting method and system of navigation height error
CN108253940B (en) 2016-12-29 2020-09-22 东莞前沿技术研究院 Positioning method and device
CN108958064B (en) * 2017-05-17 2021-10-01 上海微小卫星工程中心 Attitude guidance law error judgment method, system and electronic device
CN109307510A (en) * 2017-07-28 2019-02-05 广州极飞科技有限公司 Flight navigation method, apparatus and unmanned vehicle
CN109324337B (en) * 2017-07-31 2022-01-14 广州极飞科技股份有限公司 Unmanned aerial vehicle route generation and positioning method and device and unmanned aerial vehicle
CN107843240B (en) * 2017-09-14 2020-03-06 中国人民解放军92859部队 Method for rapidly extracting same-name point information of unmanned aerial vehicle image in coastal zone
CN107976146B (en) * 2017-11-01 2019-12-10 中国船舶重工集团公司第七一九研究所 Self-calibration method and measurement method of linear array CCD camera
CN109344970B (en) * 2018-11-27 2022-03-15 中国电子科技集团公司第二十研究所 Vision target-based dynamic reasoning method on unmanned aerial vehicle
CN109614998A (en) * 2018-11-29 2019-04-12 北京航天自动控制研究所 Landmark database preparation method based on deep learning
CN110113528B (en) * 2019-04-26 2021-05-07 维沃移动通信有限公司 Parameter obtaining method and terminal equipment
CN111815524B (en) * 2019-12-11 2024-04-23 长沙天仪空间科技研究院有限公司 Correction system and method for radiation calibration
CN111899305A (en) * 2020-07-08 2020-11-06 深圳市瑞立视多媒体科技有限公司 A kind of camera automatic calibration optimization method and related system and equipment
CN112033390B (en) * 2020-08-18 2022-07-12 深圳优地科技有限公司 Robot navigation deviation rectifying method, device, equipment and computer readable storage medium
CN112902957B (en) * 2021-01-21 2024-01-16 中国人民解放军国防科技大学 Missile-borne platform navigation method and system
CN113252079B (en) * 2021-07-05 2022-03-29 北京远度互联科技有限公司 Pod calibration method and device for unmanned aerial vehicle, electronic equipment and storage medium
CN114693807B (en) * 2022-04-18 2024-02-06 国网江苏省电力有限公司泰州供电分公司 A mapping data reconstruction method and system for transmission line images and point clouds
CN114578188B (en) * 2022-05-09 2022-07-08 环球数科集团有限公司 Power grid fault positioning method based on Beidou satellite
CN115618749B (en) * 2022-12-05 2023-04-07 四川腾盾科技有限公司 Error compensation method for real-time positioning of large unmanned aerial vehicle

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Aircraft Navigation and Positioning Method Based on Landmark Acquisition and Tracking
CN103411609A (en) * 2013-07-18 2013-11-27 北京航天自动控制研究所 Online composition based aircraft return route programming method

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100250123A1 (en) * 2009-03-30 2010-09-30 Caterpillar Inc. Method and system for dispensing material from machines
US8195393B2 (en) * 2009-06-30 2012-06-05 Apple Inc. Analyzing and consolidating track file data

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101532841A (en) * 2008-12-30 2009-09-16 华中科技大学 Aircraft Navigation and Positioning Method Based on Landmark Acquisition and Tracking
CN103411609A (en) * 2013-07-18 2013-11-27 北京航天自动控制研究所 Online composition based aircraft return route programming method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106444832A (en) * 2016-09-28 2017-02-22 北京航天时代激光导航技术有限责任公司 Navigation method for low-altitude cruising state of antisubmarine aircraft

Also Published As

Publication number Publication date
CN105222788A (en) 2016-01-06

Similar Documents

Publication Publication Date Title
CN105222788B (en) The automatic correcting method of the matched aircraft Route Offset error of feature based
CN113850126B (en) A method and system for target detection and three-dimensional positioning based on unmanned aerial vehicle
CN109341700B (en) Visual auxiliary landing navigation method for fixed-wing aircraft under low visibility
CN109708649B (en) A method and system for determining the attitude of a remote sensing satellite
CN108592950B (en) Calibration method for relative installation angle of monocular camera and inertial measurement unit
CN103679711B (en) A kind of remote sensing satellite linear array push sweeps optics camera outer orientation parameter calibration method in-orbit
CN103674063B (en) A kind of optical remote sensing camera geometric calibration method in-orbit
EP3132231B1 (en) A method and system for estimating information related to a vehicle pitch and/or roll angle
CN107560603B (en) Unmanned aerial vehicle oblique photography measurement system and measurement method
KR102239562B1 (en) Fusion system between airborne and terrestrial observation data
CN111044037B (en) Method and device for geometric positioning of optical satellite images
CN104748750A (en) Model constraint-based on-orbit 3D space target attitude estimation method and system
CN107917699B (en) A method for improving the quality of aerial triangulation in oblique photogrammetry of mountainous landforms
CN108364279B (en) A Method for Determining the Pointing Bias of Geostationary Remote Sensing Satellites
CN115950435B (en) Real-time positioning method for unmanned aerial vehicle inspection image
CN115423863B (en) Camera pose estimation method and device and computer readable storage medium
CN109341685B (en) Fixed wing aircraft vision auxiliary landing navigation method based on homography transformation
CN109883400B (en) Automatic target detection and space positioning method for fixed station based on YOLO-SITCOL
CN116817910A (en) Refused state unmanned aerial vehicle visual navigation method and device
CN119850675A (en) Depth target tracking-based line-of-sight angular rate extraction method
Curro et al. Automated aerial refueling position estimation using a scanning lidar
Skaloud et al. Mapping with MAV: experimental study on the contribution of absolute and relative aerial position control
KR101775124B1 (en) System and method for automatic satellite image processing for improvement of location accuracy
CN115908569B (en) High-orbit large-area array camera on-orbit geometric positioning method and system based on earth profile
CN117191018A (en) Inertial-assisted large-viewing-angle fast scene matching absolute navigation method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant