[go: up one dir, main page]

CN119845267A - Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path - Google Patents

Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path Download PDF

Info

Publication number
CN119845267A
CN119845267A CN202411890251.1A CN202411890251A CN119845267A CN 119845267 A CN119845267 A CN 119845267A CN 202411890251 A CN202411890251 A CN 202411890251A CN 119845267 A CN119845267 A CN 119845267A
Authority
CN
China
Prior art keywords
unmanned aerial
aerial vehicle
satellite
drone
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202411890251.1A
Other languages
Chinese (zh)
Inventor
赵帅和
姜鹏志
李鑫妮
范明炀
马春荣
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Aerospace Shenzhou Aircraft Co ltd
Original Assignee
Aerospace Shenzhou Aircraft Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aerospace Shenzhou Aircraft Co ltd filed Critical Aerospace Shenzhou Aircraft Co ltd
Priority to CN202411890251.1A priority Critical patent/CN119845267A/en
Publication of CN119845267A publication Critical patent/CN119845267A/en
Pending legal-status Critical Current

Links

Landscapes

  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)

Abstract

本发明提供一种可预测航迹的无卫星导航无人机位置估计方法,包括:S1:获取无人机前下视视场的检测图像;S2:对所述检测图像进行多源图像融合处理以获取融合图像;S3:对所述融合图像进行透视投影变换并和DEM数据进行特征匹配,获取特征点信息;S4:基于无人机相对地面的相对高度、无人机的海拔高度、无人机的飞行速度和所述特征点信息确定无人机的当前坐标,所述当前坐标用于预测无人机航迹。相对于现有技术,本发明的可预测航迹的无卫星导航无人机位置估计方法可在无卫星导航时对无人机的位置坐标进行估计,从而获取无人机的当前坐标,便于航迹预测、自主导航的进行,为无人机在失去卫星导航信号时提供一个故障保护机制,保障无人机的飞行任务顺利进行。

The present invention provides a method for estimating the position of a non-satellite navigation drone with a predictable track, comprising: S1: acquiring a detection image of the front downward field of view of the drone; S2: performing multi-source image fusion processing on the detection image to obtain a fused image; S3: performing perspective projection transformation on the fused image and performing feature matching with DEM data to obtain feature point information; S4: determining the current coordinates of the drone based on the relative height of the drone relative to the ground, the altitude of the drone, the flight speed of the drone and the feature point information, and the current coordinates are used to predict the drone track. Compared with the prior art, the method for estimating the position of a non-satellite navigation drone with a predictable track of the present invention can estimate the position coordinates of the drone when there is no satellite navigation, thereby obtaining the current coordinates of the drone, facilitating track prediction and autonomous navigation, providing a fault protection mechanism for the drone when the satellite navigation signal is lost, and ensuring the smooth progress of the drone's flight mission.

Description

Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path
Technical Field
The invention relates to the technical field, in particular to a satellite-free navigation unmanned aerial vehicle position estimation method capable of predicting a flight path.
Background
In recent years, unmanned aerial vehicles are widely applied in the fields of military, agriculture and forestry, mining engineering and the like, and more flight control and navigation aiming at unmanned aerial vehicles are hot spots for unmanned aerial vehicle field study in complex environments. The global navigation satellite system is widely used as a main positioning system or is combined with inertial navigation to form a comprehensive navigation positioning system to be applied to the task execution process of the unmanned aerial vehicle due to the advantages of higher positioning precision, lower hardware power consumption, smaller mass and volume and the like.
The unmanned aerial vehicle adopts satellite navigation and inertial navigation to fuse the mode to acquire gesture and position, when satellite navigation signal received interference or lost, inertial navigation that unmanned aerial vehicle used diverges in extremely short time, and positional information diverges and leads to unmanned aerial vehicle unable to acquire positional information, and gesture diverges and influences unmanned aerial vehicle flight safety. It is quite common for satellite navigation signals to be disturbed during war time or in certain specific application scenarios.
Currently, if satellite navigation is only relied on as a unique source for geographic positioning of an unmanned aerial vehicle in an actual flight process, when satellite navigation signals become unavailable due to various reasons (such as interference or spoofing), the unmanned aerial vehicle cannot determine current coordinates, cannot navigate for the unmanned aerial vehicle, may not continue to execute original tasks, and causes potential safety hazards to the unmanned aerial vehicle.
Disclosure of Invention
The invention aims to overcome the defects and shortcomings in the prior art and provides a position estimation method of a satellite-free navigation unmanned aerial vehicle capable of predicting a flight path.
One embodiment of the invention provides a satellite-free navigation unmanned aerial vehicle position estimation method capable of predicting a track, comprising the following steps:
S1, acquiring a detection image of a front lower view field of the unmanned aerial vehicle;
S2, carrying out multi-source image fusion processing on the detection image to obtain a fusion image;
s3, performing perspective projection on the fusion image, performing feature matching with DEM data, and obtaining feature point information;
And S4, determining the current coordinates of the unmanned aerial vehicle based on the relative height of the unmanned aerial vehicle to the ground, the altitude of the unmanned aerial vehicle, the flight speed of the unmanned aerial vehicle and the characteristic point information, wherein the current coordinates are used for predicting the flight path of the unmanned aerial vehicle.
In some optional embodiments, the detection image is acquired by a camera of the unmanned aerial vehicle, the camera faces a front lower view field of the unmanned aerial vehicle, and an angle of a shooting direction of the camera relative to a horizontal plane can be adjusted between 0 ° and 90 °.
In some optional embodiments, the magnitude of the angle of the shooting direction of the camera relative to the horizontal plane is determined based on the relative altitude and the flying speed of the unmanned aerial vehicle.
In some alternative embodiments, the relative height is measured by a laser rangefinder.
In some alternative embodiments, the altitude is measured by an barometric altimeter.
In some alternative embodiments, the multi-source image fusion process is implemented based on Swi nFus i on universal image fusion frameworks.
In some alternative embodiments, the detection image includes at least a visible light image and an infrared image.
In some alternative embodiments, the step S3 includes:
S31, performing perspective projection on the fusion image to obtain three-dimensional point cloud data of the fusion image;
S32, performing feature matching on the three-dimensional point cloud data and the DEM data of the fusion image;
and S33, acquiring characteristic point information, wherein the characteristic point information at least comprises ground characteristic points.
In some alternative embodiments, during the step of performing S1-S4, the unmanned aerial vehicle performs a flight mission based on the primary coordinates of the unmanned aerial vehicle.
Compared with the prior art, the position estimation method of the unmanned aerial vehicle without satellite navigation of the predictable track can estimate the position coordinates of the unmanned aerial vehicle during satellite-free navigation, thereby obtaining the current coordinates of the unmanned aerial vehicle, facilitating the progress of the track prediction and autonomous navigation, providing a fault protection mechanism for the unmanned aerial vehicle when the satellite navigation signal is lost, and ensuring the smooth progress of the flight task of the unmanned aerial vehicle.
In order that the invention may be more clearly understood, specific embodiments thereof will be described below with reference to the accompanying drawings.
Drawings
FIG. 1 is a flow chart of a method for estimating a position of a satellite-free navigation unmanned aerial vehicle with a predictable track according to an embodiment of the present invention;
fig. 2 is a flowchart of step S3 according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments.
Referring to fig. 1, an embodiment of the present invention provides a method for estimating a position of a satellite-free navigation unmanned aerial vehicle with a predictable track, including:
S1, acquiring a detection image of a front lower view field of the unmanned aerial vehicle;
S2, carrying out multi-source image fusion processing on the detection image to obtain a fusion image;
s3, performing perspective projection on the fusion image, performing feature matching with DEM data, and obtaining feature point information;
And S4, determining the current coordinates of the unmanned aerial vehicle based on the relative height of the unmanned aerial vehicle to the ground, the altitude of the unmanned aerial vehicle, the flight speed of the unmanned aerial vehicle and the characteristic point information, wherein the current coordinates are used for predicting the flight path of the unmanned aerial vehicle. Because the step of S1-S4 needs a certain time, the offset is required to be determined according to the flying speed, and the relative height of the unmanned aerial vehicle relative to the ground, the altitude of the unmanned aerial vehicle and the coordinates determined by the characteristic point information are corrected by the offset, so that the current coordinates of the unmanned aerial vehicle are obtained.
When the unmanned aerial vehicle flies, satellite signals can be detected through the detection device, and S1-S4 steps are carried out when the satellite signals are detected to be lost or the satellite signals are interfered, so that the current position of the unmanned aerial vehicle can be obtained without satellite navigation, track prediction and autonomous navigation are facilitated, a fault protection mechanism is provided for the unmanned aerial vehicle when the satellite navigation signals are lost, and the flight task of the unmanned aerial vehicle is guaranteed to be smoothly carried out.
It should be noted that the front downward viewing field refers to a field of view range in front of and facing downward of the unmanned aerial vehicle, and the field of view range can observe the terrain that the unmanned aerial vehicle is flying to, so that the detected image of the front downward viewing field can be used for resolving the current coordinates of the unmanned aerial vehicle.
The principle of predicting the unmanned aerial vehicle track by using the coordinates of the unmanned aerial vehicle is a technology known to those skilled in the art, and will not be described in detail herein.
In some optional embodiments, the detection image is obtained through a camera of the unmanned aerial vehicle, the camera faces a front lower view field of the unmanned aerial vehicle, and an angle of a shooting direction of the camera relative to a horizontal plane can be adjusted between 0 ° and 90 °, so that the angle of the shooting direction of the camera relative to the horizontal plane meets a requirement for positioning coordinates of the unmanned aerial vehicle.
In some optional embodiments, the angle of the shooting direction of the camera relative to the horizontal plane is determined based on the relative height and the flying speed of the unmanned aerial vehicle, wherein the faster the flying speed of the unmanned aerial vehicle is, the smaller the angle of the shooting direction of the camera relative to the horizontal plane is, so that more image information in front of the unmanned aerial vehicle can be acquired in advance, the slower the flying speed of the unmanned aerial vehicle is, the larger the angle of the shooting direction of the camera relative to the horizontal plane is, so that more accurate image information near the unmanned aerial vehicle can be acquired, the higher the relative height of the unmanned aerial vehicle is, the larger the angle of the shooting direction of the camera relative to the horizontal plane is, so that the wider the image information is acquired, and the lower the relative height of the unmanned aerial vehicle is, the smaller the angle of the shooting direction of the camera relative to the horizontal plane is, so that the camera is prevented from being too downward to acquire enough image information.
In some alternative embodiments, the relative height is measured by a laser rangefinder. Of course, the relative height may also be measured by other suitable means, without limiting this example.
In some alternative embodiments, the altitude is measured by an barometric altimeter. Of course, the altitude may also be measured by other suitable means, without being limited to this example.
The multi-source image fusion can improve the precision of feature extraction and optimize the accuracy of track prediction and positioning. In some alternative embodiments, the detection image includes at least a visible light image and an infrared image, and the combination of multiple images can improve the feature accuracy of the fusion image in a night or low-light environment.
In some alternative embodiments, the multi-source image fusion process is implemented based on Swi nFus ion universal image fusion frameworks. The step of performing multi-source image fusion processing on the detection image can be roughly divided into three steps of feature extraction, feature fusion and image reconstruction. The principles of feature extraction, feature fusion, and image reconstruction are well known to those skilled in the art, and will not be described in detail herein. In the feature extraction step, when the detected image includes a visible light image and an infrared image, the local features of the visible light image and the local features of the infrared image are extracted based on CNN extraction, and the global features of the visible light image and the global features of the infrared image are extracted based on Swi n Transformer.
Referring to fig. 2, in some alternative embodiments, the step S3 includes:
S31, performing perspective projection on the fusion image to obtain three-dimensional point cloud data of the fusion image;
And S32, performing feature matching on the three-dimensional point cloud data and the DEM data of the fusion image, wherein the feature matching algorithm adopts nearest neighbor search or RANSAC and other algorithms, so that the accuracy and the robustness of feature matching are improved.
And S33, acquiring characteristic point information, wherein the characteristic point information at least comprises ground characteristic points, and the ground characteristic points can be used for conveniently determining the horizontal plane coordinates of the unmanned aerial vehicle.
The DEM data is a pre-stored three-dimensional model containing ground elevation information, the current map position of the unmanned aerial vehicle is determined through feature matching, and then the current coordinate of the unmanned aerial vehicle can be accurately calculated by combining the altitude and the relative altitude.
In some alternative embodiments, during the step of performing S1-S4, the unmanned aerial vehicle performs a flight mission based on the primary coordinates of the unmanned aerial vehicle. The primary coordinates are the latest unmanned aerial vehicle coordinates obtained by satellite signals before the loss of satellite signals or the interference of satellite signals is detected. Because the step process of realizing S1-S4 needs a certain time, especially the time that the characteristic matching needs is longer, the unmanned aerial vehicle can not update the coordinates immediately at the moment, and therefore, the unmanned aerial vehicle flies according to the original flight task, and the task execution efficiency is prevented from being influenced. After the current coordinates of the unmanned aerial vehicle are obtained through the steps of S1-S4, the unmanned aerial vehicle continues to carry out the flight task based on the current coordinates. Before the detection device detects the satellite signal recovery, the steps S1-S4 are needed to be circularly carried out, so that the current coordinates of the unmanned aerial vehicle are continuously updated, the unmanned aerial vehicle track prediction and autonomous navigation requirements are met, and the unmanned aerial vehicle can accurately execute the flight task.
Although embodiments of the present invention have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the principles and spirit of the invention, the scope of which is defined in the appended claims and their equivalents.

Claims (9)

1. The satellite-free navigation unmanned aerial vehicle position estimation method capable of predicting the flight path is characterized by comprising the following steps of:
S1, acquiring a detection image of a front lower view field of the unmanned aerial vehicle;
S2, carrying out multi-source image fusion processing on the detection image to obtain a fusion image;
s3, performing perspective projection on the fusion image, performing feature matching with DEM data, and obtaining feature point information;
And S4, determining the current coordinates of the unmanned aerial vehicle based on the relative height of the unmanned aerial vehicle to the ground, the altitude of the unmanned aerial vehicle, the flight speed of the unmanned aerial vehicle and the characteristic point information, wherein the current coordinates are used for predicting the flight path of the unmanned aerial vehicle.
2. The method for estimating the position of a satellite-free navigation unmanned aerial vehicle capable of predicting a flight path according to claim 1, wherein the detection image is acquired through a camera of the unmanned aerial vehicle, the camera faces a front lower view field of the unmanned aerial vehicle, and an angle of a shooting direction of the camera relative to a horizontal plane can be adjusted between 0 degrees and 90 degrees.
3. The method for estimating a position of a satellite-free navigation unmanned aerial vehicle with a predictable track according to claim 2, wherein the magnitude of the angle of the shooting direction of said camera relative to the horizontal plane is determined based on said relative altitude and the flying speed of the unmanned aerial vehicle.
4. The method for estimating a position of a satellite-less navigation drone of claim 1 wherein said relative altitude is measured by a laser rangefinder.
5. The method for estimating a position of a satellite-free navigation unmanned aerial vehicle according to claim 1, wherein said altitude is measured by an barometric altimeter.
6. The method for estimating a position of a satellite-free navigation unmanned aerial vehicle with a predictable track according to any one of claims 1 to 5, wherein the multi-source image fusion process is implemented based on SwinFusion universal image fusion frames.
7. A method for estimating a position of a satellite-less navigation unmanned aerial vehicle with a predictable track according to any of claims 1 to 5, wherein said detected images comprise at least visible light images and infrared images.
8. A method for estimating a position of a satellite-free navigation unmanned aerial vehicle with a predictable track according to any of claims 1 to 5, wherein step S3 comprises:
S31, performing perspective projection on the fusion image to obtain three-dimensional point cloud data of the fusion image;
S32, performing feature matching on the three-dimensional point cloud data and the DEM data of the fusion image;
and S33, acquiring characteristic point information, wherein the characteristic point information at least comprises ground characteristic points.
9. The method for estimating a position of a satellite-free navigation unmanned aerial vehicle according to any one of claims 1 to 5, wherein during said step of performing S1-S4, the unmanned aerial vehicle performs a flight mission based on the primary coordinates of the unmanned aerial vehicle.
CN202411890251.1A 2024-12-20 2024-12-20 Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path Pending CN119845267A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411890251.1A CN119845267A (en) 2024-12-20 2024-12-20 Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411890251.1A CN119845267A (en) 2024-12-20 2024-12-20 Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path

Publications (1)

Publication Number Publication Date
CN119845267A true CN119845267A (en) 2025-04-18

Family

ID=95359349

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411890251.1A Pending CN119845267A (en) 2024-12-20 2024-12-20 Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path

Country Status (1)

Country Link
CN (1) CN119845267A (en)

Similar Documents

Publication Publication Date Title
US7725257B2 (en) Method and system for navigation of an ummanned aerial vehicle in an urban environment
CN107727079B (en) Target positioning method of full-strapdown downward-looking camera of micro unmanned aerial vehicle
US11906983B2 (en) System and method for tracking targets
US20080195316A1 (en) System and method for motion estimation using vision sensors
US8229163B2 (en) 4D GIS based virtual reality for moving target prediction
CN103175524B (en) A kind of position of aircraft without view-based access control model under marking environment and attitude determination method
US8686326B1 (en) Optical-flow techniques for improved terminal homing and control
US10078339B2 (en) Missile system with navigation capability based on image processing
Sohn et al. Vision-based real-time target localization for single-antenna GPS-guided UAV
US10254767B1 (en) Determining position or orientation relative to a marker
CN108364304A (en) A kind of system and method for the detection of monocular airborne target
JP7603709B2 (en) VISUALLY CURED RANDOM ACCESS LIDAR SYSTEM AND METHOD FOR LOCATION AND NAVIGATION
GB2479437A (en) Remote target coordinates calculated from aircraft position, inertial and laser targeting system data
US10642272B1 (en) Vehicle navigation with image-aided global positioning system
WO2018085689A1 (en) Camera-based heading-hold navigation
Andert et al. On the safe navigation problem for unmanned aircraft: Visual odometry and alignment optimizations for UAV positioning
Andert et al. Optical-aided aircraft navigation using decoupled visual SLAM with range sensor augmentation
KR101911353B1 (en) Autonomic flight method when gnss signal loss and unmanned aerial vehicle for the same
JP2009177666A (en) Tracking apparatus and tracking method
CN119845267A (en) Position estimation method of satellite-free navigation unmanned aerial vehicle capable of predicting flight path
CN112902957B (en) Missile-borne platform navigation method and system
Chathuranga et al. Aerial image matching based relative localization of a uav in urban environments
Mirabile et al. Pilot‐Assisted INS Aiding Using Bearings‐Only Measurements Taken Over Time
Lukashevich et al. The new approach for reliable UAV navigation based on onboard camera image processing
KR20210053012A (en) Image-Based Remaining Fire Tracking Location Mapping Device and Method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination