[go: up one dir, main page]

CN117974809B - Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium - Google Patents

Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium Download PDF

Info

Publication number
CN117974809B
CN117974809B CN202410341544.8A CN202410341544A CN117974809B CN 117974809 B CN117974809 B CN 117974809B CN 202410341544 A CN202410341544 A CN 202410341544A CN 117974809 B CN117974809 B CN 117974809B
Authority
CN
China
Prior art keywords
area array
camera
calibration
imu
control field
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410341544.8A
Other languages
Chinese (zh)
Other versions
CN117974809A (en
Inventor
范永祥
刘龙晖
徐冲
曾宪锋
邝展华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ji Hua Laboratory
Original Assignee
Ji Hua Laboratory
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ji Hua Laboratory filed Critical Ji Hua Laboratory
Priority to CN202410341544.8A priority Critical patent/CN117974809B/en
Publication of CN117974809A publication Critical patent/CN117974809A/en
Application granted granted Critical
Publication of CN117974809B publication Critical patent/CN117974809B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/80Analysis of captured images to determine intrinsic or extrinsic camera parameters, i.e. camera calibration
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C25/00Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass
    • G01C25/005Manufacturing, calibrating, cleaning, or repairing instruments or devices referred to in the other groups of this subclass initial alignment, calibration or starting-up of inertial devices
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Manufacturing & Machinery (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Image Analysis (AREA)

Abstract

The invention discloses a space-time calibration method, a device, equipment and a storage medium for an airborne area array camera, wherein the method comprises the following steps: performing coverage imaging in a control field with preset dimensions and known gravity acceleration direction to obtain SIFT feature control points; acquiring calibration data, wherein the calibration data comprises multi-frame area array images and IMU data; acquiring pose information of a camera when each frame of area array image is imaged by using a SIFT feature control field; performing pre-integration processing on IMU data among the frame area array images to obtain state descriptions of rotation, speed and displacement among the adjacent area array images; and calibrating the angle element, time deviation and line element between the camera and the IMU based on the pose information and the state description of rotation, speed and displacement. The invention has the property of controlling the direction and the scale of the gravity acceleration of the field to be known, reduces the parameters to be estimated in the space-time calibration process, and improves the calibration precision.

Description

机载面阵相机时空标定方法、装置、设备及存储介质Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium

技术领域Technical Field

本申请涉及记载遥感技术领域,特别是涉及一种机载面阵相机时空标定方法、装置、设备及存储介质。The present application relates to the field of remote sensing technology, and in particular to a method, device, equipment and storage medium for time-space calibration of an airborne array camera.

背景技术Background Art

在机载遥感中,融合面阵相机+IMU+GNSS数据是实现三维重构的重要方法。三维重构精度很大程度上取决于相机和IMU之间的相对空间和时间参数精度。空间参数是相机参考系和IMU参考系之间状态转换的桥梁,而时间偏移则用于对齐不同的传感器数据流。为了处理传感器测量结果,每个相机图像和IMU等测量结果都附有时间戳,该时间戳取自传感器本身或接收数据的计算机的操作系统。由于不同步的时钟、传输延迟、传感器响应和操作系统开销,实际采样时刻和附加时间戳之间始终存在延迟。由于每个传感器的延迟不同,来自相机和IMU的测量流通常会不一致。如果不考虑空间和时间参数或校准不正确,三维重构性能将受到严重影响,故有必要对机载遥感设备中相机与IMU时空标定方法进行研究。In airborne remote sensing, the fusion of array camera + IMU + GNSS data is an important method to achieve 3D reconstruction. The accuracy of 3D reconstruction depends largely on the relative spatial and temporal parameter accuracy between the camera and IMU. Spatial parameters are the bridge for state conversion between the camera reference system and the IMU reference system, while time offset is used to align different sensor data streams. In order to process sensor measurements, each camera image and IMU measurement result is attached with a timestamp, which is taken from the sensor itself or the operating system of the computer receiving the data. Due to unsynchronized clocks, transmission delays, sensor responses, and operating system overhead, there is always a delay between the actual sampling moment and the attached timestamp. Due to the different delays of each sensor, the measurement streams from the camera and IMU are usually inconsistent. If the spatial and temporal parameters are not considered or the calibration is incorrect, the 3D reconstruction performance will be seriously affected. Therefore, it is necessary to study the spatiotemporal calibration method of cameras and IMUs in airborne remote sensing equipment.

部分商业软件或学者选择将相机与IMU间外方位元素作为待估计参数,在三维重构平差过程对该参数进行估计,或者忽视该外方位元素直接进行三维重构,显然多余的估计参数或不精确的外方位元素可能影响三维重构精度。飞行检校则对布设有已知地面标志的检校场采集飞行数据后,通过姿轨数据(GNSS数据与IMU数据紧耦合获取)与相机数据间平差实现时空标定,该方法不仅需要较多的人力、物力成本,且姿轨数据融合过程中的误差可能影响标定精度。Some commercial software or scholars choose to use the external orientation elements between the camera and IMU as parameters to be estimated, and estimate the parameters in the 3D reconstruction adjustment process, or ignore the external orientation elements and directly perform 3D reconstruction. Obviously, redundant estimated parameters or inaccurate external orientation elements may affect the accuracy of 3D reconstruction. Flight calibration collects flight data from a calibration field with known ground landmarks, and then adjusts the attitude and track data (acquired by tightly coupling GNSS data and IMU data) with the camera data to achieve spatiotemporal calibration. This method not only requires more manpower and material costs, but also the errors in the attitude and track data fusion process may affect the calibration accuracy.

现有地面检校方案中,大部分用于在线完成VIO系统时空标定,未采用控制场或采用简单控制场(如ChArUco标定板),未采用控制场的方案需对尺度及重力加速度方向等进行估计;航空相机清晰成像距离一般大于30米,简单标定板无法用于该距离上的标定任务,且简单标定板需已知重力加速度大小并对重力加速度方向进行估计,这些因素大大限制了时空标定的精度,导致相机和IMU之间的时空标定结果不理想。Among the existing ground calibration schemes, most are used to complete the spatiotemporal calibration of the VIO system online. They do not use a control field or use a simple control field (such as the ChArUco calibration plate). The scheme that does not use a control field requires estimation of the scale and direction of gravity acceleration. The clear imaging distance of an aerial camera is generally greater than 30 meters, and a simple calibration plate cannot be used for calibration tasks at this distance. In addition, a simple calibration plate requires the known magnitude of gravity acceleration and an estimation of its direction. These factors greatly limit the accuracy of spatiotemporal calibration, resulting in unsatisfactory spatiotemporal calibration results between the camera and IMU.

发明内容Summary of the invention

有鉴于此,本申请提供一种机载面阵相机时空标定方法、装置、设备及存储介质,以解决现有的相机和IMU之间的时空标定精度不高的问题。In view of this, the present application provides a method, device, equipment and storage medium for spatiotemporal calibration of an airborne area array camera to solve the problem of low spatiotemporal calibration accuracy between existing cameras and IMUs.

为解决上述技术问题,本申请采用的一个技术方案是:提供一种机载面阵相机时空标定方法,其包括:在预先构建的控制场中进行覆盖成像,并基于所成影像中提取到的第一SIFT特征进行三维重构,得到SIFT特征控制点,控制场的尺度预先设定且重力加速度方向预先获知;基于控制场,结合预设方式进行运动成像,并采集标定数据,标定数据包括多帧面阵影像和IMU数据;分别从每帧面阵影像中提取第二SIFT特征并与SIFT特征控制点进行匹配、且将不同帧面阵影像间的第二SIFT特征进行匹配,并根据匹配结果得到各帧面阵影像成像时相机的位姿信息;对各帧面阵影像间的IMU数据进行预积分处理,得到面阵影像的旋转、速度和位移的状态描述;基于位姿信息和旋转的状态描述对相机和IMU间的角元素和时间偏差进行标定,且基于位姿信息、速度和位移的状态描述对相机和IMU间的线元素进行标定。In order to solve the above technical problems, a technical solution adopted in the present application is: to provide a method for spatiotemporal calibration of an airborne array camera, which includes: performing coverage imaging in a pre-constructed control field, and performing three-dimensional reconstruction based on the first SIFT feature extracted from the image to obtain SIFT feature control points, the scale of the control field is preset and the direction of gravity acceleration is known in advance; based on the control field, motion imaging is performed in combination with a preset method, and calibration data is collected, the calibration data includes multiple frames of array images and IMU data; extracting the second SIFT feature from each frame of the array image and matching it with the SIFT feature control point, and matching the second SIFT features between different frames of array images, and obtaining the camera's posture information when each frame of the array image is imaged according to the matching results; pre-integrating the IMU data between each frame of the array image to obtain a state description of the rotation, velocity and displacement of the array image; calibrating the angular element and time deviation between the camera and the IMU based on the state description of the posture information and the rotation, and calibrating the line element between the camera and the IMU based on the state description of the posture information, velocity and displacement.

为解决上述技术问题,本申请采用的又一个技术方案是:提供一种机载面阵相机时空标定装置,其包括:三维重构模块,用于在预先构建的控制场中进行覆盖成像,并基于所成影像中提取到的第一SIFT特征进行三维重构,得到SIFT特征控制点,控制场的尺度预先设定且重力加速度方向预先获知;采集模块,用于基于控制场,结合预设方式进行运动成像,并采集标定数据,标定数据包括多帧面阵影像和IMU数据;位姿估计模块,用于分别从每帧面阵影像中提取第二SIFT特征并与SIFT特征控制点进行匹配、且将不同帧面阵影像间的第二SIFT特征进行匹配,并根据匹配结果得到各帧面阵影像成像时相机的位姿信息;In order to solve the above technical problems, another technical solution adopted by the present application is: to provide an airborne area array camera spatiotemporal calibration device, which includes: a three-dimensional reconstruction module, which is used to perform coverage imaging in a pre-constructed control field, and perform three-dimensional reconstruction based on the first SIFT feature extracted from the image to obtain SIFT feature control points, the scale of the control field is preset and the direction of gravity acceleration is known in advance; an acquisition module, which is used to perform motion imaging based on the control field in combination with a preset method, and to acquire calibration data, the calibration data including multiple frames of area array images and IMU data; a posture estimation module, which is used to extract the second SIFT feature from each frame of the area array image and match it with the SIFT feature control point, and match the second SIFT features between different frames of area array images, and obtain the camera's posture information when each frame of the area array image is imaged according to the matching results;

预积分处理模块,用于对各帧面阵影像间的IMU数据进行预积分处理,得到相邻面阵影像间的旋转、速度和位移的状态描述;The pre-integration processing module is used to perform pre-integration processing on the IMU data between each frame of the array image to obtain the state description of the rotation, speed and displacement between adjacent array images;

标定模块,用于基于位姿信息和旋转的状态描述对相机和IMU间的角元素和时间偏差进行标定,且基于位姿信息、速度和位移的状态描述对相机和IMU间的线元素进行标定。The calibration module is used to calibrate the angular elements and time deviations between the camera and the IMU based on the state description of the posture information and rotation, and to calibrate the line elements between the camera and the IMU based on the state description of the posture information, velocity and displacement.

为解决上述技术问题,本申请采用的再一个技术方案是:提供一种计算机设备,所述计算机设备包括处理器、与所述处理器耦接的存储器,所述存储器中存储有程序指令,所述程序指令被所述处理器执行时,使得所述处理器执行如上述任一项的机载面阵相机时空标定方法的步骤。In order to solve the above technical problems, another technical solution adopted in the present application is: providing a computer device, the computer device comprising a processor and a memory coupled to the processor, the memory storing program instructions, and when the program instructions are executed by the processor, the processor executes the steps of the airborne area array camera spatiotemporal calibration method as described in any one of the above items.

为解决上述技术问题,本申请采用的再一个技术方案是:提供一种存储介质,存储有能够实现上述任一项的机载面阵相机时空标定方法的程序指令。In order to solve the above technical problems, another technical solution adopted by the present application is: providing a storage medium storing program instructions capable of implementing any of the above-mentioned airborne area array camera spatiotemporal calibration methods.

本申请的有益效果是:本申请的机载面阵相机时空标定方法通过在预先设定尺度且重力加速度方向已知的控制场中进行覆盖成像,以得到SIFT特征控制点,再结合控制场采集标定数据,标定数据包括多帧面阵影像和IMU数据,再分别从每帧面阵影像中提取各帧面阵影像成像时相机的位姿信息,再对各帧面阵影像间的IMU数据进行预积分处理,得到IMU的旋转、速度和位移的状态描述,最后基于位姿信息和旋转、速度和位移的状态描述对相机和IMU间的角元素、时间偏差、线元素进行标定,基于预先构建的大尺度SIFT特征控制场,其具有重力加速度方向已知、尺度已知的特定,从而可保证航空影像清晰成像质量,在标定过程中不再需要对重力加速度方向进行估计,从而减少标定过程中需要估计的参数量,从而提高标定的精度。The beneficial effects of the present application are as follows: the airborne area array camera spatiotemporal calibration method of the present application obtains SIFT feature control points by performing overlay imaging in a control field with a preset scale and a known direction of gravity acceleration, and then collects calibration data in combination with the control field. The calibration data includes multiple frames of area array images and IMU data, and then extracts the camera's posture information when each frame of area array image is imaged from each frame of area array image, and then pre-integrates the IMU data between each frame of area array image to obtain a state description of the rotation, speed and displacement of the IMU. Finally, based on the posture information and the state description of rotation, speed and displacement, the angular elements, time deviations and line elements between the camera and the IMU are calibrated. Based on the pre-constructed large-scale SIFT feature control field, which has the characteristics of a known direction and scale of gravity acceleration, the clear imaging quality of the aerial image can be guaranteed. In the calibration process, it is no longer necessary to estimate the direction of gravity acceleration, thereby reducing the amount of parameters that need to be estimated in the calibration process, thereby improving the accuracy of the calibration.

附图说明BRIEF DESCRIPTION OF THE DRAWINGS

图1是本发明实施例的机载面阵相机时空标定方法的一流程示意图;FIG1 is a schematic diagram of a flow chart of a method for temporal and spatial calibration of an airborne area array camera according to an embodiment of the present invention;

图2是本发明实施例的机载面阵相机时空标定方法的SIFT特征控制场三维重构稀疏点云示意图;2 is a schematic diagram of a sparse point cloud of a three-dimensional reconstruction of a SIFT feature control field in a method for spatiotemporal calibration of an airborne area array camera according to an embodiment of the present invention;

图3是本发明实施例的机载面阵相机时空标定方法的相机与IMU时间戳未对齐示意图;3 is a schematic diagram of the misalignment of camera and IMU timestamps in the airborne area array camera spatiotemporal calibration method according to an embodiment of the present invention;

图4是本发明实施例的机载面阵相机时空标定装置的功能模块示意图;4 is a schematic diagram of functional modules of an airborne area array camera spatiotemporal calibration device according to an embodiment of the present invention;

图5是本发明实施例的计算机设备的结构示意图;5 is a schematic diagram of the structure of a computer device according to an embodiment of the present invention;

图6是本发明实施例的存储介质的结构示意图。FIG. 6 is a schematic diagram of the structure of a storage medium according to an embodiment of the present invention.

具体实施方式DETAILED DESCRIPTION

下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅是本申请的一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。The following will be combined with the drawings in the embodiments of the present application to clearly and completely describe the technical solutions in the embodiments of the present application. Obviously, the described embodiments are only part of the embodiments of the present application, not all of the embodiments. Based on the embodiments in the present application, all other embodiments obtained by ordinary technicians in this field without creative work are within the scope of protection of this application.

本申请中的术语“第一”、“第二”、“第三”仅用于描述目的,而不能理解为指示或暗示相对重要性或者隐含指明所指示的技术特征的数量。由此,限定有“第一”、“第二”、“第三”的特征可以明示或者隐含地包括至少一个该特征。本申请的描述中,“多个”的含义是至少两个,例如两个,三个等,除非另有明确具体的限定。本申请实施例中所有方向性指示(诸如上、下、左、右、前、后……)仅用于解释在某一特定姿态(如附图所示)下各部件之间的相对位置关系、运动情况等,如果该特定姿态发生改变时,则该方向性指示也相应地随之改变。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其它步骤或单元。The terms "first", "second" and "third" in this application are only used for descriptive purposes and cannot be understood as indicating or implying relative importance or implicitly indicating the number of indicated technical features. Thus, the features defined as "first", "second" and "third" can explicitly or implicitly include at least one of the features. In the description of this application, the meaning of "multiple" is at least two, such as two, three, etc., unless otherwise clearly and specifically defined. All directional indications (such as up, down, left, right, front, back...) in the embodiments of this application are only used to explain the relative position relationship, movement, etc. between the components under a certain specific posture (as shown in the accompanying drawings). If the specific posture changes, the directional indication also changes accordingly. In addition, the terms "including" and "having" and any of their variations are intended to cover non-exclusive inclusions. For example, a process, method, system, product or device that includes a series of steps or units is not limited to the listed steps or units, but optionally also includes steps or units that are not listed, or optionally also includes other steps or units inherent to these processes, methods, products or devices.

在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。Reference to "embodiments" herein means that a particular feature, structure, or characteristic described in conjunction with the embodiments may be included in at least one embodiment of the present application. The appearance of the phrase in various locations in the specification does not necessarily refer to the same embodiment, nor is it an independent or alternative embodiment that is mutually exclusive with other embodiments. It is explicitly and implicitly understood by those skilled in the art that the embodiments described herein may be combined with other embodiments.

图1是本发明实施例的机载面阵相机时空标定方法的流程示意图。需注意的是,若有实质上相同的结果,本发明的方法并不以图1所示的流程顺序为限。如图1所示,该机载面阵相机时空标定方法包括步骤:FIG1 is a schematic flow chart of a method for time-space calibration of an airborne area array camera according to an embodiment of the present invention. It should be noted that the method of the present invention is not limited to the process sequence shown in FIG1 if substantially the same results are achieved. As shown in FIG1 , the method for time-space calibration of an airborne area array camera includes the following steps:

步骤S101:在预先构建的控制场中进行覆盖成像,并基于所成影像中提取到的第一SIFT特征进行三维重构,得到SIFT特征控制点,控制场的尺度预先设定且重力加速度方向预先获知。Step S101: Perform coverage imaging in a pre-constructed control field, and perform three-dimensional reconstruction based on the first SIFT feature extracted from the image to obtain SIFT feature control points. The scale of the control field is preset and the direction of gravity acceleration is known in advance.

需要说明的是,控制场是实现相机定位与定向的保障系统。为实现该目的,其应具备以下特征:为保证成像清晰度,成像距离(即相机对控制场成像位置到控制场平均距离)一般应大于30米;控制场景深与成像距离之比大于1/30;控制场范围与主流相机成像范围之比大于2;控制场需布设有控制点标志,该标志既可以被全站仪精确测量位置,也可以在成像影像上准确获取位置;保证主流相机成像时至少有8个控制标志可被成像,并较均匀分布于影像上。在完成控制场布设后,在适当位置架设高精度全站仪,调平后对控制场中所有控制点坐标进行精确测量,显然所获取坐标尺度为1且重力方向已知。It should be noted that the control field is a guarantee system for realizing camera positioning and orientation. To achieve this goal, it should have the following characteristics: to ensure imaging clarity, the imaging distance (i.e., the average distance from the camera imaging position of the control field to the control field) should generally be greater than 30 meters; the ratio of the control field depth to the imaging distance should be greater than 1/30; the ratio of the control field range to the mainstream camera imaging range should be greater than 2; the control field needs to be equipped with control point marks, which can be accurately measured by the total station and accurately obtained on the imaging image; ensure that at least 8 control marks can be imaged when the mainstream camera is imaging, and are relatively evenly distributed on the image. After completing the control field layout, a high-precision total station is set up at an appropriate location, and the coordinates of all control points in the control field are accurately measured after leveling. Obviously, the scale of the obtained coordinates is 1 and the direction of gravity is known.

本实施例通过三维重构技术(Structure from motion,简称SFM)实现SIFT特征控制场构建。具体地,该步骤S101具体包括以下步骤:This embodiment uses a three-dimensional reconstruction technology (Structure from motion, referred to as SFM) to achieve SIFT feature control field construction. Specifically, step S101 specifically includes the following steps:

1、利用高分辨率定焦相机对控制场进行覆盖成像,得到控制场影像。1. Use a high-resolution fixed-focus camera to cover and image the control field to obtain an image of the control field.

具体地,在进行覆盖成像时,首先,选择3个成像站点,站点间距大于5米,可对控制场清晰成像;然后,在其中一个成像站点,手持相机利用仿航线法对控制场成像,保证航带内、航带间影像重叠度均大于50%,所有影像可覆盖整个检校场,重复执行该过程,以对三个成像站点均完成成像。Specifically, when performing coverage imaging, first, three imaging sites are selected with a site spacing greater than 5 meters so that the control field can be clearly imaged; then, at one of the imaging sites, a handheld camera uses the simulated flight path method to image the control field to ensure that the image overlap within and between flight strips is greater than 50%, and all images can cover the entire calibration field. The process is repeated to complete imaging at the three imaging sites.

2、将预先获取的控制场中的控制点坐标与控制场影像上控制点对应成像处的成像点坐标进行关联。2. Associate the coordinates of the control points in the control field obtained in advance with the coordinates of the imaging points at the corresponding imaging locations of the control points on the control field image.

3、基于控制点坐标和成像点坐标的关联关系,利用相机初始内参得到相机的初始外方位元素。3. Based on the relationship between the control point coordinates and the imaging point coordinates, the initial exterior orientation elements of the camera are obtained using the initial internal parameters of the camera.

具体地,利用控制点坐标与成像点坐标间的关联关系,基于相机初始内参进行直接线性变换(如kneip-P3P法),以获取相机的初始外方位元素(即相机在检校场中的位置及姿态)。Specifically, the relationship between the coordinates of the control points and the coordinates of the imaging points is used to perform a direct linear transformation (such as the Kneip-P3P method) based on the initial intrinsic parameters of the camera to obtain the initial exterior orientation elements of the camera (i.e., the position and posture of the camera in the calibration field).

4、分别从每张控制场影像中提取第一SIFT特征。4. Extract the first SIFT feature from each control field image.

具体地,从控制场影像中提取第一SIFT特征以及其对应的描述子。Specifically, the first SIFT feature and its corresponding descriptor are extracted from the control field image.

5、将不同控制场影像间的第一SIFT特征进行匹配,且保留匹配后关联特征点数量满足预设数量阈值的控制场影像对。5. Match the first SIFT features between different control field images, and retain the control field image pairs whose number of associated feature points after matching meets the preset number threshold.

具体地,利用描述子实现不同控制场影像的第一SIFT特征匹配,并利用初始外方位元素构建的基本矩阵对错误匹配进行过滤,保留匹配后关联特征点数量满足预设数量阈值的控制场影像对,该预设数量阈值优选设置为8个。Specifically, the descriptor is used to achieve the first SIFT feature matching of different control field images, and the basic matrix constructed using the initial exterior orientation elements is used to filter out incorrect matches, retaining the control field image pairs whose number of associated feature points after matching meets a preset number threshold, and the preset number threshold is preferably set to 8.

6、基于初始外方位元素、相机初始内参和控制场影像对中成功匹配的关联特征点进行三维重构,得到SIFT特征控制点。6. Based on the initial exterior orientation elements, the initial internal parameters of the camera and the associated feature points successfully matched in the control field image, three-dimensional reconstruction is performed to obtain SIFT feature control points.

具体地,首先进行三角测量,即基于初始外方位元素、相机初始内参对匹配成功的关联特征点估计初始三维坐标。然后,分别进行无相机内参光束法平差和带有相机内参的光束法平差;无相机内参光束法平差,即通过优化方法、利用控制点约束及SIFT特征关联约束对初始外方位元素及初始三维坐标进行优化;带有相机内参的光束法平差,即通过优化方法、利用控制点约束及SIFT特征关联约束对相机内参、初始外方位元素及初始三维坐标进行优化。其次,进行空三质量评估,即检查平差结果中是否存在异常的关联特征点(如三角测量角过小点、投影误差较大点)、刺点误差较大点;若不存在则认为数据合格,否则需调整后重新进行无相机内参光束法平差和带有相机内参的光束法平差。最后,输出SIFT特征控制点,若空三质量评估中无异常情况则导出SIFT三维重构点坐标及其描述子,作为SIFT特征控制点,典型重构结果如图2所示。Specifically, triangulation is first performed, that is, the initial 3D coordinates are estimated based on the initial exterior orientation elements and the initial camera internal parameters for the successfully matched associated feature points. Then, bundle adjustment without camera internal parameters and bundle adjustment with camera internal parameters are performed respectively; bundle adjustment without camera internal parameters, that is, the initial exterior orientation elements and initial 3D coordinates are optimized by optimization methods, control point constraints and SIFT feature association constraints; bundle adjustment with camera internal parameters, that is, the camera internal parameters, initial exterior orientation elements and initial 3D coordinates are optimized by optimization methods, control point constraints and SIFT feature association constraints. Secondly, the quality assessment of aerial triangulation is performed, that is, whether there are abnormal associated feature points (such as points with too small triangulation angles, points with large projection errors) and points with large punctum errors in the adjustment results; if not, the data is considered qualified, otherwise it needs to be adjusted and bundle adjustment without camera internal parameters and bundle adjustment with camera internal parameters are performed again. Finally, the SIFT feature control points are output. If there are no abnormalities in the aerial triangulation quality assessment, the SIFT 3D reconstructed point coordinates and their descriptors are exported as SIFT feature control points. The typical reconstruction result is shown in Figure 2.

需要说明的是,从理论上而言,检校场中一点(控制点或SIFT特征重构点)在像平面上的像素坐标可表达为:It should be noted that, theoretically, Pixel coordinates of (control points or SIFT feature reconstruction points) on the image plane It can be expressed as:

其中,为控制场中点在像空间坐标系下坐标,为相机坐标系到检校场坐标系的旋转及平移;为相机焦距,为相机主点。由公式(1)~(2)可构建检校场中一点与成像像素点间残差式,由此实现以上步骤中的光束法平差,并获取各SIFT特征控制点在控制场坐标系中坐标值精确值。in, The midpoint of the control field In the image space coordinate system, , The rotation and translation from the camera coordinate system to the calibration field coordinate system; is the camera focal length, is the camera principal point. The residual formula between a point in the calibration field and the imaging pixel point can be constructed by formulas (1) to (2), thereby realizing the bundle adjustment in the above steps and obtaining the precise coordinate value of each SIFT feature control point in the control field coordinate system.

步骤S102:基于控制场,结合预设方式进行运动成像,并采集标定数据,标定数据包括多帧面阵影像和IMU数据。Step S102: Based on the control field, motion imaging is performed in combination with a preset method, and calibration data is collected. The calibration data includes multiple frames of array images and IMU data.

具体地,在预先构建好大尺度的控制场后,基于该控制场,结合预设方式进行运动成像,并采集标定数据,标定数据包括多帧面阵影像和IMU数据。Specifically, after a large-scale control field is pre-built, motion imaging is performed based on the control field in combination with a preset method, and calibration data is collected. The calibration data includes multi-frame array images and IMU data.

需要说明的是,现有地面检校方案主要用于视觉—惯性里程计(Visual-inertialodometry,VIO)外方位元素及IMU初始化标定,适配于高频触发相机(10Hz以上),故存在检校期间不同时刻IMU角速度偏差、加速度偏差不变假设,这与常用航空相机(一般触发频率低于1Hz)性能并不相符,也无法使用相关假设。因此,本实施例通过“静—动—静”完成相机与IMU同步标定数据的采集,不仅保证初始时刻及结束时刻速度为零,并使初始时刻及结束时刻IMU角速度偏差亦可通过IMU数据直接读取,从而利用这些约束的增加估计结果的可靠性。具体地,该步骤S102具体包括以下步骤:It should be noted that the existing ground calibration scheme is mainly used for visual-inertialodometry (VIO) exterior orientation elements and IMU initialization calibration, and is suitable for high-frequency trigger cameras (above 10Hz). Therefore, there is an assumption that the IMU angular velocity deviation and acceleration deviation remain unchanged at different times during the calibration. This is inconsistent with the performance of commonly used aerial cameras (generally with a trigger frequency of less than 1Hz), and the relevant assumptions cannot be used. Therefore, this embodiment completes the acquisition of camera and IMU synchronous calibration data through "static-dynamic-static", which not only ensures that the velocity is zero at the initial and end times, but also allows the IMU angular velocity deviation at the initial and end times to be directly read through the IMU data, thereby using these constraints to increase the reliability of the estimation results. Specifically, step S102 specifically includes the following steps:

1、在控制载荷设备航向朝向天空方向、成像方向朝向控制场的情况下,将载荷设备置于初始静置状态,采集至少第一预设数量的面阵影像且同时采集IMU数据。1. When the payload device is oriented toward the sky and the imaging direction is toward the control field, the payload device is placed in an initial static state, and at least a first preset number of array images and IMU data are collected simultaneously.

具体地,本实施例可根据设备重量等手持设备或置于运动平台上进行数据采集,采集数据时载荷航向朝向天顶方向、成像方向朝向控制场,然后,将载荷设备置于初始静置状态,采集至少第一预设数量的面阵影像且同时采集IMU数据,该第一预设数量优选设置为至少2组。Specifically, this embodiment can collect data by holding the device or placing it on a moving platform according to the weight of the device. When collecting data, the payload heading is toward the zenith direction and the imaging direction is toward the control field. Then, the payload device is placed in an initial static state, and at least a first preset number of area array images are collected and IMU data is collected at the same time. The first preset number is preferably set to at least 2 groups.

2、控制载荷设备分别绕自身X、Y、Z轴进行顺时针和逆时针旋转,且在绕每个轴旋转时采集第二预设数量的面阵影像且同时采集IMU数据。2. Control the payload device to rotate clockwise and counterclockwise around its own X, Y, and Z axes respectively, and collect a second preset number of array images and IMU data while rotating around each axis.

具体地,控制载荷设备分别绕自身X、Y、Z轴进行顺时针和逆时针旋转,旋转角度大于20度,且每个轴采集至少第二预设数量的面阵影像和IMU数据,该第二预设数量优选设置为10组。Specifically, the payload device is controlled to rotate clockwise and counterclockwise around its own X, Y, and Z axes, with a rotation angle greater than 20 degrees, and each axis collects at least a second preset number of array images and IMU data, and the second preset number is preferably set to 10 groups.

3、将载荷设备置于结束静置状态,采集至少第一预设数量的面阵影像且同时采集IMU数据。3. Place the payload device in a static state, collect at least a first preset number of array images and simultaneously collect IMU data.

本实施例通过“静—动—静”数据采集模式构建前置约束条件(初始时刻及结束时刻IMU角速度偏差已知,初始时刻及结束时刻速度为0)完成时空标定,且视重力加速度方向已知、大小未知,这些更加准确的假设将提高检校精度,使得最终的标定结果更为准确。This embodiment completes the space-time calibration by constructing pre-constraint conditions (the IMU angular velocity deviation at the initial and end times is known, and the velocity at the initial and end times is 0) through the "static-dynamic-static" data acquisition mode, and the direction of the gravity acceleration is known but the magnitude is unknown. These more accurate assumptions will improve the calibration accuracy and make the final calibration result more accurate.

步骤S103:分别从每帧面阵影像中提取第二SIFT特征并与SIFT特征控制点进行匹配、且将不同帧面阵影像间的第二SIFT特征进行匹配,并根据匹配结果得到各帧面阵影像成像时相机的位姿信息。Step S103: extracting the second SIFT features from each frame of the array image and matching them with the SIFT feature control points, matching the second SIFT features between different frames of array images, and obtaining the camera posture information when each frame of the array image is imaged according to the matching results.

具体地,在得到面阵影像后,对所有的面阵影像进行第二SIFT特征提取,再将该第二SIFT特征与SIFT特征控制点进行匹配,以及将不同帧的面阵影像的第二SIFT特征进行匹配,最后结合匹配结果得到各帧面阵影像成像时相机的位姿信息。Specifically, after obtaining the area array images, the second SIFT features are extracted for all the area array images, and then the second SIFT features are matched with the SIFT feature control points, and the second SIFT features of the area array images of different frames are matched. Finally, the matching results are combined to obtain the camera posture information when each frame of the area array image is imaged.

进一步,步骤S103具体包括以下步骤:Further, step S103 specifically includes the following steps:

1、分别从每帧面阵影像中提取第二SIFT特征。1. Extract the second SIFT feature from each frame of the array image.

2、将第二SIFT特征与SIFT特征控制点进行匹配,确认影像外方位元素。2. Match the second SIFT feature with the SIFT feature control point to confirm the external orientation elements of the image.

具体地,将第二SIFT特征与SIFT特征控制点进行匹配,通过kneip-P3P+RANSAC(Random Sample Consensus,随机一致性采样)确认影像外方位元素。Specifically, the second SIFT feature is matched with the SIFT feature control point, and the image exterior orientation elements are confirmed through kneip-P3P+RANSAC (Random Sample Consensus).

3、将不同帧面阵影像间的第二SIFT特征进行匹配,并利用初始外方位元素对匹配的特征点进行过滤,得到匹配特征点。3. Match the second SIFT features between different frame array images, and use the initial exterior orientation elements to filter the matched feature points to obtain the matched feature points.

4、根据面阵影像间匹配特征点的像素位移确认处于初始静置状态及结束静置状态的静置面阵影像。4. The stationary area array images in the initial stationary state and the final stationary state are confirmed according to the pixel displacement of the matching feature points between the area array images.

具体地,根据面阵影像间匹配特征点的像素位移(即各匹配特征点视差)判断处于初始静置状态及结束静置状态的面阵影像,剔除部分静置面阵影像,使初始阶段及结束阶段各保留两张静态面阵影像。Specifically, the area array images in the initial static state and the final static state are judged according to the pixel displacement of the matching feature points between the area array images (i.e., the parallax of each matching feature point), and some static area array images are eliminated, so that two static area array images are retained in the initial stage and the final stage respectively.

5、基于光束法平差对非静置的面阵影像进行位姿估计,最终得到各帧面阵影像成像时相机的位姿信息。5. Based on the bundle adjustment, the pose of the non-stationary array image is estimated, and finally the pose information of the camera when each frame of the array image is imaged is obtained.

具体地,影像位姿数据作为时空标定的输入数据,其中第帧影像的位姿为:,其中,为第帧影像成像时相机在控制场坐标系下的旋转参数及位置参数。Specifically, the image pose data is used as the input data for spatiotemporal calibration, where The pose of the frame image is: ,in, , For the The rotation parameters and position parameters of the camera in the control field coordinate system when the frame image is formed.

步骤S104:对各帧面阵影像间的IMU数据进行预积分处理,得到IMU的旋转、速度和位移的状态描述。Step S104: pre-integrate the IMU data between each frame array image to obtain a state description of the rotation, speed and displacement of the IMU.

需要说明的是,原则上,给定初始位姿及速度,IMU位姿可由IMU测量获取角速度及加速度值积分获取,但其测量值不仅包含高斯白噪声和游走偏差,且加速度测量值应减去重力加速度的影响。由此,IMU测量模型可描述为:It should be noted that, in principle, given the initial posture and velocity, the IMU posture can be measured by the IMU to obtain the angular velocity And acceleration value Integral acquisition, but its measured value not only contains Gaussian white noise and wandering deviation, but also the acceleration measurement value should be subtracted from the gravity acceleration Therefore, the IMU measurement model can be described as:

其中,为IMU在全局帧下的角速度及线加速度值,为IMU在世界坐标系下的旋转矩阵,分别为角速度的高斯白噪声和游走偏差,分别为线加速度值的高斯白噪声和游走偏差。本实施例采用预积分对相机帧间的IMU数据进行预处理,设某连续相机相邻帧间对应的IMU数据为第至第时刻,则IMU预积分的旋转、速度及位移分别可描述为:in, and is the angular velocity and linear acceleration value of IMU in the global frame, is the rotation matrix of IMU in the world coordinate system, and are Gaussian white noise of angular velocity and wandering deviation, and are Gaussian white noise and wandering deviation of linear acceleration value respectively. In this embodiment, pre-integration is used to pre-process the IMU data between camera frames. Suppose the IMU data corresponding to the adjacent frames of a continuous camera is To At this moment, the rotation, velocity and displacement of the IMU pre-integration can be described as:

其中,表示第至第时刻的旋转差,表示第至第时刻的速度差,表示第至第时刻的位移差,表示第至第时刻之间的时刻,表示至第时刻的时间差。in, Indicates To The rotation difference at time, Indicates To The speed difference of time, Indicates To The displacement difference at time, Indicates To The moments between moments, express To The time difference of the moment.

若考虑到预积分量中角速度偏差及加速度偏差,则IMU积分获取状态为:If the angular velocity deviation in the pre-integrated quantity is taken into account and acceleration deviation , then the IMU integral acquisition status is:

其中:为第时刻IMU在世界坐标系下的旋转矩阵及平移向量;为第时刻IMU在世界坐标系下的旋转矩阵及平移向量;为第时刻IMU在世界坐标系下的速度;至第时刻的时间差;为旋转预积分量对的导数;为速度预积量对的导数;为位移预积分量对的导数,以上导数可由预积分过程获取。in: , For the The rotation matrix and translation vector of the IMU in the world coordinate system at this moment; , For the The rotation matrix and translation vector of the IMU in the world coordinate system at this moment; and For the and The speed of the IMU in the world coordinate system at this moment; for To The time difference between moments; is the rotation pre-integrated component pair The derivative of , The speed pre-integration , The derivative of , is the displacement pre-integrated quantity , The derivative of , the above derivative can be obtained by the pre-integration process.

进一步的,如图3所示,由于时钟不同步、传输延迟、传感器响应和操作系统开销,这些延时的存在致使传感器采样时刻与测量时刻间存在未对齐情况。若在时刻同时对相机和IMU数据进行采样,则相机和IMU测量值实际时间戳分别为:Furthermore, as shown in Figure 3, due to clock asynchrony, transmission delay, sensor response and operating system overhead, these delays cause misalignment between the sensor sampling time and the measurement time. The camera and IMU data are sampled at the same time, and the actual timestamps of the camera and IMU measurements are:

其中,分别为相机和IMU的实际时间戳,分别为相机和IMU的延时。由此定义相机及IMU间时间偏移in, , are the actual timestamps of the camera and IMU respectively, , The time delays of the camera and IMU are defined as follows: :

由此可获取相机及IMU间位姿存在如下关系:From this, we can obtain the following relationship between the camera and IMU posture:

其中,采样时刻相机位姿,采样时刻IMU的位姿,为相机在IMU坐标系下的位姿。显然,所谓时空标定即对时间偏移量、相机与IMU间相对变换矩阵(包含角元素、线元素)进行估计的过程。因此,步骤S104具体包括以下步骤:in, for The camera pose at the sampling moment, for The position and posture of the IMU at the sampling time, is the position of the camera in the IMU coordinate system. Obviously, the so-called spatiotemporal calibration is the time offset , relative transformation matrix between camera and IMU (including corner elements , Line elements ) is used to estimate the process. Therefore, step S104 specifically includes the following steps:

1、构建时间偏移构建相机和IMU之间的时空同步状态。1. Build time offset to build the spatiotemporal synchronization state between the camera and IMU.

具体地,本发明将时刻IMU在世界坐标系下旋转矩阵及平移向量分别进行线性内插为:Specifically, the present invention will At this moment, the rotation matrix and translation vector of the IMU in the world coordinate system are linearly interpolated as follows:

其中,时刻IMU在体坐标系下角速度,时刻IMU在世界坐标系下线速度。由此可获取相机与IMU时空同步状态可描述为:in, for The angular velocity of the IMU in the body coordinate system at this moment, for The linear velocity of the IMU in the world coordinate system at this moment. From this, the spatiotemporal synchronization state of the camera and IMU can be described as:

2、基于时空同步状态对各帧面阵影像间的IMU数据进行预积分处理,得到面阵影像的旋转、速度和位移的状态描述。2. Based on the spatiotemporal synchronization state, the IMU data between each frame of the array image is pre-integrated to obtain the state description of the rotation, speed and displacement of the array image.

具体地,面阵影像旋转、速度、位移状态描述分别为:Specifically, the rotation, speed, and displacement states of the array image are described as follows:

;(13) ; (13)

;(14) ; (14)

;(15) ; (15)

其中,表示第帧影像成像时相机在控制场坐标系下的旋转预积分量,表示第帧影像成像时相机在控制场坐标系下的旋转预积分量,表示角元素,表示指数函数,表示时刻IMU在体坐标系下角速度,表示时刻IMU在体坐标系下角速度,表示相机与IMU间的时间偏移,表示第至第时刻的旋转差,表示旋转预积分量对的导数,表示角速度偏差,表示第时刻IMU在世界坐标系下速度预积分量,表示第时刻IMU在世界坐标系下速度预积分量,表示重力加速度,表示第至第时刻的时间差,表示第至第时刻的速度差,表示速度预积分量对的导数,表示速度预积分量对的导数,表示加速度偏差,表示第帧影像成像时相机在控制场坐标系下的位移预积分量,表示第帧影像成像时相机在控制场坐标系下的位移预积分量,表示第至第时刻的位移差,表示位移预积分量对的导数,表示位移预积分量对的导数,表示线元素。in, Indicates The pre-integrated quantity of the camera's rotation in the control field coordinate system when the frame image is formed. Indicates The pre-integrated quantity of the camera's rotation in the control field coordinate system when the frame image is formed. represents the corner element, represents the exponential function, express The angular velocity of the IMU in the body coordinate system at this moment, express The angular velocity of the IMU in the body coordinate system at this moment, represents the time offset between the camera and the IMU, Indicates To The rotation difference at time, Represents the rotation pre-integrated quantity pair The derivative of represents the angular velocity deviation, Indicates The velocity pre-integrated quantity of IMU in the world coordinate system at the moment, Indicates The velocity pre-integrated quantity of IMU in the world coordinate system at the moment, represents the acceleration due to gravity, Indicates To The time difference between the moments Indicates To The speed difference of time, Indicates the speed pre-integrated quantity The derivative of Indicates the speed pre-integrated quantity The derivative of represents the acceleration deviation, Indicates The pre-integrated displacement of the camera in the control field coordinate system when the frame image is formed. Indicates The pre-integrated displacement of the camera in the control field coordinate system when the frame image is formed. Indicates To The displacement difference at time, Represents the displacement pre-integrated quantity The derivative of Represents the displacement pre-integrated quantity The derivative of Represents a line element.

步骤S105:基于位姿信息和旋转的状态描述对相机和IMU间的角元素和时间偏差进行标定,且基于位姿信息、速度和位移的状态描述对相机和IMU间的线元素进行标定。Step S105: Calibrate the angular elements and time deviations between the camera and the IMU based on the state description of the posture information and rotation, and calibrate the line elements between the camera and the IMU based on the state description of the posture information, velocity and displacement.

具体地,在得到位姿信息后,利用位姿信息和旋转预计分量即可对相机和IMU间的角元素和时间偏差进行标定,利用位姿信息、速度预积分量和位移预积分量即可对相机和IMU间的线元素进行标定。Specifically, after obtaining the posture information, the angular elements and time deviations between the camera and IMU can be calibrated using the posture information and the predicted rotation component, and the line elements between the camera and IMU can be calibrated using the posture information, velocity pre-integration, and displacement pre-integration.

进一步的,基于位姿信息和旋转的状态描述对相机和IMU间的角元素和时间偏差进行标定的步骤,具体包括:Furthermore, the step of calibrating the angular element and time deviation between the camera and the IMU based on the state description of the pose information and the rotation includes:

1、基于旋转的状态描述构建残差式。1. Construct the residual formula based on the rotation state description.

具体地,由于本实施例中待处理连续影像间时间差较大,故假设不同时刻IMU角速度偏差不同。在本实施例中,通过公式(13)构建残差式实现标定。公式(13)中旋转矩阵基于四元数可表达为:Specifically, since the time difference between the continuous images to be processed in this embodiment is large, it is assumed that the IMU angular velocity deviation is different at different times. In this embodiment, the residual formula is constructed by formula (13) to achieve calibration. The rotation matrix in formula (13) can be expressed based on quaternion as:

其中,表示相机在IMU坐标系下角元素的四元数表示法;的共轭四元数,表示的逆变换。in, Quaternion representation of the camera's corner elements in the IMU coordinate system; for The conjugate quaternion of The inverse transform of .

四元数可定义为:Quaternions It can be defined as:

其中,分别为四元数实部、方向虚部、方向虚部、方向虚部,in, are the real part of the quaternion, Direction imaginary part, Direction imaginary part, Direction imaginary part, .

则四元数左乘运算和右乘运算可被定义为:Then the quaternion left multiplication operation and right multiplication Can be defined as:

其中为取向量反对称阵,表示3×3大小的单位对角阵。in To get the vector Antisymmetric array, Represents a 3×3 unit diagonal matrix.

2、初始化不同时刻的角速度偏差。2. Initialize the angular velocity deviation at different times.

具体地,本实施例所采集数据中第帧面阵影像间、第帧面阵影像均处于静态,故该期间IMU角速度真值为0,根据IMU测量模型(见公式(3))可知其测量值为角速度游走偏差与测量噪声之和,取第帧面阵影像间邻近第2帧面阵影像的5帧IMU角速度数据、帧面阵影像间邻近第帧面阵影像的5帧IMU角速度数据,分别取平均值,作为第2、时刻IMU的角速度偏差,通过线性内插获取第时刻IMU的角速度偏差初值,利用该角速度偏差进行影像帧间IMU数据预积分。需要理解的是,在现有方案中,采用的相机频率高, 因此不同时刻相机的偏差可以认定为相同,但是,本发明中,所采用相机不再局限于高频率相机,还可以采用低频率相机,而低频率相机是不能简单的认定不同时刻的偏差相同的,因此,本发明利用初始及结束时刻的静态数据来对不同时刻的重力加速度偏差进行初始化。Specifically, the data collected in this embodiment Frame array image, The frame array images are all in a static state, so the true value of the IMU angular velocity during this period is 0. According to the IMU measurement model (see formula (3)), its measured value is the angular velocity wandering deviation With measurement noise The sum of The 5 frames of IMU angular velocity data of the adjacent second frame array image between the frame array images, The adjacent frames Take the average value of 5 frames of IMU angular velocity data of the frame array image , , as the second, The angular velocity deviation of the IMU at the moment is obtained by linear interpolation. The initial value of the angular velocity deviation of the IMU at the moment is used to pre-integrate the IMU data between image frames. It should be understood that in the existing scheme, the camera frequency used is high, so the deviations of the cameras at different moments can be considered the same. However, in the present invention, the cameras used are no longer limited to high-frequency cameras, and low-frequency cameras can also be used. However, low-frequency cameras cannot simply be considered to have the same deviations at different moments. Therefore, the present invention uses static data at the initial and end moments to initialize the gravity acceleration deviations at different moments.

3、基于不同时刻的角速度偏差,利用残差式分别进行角元素标定、时间偏差标定和不同时刻角速度偏差标定。3. Based on the angular velocity deviation at different times, the residual formula is used to perform angular element calibration, time deviation calibration and angular velocity deviation calibration at different times.

具体地,针对角元素进行标定,通过公式(16)移项可获取第面阵影像间数据存在等式:Specifically, for corner elements Calibrate and obtain the first The data between array images has an equation:

条件下,利用最小二乘法对的估计:exist Under these conditions, the least square method is used to Estimates:

其中,为相机在IMU坐标系下角元素的共轭四元数表示法,将其转换为旋转矩阵即完成标定。in, , The conjugate quaternion representation of the camera's corner elements in the IMU coordinate system is converted into a rotation matrix The calibration is completed.

针对时间偏差进行标定,通过公式(16)移项可获取第面阵影像间数据存在等式:For time deviation Calibrate and obtain the first The data between array images has an equation:

若令,则整理公式(21)中四元数虚部可得:If you , then rearranging the imaginary part of the quaternion in formula (21) yields:

利用最小二乘法对时间偏差进行标定:The time deviation is calculated using the least square method. To perform calibration:

;(23) ;(twenty three)

针对不同时刻角速度偏差进行标定:Angular velocity deviation at different times To perform calibration:

通过公式(16)移项可获取第面阵影像间数据存在等式:By shifting the terms in formula (16), we can obtain The data between array images has an equation:

若令:If:

;

则:but:

利用最小二乘可得第时刻IMU角速度偏差修正量为:Using least squares, we can get The IMU angular velocity deviation correction at this moment is:

由此修正第时刻角速度偏差,从而对不同时刻角速度偏差进行标定。This amends the The angular velocity deviation at different moments can be calibrated.

3、判断时间偏差是否超过预设时间偏差阈值。3. Determine whether the time deviation exceeds the preset time deviation threshold.

具体地,该预设时间偏差阈值预先设置。Specifically, the preset time deviation threshold is preset.

4、若未超过,则确认角元素、时间偏差和不同时刻角速度偏差标定完成。4. If it does not exceed, confirm that the calibration of the angle element, time deviation and angular velocity deviation at different times is completed.

5、若超过,则循环利用标定得到的时间偏差对IMU数据再次进行预积分处理后进行标定,并将新的标定结果中的时间偏差再次与预设时间偏差阈值进行比较,直至时间偏差低于预设时间偏差阈值时为止。进一步的,基于位姿信息、速度和位移的状态描述对相机和IMU间的线元素进行标定的步骤,包括:5. If it exceeds, the time deviation obtained by calibration is cyclically used to perform pre-integration processing on the IMU data again and then calibrate, and the time deviation in the new calibration result is compared with the preset time deviation threshold again until the time deviation is lower than the preset time deviation threshold. Further, the step of calibrating the line element between the camera and the IMU based on the state description of the posture information, speed and displacement includes:

1、基于角元素、时间偏差和不同时刻角速度偏差的完成标定后的标定数据对IMU数据重新进行预积分,得到新的速度和新的位移的状态描述。1. Based on the calibration data of the angular element, time deviation and angular velocity deviation at different times, the IMU data is re-pre-integrated to obtain the state description of the new velocity and new displacement.

具体地,在对线元素进行标定前,利用已标定的角元素、时间偏差和不同时刻角速度偏差重新进行预积分,公式(14)和(15)可变形为:Specifically, before calibrating the line elements, the calibrated angle elements, time deviations, and angular velocity deviations at different times are used to perform pre-integration again. Formulas (14) and (15) can be transformed into:

本实施例基于该式进行线元素标定,除该元素外本实施例中还存在未知数重力加速度、加速度偏差,而在本实施例中控制场重力方向已知,且可描述为,故仅对重力加速度范数进行估计,由此可得第时刻公式(27)~(28)表达为:This embodiment performs line element calibration based on this formula. In addition to this element, there is also an unknown gravitational acceleration in this embodiment. , acceleration deviation , and in this embodiment the direction of the control field gravity is known and can be described as , so only the norm of gravitational acceleration By estimating, we can get The moment formulas (27)~(28) are expressed as:

其中:in:

.

2、初始化不同时刻的重力加速度偏差。2. Initialize the gravity acceleration deviation at different times.

具体地,本实施例所采集数据中第帧面阵影像间、第帧面阵影像均处于静态,故该期间IMU加速度真值为0,根据IMU测量模型(见公式(3))可知其测量值为加速度游走偏差、测量噪声及重力加速度之和,取第帧面阵影像间邻近第2帧面阵影像的5帧IMU加速度数据平均值帧面阵影像间邻近第帧面阵影像的5帧IMU加速度数据平均值后,可计算得第2、时刻IMU的重力加速度偏差:Specifically, the data collected in this embodiment Frame array image, The frame array images are all in a static state, so the true value of the IMU acceleration during this period is 0. According to the IMU measurement model (see formula (3)), its measured value is the acceleration wandering deviation , measurement noise and the acceleration due to gravity, take the The average value of the 5 frames of IMU acceleration data between the adjacent second frame array images , The adjacent frames Average value of 5 frames of IMU acceleration data of the frame array image After that, we can calculate the second Gravity acceleration deviation of IMU at this moment:

其中:为第2、时刻相机在世界坐标系中旋转矩阵的转置,是对角元素标定得到的。通过线性内插获取第时刻IMU的重力加速度偏差初值,利用该重力加速度偏差进行面阵影像帧间IMU数据预积分。需要理解的是,在现有方案中,采用的相机频率高,因此不同时刻相机的偏差可以认定为相同,但是,本发明中,所采用相机不再局限于高频率相机,还可以采用低频率相机,而低频率相机是不能简单的认定不同时刻的偏差相同的,因此,本发明利用初始及结束时刻的静态数据来对不同时刻的重力加速度偏差进行初始化。in: and For the 2nd The transpose of the rotation matrix of the camera in the world coordinate system at that moment, The diagonal elements are calibrated. The first The initial value of the gravity acceleration deviation of the IMU at the moment is used to pre-integrate the IMU data between the array image frames. It should be understood that in the existing scheme, the camera frequency used is high, so the deviations of the cameras at different moments can be considered the same. However, in the present invention, the cameras used are no longer limited to high-frequency cameras, and low-frequency cameras can also be used. However, low-frequency cameras cannot simply be considered to have the same deviations at different moments. Therefore, the present invention uses static data at the initial and end moments to initialize the gravity acceleration deviations at different moments.

3、在控制场重力加速度方向已知的情况,结合重力加速度方向、不同时刻的重力加速度偏差、新的速度和新的位移的状态描述分别对线元素、重力加速度、速度和加速度偏差进行标定。3. When the direction of gravity acceleration in the control field is known, the line elements, gravity acceleration, velocity and acceleration deviation are calibrated respectively in combination with the state description of gravity acceleration direction, gravity acceleration deviation at different times, new velocity and new displacement.

具体地,针对线元素、重力加速度及速度进行标定:Specifically, for line elements , Gravitational acceleration and speed To perform calibration:

对应列置0后利用最小二乘法对未知参数进行估计:Will middle After setting the corresponding columns to 0, the least squares method is used to estimate the unknown parameters:

其中,表示重力加速度的共轭四元数,表示速度的共轭四元数,由于本实施例数据采集的特殊性,其初始速度、结束速度为0,在以上优化中设置相应列为0即可。in, Represents the acceleration due to gravity The conjugate quaternion of Indicates speed The conjugate quaternion of , due to the particularity of data acquisition in this embodiment, its initial velocity , End Speed 0, set in the above optimization The corresponding column can be set to 0.

针对加速度偏差进行标定:For acceleration deviation To perform calibration:

对应列置0后,利用最小二乘法对未知参数进行估计:Will middle , , After setting the corresponding columns to 0, the least squares method is used to estimate the unknown parameters:

其中,表示。in, express.

3、判断重力加速度是否超过预设加速度阈值。3. Determine whether the gravity acceleration exceeds the preset acceleration threshold.

具体地,该预设加速度阈值预先设置。Specifically, the preset acceleration threshold is set in advance.

4、若未超过,则确认线元素、重力加速度、速度和加速度偏差标定完成。4. If it does not exceed, confirm that the calibration of line elements, gravity acceleration, velocity and acceleration deviation is completed.

5、若超过,则循环利用标定得到的加速度偏差对IMU数据再次进行预积分处理后进行标定,并将新的标定结果中的重力加速度再次与预设加速度阈值进行比较,直至重力加速度低于预设加速度阈值时为止。5. If it exceeds, the acceleration deviation obtained by calibration is cyclically used to pre-integrate the IMU data again and then calibrate it, and the gravity acceleration in the new calibration result is compared with the preset acceleration threshold again until the gravity acceleration is lower than the preset acceleration threshold.

可以理解的是,在现有方案中,因采用高频相机进行数据采集,因此,对不同时刻IMU的角速度偏差和加速度偏差均可认定为相同。而本发明采用低频相机采集数据,因此,不同时刻IMU的角速度偏差和加速度偏差不能直接认定为相同,导致未知数增加,而为了减少标定过程中的未知数,本发明在进行数据采集时,对数据的采集过程进行约束,使得SIFT特征控制场的尺度、重力方向已知,再分步骤对不同待估计值求解的方法,实现对所有值得标定。It is understandable that in the existing scheme, since a high-frequency camera is used for data collection, the angular velocity deviation and acceleration deviation of the IMU at different times can be considered the same. However, the present invention uses a low-frequency camera to collect data, so the angular velocity deviation and acceleration deviation of the IMU at different times cannot be directly considered the same, resulting in an increase in unknowns. In order to reduce the unknowns in the calibration process, the present invention constrains the data collection process during data collection, so that the scale and gravity direction of the SIFT feature control field are known, and then the method of solving different estimated values in steps is used to achieve calibration of all values.

本实施例的机载面阵相机时空标定方法通过在预先设定尺度且重力加速度方向已知的控制场中进行覆盖成像,以得到SIFT特征控制点,再结合控制场采集标定数据,标定数据包括多帧面阵影像和IMU数据,再分别从每帧面阵影像中提取各帧面阵影像成像时相机的位姿信息,再对各帧面阵影像间的IMU数据进行预积分处理,得到面阵影像的旋转、速度和位移的状态描述,最后基于位姿信息和旋转、速度和位移的状态描述对相机和IMU间的角元素、时间偏差、线元素进行标定,基于预先构建的大尺度SIFT特征控制场,其具有重力加速度方向已知、尺度已知的特定,从而可保证航空影像清晰成像质量,在标定过程中不再需要对重力加速度方向进行估计,从而减少标定过程中需要估计的参数量,从而提高标定的精度。The airborne area array camera spatiotemporal calibration method of the present embodiment performs coverage imaging in a control field with a preset scale and a known gravity acceleration direction to obtain SIFT feature control points, and then collects calibration data in combination with the control field. The calibration data includes multiple frames of area array images and IMU data, and then extracts the camera's posture information when each frame of the area array image is imaged from each frame of the area array image. Then, the IMU data between each frame of the area array image is pre-integrated to obtain a state description of the rotation, speed and displacement of the area array image. Finally, based on the posture information and the state description of the rotation, speed and displacement, the angular element, time deviation and line element between the camera and the IMU are calibrated. Based on the pre-constructed large-scale SIFT feature control field, which has the characteristics of known gravity acceleration direction and known scale, the clear imaging quality of the aerial image can be guaranteed. In the calibration process, it is no longer necessary to estimate the gravity acceleration direction, thereby reducing the amount of parameters that need to be estimated in the calibration process, thereby improving the calibration accuracy.

图4是本发明实施例的机载面阵相机时空标定装置的功能模块示意图。如图4所示,该机载面阵相机时空标定装置20包括三维重构模块21、采集模块22、位姿估计模块23、预积分处理模块24和标定模块25。Fig. 4 is a functional module diagram of an airborne area array camera spatiotemporal calibration device according to an embodiment of the present invention. As shown in Fig. 4, the airborne area array camera spatiotemporal calibration device 20 includes a 3D reconstruction module 21, an acquisition module 22, a pose estimation module 23, a pre-integration processing module 24 and a calibration module 25.

三维重构模块21,用于在预先构建的控制场中进行覆盖成像,并基于所成影像中提取到的第一SIFT特征进行三维重构,得到SIFT特征控制点,控制场的尺度预先设定且重力加速度方向预先获知;The three-dimensional reconstruction module 21 is used to perform coverage imaging in a pre-constructed control field and perform three-dimensional reconstruction based on the first SIFT feature extracted from the image to obtain SIFT feature control points. The scale of the control field is pre-set and the direction of gravity acceleration is pre-known.

采集模块22,用于基于控制场,结合预设方式进行运动成像,并采集标定数据,标定数据包括多帧面阵影像和IMU数据;The acquisition module 22 is used to perform motion imaging based on the control field in combination with a preset method, and to acquire calibration data, the calibration data including multi-frame array images and IMU data;

位姿估计模块23,用于分别从每帧面阵影像中提取第二SIFT特征并与SIFT特征控制点进行匹配、且将不同帧面阵影像间的第二SIFT特征进行匹配,并根据匹配结果得到各帧面阵影像成像时相机的位姿信息;The pose estimation module 23 is used to extract the second SIFT features from each frame of the array image and match them with the SIFT feature control points, and to match the second SIFT features between different frames of the array image, and to obtain the pose information of the camera when each frame of the array image is imaged according to the matching results;

预积分处理模块24,用于对各帧面阵影像间的IMU数据进行预积分处理,得到面阵影像的旋转、速度和位移的状态描述;The pre-integration processing module 24 is used to perform pre-integration processing on the IMU data between each frame of the array image to obtain a state description of the rotation, speed and displacement of the array image;

标定模块25,用于基于位姿信息和旋转的状态描述对相机和IMU间的角元素和时间偏差进行标定,且基于位姿信息、速度和位移的状态描述对相机和IMU间的线元素进行标定。The calibration module 25 is used to calibrate the angular elements and time deviations between the camera and the IMU based on the state description of the posture information and the rotation, and to calibrate the line elements between the camera and the IMU based on the state description of the posture information, the velocity and the displacement.

可选地,三维重构模块21执行在预先构建的控制场中进行覆盖成像,并基于所成影像中提取到的第一SIFT特征进行三维重构,得到SIFT特征控制点的操作,具体包括:Optionally, the three-dimensional reconstruction module 21 performs coverage imaging in a pre-constructed control field, and performs three-dimensional reconstruction based on the first SIFT feature extracted from the formed image to obtain an operation of SIFT feature control points, specifically including:

利用高分辨率定焦相机对控制场进行覆盖成像,得到控制场影像;Use a high-resolution fixed-focus camera to cover and image the control field to obtain an image of the control field;

将预先获取的控制场中的控制点坐标与控制场影像上控制点对应成像处的成像点坐标进行关联;Associating the coordinates of the control points in the control field acquired in advance with the coordinates of the imaging points at the imaging locations corresponding to the control points on the control field image;

基于控制点坐标和成像点坐标的关联关系,利用相机初始内参得到相机的初始外方位元素;Based on the correlation between the coordinates of the control points and the coordinates of the imaging points, the initial exterior orientation elements of the camera are obtained using the initial intrinsic parameters of the camera;

分别从每张控制场影像中提取第一SIFT特征;Extract the first SIFT feature from each control field image respectively;

将不同控制场影像间的第一SIFT特征进行匹配,且保留匹配后关联特征点数量满足预设数量阈值的控制场影像对;Matching the first SIFT features between different control field images, and retaining the control field image pairs whose number of associated feature points after matching meets a preset number threshold;

基于初始外方位元素、相机初始内参和控制场影像对中成功匹配的关联特征点进行三维重构,得到SIFT特征控制点。Based on the initial exterior orientation elements, the initial internal parameters of the camera and the successfully matched associated feature points in the control field image, three-dimensional reconstruction is performed to obtain the SIFT feature control points.

可选地,采集模块22执行基于控制场,结合预设方式进行运动成像,并采集标定数据的操作,具体包括:Optionally, the acquisition module 22 performs motion imaging based on the control field in combination with a preset method and acquires calibration data, specifically including:

在控制载荷设备航向朝向天空方向、成像方向朝向控制场的情况下,将载荷设备置于初始静置状态,采集至少第一预设数量的面阵影像且同时采集IMU数据;When the payload device is oriented toward the sky and the imaging direction is toward the control field, the payload device is placed in an initial static state, and at least a first preset number of array images and IMU data are collected simultaneously;

控制载荷设备分别绕自身X、Y、Z轴进行顺时针和逆时针旋转,且在绕每个轴旋转时采集第二预设数量的面阵影像且同时采集IMU数据;Control the payload device to rotate clockwise and counterclockwise around its own X, Y, and Z axes respectively, and collect a second preset number of array images and IMU data simultaneously when rotating around each axis;

将载荷设备置于结束静置状态,采集至少第一预设数量的面阵影像且同时采集IMU数据。The payload device is placed in an end-stationary state, and at least a first preset number of array images are collected and IMU data are collected simultaneously.

可选地,位姿估计模块23执行分别从每帧面阵影像中提取第二SIFT特征并与SIFT特征控制点进行匹配、且将不同帧面阵影像间的第二SIFT特征进行匹配,并根据匹配结果得到各帧面阵影像成像时相机的位姿信息的操作,具体包括:Optionally, the pose estimation module 23 performs operations of extracting the second SIFT features from each frame of the array image and matching them with the SIFT feature control points, matching the second SIFT features between different frame array images, and obtaining the pose information of the camera when each frame of the array image is imaged according to the matching results, which specifically includes:

分别从每帧面阵影像中提取第二SIFT特征;Extract the second SIFT feature from each frame of the array image;

将第二SIFT特征与SIFT特征控制点进行匹配,确认影像外方位元素;Match the second SIFT feature with the SIFT feature control point to confirm the external orientation elements of the image;

将不同帧面阵影像间的第二SIFT特征进行匹配,并利用初始外方位元素对匹配的特征点进行过滤,得到匹配特征点;Match the second SIFT features between different frame array images, and use the initial exterior orientation elements to filter the matched feature points to obtain the matched feature points;

根据面阵影像间匹配特征点的像素位移确认处于初始静置状态及结束静置状态的静置面阵影像;Confirm the stationary area array images in the initial stationary state and the final stationary state according to the pixel displacement of the matching feature points between the area array images;

基于光束法平差对非静置的面阵影像进行位姿估计,最终得到各帧面阵影像成像时相机的位姿信息。The pose of non-stationary array images is estimated based on bundle adjustment, and finally the pose information of the camera at the time of imaging each frame of the array image is obtained.

可选地,预积分处理模块24执行对各帧面阵影像间的IMU数据进行预积分处理,得到面阵影像的旋转、速度和位移的状态描述的操作,具体包括;Optionally, the pre-integration processing module 24 performs pre-integration processing on the IMU data between each frame of the array image to obtain the state description of the rotation, speed and displacement of the array image, specifically including:

构建时间偏移构建相机和IMU之间的时空同步状态;Build time offset to build the spatiotemporal synchronization state between the camera and IMU;

基于时空同步状态对各帧面阵影像间的IMU数据进行预积分处理,得到面阵影像的旋转、速度和位移的状态描述。Based on the spatiotemporal synchronization state, the IMU data between each frame of the array image is pre-integrated to obtain the state description of the rotation, velocity and displacement of the array image.

可选地,标定模块25执行基于位姿信息和旋转的状态描述对相机和IMU间的角元素和时间偏差进行标定的操作,具体包括:Optionally, the calibration module 25 performs an operation of calibrating the angular element and time deviation between the camera and the IMU based on the state description of the posture information and the rotation, specifically including:

基于旋转的状态描述构建残差式;Construct the residual formula based on the rotation state description;

初始化不同时刻的角速度偏差;Initialize the angular velocity deviation at different times;

基于不同时刻的角速度偏差,利用残差式分别进行角元素标定、时间偏差标定和不同时刻角速度偏差标定;Based on the angular velocity deviation at different times, the residual formula is used to calibrate the angle element, time deviation and angular velocity deviation at different times.

判断时间偏差是否超过预设时间偏差阈值;Determine whether the time deviation exceeds a preset time deviation threshold;

若未超过,则确认角元素、时间偏差和不同时刻角速度偏差标定完成;If it does not exceed, it is confirmed that the calibration of the angle element, time deviation and angular velocity deviation at different times is completed;

若超过,则循环利用标定得到的时间偏差对IMU数据再次进行预积分处理后进行标定,并将新的标定结果中的时间偏差再次与预设时间偏差阈值进行比较,直至时间偏差低于预设时间偏差阈值时为止。If it exceeds, the time deviation obtained by calibration is recycled to perform pre-integration processing on the IMU data again and then calibrate it, and the time deviation in the new calibration result is compared with the preset time deviation threshold again until the time deviation is lower than the preset time deviation threshold.

可选地,标定模块25执行基于位姿信息、速度和位移的状态描述对相机和IMU间的线元素进行标定的操作,具体包括:Optionally, the calibration module 25 performs an operation of calibrating the line elements between the camera and the IMU based on the state description of the posture information, velocity and displacement, specifically including:

基于角元素、时间偏差和不同时刻角速度偏差的完成标定后的标定数据对IMU数据重新进行预积分,得到新的速度和新的位移的状态描述;Based on the calibrated data of the angle element, time deviation and angular velocity deviation at different times, the IMU data is pre-integrated again to obtain a state description of the new velocity and new displacement;

初始化不同时刻的重力加速度偏差;Initialize gravity acceleration deviation at different times;

在控制场重力加速度方向已知的情况,结合重力加速度方向、不同时刻的重力加速度偏差、新的速度和新的位移的状态描述分别对线元素、重力加速度、速度和加速度偏差进行标定;When the direction of gravity acceleration in the control field is known, the line element, gravity acceleration, velocity and acceleration deviation are calibrated respectively by combining the state description of gravity acceleration direction, gravity acceleration deviation at different times, new velocity and new displacement;

判断重力加速度是否超过预设加速度阈值;Determine whether the gravity acceleration exceeds a preset acceleration threshold;

若未超过,则确认线元素、重力加速度、速度和加速度偏差标定完成;If not, then confirm that the calibration of line elements, gravity acceleration, velocity and acceleration deviation is completed;

若超过,则循环利用标定得到的加速度偏差对IMU数据再次进行预积分处理后进行标定,并将新的标定结果中的重力加速度再次与预设加速度阈值进行比较,直至重力加速度低于预设加速度阈值时为止。If it exceeds, the acceleration deviation obtained by calibration is recycled to perform pre-integration processing on the IMU data again and then calibrate it, and the gravity acceleration in the new calibration result is compared with the preset acceleration threshold again until the gravity acceleration is lower than the preset acceleration threshold.

关于上述实施例机载面阵相机时空标定装置中各模块实现技术方案的其他细节,可参见上述实施例中的机载面阵相机时空标定方法中的描述,此处不再赘述。For other details of the technical solutions for implementing each module in the airborne area array camera spatiotemporal calibration device in the above embodiment, reference may be made to the description of the airborne area array camera spatiotemporal calibration method in the above embodiment, which will not be repeated here.

需要说明的是,本说明书中的各个实施例均采用递进的方式描述,每个实施例重点说明的都是与其他实施例的不同之处,各个实施例之间相同相似的部分互相参见即可。对于装置类实施例而言,由于其与方法实施例基本相似,所以描述的比较简单,相关之处参见方法实施例的部分说明即可。It should be noted that each embodiment in this specification is described in a progressive manner, and each embodiment focuses on the differences from other embodiments, and the same or similar parts between the embodiments can be referred to each other. For the device embodiment, since it is basically similar to the method embodiment, the description is relatively simple, and the relevant parts can be referred to the partial description of the method embodiment.

请参阅图4,图4为本发明实施例的计算机设备的结构示意图。如图4所示,该计算机设备30包括处理器31及和处理器31耦接的存储器32,存储器32中存储有程序指令,程序指令被处理器31执行时,使得处理器31执行上述任一实施例所述的机载面阵相机时空标定方法步骤。Please refer to Fig. 4, which is a schematic diagram of the structure of a computer device according to an embodiment of the present invention. As shown in Fig. 4, the computer device 30 includes a processor 31 and a memory 32 coupled to the processor 31, wherein the memory 32 stores program instructions, and when the program instructions are executed by the processor 31, the processor 31 executes the steps of the airborne area array camera spatiotemporal calibration method described in any of the above embodiments.

其中,处理器31还可以称为资源(Central Processing Unit,中央处理单元)。处理器31可能是一种集成电路芯片,具有信号的处理能力。处理器31还可以是通用处理器、数字信号处理器(DSP)、专用集成电路(ASIC)、现场可编程门阵列(FPGA)或者其他可编程逻辑器件、分立门或者晶体管逻辑器件、分立硬件组件。通用处理器可以是微处理器或者该处理器也可以是任何常规的处理器等。The processor 31 may also be referred to as a resource (Central Processing Unit). The processor 31 may be an integrated circuit chip having the ability to process signals. The processor 31 may also be a general-purpose processor, a digital signal processor (DSP), an application-specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic devices, discrete gate or transistor logic devices, discrete hardware components. The general-purpose processor may be a microprocessor or the processor may also be any conventional processor, etc.

参阅图5,图5为本发明实施例的存储介质的结构示意图。本发明实施例的存储介质存储有能够实现上述机载面阵相机时空标定方法的程序指令41,其中,该程序指令41可以以软件产品的形式存储在上述存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)或处理器(processor)执行本申请各个实施方式所述方法的全部或部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(ROM,Read-Only Memory)、随机存取存储器(RAM,Random Access Memory)、磁碟或者光盘等各种可以存储程序代码的介质,或者是计算机、服务器、手机、平板等计算机设备设备。Refer to Figure 5, which is a schematic diagram of the structure of the storage medium of an embodiment of the present invention. The storage medium of the embodiment of the present invention stores program instructions 41 that can implement the above-mentioned airborne area array camera spatiotemporal calibration method, wherein the program instructions 41 can be stored in the above-mentioned storage medium in the form of a software product, including a number of instructions for enabling a computer device (which can be a personal computer, server, or network device, etc.) or a processor (processor) to execute all or part of the steps of the method described in each embodiment of the present application. The aforementioned storage medium includes: various media that can store program codes, such as a USB flash drive, a mobile hard disk, a read-only memory (ROM), a random access memory (RAM), a magnetic disk or an optical disk, or a computer device such as a computer, a server, a mobile phone, and a tablet.

在本申请所提供的几个实施例中,应该理解到,所揭露的计算机设备,装置和方法,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的,例如,单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式,例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。In the several embodiments provided in the present application, it should be understood that the disclosed computer equipment, devices and methods can be implemented in other ways. For example, the device embodiments described above are only schematic, for example, the division of units is only a logical function division, and there may be other division methods in actual implementation, such as multiple units or components can be combined or integrated into another system, or some features can be ignored or not executed. Another point is that the mutual coupling or direct coupling or communication connection shown or discussed can be an indirect coupling or communication connection through some interfaces, devices or units, which can be electrical, mechanical or other forms.

另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理存在,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用软件功能单元的形式实现。以上仅为本申请的实施方式,并非因此限制本申请的专利范围,凡是利用本申请说明书及附图内容所作的等效结构或等效流程变换,或直接或间接运用在其他相关的技术领域,均同理包括在本申请的专利保护范围内。In addition, each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit. The above integrated unit may be implemented in the form of hardware or in the form of software functional units. The above is only an implementation method of the present application, and does not limit the patent scope of the present application. Any equivalent structure or equivalent process transformation made by using the contents of the specification and drawings of this application, or directly or indirectly used in other related technical fields, is also included in the patent protection scope of the present application.

Claims (7)

1. The space-time calibration method for the airborne area array camera is characterized by comprising the following steps of:
Performing coverage imaging in a pre-constructed control field, and performing three-dimensional reconstruction based on a first SIFT feature extracted from the formed image to obtain SIFT feature control points, wherein the scale of the control field is preset and the direction of gravitational acceleration is known in advance;
Based on the control field, performing motion imaging by combining a preset mode, and acquiring calibration data, wherein the calibration data comprises a multi-frame area array image and IMU data;
Respectively extracting second SIFT features from each frame of area array image, matching the second SIFT features with the SIFT feature control points, matching the second SIFT features among different frames of area array images, and obtaining pose information of a camera when each frame of area array image is imaged according to a matching result;
Performing pre-integration processing on IMU data among the frame area array images to obtain state descriptions of rotation, speed and displacement among the adjacent area array images;
Calibrating an angle element and a time deviation between a camera and an IMU based on the pose information and the rotated state description, and calibrating a line element between the camera and the IMU based on the pose information, the speed and the displaced state description;
Performing coverage imaging in a control field constructed in advance, and performing three-dimensional reconstruction based on a first SIFT feature extracted from the formed image to obtain SIFT feature control points, wherein the method comprises the following steps:
performing coverage imaging on the control field by using a high-resolution fixed-focus camera to obtain a control field image;
Associating the control point coordinates in the control field obtained in advance with imaging point coordinates of imaging positions corresponding to the control points on the control field image;
based on the association relation between the control point coordinates and the imaging point coordinates, obtaining initial external azimuth elements of the camera by utilizing initial internal parameters of the camera;
Extracting the first SIFT features from each control field image respectively;
matching the first SIFT features among different control field images, and reserving control field image pairs with the number of the matched associated feature points meeting a preset number threshold;
Three-dimensional reconstruction is carried out on the basis of the initial external azimuth element, the initial internal reference of the camera and the associated characteristic points successfully matched in the control field image pair, so as to obtain SIFT characteristic control points;
Based on the control field, the motion imaging is performed in combination with a preset mode, and calibration data are collected, including:
under the condition that the heading of the load equipment is controlled to face the sky direction and the imaging direction is controlled to face the control field, the load equipment is placed in an initial standing state, at least a first preset number of area array images are collected, and IMU data are collected at the same time;
The load equipment is controlled to rotate clockwise and anticlockwise around the X, Y, Z shaft respectively, and a second preset number of area array images are collected and IMU data are collected simultaneously when the load equipment rotates around each shaft;
placing the load equipment in an ending and standing state, collecting at least a first preset number of area array images and simultaneously collecting IMU data;
The steps of extracting second SIFT features from each frame of area array image, matching the second SIFT features with the SIFT feature control points, matching the second SIFT features among different frames of area array images, and obtaining pose information of cameras when each frame of area array image is imaged according to a matching result include:
respectively extracting second SIFT features from each frame of area array image;
matching the second SIFT feature with the SIFT feature control point to confirm the external azimuth element of the image;
Matching the second SIFT features among the different frame area array images, and filtering the matched feature points by utilizing the initial external azimuth elements to obtain matched feature points;
Confirming a static area array image in an initial static state and an ending static state according to the pixel displacement of the matching feature points among the area array images;
And carrying out pose estimation on the non-stationary area array image based on the adjustment of the beam method, and finally obtaining pose information of the camera when each frame of area array image is imaged.
2. The space-time calibration method of an airborne area array camera according to claim 1, wherein the pre-integration processing is performed on IMU data between each frame of area array images to obtain a state description of rotation, speed and displacement between adjacent area array images, including;
constructing time offset to construct a space-time synchronization state between the camera and the IMU;
and carrying out pre-integration processing on IMU data among the frame area array images based on the space-time synchronization state to obtain state description of rotation, speed and displacement among the adjacent area array images.
3. The method for space-time calibration of an on-board area array camera according to claim 2, wherein the calibrating the angular element and the time offset between the camera and the IMU based on the pose information and the rotated state description comprises:
Constructing a residual formula based on the rotated state description;
Initializing angular speed deviations at different moments;
Based on the angular velocity deviation at different moments, respectively carrying out angle element calibration, time deviation calibration and angular velocity deviation calibration at different moments by utilizing the residual error;
judging whether the time deviation exceeds a preset time deviation threshold value or not;
If the angular element, the time deviation and the angular speed deviation at different moments are confirmed to be calibrated;
If the time deviation obtained by the calibration is exceeded, the IMU data is calibrated after the pre-integration treatment is carried out again by recycling, and the time deviation in the new calibration result is compared with the preset time deviation threshold again until the time deviation is lower than the preset time deviation threshold.
4. The method for space-time calibration of an on-board area array camera according to claim 3, wherein the calibrating line elements between the camera and the IMU based on the pose information, the velocity, and the state description of the displacement comprises:
Pre-integrating IMU data again based on the angle element, the time deviation and the calibration data of the angle speed deviation at different moments after calibration is completed, so as to obtain a new speed and a new displacement state description;
initializing gravity acceleration deviation at different moments;
Calibrating a linear element, a gravitational acceleration, a velocity and an acceleration deviation respectively in combination with the gravitational acceleration direction, the gravitational acceleration deviation at different moments, the new velocity and the state description of the new displacement under the condition that the gravitational acceleration direction of the control field is known;
Judging whether the gravity acceleration exceeds a preset acceleration threshold value or not;
If the acceleration deviation is not exceeded, confirming that calibration of the line element, the gravity acceleration, the speed and the acceleration deviation is completed;
If the acceleration deviation obtained by the calibration is exceeded, the IMU data is calibrated after the pre-integration treatment is carried out again by the cyclic utilization of the acceleration deviation obtained by the calibration, and the gravity acceleration in the new calibration result is compared with the preset acceleration threshold again until the gravity acceleration is lower than the preset acceleration threshold.
5. An airborne area array camera space-time calibration device, which is characterized in that the airborne area array camera space-time calibration method according to any one of claims 1-4 is adopted; it comprises the following steps:
the three-dimensional reconstruction module is used for performing coverage imaging in a control field which is built in advance, performing three-dimensional reconstruction based on a first SIFT feature extracted from the formed image to obtain SIFT feature control points, wherein the scale of the control field is preset and the direction of gravitational acceleration is known in advance;
The acquisition module is used for carrying out motion imaging by combining a preset mode based on the control field and acquiring calibration data, wherein the calibration data comprises multi-frame area array images and IMU data;
The pose estimation module is used for respectively extracting second SIFT features from each frame of area array image, matching the second SIFT features with the SIFT feature control points, matching the second SIFT features among different frames of area array images, and obtaining pose information of a camera when each frame of area array image is imaged according to a matching result;
The pre-integration processing module is used for carrying out pre-integration processing on IMU data among the frame area array images to obtain state descriptions of rotation, speed and displacement among the adjacent area array images;
And the calibration module is used for calibrating the angle element and the time deviation between the camera and the IMU based on the pose information and the rotating state description, and calibrating the line element between the camera and the IMU based on the pose information, the speed and the displacement state description.
6. A computer device comprising a processor, a memory coupled to the processor, the memory having stored therein program instructions that, when executed by the processor, cause the processor to perform the steps of the space-time calibration method of an onboard area array camera of any of claims 1-4.
7. A storage medium storing program instructions for implementing the space-time calibration method of an onboard area array camera according to any one of claims 1 to 4.
CN202410341544.8A 2024-03-25 2024-03-25 Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium Active CN117974809B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410341544.8A CN117974809B (en) 2024-03-25 2024-03-25 Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410341544.8A CN117974809B (en) 2024-03-25 2024-03-25 Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium

Publications (2)

Publication Number Publication Date
CN117974809A CN117974809A (en) 2024-05-03
CN117974809B true CN117974809B (en) 2024-06-18

Family

ID=90851659

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410341544.8A Active CN117974809B (en) 2024-03-25 2024-03-25 Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium

Country Status (1)

Country Link
CN (1) CN117974809B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862239A (en) * 2020-07-28 2020-10-30 阿戈斯智能科技(苏州)有限公司 Area-array camera image overlapping area calibration method, device, equipment and storage medium
CN116205947A (en) * 2023-01-03 2023-06-02 哈尔滨工业大学 Binocular-inertial fusion pose estimation method, electronic equipment and storage medium based on camera motion state

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9155675B2 (en) * 2011-10-12 2015-10-13 Board Of Trustees Of The University Of Arkansas Portable robotic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111862239A (en) * 2020-07-28 2020-10-30 阿戈斯智能科技(苏州)有限公司 Area-array camera image overlapping area calibration method, device, equipment and storage medium
CN116205947A (en) * 2023-01-03 2023-06-02 哈尔滨工业大学 Binocular-inertial fusion pose estimation method, electronic equipment and storage medium based on camera motion state

Also Published As

Publication number Publication date
CN117974809A (en) 2024-05-03

Similar Documents

Publication Publication Date Title
US10984554B2 (en) Monocular vision tracking method, apparatus and non-volatile computer-readable storage medium
CN107255476B (en) Indoor positioning method and device based on inertial data and visual features
US20230408262A1 (en) Efficient Vision-Aided Inertial Navigation Using a Rolling-Shutter Camera with Inaccurate Timestamps
CN110969665B (en) External parameter calibration method, device, system and robot
CN108629831B (en) 3D Human Body Reconstruction Method and System Based on Parametric Human Template and Inertial Measurement
CN105627991B (en) A kind of unmanned plane image real time panoramic joining method and system
CN113820735B (en) Determination method of position information, position measurement device, terminal and storage medium
CN112894832A (en) Three-dimensional modeling method, three-dimensional modeling device, electronic equipment and storage medium
JP2022510418A (en) Time synchronization processing method, electronic devices and storage media
CN110880189A (en) Combined calibration method and combined calibration device thereof and electronic equipment
CN107358633A (en) A Calibration Method of Internal and External Parameters of Multiple Cameras Based on Three-point Calibration Objects
CN112284381B (en) Visual-inertial real-time initialization alignment method and system
CN102679961B (en) Portable four-camera three-dimensional photographic measurement system and method
CN109767470B (en) Tracking system initialization method and terminal equipment
CN111561935B (en) On-orbit geometric calibration method and system for rotating large-width optical satellites
CN113516692A (en) Multi-sensor fusion SLAM method and device
CN106885585A (en) A kind of satellite borne photography measuring system integration calibration method based on bundle adjustment
CN113763479A (en) A calibration method of catadioptric panoramic camera and IMU sensor
CN113587934A (en) Robot, indoor positioning method and device and readable storage medium
CN117036509A (en) Combined calibration method, device, equipment and storage medium
CN106500729A (en) A kind of smart mobile phone self-inspection calibration method without the need for control information
CN110632636B (en) A vector pose estimation method based on Elman neural network
CN115728753A (en) Extrinsic parameter calibration method and device for laser radar and integrated navigation, and intelligent vehicle
CN117974809B (en) Airborne area array camera temporal and spatial calibration method, device, equipment and storage medium
CN113963071A (en) A tracking camera system and automatic calibration method for dynamic photogrammetry

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant