[go: up one dir, main page]

CN119090732B - Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field - Google Patents

Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field

Info

Publication number
CN119090732B
CN119090732B CN202410905585.5A CN202410905585A CN119090732B CN 119090732 B CN119090732 B CN 119090732B CN 202410905585 A CN202410905585 A CN 202410905585A CN 119090732 B CN119090732 B CN 119090732B
Authority
CN
China
Prior art keywords
centroid
image
wavefront
sub
vector
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410905585.5A
Other languages
Chinese (zh)
Other versions
CN119090732A (en
Inventor
杨忠明
邓宇轩
何伟林
刘兆军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN202410905585.5A priority Critical patent/CN119090732B/en
Publication of CN119090732A publication Critical patent/CN119090732A/en
Application granted granted Critical
Publication of CN119090732B publication Critical patent/CN119090732B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/60Analysis of geometric attributes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J9/00Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
    • G01J2009/002Wavefront phase distribution
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20036Morphological image processing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Computing Systems (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Multimedia (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Testing Of Optical Devices Or Fibers (AREA)

Abstract

The invention relates to a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, which belongs to the field of data processing and comprises the steps of decomposing acquired video data into image sequences, preprocessing each image, solving each facula centroid in each image to obtain a centroid coordinate sequence of each image, taking centroid coordinates of a first image as a calibration reference centroid, simultaneously obtaining an initial restoration matrix, carrying out centroid matching on two images adjacent front and back to obtain a slope vector and a correction parameter, updating an offset reference centroid coordinate sequence, carrying out a recovery equation on the slope vector and the correction parameter to obtain a Zernike coefficient, finishing wavefront restoration, and repeating the steps to realize the wavefront restoration of the whole video data. The invention can better face the complex situations of large inclination distortion, light spot data missing and the like, realizes the detection and recovery of wave fronts, and has important significance for improving the effective detection of the Hartmann wave front sensor on the distorted complex wave fronts.

Description

Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field
Technical Field
The invention provides a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, and belongs to the technical field of data processing of beam wavefront measurement.
Background
Hartmann wavefront sensors are a sophisticated optical measurement device that utilize a microlens array to divide an incident light wavefront into a plurality of sub-wavefronts, and determine the shape of the wavefront by measuring the sub-spot position variations formed by these sub-wavefronts at the focal plane. The technology can monitor wave front distortion in real time and correct the wave front distortion in a dynamic optical system, and has the characteristics of high precision and high real-time performance. The non-contact measurement mode of the Hartmann wavefront sensor avoids physical interference of the optical element, so that the Hartmann wavefront sensor is particularly important in optical element testing. In addition, due to the strong adaptability, the Hartmann wavefront sensor can be suitable for various wavelengths and different types of optical systems.
The origin of the Hartmann wavefront sensor can be traced back to the beginning of the 20 th century, which was first proposed by the German physicist Johann KARL FRIEDRICH HARTMANN. Hartmann issued a paper for wavefront measurement in 1910 and described a method for measuring a constant star wavefront using a microlens array. Such original Hartmann wavefront sensors were designed for astronomy to measure and correct starlight wavefront distortions due to atmospheric disturbances. Over time, hartmann wavefront sensor designs and applications have been continually improved and expanded. In the 70 s of the 20 th century, along with the development of computer technology, the data processing capability of the wavefront sensor is remarkably improved, so that the Hartmann wavefront sensor is widely applied to optical testing and adaptive optical systems. By the 90 th century of the 20 th century, along with the development of micro-electro-mechanical system (MEMS) technology, the miniaturization and integration of Hartmann wavefront sensor become possible, and further promote the application of the Hartmann wavefront sensor in the field of precision optical measurement.
The field of application of Hartmann wavefront sensors is very broad, including but not limited to atmospheric turbulence correction in adaptive optics, vision correction in ophthalmic, and starlight wavefront distortion correction in astronomical telescope systems. These applications demonstrate the importance of Hartmann wavefront sensors in improving imaging quality, optimizing optical system performance. With the optimization of the algorithm and the progress of hardware technology, the measurement speed and the measurement precision of the Hartmann wavefront sensor are remarkably improved, so that the Hartmann wavefront sensor can better adapt to high-speed dynamic environment and complex optical measurement requirements.
At present, the development direction of the Hartmann wavefront sensor mainly comprises the steps of improving the dynamic range, improving the lost data processing capacity, improving the detection precision and the like. The dynamic range and the capability of processing missing data are particularly important when facing complex distorted wavefront, but the existing detection method hardly combines the two aspects, so that a wavefront distortion detection algorithm based on continuous image sequence similarity is needed to be capable of combining the dynamic range and the capability of processing missing data.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, which can better cope with complex situations such as large inclination distortion, light spot data loss and the like, realizes the detection and recovery of the wavefront, and has important significance for improving the effective detection of a Hartmann wavefront sensor on distorted complex wavefront.
The technical scheme of the invention is as follows:
a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field comprises the following steps:
(1) Decomposing video data acquired by a CCD into a picture sequence to obtain original data;
(2) Preprocessing each picture, solving the centroid of each facula in each picture to obtain a centroid coordinate sequence of each picture, taking the centroid coordinate of a first picture as a calibration reference centroid, and simultaneously obtaining an initial restoration matrix E;
(3) Performing centroid matching by adopting two pictures adjacent front and back to obtain a slope vector and correction parameters, and updating an offset reference centroid coordinate sequence;
(4) Carrying the slope vector and the correction parameters into a restoration equation to calculate the Zernike coefficient, and completing wave front restoration;
(5) And (3) matching the next picture sequence with the updated offset reference centroid coordinate sequence, namely repeating the steps (3) - (4), and realizing the wave front restoration of the whole video data.
Preferably, step (2) comprises the steps of:
(2.1) performing picture binarization and morphological operation to obtain the approximate position of the light spot;
Morphological operations refer to open-and-close operations, the approximate location of a spot is one of the morphological operations, which can directly derive the centroid of each spot (in morphology, each individual region);
(2.2) calculating the accurate position of the light spot by using a centroid formula (a picture comprises a plurality of light spots, and the centroid of the light spot corresponding to each sub-aperture is required to be calculated), wherein the centroid formula comprises:
Wherein, (X c,Yc) is the spot centroid coordinate, M represents the number of pixels in the sub-aperture area, I i is the pixel value of the ith pixel, and (X i,yi) represents the pixel coordinate in the sub-aperture;
(2.3) comparing the centroid coordinates of the next picture with the calibration reference centroid coordinates to obtain a slope vector G, where the slope vector G includes a horizontal spot offset G x and a vertical spot offset G x of each sub-aperture, and is expressed as:
Wherein, (X t,Yt) represents the centroid coordinate obtained by the next picture, and f is the focal length of the sub-aperture;
(2.4) using Φ (x, y) to represent the incident wavefront phase, the wavefront phase can be expanded by a zernike polynomial:
Wherein Z m (x, y) is the m Xiang Zeni g polynomial, a m is its coefficient, and m=1 is the average value of the wavefront phase, usually neglected to be 0;
deriving the left and right sides of the zernike polynomial, then the wavefront restoration equation is expressed as a matrix:
G=EA
Wherein G represents a slope vector, the size of which is 2kx1, A= { a 2 a3…am } represents a coefficient vector, E represents a restoration matrix,
The restoration matrix size is 2k× (M-1), each term of the restoration matrix represents the average wavefront slope of the corresponding sub-aperture when the Ying Zeni g coefficient is 1, and k is the number of sub-apertures, (Z xM)k represents the average wavefront slope in the horizontal direction at the kth sub-aperture in the Mth term Zernike polynomial, and Z yM)k represents the average wavefront slope in the vertical direction at the kth sub-aperture in the Mth term Zernike polynomial;
And (2.5) obtaining a generalized inverse matrix E + of E, namely obtaining a coefficient vector A, and substituting the coefficient vector A into a Zernike polynomial to calculate the wave front phase phi (x, y).
Preferably, step (3) comprises the steps of:
(3.1) adopting the centroid of the previous picture as an offset reference centroid to match each centroid of the next picture (each centroid of the previous picture is matched with each centroid of the next picture, namely, any centroid is taken in the previous picture and the next picture to match each other in pairs), wherein the initial value of the offset reference centroid is the calibrated reference centroid of the first picture, and the coordinate difference value is expressed as:
Dx=XH-XP
Dy=YH-YP
Wherein (X H,YH) represents the centroid coordinates of the latter picture and (X P,YP) represents offset reference centroid coordinates, i.e. centroids that are used as references in the matching step;
the spot distance Dist can be expressed as:
(3.2) if Dist < DSpots/3, considering that the two light spots at the moment are light spots with the same sub-aperture, and calculating the light spot offset by using the calibration reference centroid corresponding to the offset reference centroid on the matching of the post-frame centroid, wherein DSpots represents the average light spot distance of the calibration reference centroid and is a fixed value;
calculating a slope vector, and replacing the centroid of the offset reference centroid at the sub-aperture with the centroid coordinate of the later picture, namely adopting the centroid of the later picture as a new offset reference centroid;
And (3.3) if Dist is more than or equal to DSpots/3, considering that the offset reference centroid cannot be matched with the centroid of the next picture, considering that the data of the sub-aperture is missing, and marking the sub-aperture serial number, namely the correction parameter.
The offset reference centroid refers to the result of its offset reference centroid from the calibration calculation for the first frame for which the recovery calculation was performed. Because the light spot flickers, not every reference centroid can find the corresponding rear frame centroid in the restored frame, the reference centroid is needed to replace the updated missing rear frame centroid at the moment and is used as the offset reference centroid of the next restored calculated frame, and the light spot on the matched rear frame centroid is still used.
Preferably, step (4) comprises the steps of:
(4.1) the slope vector G calculated in the step (3) only comprises successfully matched centroids, the unmatched centroids reserve a vector I consisting of serial numbers of the successfully matched centroids, and data in the restoration matrix E are deleted through the vector I to obtain E';
(4.2) by solving the equation
G=E′A
Obtaining a coefficient vector A;
(4.3) substituting the obtained coefficient vector a into the wavefront phase expression:
the recovered wavefront phase is obtained.
The invention is not exhaustive and can be seen in the prior art.
The beneficial effects of the invention are as follows:
1. the invention realizes the dynamic wavefront data processing of large-inclination complex distortion, realizes the dynamic wavefront detection under the condition, and improves the detection dynamic range of the Hartmann sensor by the algorithm.
2. The invention solves the wave front detection problem under the condition of large inclined complex distortion and light spot missing, and improves the processing performance of the Hartmann wave front sensor.
3. The method provided by the invention has good expansibility, and can expand corresponding functions according to requirements as the beam quality index and the stability index participate in calculation during operation.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application.
FIG. 1 is a general flow chart of a wavefront aberration detection method based on continuous image sequence similarity in a high-speed flow field;
FIG. 2 is a centroid matching flow chart in step (3) of the present invention;
FIG. 3 is a schematic diagram of a centroid matching process, where (a) is an offset reference centroid and (b) is a centroid matching process;
FIG. 4 is a calibrated reference centroid distribution;
FIG. 5 is an offset centroid distribution;
FIG. 6 is a distribution of Zernike coefficients calculated according to FIGS. 4 and 5;
fig. 7 is a diagram of wavefront restoration calculated according to fig. 4 and 5.
Detailed Description
In order to better understand the technical solutions in the present specification, the following description will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the implementation of the present specification, but not limited thereto, and the present invention is not fully described and is according to the conventional technology in the art.
Example 1
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as shown in figure 1, comprises the following steps:
(1) Decomposing video data acquired by a CCD into a picture sequence to obtain original data;
(2) Preprocessing each picture, solving the centroid of each facula in each picture to obtain a centroid coordinate sequence of each picture, taking the centroid coordinate of a first picture as a calibration reference centroid, and simultaneously obtaining an initial restoration matrix E;
(3) Performing centroid matching by adopting two pictures adjacent front and back to obtain a slope vector and correction parameters, and updating an offset reference centroid coordinate sequence;
(4) Carrying the slope vector and the correction parameters into a restoration equation to calculate the Zernike coefficient, and completing wave front restoration;
(5) And (3) matching the next picture sequence with the updated offset reference centroid coordinate sequence, namely repeating the steps (3) - (4), and realizing the wave front restoration of the whole video data.
Example 2
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as in embodiment 1, except that the step (2) includes the steps of:
(2.1) performing picture binarization and morphological operation to obtain the approximate position of the light spot;
Morphological operations refer to open-and-close operations, the approximate location of a spot is one of the morphological operations, which can directly derive the centroid of each spot (in morphology, each individual region);
(2.2) calculating the accurate position of the light spot by using a centroid formula (a picture comprises a plurality of light spots, and the centroid of the light spot corresponding to each sub-aperture is required to be calculated), wherein the centroid formula comprises:
Wherein, (X c,Yc) is the spot centroid coordinate, M represents the number of pixels in the sub-aperture area, I i is the pixel value of the ith pixel, and (X i,yi) represents the pixel coordinate in the sub-aperture;
(2.3) comparing the centroid coordinates of the next picture with the calibration reference centroid coordinates to obtain a slope vector G, where the slope vector G includes a horizontal spot offset G x and a vertical spot offset G x of each sub-aperture, and is expressed as:
Wherein, (X t,Yt) represents the centroid coordinate obtained by the next picture, and f is the focal length of the sub-aperture;
(2.4) using Φ (x, y) to represent the incident wavefront phase, the wavefront phase can be expanded by a zernike polynomial:
Wherein Z m (x, y) is the m Xiang Zeni g polynomial, a m is its coefficient, and m=1 is the average value of the wavefront phase, usually neglected to be 0;
deriving the left and right sides of the zernike polynomial, then the wavefront restoration equation is expressed as a matrix:
G=EA
Wherein G represents a slope vector, the size of which is 2kx1, A= { a 2 a3…am } represents a coefficient vector, E represents a restoration matrix,
The restoration matrix size is 2k× (M-1), each term of the restoration matrix represents the average wavefront slope of the corresponding sub-aperture when the Ying Zeni g coefficient is 1, and k is the number of sub-apertures, (Z xM)k represents the average wavefront slope in the horizontal direction at the kth sub-aperture in the Mth term Zernike polynomial, and Z yM)k represents the average wavefront slope in the vertical direction at the kth sub-aperture in the Mth term Zernike polynomial;
And (2.5) obtaining a generalized inverse matrix E + of E, namely obtaining a coefficient vector A, and substituting the coefficient vector A into a Zernike polynomial to calculate the wave front phase phi (x, y).
Example 3
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as described in embodiment 2, except that the step (3) includes the steps of:
(3.1) adopting the centroid of the previous picture as an offset reference centroid to match each centroid of the next picture (each centroid of the previous picture is matched with each centroid of the next picture, namely, any centroid is taken in the previous picture and the next picture to match each other in pairs), wherein the initial value of the offset reference centroid is the calibrated reference centroid of the first picture, and the coordinate difference value is expressed as:
Dx=XH-XP
Dy=YH-YP
Wherein (X H,YH) represents the centroid coordinates of the latter picture and (X P,YP) represents offset reference centroid coordinates, i.e. centroids that are used as references in the matching step;
the spot distance Dist can be expressed as:
(3.2) if Dist < DSpots/3, considering that the two light spots at the moment are light spots with the same sub-aperture, and calculating the light spot offset by using the calibration reference centroid corresponding to the offset reference centroid on the matching of the post-frame centroid, wherein DSpots represents the average light spot distance of the calibration reference centroid and is a fixed value;
calculating a slope vector, and replacing the centroid of the offset reference centroid at the sub-aperture with the centroid coordinate of the later picture, namely adopting the centroid of the later picture as a new offset reference centroid;
And (3.3) if Dist is more than or equal to DSpots/3, considering that the offset reference centroid cannot be matched with the centroid of the next picture, considering that the data of the sub-aperture is missing, and marking the sub-aperture serial number, namely the correction parameter.
The offset reference centroid refers to the result of its offset reference centroid from the calibration calculation for the first frame for which the recovery calculation was performed. Because the light spot flickers, not every reference centroid can find the corresponding rear frame centroid in the restored frame, the reference centroid is needed to replace the updated missing rear frame centroid at the moment and is used as the offset reference centroid of the next restored calculated frame, and the light spot on the matched rear frame centroid is still used.
Example 4
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as in embodiment 3, except that the step (4) includes the steps of:
(4.1) the slope vector G calculated in the step (3) only comprises successfully matched centroids, the unmatched centroids reserve a vector I consisting of serial numbers of the successfully matched centroids, and data in the restoration matrix E are deleted through the vector I to obtain E';
For example, for a frame subjected to restoration calculation, the first spot data of the frame is found to be missing through centroid matching, and then the corresponding restoration matrix removes the data of the first sub-aperture:
the size was changed to 2 (k-1) × (M-1). And the size of G becomes 2 (k-1). Times.1.
(4.2) By solving the equation
G=E′A
Obtaining a coefficient vector A;
(4.3) substituting the obtained coefficient vector a into the wavefront phase expression:
the recovered wavefront phase is obtained.
Example 5
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field is disclosed, as shown in a flow chart of fig. 1, by firstly decomposing dynamic light spot distribution video into picture sequences, recording different light spot position distributions on each picture, preprocessing to obtain a light spot centroid coordinate sequence of each picture, and calculating an initial restoration matrix by taking a centroid of a first picture as a calibration reference centroid.
And then, centroid matching operation is carried out, the flow chart of which is shown in fig. 2, the process diagram is shown in fig. 3, wherein a green dot represents offset reference centroid distribution (actual situation is shown in fig. 4), a red dot represents offset centroid distribution (actual situation is shown in fig. 5), a black area represents sub-aperture range, sub-apertures 8 and 11 represent the condition that light spots are missing, and finally, a slope vector calculated by a matching light spot and correction parameters of an unmatched light spot are obtained, and the offset reference centroid is updated according to the matching condition and used for matching the offset centroid of the next frame.
And (3) taking the obtained slope vector and the correction parameters into a restoration equation to obtain the Zernike coefficient, and using the Zernike coefficient to realize the wavefront restoration of the current offset centroid frame (as shown in figure 6), and repeating the steps (3) - (4) to realize the wavefront restoration of the whole video data (as shown in figure 7).
1. Restoration matrix and slope vector:
each term of the recovery matrix represents the average wavefront slope for a Ying Zeni gram coefficient of 1 over the corresponding sub-aperture.
Wherein M is a selected zernike term number, which can be changed according to the requirement, in this embodiment, M is 21, as shown in fig. 6, the abscissa is the zernike term number, the ordinate is the zernike coefficient, and k is the number of sub-apertures.
The slope vectors each represent the wavefront slope in a different direction over the corresponding sub-aperture.
When a spot is missing in a certain sub-aperture, elements in the restoration matrix and the slope vector are deleted, so that effective data of the rest sub-apertures can participate in calculating the Zernike coefficients in the correct matrix dimension.
2. Average spot distance DSpots
The light spots of the calibration reference light spot distribution (green spots) shown in fig. 3 (a) are distributed in the center of the sub-aperture, and if the sub-aperture size is a, the light spot distance is a, however, in practical situations, considering that the reference light beam for generating the calibration reference centroid is not a strict plane wave, and meanwhile, the sub-aperture size cannot be guaranteed to be completely consistent in practical processing, the sub-aperture size is obtained DSpots in a calculation mode, so as to reflect the sub-aperture size.
As shown in fig. 3b, the offset light spot distribution (red spot) is shown, in order to show that the distance between the red spot and the green spot is exaggerated, and in practical situations, the distance is very small, so that by judging whether the distance between the red spot and the green spot is less than one third of the size of the sub-aperture, that is, DSpots/3, it can be determined whether the two spots are located in the same sub-aperture. The invention can further prevent the overlapping of the light spots by using the threshold DSpots/3 to find out the best matching light spot.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that various modifications and adaptations can be made without departing from the principles of the present invention, and such modifications and adaptations are intended to be comprehended within the scope of the present invention.

Claims (4)

1.一种高速流场中基于连续图像序列相似性的波前畸变检测方法,其特征在于,包括如下步骤;1. A method for detecting wavefront distortion in a high-speed flow field based on the similarity of continuous image sequences, comprising the following steps: (1)将CCD采集到的视频数据分解为图片序列,得到原始数据;(1) Decompose the video data collected by the CCD into image sequences to obtain the original data; (2)对每张图片进行预处理,求取每张图片中的每一光斑质心,得到每张图片的质心坐标序列,并将第一张图片的质心坐标作为标定参考质心,同时得到初始复原矩阵E;(2) Preprocess each image, calculate the centroid of each light spot in each image, obtain the centroid coordinate sequence of each image, and use the centroid coordinate of the first image as the calibration reference centroid, and obtain the initial restoration matrix E at the same time; (3)采用前后相邻的两张图片进行质心匹配,得到斜率向量以及修正参数,并且更新偏移参考质心坐标序列;(3) Use the two adjacent images to perform centroid matching, obtain the slope vector and correction parameters, and update the offset reference centroid coordinate sequence; (4)将斜率向量与修正参数带入复原方程求取泽尼克系数,完成波前复原;(4) Substitute the slope vector and the correction parameter into the restoration equation to obtain the Zernike coefficient and complete the wavefront restoration; (5)将图片序列下一张与更新后的偏移参考质心坐标序列进行匹配,即重复步骤(3)~(4),实现整个视频数据的波前复原。(5) Match the next image in the sequence with the updated offset reference centroid coordinate sequence, i.e. repeat steps (3) to (4) to achieve wavefront restoration of the entire video data. 2.根据权利要求1所述高速流场中基于连续图像序列相似性的波前畸变检测方法,其特征在于,步骤(2)包括如下步骤:2. The method for detecting wavefront distortion in a high-speed flow field based on similarity of continuous image sequences according to claim 1, wherein step (2) comprises the following steps: (2.1)进行图片二值化及形态学操作获取光斑大致位置;(2.1) Perform image binarization and morphological operations to obtain the approximate location of the light spot; (2.2)利用质心公式求取光斑精确位置,质心公式:(2.2) Use the centroid formula to find the exact position of the light spot. The centroid formula is: 其中,(Xc,Yc)为光斑质心坐标,M表示子孔径区域中的像素数量,Ii为第i个像素的像素值,(xi,yi)表示该子孔径内的像素坐标;将第一张图片中所有子孔径对应的光斑质心求出后,即可得到标定参考质心;Where ( Xc , Yc ) is the coordinate of the centroid of the light spot, M represents the number of pixels in the sub-aperture area, Ii is the pixel value of the i-th pixel, and ( xi , yi ) represents the pixel coordinates within the sub-aperture. After calculating the centroid of the light spot corresponding to all sub-apertures in the first image, the calibration reference centroid can be obtained. (2.3)利用后一张图片的质心坐标与标定参考质心坐标相比较得到斜率向量G,斜率向量G包含着每一个子孔径的水平方向光斑偏移量Gx和竖直方向光斑偏移量Gx,表示为:(2.3) The slope vector G is obtained by comparing the centroid coordinates of the latter image with the calibration reference centroid coordinates. The slope vector G contains the horizontal spot offset G x and the vertical spot offset G x of each sub-aperture, which can be expressed as: 其中,(Xt,Yt)表示后一张图片得到的质心坐标,f为子孔径焦距;Where (X t ,Y t ) represents the coordinates of the centroid obtained from the latter image, and f is the focal length of the subaperture; (2.4)采用Φ(x,y)表示入射波前相位,波前相位由泽尼克多项式展开为:(2.4) Φ(x,y) is used to represent the incident wavefront phase, which is expanded by the Zernike polynomials as follows: 其中,其中Zm(x,y)为第m项泽尼克多项式,am为其系数,m=1时为波前相位平均值,忽略记为0;Where Z m (x, y) is the mth Zernike polynomial, a m is its coefficient, when m = 1, it is the average value of the wavefront phase, and when it is ignored, it is recorded as 0; 对泽尼克多项式左右两边进行求导,那么波前复原方程用矩阵形式表示为:Taking the derivative of the left and right sides of the Zernike polynomial, the wavefront restoration equation is expressed in matrix form as: G=EAG=EA 其中,G表示斜率向量;A={a2 a3…am}表示系数向量;E表示复原矩阵,Where G represents the slope vector; A = {a 2 a 3 … a m } represents the coefficient vector; E represents the restoration matrix, (ZxM)k表示第M项泽尼克多项式中第k个子孔径处的水平方向平均波前斜率,(ZyM)k表示第M项泽尼克多项式中第k个子孔径处的竖直方向平均波前斜率;(Z xM ) k represents the horizontal average wavefront slope at the kth subaperture in the Mth Zernike polynomial, and (Z yM ) k represents the vertical average wavefront slope at the kth subaperture in the Mth Zernike polynomial; (2.5)求出E的广义逆矩阵E+,即可求出系数向量A,代入泽尼克多项式即可计算波前相位Φ(x,y)。(2.5) By finding the generalized inverse matrix E + of E, we can find the coefficient vector A, which can be substituted into the Zernike polynomial to calculate the wavefront phase Φ(x,y). 3.根据权利要求2所述高速流场中基于连续图像序列相似性的波前畸变检测方法,其特征在于,步骤(3)包括如下步骤:3. The wavefront distortion detection method based on the similarity of continuous image sequences in a high-speed flow field according to claim 2, wherein step (3) comprises the following steps: (3.1)采用前一图片的质心作为偏移参考质心,与后一图片的每个质心做匹配,其中偏移参考质心的初始值为第一张图片的标定参考质心,坐标差值表示为:(3.1) The centroid of the previous image is used as the offset reference centroid and matched with each centroid of the next image. The initial value of the offset reference centroid is the calibration reference centroid of the first image. The coordinate difference is expressed as: Dx=XH-XP DxXH - XP Dy=YH-YP D y = Y H - Y P 其中,(XH,YH)表示后一图片的质心坐标,(XP,YP)表示偏移参考质心坐标;Where (X H , Y H ) represents the coordinates of the centroid of the latter image, and (X P , Y P ) represents the coordinates of the offset reference centroid; 光斑距离Dist表示为:The spot distance Dist is expressed as: (3.2)若Dist<DSpots/3,则认为此时的两个光点为同一子孔径的光斑,其中DSpots表示标定参考质心的平均光斑距离;(3.2) If Dist < DSpots/3, the two light spots are considered to be the same sub-aperture spot, where DSpots represents the average spot distance of the calibration reference centroid; 计算斜率向量,并将后一图片的质心坐标替换偏移参考质心在该子孔径处的质心,即采用后一图片的质心作为新的偏移参考质心;Calculate the slope vector and replace the centroid coordinates of the subsequent image with the centroid of the offset reference centroid at the sub-aperture, that is, use the centroid of the subsequent image as the new offset reference centroid; (3.3)若Dist≥DSpots/3,则认为偏移参考质心未能匹配到后一图片的质心,则认为该子孔径的数据缺失,标记其子孔径序号,即修正参数。(3.3) If Dist ≥ DSpots/3, it is considered that the offset reference centroid fails to match the centroid of the next image, and the data of the sub-aperture is considered missing. The sub-aperture number is marked, that is, the correction parameter. 4.根据权利要求3所述高速流场中基于连续图像序列相似性的波前畸变检测方法,其特征在于,步骤(4)包括如下步骤:4. The method for detecting wavefront distortion in a high-speed flow field based on similarity of continuous image sequences according to claim 3, wherein step (4) comprises the following steps: (4.1)通过步骤(3)计算出的斜率向量G,只包括了匹配成功的质心;未匹配成功的质心保留了由其序号组成的向量I,通过向量I删除复原矩阵E中的数据,得到E′;(4.1) The slope vector G calculated in step (3) only includes the successfully matched centroids; the unmatched centroids retain the vector I consisting of their sequence numbers. The data in the restoration matrix E is deleted by vector I to obtain E′; (4.2)通过求解方程(4.2) by solving the equation G=E′AG=E′A 得到系数向量A;Get the coefficient vector A; (4.3)将得到的系数向量A代入波前相位表达式:(4.3) Substitute the obtained coefficient vector A into the wavefront phase expression: 即得复原的波前相位。The restored wavefront phase is obtained.
CN202410905585.5A 2024-07-08 2024-07-08 Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field Active CN119090732B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410905585.5A CN119090732B (en) 2024-07-08 2024-07-08 Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410905585.5A CN119090732B (en) 2024-07-08 2024-07-08 Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field

Publications (2)

Publication Number Publication Date
CN119090732A CN119090732A (en) 2024-12-06
CN119090732B true CN119090732B (en) 2025-07-29

Family

ID=93696443

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410905585.5A Active CN119090732B (en) 2024-07-08 2024-07-08 Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field

Country Status (1)

Country Link
CN (1) CN119090732B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3007610A1 (en) * 2013-06-10 2016-04-20 Essilor International (Compagnie Générale D'Optique) Method for determining wave-front aberration data of a to-be-tested optical system
CN114252163A (en) * 2021-12-21 2022-03-29 中国科学院光电技术研究所 Low signal-to-noise ratio sub-spot wavefront restoration method based on image noise removal

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9091614B2 (en) * 2011-05-20 2015-07-28 Canon Kabushiki Kaisha Wavefront optical measuring apparatus
CN115031856A (en) * 2022-06-08 2022-09-09 中国科学院光电技术研究所 Shack-Hartmann wavefront sensor wavefront restoration method based on sub-spot screening
CN118010173A (en) * 2024-02-27 2024-05-10 北京理工大学 Error calibration method of shack-Hartmann wavefront sensor

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3007610A1 (en) * 2013-06-10 2016-04-20 Essilor International (Compagnie Générale D'Optique) Method for determining wave-front aberration data of a to-be-tested optical system
CN114252163A (en) * 2021-12-21 2022-03-29 中国科学院光电技术研究所 Low signal-to-noise ratio sub-spot wavefront restoration method based on image noise removal

Also Published As

Publication number Publication date
CN119090732A (en) 2024-12-06

Similar Documents

Publication Publication Date Title
CN111351446B (en) Light field camera calibration method for three-dimensional topography measurement
US8400505B2 (en) Calibration method, calibration device, and calibration system including the device
JP6570991B2 (en) Diversification of lenslet, beam walk (BEAMWALK), and tilt for forming non-coplanar (ANISOLANATIC) images in large aperture telescopes
JP5489897B2 (en) Stereo distance measuring device and stereo distance measuring method
US10769814B2 (en) Camera parameter calculation apparatus based on the average pixel values
CN108305233B (en) A light field image correction method for microlens array errors
CN101246590A (en) Geometric Correction Method for Spatial Distortion Image of Spaceborne Camera
CN110099267A (en) Trapezoidal correcting system, method and projector
CN109813442B (en) Multi-frame processing-based internal stray radiation non-uniformity correction method
JP2007304525A (en) Image input apparatus, electronic apparatus, and image input method
CN113284196B (en) Camera distortion pixel-by-pixel calibration method
CN111694016B (en) Non-interference synthetic aperture super-resolution imaging reconstruction method
JP7489253B2 (en) Depth map generating device and program thereof, and depth map generating system
CN112700502B (en) Binocular camera system and binocular camera space calibration method
CN116109491B (en) Method for correcting image non-uniformity of imaging spectrometer by using bright-dark uniform region
CN105044906B (en) A kind of Quick Extended target imaging bearing calibration based on image information
CN119090732B (en) Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field
CN114240801A (en) A method for non-uniform correction of remote sensing images
CN105046674A (en) Nonuniformity correction method of multi-pixel parallel scanning infrared CCD images
CN107277327A (en) It is a kind of estimate full aperture place an order lens light-field camera point spread function method
CN111932478A (en) Self-adaptive non-uniform correction method for uncooled infrared focal plane
CN109345595B (en) Stereoscopic vision sensor calibration method based on spherical lens
CN117451190A (en) Deep learning defocusing scattering wavefront sensing method
KR102584209B1 (en) 3d reconstruction method of integrated image using concave lens array
CN113566984B (en) Extended object wavefront sensing device and method based on Fourier spectrum cancellation

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant