CN119090732B - Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field - Google Patents
Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow fieldInfo
- Publication number
- CN119090732B CN119090732B CN202410905585.5A CN202410905585A CN119090732B CN 119090732 B CN119090732 B CN 119090732B CN 202410905585 A CN202410905585 A CN 202410905585A CN 119090732 B CN119090732 B CN 119090732B
- Authority
- CN
- China
- Prior art keywords
- centroid
- image
- wavefront
- sub
- vector
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/50—Image enhancement or restoration using two or more images, e.g. averaging or subtraction
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J9/00—Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/74—Image or video pattern matching; Proximity measures in feature spaces
- G06V10/761—Proximity, similarity or dissimilarity measures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J9/00—Measuring optical phase difference; Determining degree of coherence; Measuring optical wavelength
- G01J2009/002—Wavefront phase distribution
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20036—Morphological image processing
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Software Systems (AREA)
- Evolutionary Computation (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Multimedia (AREA)
- Health & Medical Sciences (AREA)
- Geometry (AREA)
- Testing Of Optical Devices Or Fibers (AREA)
Abstract
The invention relates to a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, which belongs to the field of data processing and comprises the steps of decomposing acquired video data into image sequences, preprocessing each image, solving each facula centroid in each image to obtain a centroid coordinate sequence of each image, taking centroid coordinates of a first image as a calibration reference centroid, simultaneously obtaining an initial restoration matrix, carrying out centroid matching on two images adjacent front and back to obtain a slope vector and a correction parameter, updating an offset reference centroid coordinate sequence, carrying out a recovery equation on the slope vector and the correction parameter to obtain a Zernike coefficient, finishing wavefront restoration, and repeating the steps to realize the wavefront restoration of the whole video data. The invention can better face the complex situations of large inclination distortion, light spot data missing and the like, realizes the detection and recovery of wave fronts, and has important significance for improving the effective detection of the Hartmann wave front sensor on the distorted complex wave fronts.
Description
Technical Field
The invention provides a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, and belongs to the technical field of data processing of beam wavefront measurement.
Background
Hartmann wavefront sensors are a sophisticated optical measurement device that utilize a microlens array to divide an incident light wavefront into a plurality of sub-wavefronts, and determine the shape of the wavefront by measuring the sub-spot position variations formed by these sub-wavefronts at the focal plane. The technology can monitor wave front distortion in real time and correct the wave front distortion in a dynamic optical system, and has the characteristics of high precision and high real-time performance. The non-contact measurement mode of the Hartmann wavefront sensor avoids physical interference of the optical element, so that the Hartmann wavefront sensor is particularly important in optical element testing. In addition, due to the strong adaptability, the Hartmann wavefront sensor can be suitable for various wavelengths and different types of optical systems.
The origin of the Hartmann wavefront sensor can be traced back to the beginning of the 20 th century, which was first proposed by the German physicist Johann KARL FRIEDRICH HARTMANN. Hartmann issued a paper for wavefront measurement in 1910 and described a method for measuring a constant star wavefront using a microlens array. Such original Hartmann wavefront sensors were designed for astronomy to measure and correct starlight wavefront distortions due to atmospheric disturbances. Over time, hartmann wavefront sensor designs and applications have been continually improved and expanded. In the 70 s of the 20 th century, along with the development of computer technology, the data processing capability of the wavefront sensor is remarkably improved, so that the Hartmann wavefront sensor is widely applied to optical testing and adaptive optical systems. By the 90 th century of the 20 th century, along with the development of micro-electro-mechanical system (MEMS) technology, the miniaturization and integration of Hartmann wavefront sensor become possible, and further promote the application of the Hartmann wavefront sensor in the field of precision optical measurement.
The field of application of Hartmann wavefront sensors is very broad, including but not limited to atmospheric turbulence correction in adaptive optics, vision correction in ophthalmic, and starlight wavefront distortion correction in astronomical telescope systems. These applications demonstrate the importance of Hartmann wavefront sensors in improving imaging quality, optimizing optical system performance. With the optimization of the algorithm and the progress of hardware technology, the measurement speed and the measurement precision of the Hartmann wavefront sensor are remarkably improved, so that the Hartmann wavefront sensor can better adapt to high-speed dynamic environment and complex optical measurement requirements.
At present, the development direction of the Hartmann wavefront sensor mainly comprises the steps of improving the dynamic range, improving the lost data processing capacity, improving the detection precision and the like. The dynamic range and the capability of processing missing data are particularly important when facing complex distorted wavefront, but the existing detection method hardly combines the two aspects, so that a wavefront distortion detection algorithm based on continuous image sequence similarity is needed to be capable of combining the dynamic range and the capability of processing missing data.
Disclosure of Invention
Aiming at the defects of the prior art, the invention provides a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, which can better cope with complex situations such as large inclination distortion, light spot data loss and the like, realizes the detection and recovery of the wavefront, and has important significance for improving the effective detection of a Hartmann wavefront sensor on distorted complex wavefront.
The technical scheme of the invention is as follows:
a wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field comprises the following steps:
(1) Decomposing video data acquired by a CCD into a picture sequence to obtain original data;
(2) Preprocessing each picture, solving the centroid of each facula in each picture to obtain a centroid coordinate sequence of each picture, taking the centroid coordinate of a first picture as a calibration reference centroid, and simultaneously obtaining an initial restoration matrix E;
(3) Performing centroid matching by adopting two pictures adjacent front and back to obtain a slope vector and correction parameters, and updating an offset reference centroid coordinate sequence;
(4) Carrying the slope vector and the correction parameters into a restoration equation to calculate the Zernike coefficient, and completing wave front restoration;
(5) And (3) matching the next picture sequence with the updated offset reference centroid coordinate sequence, namely repeating the steps (3) - (4), and realizing the wave front restoration of the whole video data.
Preferably, step (2) comprises the steps of:
(2.1) performing picture binarization and morphological operation to obtain the approximate position of the light spot;
Morphological operations refer to open-and-close operations, the approximate location of a spot is one of the morphological operations, which can directly derive the centroid of each spot (in morphology, each individual region);
(2.2) calculating the accurate position of the light spot by using a centroid formula (a picture comprises a plurality of light spots, and the centroid of the light spot corresponding to each sub-aperture is required to be calculated), wherein the centroid formula comprises:
Wherein, (X c,Yc) is the spot centroid coordinate, M represents the number of pixels in the sub-aperture area, I i is the pixel value of the ith pixel, and (X i,yi) represents the pixel coordinate in the sub-aperture;
(2.3) comparing the centroid coordinates of the next picture with the calibration reference centroid coordinates to obtain a slope vector G, where the slope vector G includes a horizontal spot offset G x and a vertical spot offset G x of each sub-aperture, and is expressed as:
Wherein, (X t,Yt) represents the centroid coordinate obtained by the next picture, and f is the focal length of the sub-aperture;
(2.4) using Φ (x, y) to represent the incident wavefront phase, the wavefront phase can be expanded by a zernike polynomial:
Wherein Z m (x, y) is the m Xiang Zeni g polynomial, a m is its coefficient, and m=1 is the average value of the wavefront phase, usually neglected to be 0;
deriving the left and right sides of the zernike polynomial, then the wavefront restoration equation is expressed as a matrix:
G=EA
Wherein G represents a slope vector, the size of which is 2kx1, A= { a 2 a3…am } represents a coefficient vector, E represents a restoration matrix,
The restoration matrix size is 2k× (M-1), each term of the restoration matrix represents the average wavefront slope of the corresponding sub-aperture when the Ying Zeni g coefficient is 1, and k is the number of sub-apertures, (Z xM)k represents the average wavefront slope in the horizontal direction at the kth sub-aperture in the Mth term Zernike polynomial, and Z yM)k represents the average wavefront slope in the vertical direction at the kth sub-aperture in the Mth term Zernike polynomial;
And (2.5) obtaining a generalized inverse matrix E + of E, namely obtaining a coefficient vector A, and substituting the coefficient vector A into a Zernike polynomial to calculate the wave front phase phi (x, y).
Preferably, step (3) comprises the steps of:
(3.1) adopting the centroid of the previous picture as an offset reference centroid to match each centroid of the next picture (each centroid of the previous picture is matched with each centroid of the next picture, namely, any centroid is taken in the previous picture and the next picture to match each other in pairs), wherein the initial value of the offset reference centroid is the calibrated reference centroid of the first picture, and the coordinate difference value is expressed as:
Dx=XH-XP
Dy=YH-YP
Wherein (X H,YH) represents the centroid coordinates of the latter picture and (X P,YP) represents offset reference centroid coordinates, i.e. centroids that are used as references in the matching step;
the spot distance Dist can be expressed as:
(3.2) if Dist < DSpots/3, considering that the two light spots at the moment are light spots with the same sub-aperture, and calculating the light spot offset by using the calibration reference centroid corresponding to the offset reference centroid on the matching of the post-frame centroid, wherein DSpots represents the average light spot distance of the calibration reference centroid and is a fixed value;
calculating a slope vector, and replacing the centroid of the offset reference centroid at the sub-aperture with the centroid coordinate of the later picture, namely adopting the centroid of the later picture as a new offset reference centroid;
And (3.3) if Dist is more than or equal to DSpots/3, considering that the offset reference centroid cannot be matched with the centroid of the next picture, considering that the data of the sub-aperture is missing, and marking the sub-aperture serial number, namely the correction parameter.
The offset reference centroid refers to the result of its offset reference centroid from the calibration calculation for the first frame for which the recovery calculation was performed. Because the light spot flickers, not every reference centroid can find the corresponding rear frame centroid in the restored frame, the reference centroid is needed to replace the updated missing rear frame centroid at the moment and is used as the offset reference centroid of the next restored calculated frame, and the light spot on the matched rear frame centroid is still used.
Preferably, step (4) comprises the steps of:
(4.1) the slope vector G calculated in the step (3) only comprises successfully matched centroids, the unmatched centroids reserve a vector I consisting of serial numbers of the successfully matched centroids, and data in the restoration matrix E are deleted through the vector I to obtain E';
(4.2) by solving the equation
G=E′A
Obtaining a coefficient vector A;
(4.3) substituting the obtained coefficient vector a into the wavefront phase expression:
the recovered wavefront phase is obtained.
The invention is not exhaustive and can be seen in the prior art.
The beneficial effects of the invention are as follows:
1. the invention realizes the dynamic wavefront data processing of large-inclination complex distortion, realizes the dynamic wavefront detection under the condition, and improves the detection dynamic range of the Hartmann sensor by the algorithm.
2. The invention solves the wave front detection problem under the condition of large inclined complex distortion and light spot missing, and improves the processing performance of the Hartmann wave front sensor.
3. The method provided by the invention has good expansibility, and can expand corresponding functions according to requirements as the beam quality index and the stability index participate in calculation during operation.
Drawings
The accompanying drawings, which are included to provide a further understanding of the application and are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the application.
FIG. 1 is a general flow chart of a wavefront aberration detection method based on continuous image sequence similarity in a high-speed flow field;
FIG. 2 is a centroid matching flow chart in step (3) of the present invention;
FIG. 3 is a schematic diagram of a centroid matching process, where (a) is an offset reference centroid and (b) is a centroid matching process;
FIG. 4 is a calibrated reference centroid distribution;
FIG. 5 is an offset centroid distribution;
FIG. 6 is a distribution of Zernike coefficients calculated according to FIGS. 4 and 5;
fig. 7 is a diagram of wavefront restoration calculated according to fig. 4 and 5.
Detailed Description
In order to better understand the technical solutions in the present specification, the following description will clearly and completely describe the technical solutions in the embodiments of the present invention in conjunction with the drawings in the implementation of the present specification, but not limited thereto, and the present invention is not fully described and is according to the conventional technology in the art.
Example 1
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as shown in figure 1, comprises the following steps:
(1) Decomposing video data acquired by a CCD into a picture sequence to obtain original data;
(2) Preprocessing each picture, solving the centroid of each facula in each picture to obtain a centroid coordinate sequence of each picture, taking the centroid coordinate of a first picture as a calibration reference centroid, and simultaneously obtaining an initial restoration matrix E;
(3) Performing centroid matching by adopting two pictures adjacent front and back to obtain a slope vector and correction parameters, and updating an offset reference centroid coordinate sequence;
(4) Carrying the slope vector and the correction parameters into a restoration equation to calculate the Zernike coefficient, and completing wave front restoration;
(5) And (3) matching the next picture sequence with the updated offset reference centroid coordinate sequence, namely repeating the steps (3) - (4), and realizing the wave front restoration of the whole video data.
Example 2
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as in embodiment 1, except that the step (2) includes the steps of:
(2.1) performing picture binarization and morphological operation to obtain the approximate position of the light spot;
Morphological operations refer to open-and-close operations, the approximate location of a spot is one of the morphological operations, which can directly derive the centroid of each spot (in morphology, each individual region);
(2.2) calculating the accurate position of the light spot by using a centroid formula (a picture comprises a plurality of light spots, and the centroid of the light spot corresponding to each sub-aperture is required to be calculated), wherein the centroid formula comprises:
Wherein, (X c,Yc) is the spot centroid coordinate, M represents the number of pixels in the sub-aperture area, I i is the pixel value of the ith pixel, and (X i,yi) represents the pixel coordinate in the sub-aperture;
(2.3) comparing the centroid coordinates of the next picture with the calibration reference centroid coordinates to obtain a slope vector G, where the slope vector G includes a horizontal spot offset G x and a vertical spot offset G x of each sub-aperture, and is expressed as:
Wherein, (X t,Yt) represents the centroid coordinate obtained by the next picture, and f is the focal length of the sub-aperture;
(2.4) using Φ (x, y) to represent the incident wavefront phase, the wavefront phase can be expanded by a zernike polynomial:
Wherein Z m (x, y) is the m Xiang Zeni g polynomial, a m is its coefficient, and m=1 is the average value of the wavefront phase, usually neglected to be 0;
deriving the left and right sides of the zernike polynomial, then the wavefront restoration equation is expressed as a matrix:
G=EA
Wherein G represents a slope vector, the size of which is 2kx1, A= { a 2 a3…am } represents a coefficient vector, E represents a restoration matrix,
The restoration matrix size is 2k× (M-1), each term of the restoration matrix represents the average wavefront slope of the corresponding sub-aperture when the Ying Zeni g coefficient is 1, and k is the number of sub-apertures, (Z xM)k represents the average wavefront slope in the horizontal direction at the kth sub-aperture in the Mth term Zernike polynomial, and Z yM)k represents the average wavefront slope in the vertical direction at the kth sub-aperture in the Mth term Zernike polynomial;
And (2.5) obtaining a generalized inverse matrix E + of E, namely obtaining a coefficient vector A, and substituting the coefficient vector A into a Zernike polynomial to calculate the wave front phase phi (x, y).
Example 3
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as described in embodiment 2, except that the step (3) includes the steps of:
(3.1) adopting the centroid of the previous picture as an offset reference centroid to match each centroid of the next picture (each centroid of the previous picture is matched with each centroid of the next picture, namely, any centroid is taken in the previous picture and the next picture to match each other in pairs), wherein the initial value of the offset reference centroid is the calibrated reference centroid of the first picture, and the coordinate difference value is expressed as:
Dx=XH-XP
Dy=YH-YP
Wherein (X H,YH) represents the centroid coordinates of the latter picture and (X P,YP) represents offset reference centroid coordinates, i.e. centroids that are used as references in the matching step;
the spot distance Dist can be expressed as:
(3.2) if Dist < DSpots/3, considering that the two light spots at the moment are light spots with the same sub-aperture, and calculating the light spot offset by using the calibration reference centroid corresponding to the offset reference centroid on the matching of the post-frame centroid, wherein DSpots represents the average light spot distance of the calibration reference centroid and is a fixed value;
calculating a slope vector, and replacing the centroid of the offset reference centroid at the sub-aperture with the centroid coordinate of the later picture, namely adopting the centroid of the later picture as a new offset reference centroid;
And (3.3) if Dist is more than or equal to DSpots/3, considering that the offset reference centroid cannot be matched with the centroid of the next picture, considering that the data of the sub-aperture is missing, and marking the sub-aperture serial number, namely the correction parameter.
The offset reference centroid refers to the result of its offset reference centroid from the calibration calculation for the first frame for which the recovery calculation was performed. Because the light spot flickers, not every reference centroid can find the corresponding rear frame centroid in the restored frame, the reference centroid is needed to replace the updated missing rear frame centroid at the moment and is used as the offset reference centroid of the next restored calculated frame, and the light spot on the matched rear frame centroid is still used.
Example 4
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field, as in embodiment 3, except that the step (4) includes the steps of:
(4.1) the slope vector G calculated in the step (3) only comprises successfully matched centroids, the unmatched centroids reserve a vector I consisting of serial numbers of the successfully matched centroids, and data in the restoration matrix E are deleted through the vector I to obtain E';
For example, for a frame subjected to restoration calculation, the first spot data of the frame is found to be missing through centroid matching, and then the corresponding restoration matrix removes the data of the first sub-aperture:
the size was changed to 2 (k-1) × (M-1). And the size of G becomes 2 (k-1). Times.1.
(4.2) By solving the equation
G=E′A
Obtaining a coefficient vector A;
(4.3) substituting the obtained coefficient vector a into the wavefront phase expression:
the recovered wavefront phase is obtained.
Example 5
A wavefront distortion detection method based on continuous image sequence similarity in a high-speed flow field is disclosed, as shown in a flow chart of fig. 1, by firstly decomposing dynamic light spot distribution video into picture sequences, recording different light spot position distributions on each picture, preprocessing to obtain a light spot centroid coordinate sequence of each picture, and calculating an initial restoration matrix by taking a centroid of a first picture as a calibration reference centroid.
And then, centroid matching operation is carried out, the flow chart of which is shown in fig. 2, the process diagram is shown in fig. 3, wherein a green dot represents offset reference centroid distribution (actual situation is shown in fig. 4), a red dot represents offset centroid distribution (actual situation is shown in fig. 5), a black area represents sub-aperture range, sub-apertures 8 and 11 represent the condition that light spots are missing, and finally, a slope vector calculated by a matching light spot and correction parameters of an unmatched light spot are obtained, and the offset reference centroid is updated according to the matching condition and used for matching the offset centroid of the next frame.
And (3) taking the obtained slope vector and the correction parameters into a restoration equation to obtain the Zernike coefficient, and using the Zernike coefficient to realize the wavefront restoration of the current offset centroid frame (as shown in figure 6), and repeating the steps (3) - (4) to realize the wavefront restoration of the whole video data (as shown in figure 7).
1. Restoration matrix and slope vector:
each term of the recovery matrix represents the average wavefront slope for a Ying Zeni gram coefficient of 1 over the corresponding sub-aperture.
Wherein M is a selected zernike term number, which can be changed according to the requirement, in this embodiment, M is 21, as shown in fig. 6, the abscissa is the zernike term number, the ordinate is the zernike coefficient, and k is the number of sub-apertures.
The slope vectors each represent the wavefront slope in a different direction over the corresponding sub-aperture.
When a spot is missing in a certain sub-aperture, elements in the restoration matrix and the slope vector are deleted, so that effective data of the rest sub-apertures can participate in calculating the Zernike coefficients in the correct matrix dimension.
2. Average spot distance DSpots
The light spots of the calibration reference light spot distribution (green spots) shown in fig. 3 (a) are distributed in the center of the sub-aperture, and if the sub-aperture size is a, the light spot distance is a, however, in practical situations, considering that the reference light beam for generating the calibration reference centroid is not a strict plane wave, and meanwhile, the sub-aperture size cannot be guaranteed to be completely consistent in practical processing, the sub-aperture size is obtained DSpots in a calculation mode, so as to reflect the sub-aperture size.
As shown in fig. 3b, the offset light spot distribution (red spot) is shown, in order to show that the distance between the red spot and the green spot is exaggerated, and in practical situations, the distance is very small, so that by judging whether the distance between the red spot and the green spot is less than one third of the size of the sub-aperture, that is, DSpots/3, it can be determined whether the two spots are located in the same sub-aperture. The invention can further prevent the overlapping of the light spots by using the threshold DSpots/3 to find out the best matching light spot.
While the foregoing is directed to the preferred embodiments of the present invention, it will be appreciated by those skilled in the art that various modifications and adaptations can be made without departing from the principles of the present invention, and such modifications and adaptations are intended to be comprehended within the scope of the present invention.
Claims (4)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410905585.5A CN119090732B (en) | 2024-07-08 | 2024-07-08 | Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410905585.5A CN119090732B (en) | 2024-07-08 | 2024-07-08 | Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN119090732A CN119090732A (en) | 2024-12-06 |
| CN119090732B true CN119090732B (en) | 2025-07-29 |
Family
ID=93696443
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202410905585.5A Active CN119090732B (en) | 2024-07-08 | 2024-07-08 | Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119090732B (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3007610A1 (en) * | 2013-06-10 | 2016-04-20 | Essilor International (Compagnie Générale D'Optique) | Method for determining wave-front aberration data of a to-be-tested optical system |
| CN114252163A (en) * | 2021-12-21 | 2022-03-29 | 中国科学院光电技术研究所 | Low signal-to-noise ratio sub-spot wavefront restoration method based on image noise removal |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US9091614B2 (en) * | 2011-05-20 | 2015-07-28 | Canon Kabushiki Kaisha | Wavefront optical measuring apparatus |
| CN115031856A (en) * | 2022-06-08 | 2022-09-09 | 中国科学院光电技术研究所 | Shack-Hartmann wavefront sensor wavefront restoration method based on sub-spot screening |
| CN118010173A (en) * | 2024-02-27 | 2024-05-10 | 北京理工大学 | Error calibration method of shack-Hartmann wavefront sensor |
-
2024
- 2024-07-08 CN CN202410905585.5A patent/CN119090732B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP3007610A1 (en) * | 2013-06-10 | 2016-04-20 | Essilor International (Compagnie Générale D'Optique) | Method for determining wave-front aberration data of a to-be-tested optical system |
| CN114252163A (en) * | 2021-12-21 | 2022-03-29 | 中国科学院光电技术研究所 | Low signal-to-noise ratio sub-spot wavefront restoration method based on image noise removal |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119090732A (en) | 2024-12-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN111351446B (en) | Light field camera calibration method for three-dimensional topography measurement | |
| US8400505B2 (en) | Calibration method, calibration device, and calibration system including the device | |
| JP6570991B2 (en) | Diversification of lenslet, beam walk (BEAMWALK), and tilt for forming non-coplanar (ANISOLANATIC) images in large aperture telescopes | |
| JP5489897B2 (en) | Stereo distance measuring device and stereo distance measuring method | |
| US10769814B2 (en) | Camera parameter calculation apparatus based on the average pixel values | |
| CN108305233B (en) | A light field image correction method for microlens array errors | |
| CN101246590A (en) | Geometric Correction Method for Spatial Distortion Image of Spaceborne Camera | |
| CN110099267A (en) | Trapezoidal correcting system, method and projector | |
| CN109813442B (en) | Multi-frame processing-based internal stray radiation non-uniformity correction method | |
| JP2007304525A (en) | Image input apparatus, electronic apparatus, and image input method | |
| CN113284196B (en) | Camera distortion pixel-by-pixel calibration method | |
| CN111694016B (en) | Non-interference synthetic aperture super-resolution imaging reconstruction method | |
| JP7489253B2 (en) | Depth map generating device and program thereof, and depth map generating system | |
| CN112700502B (en) | Binocular camera system and binocular camera space calibration method | |
| CN116109491B (en) | Method for correcting image non-uniformity of imaging spectrometer by using bright-dark uniform region | |
| CN105044906B (en) | A kind of Quick Extended target imaging bearing calibration based on image information | |
| CN119090732B (en) | Wavefront distortion detection method based on continuous image sequence similarity in high-speed flow field | |
| CN114240801A (en) | A method for non-uniform correction of remote sensing images | |
| CN105046674A (en) | Nonuniformity correction method of multi-pixel parallel scanning infrared CCD images | |
| CN107277327A (en) | It is a kind of estimate full aperture place an order lens light-field camera point spread function method | |
| CN111932478A (en) | Self-adaptive non-uniform correction method for uncooled infrared focal plane | |
| CN109345595B (en) | Stereoscopic vision sensor calibration method based on spherical lens | |
| CN117451190A (en) | Deep learning defocusing scattering wavefront sensing method | |
| KR102584209B1 (en) | 3d reconstruction method of integrated image using concave lens array | |
| CN113566984B (en) | Extended object wavefront sensing device and method based on Fourier spectrum cancellation |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |