CN119199886B - Non-visual field imaging method, device and system based on spectrum field - Google Patents
Non-visual field imaging method, device and system based on spectrum field Download PDFInfo
- Publication number
- CN119199886B CN119199886B CN202411203913.3A CN202411203913A CN119199886B CN 119199886 B CN119199886 B CN 119199886B CN 202411203913 A CN202411203913 A CN 202411203913A CN 119199886 B CN119199886 B CN 119199886B
- Authority
- CN
- China
- Prior art keywords
- data
- spectrum
- image
- time
- photon
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000001228 spectrum Methods 0.000 title claims abstract description 114
- 238000003384 imaging method Methods 0.000 title claims abstract description 46
- 238000005070 sampling Methods 0.000 claims abstract description 32
- 238000000034 method Methods 0.000 claims abstract description 26
- 238000001914 filtration Methods 0.000 claims abstract description 16
- 238000007781 pre-processing Methods 0.000 claims abstract description 6
- 230000003595 spectral effect Effects 0.000 claims description 35
- 239000011159 matrix material Substances 0.000 claims description 25
- 230000001131 transforming effect Effects 0.000 claims description 9
- 230000008569 process Effects 0.000 claims description 5
- 238000000691 measurement method Methods 0.000 claims 4
- 238000012856 packing Methods 0.000 claims 1
- 230000000007 visual effect Effects 0.000 abstract description 31
- 238000005259 measurement Methods 0.000 description 28
- 238000001514 detection method Methods 0.000 description 7
- 230000000694 effects Effects 0.000 description 7
- 230000002123 temporal effect Effects 0.000 description 5
- 238000005286 illumination Methods 0.000 description 4
- 230000003287 optical effect Effects 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000010586 diagram Methods 0.000 description 3
- 230000006870 function Effects 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 230000005540 biological transmission Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000004806 packaging method and process Methods 0.000 description 2
- 238000012545 processing Methods 0.000 description 2
- 230000005855 radiation Effects 0.000 description 2
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 230000009471 action Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 238000013501 data transformation Methods 0.000 description 1
- 230000005693 optoelectronics Effects 0.000 description 1
- 230000008439 repair process Effects 0.000 description 1
- 230000036962 time dependent Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/88—Lidar systems specially adapted for specific applications
- G01S17/89—Lidar systems specially adapted for specific applications for mapping or imaging
Landscapes
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Computer Networks & Wireless Communication (AREA)
- Electromagnetism (AREA)
- General Physics & Mathematics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Image Processing (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
Abstract
The application provides a non-visual field imaging method, device and system based on a spectrum field. The method comprises the steps of counting photon counting-time distribution histograms at all sampling points arranged on an intermediate wall surface according to reflected photons, preprocessing the photon counting-time distribution histograms at all the sampling points to obtain spectrum space data, converting the spectrum space data into image spectrum data, filtering the image spectrum data, and converting the filtered image spectrum data into image data of a non-visual field scene. By adopting the method, the resolution and quality of the reconstructed image can be improved.
Description
Technical Field
The present application relates to the field of optical non-visual field imaging technology, and in particular, to a non-visual field imaging method, device and system based on spectrum domain.
Background
Conventional vision techniques can only image areas within the line of sight, while non-field of view imaging techniques can bypass obstacles and image hidden scenes behind the obstacles. The non-visual field imaging technology has many potential applications, for example, in automatic driving, a non-visual field imaging radar can enable an automobile to predict whether the automobile passes through the other side at a corner or an intersection in advance, and has an early warning effect, so that traffic accidents are avoided, and in anti-terrorism application, the non-visual field imaging radar detects an internal hidden space through a window or a notch of a fostering point, so that terrorist buried positions are known in advance, and powerful help is provided for criminal and terrorist attack.
There are various implementations of non-field of view imaging, which employ sensing means in imaging including, but not limited to, microwaves, thermal radiation, acoustic waves, light waves, and the like. The method is characterized in that the method comprises the steps of determining the outline of a person, determining the outline of the person, determining the image of the person, and displaying the image of the person, wherein the wavelength of the microwave and the sound wave is longer, the resolution is low, the outline of the person can be only identified, detail information which is difficult to clearly perceive can be used for locating and tracking the target, and the imaging scheme utilizing object thermal radiation or ambient light is extremely easy to be influenced by the environment and the object temperature and is difficult to apply to imaging scenes under actual conditions.
Among the many detection means, non-field of view imaging based on optical time-resolved detection has the advantage of long range of action and being less susceptible to environmental influences. However, currently non-field of view imaging based on optical time-resolved detection still faces some performance bottlenecks in terms of practicality. For example, in practical application, the problems of insufficient photon number collected by the system, insufficient sampling points and the like can result in reduced signal-to-noise ratio and reduced resolution of the final imaging result, and in addition, the larger data transmission amount can limit the imaging speed or refresh rate.
Disclosure of Invention
In order to overcome the problems in the related art at least to a certain extent, the present application provides a non-visual field imaging method, device and system based on a spectrum field.
A non-visual field imaging method based on spectrum domain comprises counting photon count-time distribution histograms at sampling points arranged on an intermediate wall according to reflected photons, wherein the reflected photons are reflected photons which are received by a detector after pulse light scanning is carried out on a non-visual field scene through the intermediate wall, preprocessing the photon count-time distribution histograms at the sampling points to obtain spectrum space data, converting the spectrum space data into image spectrum data, filtering the image spectrum data, and converting the filtered image spectrum data into the image data of the non-visual field scene.
Further, preprocessing the photon count-time distribution histogram at each sampling point to obtain spectral space data, including:
setting the position of a direct reflection peak in the photon counting-time distribution histogram obtained through statistics as a time zero point for each sampling point to obtain a photon counting-time distribution histogram after time shifting;
packaging the data in the time-shifted photon count-time distribution histogram at all the sampling points as a three-dimensional matrix, and
And performing three-dimensional Fourier transform on the three-dimensional matrix to obtain spectrum space data.
Further, when the coaxial measurement mode is adopted in the process of scanning the pulse light and receiving the reflected photons, performing three-dimensional fourier transform on the three-dimensional matrix to obtain spectrum space data, wherein the spectrum space data comprises:
performing three-dimensional Fourier transform on the three-dimensional matrix to obtain spectrum space data by the following formula:
where Φ (k x,ky, ω) represents spectral-space data, (x, y, t) represents the three-dimensional matrix, where x represents the x-axis of the intermediate wall, y represents the y-axis of the intermediate wall, and t represents photon time of flight.
Further, transforming the spectral-spatial data into image spectral data comprises:
according to the relation The coordinate axis ω in the spectral space data Φ (k x,ky, ω) is transformed into k z, resulting in the image spectral data Φ (k x,ky,kz).
Further, when a non-coaxial measurement mode is adopted in the process of scanning the pulse light and receiving the reflected photons, performing three-dimensional fourier transform on the three-dimensional matrix to obtain spectrum space data, wherein the spectrum space data comprises:
performing three-dimensional Fourier transform on the three-dimensional matrix to obtain spectrum space data by the following formula:
Wherein ψ τ(kx,ky, ω) represents spectral-spatial data, (x, y, t) represents the three-dimensional matrix, where x represents the x-axis of the intermediate wall, y represents the y-axis of the intermediate wall, and t represents photon time of flight.
Further, transforming the spectral-spatial data into image spectral data comprises:
transforming the spectral-spatial data ψ τ(kx,ky, ω) into the image spectral data according to the following formula:
wherein, ψ f(kx,ky,ω,z=zv) represents the image spectrum data,
Further, filtering the image spectral data includes:
satisfying the relation in the image spectrum data Wherein k rmax represents the maximum recoverable frequency, the value of which is set according to the number of sampling points;
And/or the number of the groups of groups,
Satisfying the relation in the image spectrum dataWherein the value of θ is set according to the ratio of the scanning range and the object depth;
And/or the number of the groups of groups,
Setting a spectral component satisfying a relation |k| > lambda in the image spectral data to 0, wherein |k| represents the size of the spectrum, and the value of lambda is set according to the signal strength and the system time resolution;
And/or the number of the groups of groups,
Spectral components satisfying the relation |k| < β in the image spectral data are set to 0, where |k| represents the size of the spectrum, and the value of β is set according to the background noise intensity.
Further, performing inverse fourier transform on the filtered image spectrum data to obtain image data of the non-view scene includes:
when the coaxial measurement mode is adopted, the filtered image spectrum data is subjected to inverse Fourier transform by adopting the following formula to obtain the image data of the non-visual field scene:
wherein ψ (x, y, z, t) represents image data of the non-view scene and Φ (k x,ky,kz) represents filtered image spectrum data;
When the non-coaxial measurement mode is adopted, the filtered image spectrum data is subjected to inverse Fourier transform by adopting the following formula to obtain the image data of the non-visual field scene:
Wherein ψ f (x, y, z, t) represents image data of the non-view scene, and ψ f(kx,ky,ω,z=zv represents filtered image spectral data.
In addition, the application also provides a non-visual field imaging device based on the spectrum field, which comprises a memory and a processor coupled to the memory, wherein the processor is configured to execute the steps in the non-visual field imaging method based on the spectrum field based on the instructions stored in the memory.
In addition, the application also provides a non-visual field imaging system based on the spectrum field, which comprises a laser, a detector and a processor, wherein the laser scans pulse light on a non-visual field scene through an intermediate wall, the detector receives reflected photons, and the processor is used for executing the steps in the non-visual field imaging method based on the spectrum field.
According to the non-visual field imaging method, device and system based on the spectrum field, which are provided by the embodiment of the application, photon counting-time distribution histograms at all sampling points arranged on the intermediate wall surface are counted according to reflected photons, the photon counting-time distribution histograms at all the sampling points are preprocessed to obtain spectrum space data, the spectrum space data are converted into image spectrum data, the image spectrum data are filtered, and the filtered image spectrum data are converted into image data of a non-visual field scene. In this way, higher non-field of view imaging quality may be achieved. In the embodiment of the application, the non-visual field imaging is realized based on the photon time resolution measurement mode, the environment adaptability is strong, the coaxial or non-coaxial measurement mode can be compatible, the robustness is better, the processing is performed based on the frequency spectrum domain of the measurement data, and the data volume required by the method is smaller than that required by the traditional method. In addition, a frequency spectrum filtering method is adopted to filter high-frequency noise in data, so that the signal-to-noise ratio of a reconstructed image is improved, the time and space broadening effect of measured data is corrected through the frequency spectrum filtering method, low-frequency components are restrained, so that the resolution of an imaging result is improved, and spectral components which cannot be measured or are wrong are eliminated through the frequency spectrum filtering method under the condition that the number of scanning points is limited, so that the image is realized.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description, serve to explain the principles of the application.
FIG. 1 illustrates a spectral domain-based non-video imaging scene graph in accordance with an embodiment of the application.
Fig. 2 shows a schematic diagram of a model of spectral domain based non-video imaging according to an embodiment of the application.
Fig. 3 shows a flow diagram of a spectral domain-based non-view imaging method according to an embodiment of the application.
Fig. 4 shows photon count-time distribution histograms according to an embodiment of the application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the embodiments of the present application more apparent, the spirit of the present disclosure will be clearly described in the following drawings and detailed description, and any person skilled in the art, after having appreciated the embodiments of the present disclosure, may make alterations and modifications by the techniques taught by the present disclosure without departing from the spirit and scope of the present disclosure.
The exemplary embodiments of the present application and the descriptions thereof are intended to illustrate the present application, but not to limit the present application. In addition, the same or similar reference numerals are used for the same or similar parts in the drawings and the embodiments.
The terms "first," "second," "third," and the like, as used herein, do not denote a particular order or sequence of parts, nor are they intended to limit the application, but rather are used to distinguish one element or operation from another in the same technical term.
As used herein, the terms "comprising," "including," "having," "containing," and the like are intended to be inclusive and mean an inclusion, but not limited to.
As used herein, "and/or" includes any or all combinations of such things.
The term "plurality" as used herein includes "two" and "more than two", and the term "plurality" as used herein includes "two" and "more than two".
Certain words used to describe the application will be discussed below or elsewhere in this specification to provide additional guidance to those skilled in the art in describing the application.
FIG. 1 illustrates a spectral domain-based non-video imaging scene graph in accordance with an embodiment of the application. Fig. 2 shows a schematic diagram of a model of spectral domain based non-video imaging comprising a laser, a galvanometer, a detector and a processor with a time synchronization module between the laser and the detector for time synchronization, according to an embodiment of the application. As shown in fig. 1 and 2, the laser emits pulsed laser light, and the pulsed laser light is guided to the intermediate wall surface by the galvanometer, and the pulsed laser light is diffusely reflected at the intermediate wall surface and propagates to the hidden scene, then photons are reflected by the scene surface and returned to the intermediate wall surface, and finally photons are diffusely reflected on the wall surface and returned to be received by the detector. The system measures the time of photon emission and receiving to obtain the flight time and flight distance of the photons. By using the vibrating mirror, the emitting direction of photons and the receiving direction of the detector can be changed, so that measurement can be performed at different positions on the wall surface, and finally, the image of the hidden scene is restored.
The embodiment of the application provides a non-visual field imaging method based on a spectrum field, which can improve the resolution and quality of a reconstructed image through a self-adaptive spectrum filtering algorithm. The method can obtain better results compared with the traditional method in terms of signal-to-noise ratio and resolution ratio under the condition of fewer photon numbers and sampling points, and can greatly reduce the data volume required by imaging, thereby solving the bottleneck of data transmission in terms of hardware.
Fig. 3 shows a spectral domain based non-view imaging method according to an embodiment of the application, comprising steps S1-S5.
S1, counting photon counting-time distribution histograms at all sampling points arranged on an intermediate wall according to reflected photons, wherein the reflected photons are photons which are reflected back and are scanned by a laser through the intermediate wall to a non-visual scene.
Firstly, a laser emits pulse light, the vibrating mirror guides the illumination of the intermediate wall surface, and the pulse light is diffusely reflected on the intermediate wall surface and propagates to a hidden scene. The laser beam emitted from the laser irradiates the intermediate wall surface through, for example, the galvanometer reflection, and after the beam acts with the hidden scene, part of photons return to the intermediate wall surface and are received by, for example, the single photon detector.
In one embodiment, the time synchronization module statistically obtains a photon count-time distribution histogram based on the time difference between the light emission and reception times, as shown in fig. 4, where the horizontal axis is photon flight time and the vertical axis is photon count.
In one embodiment, a 532nm picosecond pulsed laser is used to emit pulsed laser light, and a single photon detector is used to detect the optical signal, and a time-to-digital converter is used to make time measurements to obtain a photon count-time distribution histogram for each sample point.
S2, preprocessing photon counting-time distribution histograms at the sampling points to obtain spectrum space data.
Specifically, in one embodiment, preprocessing the photon count-time distribution histogram includes S21-S23.
S21, setting the position of the direct reflection peak in the photon counting-time distribution histogram obtained through statistics as a time zero point for each sampling point, and obtaining the photon counting-time distribution histogram after time shifting.
For each sampling point, the position of the direct reflection peak in the photon count-time distribution histogram is shifted to time zero t=0. Here, the direct reflection peak refers to the position of the maximum value in the histogram, which is taken as the time zero point. I.e. the coordinate zero point of the time axis is redefined in this step.
S22, integrally packaging the data in the photon counting-time distribution histogram after the time shift at all the sampling points into a three-dimensional matrix.
In one embodiment, the data in the time shifted photon count-time distribution histogram at all sample points is packed into a three-dimensional matrix (x, y, t), where x represents the x-axis of the intervening wall, y represents the y-axis of the intervening wall, and t represents photon time of flight.
Here, the two-dimensional histogram data of all the sampling points are packed into a three-dimensional (grid x-axis, grid y-axis, time-of-flight) matrix according to the grid distribution of the sampling points on the intermediate wall surface.
S23, performing three-dimensional Fourier transform on the three-dimensional matrix to obtain spectrum space data.
Specifically, three-dimensional fourier transformation is performed on the three-dimensional matrix (x, y, t) to obtain spectrum space data.
Non-field of view imaging may employ either coaxial measurement or non-coaxial measurement. In the on-axis measurement mode, the detection point (i.e., the point at which the detector scans the intervening wall surface to receive the reflected photons) and the illumination point (i.e., the point at which the laser scans the intervening wall surface to emit pulsed light) are scanned together and remain coincident. In the non-coaxial measurement mode, the detection points are sampled at different positions, the illumination points are stationary, or the detection points are stationary, and the illumination points are scanned. These two sampling modes correspond to different data transformation modes.
In one embodiment, when an on-axis measurement mode is adopted, the three-dimensional matrix (x, y, t) is subjected to three-dimensional fourier transformation to obtain spectrum space data by the following formula:
Where Φ (k x,ky, ω) represents spectral-space data, (x, y, t) represents a three-dimensional matrix, where x represents the x-axis of the intermediate wall, y represents the y-axis of the intermediate wall, and t represents photon time of flight.
Specifically, in the formula, Φ (k x,ky, ω) is a function after fourier transform, which describes characteristics in the frequency domain, where k x and k y are spatial frequency components, ω is a temporal frequency component.
Ψ (x, y, t) is the original time domain or spatio-temporal domain function, which is typically a signal or image defined in space (x, y) and time t.
Is a complex exponential function representing the modulation over spatial and temporal frequencies, where k x and k y control the spatial frequency and ω control the temporal frequency.
The ≡ ≡ψ (x, y, t) is a triple integral representing the integral of the spatial and temporal variables x, y, t, calculating the fourier transform in all these dimensions.
This formula is to convert the signal ψ (x, y, t) in the time domain or space-time domain to Φ (k x,ky, ω) in the frequency domain, thereby analyzing the characteristics of the signal in space and time frequency.
In this way, the three-dimensional data (x, y, t) is transformed by fourier transformation onto the spectral domain (kx, ky, ω), wherein each spectral component represents a corresponding plane wave component.
In one embodiment, when a non-coaxial measurement is used, the spectral-spatial data is obtained by performing a three-dimensional fourier transform on a three-dimensional matrix (x, y, t) by the following equation:
Wherein ψ τ(kx,ky, ω) represents spectral-spatial data, (x, y, t) represents the three-dimensional matrix, where x represents the x-axis of the intermediate wall, y represents the y-axis of the intermediate wall, and t represents photon time of flight. Thus, the three-dimensional data (x, y, t) is transformed onto the spectral domain (kx, ky, ω) by fourier transformation.
S3, converting the spectrum space data into image spectrum data.
In one embodiment, when an on-axis measurement mode is employed, transforming the spectral-spatial data Φ (k x,ky, ω) into image spectral data comprises:
according to the relation Where c represents the speed of light, the coordinate axis ω in the spectral-spatial data Φ (k x,ky, ω) is transformed into k z, resulting in the image spectral data Φ (k x,ky,kz).
In one embodiment, when a non-coaxial measurement mode is employed, the spectral-spatial data ψ τ(kx,ky, ω) is transformed into the image spectral data according to the following formula:
wherein, ψ f(kx,ky,ω,z=zv) represents the image spectrum data,
In the formula, the first phase factorFor describing the propagation of the signal at the depth z _ v. Second phase factorThe overall phase offset contained in the spectrum is represented.
And S4, filtering the image spectrum data.
In non-field of view measurements, the data spectrum has many drawbacks over the real image spectrum, which are caused by detection noise, system time resolution, and sampling imperfections. In the image reconstruction process, the embodiment of the application adopts spectrum filtering to repair the data spectrum, thereby improving the quality of the result.
In one embodiment, when the measured sampling points are very sparse, only a portion of the spectrum of the image to be measured can be effectively measured. Therefore, the relation will be satisfiedI.e. the spectral component that cannot be measured or is erroneous is set to 0, where k rmax is an adjustable parameter representing the maximum recoverable frequency, and the parameter k rmax can be chosen according to the number of sampling points. The number of sampling points here refers to the number of sampling points set on the intermediate wall surface. More sampling points may restore higher frequency components.
In one embodiment, it is difficult to obtain a good measurement of a partial area due to the limitation of the scanning range. Therefore, the relation will be satisfiedIs set to 0, where θ is an adjustable parameter whose value depends on the ratio of the scan range and the depth of the object. The scanning range here refers to the range in which the laser performs a pulsed light scan or in which the detector receives reflected photons. The object depth ratio here refers to a relationship between the depth of an object in an image and the depth in real space. The larger the scan range, the larger the value of θ, allowing more spectral cost to be preserved.
In one embodiment, the measured high frequency component signal will be suppressed due to the limitation of the time resolution of the non-field of view measurement system, and will be easily submerged in noise. Thus, considering the trade-off between image resolution and noise, the high frequency component with low signal-to-noise ratio is removed or suppressed, i.e. the spectral component satisfying |k| > λ is set to 0, where |k| may represent the size of the spectrum, λ is an adjustable parameter, the value of which depends on the signal strength and the system time resolution. The signal intensity here refers to the signal intensity of the received photons, the greater the signal intensity if the number of received photons is greater. The system time resolution herein may also be referred to as digital resolution, referring to the minimum unit in which the system is able to resolve and record a time or time interval. In a time-dependent single photon counter device, the digital resolution is the smallest unit of time that the digitizing circuit can represent. The digital resolution determines the accuracy of the system in time measurement. For example, in an embodiment of the present application, if a 532nm picosecond pulsed laser is used to emit pulsed laser light, the system can theoretically resolve the 532nm time interval variation.
In one embodiment, to reduce the effects of background noise and correct for temporal and spatial broadening effects of the measured data, the low frequency components are suppressed, i.e., the spectral cost satisfying |k| < β is set to 0, thereby sharpening the edges of the image and increasing the resolution, where β is an adjustable parameter whose value depends on the background noise intensity. The background noise intensity here refers to noise that cannot be completely eliminated, such as noise of the optoelectronic device itself and ambient light noise. The higher the background noise, the smaller the beta value.
By adjusting these parameters, the measurement data can be efficiently processed and repaired in the spectral domain, thereby improving quality in image reconstruction. These steps ensure that the image is recovered from the valid measurement data as accurately and with the highest possible quality, while taking into account the effects of noise and system resolution.
The above filtering modes can be executed separately or in any combination according to actual conditions so as to obtain different filtering effects.
S5, converting the filtered image spectrum data into image data of the non-visual field scene.
In one embodiment, the filtered image spectral data is subjected to an inverse fourier transform to obtain image data of the non-view scene.
In one embodiment, when the on-axis measurement mode is adopted, the following formula is adopted to perform inverse fourier transform on the filtered image spectrum data, so as to obtain the image data of the non-view scene:
Wherein ψ (x, y, z, t) represents image data of the non-view scene and Φ (k x,ky,kz) represents filtered image spectrum data.
In one embodiment, when a non-coaxial measurement mode is adopted, the following formula is adopted to perform inverse fourier transform on the filtered image spectrum data, so as to obtain the image data of the non-view scene:
Wherein ψ f (x, y, z, t) represents image data of the non-view scene, and ψ f(kx,ky,ω,z=zv represents filtered image spectral data.
Specifically, in the formula, (x, y, z, t) is coordinates in the time domain or the space domain, i.e., the frequency domain is converted into the time domain or the space domain, thereby obtaining an image of the non-view scene.
Thus, an image of the non-view scene can be obtained.
As described above, in the non-visual field imaging method according to the embodiment of the present application, the non-visual field imaging is implemented based on the photon time-resolved measurement mode, and the method has strong environmental adaptability, and can be compatible with the coaxial or non-coaxial measurement mode, and has better robustness. In addition, the processing is performed based on the spectrum domain of the measurement data, and the amount of data required is smaller than that of the conventional method. In addition, the frequency spectrum filtering method filters high-frequency noise in the data, so that the signal-to-noise ratio of the reconstructed image is improved, the frequency spectrum filtering method corrects the stretching effect on the time and space of the measured data, suppresses low-frequency components, so that the resolution of an imaging result is improved, and the frequency spectrum components which cannot be measured or are wrong are eliminated by the frequency spectrum filtering method under the condition that the number of scanning points is limited, so that the image is realized.
The embodiment of the application also provides a non-visual field imaging device based on the spectrum field, which comprises a memory and a processor coupled to the memory, wherein the processor is configured to execute the steps in the non-visual field imaging method based on the spectrum field based on the instructions stored in the memory.
The embodiment of the application also provides a non-visual field imaging system based on the spectrum field, which comprises the following steps:
the laser scans the non-visual field scene through the intermediate wall;
a detector for receiving the reflected photons, and
And the processor is used for executing the steps in the non-visual field imaging method based on the spectrum field.
The details of each module in this embodiment may refer to the above method embodiments, and are not described herein.
The foregoing is merely illustrative of the embodiments of this application and any equivalent and equivalent changes and modifications can be made by those skilled in the art without departing from the spirit and principles of this application.
Claims (4)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411203913.3A CN119199886B (en) | 2024-08-30 | 2024-08-30 | Non-visual field imaging method, device and system based on spectrum field |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202411203913.3A CN119199886B (en) | 2024-08-30 | 2024-08-30 | Non-visual field imaging method, device and system based on spectrum field |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN119199886A CN119199886A (en) | 2024-12-27 |
| CN119199886B true CN119199886B (en) | 2025-04-08 |
Family
ID=94053472
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202411203913.3A Active CN119199886B (en) | 2024-08-30 | 2024-08-30 | Non-visual field imaging method, device and system based on spectrum field |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN119199886B (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112540381A (en) * | 2020-11-17 | 2021-03-23 | 中国科学院西安光学精密机械研究所 | Non-vision field single-in multi-out three-dimensional reconstruction method based on non-uniform fast Fourier transform |
Family Cites Families (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| ITFI20030254A1 (en) * | 2003-10-08 | 2005-04-09 | Actis Active Sensors S R L | PERFECT METHOD AND DEVICE FOR SPECTRUM ANALYSIS |
| CN105388486A (en) * | 2015-12-15 | 2016-03-09 | 西安电子科技大学 | Ghost imaging system and imaging method based on fiber array pseudo-thermal light |
| CN108020833B (en) * | 2017-10-25 | 2020-03-31 | 清华大学 | Terahertz ISAR imaging method and system |
| CN110703276B (en) * | 2019-08-30 | 2021-09-07 | 清华大学深圳研究生院 | Fourier imaging device and method under strong scattering condition |
| CN112904368B (en) * | 2021-01-25 | 2023-09-29 | 中国科学院西安光学精密机械研究所 | Non-visual field three-dimensional reconstruction method and system based on analytic signal and compensation reference function |
| CN115598661A (en) * | 2021-06-28 | 2023-01-13 | 北京理工大学(Cn) | Dual-frequency laser Doppler imaging detection method based on compressive sensing |
| CN115587953A (en) * | 2022-10-11 | 2023-01-10 | 中国科学院光电技术研究所 | A non-line-of-sight imaging method based on mid-frequency domain Wiener filtering |
| CN116009017B (en) * | 2022-12-15 | 2025-08-29 | 浙江大学杭州国际科创中心 | A non-line-of-sight imaging system and method based on spectral and spatial-temporal dual coding |
-
2024
- 2024-08-30 CN CN202411203913.3A patent/CN119199886B/en active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112540381A (en) * | 2020-11-17 | 2021-03-23 | 中国科学院西安光学精密机械研究所 | Non-vision field single-in multi-out three-dimensional reconstruction method based on non-uniform fast Fourier transform |
Also Published As
| Publication number | Publication date |
|---|---|
| CN119199886A (en) | 2024-12-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10712432B2 (en) | Time-of-light-based systems using reduced illumination duty cycles | |
| JP7351040B2 (en) | Graph-based array signal denoising for perturbed synthetic aperture radar | |
| CN112731443A (en) | Three-dimensional imaging system and method for fusing single photon laser radar and short wave infrared image | |
| CN112904368B (en) | Non-visual field three-dimensional reconstruction method and system based on analytic signal and compensation reference function | |
| CN112415533B (en) | Depth sensing method and device based on chirped pulse and sensor | |
| O’Connor et al. | Underwater modulated pulse laser imaging system | |
| JP6772639B2 (en) | Parallax calculation system, mobiles and programs | |
| JP2020020612A (en) | Distance measuring device, method for measuring distance, program, and mobile body | |
| CN117473456B (en) | Intelligent fusion method and system for thunder | |
| US20230050937A1 (en) | Detection method and detection apparatus | |
| CN107092015B (en) | A method for filtering speckle noise of lidar echo signals | |
| Fang et al. | Streak tube imaging lidar with kilohertz laser pulses and few-photons detection capability | |
| CN119199886B (en) | Non-visual field imaging method, device and system based on spectrum field | |
| CN116930125B (en) | A method for measuring the attenuation coefficient of water in backscattering fully gated imaging | |
| Li et al. | An enhance ranging algorithm based on multi-waveform classification with hyperspectral LiDAR | |
| Popescu et al. | Point spread function estimation for a terahertz imaging system | |
| Mack et al. | Time-of-flight (ToF) cameras for underwater situational awareness | |
| CN112946602B (en) | Multipath error compensation method and multipath error compensation indirect flight time distance calculation device | |
| Kirmani et al. | Spatio-temporal regularization for range imaging with high photon efficiency | |
| Zhang et al. | High-resolution and real-time non-line-of-sight imaging based on spatial correlation | |
| CN113837969B (en) | Non-line-of-sight image reconstruction method, device, system and computer readable storage medium | |
| US20250110210A1 (en) | Method, device, terminal device, and storage medium for determining object reflectance | |
| CN222299788U (en) | An underwater laser detection device | |
| CN119758353B (en) | FMCW laser radar point cloud ghost removing method based on edge detection algorithm | |
| CN119846658B (en) | Imaging method and imaging device |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |