[go: up one dir, main page]

CN104732494A - Tissue culturing monitoring method and system based on image mist elimination - Google Patents

Tissue culturing monitoring method and system based on image mist elimination Download PDF

Info

Publication number
CN104732494A
CN104732494A CN201510127382.9A CN201510127382A CN104732494A CN 104732494 A CN104732494 A CN 104732494A CN 201510127382 A CN201510127382 A CN 201510127382A CN 104732494 A CN104732494 A CN 104732494A
Authority
CN
China
Prior art keywords
mrow
image
real
munder
msup
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201510127382.9A
Other languages
Chinese (zh)
Inventor
吴军锋
李淼
张健
高会议
董俊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
WUXI CAS INTELLIGENT AGRICULTURAL DEVELOPMENT CO LTD
Original Assignee
WUXI CAS INTELLIGENT AGRICULTURAL DEVELOPMENT CO LTD
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by WUXI CAS INTELLIGENT AGRICULTURAL DEVELOPMENT CO LTD filed Critical WUXI CAS INTELLIGENT AGRICULTURAL DEVELOPMENT CO LTD
Priority to CN201510127382.9A priority Critical patent/CN104732494A/en
Publication of CN104732494A publication Critical patent/CN104732494A/en
Pending legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)

Abstract

The invention relates to a tissue culturing monitoring method based on image mist elimination. The tissue culturing monitoring method includes the steps of obtaining real-time monitoring image information and real-time environment information of a tissue culturing monitoring point in a tissue culturing room; preprocessing the real-time monitoring image information to obtain a preprocessed image, and then obtaining a dark channel image of the preprocessed image; carrying out bilateral filtering on the dark channel image, calculating an atmospheric optical value A according to the bilaterally-filtered dark channel image, and calculating a transmittance predicted value according to the real-time monitoring image information; obtaining a mist-eliminated monitoring image according to a mist image forming model; sending the real-time monitoring image information, the real-time environment information and the mist-eliminated monitoring image to a user. The invention further discloses a tissue culturing monitoring system based on image mist elimination. By means of the tissue culturing monitoring method and system, automatic real-time collecting is achieved in the whole process, and the labor cost is greatly saved; preprocessing and mist elimination processing are carried out on obtained tissue culturing real-time monitoring information at the server side, and therefore the influences of mist on a tissue culturing monitoring picture are reduced as far as possible.

Description

Tissue culture monitoring method and system based on image defogging
Technical Field
The invention relates to the technical field of video monitoring image processing, in particular to a tissue culture monitoring method and system based on image defogging.
Background
At present, due to the scattering effect of a large number of suspended particles, atmospheric visibility is reduced, the color and contrast of an outdoor image are degraded, extraction of information in the image is influenced, outdoor definition is reduced, and traffic accidents are frequent. Therefore, image defogging has become an important subject of image processing and computer vision field research, and is one of the problems to be solved urgently, and removing fog in videos and improving image quality are key technologies for improving the value of video monitoring systems. With the continuous development of the technology, the defogging of the scenery images in foggy weather becomes possible, and the definition and the reality of the defogged images are greatly improved.
Image defogging is an important content in image processing and computer vision field research, and the main application fields of the image defogging are video monitoring, terrain surveying, automatic driving and target tracking. It can be seen that the application of the image defogging is mostly to the treatment of outdoor foggy images, while the tissue culture is in closed growth, and a layer of water fog is often attached to the wall of the tissue culture bottle, so that the real growth condition of plants in the closed tissue culture bottle can not be clearly seen when the plants are checked through remote monitoring. At present, rely on the manual work to organize tissue culture plant diseases and insect pests and inspect much, just so lead to unable in time discovering the plant diseases and insect pests to manual inspection needs to consume a large amount of manpowers, and group banks up the growth under sterile environment with earth, and people themselves can bring the bacterium into, can increase the possibility that the tissue culture seedling infects the germ, and the manual management of being not convenient for moreover can't organize the tissue culture growth condition in a long-range and look over in real time, can consume very big human cost like this.
Disclosure of Invention
The invention aims to provide a tissue culture monitoring system which can automatically acquire remote tissue culture monitoring information in real time and save labor cost; and carrying out image defogging treatment on the acquired tissue culture monitoring information, so that the influence of fog on a tissue culture monitoring picture is reduced as much as possible.
In order to achieve the purpose, the invention adopts the following technical scheme: a tissue culture monitoring method based on image defogging comprises the following steps:
(1) acquiring real-time monitoring image information and real-time environment information of tissue culture monitoring points in a tissue culture room;
(2) preprocessing the real-time monitoring image information to obtain a preprocessed image, and then acquiring a dark channel map of the preprocessed image;
(3) carrying out bilateral filtering on the dark channel map, calculating an atmospheric light value A according to the dark channel map subjected to bilateral filtering, and calculating a transmittance pre-estimated value according to real-time monitoring image information;
(4) after obtaining the atmospheric light value A and the transmittance estimated value, obtaining a monitoring image after defogging according to a fog image forming model;
(5) and sending the real-time monitoring image information, the real-time environment information and the defogged monitoring image to a user.
The preprocessing of the real-time monitoring image information refers to performing image denoising and smoothing on each frame of the real-time acquired monitoring image, namely performing mean filtering on each frame of image and then performing histogram equalization processing.
The step of obtaining the dark channel map of the preprocessed image is to decompose the preprocessed image in an RGB space into a plurality of square windows, firstly calculate the minimum value of RGB components of each pixel in each square window, use the minimum value as the gray value of the dark channel map, and store the minimum value of RGB components of each pixel in each square window into the gray map with the same size as the original real-time monitoring image, wherein the gray map is the dark channel map;
the dark channel expression is:
<math> <mrow> <msup> <mi>J</mi> <mi>dark</mi> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mrow> <mi>c</mi> <mo>&Element;</mo> <mo>{</mo> <mi>R</mi> <mo>,</mo> <mi>G</mi> <mo>,</mo> <mi>B</mi> <mo>}</mo> </mrow> </munder> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
wherein Jdark(x) Gray value representing the dark channel map, JcRepresents each channel of the real-time monitored image, Ω (x) represents a square window centered on pixel x; y represents each pixel in the omega (x) window.
The bilateral filtering is to calculate the dark channel image through a bilateral filtering template, and the bilateral filtering template is obtained by performing point multiplication operation on a Gaussian template and a Gaussian function template.
The calculation method of the atmospheric light value A comprises the following steps: firstly, taking the first 0.1% of pixel points from a dark channel map according to the gray value; and (3) the pixel points are corresponding to the original real-time monitoring image, the pixel point with the highest brightness is searched in the original real-time monitoring image, and the highest brightness is used as an atmospheric light value A.
The calculation formula of the transmittance estimated value is as follows:
<math> <mrow> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> <mo>-</mo> <mi>&omega;</mi> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mfrac> <mrow> <msup> <mi>I</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> </mrow> </math>
wherein,omega is a constant set by a user and is in a range of 0.6 to 0.7 for the transmittance estimated value,cdenotes R, G, B three channels, IcRepresenting each channel, A, in the original real-time monitoring imagecRepresenting R, G, B the atmospheric light values for three channels, Ω (x) represents a square window centered on pixel x, and y represents each pixel in the Ω (x) window.
The fog diagram forming model is as follows:
I ( x ) = J ( x ) t ~ ( x ) + A ( 1 - t ~ ( x ) )
wherein, I (x) is the original real-time monitoring image, J (x) is the monitoring image after defogging, A is the atmospheric light value,is a pre-estimated value of transmittance, which is a constant,the acquisition process is as follows:
first, i (x) ═ j (x) t (x) + a (1-t (x)) is transformed intoAs described above, superscript C denotes R, G, B three channels, and first assumes that the transmittance t (x) is constant within each window, defining it asAnd the value of A has been given, thenTwo minimum value operations are calculated on two sides to obtain <math> <mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>I</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>+</mo> <mn>1</mn> <mo>-</mo> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math> According to the dark channel prior theory <math> <mrow> <msup> <mi>J</mi> <mi>dark</mi> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mn>0</mn> <mo>,</mo> </mrow> </math> It can be known that <math> <mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mn>0</mn> <mo>,</mo> </mrow> </math> Thereby obtaining a predicted value of transmissivityIntroduction of a group in [0,1 ]]By a correction factor to obtain a final estimated value of transmittanceWhere ω is used to adjust the degree of defogging, where C represents R, G, B three channels, IcRepresenting each channel in the image to be dehazed, AcRepresenting the atmospheric light value corresponding to the three channels, wherein omega (X) represents a window taking a pixel X as a center, and y represents each pixel point in the omega (X) window;
can obtain the monitor after defoggingA control image, represented as:
wherein, I (x) is the original real-time monitoring image, J (x) is the monitoring image after defogging, A is the atmospheric light value,for the transmittance estimate, it is constant, representing the transmittance in each omega (x) window, t0To representIs measured.
And after obtaining the dark channel image, carrying out minimum value filtering on the dark channel image, wherein the filtering radius is determined by the size of a square window, and the calculation formula of the filtering radius is as follows:
WindowSize=2*Radius+1。
wherein, WindowSize is the side length of the square window, and Radius is the filtering Radius.
The invention also provides a tissue culture monitoring system based on image defogging, which comprises:
the acquisition terminal acquires real-time monitoring image information of a tissue culture monitoring point through a network camera arranged in the tissue culture room, acquires real-time environment information through a temperature sensor and an illumination sensor, and transmits the acquired information to the server through a wired/wireless network;
the server receives the acquired real-time monitoring image information and real-time environment information, performs preprocessing and defogging processing on each frame of the image, and sends the real-time monitoring image information, the real-time environment information and the defogged monitoring image information to the client through a wired/wireless network;
and the client is used for receiving the real-time image and the defogged monitoring image sent by the server and displaying the real-time image and the defogged monitoring image to a user through the mobile phone client, the portable PC terminal and the PC terminal.
According to the technical scheme, the remote tissue culture image information and the real-time environment information are acquired in real time through the network camera, the information is transmitted to the server end through the network and then transmitted to the client end through the server end, the whole process is automatically acquired in real time, manual monitoring is not needed, and the labor cost is greatly saved; the server side carries out pretreatment and defogging treatment on the acquired tissue culture real-time monitoring information, so that the influence of fog on a tissue culture monitoring picture is reduced as much as possible, and a clear image about tissue culture real-time growth and pathogen infection conditions is provided for managers so as to make corresponding treatment measures. In a word, the invention carries out defogging and clearing treatment on the collected video data, so that managers can observe the tissue culture growth condition more clearly.
Drawings
FIG. 1 is a flow chart of a method of the present invention;
fig. 2 is a block diagram of the system architecture of the present invention.
Detailed Description
As shown in fig. 1, a tissue culture monitoring method based on image defogging comprises the following steps: (1) acquiring real-time monitoring image information and real-time environment information of tissue culture monitoring points in a tissue culture room; (2) preprocessing the real-time monitoring image information to obtain a preprocessed image, and then acquiring a dark channel map of the preprocessed image; (3) carrying out bilateral filtering on the dark channel map, calculating an atmospheric light value A according to the dark channel map subjected to bilateral filtering, and calculating a transmittance pre-estimated value according to real-time monitoring image information; (4) after obtaining the atmospheric light value A and the transmittance estimated value, obtaining a monitoring image after defogging according to a fog image forming model; (5) and sending the real-time monitoring image information, the real-time environment information and the defogged monitoring image to a user.
The preprocessing of the real-time monitoring image information refers to performing image denoising and smoothing on each frame of the real-time acquired monitoring image, namely performing mean filtering on each frame of image and then performing histogram equalization processing.
The dark channel map of the pre-processed image is obtained by decomposing the pre-processed image in an RGB space into a plurality of square windows, generally taking 3-3 windows, firstly calculating the minimum value of RGB components of each pixel in each square window, taking the minimum value as the gray value of the dark channel map, and storing the minimum value of RGB components of each pixel in each square window into the gray map with the same size as the original real-time monitoring image, wherein the gray map is the dark channel map;
the dark channel expression is:
<math> <mrow> <msup> <mi>J</mi> <mi>dark</mi> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mrow> <mi>c</mi> <mo>&Element;</mo> <mo>{</mo> <mi>R</mi> <mo>,</mo> <mi>G</mi> <mo>,</mo> <mi>B</mi> <mo>}</mo> </mrow> </munder> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
wherein Jdark(x) Gray value representing the dark channel map, JcEach channel representing a real-time monitored image, and Ω (x) represents one centered on pixel xA square window; y represents each pixel in the omega (x) window.
And after obtaining the dark channel image, carrying out minimum value filtering on the dark channel image, wherein the filtering radius is determined by the size of a square window, and the calculation formula of the filtering radius is as follows:
WindowSize=2*Radius+1。
wherein, WindowSize is the side length of the square window, and Radius is the filtering Radius. And performing minimum value filtering to mainly filter discrete points at the edge of the dark channel map so as to obtain the dark channel map with smooth edge.
The bilateral filtering is to calculate the dark channel image through a bilateral filtering template, and the bilateral filtering template is obtained by performing point multiplication operation on a Gaussian template and a Gaussian function template. The bilateral filtering is used for changing a zigzag broken line edge of a dark channel graph into a smooth curve edge, the Gaussian function template is generated by taking a difference value of gray levels as a Gaussian function coefficient, and the Gaussian template is a global template, so that the generation is only needed once, and the Gaussian function template needs to be calculated once for each pixel.
The calculation method of the atmospheric light value A comprises the following steps: firstly, taking the first 0.1% of pixel points from a dark channel map according to the gray value; and (3) the pixel points are corresponding to the original real-time monitoring image, the pixel point with the highest brightness is searched in the original real-time monitoring image, and the highest brightness is used as an atmospheric light value A.
The calculation formula of the transmittance estimated value is as follows:
<math> <mrow> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> <mo>-</mo> <mi>&omega;</mi> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mfrac> <mrow> <msup> <mi>I</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> </mrow> </math>
wherein,omega is a constant set by a user and is in a range of 0.6 to 0.7 for the transmittance estimated value,cdenotes R, G, B three channels, IcRepresenting each channel, A, in the original real-time monitoring imagecRepresenting R, G, B the atmospheric light values for three channels, Ω (x) represents a square window centered on pixel x, and y represents each pixel in the Ω (x) window.
The fog diagram forming model is as follows:
I ( x ) = J ( x ) t ~ ( x ) + A ( 1 - t ~ ( x ) )
wherein, I (x) is the original real-time monitoring image, J (x) is the monitoring image after defogging, A is the atmospheric light value,is a pre-estimated value of transmittance, which is a constant,the acquisition process is as follows:
first, i (x) ═ j (x) t (x) + a (1-t (x)) is transformed intoAs described above, superscript C denotes R, G, B three channels, and first assumes that the transmittance t (x) is constant within each window, defining it asAnd the value of A has been given, thenTwo minimum value operations are calculated on two sides to obtain <math> <mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>I</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>+</mo> <mn>1</mn> <mo>-</mo> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math> According to the dark channel prior theory <math> <mrow> <msup> <mi>J</mi> <mi>dark</mi> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mn>0</mn> <mo>,</mo> </mrow> </math> It can be known that <math> <mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mn>0</mn> <mo>,</mo> </mrow> </math> Thereby obtaining a predicted value of transmissivityIntroduction of a group in [0,1 ]]By a correction factor to obtain a final estimated value of transmittanceWhere ω is used to adjust the degree of defogging, where C represents R, G, B three channels, IcRepresenting each channel in the image to be dehazed, AcRepresenting the atmospheric light value corresponding to the three channels, wherein omega (X) represents a window taking a pixel X as a center, and y represents each pixel point in the omega (X) window;
the monitoring image after defogging can be obtained and is represented as:
wherein, I (x) is the original real-time monitoring image, J (x) is the monitoring image after defogging, A is the atmospheric light value,for the transmittance estimate, it is constant, representing the transmittance in each omega (x) window, t0To representIs measured.
As shown in fig. 2, the present system includes: the acquisition terminal acquires real-time monitoring image information of a tissue culture monitoring point through a network camera arranged in the tissue culture room, acquires real-time environment information through a temperature sensor and an illumination sensor, and transmits the acquired information to the server through a wired/wireless network; the server receives the acquired real-time monitoring image information and real-time environment information, performs preprocessing and defogging processing on each frame of the image, and sends the real-time monitoring image information, the real-time environment information and the defogged monitoring image information to the client through a wired/wireless network; and the client is used for receiving the real-time image and the defogged monitoring image sent by the server and displaying the real-time image and the defogged monitoring image to a user through the mobile phone client, the portable PC terminal and the PC terminal.
In conclusion, the remote tissue culture image information and the real-time environment information are acquired in real time through the network camera, transmitted to the server end through the network and transmitted to the client end through the server end, automatically acquired in real time in the whole process, manual monitoring is not needed, and the labor cost is greatly saved; the server side carries out pretreatment and defogging treatment on the acquired tissue culture real-time monitoring information, so that the influence of fog on a tissue culture monitoring picture is reduced as much as possible, and a clear image about tissue culture real-time growth and pathogen infection conditions is provided for managers so as to make corresponding treatment measures.

Claims (9)

1. A tissue culture monitoring method based on image defogging comprises the following steps:
(1) acquiring real-time monitoring image information and real-time environment information of tissue culture monitoring points in a tissue culture room;
(2) preprocessing the real-time monitoring image information to obtain a preprocessed image, and then acquiring a dark channel map of the preprocessed image;
(3) carrying out bilateral filtering on the dark channel map, calculating an atmospheric light value A according to the dark channel map subjected to bilateral filtering, and calculating a transmittance pre-estimated value according to real-time monitoring image information;
(4) after obtaining the atmospheric light value A and the transmittance estimated value, obtaining a monitoring image after defogging according to a fog image forming model;
(5) and sending the real-time monitoring image information, the real-time environment information and the defogged monitoring image to a user.
2. The image defogging based tissue culture monitoring method according to claim 1, wherein: the preprocessing of the real-time monitoring image information refers to performing image denoising and smoothing on each frame of the real-time acquired monitoring image, namely performing mean filtering on each frame of image and then performing histogram equalization processing.
3. The image defogging based tissue culture monitoring method according to claim 1, wherein: the step of obtaining the dark channel map of the preprocessed image is to decompose the preprocessed image in an RGB space into a plurality of square windows, firstly calculate the minimum value of RGB components of each pixel in each square window, use the minimum value as the gray value of the dark channel map, and store the minimum value of RGB components of each pixel in each square window into the gray map with the same size as the original real-time monitoring image, wherein the gray map is the dark channel map;
the dark channel expression is:
<math> <mrow> <msup> <mi>J</mi> <mi>dark</mi> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mrow> <mi>c</mi> <mo>&Element;</mo> <mo>{</mo> <mi>R</mi> <mo>,</mo> <mi>G</mi> <mo>,</mo> <mi>G</mi> <mo>}</mo> </mrow> </munder> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math>
wherein Jdark(x) Gray value representing the dark channel map, JcRepresents each channel of the real-time monitored image, Ω (x) represents a square window centered on pixel x; y represents each pixel in the omega (x) window.
4. The image defogging based tissue culture monitoring method according to claim 1, wherein: the bilateral filtering is to calculate the dark channel image through a bilateral filtering template, and the bilateral filtering template is obtained by performing point multiplication operation on a Gaussian template and a Gaussian function template.
5. The image defogging based tissue culture monitoring method according to claim 1, wherein: the calculation method of the atmospheric light value A comprises the following steps: firstly, taking the first 0.1% of pixel points from a dark channel map according to the gray value; and (3) the pixel points are corresponding to the original real-time monitoring image, the pixel point with the highest brightness is searched in the original real-time monitoring image, and the highest brightness is used as an atmospheric light value A.
6. The image defogging based tissue culture monitoring method according to claim 1, wherein: the calculation formula of the transmittance estimated value is as follows:
<math> <mrow> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <mn>1</mn> <mo>-</mo> <mi>&omega;</mi> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mfrac> <mrow> <msup> <mi>I</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> </mrow> </math>
wherein,omega is a constant set by a user and is in a range of 0.6 to 0.7 for the transmittance estimated value,cdenotes R, G, B three channels, IcRepresenting each channel, A, in the original real-time monitoring imagecRepresenting R, G, B the atmospheric light values for three channels, Ω (x) represents a square window centered on pixel x, and y represents each pixel in the Ω (x) window.
7. The image defogging based tissue culture monitoring method according to claim 1, wherein: the fog diagram forming model is as follows:
I ( x ) = J ( x ) t ~ ( x ) + A ( 1 - t ~ ( x ) )
wherein, I (x) is the original real-time monitoring image, J (x) is the monitoring image after defogging, A is the atmospheric light value,is a pre-estimated value of transmittance, which is a constant,the acquisition process is as follows:
first, i (x) ═ j (x) t (x) + a (1-t (x)) is transformed intoAs described above, superscript C denotes R, G, B three channels, and first assumes that the transmittance t (x) is constant within each window, defining it asAnd the value of A has been given, thenTwo minimum value operations are calculated on two sides to obtain <math> <mrow> <munder> <mi>mni</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>I</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>+</mo> <mn>1</mn> <mo>-</mo> <mover> <mi>t</mi> <mo>~</mo> </mover> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>,</mo> </mrow> </math> According to the dark channel prior theory <math> <mrow> <msup> <mi>J</mi> <mi>dark</mi> </msup> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> <mo>=</mo> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mn>0</mn> <mo>,</mo> </mrow> </math> It can be known that <math> <mrow> <munder> <mi>min</mi> <mrow> <mi>y</mi> <mo>&Element;</mo> <mi>&Omega;</mi> <mrow> <mo>(</mo> <mi>x</mi> <mo>)</mo> </mrow> </mrow> </munder> <mrow> <mo>(</mo> <munder> <mi>min</mi> <mi>c</mi> </munder> <mrow> <mo>(</mo> <mfrac> <mrow> <msup> <mi>J</mi> <mi>c</mi> </msup> <mrow> <mo>(</mo> <mi>y</mi> <mo>)</mo> </mrow> </mrow> <msup> <mi>A</mi> <mi>c</mi> </msup> </mfrac> <mo>)</mo> </mrow> <mo>)</mo> </mrow> <mo>=</mo> <mn>0</mn> <mo>,</mo> </mrow> </math> Thereby obtaining a predicted value of transmissivityIntroduction of a group in [0,1 ]]By a correction factor to obtain a final estimated value of transmittanceWhere ω is used to adjust the degree of defogging, where C represents R, G, B three channels, IcRepresenting each channel in the image to be dehazed, AcRepresenting the atmospheric light value corresponding to the three channels, wherein omega (X) represents a window taking a pixel X as a center, and y represents each pixel point in the omega (X) window;
the monitoring image after defogging can be obtained and is represented as:
wherein, I (x) is the original real-time monitoring image, J (x) is the monitoring image after defogging, A is the atmospheric light value,for the transmittance estimate, it is constant, representing the transmittance in each omega (x) window, t0To representIs measured.
8. The image defogging based tissue culture monitoring method according to claim 3, wherein: and after obtaining the dark channel image, carrying out minimum value filtering on the dark channel image, wherein the filtering radius is determined by the size of a square window, and the calculation formula of the filtering radius is as follows:
WindowSize=2*Radius+1。
wherein, WindowSize is the side length of the square window, and Radius is the filtering Radius.
9. The utility model provides a group banks up monitored control system with earth based on image defogging which characterized in that: the method comprises the following steps:
the acquisition terminal acquires real-time monitoring image information of a tissue culture monitoring point through a network camera arranged in the tissue culture room, acquires real-time environment information through a temperature sensor and an illumination sensor, and transmits the acquired information to the server through a wired/wireless network;
the server receives the acquired real-time monitoring image information and real-time environment information, performs preprocessing and defogging processing on each frame of the image, and sends the real-time monitoring image information, the real-time environment information and the defogged monitoring image information to the client through a wired/wireless network;
and the client is used for receiving the real-time image and the defogged monitoring image sent by the server and displaying the real-time image and the defogged monitoring image to a user through the mobile phone client, the portable PC terminal and the PC terminal.
CN201510127382.9A 2015-03-23 2015-03-23 Tissue culturing monitoring method and system based on image mist elimination Pending CN104732494A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201510127382.9A CN104732494A (en) 2015-03-23 2015-03-23 Tissue culturing monitoring method and system based on image mist elimination

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201510127382.9A CN104732494A (en) 2015-03-23 2015-03-23 Tissue culturing monitoring method and system based on image mist elimination

Publications (1)

Publication Number Publication Date
CN104732494A true CN104732494A (en) 2015-06-24

Family

ID=53456365

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201510127382.9A Pending CN104732494A (en) 2015-03-23 2015-03-23 Tissue culturing monitoring method and system based on image mist elimination

Country Status (1)

Country Link
CN (1) CN104732494A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104616269A (en) * 2015-02-27 2015-05-13 深圳市中兴移动通信有限公司 Image defogging method and shooting device
CN105488769A (en) * 2015-12-08 2016-04-13 中国航空工业集团公司西安航空计算技术研究所 Real time video defogging method
CN105574819A (en) * 2015-06-25 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Real-time image defogging method and apparatus
CN105809647A (en) * 2016-03-31 2016-07-27 北京奇虎科技有限公司 Automatic defogging photographing method, device and equipment
CN105872373A (en) * 2016-03-31 2016-08-17 北京奇虎科技有限公司 Automatic defogging photographing method, device and equipment
CN106600541A (en) * 2016-11-04 2017-04-26 华南农业大学 Multi-mode Transmission Video Image Clear Processing System Based on Adaptive Atmospheric Light Curtain Image
CN107194894A (en) * 2017-05-25 2017-09-22 河南师范大学 A kind of video defogging method and its system
CN107317972A (en) * 2017-07-27 2017-11-03 广东欧珀移动通信有限公司 Image processing method, device, computer equipment and computer-readable recording medium
CN107454317A (en) * 2017-07-27 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, computer readable storage medium and computer equipment
CN107481198A (en) * 2017-07-27 2017-12-15 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
CN108230275A (en) * 2018-02-05 2018-06-29 电子科技大学 The method of image defogging
CN108986049A (en) * 2018-07-20 2018-12-11 百度在线网络技术(北京)有限公司 Method and apparatus for handling image
CN113781329A (en) * 2021-08-17 2021-12-10 北京数慧时空信息技术有限公司 Fog removing method for remote sensing image
CN114022397A (en) * 2022-01-06 2022-02-08 广东欧谱曼迪科技有限公司 An endoscope image defogging method, device, electronic device and storage medium
CN116523801A (en) * 2023-07-03 2023-08-01 贵州医科大学附属医院 Intelligent monitoring method for nursing premature infants

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950416A (en) * 2010-09-15 2011-01-19 北京理工大学 Bidirectional filtration-based real-time image de-hazing and enhancing method
US20140140619A1 (en) * 2011-08-03 2014-05-22 Sudipta Mukhopadhyay Method and System for Removal of Fog, Mist, or Haze from Images and Videos

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101950416A (en) * 2010-09-15 2011-01-19 北京理工大学 Bidirectional filtration-based real-time image de-hazing and enhancing method
US20140140619A1 (en) * 2011-08-03 2014-05-22 Sudipta Mukhopadhyay Method and System for Removal of Fog, Mist, or Haze from Images and Videos

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
HAORAN XU等: "Fast Image Dehazing Using Improved Dark Channel Prior", 《INFORMATION SCIENCE AND TECHNOLOGY (ICIST), 2012 INTERNATIONAL CONFERENCE ON》 *
吴笑天等: "基于暗通道理论的雾天图像复原的快速算法", 《长春理工大学学报(自然科学版)》 *
张旭: "图像快速去雾方法研究与实现", 《中国优秀硕士学位论文全文数据库》 *

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104616269A (en) * 2015-02-27 2015-05-13 深圳市中兴移动通信有限公司 Image defogging method and shooting device
CN105574819A (en) * 2015-06-25 2016-05-11 宇龙计算机通信科技(深圳)有限公司 Real-time image defogging method and apparatus
CN105488769A (en) * 2015-12-08 2016-04-13 中国航空工业集团公司西安航空计算技术研究所 Real time video defogging method
CN105809647A (en) * 2016-03-31 2016-07-27 北京奇虎科技有限公司 Automatic defogging photographing method, device and equipment
CN105872373A (en) * 2016-03-31 2016-08-17 北京奇虎科技有限公司 Automatic defogging photographing method, device and equipment
CN105809647B (en) * 2016-03-31 2020-06-05 北京奇虎科技有限公司 Automatic defogging photographing method, device and equipment
CN106600541A (en) * 2016-11-04 2017-04-26 华南农业大学 Multi-mode Transmission Video Image Clear Processing System Based on Adaptive Atmospheric Light Curtain Image
CN106600541B (en) * 2016-11-04 2019-09-06 华南农业大学 Multi-mode transmission video image clarity processing system based on adaptive atmospheric light curtain image
CN107194894A (en) * 2017-05-25 2017-09-22 河南师范大学 A kind of video defogging method and its system
CN107481198A (en) * 2017-07-27 2017-12-15 广东欧珀移动通信有限公司 Image processing method, device, computer-readable recording medium and computer equipment
CN107454317A (en) * 2017-07-27 2017-12-08 广东欧珀移动通信有限公司 Image processing method, device, computer readable storage medium and computer equipment
CN107317972A (en) * 2017-07-27 2017-11-03 广东欧珀移动通信有限公司 Image processing method, device, computer equipment and computer-readable recording medium
CN107317972B (en) * 2017-07-27 2019-09-06 Oppo广东移动通信有限公司 Image processing method, apparatus, computer device, and computer-readable storage medium
CN108230275A (en) * 2018-02-05 2018-06-29 电子科技大学 The method of image defogging
CN108986049A (en) * 2018-07-20 2018-12-11 百度在线网络技术(北京)有限公司 Method and apparatus for handling image
US11216924B2 (en) 2018-07-20 2022-01-04 Baidu Online Network Technology (Beijing) Co., Ltd. Method and apparatus for processing image
CN113781329A (en) * 2021-08-17 2021-12-10 北京数慧时空信息技术有限公司 Fog removing method for remote sensing image
CN113781329B (en) * 2021-08-17 2023-10-20 北京数慧时空信息技术有限公司 Fog removing method for remote sensing image
CN114022397A (en) * 2022-01-06 2022-02-08 广东欧谱曼迪科技有限公司 An endoscope image defogging method, device, electronic device and storage medium
CN114022397B (en) * 2022-01-06 2022-04-19 广东欧谱曼迪科技有限公司 An endoscope image defogging method, device, electronic device and storage medium
CN116523801A (en) * 2023-07-03 2023-08-01 贵州医科大学附属医院 Intelligent monitoring method for nursing premature infants
CN116523801B (en) * 2023-07-03 2023-08-25 贵州医科大学附属医院 An intelligent monitoring method for nursing care of premature infants

Similar Documents

Publication Publication Date Title
CN104732494A (en) Tissue culturing monitoring method and system based on image mist elimination
CN106971167B (en) Crop growth analysis method and system based on unmanned aerial vehicle platform
CN108574737B (en) Agricultural automatic monitoring system and method based on cloud technology and zynq platform
CN110378865A (en) A kind of greasy weather visibility intelligence hierarchical identification method and system under complex background
CN105812739B (en) A kind of system and method for automatic collection plant growth information
CN106940882A (en) A kind of transformer substation video image clarification method for meeting human-eye visual characteristic
CN112365467B (en) Foggy image visibility estimation method based on single image depth estimation
CN104050637A (en) Quick image defogging method based on two times of guide filtration
CN104881879A (en) Remote sensing image haze simulation method based on dark-channel priori knowledge
CN115439363A (en) Video defogging device and method based on comparison learning
CN115661650A (en) A farm management system based on IoT data monitoring
DE102022116434A1 (en) systems, methods, computer programs and devices for automatic meter reading for intelligent field patrols
CN106404720A (en) Visibility observation method
CN101321270A (en) A monitoring system and method for optimizing images in real time
CN112418112A (en) A kind of orchard disease and insect pest monitoring and early warning method and system
Zhang et al. Image dehazing based on dark channel prior and brightness enhancement for agricultural remote sensing images from consumer-grade cameras
CN118941988B (en) A method and system for agricultural machinery operation planning based on unmanned aerial vehicle
Bugarić et al. Adaptive estimation of visual smoke detection parameters based on spatial data and fire risk index
KR20190130393A (en) Method and apparatus for monitering marine environmental disturbance and harmful organisms
CN104091306A (en) Image defogging method based on mathematic morphology and multi-resolution fusion
CN119809122A (en) An intelligent crop monitoring system based on spatiotemporal data
CN116665077A (en) Power transmission line detection shooting method and system based on AI (advanced identification) recognition technology
KR102040562B1 (en) Method to estimate visibility distance using image information
CN103900498B (en) A kind of cotton field automatic detection method of the growth of cereal crop seedlings and detection device thereof
CN119229292A (en) Crop monitoring methods and related equipment

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20150624

WD01 Invention patent application deemed withdrawn after publication