[go: up one dir, main page]

CN114359129B - DR image analysis method and electronic device - Google Patents

DR image analysis method and electronic device Download PDF

Info

Publication number
CN114359129B
CN114359129B CN202111193213.7A CN202111193213A CN114359129B CN 114359129 B CN114359129 B CN 114359129B CN 202111193213 A CN202111193213 A CN 202111193213A CN 114359129 B CN114359129 B CN 114359129B
Authority
CN
China
Prior art keywords
image
features
noise
feature
texture
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202111193213.7A
Other languages
Chinese (zh)
Other versions
CN114359129A (en
Inventor
许�鹏
周华
张继晔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Mindray Bio Medical Electronics Co Ltd
Original Assignee
Shenzhen Mindray Bio Medical Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Mindray Bio Medical Electronics Co Ltd filed Critical Shenzhen Mindray Bio Medical Electronics Co Ltd
Publication of CN114359129A publication Critical patent/CN114359129A/en
Application granted granted Critical
Publication of CN114359129B publication Critical patent/CN114359129B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

一种DR图像的分析方法和电子设备,所述方法包括:获取数字X线摄影DR图像,其中,所述DR图像包括DR原始图像和DR原始图像被处理后的图像中的至少一种;从所述DR图像中提取图像特征,所述图像特征包括灰度熵特征、纹理特征、噪声特征、梯度特征和散度特征中的至少一种;将所述灰度熵特征、纹理特征、噪声特征、梯度特征和散度特征中的至少一种图像特征输入到目标模型中,并输出反映所述DR图像的基本特征信息含量的DR图像指标。本申请从DR图像中提取的图像特征获得能够客观化的反映DR图像的基本特征信息含量的DR图像指标,从而为操作者提供客观化的图像判断依据。

A DR image analysis method and electronic device, the method comprising: acquiring a digital X-ray photography DR image, wherein the DR image comprises at least one of a DR original image and an image after the DR original image is processed; extracting image features from the DR image, wherein the image features comprise at least one of grayscale entropy features, texture features, noise features, gradient features, and divergence features; inputting at least one image feature of the grayscale entropy features, texture features, noise features, gradient features, and divergence features into a target model, and outputting a DR image index reflecting the basic feature information content of the DR image. The present application obtains a DR image index that can objectively reflect the basic feature information content of the DR image from the image features extracted from the DR image, thereby providing an objective basis for image judgment for the operator.

Description

DR image analysis method and electronic device
Technical Field
The application relates to the technical field of medical images, in particular to a DR image analysis method and electronic equipment.
Background
Digital radiography (Digital Radiography, DR) images are a common type of medical digital images that are widely used in the fields of physical examination and conventional medical imaging diagnosis. With the development of digital X-ray detectors and digital image processing systems, DR images often exhibit diagnostic image effects in a wide exposure dose range, but the accuracy of the determination is easily affected by various factors such as experience differences, subjective differences, post-processing, and the like, due to the fact that the final image effect is often grasped depending on the experience of an operator during image acquisition.
In order to reflect the effect of the DR image, one way is to use an Exposure Index (EI) to indicate the magnitude of a single Exposure, but the Exposure Index generally has an energy dependency on a specific line quality, and generally the Exposure Index uses only gray information of the image, and the Exposure Index cannot well reflect image information actually used for diagnosis in the DR image due to the complexity and diversity of clinical images and human body structures.
Disclosure of Invention
In the summary, a series of concepts in a simplified form are introduced, which will be further described in detail in the detailed description. The summary of the application is not intended to define the key features and essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
In one aspect, the present application provides a method for analyzing a DR image, the method comprising:
Emitting X-rays toward a target tissue site and controlling the X-rays to pass through the target tissue site;
Receiving X-rays after passing through the target tissue site;
Processing the X-rays after passing through the target tissue site to obtain a digital radiography, DR, image, wherein the DR image comprises at least one of a DR raw image and an image after the DR raw image is processed;
Extracting image features from the DR image, wherein the image features comprise at least one of gray entropy features, texture features, noise features, gradient features and divergence features;
And inputting at least one image characteristic of the gray entropy characteristic, the texture characteristic, the noise characteristic, the gradient characteristic and the divergence characteristic into a target model, and outputting a DR image index reflecting the basic characteristic information content of the DR image through the target model.
In another aspect, the present application provides a method for analyzing a DR image, the method comprising:
Acquiring a Digital Radiography (DR) image, wherein the DR image comprises at least one of a DR original image and an image after the DR original image is processed;
Extracting image features from at least one of the DR original image and an image of which the DR original image is processed, wherein the image features comprise at least one of gray entropy features, texture features, noise features, gradient features and divergence features;
Determining a DR image index reflecting the basic feature information content of the DR image according to at least one image feature of the gray entropy feature, the texture feature, the noise feature, the gradient feature and the divergence feature;
And outputting a DR image index reflecting the basic characteristic information content of the DR image.
In another aspect, the present application provides a method for analyzing a DR image, the method comprising:
Acquiring a Digital Radiography (DR) image, wherein the DR image comprises at least one of a DR original image and an image after the DR original image is processed;
determining a DR image index reflecting the basic feature information content of the DR image according to the DR original image and at least one image of the images after the DR original image is processed;
And outputting a DR image index reflecting the basic characteristic information content of the DR image.
In another aspect, the present application provides a method for analyzing a DR image, the method comprising:
Acquiring a Digital Radiography (DR) image and image features corresponding to the DR image, wherein the DR image comprises at least one of a DR original image and an image processed by the DR original image, and the image features comprise at least one of gray entropy features, texture features, noise features, gradient features and divergence features;
determining DR image indexes reflecting the basic characteristic information content of the DR image according to the DR image and the image characteristics corresponding to the DR image;
And outputting a DR image index reflecting the basic characteristic information content of the DR image.
Another aspect of the present application provides a DR imaging apparatus comprising:
an X-ray generator, a detector, a processor and a display;
The X-ray generator is used for generating X-rays, emitting the X-rays to a target tissue site and controlling the X-rays to pass through the target tissue site;
The detector is used for receiving the X-rays passing through the target tissue site and processing the X-rays passing through the target tissue site to acquire a digital radiography DR image, wherein the DR image comprises at least one of a DR original image and an image after the DR original image is processed;
The processor is used for inputting at least one image characteristic of the gray entropy characteristic, the texture characteristic, the noise characteristic, the gradient characteristic and the divergence characteristic into a target model, and outputting a DR image index reflecting the basic characteristic information content of the DR image through the target model;
the display is used for displaying the DR image index reflecting the basic characteristic information content of the DR image.
In another aspect, the present application provides an electronic device comprising a memory and a processor, the memory having stored thereon a computer program for execution by the processor, the computer program when executed by the processor performing the steps of the method for analyzing DR images provided by the present application.
According to the application, DR image indexes capable of objectively reflecting the basic characteristic information content of the DR image are obtained according to the image characteristics extracted from the DR image, so that an objectified image judgment basis is provided for an operator.
Drawings
The above and other objects, features and advantages of the present application will become more apparent from the following more particular description of embodiments of the present application, as illustrated in the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
FIG. 1 shows a schematic flow chart of a method of analysis of DR images according to one embodiment of the invention;
FIG. 2 shows a schematic diagram of model training and DR image analysis using a trained network model in accordance with one embodiment of the present invention;
FIG. 3 shows a schematic flow chart of a method of analyzing DR images according to another embodiment of the present invention;
FIG. 4 shows a block diagram of an electronic device according to one embodiment of the invention;
FIG. 5 shows a schematic flow chart of a method of analyzing DR images according to another embodiment of the present invention;
FIG. 6 shows a schematic flow chart of a method of analyzing DR images according to another embodiment of the present invention;
fig. 7 shows a block diagram of a DR imaging apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein. Based on the embodiments of the application described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the application.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present application. It will be apparent, however, to one skilled in the art that the application may be practiced without one or more of these details. In other instances, well-known features have not been described in detail in order to avoid obscuring the application.
It should be understood that the present application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present application, detailed structures will be presented in the following description in order to illustrate the technical solutions presented by the present application. Alternative embodiments of the application are described in detail below, however, the application may have other implementations in addition to these detailed descriptions.
Next, a method 100 of analyzing a DR image according to an embodiment of the present application is described first with reference to fig. 1. The analysis method can be applied to DR imaging equipment and other electronic equipment, such as computers, intelligent terminals and the like. As shown in fig. 1, the analysis method 100 of DR image may include the following steps:
in step S110, a digital radiography DR image is acquired;
In the embodiment of the present application, the DR image acquired by the device may be acquired locally or may be acquired from another external device, which is not specifically limited herein. The local acquisition can be that the device acquires the DR image in real time, or can be that the device acquires the DR image in non-real time and stores the DR image locally.
In one possible implementation manner, the specific process of the device obtaining the DR image from the local area is:
the method includes the steps of emitting X-rays to a target tissue site, controlling the X-rays to pass through the target tissue site, receiving the X-rays after passing through the target tissue site, and processing the X-rays after passing through the target tissue site to acquire a digital radiography DR image. Wherein the target tissue site may be a tissue site of a human or other animal body to be examined. Such as the head, abdomen, etc.
In the present application, the DR image includes at least one of a DR original image and an image after the DR original image is processed. The DR original image is a gray-scale image that is not subjected to post-processing such as contrast adjustment and brightness adjustment, for example, the DR original image may be digital image information obtained by converting X-rays into visible light and converting visible light into an electrical signal. The image after the DR original image is processed includes an image after the DR original image is processed by a certain transformation relationship, or image contrast, image brightness adjustment, or the like, that is, the image after the DR original image is processed may be regarded as an image after the DR original image is processed arbitrarily, which is not particularly limited herein.
Extracting image features from the DR image, the image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features at step S120;
At step S130, at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature is input into a target model, and a DR image index reflecting the basic feature information content of the DR image is output.
In the analysis method 100 of the DR image according to the embodiment of the present application, at least one of gray entropy feature, texture feature, noise feature, gradient feature and divergence feature is extracted from the DR image, and a DR image index capable of objectively reflecting the basic feature information content of the DR image, or referred to as an image feature index (Image Feature Index, IFI), is obtained according to the extracted at least one image feature, and the DR image index can provide an objective judgment basis for an operator, and can direct directions such as in-hospital quality control and dose-reducing imaging.
Specifically, the DR image may be a raw image acquired by the DR imaging system. In the present application, the original image may be an image displayed on a display, or may be original image data or original data, which is not particularly limited herein. The DR system mainly comprises an X-ray generator, a detector (such as a flat panel detector), a workstation, a mechanical device and the like, and the working process mainly comprises the steps that X-rays generated by the X-ray generator pass through a target part of a tested object and are attenuated, the attenuated X-rays are projected onto the detector, the flat panel detector converts the X-rays into visible light and then converts the visible light into an electric signal, so that digital image information is obtained, and the digital image information is synchronously transmitted to the workstation. In addition, the DR image may be an image after the DR original image is processed, for example, the image is post-processed by a workstation, and finally the image after the DR original image is processed is obtained.
In step S120, image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features are extracted from the DR image. That is, the image features extracted in step S120 may be several image features of gray entropy features, texture features, and noise features, gradient features, and divergence features, or any one or any two of them. The gray entropy features, the texture features, the noise features, the gradient features and the divergence features respectively reflect gray, texture and noise information, gradient information, divergence information and multi-dimensional image features of the DR image, so that more accurate DR image indexes can be obtained.
In one embodiment, the gray entropy feature is the information content obtained by statistically screening redundant gray scale sources in the DR image. Depending on the transfer characteristics of the X-ray quanta converted to digital signals by the detector of the DR system, each individual gray level in the DR image may be considered a source and the DR imaging process may be considered a process of transferring information through the gray level source. Of all gray levels, there are some gray levels where no gray value exists, and thus it is regarded as a redundant gray level source. Because the extraction process of the gray entropy features screens redundant gray level information sources, more effective image information can be extracted.
Illustratively, when the gray entropy feature is the information content obtained by statistically screening redundant gray level sources in the DR image, the gray entropy feature may be extracted from the DR image in the following manner:
Firstly, obtaining a first probability statistical distribution pH (i) of each gray level information source in a DR image, wherein i is an image gray value of the DR image, and screening the first probability statistical distribution of redundant gray level information sources in each gray level information source to obtain a first probability statistical distribution pNH (i) of non-redundant gray level information sources, namely:
obtaining a second probability statistical distribution pH' (i) of each non-redundant gray scale source according to the ratio of the first probability statistical distribution pNH (i) of each non-redundant gray scale source to the sum of the first probability statistical distributions sigma pNH (i) of all the non-redundant gray scale sources:
pH' (i) =pnh (i)/Σpnh (i) formula (2)
Finally, entropy is calculated according to a second probability statistical distribution pH' (i) of the non-redundant gray level source to obtain a gray entropy feature H (Image), where the entropy calculation method includes, but is not limited to, taking a natural logarithm ln based on e or taking a logarithm based on other values, and the entropy calculation method is expressed as:
Therefore, the image gray entropy characteristics after redundant gray level information sources are screened out can be obtained.
The texture features represent the spatial change relation of different gray values on the DR image, and can reflect the abstract features of the change rule of human tissues. Illustratively, a statistical method may be employed to extract texture features of the DR image, which results in statistical characteristics of texture regions based on gray scale properties of pels and their neighbors. Details of the statistical method for extracting texture features are mainly described below, but the method for extracting texture features may also include geometric methods, model methods, or other suitable texture feature extraction methods.
When a statistical method is used to extract texture features from a DR image, a texture feature description matrix is first extracted from the gray values of pixels in the DR image, and then at least one two-dimensional component of the texture feature description matrix is extracted as at least one texture feature.
In one embodiment, the texture feature description matrix describes the variation of gray values in different distances and different directions in the DR image. Let DR Image image=f (x, y), the texture feature description matrix P (i, j) is:
p (i, j) = # { (x 1, y 1), (x 2, y 2) ∈m x n|f (x 1, y 1) =i, f (x 2, y 2) =j } formula (4)
Where, # (x) represents the number of elements in the set x, and assuming that the distance between two points (x 1, y 1) and (x 2, y 2) in the image is k, the texture feature description matrix in different directions l can be expanded to P (i, j, k, l). The texture feature description matrix can be expanded by counting the changes of gray values in different distances and different directions to obtain the texture feature description matrix, so that information in more dimensions can be obtained. For example, in order to calculate the actual feature values of different angles, super-resolution interpolation may be performed on the DR image in each direction to obtain a sub-pixel gray value corresponding to the new target position, and the texture feature description matrix in different directions may be obtained according to the sub-pixel gray value.
After the texture feature description matrix is obtained, at least one two-dimensional component of the texture feature description matrix is extracted to obtain texture features, and the key characteristics of the texture feature description matrix are reflected by the texture features. Illustratively, the texture feature comprises at least one of:
The size of the P1 value can reflect the definition of the DR image and the groove depth of the texture, the deeper the groove of the texture is, the larger the P1 value is, and the shallower the groove is, the smaller the P1 value is;
The second texture feature P2 is used for counting the similarity degree of the value distribution in the texture feature description matrix and the parallel and normal directions in the DR image, and the larger the P2 value is, the larger the similarity degree of the gray level of the image in different directions is;
The third texture feature P3 is used for counting the uniformity degree of the value distribution in the texture feature description matrix and the gray level change distribution in the DR image, the size of the P3 value can reflect the uniformity degree of the gray level distribution of the image and the thickness of the texture, and the larger the P3 value is, the more stable the texture change of the DR image is;
And the fourth texture feature P4 is used for counting the value distribution in the texture feature description matrix and the measurement of the local variation of the texture of the DR image, and the larger the P4 value is, the stronger the regularity of the texture is.
The texture features extracted from the texture feature description matrix are not limited to the above four, and in other embodiments, other two-dimensional components of the texture feature description matrix may be extracted as texture features.
The image noise mainly comprises X-ray quantum noise, the distribution of the X-ray quanta obeys Poisson distribution, the variance of the distribution is in direct proportion to the average quantum detection number, and the fluctuation degree of the X-ray quanta can be counted from the DR image according to the characteristic of the X-ray quanta noise to be used as an image noise characteristic, namely the image noise characteristic reflects the fluctuation degree of the X-ray quanta in the DR image.
In one embodiment, extracting the image noise characteristics from the DR image includes the steps of obtaining a high frequency image from the DR image, the high frequency image obtained from the DR image containing primarily noise information, extracting valid information in the high frequency image to obtain a noise distribution image, and counting the noise value distribution in the noise distribution image to obtain the image noise characteristics.
As one implementation, obtaining the high-frequency image from the DR image comprises performing low-frequency filtering on the DR image I to filter low-frequency components in the DR image I so as to obtain a low-frequency image I1, and obtaining the high-frequency image I2 according to the difference value between the low-frequency image and the DR image. In other implementations, the DR image I may also be directly subjected to high-frequency filtering to obtain a high-frequency image I2.
Illustratively, the DR image I may be subjected to gaussian low-pass filtering, resulting in a low-frequency image I1. The image is a two-dimensional signal, so that a two-dimensional Gaussian function is adopted to carry out Gaussian low-pass filtering, and the two-dimensional Gaussian filtering kernel is as follows:
The high frequency image I2, i.e., i2=i1-I, can be obtained by calculating the difference between the low frequency image I1 and the DR image I. The noise distribution image I3 can be obtained by extracting effective information in the high-frequency image I2. In one embodiment, the noise distribution image I3 is an image formed by local root mean square of each pixel in the high frequency image, and the method for calculating the local root mean square includes using an L1 norm or other approximate or equivalent measure, for example:
Where I3 (I, j) is the value of the noise distribution image I3 at the pixel point (I, j), and I2 (l, k) is the value of the high frequency I2 at the pixel point (l, k).
After the noise distribution image I3 is acquired, the noise value distribution is counted to obtain the image noise characteristics. For example, a noise value section having the highest noise value distribution probability in the noise distribution image may be determined, and a noise value of the noise value section having the highest noise value distribution probability may be taken as the image noise characteristic.
Specifically, first, the pixel value of the noise distribution image I3 is divided into M bins, the pixel value interval of each bin is d, the initialized histogram vector is h, the length is M, and each component h (I) of the initialized histogram represents the number of pixel value values in the I-th bin. For a pixel point (I, j) in the noise distribution image I3, the interval corresponding to the pixel value of the point is R [ I3 (I, j) ]. And traversing each pixel point of the noise distribution image I3, counting h (R (I3 (I, j))), obtaining h, wherein after h is obtained, the maximum value of the main peak is max (h), and calculating the noise distribution probability R0 x d as the noise characteristic when the corresponding interval R0=argmax (h).
In one embodiment, gradient features are used to characterize the sharpness of different tissue boundaries in the DR image. The gradient feature is extracted from the DR image, and comprises the steps of determining regional distribution of different tissues in the DR image, and obtaining definition of boundaries of the regional distribution of the different tissues to serve as the gradient feature.
Illustratively, according to the difference of absorption coefficients of the X-rays penetrating through different tissues of a human body, region distribution of the different tissues is formed in the DR image, boundaries with different degrees exist between the different tissue regions, and the definition degree of the boundaries can be quantified by extracting gradient features. Specifically, the method can be calculated according to the following formula:
wherein Grad (x, y) is used to characterize the gradient magnitude at coordinates (x, y), and Image (x, y) is used to characterize the pixel value at coordinates (x, y).
In one embodiment, a divergence feature is used to characterize transition intensity and/or trend consistency of different tissue boundaries in the DR image. The method for extracting the divergence characteristics from the DR image comprises the steps of determining regional distribution of different tissues in the DR image, and obtaining transition strength degree and/or trend consistency of boundaries of regional distribution of different tissues to serve as the divergence characteristics.
Illustratively, in DR images, the boundaries of different tissues are not strictly defined, but rather there is a degree of transition, the degree of intensity and/or trend consistency of which may be quantified by extracting divergence features. Specifically, the method can be calculated according to the following formula:
Wherein Diver (x, y) is used to characterize the magnitude of the divergence at coordinates (x, y), image (x, y) is used to characterize the pixel value at coordinates (x, y), actan represents the arctangent function.
In some embodiments, in addition to at least one of the above image features, other image features may be extracted from the DR image for obtaining the DR image index, for example, gray gradient, gray average, variance, pixel value information, signal-to-noise ratio, contrast-to-noise ratio, and so on, which may be specific according to the actual situation, and is not limited herein.
In step S130, as shown in fig. 2, at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature is input into the target model, and a DR image index reflecting the basic feature information content of the DR image is output. The target model may be a trained model or an untrained model. The target model can output DR image indexes reflecting the basic characteristic information content of the DR image. In one possible implementation manner, the target model comprises a pre-trained network model, wherein the network model can be trained in an offline mode, and the training method mainly comprises the following steps:
First, a DR image set including a plurality of DR images is acquired. An image feature set may be derived from a DR image set, the image feature set comprising a plurality of image features extracted from a plurality of the DR images. Specifically, at least one image feature of gray entropy features, texture features, noise features, gradient features, and divergence features is extracted for each DR image in the DR image set, and a plurality of image features extracted from a plurality of DR images in the DR image set collectively constitute an image feature set, each DR image in the DR image set corresponding to at least one image feature in the image feature set.
And obtaining a DR diagnostic image set according to the DR image set, wherein the DR diagnostic image set comprises a plurality of DR diagnostic images obtained by performing image processing on the plurality of DR images. Specifically, the DR image processing system may perform image processing on a plurality of DR images in the DR image set, respectively, to obtain a plurality of DR diagnostic images, that is, visual images for diagnosis by a doctor.
And obtaining a scoring set according to the DR diagnostic image set, wherein the scoring set comprises scores obtained by evaluating a plurality of DR diagnostic images. In some embodiments, the clinical expert may evaluate the DR diagnostic images in the DR diagnostic image set according to the quality of the DR diagnostic images, and give an expert score, where the higher the score of the DR diagnostic image is, the more the content of the basic feature information is included in the DR image corresponding thereto, whereas the lower the score of the DR diagnostic image is, the less the content of the basic feature information is included in the DR image corresponding thereto. The expert scores of the DR diagnostic images constitute an expert score set.
And then, training the network model by taking the acquired image feature set and the scoring set as training sample sets to obtain a trained network model. The network model can be a traditional machine learning model or a deep learning model, and comprises a neural network, a support vector machine, linear discriminant analysis and the like, wherein the training method comprises a model training method such as linear regression, gradient descent and the like. For example, an optimal mapping function of image features to scores may be learned such that the DR image index resulting from the image feature mapping is minimally error from the actual calibrated expert score. Executing the optimal mapping function for the image features acquired in step S120, a prediction result of the DR image index closest to the expert score may be obtained.
In one embodiment, when the DR image feature set and the expert score set are utilized for classification regression training, the classification regression training formula is:
Wherein x is a feature vector, { x i}i=1,...,m is a support vector, { α i is a weighting coefficient, b is a bias, k is a kernel function, and each value of the feature vector x is required to be obtained through a linear transfer function:
wherein, the For input features, w and s are scaling and translation parameter vectors, respectively, the kernel function k employsOr (b)
It should be noted that the present application does not limit the linear transfer function, kernel function or parameters used in the regression training method.
After the trained network model is obtained in the model training stage, in the practical application process, at least one image feature of the gray entropy features, the texture features, the noise features, the gradient features and the divergence features extracted in the step S120 is input into the trained network model, so that a DR image index reflecting the basic feature information content of the DR image can be obtained, and the DR image index can be displayed by a display device or otherwise output.
Based on the above description, the analysis method of the DR image according to the embodiment of the application extracts at least one of gray entropy feature, texture feature and noise feature from the DR image, and obtains a DR image index capable of objectively reflecting the basic feature information content of the DR image according to the extracted at least one image feature, where the DR image index can provide an objective judgment basis for an operator, and can direct directions such as quality control and dose-reducing imaging in a hospital.
Next, a method of analyzing a DR image according to another embodiment of the present application is described with reference to fig. 3. Fig. 3 is a schematic flow chart of a method 300 of analyzing DR images according to an embodiment of the application.
As shown in fig. 3, the analysis method 300 of the DR image includes the following steps:
In step S310, a digital radiography DR image is acquired, wherein the DR image includes at least one of a DR original image and an image after the DR original image is processed;
extracting image features from the DR image, the image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features at step S320;
in step S330, a DR image index reflecting a basic feature information content of the DR image is determined according to at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature;
In step S330, the DR image index reflecting the basic feature information content of the DR image may be determined according to at least one image feature selected from the group consisting of the gray entropy feature, the texture feature, the noise feature, the gradient feature, and the divergence feature, which may be determined by inputting the target model as shown in fig. 1 and 2, may be processed by another processor on the device, may be obtained after processing by another processor outside the device, and is not specifically limited herein.
In step S340, a DR image index reflecting the basic feature information content of the DR image is output.
The analysis method 300 of the DR image is similar to the analysis method 100 of the DR image hereinabove, but the analysis method 300 of the DR image is not limited to a specific manner of determining the DR image index according to the image characteristics, and after at least one image characteristic of the gray entropy characteristic, the texture characteristic, the noise characteristic, the gradient characteristic and the divergence characteristic is extracted, the DR image index may be directly obtained by processing and calculating the same, the DR image index may also be obtained by using the trained network model as described above, or any other suitable manner may be used to obtain the DR image index. In step S340, the DR image index may be output by a display device, or may be output by a speaker, a printer, or other output device, which is not particularly limited herein.
The steps of the DR image analysis method 300 are the same as or similar to those of the DR image analysis method 100, and detailed descriptions thereof are omitted herein.
Based on the above description, the analysis method 300 for a DR image according to an embodiment of the present application extracts at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature from the DR image, and obtains a DR image index capable of objectively reflecting a basic feature information content of the DR image according to the extracted at least one image feature, where the DR image index can provide an objective judgment basis for an operator, and can direct directions such as in-hospital quality control and dose-reducing imaging.
As shown in fig. 5, the analysis method 500 of the DR image includes the steps of:
in step S510, a digital radiography DR image is acquired, wherein the DR image includes at least one of a DR original image and an image after the DR original image is processed;
Determining a DR image index reflecting the basic feature information content of the DR image according to the DR original image and at least one image of the images after the DR original image is processed at step S520;
It should be noted that, in the analysis method 500 of the DR image, the DR image index reflecting the content of the basic feature information of the DR image may be directly determined according to the image without extracting the image feature in the DR image.
In some possible implementations, determining a DR image indicator reflecting the basic feature information content of the DR image from at least one of the DR original image and the processed image of the DR original image includes inputting the at least one of the DR original image and the processed image of the DR original image into a target model and outputting the DR image indicator reflecting the basic feature information content of the DR image through the target model. Of course, the DR image index may be determined by a processor other than the input of the target model, or may be obtained by a processor other than the device, which is not particularly limited herein.
In step S530, a DR image index reflecting the basic feature information content of the DR image is output.
As shown in fig. 6, the analysis method 600 of the DR image includes the steps of:
in step S610, acquiring a digital radiography DR image and image features corresponding to the DR image, where the DR image includes at least one of a DR original image and an image of the DR original image after being processed, and the image features include at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature;
In step S620, a DR image index reflecting the content of basic feature information of the DR image is determined according to the DR image and the image features corresponding to the DR image;
In the DR image analysis method 600, the DR image index is determined by two contents, that is, the DR image and the image feature corresponding to the DR image, and the DR image index obtained by this processing method is relatively accurate. Instead of determining the DR image index only by the DR image or the image feature corresponding to the DR image.
In one possible implementation manner, determining a DR image index reflecting a content of basic feature information of the DR image according to the DR image and an image feature corresponding to the DR image includes:
inputting the DR image and the image characteristics corresponding to the DR image into a target model, and outputting DR image indexes reflecting the basic characteristic information content of the DR image. Of course, the DR image index may be determined by a processor other than the input of the target model, or may be obtained by a processor other than the device, which is not particularly limited herein.
In step S630, a DR image index reflecting the basic feature information content of the DR image is output.
It should be noted that, many similar or identical steps of the DR image analyzing methods 500 and 600 and the DR image analyzing method 100 exist, and specific reference may be made to the above related description, which is not repeated herein.
Referring to fig. 4, an embodiment of the present application further provides an electronic device 400, where the electronic device 400 may be used to implement the above-described DR image analysis method 100, DR image analysis method 300, DR image analysis method 500, or DR image analysis method 600, where the electronic device 400 may be a DR device for DR imaging, such as a flat panel detector or an automatic exposure controller, or a mobile DR device, or a fixed DR device, or may be another computer or a terminal device, such as a mobile phone, a computer, a palm computer, or the like, which is not specifically limited herein. The electronic device 400 comprises a memory 410 and a processor 420, the memory 410 having stored thereon a computer program to be executed by the processor 420, which when executed by the processor performs the steps of the method 100 for analyzing DR images or the method 300 for analyzing DR images. When used to implement the analysis method 100 of a DR image, a computer program stored on the memory 410 when executed by the processor 420 performs the steps of acquiring a digital radiography DR image, extracting image features from the DR image, the image features including at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature, inputting the at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature into a pre-trained network model, and outputting DR image indicators reflecting a basic feature information content of the DR image. Of course, in practical application, the DR image index may also be displayed by a display device. When used to implement the analysis method 300 of a DR image, a computer program stored on the memory 410 when executed by the processor 420 performs the steps of acquiring a digital radiography DR image, extracting image features from the DR image, the image features including at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature, determining a DR image index reflecting a basic feature information content of the DR image from the at least one of the gray entropy feature, the texture feature, the noise feature, the gradient feature, and the divergence feature, and outputting the DR image index reflecting the basic feature information content of the DR image. Other specific details of the DR image analysis method 100 and the DR image analysis method 300 may be referred to above, and are not described herein.
Wherein the processor 420 may be implemented in software, hardware, firmware, or any combination thereof, may use circuitry, single or multiple application specific integrated circuits, single or multiple general purpose integrated circuits, single or multiple microprocessors, single or multiple programmable logic devices, or any combination of the foregoing circuits and/or devices, or other suitable circuits or devices, and the processor 420 may control other components in the electronic device 400 to perform the desired functions.
The memory 410 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random access memory and/or cache memory, etc. The non-volatile memory may include, for example, read-only memory, hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer readable storage medium that can be executed by the processor 420 to implement the DR image analysis method and/or other various desired functions of the present application. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer readable storage medium.
Furthermore, according to an embodiment of the present application, there is also provided a computer storage medium on which program instructions are stored, which program instructions, when executed by a computer or a processor, are adapted to carry out the respective steps of the DR image analysis method of any embodiment of the present application. In some embodiments, the computer storage medium is a non-volatile computer readable storage medium that may include a storage program area that may store an operating system, application programs required for at least one function, data created during execution of program instructions, and the like. Further, the non-volatile computer-readable storage medium may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the non-transitory computer readable storage medium optionally includes memory remotely located relative to the processor.
By way of example, the computer storage media may comprise, for example, a hard disk of a personal computer, a memory component of a tablet computer, a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a portable compact disc read-only memory (CD-ROM), a USB memory, a memory card of a smart phone, or any combination of the foregoing. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
In one possible implementation, referring to FIG. 7, the electronic device may be a DR imaging device 700, the DR imaging device 700 including an X-ray generator 710, a detector 720, and a processor 730 and a display 740, wherein the processor 730 is communicatively coupled to the X-ray generator 710, the detector 720, and the display 740;
The detector 720 is configured to receive X-rays after passing through the target tissue site and process the X-rays after passing through the target tissue site to obtain a digital radiography DR image, wherein the DR image includes at least one of a DR original image and an image after the DR original image is processed;
the processor 730 is configured to input at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature into a target model, and output a DR image index reflecting a basic feature information content of the DR image;
the display 740 is used for displaying the DR image index reflecting the basic feature information content of the DR image.
Furthermore, according to an embodiment of the present application, there is also provided a computer program, which may be stored on a cloud or local storage medium. Which when executed by a computer or processor is adapted to carry out the corresponding steps of the DR image analysis method of an embodiment of the application.
In summary, the analysis method and apparatus for DR image according to the present application obtain DR image index capable of objectively reflecting the basic feature information content of DR image according to the image features extracted from DR image, thereby providing an objective judgment basis for an operator.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the application. All such changes and modifications are intended to be included within the scope of the present application as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the application and aid in understanding one or more of the various inventive aspects, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the application. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules in an item analysis device according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present application and the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. The protection scope of the application is subject to the protection scope of the claims.

Claims (17)

1. A method of analyzing a DR image, the method comprising:
Emitting X-rays toward a target tissue site and controlling the X-rays to pass through the target tissue site;
Receiving X-rays after passing through the target tissue site;
Processing the X-rays after passing through the target tissue site to obtain a digital radiography, DR, image, wherein the DR image comprises at least one of a DR raw image and an image after the DR raw image is processed;
Extracting image features from the DR image, wherein the image features comprise at least two of gray entropy features, texture features, noise features, gradient features and divergence features, the gray entropy features are information contents obtained by statistically screening redundant gray level information sources in the DR image, the noise features comprise X-ray quantum noise, the noise features reflect fluctuation degrees of X-ray quanta in the DR image, the texture features represent the spatial change relation of different gray values on the DR image, the gradient features are used for representing the definition of different tissue boundaries in the DR image, and the divergence features are used for representing the transition intensity degree and/or trend consistency of different tissue boundaries in the DR image;
Inputting at least two image features of the gray entropy feature, the texture feature, the noise feature, the gradient feature and the divergence feature into a target model, and outputting a DR image index reflecting the basic feature information content of the DR image.
2. The method of claim 1, wherein extracting the grayscale entropy feature from the DR image comprises:
obtaining a first probability statistical distribution of each gray level information source in the DR image;
screening the first probability statistical distribution of the redundant gray level sources in the gray level sources to obtain the first probability statistical distribution of the non-redundant gray level sources;
obtaining a second probability statistical distribution of each non-redundant gray level information source according to the ratio of the first probability statistical distribution of each non-redundant gray level information source to the sum of the first probability statistical distributions of all non-redundant gray level information sources;
And calculating entropy according to the second probability statistical distribution of the non-redundant gray level information source so as to obtain the gray entropy characteristic.
3. The method of claim 1, wherein extracting the texture feature from the DR image comprises:
a texture feature description matrix is formed according to the gray values of all pixels in the DR image;
At least one two-dimensional component of the texture feature description matrix is extracted as at least one of the texture features.
4. A method as claimed in claim 3, wherein the texture features comprise at least one of:
the first texture feature is used for counting the value distribution of the texture feature description matrix and the overall distribution of texture change in the DR image;
the second texture feature is used for counting the similarity degree of the value distribution in the texture feature description matrix and the parallel and normal directions in the DR image;
the third texture feature is used for counting the uniformity degree of the value distribution in the texture feature description matrix and the gray level change distribution in the DR image;
and fourth texture features for counting the value distribution in the texture feature description matrix and the measurement of the local variation of the texture of the DR image.
5. A method as claimed in claim 3, wherein the texture feature description matrix describes the variation of grey values in different distances and different directions in the DR image.
6. The method of claim 5, further comprising super-resolution interpolating the DR image in different directions to obtain sub-pixel gray values;
and obtaining the texture feature description matrixes in different directions according to the sub-pixel gray values.
7. The method of claim 1, wherein extracting the image noise feature from the DR image comprises:
obtaining a high frequency image from the DR image;
Extracting effective information in the high-frequency image to obtain a noise distribution image, wherein the noise distribution image is an image formed by local root mean square of each pixel point in the high-frequency image;
And counting the noise value distribution in the noise distribution image to obtain the image noise characteristics.
8. The method of claim 7, wherein said counting noise value distributions in said noise value distribution image to obtain said image noise signature comprises:
determining a noise value interval with the highest noise value distribution probability in the noise distribution image;
and taking the noise value of the noise value interval with the highest noise value distribution probability as the image noise characteristic.
9. The method of claim 7, wherein said deriving a high frequency image from said DR image comprises:
Low-frequency filtering the DR image to obtain a low-frequency image, obtaining a high-frequency image based on the difference between the low-frequency image and the DR image, or
And performing high-frequency filtering on the DR image to obtain the high-frequency image.
10. The method of claim 1, wherein the target model comprises a pre-trained network model, and wherein the training process of the network model comprises:
acquiring a DR image set, wherein the DR image set comprises a plurality of DR images;
obtaining an image feature set according to the DR image set, wherein the image feature set comprises a plurality of image features extracted from a plurality of DR images;
obtaining a DR diagnostic image set according to the DR image set, wherein the DR diagnostic image set comprises a plurality of DR diagnostic images obtained by performing image processing on a plurality of DR images;
obtaining a scoring set according to the DR diagnostic image set, wherein the scoring set comprises scores obtained by evaluating a plurality of DR diagnostic images;
And training the network model by taking the image feature set and the evaluation set as training sample sets to obtain a trained network model.
11. The method of claim 1, wherein the extracting the gradient features from the DR image comprises:
Determining the regional distribution of different tissues in the DR image;
and acquiring the definition of the boundary of the distribution of the different tissue regions as the gradient characteristic.
12. The method of claim 1, wherein the extracting the divergence features from the DR image comprises:
Determining the regional distribution of different tissues in the DR image;
And obtaining the transition intensity degree and/or trend consistency of the boundaries of the distribution of different tissue areas as the divergence characteristic.
13. A method of analyzing a DR image, the method comprising:
Acquiring a Digital Radiography (DR) image, wherein the DR image comprises at least one of a DR original image and an image after the DR original image is processed;
Extracting image characteristics from at least one of the DR original image and the image processed by the DR original image, wherein the image characteristics comprise at least two of gray entropy characteristics, texture characteristics, noise characteristics, gradient characteristics and divergence characteristics, the gray entropy characteristics are information contents obtained by statistically screening redundant gray level information sources in the DR image, the noise characteristics comprise X-ray quantum noise, the noise characteristics reflect the fluctuation degree of X-ray quanta in the DR image, the texture characteristics represent the spatial change relation of different gray values on the DR image, the gradient characteristics are used for representing the definition of different tissue boundaries in the DR image, and the divergence characteristics are used for representing the transition intensity degree and/or trend consistency of different tissue boundaries in the DR image;
Determining a DR image index reflecting the basic feature information content of the DR image according to at least one image feature of the gray entropy feature, the texture feature, the noise feature, the gradient feature and the divergence feature;
And outputting a DR image index reflecting the basic characteristic information content of the DR image.
14. A method of analyzing a DR image, the method comprising:
The method comprises the steps of obtaining a digital radiography DR image and image characteristics corresponding to the DR image, wherein the DR image comprises at least one of a DR original image and an image processed by the DR original image, the image characteristics comprise at least two of gray entropy characteristics, texture characteristics, noise characteristics, gradient characteristics and divergence characteristics, the gray entropy characteristics are information contents obtained by statistically screening redundant gray level information sources in the DR image, the noise characteristics comprise X-ray quantum noise, the noise characteristics reflect fluctuation degrees of X-ray quanta in the DR image, the texture characteristics represent the spatial change relation of different gray values on the DR image, the gradient characteristics are used for representing the definition of different tissue boundaries in the DR image, and the divergence characteristics are used for representing the transition strength degree and/or trend consistency of different tissue boundaries in the DR image;
determining DR image indexes reflecting the basic characteristic information content of the DR image according to the DR image and the image characteristics corresponding to the DR image;
And outputting a DR image index reflecting the basic characteristic information content of the DR image.
15. The method of claim 14, wherein determining a DR image index reflecting a basic feature information content of the DR image from the DR image and image features corresponding to the DR image comprises:
inputting the DR image and the image characteristics corresponding to the DR image into a target model, and outputting DR image indexes reflecting the basic characteristic information content of the DR image.
16. A DR imaging apparatus, comprising an X-ray generator, a detector, a processor and a display;
The X-ray generator is used for generating X-rays, emitting the X-rays to a target tissue site and controlling the X-rays to pass through the target tissue site;
The detector is used for receiving the X-rays passing through the target tissue site and processing the X-rays passing through the target tissue site to acquire a digital radiography DR image, wherein the DR image comprises at least one of a DR original image and an image after the DR original image is processed;
The processor is used for extracting image features from the DR image, wherein the image features comprise at least two of gray entropy features, texture features, noise features, gradient features and divergence features, the processor is further used for inputting the at least one of the gray entropy features, the texture features, the noise features, the gradient features and the divergence features into a target model and outputting DR image indexes reflecting basic feature information content of the DR image, the gray entropy features are information content obtained by statistically screening redundant gray level information sources in the DR image, the noise features comprise X-ray quantum noise, the noise features reflect fluctuation degrees of X-ray quanta in the DR image, the texture features represent spatial change relations of different gray values in the DR image, the gradient features are used for representing definition of different tissue boundaries in the DR image, and the divergence features are used for representing transition strength degree and/or consistency of different tissue boundaries in the DR image;
the display is used for displaying the DR image index reflecting the basic characteristic information content of the DR image.
17. An electronic device comprising a memory and a processor, the memory having stored thereon a computer program to be executed by the processor, the computer program when executed by the processor performing the steps of the method of analyzing DR images as defined in any one of claims 1-16.
CN202111193213.7A 2020-10-13 2021-10-13 DR image analysis method and electronic device Active CN114359129B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN2020110916255 2020-10-13
CN202011091625 2020-10-13

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN202510772390.2A Division CN120807398A (en) 2020-10-13 2021-10-13 DR image analysis method and electronic device

Publications (2)

Publication Number Publication Date
CN114359129A CN114359129A (en) 2022-04-15
CN114359129B true CN114359129B (en) 2025-07-01

Family

ID=81095431

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111193213.7A Active CN114359129B (en) 2020-10-13 2021-10-13 DR image analysis method and electronic device

Country Status (1)

Country Link
CN (1) CN114359129B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115089203A (en) * 2022-06-10 2022-09-23 武汉迈瑞医疗技术研究院有限公司 Analysis method of DR imaging and DR imaging equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171272A (en) * 2018-01-12 2018-06-15 上海东软医疗科技有限公司 A kind of evaluation method and device of Medical Imaging Technology
CN110428375A (en) * 2019-07-24 2019-11-08 东软医疗系统股份有限公司 A kind of processing method and processing device of DR image

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8270695B2 (en) * 2008-10-07 2012-09-18 Carestream Health, Inc. Diagnostic image processing with automatic self image quality validation
CN109658400A (en) * 2018-12-14 2019-04-19 首都医科大学附属北京天坛医院 A kind of methods of marking and system based on head CT images
CN111353998A (en) * 2020-05-13 2020-06-30 温州医科大学附属第一医院 Tumor diagnosis and treatment prediction model and device based on artificial intelligence

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108171272A (en) * 2018-01-12 2018-06-15 上海东软医疗科技有限公司 A kind of evaluation method and device of Medical Imaging Technology
CN110428375A (en) * 2019-07-24 2019-11-08 东软医疗系统股份有限公司 A kind of processing method and processing device of DR image

Also Published As

Publication number Publication date
CN114359129A (en) 2022-04-15

Similar Documents

Publication Publication Date Title
JP6577607B2 (en) Automatic feature analysis, comparison, and anomaly detection
CN100387196C (en) Method and system for measuring tissue changes associated with disease
US11615508B2 (en) Systems and methods for consistent presentation of medical images using deep neural networks
CN104838422B (en) Image processing equipment and method
CN104000619A (en) Thyroid CT image computer-aided diagnosis system and method
US9811904B2 (en) Method and system for determining a phenotype of a neoplasm in a human or animal body
Zhang et al. A supervised texton based approach for automatic segmentation and measurement of the fetal head and femur in 2D ultrasound images
EP3471054B1 (en) Method for determining at least one object feature of an object
US7873196B2 (en) Medical imaging visibility index system and method for cancer lesions
CN105488781A (en) Dividing method based on CT image liver tumor focus
Bortsova et al. Automated segmentation and volume measurement of intracranial internal carotid artery calcification at noncontrast CT
Koprowski et al. Assessment of significance of features acquired from thyroid ultrasonograms in Hashimoto's disease
Petrov et al. Model and human observer reproducibility for detection of microcalcification clusters in digital breast tomosynthesis images of three-dimensionally structured test object
Rebouças Filho et al. 3D segmentation and visualization of lung and its structures using CT images of the thorax
CN114359129B (en) DR image analysis method and electronic device
CN115089203A (en) Analysis method of DR imaging and DR imaging equipment
Dovganich et al. Automatic quality control in lung X-ray imaging with deep learning
Zhang et al. Retinal vessel segmentation using Gabor filter and textons
CN120807398A (en) DR image analysis method and electronic device
CN113222985B (en) Image processing method, image processing device, computer equipment and medium
Hernandez et al. Image analysis tool with laws' masks to bone texture
CN119523500A (en) DR image analysis method and DR imaging system
CN119523516A (en) DR characteristic index calibration method and DR imaging system
Stoican et al. Spine Detection and Vertebral Body Segmentation in the PCdare Software with Machine Learning
CN119516218A (en) Statistical analysis method of DR images and DR imaging system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant