Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein. Based on the embodiments of the application described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the application.
In the following description, numerous specific details are set forth in order to provide a more thorough understanding of the present application. It will be apparent, however, to one skilled in the art that the application may be practiced without one or more of these details. In other instances, well-known features have not been described in detail in order to avoid obscuring the application.
It should be understood that the present application may be embodied in various forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the application to those skilled in the art.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It will be further understood that the terms "comprises" and/or "comprising," when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof. As used herein, the term "and/or" includes any and all combinations of the associated listed items.
In order to provide a thorough understanding of the present application, detailed structures will be presented in the following description in order to illustrate the technical solutions presented by the present application. Alternative embodiments of the application are described in detail below, however, the application may have other implementations in addition to these detailed descriptions.
Next, a method 100 of analyzing a DR image according to an embodiment of the present application is described first with reference to fig. 1. The analysis method can be applied to DR imaging equipment and other electronic equipment, such as computers, intelligent terminals and the like. As shown in fig. 1, the analysis method 100 of DR image may include the following steps:
in step S110, a digital radiography DR image is acquired;
In the embodiment of the present application, the DR image acquired by the device may be acquired locally or may be acquired from another external device, which is not specifically limited herein. The local acquisition can be that the device acquires the DR image in real time, or can be that the device acquires the DR image in non-real time and stores the DR image locally.
In one possible implementation manner, the specific process of the device obtaining the DR image from the local area is:
the method includes the steps of emitting X-rays to a target tissue site, controlling the X-rays to pass through the target tissue site, receiving the X-rays after passing through the target tissue site, and processing the X-rays after passing through the target tissue site to acquire a digital radiography DR image. Wherein the target tissue site may be a tissue site of a human or other animal body to be examined. Such as the head, abdomen, etc.
In the present application, the DR image includes at least one of a DR original image and an image after the DR original image is processed. The DR original image is a gray-scale image that is not subjected to post-processing such as contrast adjustment and brightness adjustment, for example, the DR original image may be digital image information obtained by converting X-rays into visible light and converting visible light into an electrical signal. The image after the DR original image is processed includes an image after the DR original image is processed by a certain transformation relationship, or image contrast, image brightness adjustment, or the like, that is, the image after the DR original image is processed may be regarded as an image after the DR original image is processed arbitrarily, which is not particularly limited herein.
Extracting image features from the DR image, the image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features at step S120;
At step S130, at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature is input into a target model, and a DR image index reflecting the basic feature information content of the DR image is output.
In the analysis method 100 of the DR image according to the embodiment of the present application, at least one of gray entropy feature, texture feature, noise feature, gradient feature and divergence feature is extracted from the DR image, and a DR image index capable of objectively reflecting the basic feature information content of the DR image, or referred to as an image feature index (Image Feature Index, IFI), is obtained according to the extracted at least one image feature, and the DR image index can provide an objective judgment basis for an operator, and can direct directions such as in-hospital quality control and dose-reducing imaging.
Specifically, the DR image may be a raw image acquired by the DR imaging system. In the present application, the original image may be an image displayed on a display, or may be original image data or original data, which is not particularly limited herein. The DR system mainly comprises an X-ray generator, a detector (such as a flat panel detector), a workstation, a mechanical device and the like, and the working process mainly comprises the steps that X-rays generated by the X-ray generator pass through a target part of a tested object and are attenuated, the attenuated X-rays are projected onto the detector, the flat panel detector converts the X-rays into visible light and then converts the visible light into an electric signal, so that digital image information is obtained, and the digital image information is synchronously transmitted to the workstation. In addition, the DR image may be an image after the DR original image is processed, for example, the image is post-processed by a workstation, and finally the image after the DR original image is processed is obtained.
In step S120, image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features are extracted from the DR image. That is, the image features extracted in step S120 may be several image features of gray entropy features, texture features, and noise features, gradient features, and divergence features, or any one or any two of them. The gray entropy features, the texture features, the noise features, the gradient features and the divergence features respectively reflect gray, texture and noise information, gradient information, divergence information and multi-dimensional image features of the DR image, so that more accurate DR image indexes can be obtained.
In one embodiment, the gray entropy feature is the information content obtained by statistically screening redundant gray scale sources in the DR image. Depending on the transfer characteristics of the X-ray quanta converted to digital signals by the detector of the DR system, each individual gray level in the DR image may be considered a source and the DR imaging process may be considered a process of transferring information through the gray level source. Of all gray levels, there are some gray levels where no gray value exists, and thus it is regarded as a redundant gray level source. Because the extraction process of the gray entropy features screens redundant gray level information sources, more effective image information can be extracted.
Illustratively, when the gray entropy feature is the information content obtained by statistically screening redundant gray level sources in the DR image, the gray entropy feature may be extracted from the DR image in the following manner:
Firstly, obtaining a first probability statistical distribution pH (i) of each gray level information source in a DR image, wherein i is an image gray value of the DR image, and screening the first probability statistical distribution of redundant gray level information sources in each gray level information source to obtain a first probability statistical distribution pNH (i) of non-redundant gray level information sources, namely:
obtaining a second probability statistical distribution pH' (i) of each non-redundant gray scale source according to the ratio of the first probability statistical distribution pNH (i) of each non-redundant gray scale source to the sum of the first probability statistical distributions sigma pNH (i) of all the non-redundant gray scale sources:
pH' (i) =pnh (i)/Σpnh (i) formula (2)
Finally, entropy is calculated according to a second probability statistical distribution pH' (i) of the non-redundant gray level source to obtain a gray entropy feature H (Image), where the entropy calculation method includes, but is not limited to, taking a natural logarithm ln based on e or taking a logarithm based on other values, and the entropy calculation method is expressed as:
Therefore, the image gray entropy characteristics after redundant gray level information sources are screened out can be obtained.
The texture features represent the spatial change relation of different gray values on the DR image, and can reflect the abstract features of the change rule of human tissues. Illustratively, a statistical method may be employed to extract texture features of the DR image, which results in statistical characteristics of texture regions based on gray scale properties of pels and their neighbors. Details of the statistical method for extracting texture features are mainly described below, but the method for extracting texture features may also include geometric methods, model methods, or other suitable texture feature extraction methods.
When a statistical method is used to extract texture features from a DR image, a texture feature description matrix is first extracted from the gray values of pixels in the DR image, and then at least one two-dimensional component of the texture feature description matrix is extracted as at least one texture feature.
In one embodiment, the texture feature description matrix describes the variation of gray values in different distances and different directions in the DR image. Let DR Image image=f (x, y), the texture feature description matrix P (i, j) is:
p (i, j) = # { (x 1, y 1), (x 2, y 2) ∈m x n|f (x 1, y 1) =i, f (x 2, y 2) =j } formula (4)
Where, # (x) represents the number of elements in the set x, and assuming that the distance between two points (x 1, y 1) and (x 2, y 2) in the image is k, the texture feature description matrix in different directions l can be expanded to P (i, j, k, l). The texture feature description matrix can be expanded by counting the changes of gray values in different distances and different directions to obtain the texture feature description matrix, so that information in more dimensions can be obtained. For example, in order to calculate the actual feature values of different angles, super-resolution interpolation may be performed on the DR image in each direction to obtain a sub-pixel gray value corresponding to the new target position, and the texture feature description matrix in different directions may be obtained according to the sub-pixel gray value.
After the texture feature description matrix is obtained, at least one two-dimensional component of the texture feature description matrix is extracted to obtain texture features, and the key characteristics of the texture feature description matrix are reflected by the texture features. Illustratively, the texture feature comprises at least one of:
The size of the P1 value can reflect the definition of the DR image and the groove depth of the texture, the deeper the groove of the texture is, the larger the P1 value is, and the shallower the groove is, the smaller the P1 value is;
The second texture feature P2 is used for counting the similarity degree of the value distribution in the texture feature description matrix and the parallel and normal directions in the DR image, and the larger the P2 value is, the larger the similarity degree of the gray level of the image in different directions is;
The third texture feature P3 is used for counting the uniformity degree of the value distribution in the texture feature description matrix and the gray level change distribution in the DR image, the size of the P3 value can reflect the uniformity degree of the gray level distribution of the image and the thickness of the texture, and the larger the P3 value is, the more stable the texture change of the DR image is;
And the fourth texture feature P4 is used for counting the value distribution in the texture feature description matrix and the measurement of the local variation of the texture of the DR image, and the larger the P4 value is, the stronger the regularity of the texture is.
The texture features extracted from the texture feature description matrix are not limited to the above four, and in other embodiments, other two-dimensional components of the texture feature description matrix may be extracted as texture features.
The image noise mainly comprises X-ray quantum noise, the distribution of the X-ray quanta obeys Poisson distribution, the variance of the distribution is in direct proportion to the average quantum detection number, and the fluctuation degree of the X-ray quanta can be counted from the DR image according to the characteristic of the X-ray quanta noise to be used as an image noise characteristic, namely the image noise characteristic reflects the fluctuation degree of the X-ray quanta in the DR image.
In one embodiment, extracting the image noise characteristics from the DR image includes the steps of obtaining a high frequency image from the DR image, the high frequency image obtained from the DR image containing primarily noise information, extracting valid information in the high frequency image to obtain a noise distribution image, and counting the noise value distribution in the noise distribution image to obtain the image noise characteristics.
As one implementation, obtaining the high-frequency image from the DR image comprises performing low-frequency filtering on the DR image I to filter low-frequency components in the DR image I so as to obtain a low-frequency image I1, and obtaining the high-frequency image I2 according to the difference value between the low-frequency image and the DR image. In other implementations, the DR image I may also be directly subjected to high-frequency filtering to obtain a high-frequency image I2.
Illustratively, the DR image I may be subjected to gaussian low-pass filtering, resulting in a low-frequency image I1. The image is a two-dimensional signal, so that a two-dimensional Gaussian function is adopted to carry out Gaussian low-pass filtering, and the two-dimensional Gaussian filtering kernel is as follows:
The high frequency image I2, i.e., i2=i1-I, can be obtained by calculating the difference between the low frequency image I1 and the DR image I. The noise distribution image I3 can be obtained by extracting effective information in the high-frequency image I2. In one embodiment, the noise distribution image I3 is an image formed by local root mean square of each pixel in the high frequency image, and the method for calculating the local root mean square includes using an L1 norm or other approximate or equivalent measure, for example:
Where I3 (I, j) is the value of the noise distribution image I3 at the pixel point (I, j), and I2 (l, k) is the value of the high frequency I2 at the pixel point (l, k).
After the noise distribution image I3 is acquired, the noise value distribution is counted to obtain the image noise characteristics. For example, a noise value section having the highest noise value distribution probability in the noise distribution image may be determined, and a noise value of the noise value section having the highest noise value distribution probability may be taken as the image noise characteristic.
Specifically, first, the pixel value of the noise distribution image I3 is divided into M bins, the pixel value interval of each bin is d, the initialized histogram vector is h, the length is M, and each component h (I) of the initialized histogram represents the number of pixel value values in the I-th bin. For a pixel point (I, j) in the noise distribution image I3, the interval corresponding to the pixel value of the point is R [ I3 (I, j) ]. And traversing each pixel point of the noise distribution image I3, counting h (R (I3 (I, j))), obtaining h, wherein after h is obtained, the maximum value of the main peak is max (h), and calculating the noise distribution probability R0 x d as the noise characteristic when the corresponding interval R0=argmax (h).
In one embodiment, gradient features are used to characterize the sharpness of different tissue boundaries in the DR image. The gradient feature is extracted from the DR image, and comprises the steps of determining regional distribution of different tissues in the DR image, and obtaining definition of boundaries of the regional distribution of the different tissues to serve as the gradient feature.
Illustratively, according to the difference of absorption coefficients of the X-rays penetrating through different tissues of a human body, region distribution of the different tissues is formed in the DR image, boundaries with different degrees exist between the different tissue regions, and the definition degree of the boundaries can be quantified by extracting gradient features. Specifically, the method can be calculated according to the following formula:
wherein Grad (x, y) is used to characterize the gradient magnitude at coordinates (x, y), and Image (x, y) is used to characterize the pixel value at coordinates (x, y).
In one embodiment, a divergence feature is used to characterize transition intensity and/or trend consistency of different tissue boundaries in the DR image. The method for extracting the divergence characteristics from the DR image comprises the steps of determining regional distribution of different tissues in the DR image, and obtaining transition strength degree and/or trend consistency of boundaries of regional distribution of different tissues to serve as the divergence characteristics.
Illustratively, in DR images, the boundaries of different tissues are not strictly defined, but rather there is a degree of transition, the degree of intensity and/or trend consistency of which may be quantified by extracting divergence features. Specifically, the method can be calculated according to the following formula:
Wherein Diver (x, y) is used to characterize the magnitude of the divergence at coordinates (x, y), image (x, y) is used to characterize the pixel value at coordinates (x, y), actan represents the arctangent function.
In some embodiments, in addition to at least one of the above image features, other image features may be extracted from the DR image for obtaining the DR image index, for example, gray gradient, gray average, variance, pixel value information, signal-to-noise ratio, contrast-to-noise ratio, and so on, which may be specific according to the actual situation, and is not limited herein.
In step S130, as shown in fig. 2, at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature is input into the target model, and a DR image index reflecting the basic feature information content of the DR image is output. The target model may be a trained model or an untrained model. The target model can output DR image indexes reflecting the basic characteristic information content of the DR image. In one possible implementation manner, the target model comprises a pre-trained network model, wherein the network model can be trained in an offline mode, and the training method mainly comprises the following steps:
First, a DR image set including a plurality of DR images is acquired. An image feature set may be derived from a DR image set, the image feature set comprising a plurality of image features extracted from a plurality of the DR images. Specifically, at least one image feature of gray entropy features, texture features, noise features, gradient features, and divergence features is extracted for each DR image in the DR image set, and a plurality of image features extracted from a plurality of DR images in the DR image set collectively constitute an image feature set, each DR image in the DR image set corresponding to at least one image feature in the image feature set.
And obtaining a DR diagnostic image set according to the DR image set, wherein the DR diagnostic image set comprises a plurality of DR diagnostic images obtained by performing image processing on the plurality of DR images. Specifically, the DR image processing system may perform image processing on a plurality of DR images in the DR image set, respectively, to obtain a plurality of DR diagnostic images, that is, visual images for diagnosis by a doctor.
And obtaining a scoring set according to the DR diagnostic image set, wherein the scoring set comprises scores obtained by evaluating a plurality of DR diagnostic images. In some embodiments, the clinical expert may evaluate the DR diagnostic images in the DR diagnostic image set according to the quality of the DR diagnostic images, and give an expert score, where the higher the score of the DR diagnostic image is, the more the content of the basic feature information is included in the DR image corresponding thereto, whereas the lower the score of the DR diagnostic image is, the less the content of the basic feature information is included in the DR image corresponding thereto. The expert scores of the DR diagnostic images constitute an expert score set.
And then, training the network model by taking the acquired image feature set and the scoring set as training sample sets to obtain a trained network model. The network model can be a traditional machine learning model or a deep learning model, and comprises a neural network, a support vector machine, linear discriminant analysis and the like, wherein the training method comprises a model training method such as linear regression, gradient descent and the like. For example, an optimal mapping function of image features to scores may be learned such that the DR image index resulting from the image feature mapping is minimally error from the actual calibrated expert score. Executing the optimal mapping function for the image features acquired in step S120, a prediction result of the DR image index closest to the expert score may be obtained.
In one embodiment, when the DR image feature set and the expert score set are utilized for classification regression training, the classification regression training formula is:
Wherein x is a feature vector, { x i}i=1,...,m is a support vector, { α i is a weighting coefficient, b is a bias, k is a kernel function, and each value of the feature vector x is required to be obtained through a linear transfer function:
wherein, the For input features, w and s are scaling and translation parameter vectors, respectively, the kernel function k employsOr (b)
It should be noted that the present application does not limit the linear transfer function, kernel function or parameters used in the regression training method.
After the trained network model is obtained in the model training stage, in the practical application process, at least one image feature of the gray entropy features, the texture features, the noise features, the gradient features and the divergence features extracted in the step S120 is input into the trained network model, so that a DR image index reflecting the basic feature information content of the DR image can be obtained, and the DR image index can be displayed by a display device or otherwise output.
Based on the above description, the analysis method of the DR image according to the embodiment of the application extracts at least one of gray entropy feature, texture feature and noise feature from the DR image, and obtains a DR image index capable of objectively reflecting the basic feature information content of the DR image according to the extracted at least one image feature, where the DR image index can provide an objective judgment basis for an operator, and can direct directions such as quality control and dose-reducing imaging in a hospital.
Next, a method of analyzing a DR image according to another embodiment of the present application is described with reference to fig. 3. Fig. 3 is a schematic flow chart of a method 300 of analyzing DR images according to an embodiment of the application.
As shown in fig. 3, the analysis method 300 of the DR image includes the following steps:
In step S310, a digital radiography DR image is acquired, wherein the DR image includes at least one of a DR original image and an image after the DR original image is processed;
extracting image features from the DR image, the image features including at least one of gray entropy features, texture features, noise features, gradient features, and divergence features at step S320;
in step S330, a DR image index reflecting a basic feature information content of the DR image is determined according to at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature;
In step S330, the DR image index reflecting the basic feature information content of the DR image may be determined according to at least one image feature selected from the group consisting of the gray entropy feature, the texture feature, the noise feature, the gradient feature, and the divergence feature, which may be determined by inputting the target model as shown in fig. 1 and 2, may be processed by another processor on the device, may be obtained after processing by another processor outside the device, and is not specifically limited herein.
In step S340, a DR image index reflecting the basic feature information content of the DR image is output.
The analysis method 300 of the DR image is similar to the analysis method 100 of the DR image hereinabove, but the analysis method 300 of the DR image is not limited to a specific manner of determining the DR image index according to the image characteristics, and after at least one image characteristic of the gray entropy characteristic, the texture characteristic, the noise characteristic, the gradient characteristic and the divergence characteristic is extracted, the DR image index may be directly obtained by processing and calculating the same, the DR image index may also be obtained by using the trained network model as described above, or any other suitable manner may be used to obtain the DR image index. In step S340, the DR image index may be output by a display device, or may be output by a speaker, a printer, or other output device, which is not particularly limited herein.
The steps of the DR image analysis method 300 are the same as or similar to those of the DR image analysis method 100, and detailed descriptions thereof are omitted herein.
Based on the above description, the analysis method 300 for a DR image according to an embodiment of the present application extracts at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature from the DR image, and obtains a DR image index capable of objectively reflecting a basic feature information content of the DR image according to the extracted at least one image feature, where the DR image index can provide an objective judgment basis for an operator, and can direct directions such as in-hospital quality control and dose-reducing imaging.
As shown in fig. 5, the analysis method 500 of the DR image includes the steps of:
in step S510, a digital radiography DR image is acquired, wherein the DR image includes at least one of a DR original image and an image after the DR original image is processed;
Determining a DR image index reflecting the basic feature information content of the DR image according to the DR original image and at least one image of the images after the DR original image is processed at step S520;
It should be noted that, in the analysis method 500 of the DR image, the DR image index reflecting the content of the basic feature information of the DR image may be directly determined according to the image without extracting the image feature in the DR image.
In some possible implementations, determining a DR image indicator reflecting the basic feature information content of the DR image from at least one of the DR original image and the processed image of the DR original image includes inputting the at least one of the DR original image and the processed image of the DR original image into a target model and outputting the DR image indicator reflecting the basic feature information content of the DR image through the target model. Of course, the DR image index may be determined by a processor other than the input of the target model, or may be obtained by a processor other than the device, which is not particularly limited herein.
In step S530, a DR image index reflecting the basic feature information content of the DR image is output.
As shown in fig. 6, the analysis method 600 of the DR image includes the steps of:
in step S610, acquiring a digital radiography DR image and image features corresponding to the DR image, where the DR image includes at least one of a DR original image and an image of the DR original image after being processed, and the image features include at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature;
In step S620, a DR image index reflecting the content of basic feature information of the DR image is determined according to the DR image and the image features corresponding to the DR image;
In the DR image analysis method 600, the DR image index is determined by two contents, that is, the DR image and the image feature corresponding to the DR image, and the DR image index obtained by this processing method is relatively accurate. Instead of determining the DR image index only by the DR image or the image feature corresponding to the DR image.
In one possible implementation manner, determining a DR image index reflecting a content of basic feature information of the DR image according to the DR image and an image feature corresponding to the DR image includes:
inputting the DR image and the image characteristics corresponding to the DR image into a target model, and outputting DR image indexes reflecting the basic characteristic information content of the DR image. Of course, the DR image index may be determined by a processor other than the input of the target model, or may be obtained by a processor other than the device, which is not particularly limited herein.
In step S630, a DR image index reflecting the basic feature information content of the DR image is output.
It should be noted that, many similar or identical steps of the DR image analyzing methods 500 and 600 and the DR image analyzing method 100 exist, and specific reference may be made to the above related description, which is not repeated herein.
Referring to fig. 4, an embodiment of the present application further provides an electronic device 400, where the electronic device 400 may be used to implement the above-described DR image analysis method 100, DR image analysis method 300, DR image analysis method 500, or DR image analysis method 600, where the electronic device 400 may be a DR device for DR imaging, such as a flat panel detector or an automatic exposure controller, or a mobile DR device, or a fixed DR device, or may be another computer or a terminal device, such as a mobile phone, a computer, a palm computer, or the like, which is not specifically limited herein. The electronic device 400 comprises a memory 410 and a processor 420, the memory 410 having stored thereon a computer program to be executed by the processor 420, which when executed by the processor performs the steps of the method 100 for analyzing DR images or the method 300 for analyzing DR images. When used to implement the analysis method 100 of a DR image, a computer program stored on the memory 410 when executed by the processor 420 performs the steps of acquiring a digital radiography DR image, extracting image features from the DR image, the image features including at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature, inputting the at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature into a pre-trained network model, and outputting DR image indicators reflecting a basic feature information content of the DR image. Of course, in practical application, the DR image index may also be displayed by a display device. When used to implement the analysis method 300 of a DR image, a computer program stored on the memory 410 when executed by the processor 420 performs the steps of acquiring a digital radiography DR image, extracting image features from the DR image, the image features including at least one of a gray entropy feature, a texture feature, a noise feature, a gradient feature, and a divergence feature, determining a DR image index reflecting a basic feature information content of the DR image from the at least one of the gray entropy feature, the texture feature, the noise feature, the gradient feature, and the divergence feature, and outputting the DR image index reflecting the basic feature information content of the DR image. Other specific details of the DR image analysis method 100 and the DR image analysis method 300 may be referred to above, and are not described herein.
Wherein the processor 420 may be implemented in software, hardware, firmware, or any combination thereof, may use circuitry, single or multiple application specific integrated circuits, single or multiple general purpose integrated circuits, single or multiple microprocessors, single or multiple programmable logic devices, or any combination of the foregoing circuits and/or devices, or other suitable circuits or devices, and the processor 420 may control other components in the electronic device 400 to perform the desired functions.
The memory 410 may include one or more computer program products that may include various forms of computer-readable storage media, such as volatile memory and/or non-volatile memory. The volatile memory may include, for example, random access memory and/or cache memory, etc. The non-volatile memory may include, for example, read-only memory, hard disk, flash memory, etc. One or more computer program instructions may be stored on the computer readable storage medium that can be executed by the processor 420 to implement the DR image analysis method and/or other various desired functions of the present application. Various applications and various data, such as various data used and/or generated by the applications, may also be stored in the computer readable storage medium.
Furthermore, according to an embodiment of the present application, there is also provided a computer storage medium on which program instructions are stored, which program instructions, when executed by a computer or a processor, are adapted to carry out the respective steps of the DR image analysis method of any embodiment of the present application. In some embodiments, the computer storage medium is a non-volatile computer readable storage medium that may include a storage program area that may store an operating system, application programs required for at least one function, data created during execution of program instructions, and the like. Further, the non-volatile computer-readable storage medium may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other non-volatile solid state storage device. In some embodiments, the non-transitory computer readable storage medium optionally includes memory remotely located relative to the processor.
By way of example, the computer storage media may comprise, for example, a hard disk of a personal computer, a memory component of a tablet computer, a read-only memory (ROM), an erasable programmable read-only memory (EPROM), a portable compact disc read-only memory (CD-ROM), a USB memory, a memory card of a smart phone, or any combination of the foregoing. The computer-readable storage medium may be any combination of one or more computer-readable storage media.
In one possible implementation, referring to FIG. 7, the electronic device may be a DR imaging device 700, the DR imaging device 700 including an X-ray generator 710, a detector 720, and a processor 730 and a display 740, wherein the processor 730 is communicatively coupled to the X-ray generator 710, the detector 720, and the display 740;
The detector 720 is configured to receive X-rays after passing through the target tissue site and process the X-rays after passing through the target tissue site to obtain a digital radiography DR image, wherein the DR image includes at least one of a DR original image and an image after the DR original image is processed;
the processor 730 is configured to input at least one image feature of the gray entropy feature, texture feature, noise feature, gradient feature, and divergence feature into a target model, and output a DR image index reflecting a basic feature information content of the DR image;
the display 740 is used for displaying the DR image index reflecting the basic feature information content of the DR image.
Furthermore, according to an embodiment of the present application, there is also provided a computer program, which may be stored on a cloud or local storage medium. Which when executed by a computer or processor is adapted to carry out the corresponding steps of the DR image analysis method of an embodiment of the application.
In summary, the analysis method and apparatus for DR image according to the present application obtain DR image index capable of objectively reflecting the basic feature information content of DR image according to the image features extracted from DR image, thereby providing an objective judgment basis for an operator.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the application. All such changes and modifications are intended to be included within the scope of the present application as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the application and aid in understanding one or more of the various inventive aspects, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the application. This method of disclosure, however, is not to be interpreted as reflecting an intention that the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules in an item analysis device according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present application and the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. The protection scope of the application is subject to the protection scope of the claims.