CN112704513B - Four-dimensional ultrasonic imaging method, device, system and storage medium - Google Patents
Four-dimensional ultrasonic imaging method, device, system and storage medium Download PDFInfo
- Publication number
- CN112704513B CN112704513B CN201911018471.4A CN201911018471A CN112704513B CN 112704513 B CN112704513 B CN 112704513B CN 201911018471 A CN201911018471 A CN 201911018471A CN 112704513 B CN112704513 B CN 112704513B
- Authority
- CN
- China
- Prior art keywords
- dimensional
- pixel
- volume data
- time
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000003384 imaging method Methods 0.000 title description 13
- 238000000034 method Methods 0.000 claims abstract description 80
- 238000009877 rendering Methods 0.000 claims abstract description 67
- 238000012285 ultrasound imaging Methods 0.000 claims abstract description 50
- 238000002604 ultrasonography Methods 0.000 claims description 23
- 230000010354 integration Effects 0.000 claims description 17
- 238000004590 computer program Methods 0.000 claims description 10
- 238000013507 mapping Methods 0.000 claims description 7
- 238000012545 processing Methods 0.000 claims description 7
- 230000002123 temporal effect Effects 0.000 claims description 2
- 238000010606 normalization Methods 0.000 claims 1
- 239000002872 contrast media Substances 0.000 abstract description 23
- 230000008569 process Effects 0.000 description 13
- 238000010586 diagram Methods 0.000 description 10
- 210000000056 organ Anatomy 0.000 description 8
- 238000005070 sampling Methods 0.000 description 8
- 239000003086 colorant Substances 0.000 description 7
- 239000000523 sample Substances 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 5
- 238000003745 diagnosis Methods 0.000 description 4
- 238000004422 calculation algorithm Methods 0.000 description 3
- 238000006243 chemical reaction Methods 0.000 description 3
- 230000001186 cumulative effect Effects 0.000 description 3
- 239000002961 echo contrast media Substances 0.000 description 3
- 238000005286 illumination Methods 0.000 description 3
- 238000007689 inspection Methods 0.000 description 3
- 230000007704 transition Effects 0.000 description 3
- 238000005481 NMR spectroscopy Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000002591 computed tomography Methods 0.000 description 2
- 238000013461 design Methods 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000004927 fusion Effects 0.000 description 2
- 239000004973 liquid crystal related substance Substances 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 239000000243 solution Substances 0.000 description 2
- 206010016654 Fibrosis Diseases 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 206010054107 Nodule Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000000481 breast Anatomy 0.000 description 1
- 230000007882 cirrhosis Effects 0.000 description 1
- 208000019425 cirrhosis of liver Diseases 0.000 description 1
- 238000003759 clinical diagnosis Methods 0.000 description 1
- 238000002592 echocardiography Methods 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 238000007499 fusion processing Methods 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 210000004185 liver Anatomy 0.000 description 1
- 230000004089 microcirculation Effects 0.000 description 1
- 230000009022 nonlinear effect Effects 0.000 description 1
- 210000003101 oviduct Anatomy 0.000 description 1
- 230000001575 pathological effect Effects 0.000 description 1
- 230000010412 perfusion Effects 0.000 description 1
- 230000002250 progressing effect Effects 0.000 description 1
- 230000005855 radiation Effects 0.000 description 1
- 238000002601 radiography Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 238000012360 testing method Methods 0.000 description 1
- 210000001685 thyroid gland Anatomy 0.000 description 1
- 201000002282 venous insufficiency Diseases 0.000 description 1
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/481—Diagnostic techniques involving the use of contrast agents, e.g. microbubbles introduced into the bloodstream
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/48—Diagnostic techniques
- A61B8/483—Diagnostic techniques involving the acquisition of a 3D volume of data
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B8/00—Diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/52—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves
- A61B8/5215—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data
- A61B8/5238—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image
- A61B8/5246—Devices using data or image processing specially adapted for diagnosis using ultrasonic, sonic or infrasonic waves involving processing of medical diagnostic data for combining image data of patient, e.g. merging several images from different acquisition modes into one image combining images from the same or different imaging techniques, e.g. color Doppler and B-mode
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Surgery (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Biophysics (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Physics & Mathematics (AREA)
- Molecular Biology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Hematology (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Image Generation (AREA)
- Ultra Sonic Daignosis Equipment (AREA)
Abstract
Disclosed are a four-dimensional ultrasound imaging method, apparatus, system, and storage medium, the method comprising: acquiring four-dimensional contrast volume data, and executing the following operations on three-dimensional contrast volume data at each moment in the four-dimensional contrast volume data: rendering the three-dimensional contrast volume data at the moment to obtain a rendered image, and acquiring time information and relative spatial depth information of each pixel in the rendered image; retrieving a corresponding color value from a pre-created color information table associated with time and depth based on the time information and the relative spatial depth information of each pixel; fusing the original color value of each pixel with the retrieved corresponding color value to obtain a fused color value for each pixel; the blended color values for each pixel are mapped onto individual pixels of the rendered image to obtain a volume rendering map. The scheme of the application can help a user observe the flow speed of the contrast agent in the tissue at different times.
Description
Technical Field
The present application relates to the field of ultrasound imaging technology, and more particularly, to a four-dimensional ultrasound imaging method, apparatus, system, and storage medium.
Background
In modern medical image inspection, the ultrasonic technology has become the inspection means which has the widest application and highest use frequency and the fastest popularization and application of the new technology due to the advantages of high reliability, rapidness, convenience, real-time imaging, repeated inspection and the like. In ultrasound examination, sometimes an ultrasound contrast agent is required to be used, and the ultrasound contrast agent can change the ultrasound acoustic characteristics (such as backscattering coefficient, sound velocity, nonlinear effect and the like) of target tissues so as to generate an enhanced contrast effect, so that compared with conventional ultrasound imaging, the detection of pathological tissues at the microcirculation perfusion level can be remarkably improved. Compared with other examination methods such as electronic computer tomography (Computed Tomography, CT), nuclear magnetic resonance (Nuclear magnetic resonance, MRI) and the like, the ultrasonic contrast agent has the advantages of simplicity, short time consumption, real-time performance, no invasiveness, no radiation and the like, and has become an important technology in ultrasonic diagnosis.
The three-dimensional radiography imaging is to make the dynamic two-dimensional section data continuously collected through a series of processes of a computer, rearrange the data into three-dimensional data according to a certain sequence, and restore the three-dimensional structure information of tissues and organs by using three-dimensional rendering technology (surface drawing, volume drawing and the like) so as to help doctors to make more detailed clinical diagnosis. Medical ultrasound three-dimensional contrast imaging techniques have been widely used for examination in the fields of thyroid (nodule detection), breast, liver (cirrhosis, nodules, tumors), fallopian tubes (blockage), etc. Four-dimensional imaging is based on three dimensions, and a time dimension is added to assist doctors in observing the change condition of the contrast agent in tissues/organs along with the time course. Existing four-dimensional contrast images can exhibit a flow process of contrast agent, but it is difficult to distinguish image information of contrast agent at different times, for example, in the case of salpingography, it is difficult to distinguish between salpingo and venous reflux.
Disclosure of Invention
The application provides a four-dimensional ultrasonic imaging scheme, which combines the current moment and contrast rendering data to obtain a three-dimensional rendering image with time pseudo color, and can help a user to intuitively understand and observe the flow speed of contrast agent in tissues at different times and acquire more clinical information. The four-dimensional ultrasound imaging scheme proposed by the present application is briefly described below, and more details will be described in the following detailed description with reference to the drawings.
According to an aspect of the present application, there is provided a four-dimensional ultrasound imaging method, the method comprising: acquiring four-dimensional contrast volume data, and executing the following operations on three-dimensional contrast volume data at each moment in the four-dimensional contrast volume data: rendering the three-dimensional contrast volume data at the moment to obtain a rendered image, and acquiring time information and relative spatial depth information of each pixel in the rendered image; retrieving a corresponding color value from a pre-created color information table associated with time and depth based on the time information and the relative spatial depth information of each pixel; fusing the original color value of each pixel with the corresponding color value obtained by searching to obtain a fused color value for each pixel; and mapping the fused color values for each pixel onto the respective pixels of the rendered image to obtain a volume rendering map.
According to yet another aspect of the present application, there is provided a four-dimensional ultrasound imaging method, the method comprising: acquiring four-dimensional contrast volume data, and executing the following operations on three-dimensional contrast volume data at each moment in the four-dimensional contrast volume data: updating a time tag of a corresponding voxel in first index table body data based on the three-dimensional contrast volume data of the moment, and integrating the time tag to obtain time integration information, wherein the first index table body data is pre-established volume data for recording the occurrence time of a contrast signal; updating the time label of the corresponding voxel in the second index table body data based on the three-dimensional contrast body data of the moment, wherein the second index table body data is the body data which is established in advance and records the continuous occurrence time of the contrast signal; rendering the three-dimensional contrast volume data at the moment to obtain a rendered image, and acquiring accumulated opacity information of each pixel in the rendered image based on the second index table volume data; retrieving a corresponding color value from a pre-created color information table associated with time integration and accumulated opacity based on the accumulated opacity information and the time integration information for each pixel; fusing the original color value of each pixel with the corresponding color value obtained by searching to obtain a fused color value for each pixel; and mapping the fused color values for each pixel onto the respective pixels of the rendered image to obtain a volume rendering map.
According to yet another aspect of the present application, there is provided a four-dimensional ultrasound imaging apparatus comprising a memory and a processor, the memory having stored thereon a computer program for execution by the processor, which when executed by the processor performs the four-dimensional ultrasound imaging method described above.
According to yet another aspect of the present application, there is provided an ultrasound system comprising the four-dimensional ultrasound imaging apparatus described above.
According to yet another aspect of the present application, there is provided a storage medium having stored thereon a computer program which, when run, performs the four-dimensional ultrasound imaging method described above.
According to a further aspect of the present application, there is provided a computer program for performing the four-dimensional ultrasound imaging method described above when the computer program is run by a computer or processor.
According to the four-dimensional ultrasonic imaging method, equipment and system provided by the embodiment of the application, the three-dimensional rendering image with the time pseudo color is obtained by combining the current time and the contrast rendering data, so that a user can be helped to intuitively understand and observe the flow speed of the contrast agent in tissues at different times, and obtain more clinical information.
Drawings
The above and other objects, features and advantages of the present application will become more apparent from the following more particular description of embodiments of the present application, as illustrated in the accompanying drawings. The accompanying drawings are included to provide a further understanding of embodiments of the application and are incorporated in and constitute a part of this specification, illustrate the application and together with the embodiments of the application, and not constitute a limitation to the application. In the drawings, like reference numerals generally refer to like parts or steps.
Fig. 1 shows a schematic block diagram of an exemplary ultrasound system for implementing a four-dimensional ultrasound imaging method according to an embodiment of the application.
Fig. 2 shows a schematic flow chart of a four-dimensional ultrasound imaging method according to one embodiment of the application.
Fig. 3 shows a schematic diagram of a color information table employed in a four-dimensional ultrasound imaging method according to one embodiment of the application.
Fig. 4 shows a schematic flow chart of a four-dimensional ultrasound imaging method according to another embodiment of the application.
Fig. 5 shows a schematic diagram of a color information table employed in a four-dimensional ultrasound imaging method according to another embodiment of the present application.
Fig. 6 shows a schematic flow chart of a four-dimensional ultrasound imaging method according to yet another embodiment of the application.
Fig. 7 shows a schematic diagram of a three-dimensional contrast rendering map at different moments obtained using a four-dimensional ultrasound imaging method according to an embodiment of the present application.
Fig. 8 shows a schematic block diagram of a four-dimensional ultrasound imaging device according to an embodiment of the application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, exemplary embodiments according to the present application will be described in detail with reference to the accompanying drawings. It should be apparent that the described embodiments are only some embodiments of the present application and not all embodiments of the present application, and it should be understood that the present application is not limited by the example embodiments described herein. Based on the embodiments of the application described in the present application, all other embodiments that a person skilled in the art would have without inventive effort shall fall within the scope of the application.
First, an exemplary ultrasound system for implementing the four-dimensional ultrasound imaging method of an embodiment of the present application is described with reference to fig. 1.
Fig. 1 is a block diagram schematic of an exemplary ultrasound system 10 for implementing a four-dimensional ultrasound imaging method of an embodiment of the present application. As shown in fig. 1, the ultrasound system 10 may include an ultrasound probe 100, a transmit/receive selection switch 101, a transmit/receive sequence controller 102, a processor 103, a display 104, and a memory 105. The transmission/reception sequence controller 102 may excite the ultrasonic probe 100 to transmit ultrasonic waves to a target object (object under test), and may also control the ultrasonic probe 100 to receive ultrasonic echoes returned from the target object, thereby obtaining ultrasonic echo signals/data. The processor 103 processes the ultrasound echo signals/data to obtain tissue related parameters and ultrasound images of the target object. Ultrasound images obtained by the processor 103 may be stored in the memory 105 and these ultrasound images may be displayed on the display 104.
In the embodiment of the present application, the display 104 of the ultrasound system 10 may be a touch display screen, a liquid crystal display screen, or the like, or may be an independent display device such as a liquid crystal display, a television, or the like, which is independent of the ultrasound system 10, or may be a display screen on an electronic device such as a mobile phone, a tablet computer, or the like.
In the embodiment of the present application, the memory 105 of the ultrasound system 10 may be a flash memory card, a solid state memory, a hard disk, or the like.
Embodiments of the present application also provide a computer readable storage medium storing a plurality of program instructions that, when invoked by the processor 103 for execution, may perform part or all of the steps or any combination of the steps in the four-dimensional ultrasound imaging method of the various embodiments of the present application.
In one embodiment, the computer readable storage medium may be memory 105, which may be a non-volatile storage medium such as a flash memory card, solid state memory, hard disk, or the like.
In an embodiment of the present application, the processor 103 of the ultrasound system 10 described above may be implemented in software, hardware, firmware, or a combination thereof, and may use circuitry, single or multiple application-specific integrated circuits (ASICs), single or multiple general-purpose integrated circuits, single or multiple microprocessors, single or multiple programmable logic devices, or a combination of the foregoing, or other suitable circuitry or devices, such that the processor 103 may perform the respective steps of the four-dimensional ultrasound imaging method in various embodiments.
The four-dimensional ultrasound imaging method of the present application is described in detail below in conjunction with fig. 2-7, and is applicable to the ultrasound system 10 described previously.
Fig. 2 shows a schematic flow diagram of a four-dimensional ultrasound imaging method 200 according to one embodiment of the application. As shown in fig. 2, the four-dimensional ultrasound imaging method 200 may include the steps of:
In step S210, four-dimensional contrast volume data is acquired, and operations in the following steps (i.e., step S220 to step S250 to be described later) are performed for three-dimensional contrast volume data at each time in the four-dimensional contrast volume data.
In one example of the present application, the four-dimensional contrast volume data acquired at step S210 may be four-dimensional contrast volume data acquired with an ultrasound volume probe. In other examples, the four-dimensional contrast volume data acquired in step S210 may also be four-dimensional contrast volume data acquired by other means, such as acquiring the four-dimensional contrast volume data from a data source such as a storage medium. In the embodiment of the present application, steps S220 to S250, which will be described later, are processing to be performed for three-dimensional contrast volume data at each time (for example, each frame) in the four-dimensional contrast volume data acquired in step S210. That is, after the processing from step S220 to step S250, it may be determined whether the three-dimensional contrast volume data of all times (e.g., all frames) in the four-dimensional contrast volume data has been traversed, and if the three-dimensional contrast volume data of unprocessed time still exists, the processing from step S220 to step S250 is continued until the three-dimensional contrast volume data of all times has been processed.
In step S220, the three-dimensional contrast volume data at the time is rendered to obtain a rendered image, and time information and relative spatial depth information of each pixel in the rendered image are acquired.
In embodiments of the present application, the rendering of three-dimensional contrast volume data at any one time may be surface rendering, volume rendering, or a combination of both. For example, the process of surface rendering of three-dimensional contrast volume data at any one time may include: by extracting the normal vector and vertex coordinates of triangular patches, which are equivalent surface (i.e. surface contour) information of tissues/organs in the three-dimensional contrast Volume data, a triangular mesh model is established, and then three-dimensional rendering is performed in combination with an illumination model (including, for example, ambient light, scattered light, high light, etc. and different light source parameters), wherein the illumination model may include ambient light, scattered light, high light, etc. and different light source parameters (such as type, direction, position, angle, etc.) affect the effect of the illumination model to different extents, so that a rendered (VR) image can be obtained.
For example, the process of volume rendering of three-dimensional contrast volume data at any instant in time may comprise: and transmitting a plurality of light rays passing through the contrast (time) volume data based on the line-of-sight direction, wherein each light ray is progressive according to a fixed step length, sampling the contrast (time) volume data on a light ray path, calculating the color and the transparency of each sampling point, accumulating the color and the transparency on each light ray path, and finally mapping the accumulated value onto each pixel of a two-dimensional image to obtain a rendered image.
In the embodiment of the application, in the process of rendering the three-dimensional contrast volume data at any moment, the spatial depth information and the time information corresponding to each pixel (triangular patch) can be recorded simultaneously. For surface rendering, spatial depth information may be obtained by obtaining vertex coordinates of a triangular patch, for example. For volume rendering, three-dimensional data on a ray path may be sampled, and spatial depth information may be obtained by first sampling the tissue/organ location during ray progression. As for the time information of each pixel, the time information can be acquired by the time point of the current time.
In step S230, a corresponding color value is retrieved from a pre-created color information table associated with time and depth based on the time information and the relative spatial depth information of each pixel.
In an embodiment of the present application, a two-dimensional color information table associated with time and depth is created in advance, as shown in fig. 3, and different spatial depths and different times correspond to different colors (it should be understood that, for the requirements of the patent application, here, fig. 3 shows a gray-scale image, and cannot show a set color), as shown in formula (1):
color depth,time =table (depth, time) formula (1)
In formula (1), color depth,time is an index Color value, table () is a two-dimensional index Table (as shown in fig. 3), time is time, and depth is a depth value.
In the embodiment of the application, the spatial depth information of the current time and the spatial depth information of the previous time can be compared, and the time information and the depth information newly generated at the current time are indexed in a two-dimensional color index table to obtain related color information (color value). And retrieving corresponding color values from a pre-created color information table associated with time and depth based on the time information and the relative spatial depth information of each pixel, so as to be used for fusing the original color values of each pixel with the retrieved corresponding color values described in the subsequent steps to obtain fused color values for each pixel, mapping the fused color values to the corresponding pixels of the rendering image obtained in the step S220, and obtaining a stereoscopic rendering image with time pseudo-colors, thereby helping users to more intuitively understand and observe the flow speed of contrast agent in tissues at different times. The relative spatial depth information refers to a difference value between the spatial depth information of the current time of a pixel and the spatial depth information of the previous time, and when the difference value between the spatial depth information of the current time of a pixel and the spatial depth information of the previous time is smaller than a predetermined range, the color value may not be changed, that is, the same color value as the color value corresponding to the previous time is adopted. In addition, the color of each pixel displayed in the stereo-rendered image is related to the time information and the relative spatial depth information, so that the anti-interference capability of the image is higher, the quality of the image is higher, the display of the image is more layering, and the observation of the image and the diagnosis of corresponding tissues/organs by a user are facilitated.
Further, in the embodiment of the present application, a color model of a non-hue class such as RGB, YUV, etc. may be used, that is, the foregoing color index value may be a color value of a non-hue class such as RGB, YUV, etc. The natural transition of colors can be realized by adopting a non-hue color model, and the screen coding display is more accordant. Of course, this is merely exemplary, and a color model of a hue class or any other suitable color model may be employed.
In step S240, the original color value of each pixel is fused with the retrieved corresponding color value to obtain a fused color value for each pixel.
In an embodiment of the present application, by retrieving the index color value associated with the temporal information and the spatial depth information of each pixel in step S230, the color value may be fused with the original color value of each pixel in the rendered image obtained in step S220 to obtain a fused color value. Illustratively, fusing the original color value of each pixel with the retrieved corresponding color value may include: the original color values are weighted based on the corresponding color values, and the corresponding color values are used as a way of integrating the accumulated weight, the product weight or the quotient weight of the original color values. That is, the manner of fusion may be one of the following or various combinations thereof as shown in the formula (2):
In the formula (2), color combine is a two-dimensional table of fused Color values, and the size is consistent with the rendered image; color render is a two-dimensional table of Color values of the rendered image, and the size is consistent with that of the rendered image; color depth,time is an index Color value two-dimensional table, and the size is consistent with the rendered image; a. b is the fusion coefficient.
Further, as previously described, in embodiments of the present application, color models of non-hue classes such as RGB, YUV, etc. may be employed. Based on this, the original color value of each pixel may be converted into the non-hue-type color space before the fusing, and then the color value obtained after the conversion may be fused with the color value indexed.
In step S250, the blended color values for each pixel are mapped onto the respective pixels of the rendered image to obtain a volume rendering map.
In an embodiment of the present application, based on the fused color value obtained in step S240, it may be mapped onto each pixel of the two-dimensional image, thereby obtaining a volume rendering map.
Further, as described above, the operations of the above steps S220 to S250 may be performed on the three-dimensional contrast volume data at each time until the three-dimensional contrast volume data at all times is processed. Further, volume rendering maps at various times may be displayed to the user, thereby enabling the user to observe how fast the contrast agent flows within the tissue at different times.
Based on the above description, the four-dimensional ultrasonic imaging method according to the embodiment of the application combines the current time and the contrast rendering data to obtain the three-dimensional rendering image with the time pseudo color, which can help a user to intuitively understand and observe the flowing speed of the contrast agent in the tissue at different times and acquire more clinical information.
The four-dimensional ultrasound imaging method according to one embodiment of the application is exemplarily described above. Four-dimensional ultrasound imaging methods according to other embodiments of the present application are described below in conjunction with fig. 4-5. Fig. 4 shows a schematic flow chart of a four-dimensional ultrasound imaging method 400 according to another embodiment of the application, as shown in fig. 4, the four-dimensional ultrasound imaging method 400 may comprise the steps of:
In step S410, four-dimensional contrast volume data is acquired, and operations in the following steps (i.e., steps S420 to S460 to be described later) are performed for three-dimensional contrast volume data at each time in the four-dimensional contrast volume data.
In one example of the present application, the four-dimensional contrast volume data acquired at step S410 may be four-dimensional contrast volume data acquired with an ultrasound volume probe. In other examples, the four-dimensional contrast volume data acquired at step S410 may also be four-dimensional contrast volume data acquired by other means. In the embodiment of the present application, steps S420 to S460, which will be described later, are processing to be performed for three-dimensional contrast volume data at each time (for example, each frame) in the four-dimensional contrast volume data acquired in step S410. That is, after the three-dimensional contrast volume data at any time (e.g., any frame) is processed in steps S420 to S460, it may be determined whether to traverse all the three-dimensional contrast volume data at all times (e.g., all frames) in the four-dimensional contrast volume data, and if there is still three-dimensional contrast volume data at the unprocessed time, the processing in steps S420 to S460 is continued until all the three-dimensional contrast volume data at all times are processed.
In step S420, the time labels of the corresponding voxels in the first index table volume data are updated based on the three-dimensional contrast volume data at the time, and the time labels are integrated to obtain time integration information, where the first index table volume data is pre-established volume data recording the occurrence time of the contrast signal.
In the embodiment of the present application, the first index table body data (denoted as T1) for recording the occurrence time of the contrast signal is pre-established, where the first index table is a three-dimensional index table, and the size of the index table may be the same as the size of the three-dimensional contrast volume data at any time in the four-dimensional contrast volume data acquired in step S410. After the three-dimensional contrast volume data at any moment is acquired, the image features of the three-dimensional contrast volume data can be extracted, and the time labels of the corresponding voxels in the first index table can be updated according to the current time point and the image features of the three-dimensional contrast image. Wherein a voxel is the smallest unit constituting the first index table volume data, corresponding to a pixel in a two-dimensional space.
In an initial case, each voxel in the first index table is assigned the same initial value or is unassigned. After the three-dimensional contrast volume data at the first time instant (e.g. the first frame) is acquired, the spatial voxel position reached by the contrast agent can be obtained from the three-dimensional contrast volume data at the time instant, and then the value at the corresponding voxel position in the first index is updated accordingly to the time stamp of the current time instant (the first time instant). For three-dimensional contrast volume data at a subsequent moment, updating the time label of the voxels in the first index table in a similar way, and if relevant time information exists in the first index table, not recording the time information at the current moment; if the relevant time information does not exist, the time information of the current moment is recorded.
Further, an integral of the occurrence time of the contrast signal may be calculated based on the first index table body data to obtain time integral information. Specifically, similar to the ray tracing algorithm, a plurality of rays passing through the volume data T1 may be emitted based on the line-of-sight direction, each ray progresses according to a specific step length, the volume data T1 on the ray paths is sampled, the sampled values on each ray path are accumulated, and finally, a time integral value of the occurrence time on each ray path is obtained through calculation, and after all the rays are traversed, a two-dimensional integral value t_accum may be obtained, where the size is consistent with the two-dimensional rendered image. In addition, the time stamp in the first index table body data may be normalized (that is, all time stamp data are converted into dimensionless values) before the integration is performed, so that the calculation process may be simplified.
In step S430, the three-dimensional contrast volume data at the time is rendered to obtain a rendered image, and accumulated opacity information of each pixel in the rendered image is acquired.
In embodiments of the application, rendering of three-dimensional contrast volume data at any instant in time may include volume rendering. For example, the process of volume rendering of three-dimensional contrast volume data at any instant in time may comprise: transmitting a plurality of rays passing through the contrast (time) volume data based on the line-of-sight direction, progressive each ray according to a fixed step length, sampling the contrast (time) volume data on a ray path, calculating Opacity of each sampling point, accumulating the Opacity of each ray path, finally calculating to obtain the accumulated Opacity of the three-dimensional contrast volume data at the moment, and obtaining a two-dimensional accumulated Opacity opacity_render after traversing all the rays, wherein the size is consistent with that of a two-dimensional rendered image.
In step S440, the corresponding color value is retrieved from a pre-created color information table associated with time integration and accumulated opacity based on the accumulated opacity information of each pixel and the time integration information.
In the embodiment of the present application, a color information table varying with the cumulative opacity and the time integral is created in advance, as shown in fig. 5, and different cumulative opacities, different time integral values correspond to different colors (it should be understood that, for the purposes of the patent application, here, fig. 5 shows a gray-scale image, and cannot show a set color), as shown in formula (3):
Color T1,render=Table(Taccum,Opacityrender) formula (3)
In formula (3), color T1,render is an index Color value, table () is a two-dimensional index Table (as shown in fig. 5), T accum is a time integral value (calculated in step S420) of the first index Table (time three-dimensional index Table) T1, and Opacity render is an accumulated Opacity of three-dimensional contrast volume data (calculated in step S430).
In the embodiment of the application, corresponding color values are retrieved from a color information table which is pre-created and is associated with the accumulated opacity and the time integral value based on the accumulated opacity and the time integral information of the three-dimensional contrast volume data, so that the color values of the original color values of each pixel are fused with the retrieved corresponding color values to obtain fused color values for each pixel, and the fused color values are mapped onto the corresponding pixels of the rendered image obtained in the step S430, thereby obtaining a three-dimensional rendered image with time pseudo-color, and helping a user to more intuitively understand and observe the flowing speed of contrast agent in tissues at different times. In addition, since the color of each pixel displayed in the stereoscopic rendered image is related to both time integral and accumulated opacity, the display of the image can be made more hierarchical, which is more beneficial to the observation of the image and the diagnosis of the corresponding tissue/organ by the user.
Further, in the embodiment of the present application, a color model of a non-hue class such as RGB, YUV, etc. may be used, that is, the foregoing color index value may be a color value of a non-hue class such as RGB, YUV, etc. The natural transition of colors can be realized by adopting a non-hue color model, and the screen coding display is more accordant. Of course, this is merely exemplary, and a color model of a hue class or any other suitable color model may be employed.
In step S450, the original color value of each pixel is fused with the retrieved corresponding color value to obtain a fused color value for each pixel.
In an embodiment of the present application, by retrieving the index color value associated with the accumulated opacity information and the time integral information of each pixel in step S440, the color value may be fused with the original color value of each pixel in the rendered image obtained in step S430 to obtain a fused color value. Illustratively, fusing the original color value of each pixel with the retrieved corresponding color value may include: the original color values are weighted based on the corresponding color values, and the corresponding color values are used as a way of integrating the accumulated weight, the product weight or the quotient weight of the original color values.
Further, as previously described, in embodiments of the present application, color models of non-hue classes such as RGB, YUV, etc. may be employed. Based on this, the original color value of each pixel may be converted into the non-hue-type color space before the fusing, and then the color value obtained after the conversion may be fused with the color value indexed.
In step S460, the blended color values for each pixel are mapped onto the respective pixels of the rendered image to obtain a volume rendering map.
In an embodiment of the present application, based on the fused color values obtained in step S450, they may be mapped onto each pixel of the two-dimensional image, thereby obtaining a volume rendering map.
Further, as described above, the operations of steps S420 to S460 may be performed on the three-dimensional contrast volume data at each time until the three-dimensional contrast volume data at all times is processed. Further, volume rendering maps at various times may be displayed to the user, thereby enabling the user to observe how fast the contrast agent flows within the tissue at different times.
Based on the above description, the four-dimensional ultrasonic imaging method according to the embodiment of the application combines the current time and the contrast rendering data to obtain the three-dimensional rendering image with the time pseudo color, which can help a user to intuitively understand and observe the flowing speed of the contrast agent in the tissue at different times and acquire more clinical information.
Fig. 6 shows a schematic flow chart of a four-dimensional ultrasound imaging method 600 according to yet another embodiment of the application, as shown in fig. 6, the four-dimensional ultrasound imaging method 600 may include the steps of:
in step S610, four-dimensional contrast volume data is acquired, and operations in the following steps (i.e., step S620 to step S670 to be described later) are performed for three-dimensional contrast volume data at each time in the four-dimensional contrast volume data.
In one example of the present application, the four-dimensional contrast volume data acquired at step S610 may be four-dimensional contrast volume data acquired with an ultrasound volume probe. In other examples, the four-dimensional contrast volume data acquired in step S610 may also be four-dimensional contrast volume data acquired by other means. In the embodiment of the present application, steps S620 to S670, which will be described later, are processes to be performed for three-dimensional contrast volume data at each time (for example, each frame) in the four-dimensional contrast volume data acquired in step S610. That is, after the three-dimensional contrast volume data at any time (e.g., any frame) is processed in steps S620 to S670, it may be determined whether to traverse the three-dimensional contrast volume data at all times (e.g., all frames) in the four-dimensional contrast volume data, and if there is still three-dimensional contrast volume data at the unprocessed time, the processing in steps S620 to S670 is continued until the three-dimensional contrast volume data at all times is processed.
In step S620, the time labels of the corresponding voxels in the first index table body data are updated based on the three-dimensional contrast volume data at the time, and the time labels are integrated to obtain time integration information, where the first index table body data is pre-established volume data recording the occurrence time of the contrast signal.
In the embodiment of the present application, the first index table body data (denoted as T1) for recording the occurrence time of the contrast signal is pre-established, where the first index table is a three-dimensional index table, and the size of the index table may be the same as the size of the three-dimensional contrast volume data at any time in the four-dimensional contrast volume data acquired in step S610. After the three-dimensional contrast volume data at any moment is acquired, the image features of the three-dimensional contrast volume data can be extracted, and the time labels of the corresponding voxels in the first index table can be updated according to the current time point and the image features of the three-dimensional contrast image. Wherein a voxel is the smallest unit constituting the first index table volume data, corresponding to a pixel in a two-dimensional space.
In an initial case, each voxel in the first index table is assigned the same initial value or is unassigned. After the three-dimensional contrast volume data at the first time instant (e.g. the first frame) is acquired, the spatial voxel position reached by the contrast agent can be obtained from the three-dimensional contrast volume data at the time instant, and then the value at the corresponding voxel position in the first index is updated accordingly to the time stamp of the current time instant (the first time instant). For three-dimensional contrast volume data at a subsequent moment, updating the time label of the voxels in the first index table in a similar way, and if relevant time information exists in the first index table, not recording the time information at the current moment; if the relevant time information does not exist, the time information of the current moment is recorded.
Further, an integral of the occurrence time of the contrast signal may be calculated based on the first index table body data to obtain time integral information. Specifically, similar to the ray tracing algorithm, a plurality of rays passing through the volume data T1 may be emitted based on the line-of-sight direction, each ray progresses according to a specific step length, the volume data T1 on the ray paths is sampled, the sampled values on each ray path are accumulated, and finally, a time integral value of the occurrence time on each ray path is obtained through calculation, and after all the rays are traversed, a two-dimensional integral value t_accum may be obtained, where the size is consistent with the two-dimensional rendered image. In addition, the time stamp in the first index table body data may be normalized (that is, all time stamp data are converted into dimensionless values) before the integration is performed, so that the calculation process may be simplified.
In step S630, the time stamp of the corresponding voxel in the second index table volume data is updated based on the three-dimensional contrast volume data of the moment, where the second index table volume data is the volume data that is pre-established and records the duration of the occurrence of the contrast signal.
In the embodiment of the present application, the second index table body data (denoted as T2) for recording the duration of the contrast signal is pre-established, where the second index table is a three-dimensional index table, and the size of the index table may be the same as the size of the three-dimensional contrast volume data at any time in the four-dimensional contrast volume data acquired in step S610. After the three-dimensional contrast volume data at any moment is acquired, the image features of the three-dimensional contrast volume data can be extracted, and the time labels of the corresponding voxels in the second index table can be updated according to the current time point and the image features of the three-dimensional contrast image. Wherein the voxel is the smallest unit constituting the volume data of the second index table, corresponding to a pixel in the two-dimensional space.
In an initial case, each voxel in the second index table is assigned the same initial value or is unassigned. After the three-dimensional contrast volume data at the first time instant (e.g. the first frame) is acquired, the spatial voxel position reached by the contrast agent can be obtained from the three-dimensional contrast volume data at the time instant, and then the value at the corresponding voxel position in the second index is updated accordingly to the time stamp of the current time instant (the first time instant). For the three-dimensional contrast volume data at a subsequent time instant, the time labels of the voxels in the second index table are updated in a similar way, and if relevant time information already exists in the second index table, the time information at the current time instant is accumulated on the basis of the previous time information.
Further, the time labels in the second index table body data may be normalized (i.e., all time label data are transformed into dimensionless values), so that the calculation process of the subsequent steps may be simplified.
In step S640, the three-dimensional contrast volume data at the time is rendered to obtain a rendered image, and the cumulative opacity information of each pixel in the rendered image is acquired based on the second index table volume data.
For example, rendering of three-dimensional contrast volume data at any instant in time may include volume rendering. For example, the process of volume rendering of three-dimensional contrast volume data at any instant in time may comprise: transmitting a plurality of rays passing through contrast (Time) volume data based on the line-of-sight direction, each ray progressing according to a fixed step length, sampling the contrast (Time) volume data on a ray path, calculating Opacity of each sampling point, simultaneously acquiring a value Time of volume data T2 at a current sampling point, and performing fusion processing (Opacity Time) on the two to obtain the fused Opacity opacity_fusion. And accumulating the Opacity opacity_fusion on each ray path, finally calculating to obtain the accumulated Opacity of the three-dimensional contrast volume data at the moment, and after all rays are traversed, obtaining the two-dimensional accumulated opacity_render, wherein the size of the two-dimensional accumulated opacity_render is consistent with that of the two-dimensional rendered image.
In step S650, respective color values are retrieved from a color information table associated with time integration and accumulated opacity, which is created in advance, based on the accumulated opacity information of each pixel and the time integration information.
In the embodiment of the present application, a color information table varying with the accumulated opacity and the time integral is created in advance, as shown in fig. 5, and different accumulated opacities, different time integral values correspond to different colors, as shown in the foregoing formula (3).
In the embodiment of the application, corresponding color values are retrieved from a color information table which is pre-created and is associated with the accumulated opacity and the time integral value based on the accumulated opacity and the time integral information of the three-dimensional contrast volume data, so that the color values of the original color values of each pixel are fused with the retrieved corresponding color values to obtain fused color values for each pixel, and the fused color values are mapped onto the corresponding pixels of the rendering image obtained in the step S640, thereby obtaining a three-dimensional rendering image with time pseudo-color, and helping a user to more intuitively understand and observe the flowing speed of contrast agent in tissues at different times. In addition, since the color of each pixel displayed in the stereoscopic rendered image is related to both time integral and accumulated opacity, the display of the image can be made more hierarchical, which is more beneficial to the observation of the image and the diagnosis of the corresponding tissue/organ by the user.
Further, in the embodiment of the present application, a color model of a non-hue class such as RGB, YUV, etc. may be used, that is, the foregoing color index value may be a color value of a non-hue class such as RGB, YUV, etc. The natural transition of colors can be realized by adopting a non-hue color model, so that the rendering diagram obtained in the image is more in line with screen coding display. Of course, this is merely exemplary, and a color model of a hue class or any other suitable color model may be employed.
In step S660, the original color value of each pixel is fused with the retrieved corresponding color value to obtain a fused color value for each pixel.
In an embodiment of the present application, by retrieving the index color value associated with the accumulated opacity information and the time integral information of each pixel in step S650, the color value may be fused with the original color value of each pixel in the rendered image obtained in step S640 to obtain a fused color value. Illustratively, fusing the original color value of each pixel with the retrieved corresponding color value may include: the original color values are weighted based on the corresponding color values, and the corresponding color values are used as a way of integrating the accumulated weight, the product weight or the quotient weight of the original color values.
Further, as previously described, in embodiments of the present application, color models of non-hue classes such as RGB, YUV, etc. may be employed. Based on this, the original color value of each pixel may be converted into the non-hue-type color space before the fusing, and then the color value obtained after the conversion may be fused with the color value indexed.
In step S670, the blended color values for each pixel are mapped onto the respective pixels of the rendered image to obtain a volume rendering map.
In an embodiment of the present application, based on the fused color values obtained in step S660, they may be mapped onto each pixel of the two-dimensional image, thereby obtaining a volume rendering map.
Further, as described above, the operations of the above steps S620 to S670 may be performed on the three-dimensional contrast volume data at each time until the three-dimensional contrast volume data at all times is processed. Further, volume rendering maps at various times may be displayed to the user, thereby enabling the user to observe how fast the contrast agent flows within the tissue at different times.
Based on the above description, the four-dimensional ultrasonic imaging method according to the embodiment of the application combines the current time and the contrast rendering data to obtain the three-dimensional rendering image with the time pseudo color, which can help a user to intuitively understand and observe the flowing speed of the contrast agent in the tissue at different times and acquire more clinical information.
The four-dimensional ultrasound imaging method according to the embodiment of the present application is exemplarily shown above, and a schematic diagram of a three-dimensional contrast rendering map at different moments (gradually increasing time from left to right) obtained by using the four-dimensional ultrasound imaging method according to the embodiment of the present application is exemplarily described below by fig. 7. As shown in fig. 7, only a small portion of the contrast agent enters the tissue upon injection of the contrast agent into the target organ (tissue), and therefore the three-dimensional image shows only a small portion of the tissue, as shown in the uppermost image of fig. 7, which is set to yellow in one example, it should be understood that, for the purposes of this patent document, fig. 7 herein shows a gray scale image and cannot show the set color. After a certain time, the contrast agent gradually penetrates into the tissue, as shown in the middle image of fig. 7, in one example, yellow represents the earliest tissue, green represents the middle tissue, and blue represents the current tissue, and it should be understood that, for the purposes of the patent application, fig. 7 herein shows a gray scale image, and cannot show the set color. The lowermost image of fig. 7 shows the last moment of contrast image acquisition, in one example purple being the tissue at the current moment, it being understood that for the purposes of this patent document, here fig. 7 shows a grey scale image, not showing the set color. In a word, the four-dimensional ultrasonic imaging method provided by the embodiment of the application combines the current time and the contrast rendering data to obtain the stereoscopic rendering image with the time pseudo color, so that a user can be helped to intuitively understand and observe the flow speed of the contrast agent in the tissue at different times and acquire more clinical information.
A four-dimensional ultrasound imaging apparatus provided according to another aspect of the present application is described below in conjunction with fig. 8. Fig. 8 shows a schematic block diagram of a four-dimensional ultrasound imaging device 800 in accordance with an embodiment of the application. The four-dimensional ultrasound imaging device 800 includes a memory 810 and a processor 820.
Wherein the memory 810 stores program code for implementing the respective steps in a four-dimensional ultrasound imaging method according to an embodiment of the present application. Processor 820 is operative to execute program code stored in memory 810 to perform corresponding steps of a four-dimensional ultrasound imaging method in accordance with an embodiment of the present application.
Furthermore, according to an embodiment of the present application, there is provided an ultrasound system including the aforementioned four-dimensional ultrasound imaging apparatus.
Furthermore, according to an embodiment of the present application, there is also provided a storage medium on which program instructions are stored, which program instructions, when executed by a computer or processor, are adapted to carry out the respective steps of the four-dimensional ultrasound imaging method of an embodiment of the present application. The storage medium may include, for example, a memory card of a smart phone, a memory component of a tablet computer, a hard disk of a personal computer, read-only memory (ROM), erasable programmable read-only memory (EPROM), portable compact disc read-only memory (CD-ROM), USB memory, or any combination of the foregoing storage media.
Furthermore, according to an embodiment of the present application, there is also provided a computer program, which may be stored on a cloud or local storage medium. Which when executed by a computer or processor is adapted to carry out the respective steps of the four-dimensional ultrasound imaging method of an embodiment of the present application.
Based on the above description, the four-dimensional ultrasonic imaging method, the device and the system according to the embodiment of the application combine the current moment and the contrast rendering data to obtain the three-dimensional rendering image with the time pseudo color, which can help a user to intuitively understand and observe the flowing speed of the contrast agent in tissues at different times and acquire more clinical information.
Although the illustrative embodiments have been described herein with reference to the accompanying drawings, it is to be understood that the above illustrative embodiments are merely illustrative and are not intended to limit the scope of the present application thereto. Various changes and modifications may be made therein by one of ordinary skill in the art without departing from the scope and spirit of the application. All such changes and modifications are intended to be included within the scope of the present application as set forth in the appended claims.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the several embodiments provided by the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the above-described device embodiments are merely illustrative, e.g., the division of the elements is merely a logical functional division, and there may be additional divisions when actually implemented, e.g., multiple elements or components may be combined or integrated into another device, or some features may be omitted or not performed.
In the description provided herein, numerous specific details are set forth. However, it is understood that embodiments of the application may be practiced without these specific details. In some instances, well-known methods, structures and techniques have not been shown in detail in order not to obscure an understanding of this description.
Similarly, it should be appreciated that in order to streamline the application and aid in understanding one or more of the various inventive aspects, various features of the application are sometimes grouped together in a single embodiment, figure, or description thereof in the description of exemplary embodiments of the application. However, the method of the present application should not be construed as reflecting the following intent: i.e., the claimed application requires more features than are expressly recited in each claim. Rather, as the following claims reflect, inventive aspects lie in less than all features of a single disclosed embodiment. Thus, the claims following the detailed description are hereby expressly incorporated into this detailed description, with each claim standing on its own as a separate embodiment of this application.
It will be understood by those skilled in the art that all of the features disclosed in this specification (including any accompanying claims, abstract and drawings), and all of the processes or units of any method or apparatus so disclosed, may be combined in any combination, except combinations where the features are mutually exclusive. Each feature disclosed in this specification (including any accompanying claims, abstract and drawings), may be replaced by alternative features serving the same, equivalent or similar purpose, unless expressly stated otherwise.
Furthermore, those skilled in the art will appreciate that while some embodiments described herein include some features but not others included in other embodiments, combinations of features of different embodiments are meant to be within the scope of the application and form different embodiments. For example, in the claims, any of the claimed embodiments may be used in any combination.
Various component embodiments of the application may be implemented in hardware, or in software modules running on one or more processors, or in a combination thereof. Those skilled in the art will appreciate that some or all of the functions of some of the modules in an item analysis device according to embodiments of the present application may be implemented in practice using a microprocessor or Digital Signal Processor (DSP). The present application can also be implemented as an apparatus program (e.g., a computer program and a computer program product) for performing a portion or all of the methods described herein. Such a program embodying the present application may be stored on a computer readable medium, or may have the form of one or more signals. Such signals may be downloaded from an internet website, provided on a carrier signal, or provided in any other form.
It should be noted that the above-mentioned embodiments illustrate rather than limit the application, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The application may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed computer. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The use of the words first, second, third, etc. do not denote any order. These words may be interpreted as names.
The foregoing description is merely illustrative of specific embodiments of the present application and the scope of the present application is not limited thereto, and any person skilled in the art can easily think about variations or substitutions within the scope of the present application. The protection scope of the application is subject to the protection scope of the claims.
Claims (18)
1. A method of four-dimensional ultrasound imaging, the method comprising:
acquiring four-dimensional contrast volume data, and executing the following operations on three-dimensional contrast volume data at each moment in the four-dimensional contrast volume data:
Rendering the three-dimensional contrast volume data at the moment to obtain a rendered image, and acquiring time information and relative spatial depth information of each pixel in the rendered image, wherein the relative spatial depth information refers to a difference value of spatial depth information at the current moment of one pixel relative to spatial depth information at the last moment;
retrieving a corresponding color value from a pre-created color information table associated with both time and depth based on the temporal information and the relative spatial depth information of each pixel;
fusing the original color value of each pixel with the corresponding color value obtained by searching to obtain a fused color value for each pixel; and
Mapping the fused color values for each pixel onto the respective pixels of the rendered image to obtain a volume rendering map.
2. The method of claim 1, wherein the color values in the color information table are color values in a non-hue-type color space, the method further comprising:
the original color values of each pixel are converted to the non-hue-type color space prior to the fusing.
3. The method of claim 1, wherein said fusing said original color value for each pixel with said retrieved corresponding color value comprises:
The original color values are weighted based on the corresponding color values, and the corresponding color values are taken as accumulated weights, product weights or quotient weights of the original color values.
4. The method of claim 1, wherein said rendering three-dimensional contrast volume data at said time instant comprises:
and carrying out surface rendering and/or volume rendering on the three-dimensional contrast volume data at the moment.
5. A method of four-dimensional ultrasound imaging, the method comprising:
acquiring four-dimensional contrast volume data, and executing the following operations on three-dimensional contrast volume data at each moment in the four-dimensional contrast volume data:
Updating a time tag of a corresponding voxel in first index table body data based on the three-dimensional contrast volume data of the moment, and integrating the time tag to obtain time integration information, wherein the first index table body data is pre-established volume data for recording the occurrence time of a contrast signal;
Rendering the three-dimensional contrast volume data at the moment to obtain a rendered image, and acquiring accumulated opacity information of each pixel in the rendered image;
Retrieving a corresponding color value from a pre-created color information table associated with time integration and accumulated opacity based on the accumulated opacity information and the time integration information for each pixel;
fusing the original color value of each pixel with the corresponding color value obtained by searching to obtain a fused color value for each pixel; and
Mapping the fused color values for each pixel onto the respective pixels of the rendered image to obtain a volume rendering map.
6. The method of claim 5, wherein the method further comprises:
and before integrating the time labels, normalizing the time labels in the first index table body data.
7. The method of claim 5, wherein the color values in the color information table are color values in a non-hue-type color space, and further comprising:
the original color values of each pixel are converted to the non-hue-type color space prior to the fusing.
8. The method of claim 5, wherein said fusing said original color value for each pixel with said retrieved corresponding color value comprises:
The original color values are weighted based on the corresponding color values, and the corresponding color values are taken as accumulated weights, product weights or quotient weights of the original color values.
9. The method of claim 5, wherein said rendering three-dimensional contrast volume data at said time instant comprises:
and carrying out surface rendering and/or volume rendering on the three-dimensional contrast volume data at the moment.
10. A method of four-dimensional ultrasound imaging, the method comprising:
acquiring four-dimensional contrast volume data, and executing the following operations on three-dimensional contrast volume data at each moment in the four-dimensional contrast volume data:
Updating a time tag of a corresponding voxel in first index table body data based on the three-dimensional contrast volume data of the moment, and integrating the time tag to obtain time integration information, wherein the first index table body data is pre-established volume data for recording the occurrence time of a contrast signal;
Updating the time label of the corresponding voxel in the second index table body data based on the three-dimensional contrast body data of the moment, wherein the second index table body data is the body data which is established in advance and records the continuous occurrence time of the contrast signal;
rendering the three-dimensional contrast volume data at the moment to obtain a rendered image, and acquiring accumulated opacity information of each pixel in the rendered image based on the second index table volume data;
Retrieving a corresponding color value from a pre-created color information table associated with time integration and accumulated opacity based on the accumulated opacity information and the time integration information for each pixel;
fusing the original color value of each pixel with the corresponding color value obtained by searching to obtain a fused color value for each pixel; and
Mapping the fused color values for each pixel onto the respective pixels of the rendered image to obtain a volume rendering map.
11. The method according to claim 10, wherein the method further comprises:
and before integrating the time labels, normalizing the time labels in the first index table body data.
12. The method according to claim 10, wherein the method further comprises:
And before the accumulated opacity information of each pixel in the rendered image is acquired based on the second index table body data, performing normalization processing on the time tag in the second index table body data.
13. The method of claim 10, wherein the color values in the color information table are color values in a non-hue-type color space, and further comprising:
the original color values of each pixel are converted to the non-hue-type color space prior to the fusing.
14. The method of claim 10, wherein said fusing said original color value for each pixel with said retrieved corresponding color value comprises:
The original color values are weighted based on the corresponding color values, and the corresponding color values are taken as accumulated weights, product weights or quotient weights of the original color values.
15. The method of claim 10, wherein said rendering the three-dimensional contrast volume data at the time instant comprises:
and carrying out surface rendering and/or volume rendering on the three-dimensional contrast volume data at the moment.
16. A four-dimensional ultrasound imaging apparatus, characterized in that the apparatus comprises a memory and a processor, the memory having stored thereon a computer program to be run by the processor, which computer program, when run by the processor, performs the four-dimensional ultrasound imaging method according to any of claims 1-15.
17. An ultrasound system comprising the four-dimensional ultrasound imaging device of claim 16.
18. A storage medium having stored thereon a computer program which, when run, performs the four-dimensional ultrasound imaging method of any of claims 1-15.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911018471.4A CN112704513B (en) | 2019-10-24 | 2019-10-24 | Four-dimensional ultrasonic imaging method, device, system and storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201911018471.4A CN112704513B (en) | 2019-10-24 | 2019-10-24 | Four-dimensional ultrasonic imaging method, device, system and storage medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN112704513A CN112704513A (en) | 2021-04-27 |
| CN112704513B true CN112704513B (en) | 2024-10-11 |
Family
ID=75540283
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201911018471.4A Active CN112704513B (en) | 2019-10-24 | 2019-10-24 | Four-dimensional ultrasonic imaging method, device, system and storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN112704513B (en) |
Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109741444A (en) * | 2019-01-10 | 2019-05-10 | 深圳开立生物医疗科技股份有限公司 | A kind of three-dimensional contrastographic picture display methods, device, equipment and storage medium |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP5762076B2 (en) * | 2010-03-30 | 2015-08-12 | 株式会社東芝 | Ultrasonic diagnostic apparatus, ultrasonic image processing apparatus, and medical image diagnostic apparatus |
| US9224240B2 (en) * | 2010-11-23 | 2015-12-29 | Siemens Medical Solutions Usa, Inc. | Depth-based information layering in medical diagnostic ultrasound |
| US20130150719A1 (en) * | 2011-12-08 | 2013-06-13 | General Electric Company | Ultrasound imaging system and method |
-
2019
- 2019-10-24 CN CN201911018471.4A patent/CN112704513B/en active Active
Patent Citations (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109741444A (en) * | 2019-01-10 | 2019-05-10 | 深圳开立生物医疗科技股份有限公司 | A kind of three-dimensional contrastographic picture display methods, device, equipment and storage medium |
Also Published As
| Publication number | Publication date |
|---|---|
| CN112704513A (en) | 2021-04-27 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109961834B (en) | Method and device for generating image diagnosis report | |
| US9218661B2 (en) | Image analysis for specific objects | |
| WO2017114366A1 (en) | Data rendering method and system thereof | |
| US8805034B2 (en) | Selection of datasets from 3D renderings for viewing | |
| CN107945878B (en) | Method for constructing hepatic vein pressure gradient calculation model based on radiology group | |
| WO2006042077A2 (en) | Sampling medical images for virtual histology | |
| KR102149369B1 (en) | Method for visualizing medical image and apparatus using the same | |
| EP3877949A1 (en) | Systems and methods for semi-automatic tumor segmentation | |
| Noll et al. | Automated kidney detection and segmentation in 3D ultrasound | |
| CN118285839A (en) | Ultrasound contrast imaging method and ultrasound imaging device | |
| CN108876783B (en) | Image fusion method and system, medical equipment and image fusion terminal | |
| CA3011141A1 (en) | Confidence determination in a medical imaging video clip measurement based upon video clip image quality | |
| US12033755B2 (en) | Method and arrangement for identifying similar pre-stored medical datasets | |
| TW202033159A (en) | Image processing method, device and system, electronic apparatus, and computer readable storage medium | |
| CN114708283A (en) | Image object segmentation method and device, electronic equipment and storage medium | |
| US20240119705A1 (en) | Systems, methods, and apparatuses for identifying inhomogeneous liver fat | |
| CN112704513B (en) | Four-dimensional ultrasonic imaging method, device, system and storage medium | |
| US20130332868A1 (en) | Facilitating user-interactive navigation of medical image data | |
| JP2011182960A (en) | Program and information treatment device | |
| Fei et al. | PET-directed, 3D ultrasound-guided prostate biopsy | |
| US10977792B2 (en) | Quantitative evaluation of time-varying data | |
| Guyon et al. | VETOT, volume estimation and tracking over time: Framework and validation | |
| JPWO2019058657A1 (en) | Fluid analysis device and operation method of fluid analysis device and fluid analysis program | |
| JPWO2018159708A1 (en) | Apparatus and method for blood flow analysis and program | |
| JP6813759B2 (en) | Projection image calculation processing device, projection image calculation processing method and projection image calculation processing program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |