[go: up one dir, main page]

CN114170213A - Parathyroid gland identification method based on image fusion technology - Google Patents

Parathyroid gland identification method based on image fusion technology Download PDF

Info

Publication number
CN114170213A
CN114170213A CN202111529377.2A CN202111529377A CN114170213A CN 114170213 A CN114170213 A CN 114170213A CN 202111529377 A CN202111529377 A CN 202111529377A CN 114170213 A CN114170213 A CN 114170213A
Authority
CN
China
Prior art keywords
image
thyroid
parathyroid gland
live
model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
CN202111529377.2A
Other languages
Chinese (zh)
Inventor
高永楷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN202111529377.2A priority Critical patent/CN114170213A/en
Publication of CN114170213A publication Critical patent/CN114170213A/en
Withdrawn legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/20Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10081Computed x-ray tomography [CT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10088Magnetic resonance imaging [MRI]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • G06T2207/10108Single photon emission computed tomography [SPECT]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10116X-ray image
    • G06T2207/10121Fluoroscopy
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10132Ultrasound image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/41Medical
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Medical Informatics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • General Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Health & Medical Sciences (AREA)
  • Architecture (AREA)
  • Computer Hardware Design (AREA)
  • General Engineering & Computer Science (AREA)
  • Apparatus For Radiation Diagnosis (AREA)

Abstract

The invention discloses a parathyroid gland identification method based on an image fusion technology, which comprises the following steps: acquiring depth images of the fluorescence development image and the live-action image, and realizing three-dimensional reconstruction of the thyroid tissue image based on the depth images to obtain a three-dimensional thyroid tissue image; the method comprises the steps of correcting and splicing three-dimensional thyroid tissue images based on position information corresponding to a fluorography image and a live-action image to generate a thyroid three-dimensional live-action model and a thyroid fluorography model, and identifying the parathyroid gland loaded areas in the thyroid three-dimensional live-action model and the thyroid fluorography model based on a Dssd-inclusion-V3-model; realizing the overlapping fusion operation of a thyroid three-dimensional live-action model carrying parathyroid gland zone marks and a thyroid fluorescence developing model; and identifying the overlapped parathyroid gland regions, and taking the non-overlapped regions as secondary important attention regions. The method greatly improves the accuracy of parathyroid gland identification.

Description

Parathyroid gland identification method based on image fusion technology
Technical Field
The invention relates to the technical field of medical assistance, in particular to a parathyroid gland identification method based on an image fusion technology.
Background
Parathyroid gland is an important endocrine gland in the human body, and its main function is to secrete parathyroid hormone (PTH for short). The living parathyroid gland is generally flat and oval, yellow or brownish yellow in appearance, and is similar to soybean in shape. Generally, each person has four parathyroid glands symmetrically distributed in the middle and lower parts of the thyroid glands on the left and right sides, but the parathyroid glands of some people have large differences in position and number.
The parathyroid gland is small in size and unfixed in position and quantity, so that the parathyroid gland is not easily distinguished from ectopic thyroid gland, ectopic thymus, peripheral fat and lymph node tissue, and the risk of miscut, injury, contusion or blood supply damage is greatly increased. If the parathyroid gland is cut by mistake in the operation of neck such as thyroid gland, the secretion of parathyroid hormone is insufficient, the blood calcium level is reduced, and the blood phosphorus level is increased, so that the symptoms of numbness of hands and feet, lip and four limbs of a patient occur, and the postoperative life quality of the patient is seriously affected.
Accurate parathyroid gland identification in an operation is a precondition for protecting the parathyroid gland from being cut or damaged by mistake, and the common parathyroid gland identification methods in the operation mainly comprise the following methods:
1. visual identification: the operative area is observed visually or with the aid of an endoscope to look for suspected parathyroid tissue. Its advantages are short time, low cost and high requirement to personal experience of doctor.
2. Dyeing and identifying: parathyroid glands are differentiated by staining them with a specific dye or their surrounding tissues. The dye used for marking is divided into positive staining agent and negative staining agent, and the positive staining agent is used for directly staining and marking parathyroid gland tissue, such as methylene blue. The negative staining agent is used for staining and marking tissues around the parathyroid gland so as to contrast the parathyroid gland tissues, such as nano carbon injection. The dyeing identification has the advantages of relatively non-invasive property, high accuracy and high cost, and still has disputes in the aspects of dyeing time, position and dosage, and meanwhile, part of dyeing agents have certain side effects on human bodies, such as the possibility of causing nausea and vomiting, fever, oxygen deficiency, urethral orifice pricking and other discomforts of patients due to the methylene blue.
3. Identifying by autofluorescence development: research shows that when the parathyroid gland is irradiated by light with the wavelength of 785nm, the parathyroid gland can generate near infrared autofluorescence with the peak value of 820-830 nm, the autofluorescence is fluorescence emitted by an inherent phosphor, and is different from fluorescence generated by a fluorescent labeling dye, and the parathyroid gland can be distinguished from other surrounding tissues by utilizing the characteristic. The parathyroid gland is identified by autofluorescence development, and the method has the advantages of high accuracy, simple operation, high speed, no wound, etc., and avoids the side effects possibly caused by fluorescent dye and contrast agent. Autofluorescence imaging identification has much broader research and application space than other identification methods. However, in the actual use process, it is found that autofluorescence is generated in part of lymph nodes and adipose tissues burned by an electrotome, interference is generated, and the accuracy of autofluorescence imaging identification is difficult to guarantee.
Disclosure of Invention
In order to solve the problems, the invention provides a parathyroid gland identification method based on an image fusion technology, which is based on the overlapping fusion and identification operations of a thyroid three-dimensional live-action model and a thyroid fluorescence developing model constructed by a fluorescence developing image and a live-action image, and greatly improves the parathyroid gland identification accuracy.
In order to achieve the purpose, the invention adopts the technical scheme that:
a parathyroid gland identification method based on image fusion technology is characterized in that: the method comprises the following steps:
s1, acquiring a fluorescence development image and a live-action image of the thyroid tissue to be identified in the same visual field, comprehensively analyzing SPECT and CT image fusion, an ultrasonic image and head and neck MRI, planning out a parathyroid gland area, understanding the relation with adjacent organs, and avoiding overlarge operation range and damage to surrounding important organs;
s2, obtaining the depth image of each fluorescence development image and each live-action image through a kinect depth sensor, and realizing three-dimensional reconstruction of the thyroid tissue image based on the obtained depth images to obtain a three-dimensional thyroid tissue image;
s3, correcting and splicing the three-dimensional thyroid tissue images based on the position information corresponding to each fluorescence development image and the live-action image to generate a thyroid three-dimensional live-action model and a thyroid fluorescence development model;
s4, identifying the parathyroid gland loaded regions in the thyroid three-dimensional live-action model and the thyroid fluorography model based on the Dssd-inclusion-V3-model;
s5, realizing the overlapping fusion operation of the thyroid gland three-dimensional live-action model carrying the parathyroid gland zone mark and the thyroid gland fluorescence development model;
and S6, identifying the overlapped parathyroid gland area, and taking the non-overlapped area as a secondary important attention area.
Further, in step S1, the method at least includes a top view fluorescence development image, a left view fluorescence development image, a right view fluorescence development image, a front view fluorescence development image, a back view fluorescence development image, a top view live-action image, a left view live-action image, a right view live-action image, a front view live-action image, and a back view live-action image of the thyroid tissue, and each of the fluorescence development image and the live-action image carries the posture information and the position information of the image capturing device corresponding thereto.
Further, in step S3, the correction of the deflection angle and the size of the three-dimensional thyroid tissue image is performed based on the image capturing device attitude information and the position information corresponding to the top view fluoroscopic developed image/top view live view image, and based on the image capturing device attitude information and the position information corresponding to the remaining fluoroscopic developed image and live view image.
Further, the shape and color of the parathyroid gland in the thyroid three-dimensional live-action model are used as detection characteristics of the parathyroid gland; the fluorescence developing area in the thyroid gland fluorescence developing model is used as the detection characteristic of parathyroid gland.
Further, the fluorography image is a thyroid tissue image acquired by a fluorescence imaging surgical navigation system, and the live-action image is a thyroid tissue image acquired by an endoscope in the same visual field.
Further, in step S4, after the Dssd _ inclusion _ V3_ model completes the detection, a parathyroid gland region is circled in the thyroid three-dimensional live-action model and the thyroid fluorescence imaging model.
Further, in step S5, an overlap fusion operation of the thyroid three-dimensional live-action model and the thyroid fluoroscopic model with the parathyroid gland region markers is implemented based on an overlap operation of the coordinates of the central points of the thyroid three-dimensional live-action model and the thyroid fluoroscopic model and the coordinates of the fixed points.
Further, in step S6, the overlapped parathyroid gland region is the target parathyroid gland, when the surgical operation region falls into the non-overlapped region, the laparoscope automatically enlarges the region, and the parathyroid gland is redetected based on the Dssd _ inclusion _ V3_ model.
The invention has the following beneficial effects:
1) the overlapping fusion and identification operations of the thyroid three-dimensional live-action model and the thyroid fluorescence developing model constructed on the basis of the fluorescence developing image and the live-action image fully consider the position parameters and the size parameters of the parathyroid gland and the characteristic parameters under different conditions, and the accuracy of parathyroid gland identification is greatly improved through a mutual inspection mode.
2) The machine vision is used for replacing the artificial detection and identification, and the accuracy of parathyroid gland identification is improved on the premise of avoiding depending on personal working experience.
3) The non-overlapping area is used as a secondary attention area, and corresponding measures for avoiding missing detection are configured for the secondary attention area, so that the safety of the operation is further improved.
Drawings
Fig. 1 is a flowchart of a parathyroid gland identification method based on an image fusion technology in an embodiment of the present invention.
Detailed Description
The present invention will be described in detail with reference to specific examples. The following examples will assist those skilled in the art in further understanding the invention, but are not intended to limit the invention in any way. It should be noted that variations and modifications can be made by persons skilled in the art without departing from the spirit of the invention. All falling within the scope of the present invention.
Examples
As shown in fig. 1, a parathyroid gland identification method based on image fusion technology includes the following steps:
s1, acquiring a fluorescence development image and a live-action image of the thyroid tissue to be identified under the same visual field, comprehensively analyzing the fusion of the SPECT and CT images, an ultrasonic image and head and neck MRI, and planning out a parathyroid gland area; the fluorescence development image is a thyroid tissue image acquired by a fluorescence imaging operation navigation system, and the live-action image is a thyroid tissue image acquired by an endoscope under the same visual field;
s2, obtaining the depth image of each fluorescence development image and each live-action image through a kinect depth sensor, and realizing three-dimensional reconstruction of the thyroid tissue image based on the obtained depth images to obtain a three-dimensional thyroid tissue image; specifically, the acquisition of the depth images of each fluorescence developed image and each live-action image is completed through a kinect depth sensor, the obtained depth images are triangulated, all the triangulated depth images are fused in a scale space to construct a layered directional distance field, all voxels in the distance field are subjected to integral triangulation algorithm to generate a convex hull covering all the voxels, an isosurface is constructed through a Marchandra algorithm, the three-dimensional reconstruction of the images is completed, and the three-dimensional thyroid tissue image is obtained.
S3, correcting and splicing the three-dimensional thyroid tissue images based on the position information corresponding to each fluorescence development image and the live-action image to generate a thyroid three-dimensional live-action model and a thyroid fluorescence development model;
s4, identifying the parathyroid gland loaded regions in the thyroid three-dimensional live-action model and the thyroid fluorography model based on the Dssd-inclusion-V3-model;
s5, realizing the overlapping fusion operation of the thyroid gland three-dimensional live-action model carrying the parathyroid gland zone mark and the thyroid gland fluorescence development model;
and S6, identifying the overlapped parathyroid gland area, and taking the non-overlapped area as a secondary important attention area.
In this embodiment, in step S1, the method at least includes a top view fluorescence development image, a left side view fluorescence development image, a right side view fluorescence development image, a front side view fluorescence development image, a back side view fluorescence development image, a top view live-action image, a left side view live-action image, a right side view live-action image, a front side view live-action image, and a back side view live-action image of the thyroid tissue, and each of the fluorescence development image and the live-action image carries posture information and position information of the image acquisition device corresponding to the fluorescence development image and the live-action image. After the fluorescence imaging operation navigation system and the endoscope enter the body of a patient, automatically adjusting the angle to complete the acquisition of an image;
in this embodiment, in step S3, the correction of the deflection angle and the size of the three-dimensional thyroid tissue image is implemented according to the posture information and the position information of the image capturing device corresponding to the rest of the fluoroscopic developed image and the live-action image, with the posture information and the position information of the image capturing device corresponding to the overlook fluoroscopic developed image/the overlook live-action image as the reference.
In this embodiment, the shape and color of the parathyroid gland in the thyroid three-dimensional live-action model are used as the detection characteristics of the parathyroid gland; the fluorescence developing area in the thyroid gland fluorescence developing model is used as the detection characteristic of parathyroid gland. Namely, the Dssd-inclusion-V3 model for identifying the loaded parathyroid gland region in the thyroid three-dimensional live-action model adopts a Dssd-target detection algorithm, and is obtained by training the Dssd-target detection algorithm with the shape and color characteristics of the parathyroid gland loaded in the historical parathyroid gland image as a training parameter set to train the Dssd-inclusion-V3 model. The Dssd-inclusion-V3 model for identifying the parathyroid gland region carried in the thyroid gland fluorography model is obtained by training the _ inclusion-V3 _ by adopting a Dssd-target detection algorithm and taking the color characteristics, the pixel channel characteristics and the like of the fluorescence development region carried in the historical parathyroid gland fluorescence development image as a training parameter set.
In this embodiment, in the step S4, after the Dssd _ inclusion _ V3_ model completes the detection, a parathyroid gland region is circled in the thyroid gland three-dimensional real scene model and the thyroid gland fluorography model, preferably, the parathyroid gland region is marked by the thyroid gland three-dimensional real scene model and the thyroid gland fluorography model with different colors, and the system memorizes the position information and the size information of the region.
In this embodiment, in step S5, the overlap fusion operation of the thyroid three-dimensional live-action model carrying a parathyroid gland region marker and the thyroid fluoroscopic image model is implemented based on the overlap operation of the coordinates of the central points of the thyroid three-dimensional live-action model and the thyroid fluoroscopic image model and the coordinates of each fixed point, the step is designed to make the two models have the same central point, after the overlap fusion operation is completed, the position information of the parathyroid gland region carried in the thyroid three-dimensional live-action model and the thyroid fluoroscopic image model can be changed correspondingly according to the change condition of the central point of the models, and at this time, the overlap region and the non-overlap region can be obtained by comparing the position information and the size information of the parathyroid gland regions of the two models. The system automatically remembers the position information and size information of the overlapping and non-overlapping regions.
In this embodiment, in step S6, the overlapped parathyroid gland region is the target parathyroid gland, when the surgical operation region falls into the non-overlapped region, that is, when the surgical operation region enters a range corresponding to the position information and the size information of the non-overlapped region recorded by the system, the laparoscope automatically enlarges the region, and realizes the redetection of the parathyroid gland in the region based on the dsd _ inclusion _ V3_ model for identifying the parathyroid gland region loaded in the thyroid three-dimensional real-scene model.
The foregoing description of specific embodiments of the present invention has been presented. It is to be understood that the present invention is not limited to the specific embodiments described above, and that various changes or modifications may be made by one skilled in the art within the scope of the appended claims without departing from the spirit of the invention. The embodiments and features of the embodiments of the present application may be combined with each other arbitrarily without conflict.

Claims (8)

1. A parathyroid gland identification method based on image fusion technology is characterized in that: the method comprises the following steps:
s1, acquiring a fluorescence development image and a live-action image of the thyroid tissue to be identified under the same visual field, comprehensively analyzing the fusion of the SPECT and CT images, an ultrasonic image and head and neck MRI, and planning out a parathyroid gland area;
s2, obtaining the depth image of each fluorescence development image and each live-action image through a kinect depth sensor, and realizing three-dimensional reconstruction of the thyroid tissue image based on the obtained depth images to obtain a three-dimensional thyroid tissue image;
s3, correcting and splicing the three-dimensional thyroid tissue images based on the position information corresponding to each fluorescence development image and the live-action image to generate a thyroid three-dimensional live-action model and a thyroid fluorescence development model;
s4, identifying the parathyroid gland loaded regions in the thyroid three-dimensional live-action model and the thyroid fluorography model based on the Dssd-inclusion-V3-model;
s5, realizing the overlapping fusion operation of the thyroid gland three-dimensional live-action model carrying the parathyroid gland zone mark and the thyroid gland fluorescence development model;
and S6, identifying the overlapped parathyroid gland area, and taking the non-overlapped area as a secondary important attention area.
2. The parathyroid gland identification method based on image fusion technology of claim 1, characterized in that: in step S1, the method at least includes a top view fluorescence development image, a left side view fluorescence development image, a right side view fluorescence development image, a front side view fluorescence development image, a back side view fluorescence development image, a top view live-action image, a left side view live-action image, a right side view live-action image, a front side view live-action image, and a back side view live-action image of the thyroid tissue, and each fluorescence development image and each live-action image carries posture information and position information of the image acquisition device corresponding to each fluorescence development image and each live-action image.
3. The parathyroid gland identification method based on image fusion technology of claim 1, characterized in that: in step S3, the correction of the deflection angle and the size of the three-dimensional thyroid tissue image is implemented according to the posture information and the position information of the image capturing device corresponding to the rest of the fluoroscopic developed image and the live-action image, with the posture information and the position information of the image capturing device corresponding to the overhead fluoroscopic developed image/overhead live-action image as the reference.
4. The parathyroid gland identification method based on image fusion technology of claim 1, characterized in that: the shape and color of parathyroid gland in the thyroid gland three-dimensional real scene model are used as the detection characteristics of parathyroid gland; the fluorescence developing area in the thyroid gland fluorescence developing model is used as the detection characteristic of parathyroid gland.
5. The parathyroid gland identification method based on image fusion technology of claim 1, characterized in that: the fluorescence development image is a thyroid tissue image acquired by a fluorescence imaging operation navigation system, and the live-action image is a thyroid tissue image acquired by an endoscope in the same visual field.
6. The parathyroid gland identification method based on image fusion technology of claim 1, characterized in that: in step S4, after the Dssd _ inclusion _ V3_ model completes the detection, a parathyroid gland region is circled in the thyroid three-dimensional live-action model and the thyroid fluoroscopic model.
7. The parathyroid gland identification method based on image fusion technology of claim 1, characterized in that: in step S5, an overlap fusion operation of the thyroid three-dimensional live-action model and the thyroid fluoroscopic visualization model with the parathyroid gland region markers is implemented based on an overlap operation of the coordinates of the central points of the thyroid three-dimensional live-action model and the thyroid fluoroscopic visualization model and the coordinates of the fixed points.
8. The parathyroid gland identification method based on image fusion technology of claim 1, characterized in that: in step S6, the overlapped parathyroid gland region is the target parathyroid gland, when the surgical operation region falls into the non-overlapped region, the laparoscope automatically enlarges the region, and the parathyroid gland is redetected based on the Dssd _ inclusion _ V3_ model.
CN202111529377.2A 2021-12-15 2021-12-15 Parathyroid gland identification method based on image fusion technology Withdrawn CN114170213A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111529377.2A CN114170213A (en) 2021-12-15 2021-12-15 Parathyroid gland identification method based on image fusion technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111529377.2A CN114170213A (en) 2021-12-15 2021-12-15 Parathyroid gland identification method based on image fusion technology

Publications (1)

Publication Number Publication Date
CN114170213A true CN114170213A (en) 2022-03-11

Family

ID=80486506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111529377.2A Withdrawn CN114170213A (en) 2021-12-15 2021-12-15 Parathyroid gland identification method based on image fusion technology

Country Status (1)

Country Link
CN (1) CN114170213A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797617A (en) * 2022-12-05 2023-03-14 杭州显微智能科技有限公司 Parathyroid gland identification method and intelligent endoscope camera system device
CN116385337A (en) * 2022-12-15 2023-07-04 陕西中科创孚医疗科技有限责任公司 Parathyroid gland recognition device and method based on multi-light fusion

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115797617A (en) * 2022-12-05 2023-03-14 杭州显微智能科技有限公司 Parathyroid gland identification method and intelligent endoscope camera system device
CN116385337A (en) * 2022-12-15 2023-07-04 陕西中科创孚医疗科技有限责任公司 Parathyroid gland recognition device and method based on multi-light fusion
CN116385337B (en) * 2022-12-15 2023-10-17 西安长空医疗科技服务有限公司 Parathyroid gland recognition device and method based on multi-light fusion

Similar Documents

Publication Publication Date Title
US11275249B2 (en) Augmented visualization during surgery
US12053333B2 (en) Surgical enhanced visualization system and method of use
Bert et al. Clinical experience with a 3D surface patient setup system for alignment of partial-breast irradiation patients
US20220230334A1 (en) Pen-type medical fluorescent imaging device and system for aligning multiple fluorescent images using the same
CN107374729B (en) Operation navigation system and method based on AR technology
US9538907B2 (en) Endoscope system and actuation method for displaying an organ model image pasted with an endoscopic image
WO2016152042A1 (en) Endoscopic examination support device, method, and program
JP6430517B2 (en) How to calculate a surgical intervention plan
CN114170213A (en) Parathyroid gland identification method based on image fusion technology
EP3554383B1 (en) System for providing images for guiding surgery
EP2907453A1 (en) Diagnosis assistance device and diagnosis assistance method
CN107596578A (en) The identification and location determining method of alignment mark, imaging device and storage medium
US11690558B2 (en) Surgical navigation with stereovision and associated methods
US20210127957A1 (en) Apparatus for intraoperative identification and viability assessment of tissue and method using the same
CN110720985A (en) Multi-mode guided surgical navigation method and system
WO2015187620A1 (en) Surgical navigation with stereovision and associated methods
US20040152975A1 (en) Image registration
CN113205141B (en) Parathyroid gland identification method based on image fusion technology
CN119654105A (en) CT scanner and scanning method for performing brain scans
CN109620406B (en) Display and registration method for total knee arthroplasty
CN111728695A (en) A beam-assisted positioning method and positioning system for craniotomy
Rutten et al. Toward functional neuronavigation: implementation of functional magnetic resonance imaging data in a surgical guidance system for intraoperative identification of motor and language cortices: Technical note and illustrative case
WO2018109227A1 (en) System providing images guiding surgery
CN104463967A (en) Skin disease quantitative evaluation device
JP6795744B2 (en) Medical support method and medical support device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WW01 Invention patent application withdrawn after publication

Application publication date: 20220311

WW01 Invention patent application withdrawn after publication