CN118285913A - Method for guiding endoscopic surgery under navigation system, electronic equipment, navigation system and surgical robot system - Google Patents
Method for guiding endoscopic surgery under navigation system, electronic equipment, navigation system and surgical robot system Download PDFInfo
- Publication number
- CN118285913A CN118285913A CN202410385476.5A CN202410385476A CN118285913A CN 118285913 A CN118285913 A CN 118285913A CN 202410385476 A CN202410385476 A CN 202410385476A CN 118285913 A CN118285913 A CN 118285913A
- Authority
- CN
- China
- Prior art keywords
- image
- endoscope
- scale model
- navigation system
- navigation
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/313—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
- A61B1/317—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for bones or joints, e.g. osteoscopes, arthroscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B17/00—Surgical instruments, devices or methods
- A61B17/00234—Surgical instruments, devices or methods for minimally invasive surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T11/00—2D [Two Dimensional] image generation
- G06T11/60—Editing figures and text; Combining figures or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/003—Navigation within 3D models or images
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2068—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis using pointers, e.g. pointers having reference marks for determining coordinates of body points
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B2034/301—Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/012—Dimensioning, tolerancing
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- General Health & Medical Sciences (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Software Systems (AREA)
- Robotics (AREA)
- General Engineering & Computer Science (AREA)
- Computer Hardware Design (AREA)
- Computer Graphics (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Orthopedic Medicine & Surgery (AREA)
- Physical Education & Sports Medicine (AREA)
- Biophysics (AREA)
- Optics & Photonics (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Architecture (AREA)
- Endoscopes (AREA)
Abstract
The present invention relates to a method of guiding an endoscopic procedure under a navigation system, a corresponding computer readable storage medium, a control device, a computer program product, and an electronic device for navigation of an endoscopic procedure, a navigation system and a surgical robotic system. The method comprises the following steps: an endoscope azimuth acquiring step of acquiring azimuth of the endoscope under the navigation system; and (3) a ruler model construction step: constructing a scale model according to the azimuth of the endoscope under the navigation system; a guide image acquisition step; and (3) enhancing image fusion: fusing the scale model with the guide image to obtain an enhanced image; and a display step: a view including at least a portion of the enhanced image is displayed. Because the staff gauge model is fused in the guiding image, an operator can more intuitively judge the length of the tool extending out of the working channel and the distance/direction of each physiological structure/interest point in the under-scope image relative to the endoscope and the tool by taking the staff gauge model as a reference, so that the endoscope can be operated more accurately.
Description
Technical Field
The present invention relates to the technical field of medical devices, and in particular, to a surgical navigation system, and more particularly, to a method for guiding an endoscopic surgery, an electronic device, a navigation system, and a surgical robot system.
Background
In conventional endoscopic surgery, the endoscopic view is used solely for surgical viewing, tool manipulation and control. The doctor can only obtain the information such as the physiological structure of the endoscope front end or the position of the tool through the two-dimensional endoscope picture on the display device of the endoscope image system. When judging the distance between the physiological structure under the endoscope and the endoscope or the length of the tool extending out of the tool channel, the method can only be realized by depending on the operation experience of doctors or using the working area at the front end of the endoscope in C-arm operation.
Especially, the spinal endoscopic surgery has been widely accepted and popularized in clinical practice in recent years due to its minimally invasive nature. But because of the special configuration of the spinal endoscope, the physician has to face great challenges in familiarising and ultimately grasping the spinal endoscopic surgical technique.
In a manner that relies on the surgical experience of the physician to determine the distance of the under-scope physiological structure from the endoscope or the length of the tool extending out of the tool channel, since the distal end of the spinal endoscope is designed with an angled bevel, while different spinal endoscopes have different angles (e.g., 15 °,30 °, 45 °, etc.), even the same operating tool looks different under the scope due to the angle, i.e., viewing angle. The accuracy of the determination of the distance or length described above is entirely dependent on the physician's familiarity with the spinal endoscopes of such angular lenses. Meanwhile, the operation tools used by the spinal endoscope often lack standardization, the tool length is not unified, once the tools are replaced, doctors often need to adapt again, and the possibility of misjudgment exists.
The manner in which the C-arm is relied upon to view the working area of the anterior end of the endoscope during surgery often results in additional radiation exposure injuries to the patient and the physician (who is required to hold the spinal endoscope and tools and cannot leave them).
There is still no solution how to quickly determine the positional relationship (direction and/or distance) of a point under the scope to a physiological structure of a particular feature during surgery.
The existing situation results in endoscopists requiring supergroup physiological anatomical knowledge, excellent spatial imagination and more surgical case accumulation to bridge the so-called "learning curve". In general, 30 to 50 operations or more are required for doctors, and the doctor can master the spinal endoscope technology in a skilled way, so that a doctor who cultures a qualified spinal endoscope often needs a long time and has a rather low cost, which also limits the popularization and popularization of the spinal endoscope technology.
Disclosure of Invention
The present invention is directed to solving at least one of the above-mentioned problems and disadvantages of the prior art, as well as other technical problems.
In one aspect, the present invention provides a method of guiding an endoscopic procedure under a navigation system, the method comprising the steps of: an endoscope azimuth acquiring step of acquiring azimuth of the endoscope under a navigation system; and (3) a ruler model construction step: constructing a scale model according to the azimuth of the endoscope under the navigation system; a guide image acquisition step; and (3) enhancing image fusion: fusing the scale model with the guide image to obtain an enhanced image; and a display step: a view including at least a portion of the enhanced image is displayed.
The guidance image, referred to herein, refers generally to an image displayed on a display device of a navigation system that is capable of guiding or assisting an operator in performing an endoscopic procedure, including, but not limited to, an endoscopic real-time image from an endoscopic real-time image, a two-dimensional image or a three-dimensional image based on a patient, and a navigation view of an endoscopic model is generated and displayed therein.
In this example, since the scale model is further fused in the guide image, the operator can more intuitively judge the length of the tool extending out of the working channel and the distance of each physiological structure/point of interest in the under-scope image with reference to the scale model displayed on the image, thereby operating more accurately. Meanwhile, the navigation system is adopted to acquire the position of the endoscope under the navigation system, and the scale model is constructed according to the position, namely, the position of the scale model is determined, so that the acquired position of the scale model is more accurate, the data processing amount is small (compared with a certain mode of determining the position of the scale model by processing images), and the accurate fusion of the scale model and the guide image position is facilitated.
According to a further example of the invention, the acquisition of the orientation of the endoscope under the navigation system is performed by means of acquiring the position of the tracer having a fixed positional relationship with respect to the endoscope from an optical tracking device of the navigation system. This example uses an optical tracking device to track, for example, an optical tracer disposed on an external portion of the endoscope, and uses an optical navigation system to obtain the orientation of the endoscope with an accurate, simple structure.
According to a further example of the present invention, in the scale model constructing step, a position of a center point of a tool passage of the endoscope on a distal end face of the endoscope under a navigation system and an orientation of a central axis of the tool passage under the navigation system are obtained from the orientation of the endoscope under the navigation system, and the scale model extends in a direction away from the endoscope in a direction of the central axis with the center point as a starting point. The position of the ruler model constructed according to the example is basically coaxial with the position of the spinal endoscope tool, so that the tool can be more accurately guided to extend out of the distance of the tool channel.
According to a further example of the present invention, the guide image includes an endoscopic image, and the enhanced image includes an endoscopic enhanced image obtained by fusing the scale model with the endoscopic image.
In this exemplary scenario, the operator can view the under-lens image in real time and judge the distance, direction of the under-lens tool and under-lens structure relative to the tool with reference to the scale, thereby operating more accurately.
According to a further example of the present invention, the method further comprises a perspective scale model construction step: wherein the orientation of the imaging device of the endoscope under the navigation system is obtained according to the orientation of the endoscope under the navigation system, and the scale model constructed in the scale model constructing step is converted into a perspective scale model seen from the view angle of the imaging device according to the orientation of the imaging device; wherein the perspective scale model is fused with the endoscopic image in the enhanced image fusion step to obtain the endoscopic enhanced image.
In this exemplary scheme, a perspective scale model seen from the view angle of the imaging device of the endoscope is constructed according to the position of the imaging device under the navigation system and the position of the scale model, and the perspective relationship is utilized to create the sense of three-dimensional depth information, so as to provide a more visual scale indication effect.
According to a further example of the invention, the method of the invention further comprises the steps of:
Marking and azimuth acquisition: acquiring a physiological structure mark on a physiological structure image of a patient and a position of the physiological structure mark under a navigation system;
The mark indicates the construction step: acquiring the position of the end point of the perspective scale model, and constructing a mark indication of the mark relative to the end point according to the position of the mark and the position of the end point;
splicing: stitching the images of the plurality of endoscopes according to the orientation of the imaging device corresponding to at least one endoscope image to obtain stitched images,
Wherein the marker, the marker indication and the stitched image are also fused into the endoscopic enhanced image in the enhanced image fusion step.
In this example, a marker indication from the end point of the perspective scale model, such as the name of the marker point, the distance from the end point of the perspective scale model, the arrow indicating the direction toward the marker, etc., is further superimposed on the endoscope image, and the scale model also plays the role of virtual probe ranging at this time, assisting the doctor in judging the distance from the physiological structure marker point/target point and the current endoscope method, thereby guiding the doctor to perform the endoscope operation.
Further, in this example, the problem of limited operator field of view is successfully addressed by providing the operator with a larger field of view (e.g., global field of view) around the endoscope tip due to the enhanced fusion of the stitched image around the endoscope tip over the endoscope image. Moreover, since the endoscope image is fused with a larger range of stitched images around, the physiological structure mark described above is highly likely to be observed in the stitched image even if not located in the current field of view of the endoscope (i.e., outside the current endoscope image) and is guided by the mark indication, so that the operator can more intuitively observe the distance and orientation of the physiological structure mark point relative to the end of the endoscope, and is convenient for adjusting the endoscope orientation, thereby completing the operation.
Further, in this exemplary scenario, since the manner in which the images are stitched according to the orientation of the imaging device corresponding to at least one image to obtain a stitched image, less processing is required compared to various stitching manners (e.g., algorithm-based) that are available, greatly increasing the stitching speed, which is particularly important for real-time viewing during surgery. And the splicing precision of the scheme is higher, and the requirement on the operation capability of the processor is lower.
According to a further example of the invention, the patient physiological structure is the spine and the markers are marker points indicating one or more of protruding or free intervertebral discs, osteophytes and ossified ligamentum flavum, or physiological structure marker points that are not displaced during endoscopic surgery, such as one or more of the pedicles of the articular process ventral, anterior, posterior, and spinal discs.
The method of guiding endoscopic surgery of the present invention is particularly beneficial for spinal endoscopes (e.g., foraminal endoscopes). As described in the background, the imaging device of the spinal endoscope and the channel of the surgical tool are integrated into the same endoscope extension barrel, and the end bevel of the endoscope has different angles, and the operating tool is not standardized, making it more difficult for a physician to empirically determine the length of the tool extension from the working channel and the distance of the under-scope physiological structures from the endoscope and the tool. The method of the invention judges the tool length and the distance of the physiological structure by providing the enhanced information of the scale model for the endoscope image and taking the scale as the scale; enhancement information of the spliced image is provided for the endoscope image, and a wider range of vision is provided for an operator; the method provides the enhancement information of the physiological structure mark and the mark indication information of the physiological structure mark for the endoscope image, provides the probe ranging function of the target physiological structure for an operator and provides direction guidance, and can avoid misoperation of doctors and increase the reliability of spinal surgery to a great extent.
According to a further example of the present invention, in the marker and its position acquisition step, the position of the marker under the navigation system is acquired by means of registration of the patient physiological structure image under the navigation system. Illustratively, the patient physiological structure image can be selected from a pre-operative three-dimensional image, an intra-operative three-dimensional image, and an intra-operative two-dimensional image. This provides the operator with more options to enable selection of the appropriate image type depending on the particular surgical situation and available equipment.
According to a further example of the present invention, the above-mentioned guide image includes a navigation view generated based on a two-dimensional image or a three-dimensional image of a patient and displaying an endoscope model therein, and the enhanced image includes a navigation view enhanced image obtained by fusing the scale model with the navigation view, wherein in the navigation view enhanced image, the scale model extends in a direction away from the endoscope model in a direction of a central axis of a tool channel in the endoscope model with a central point on a distal end face of the endoscope model as a starting point.
In this example, a ruler model is fused into the navigation view, superimposed on the end of the endoscope model, assisting the physician in judging the distance of the physiological structure visible on the navigation view from the end of the tool. Preferably, the endoscope-enhanced image and the navigation view-enhanced image may be simultaneously displayed on the display device. The scale model acts as a virtual probe in the navigation view, the end of which points at a point on the navigation view, which point corresponds to the physiological structure at the position of the end point of the scale model on the current endoscope enhancement image, whereby the scale model/virtual probe can assist the operator in linking the two-dimensional image displayed by the endoscope with the three-dimensional physiological structure, and better guiding effect is achieved.
Preferably, the navigation view comprises one or more of a cut view along the tool axis, a cut view perpendicular to the tool axis, a view on a real or fitted two-dimensional perspective cut, a sagittal cut, a coronal cut, an axial cut.
According to a further example of the present invention, the length of the scale model can be determined from an operator's input in the scale model creation step, preferably the scale model includes a size scale.
According to another aspect of the present invention, there is also provided a computer-readable storage medium having stored thereon a computer program which, when executed by a processor, performs the steps of the methods in the examples described above.
According to still another aspect of the present invention, there is also provided a control apparatus including a memory, a processor, and a program stored on the memory and capable of running on the processor, wherein the methods in the examples above are performed when the processor runs the program.
According to a further aspect of the present invention there is also provided a computer program product comprising a computer program which, when executed by a processor, implements the steps of the method in the examples described above.
According to yet another aspect of the present invention there is also provided an electronic device for navigation of an endoscopic procedure, the electronic device comprising a display means and a processor, the processor having a data interface, wherein the data interface is connectable to a tracking means of a navigation system such that the processor is able to obtain the orientation of the endoscope; and the processor is configured to be able to acquire a guidance image; wherein the processor is configured to display a view of at least a portion of an enhanced image on the display device for at least a period of time when the processor is running, wherein the enhanced image is an enhanced image obtained by fusing a scale model with the guide image.
The electronic device of the invention is also structured with corresponding constructions and configurations, corresponding to the methods shown in the further method examples above. The construction and configuration thereof are set forth in the claims and the advantages and benefits of the solution are set forth in the description of the exemplary methods above, which are not repeated here.
According to a further aspect of the present invention, there is also provided a navigation system comprising a tracking device adapted to track a tracer having a fixed positional relationship with respect to the endoscope; and a display device and a processor adapted to be connected to the tracking device and the endoscope, wherein the methods described in the examples above are performed when the processor is running and the display in the methods is effected by the display device.
The navigation system guides the endoscopic surgery by using the real-time endoscopic image enhanced by one or more of the staff gauge model, the physiological structure mark, the mark indication and the spliced image, and the staff gauge model is fused on the navigation view for enhancement, so that the staff gauge for judging the length of the tool extending out of the working channel is provided for an operator, the operator can intuitively observe the distance and the azimuth of the physiological structure mark of interest relative to the tool, the corresponding relation of the physiological structure under the mirror under the two-dimensional image and the three-dimensional image structure and the like, and better guidance is provided for the operator. Therefore, the navigation system has higher navigation precision and better navigation effect, and can assist doctors to master the operation skills faster and finish the operation better.
According to yet another aspect of the present invention, there is also provided a surgical robot system including a robotic arm and the navigation system described above. In other words, the inventive concept may also be implemented under a navigation system in a surgical robotic system.
Drawings
The invention is described in detail below via exemplary embodiments with reference to the accompanying drawings.
Fig. 1 shows a flow chart of a method of guiding an endoscopic procedure under a navigation system according to an exemplary embodiment of the invention.
Fig. 2 shows a schematic diagram of an exemplary navigation system for spinal endoscopic surgery according to the present invention.
Fig. 3 schematically shows a schematic diagram of a positioning scale model at the end of an endoscope.
Fig. 4 schematically shows an endoscopically enhanced image.
Fig. 5A and 5B schematically show navigation view enhancement images obtained by fusing a positioning scale to a three-dimensional navigation view at different viewing angles, respectively.
Fig. 6A and 6B schematically show navigation view enhancement images obtained by fusing a positioning scale to a two-dimensional navigation view at different viewing angles, respectively.
Fig. 7 schematically illustrates the positional relationship of the endoscope, the scale model and the physiological structure marker points.
Fig. 8 schematically illustrates an example of an operator completing a physiological structure marking on a pre-operative three-dimensional image or an intra-operative three-dimensional image.
It should be noted that the drawings are merely schematic. They show only those parts or steps that are required for elucidation of the invention, while other parts or steps may be omitted or merely mentioned. The invention may include other components or steps than those shown in the figures.
Detailed Description
The technical scheme of the invention is further specifically described below through examples and with reference to the accompanying drawings. The following description of embodiments of the present invention with reference to the accompanying drawings is intended to illustrate the general inventive concept and should not be taken as limiting the invention.
The following describes, as a specific example, the specific steps of the method of guiding an endoscope (e.g., spinal endoscope, neuroendoscope, etc.) under a navigation system of the present invention, as well as the electronic devices, navigation systems, and surgical robotic systems (or positioning and navigation systems) involved, among others. In the following detailed description, numerous specific details, steps, and so forth, are set forth in a very specific and detailed manner in order to provide a thorough understanding of the embodiments. However, it is understood that one or more other embodiments may be practiced without these specific details, or with steps.
Although the terms "imaging device" and "image" of the endoscope are used herein, those skilled in the art will understand that "imaging device" is a broad concept and may include functions such as image capturing, video capturing and image capturing, and "image" is a broad concept and may include video, dynamic continuous images and still images. In the present invention, the "imaging device" of the endoscope may be an endoscope module for video imaging of the endoscope (obtaining an image of the endoscope), and the image obtained in the "endoscope image obtaining step" is an image of framing of the image from the endoscope. Also, while the spinal endoscope is illustrated in this particular embodiment and the accompanying drawings as an example, the inventive concept is not so limited and may be applied to any endoscope including neuroendoscopes, gastroscopes, enteroscopes, laparoscopes, arthroscopes, stomatoscopes, dental endoscopes, nasoscope, laryngoscopes, and the like.
In a specific embodiment of the invention, a navigation system is used comprising a tracking device 1, a control device 2 and a display device 3 as shown in fig. 2. The tracking device 1 may be an optical tracking device (e.g. NDI navigator) and accordingly an optical tracer 4 may be provided on the endoscope 5. As a specific example, the control device 2 may be a general purpose computer, a special purpose computer, an embedded processor, or any other suitable programmable data processing apparatus such as a single chip microcomputer, a chip or the like. The control device 2 may comprise a processor and a memory for storing a program, but may also comprise only a processor, in which case the processor may be attached to the memory in which the program is stored. In other words, the control device comprises at least a processor. The control device 2 (or processor) and the display device 3 may be integrated or may be provided separately. The control device or processor has a data interface which may comprise a data interface connectable to the endoscope such that the control device/processor can acquire images of the endoscope in real time. The data interface of the control device or processor can also be connected to the tracking device 1 of the navigation system so that the position and orientation of the tracked object, e.g. the tracer 4 on the endoscope, can be obtained from the tracking device 1 in real time. As an example, the endoscope 5 and/or the tracer 4 may also be considered part of the navigation system of the invention.
Typically, the distal end of the spinal endoscope (distal end) is passed into the tissue structure, bony structure of the patient for viewing and/or manipulation thereof, and the proximal end of the spinal endoscope (proximal to the end of the operator, i.e., the end of the endoscope opposite the end extending into the barrel) is positioned outside the patient for manipulation by the operator. In the navigation system, by arranging the tracer 4 at the proximal end of the endoscope, which is located outside the patient's body, which is adapted to be tracked by the tracking device 1, the navigation system is able to know in real time the position and direction of the tracer 4 on the endoscope 5 in the navigation coordinate system, i.e. to obtain the orientation of the endoscope under the navigation system. As an example of the present invention, as shown in fig. 3, an imaging device 53 of the spinal endoscope, i.e., a distal lens of the optical hard lens module, is disposed at the end of the endoscope that protrudes into the barrel. By calibrating the relative positional relationship of the distal lens of the optical hard lens module with respect to the tracer 4 on the endoscope 5, the relative positional relationship of the imaging device with respect to the tracer 4 can be obtained. So that the position of the tracer 4 and thus the position and direction of the imaging device in the navigation coordinate system can be known by the tracking device 1 in the navigation system. Also, the tool channel 52 of the endoscope is seen in fig. 3 at a center point P on the distal end surface 51 of the endoscope. The relative position relationship of the central point P and the central axis of the tool channel 52 passing through the central point P with respect to the tracer 4 on the endoscope 5 can be calibrated in advance, and the position and direction of the central point P and the central axis under the navigation coordinate system can be obtained by knowing the orientation of the tracer 4 through the tracking device 1 in the navigation system in the operation. The above calibration is usually performed before performing the endoscopic surgery and navigation thereof, also referred to as calibration of external parameters of the endoscope. The existing external parameter calibration for the endoscope has various implementation modes, which can be performed through a calibration tool or through the endoscope to acquire an image and perform image processing, and the calibration process is not described herein.
The position (position and direction) of the tracer 4 on the endoscope is acquired in real time by the tracking device 1 during operation, that is, the position of the center point P and the central axis thereof on the distal end face of the endoscope of the tool channel 52 of the endoscope under the navigation system and the position of the imaging device 53 under the navigation system can be indirectly acquired in real time. The orientations of both will be used in the scale model construction, perspective scale model construction, and navigation view enhancement image construction described below.
In this particular embodiment of the invention, imaging parameters of the endoscope, also known as internal parameters, may also be calibrated prior to performing the endoscopic procedure and navigation thereof. For example, calibration of the internal reference may use a calibration plate, for example, using the endoscope to take images of the calibration plate at different azimuth angles, save images, and record calibration plate specifications. Further calibrating internal parameters of the endoscope, including distortion matrix and/or rotation and displacement vector. The internal parameters calibrated there will be used in the process of distortion calibration of endoscopic images as will be described hereinafter.
The steps of a specific embodiment of the method for guiding an endoscopic procedure of the present invention will be described in detail with reference to fig. 1. Before the operation starts, the internal and external parameters of the endoscope may be calibrated as described above. And registration of the navigation system, i.e. determination of the navigation coordinate system, is performed before the operation and navigation are started. The method of the invention can be used in a scene using two-dimensional images for navigation and in a scene using three-dimensional images for navigation. The registration method for the navigation system is a known method and will not be described here.
During the endoscopic surgery, the control device automatically acquires real-time images of the endoscope. Meanwhile, the control device can acquire the position of the tracer 4 on the endoscope in real time through the tracking device 1 (endoscope position acquisition step) as described above, that is, the position of the center point P on the distal end surface of the endoscope and the central axis thereof under the navigation system can be indirectly acquired in real time according to the calibrated external parameter, and the scale model R can be constructed accordingly. Wherein the scale model extends in the direction away from the endoscope along the central axis direction of the tool channel starting from the central point P, as schematically shown in fig. 3. The length of the scale model may be set/adjusted based on operator input to determine the location of the end point Q of the scale model in the navigation system. The input means for the operator to input the length may for example comprise a selection box, a dialog box or the like displayed on the display device.
In the present invention, the constructed scale model may be fused to an endoscopic image (framed image of a live image from the endoscope) acquired in the endoscopic image acquisition step to form an endoscopic enhanced image and displayed in the display step. In this case, the method of this embodiment may further include a perspective scale model constructing step of: wherein the position of the tracer 4 on the endoscope is acquired in real time by the tracking apparatus 1 and the position of the imaging apparatus of the endoscope under the navigation system is indirectly acquired according to the calibrated external parameters as described above, and the scale model constructed in the scale model constructing step is converted into a perspective scale model seen from the perspective of the imaging apparatus according to the position of the imaging apparatus, in other words, the constructed scale model is projected onto the endoscope image in this step, and the perspective scale model is fused with the endoscope image in the subsequent enhanced image fusion step to obtain the endoscope enhanced image. Fig. 4 shows a schematic diagram in which a perspective scale model 8 is fused to an endoscopic image 6 (image within a circle). Preferably, the scale model may be graduated. By "projecting" a scale model, i.e., a virtual scale, in the endoscope image, a visual sense of three-dimensional depth information is created in the two-dimensional endoscope image using perspective, thereby more intuitively assisting the doctor in judging the length of the tool 7 extending out of the working channel.
With continued reference to fig. 1, the above-described endoscope enhancement image, on which the marker indications 9 (e.g., the arrow shown in fig. 4 and the distance 32.5 mm) directed from the end point of the perspective scale model 8 to the physiological structural markers are fused, may be further enhanced by the "markers and their orientation acquisition step", "marker indication construction step" shown therein. The marker indication for example indicates the distance and/or direction of the end point of the perspective scale model from the physiological structure marker. These marker indication information is displayed in real-time images of the endoscope, which can assist the physician in determining the distance and/or orientation of the end point of the scale model from the physiological structural markers, thereby guiding the physician to move and manipulate the endoscope and tools. The physiological structure markers are marked, for example, by an operator, on a three-dimensional image or a two-dimensional image of the physiological structure of the patient, as will be described in detail below.
Specifically, herein, the image on which the operator marks may be a pre-operative three-dimensional image, such as a pre-operative CT image, or an intra-operative acquired image, such as an intra-operative CBCT image or an intra-operative two-dimensional fluoroscopic image. When the navigation system uses the preoperative three-dimensional image to navigate, the operator makes a mark on the preoperative three-dimensional image. When the navigation system uses intra-operative CBCT images or intra-operative two-dimensional fluoroscopic images for navigation, the operator makes a mark on the intra-operative acquired images.
In the present invention, the operator can choose to make various marks, such as physiological structure mark points indicating one or more of herniated or free intervertebral discs, osteophytes, and ossified ligamentum flavum, or physiological structure mark points that do not shift during endoscopic surgery. Illustratively, as shown in fig. 8, the operator may mark (e.g., using pre-operative planning software) certain physiological structure marking points within or around the foramen that do not undergo any displacement throughout the course of the spinal endoscopic procedure, including but not limited to physiological structure points such as the ventral aspect of the articular process, the pedicles of the anterior segment vertebral body, the pedicles of the posterior segment vertebral body, the intervertebral discs, etc., on the three-dimensional image.
As for the manner in which the marking is made on the preoperative three-dimensional image, the processor obtains the preoperative three-dimensional image while simultaneously obtaining the marking that the operator has made on the preoperative three-dimensional image as described in the foregoing. And registering the preoperative three-dimensional image under the navigation system, and determining the coordinates of each mark under the navigation system according to the registration relation determined in the registration step, thereby completing the acquisition of the mark and the acquisition of the azimuth thereof.
For the way in which a mark is made on an intra-operative three-dimensional image or an intra-operative two-dimensional image, registration of the image under the navigation system is typically accomplished while the intra-operative three-dimensional image or the intra-operative two-dimensional image is acquired. Therefore, in this case, the operator marks on the intra-operative three-dimensional image or the intra-operative two-dimensional image after registration is completed. Because the coordinates of the registered images under the navigation system are known, the coordinates, namely the positions, of the marks made in the images under the navigation system can be correspondingly acquired, so that the steps of acquiring the marks and the positions thereof are completed. In the case of intra-operative two-dimensional images, preferably, a three-dimensional-like fit may be performed on the two-dimensional images of at least two body positions to obtain a three-dimensional-like image for convenient marking by the operator, after which the marking obtained by the processor is a marking made by the operator on the fitted image. It will be appreciated by those skilled in the art that the images used for the three-dimensional-like fit may be intra-operative orthographic images and intra-operative lateral images, or images of other body positions, as may be selected according to the needs of the operator or the actual condition of the portion of the patient to be operated upon.
The operator may enter or select the various forms of indicia described hereinabove in various ways. This may be entered through the interface of the display device, for example through a touch interface of the display device and using preoperative planning software or intra-operative planning software for selection of the individual physiological structure markers, as well as through a keyboard and/or mouse of the electronic device, as exemplarily shown in fig. 8.
It should be noted that although in the description herein, the marks are formed in such a manner that an operator makes marks on an image for a physiological structure of a patient. However, it will be appreciated that in other embodiments the marking may be made by other than an operator, for example by automatically forming the marking when an image is formed, for example by pre-positioning a marker on a body part of the patient or on an operating table or the like so that the marking may be generated when the image is acquired.
In the marker indication construction step, the position of the end point of the perspective scale model constructed above is acquired, and a marker indication of the marker with respect to the end point of the scale model, such as an arrow pointing from the end point Q to a physiological structure marker (e.g., superior articular process ventral side) and distance information therebetween, is constructed based on the orientation of the marker and the position of the end point acquired in the marker and orientation acquisition step thereof. And the marker indication 9 is fused into the endoscope enhancement image in a subsequent enhancement image fusion step. At this time, the end point of the scale model is used as the starting point of the arrow indicated by the direction of the physiological structure marking point, the scale model plays a role in the real-time image of the endoscope as a virtual probe for assisting in distance measurement, and assists a doctor in judging the distance and the relative direction between the end point of the virtual probe and the physiological structure target point, so that the doctor is guided to perform the operation of the endoscope. In fig. 7, a schematic diagram of the positional relationship of the endoscope 5, the scale model R and the physiological structure marking points is exemplarily shown, wherein the above physiological structure marking points on the articular synapse side are taken as examples of the marks, and arrows and distances of 32.5mm are taken as examples of the marks.
In this particular embodiment of the present invention, the scale model obtained in the scale model constructing step may also be fused into a navigation view generated based on a two-dimensional image or a three-dimensional image of the patient and having an endoscope model displayed therein. The navigation view here is, for example, a cut-plane view along the tool axis, a cut-plane view perpendicular to the tool axis, a view on a real or fitted two-dimensional perspective cut-plane, a sagittal cut-plane view, a coronal cut-plane view, an axial cut-plane view, or one or more of the above. Three-dimensional navigation views on different sections along the tool axis are shown in fig. 5A, 5B, respectively, and two-dimensional navigation views on the normal and side are shown in fig. 6A and 6B, respectively, with the endoscope model 54 shown on each of these views, and the conical field of view 55 of the endoscope at its end. During surgery, these navigation views are updated in real-time to indicate the current position of the endoscope relative to the patient's physiological structure by the position of the endoscope model 54 in the navigation views, thereby enabling navigation functions to the operator. According to the scheme of the invention, the navigation view is also fused with a scale model R to obtain a navigation view enhanced image. In the navigation view enhancement image, the position of the central point of the endoscope working channel at the inclined plane of the distal end of the endoscope, the direction information of the central axis and the length of the staff gauge model defined by an operator are combined, the staff gauge model is added along the central axis direction of the working channel of the endoscope by taking the central point as a starting point, and a doctor is assisted in judging the distance between a visible physiological structure on the navigation view and the tail end of the endoscope.
Preferably, the endoscope-enhanced image and the navigation view-enhanced image described above may be displayed simultaneously on the display device. The scale model acts as a virtual probe in the navigation view, the end of which points at a point on the navigation view, which point corresponds to the physiological structure at the position of the end point of the scale model on the current endoscope enhancement image, whereby the scale model/virtual probe can assist the operator in linking the two-dimensional image displayed by the endoscope with the three-dimensional physiological structure, and better guiding effect is achieved.
During a surgical procedure, an operator performs an operation by changing the position and/or orientation of the endoscope in real time by means of guidance of the guidance image, the above-mentioned endoscope-enhanced image and navigation-view-enhanced image being updated in real time, wherein the scale model/virtual probe is also updated at any time with the position of the endoscope, thereby guiding the endoscopic surgical procedure. The operator can also adjust the length of the scale model/virtual probe as needed during surgery via the input means.
Also shown in the image shown in fig. 4 is a stitched image 10 fused at the periphery of the endoscopic real-time image. The stitched image is obtained by stitching images at multiple orientations obtained by translating and/or rotating the endoscope over a larger range of viewing requirements. During this translation and/or rotation to obtain images at multiple orientations, the control device automatically acquires multiple images around the tip of the endoscope. At the same time, the control device automatically acquires and records the position and orientation of the imaging device corresponding to at least one of the images of the endoscope (e.g., corresponding to each image). At the same time or later, the multiple images to be stitched may be distortion calibrated in combination with the internal parameters of the imaging device calibrated above to remove distortion effects of the imaging device. Then, the respective images are stitched according to the position and direction (i.e., azimuth) of the imaging device corresponding to the at least one image, thereby obtaining a stitched image.
Since the images are stitched according to the orientation of the imaging device corresponding to at least one image to obtain a stitched image, the orientation information of the stitched image under the navigation system is known. The stitched image can thus be fused with the endoscopic image and other enhancement information based on the orientation information to obtain the endoscopic enhancement image shown in fig. 4. The stitched image may also be processed, i.e. a planar image is generated from the stitched image, preferably after it has been obtained and before it is fused to the endoscopic real-time image 6, to eliminate or reduce visual distortion of the stitched image due to the difference in viewing angle, after which the planar image is taken as the stitched image to be fused with the current image of the endoscope (e.g. placed around the current image of the endoscope as shown in fig. 4). The spliced images around the endoscope tail end enable a doctor to observe a larger visual field (also can be a global visual field), and soft tissues such as nerves, dura mater, blood vessels, intervertebral discs and the like around the endoscope tail end can be better displayed, so that the doctor can be guided to rapidly judge the space position relation between the endoscope orientation and the key physiological structure relative to the current image of the endoscope at the moment, and the defect of limited visual field of the endoscope is eliminated.
Because the enhancement information (the ruler model, the spliced image and the mark) to be fused and the imaging device have a determined azimuth relationship under the navigation system, and the information to be fused and the endoscope image acquired by the imaging device have a determined azimuth relationship under the navigation system, fusion of the information and the current image of the endoscope can be performed according to the azimuth relationship and the acquired azimuth of the endoscope imaging device.
As shown in fig. 4, in the case of enhancement of an endoscopic image by a peripheral stitched image, the above-described physiological structure marker (in fig. 4, the superior articular process side marker point) is likely to be observable in the peripheral stitched image even without being located in the current field of view of the endoscope (i.e., outside the current endoscopic image), and is directed by the marker indication, so that the operator can more intuitively observe the end point of the physiological structure marker point with respect to the scale model and the distance and orientation with respect to the tool.
It should be noted that, although the perspective scale model 8, the physiological structure marker, the guidance indication 9 of the physiological structure marker, and the stitched image 10 are shown fused on the endoscopic image 6 in fig. 4, it will be understood by those skilled in the art that the endoscopic enhanced image does not have to fuse each of these enhanced information. Instead, blending each enhancement information on the endoscopic image 6 may enable better navigation to the operator. In addition, under the condition that a plurality of enhancement information is fused, the multiple enhancement effects interact in an auxiliary mode, and navigation performance is further enhanced. For example, the simultaneous fusion of the scale model enhancement information, the physiological structure marker enhancement information, and the marker indication enhancement information allows an operator to more intuitively determine the distance and orientation of the physiological structure marker, i.e., the point of interest. The spliced image enhancement information and the physiological structure mark enhancement information are fused at the same time, so that an operator can observe the field of view around the tail end of the endoscope in a larger range, and further can observe the physiological structure mark more conveniently.
It should be understood that although the steps described above are listed and described in the flowchart of fig. 1, the claims, and the description herein in order, it is not intended to represent a particular precedence relationship between each of these steps. For example, the marking and its orientation acquisition step may be performed before or after or simultaneously with the endoscopic image acquisition step. The marking and its orientation acquisition step may be performed before or after or simultaneously with the endoscope orientation acquisition step/scale model construction step, and the navigation view acquisition step/navigation view enhanced image fusion step may be performed before or after or simultaneously with the endoscope enhanced image fusion step.
It should also be noted that the "display step" described herein does not represent a constant display of the enhanced image. For example, the endoscope enhancement image and/or the navigation view enhancement image may also be displayed for only a certain period of time, depending on the needs of the operator, e.g. when the operator desires to view the enhancement image, it may trigger the display of the respective enhancement image, e.g. by means of a foot switch. The endoscope-enhanced image and the navigation view-enhanced image may be displayed on the display device at the same time, or only one of them may be displayed.
The invention also provides an electronic device for navigation of an endoscopic procedure, comprising a display device 3 as described above and a processor, which in the specific example shown in fig. 2 is comprised in a control device 2, i.e. the illustrated host computer. Those skilled in the art will appreciate that the processor and display device may be unitary or split. The control device 2 or the processor has a data interface. The control device or processor is electrically connected to the tracking device 1 and the endoscope 5 of the navigation system via the data interface to acquire the position and orientation of the imaging device of the endoscope and to acquire images of the endoscope. The data interface of the processor also enables the processor to acquire a three-dimensional or two-dimensional image of the patient's physiological structure and a marker on the three-dimensional or two-dimensional image of the patient's physiological structure. When the processor is running, a computer program is executed (which may be stored on a memory comprised by the control means or on another memory) to perform the method of the invention and to display a view of at least a part of an enhanced image on the display means 3 at least for a period of time, wherein the enhanced image is an enhanced image obtained by fusing a scale model with, for example, an endoscopic image or a navigation view.
The present invention also provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, is capable of performing the steps of the above-described method of the present invention. The present invention also provides a control device, which may include a memory, a processor, and a program stored in the memory and capable of running on the processor, wherein the steps of the above method of the present invention are performed when the processor runs the program. The present invention also provides a computer program product comprising a computer program which, when executed by a processor, implements the steps of the above-described method of the present invention.
Those of skill in the art will appreciate that the steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. The software modules may be disposed in Random Access Memory (RAM), memory, read Only Memory (ROM), electrically programmable ROM, electrically erasable programmable ROM, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art.
In the above-described embodiments, the methods may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. When implemented in software, may be implemented in whole or in part in the form of a computer program product. The computer program product includes one or more computer instructions. When the computer program instructions are loaded and executed on a computer, the processes or functions in accordance with the present invention are produced in whole or in part. Where the computer may be a general purpose computer, a special purpose computer, a network of computers, or other programmable devices. The computer instructions may be stored in a computer-readable storage medium or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be transmitted from one website, computer, server, or data center to another website, computer, server, or data center by a wired (e.g., coaxial cable, fiber optic, digital subscriber line), or wireless (e.g., infrared, wireless, microwave, etc.). Computer readable storage media can be any available media that can be accessed by a computer or data storage devices, such as servers, data centers, etc., that contain an integration of one or more available media. Usable media may be magnetic media (e.g., floppy disk, hard disk, magnetic tape), optical media (e.g., DVD), or semiconductor media (e.g., solid state disk Solid STATE DISK), etc.
It will be appreciated by those skilled in the art that the Memory of the control device of the present invention may include random access Memory (Random Access Memory, RAM) or may include Non-Volatile Memory (NVM), such as at least one disk Memory. Alternatively, the memory may be at least one memory device separate from the processor.
The processor of the control device may be a general-purpose processor including a central processing unit (Central Processing Unit, CPU), a network processor (Network Processor, NP), etc.; but may also be a digital signal Processor (DIGITAL SIGNAL Processor, DSP), application SPECIFIC INTEGRATED Circuit (ASIC), field-Programmable gate array (Field-Programmable GATE ARRAY, FPGA) or other Programmable logic device, discrete gate or transistor logic device, discrete hardware components.
Although a few embodiments of the present invention have been shown and described, it would be appreciated by those skilled in the art that changes may be made in these embodiments without departing from the principles and spirit of the general inventive concept, the scope of which is defined in the claims and their equivalents.
Claims (30)
1. A method of guiding an endoscopic procedure under a navigation system, the method comprising the steps of:
an endoscope azimuth acquiring step of acquiring azimuth of the endoscope under a navigation system;
and (3) a ruler model construction step: constructing a scale model according to the azimuth of the endoscope under the navigation system;
a guide image acquisition step;
And (3) enhancing image fusion: fusing the scale model with the guide image to obtain an enhanced image; and
And a display step: a view including at least a portion of the enhanced image is displayed.
2. Method according to claim 1, characterized in that the acquisition of the orientation of the endoscope under the navigation system is performed by means of acquiring the position of a tracer (4) having a fixed positional relationship with respect to the endoscope from an optical tracking device (1) of the navigation system.
3. The method according to claim 2, wherein in the scale model constructing step, a position of a center point of a tool channel of the endoscope under a navigation system on a distal end face of the endoscope and an orientation of a central axis of the tool channel under the navigation system are obtained from an orientation of the endoscope under the navigation system, and the scale model extends in a direction away from the endoscope in a direction of the central axis with the center point as a starting point.
4. A method according to any one of claims 1-3, wherein the guidance image comprises an endoscopic image and the enhancement image comprises an endoscopic enhancement image obtained by fusing the scale model with the endoscopic image.
5. A method according to claim 4, further comprising the step of perspective scale model construction: wherein the orientation of the imaging device of the endoscope under the navigation system is obtained according to the orientation of the endoscope under the navigation system, and the scale model constructed in the scale model constructing step is converted into a perspective scale model seen from the view angle of the imaging device according to the orientation of the imaging device;
wherein the perspective scale model is fused with the endoscopic image in the enhanced image fusion step to obtain the endoscopic enhanced image.
6. The method according to claim 5, further comprising the step of:
Marking and azimuth acquisition: acquiring a physiological structure mark on a physiological structure image of a patient and a position of the physiological structure mark under a navigation system;
The mark indicates the construction step: acquiring the position of the end point of the perspective scale model, and constructing a mark indication of the mark relative to the end point according to the position of the mark and the position of the end point;
splicing: stitching the images of the plurality of endoscopes according to the orientation of the imaging device corresponding to at least one endoscope image to obtain stitched images,
Wherein the marker, the marker indication and the stitched image are also fused into the endoscopic enhanced image in the enhanced image fusion step.
7. The method of claim 6, wherein the indicia indication comprises at least one of a line between the endpoint and the indicia, a direction from the endpoint to the indicia, a distance between the endpoint and the indicia, text about the indicia.
8. The method of claim 6 or 7, wherein the patient physiological structure is the spine and the marker is a marker point indicating one or more of a herniated or free intervertebral disc, osteophyte and ossified yellow ligaments, or a physiological structure marker point that is not displaced during endoscopic surgery, such as one or more of a ventral side of a articular process, a pedicle of an anterior vertebral body, a pedicle of a posterior vertebral body, an intervertebral disc.
9. Method according to claim 6 or 7, characterized in that in the marker and its position acquisition step the position of the marker under the navigation system is acquired by means of registration of the patient physiological structure image under the navigation system.
10. The method of claim 6 or 7, wherein the patient physiological structure image is selectable from a pre-operative three-dimensional image, an intra-operative three-dimensional image, and an intra-operative two-dimensional image.
11. A method according to any one of claims 1-3, wherein the guiding image comprises a navigation view generated based on a two-dimensional or three-dimensional image of the patient and displaying an endoscope model therein, and the enhanced image comprises a navigation view enhanced image obtained by fusing the scale model with the navigation view, wherein in the navigation view enhanced image, the scale model extends in a direction away from the endoscope model with a center point on a distal end face of the endoscope model of a tool channel in the endoscope model as a starting point, in a central axis direction of the tool channel.
12. The method of claim 11, wherein the navigation view comprises one or more of a cut view along a tool axis, a cut view perpendicular to a tool axis, a view on a real or fitted two-dimensional perspective cut, a sagittal cut, a coronal cut, an axial cut.
13. The method of any one of claims 1-3, 5-7, and 12, wherein the endoscope is a spinal endoscope.
14. A method according to any one of claims 1-3, 5-7 and 12, wherein in the scale model creation step the length of the scale model can be determined from operator input, preferably the scale model comprises a size scale.
15. A method according to claim 11, wherein in the scale model creation step the length of the scale model can be determined from an operator input, preferably the scale model comprises a size scale.
16. A computer-readable storage medium, on which a computer program is stored, characterized in that the computer program, when being executed by a processor, performs the steps of the method according to any one of claims 1-15.
17. A control device comprising a memory, a processor and a program stored on the memory and executable on the processor, characterized in that the steps of the method according to any one of claims 1-15 are performed when the processor runs the program.
18. A computer program product comprising a computer program, characterized in that the computer program, when being executed by a processor, implements the steps of the method according to any of claims 1-15.
19. An electronic device for navigation of an endoscopic procedure, characterized in that it comprises a display means (3) and a processor with a data interface,
Wherein the data interface is connectable to a tracking device (1) of a navigation system such that the processor is able to acquire the position of the endoscope (5); and the processor is configured to be able to acquire a guidance image;
Wherein the processor is configured to display a view of at least a portion of an enhanced image on the display device (3) for at least a period of time when the processor is running, wherein the enhanced image is an enhanced image obtained by fusing a scale model with the guide image.
20. The electronic device of claim 19, wherein the data interface is connectable to the endoscope to obtain an endoscope image, wherein the guidance image comprises an endoscope image and the enhancement image comprises an endoscope enhancement image obtained by fusing the scale model with the endoscope image.
21. The electronic device of claim 20, wherein the scale model in the endoscope enhanced image is a perspective scale model from a perspective of an imaging apparatus of an endoscope.
22. The electronic device of claim 19, wherein the guidance image comprises a navigation view generated based on a two-dimensional image or a three-dimensional image of the patient and displaying an endoscope model therein, the scale model extending from a distal end of the endoscope model toward a direction away from the endoscope model in a navigation view enhancement image obtained by fusing the scale model with the navigation view.
23. The electronic device of claim 22, wherein the navigation view comprises one or more of a cut view along a tool axis, a cut view perpendicular to a tool axis, a view on a real or fitted two-dimensional perspective cut, a sagittal cut, a coronal cut, an axial cut.
24. The electronic device of claim 21, wherein the endoscope enhancement image further displays thereon a marker indication for a marker of a patient physiological structure, wherein the marker indication comprises at least one of a line segment between an endpoint of the perspective scale model and the marker, a direction from the endpoint to the marker, a distance between the endpoint and the marker, text regarding the marker; the enhanced image further includes a stitched image located around the endoscopic image, and the enhanced image is an endoscopic enhanced image obtained by fusing the patient physiological structure marker, the marker indication, and the stitched image to the endoscopic image.
25. The electronic device of claim 24, wherein the patient physiological structure is the spine and the markers are marker points indicating one or more of a herniated or free intervertebral disc, osteophyte, and ossified yellow ligaments, or physiological structure marker points that are not displaced during endoscopic surgery, such as one or more of a ventral articular process, a pedicle of an anterior vertebral body, a pedicle of a posterior vertebral body, an intervertebral disc.
26. The electronic device of any one of claims 19-25, further comprising an input member for an operator to input instructions to determine the length of the scale model.
27. The electronic device according to any one of claims 19 to 25, wherein an endoscopic enhancement image obtained by fusing the scale model with an endoscopic image and a navigation view enhancement image obtained by fusing the scale model with a navigation view are simultaneously displayed on a display interface of the display means.
28. A navigation system, the navigation system comprising:
-a tracking device (1), the tracking device (1) being adapted to track a tracer (4) having a fixed positional relationship with respect to the endoscope (5); and
-A display device (3) and-a processor adapted to be connected to the tracking device (1) and the endoscope (5), wherein the method of any one of claims 1-15 is performed when the processor is run, and wherein the display in the method is effected by the display device (3).
29. A navigation system, the navigation system comprising:
-a tracking device (1), the tracking device (1) being adapted to track a tracer (4) having a fixed positional relationship with respect to the endoscope (5); and
Electronic device according to any of claims 19-27, wherein a processor in the electronic device is adapted to be connected to the tracking means (1) and the endoscope (5).
30. A surgical robotic system comprising a robotic arm and a navigation system according to any one of claims 28-29.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410385476.5A CN118285913A (en) | 2024-04-01 | 2024-04-01 | Method for guiding endoscopic surgery under navigation system, electronic equipment, navigation system and surgical robot system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202410385476.5A CN118285913A (en) | 2024-04-01 | 2024-04-01 | Method for guiding endoscopic surgery under navigation system, electronic equipment, navigation system and surgical robot system |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN118285913A true CN118285913A (en) | 2024-07-05 |
Family
ID=91677046
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202410385476.5A Pending CN118285913A (en) | 2024-04-01 | 2024-04-01 | Method for guiding endoscopic surgery under navigation system, electronic equipment, navigation system and surgical robot system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN118285913A (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025167189A1 (en) * | 2024-02-07 | 2025-08-14 | 常州市康辉医疗器械有限公司 | Navigation method for optical rigid endoscope surgery, electronic device, navigation system, and robot system |
| WO2025185175A1 (en) * | 2024-03-06 | 2025-09-12 | 常州市康辉医疗器械有限公司 | Method for guiding endoscopic surgery, computer-readable storage medium, control apparatus, computer program product, electronic device, navigation system, and robotic system |
-
2024
- 2024-04-01 CN CN202410385476.5A patent/CN118285913A/en active Pending
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| WO2025167189A1 (en) * | 2024-02-07 | 2025-08-14 | 常州市康辉医疗器械有限公司 | Navigation method for optical rigid endoscope surgery, electronic device, navigation system, and robot system |
| WO2025185175A1 (en) * | 2024-03-06 | 2025-09-12 | 常州市康辉医疗器械有限公司 | Method for guiding endoscopic surgery, computer-readable storage medium, control apparatus, computer program product, electronic device, navigation system, and robotic system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11779408B2 (en) | Robotic navigation of robotic surgical systems | |
| EP3395282B1 (en) | Endoscopic view of invasive procedures in narrow passages | |
| JP5190510B2 (en) | Multifunctional robotized platform for neurosurgery and position adjustment method | |
| CN106163408B (en) | Image registration and guidance using simultaneous X-plane imaging | |
| US10506991B2 (en) | Displaying position and optical axis of an endoscope in an anatomical image | |
| US20070225553A1 (en) | Systems and Methods for Intraoperative Targeting | |
| US20050085718A1 (en) | Systems and methods for intraoperative targetting | |
| IL281915B1 (en) | Computerized tomography (ct) image correction using position and direction (p& d) tracking assisted optical visualization | |
| CN118285913A (en) | Method for guiding endoscopic surgery under navigation system, electronic equipment, navigation system and surgical robot system | |
| CN108969099B (en) | Correction method, surgical navigation system, electronic device and storage medium | |
| CN116829091B (en) | Surgical assistance system and presentation method | |
| JP2009529951A (en) | Method and apparatus for recording and reviewing a surgical navigation process | |
| CN117898834A (en) | Method for guiding endoscopic surgery, computer-readable storage medium, control device and computer program product, electronic device, navigation system and robotic system | |
| Eom et al. | NeuroLens: Augmented reality-based contextual guidance through surgical tool tracking in neurosurgery | |
| JP2020058779A (en) | Method for supporting user, computer program product, data storage medium, and imaging system | |
| CN117462257A (en) | Navigation method, electronic equipment and navigation system for spinal endoscopic surgery | |
| WO2025167189A1 (en) | Navigation method for optical rigid endoscope surgery, electronic device, navigation system, and robot system | |
| CN117860379A (en) | Endoscope guiding method under navigation system, electronic equipment and navigation system | |
| US20080204000A1 (en) | Universal instrument calibration system and method of use | |
| Naik et al. | Image Guidance in Endoscopic Sinonasal Surgery | |
| US20210186362A1 (en) | Selecting cursor locations on medical image using directions from distal end of probe | |
| CN117898664A (en) | Method for displaying previous endoscope image, electronic device, navigation system and robot system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |