CN120392290A - A computer-based surgical simulation assistance and navigation system - Google Patents
A computer-based surgical simulation assistance and navigation systemInfo
- Publication number
- CN120392290A CN120392290A CN202510905594.9A CN202510905594A CN120392290A CN 120392290 A CN120392290 A CN 120392290A CN 202510905594 A CN202510905594 A CN 202510905594A CN 120392290 A CN120392290 A CN 120392290A
- Authority
- CN
- China
- Prior art keywords
- model
- simulation
- dimensional
- surgical
- subsystem
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Image Processing (AREA)
Abstract
The invention relates to the technical field of medical treatment, in particular to a computer-based operation simulation auxiliary and navigation system, which comprises a medical image three-dimensional model reconstruction subsystem, a model calibration and guide subsystem, an operation space positioning subsystem and an operation planning and simulation subsystem, wherein the medical image three-dimensional model reconstruction subsystem obtains a three-dimensional simulation model according to pre-operation detection information of a patient, a three-dimensional model registration function is built in the model calibration and guide subsystem, a target three-dimensional simulation model is obtained by adjusting the model through the function, the target three-dimensional simulation model is analyzed by the operation space positioning subsystem to obtain spatial distribution information and posture information of tissue of an operation area of the patient, a horizontal parallax safety evaluation model is built in the operation planning and simulation subsystem, a safe horizontal parallax distance is obtained, and the operation planning and simulation subsystem realizes operation simulation according to the target three-dimensional simulation model, the spatial distribution information, the posture information and the safe horizontal parallax distance.
Description
Technical Field
The invention relates to the technical field of medical treatment, in particular to a computer-based operation simulation auxiliary and navigation system.
Background
Current surgical assistance and navigation systems rely mainly on techniques such as electromagnetic navigation tracking, intraoperative X-ray fluoroscopy, intraoperative real-time CT imaging, etc. The prior art generally builds a three-dimensional model based on preoperative image information, however, how to perform standardized adjustment on the three-dimensional model of the patient to ensure the accuracy of the posture of the three-dimensional model, and effectively fuse the three-dimensional model with a target object in actual operation remains a great challenge.
In addition, during the process of assisting and navigating the operation simulation, due to the non-standardization of the simulation operation, the correct use of the related equipment in the operation simulation system cannot be ensured. Therefore, how to ensure accurate operation, effective monitoring and normal operation of the surgical simulation and navigation system and provide reliable visual guidance and auxiliary information for the surgical simulation is a problem to be solved.
Disclosure of Invention
In order to optimize the vision assistance and navigation functions in the operation simulation process, on the other hand, the invention acquires the spatial distribution information and posture information of a patient based on the preoperative detection information and the three-dimensional model information, provides the vision assistance and guiding information for the operation simulation process, and is beneficial to improving the accuracy and safety of the operation. The invention provides a computer-based operation simulation assistance and navigation system, which comprises a medical image three-dimensional model reconstruction subsystem, a model calibration and guide subsystem, an operation space positioning subsystem and an operation planning and simulation subsystem, wherein the medical image three-dimensional model reconstruction subsystem obtains a three-dimensional simulation model of a patient according to preoperation detection information of the patient, a three-dimensional model registration function is built in the model calibration and guide subsystem, the model calibration and guide subsystem adjusts the three-dimensional simulation model according to the three-dimensional model registration function so as to obtain a target three-dimensional simulation model of the patient, the operation space positioning subsystem is utilized to analyze the target three-dimensional simulation model so as to obtain spatial distribution information and posture information of tissue of an operation area of the patient, a horizontal parallax safety evaluation model is built in the operation planning and simulation subsystem so as to obtain a safety horizontal parallax distance, and the operation planning and simulation subsystem realizes simulation of an operation scheme of the patient according to the target three-dimensional simulation model, the spatial distribution information, the posture information and the safety horizontal parallax distance. The system has the functions of three-dimensional simulation, model calibration, operation space positioning, operation planning simulation and the like, can obviously reduce the risk in the operation process, more accurately positions the focus area of a patient, provides a better operation auxiliary and guiding tool for doctors, and is beneficial to realizing more accurate, safe and efficient operation treatment.
Optionally, the three-dimensional model registration function is built in the model calibration and guide subsystem, and the model calibration and guide subsystem adjusts the three-dimensional simulation model according to the three-dimensional model registration function so as to obtain a target three-dimensional simulation model of the patient, wherein the three-dimensional model registration function is built based on the three-dimensional simulation model and the preoperative detection information, and the three-dimensional simulation model is adjusted according to the output result of the three-dimensional model registration function so as to obtain the target three-dimensional simulation model of the patient. The three-dimensional model registration function is constructed and applied to adjust the three-dimensional simulation model, so that the three-dimensional simulation model which is more relevant to the actual condition of a patient can be obtained.
Optionally, the three-dimensional model registration function satisfies the following relationship:
Wherein, the Representing the registration result of the three-dimensional model,Representing the number of views that can be taken of the three-dimensional simulation model,The intensity value of the red channel in the left camera projection image representing the mth three-dimensional simulation model's viewable point,Representing the intensity value of the green channel in the left camera projected image of the mth three-dimensional simulation model's viewable point,The intensity value of the blue channel in the left camera projection image representing the mth three-dimensional simulation model's viewable point,The intensity value of the red channel in the right camera projection image representing the mth three-dimensional simulation model's viewable point,Representing the intensity value of the green channel in the right camera projected image of the mth three-dimensional simulation model's viewable point,The intensity value of the blue channel in the right camera projection image representing the mth three-dimensional simulation model's viewable point,The average value of red intensity in the left and right camera projection images representing the mth three-dimensional simulation model visual point,The average green intensity in the left and right camera projection images representing the mth three-dimensional simulation model visual point,And representing the average value of blue intensity in left and right camera projection images of the mth three-dimensional simulation model visual point. The three-dimensional model registration function disclosed by the invention integrates factors such as multichannel information, a left camera projection mechanism, a right camera projection mechanism, quantitative analysis and the like, is beneficial to formulating a more accurate and personalized surgical treatment scheme, reduces surgical risks and improves surgical success rate.
Optionally, the analyzing the target three-dimensional simulation model by using the operation space positioning subsystem to obtain the spatial distribution information and the posture information of the tissue of the operation region of the patient comprises the steps that the operation space positioning subsystem analyzes the target three-dimensional simulation model by combining the pre-operation detection information and obtains a space positioning analysis result, and the spatial distribution information of the tissue of the operation region of the patient is obtained based on the space positioning analysis result, wherein the spatial distribution information comprises the spatial distance and the spatial angle value of the tissue of the operation region of the patient. The invention utilizes the operation space positioning subsystem to acquire the space distribution information and the posture information of the tissue of the operation area of the patient, thereby being beneficial to optimizing the operation scheme and realizing the real-time navigation of the simulated operation.
Optionally, the spatial distance satisfies the following relationship:
Wherein, the Representing the space distance between A and B in three-dimensional simulation model,,) Representing the spatial coordinates of the point a in the three-dimensional simulation model, (-) -A/D,,) Representing the space coordinate of the point B in the three-dimensional simulation model;
the spatial angle value satisfies the following relationship:
Wherein, the Representing the angular values of a and B in the three-dimensional simulation model,Representing A, O the spatial distance of two points in the three-dimensional simulation model,Representing B, O the spatial distance of two points in the three-dimensional simulation model,Representing the spatial distance of a and B in the three-dimensional simulation model. The calculation model of the spatial distance and the spatial angle value is beneficial to positioning and navigating the focus area of a patient in the operation simulation process, and further improves the success rate of the operation.
Optionally, establishing the horizontal parallax safety assessment model in the surgical planning and simulation subsystem comprises establishing the horizontal parallax safety assessment model according to a parallax type of surgical simulation, wherein the parallax type comprises zero parallax, positive parallax, negative parallax and divergent parallax. The invention can simulate the visual perception in the real operation process, further improves the authenticity, accuracy and safety of a simulation system, and reduces the error and uncertainty in the operation process.
Optionally, the horizontal parallax safety assessment model satisfies the following relationship:
Wherein, the Indicating the visual acuity of the human eye,Representing the distance between the viewer's eyes and the display screen,Representing the center distance of the pupils of the eyes of the viewer,Representing the horizontal pixel resolution of the display,Represents the pupil diameter size of the human eye,Representing binocular disparity values in the horizontal direction. The model can ensure that the binocular vision difference value is in a safe range, and reduce the surgical risk caused by vision errors.
Optionally, the step of establishing a horizontal parallax safety evaluation model in the operation planning and simulation subsystem to obtain a safe horizontal parallax distance comprises the steps of obtaining a horizontal parallax evaluation result by the operation planning and simulation subsystem through the horizontal parallax safety evaluation model, and adjusting the equipment use distance in the operation simulation process according to the horizontal parallax evaluation result so as to maintain the safe horizontal parallax distance in the operation simulation process. The invention adjusts the using distance of the operation simulation equipment through the safety evaluation model, is beneficial to improving the authenticity and flexibility of operation simulation, and provides a safer and more efficient operation treatment scheme for patients.
Optionally, the operation planning and simulation subsystem realizes simulation of the operation scheme of the patient according to the target three-dimensional simulation model, the spatial distribution information, the gesture information and the safety horizontal parallax distance, wherein the operation planning and simulation subsystem is provided with visual monitoring equipment, and the operation planning and simulation subsystem realizes visual monitoring of an operation simulation flow based on the visual monitoring equipment, the target three-dimensional simulation model, the spatial distribution information, the gesture information and the safety horizontal parallax distance. The visual monitoring equipment can capture and display key information in the operation simulation process in real time, and meanwhile, the system can highly restore the operation simulation environment by combining the target three-dimensional simulation model, the spatial distribution information and the gesture information.
Optionally, the computer-based operation simulation assistance and navigation system further comprises a step of setting operation scheme evaluation indexes based on historical operation information, wherein the operation scheme evaluation indexes comprise operation execution quality indexes, postoperative rehabilitation condition indexes and patient satisfaction indexes, and the operation simulation scheme of the patient is comprehensively evaluated and analyzed by combining the operation execution quality indexes, the postoperative rehabilitation condition indexes and the patient satisfaction indexes. The invention comprehensively evaluates and analyzes the operation simulation scheme, so that the system can predict risks and problems possibly occurring in the operation process, is beneficial to doctors to make adequate preparation and countermeasure before operation, and reduces the operation risks.
Drawings
FIG. 1 is a flow chart of a computer-based surgical simulation assistance and navigation system of the present invention;
FIG. 2 is a schematic view of different parallax types in the computer-based surgical simulation assistance and navigation system of the present invention;
FIG. 3 is a block diagram of a computer-based surgical simulation assistance and navigation system of the present invention.
Detailed Description
Specific embodiments of the invention will be described in detail below, it being noted that the embodiments described herein are for illustration only and are not intended to limit the invention. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. It will be apparent, however, to one skilled in the art that these specific details need not be employed to practice the present invention. In other instances, well-known circuits, software, or methods have not been described in detail in order not to obscure the invention.
Reference throughout this specification to "one embodiment," "an embodiment," "one example," or "an example" means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment," "in an embodiment," "one example," or "an example" in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable combination and/or sub-combination in one or more embodiments or examples. Moreover, those of ordinary skill in the art will appreciate that the illustrations provided herein are for illustrative purposes and that the illustrations are not necessarily drawn to scale.
Referring to fig. 1, the three-dimensional simulation model is registered by combining related algorithms and technologies, so that the three-dimensional simulation model can be attached to an operation target object to provide a solid foundation for operation simulation, and meanwhile, based on preoperative detection data and three-dimensional model information, the space and posture information of an operation area of a patient are further analyzed to provide visual assistance and navigation basis for operation simulation, thereby helping doctors to accurately perform operation planning and operation in a simulation environment, reducing operation risks and improving operation success rate. The invention provides a computer-based operation simulation auxiliary and navigation system, which mainly comprises the following steps:
A computer-based operation simulation auxiliary and navigation system is provided with a medical image three-dimensional model reconstruction subsystem, a model calibration and guide subsystem, an operation space positioning subsystem and an operation planning and simulation subsystem.
S1, the medical image three-dimensional model reconstruction subsystem obtains a three-dimensional simulation model of a patient according to preoperative detection information of the patient, and the specific implementation steps and related contents are as follows:
The medical image three-dimensional model reconstruction subsystem can accurately receive preoperative detection information of a patient, the information comprises medical image data information such as CT, MRI, ultrasound and the like, the preoperative detection data of the patient are mainly drawn, and the three-dimensional model reconstruction subsystem is beneficial to accurately constructing three-dimensional simulation models of different patients, and the model is used for re-engraving focus areas and construction conditions of the patients, so that doctors or related simulation participants can fully know tissue details such as blood vessel layout, nerve trend, bone structure and the like of the focus areas of the patients, and auxiliary and quotation information is provided for diagnosis work, operation strategy planning and operation simulation practice of doctors.
In actual operation, the medical image three-dimensional model reconstruction subsystem firstly needs to combine two-dimensional image technologies such as CT, MRI and the like to timely receive and analyze image data and detection information of a patient operation target area, and then converts the two-dimensional image data of the patient into a three-dimensional model through a medical image processing technology, so that the conversion from plane to three-dimensional information is completed, the conversion process not only ensures the complete reservation of original diagnostic images and examination data, but also enables doctors to observe focuses and adjacent tissues of the patient at different visual angles, and is beneficial to designing a more accurate and safer patient operation scheme.
In summary, the medical image three-dimensional model reconstruction subsystem converts the two-dimensional information of the patient into the three-dimensional simulation model, presents visual patient focus and surrounding tissue models thereof, and is beneficial to promoting the progress and development of medical diagnosis technology.
Furthermore, the method of establishing the three-dimensional simulation model of the patient in the embodiment is only an optional condition of the invention, and the method of establishing the three-dimensional simulation model of the patient can be replaced according to the actual condition of the patient and the operation simulation requirement in other embodiments or in other embodiments, and different model precision and detail degrees are required for different patients and operation scenes.
S2, constructing a three-dimensional model registration function in the model calibration and guide subsystem, and adjusting the three-dimensional simulation model by the model calibration and guide subsystem according to the three-dimensional model registration function so as to obtain a target three-dimensional simulation model of the patient, wherein the specific implementation contents are as follows:
And simultaneously adjusting the three-dimensional simulation model according to the output result of the three-dimensional model registration function to obtain a target three-dimensional simulation model of the patient.
A three-dimensional model registration function is constructed in the model calibration and guidance subsystem, and the three-dimensional simulation model of the patient is adjusted accordingly to obtain a target three-dimensional simulation model which is highly consistent with the actual surgical situation.
Firstly, comparing and analyzing preoperative detection information of a patient with a constructed three-dimensional simulation model, and constructing a three-dimensional model registration function according to a corresponding relation between the preoperative detection information and the constructed three-dimensional simulation model, wherein the registration function can optimize an alignment mode of the three-dimensional simulation model and a real operation area or structure of the patient, so that more accurate operation simulation information is provided.
The three-dimensional model registration function in the embodiment satisfies the following relationship:
Wherein, the Representing the registration result of the three-dimensional model,Representing the number of views that can be taken of the three-dimensional simulation model,The intensity value of the red channel in the left camera projection image representing the mth three-dimensional simulation model's viewable point,Representing the intensity value of the green channel in the left camera projected image of the mth three-dimensional simulation model's viewable point,The intensity value of the blue channel in the left camera projection image representing the mth three-dimensional simulation model's viewable point,The intensity value of the red channel in the right camera projection image representing the mth three-dimensional simulation model's viewable point,Representing the intensity value of the green channel in the right camera projected image of the mth three-dimensional simulation model's viewable point,The intensity value of the blue channel in the right camera projection image representing the mth three-dimensional simulation model's viewable point,The average value of red intensity in the left and right camera projection images representing the mth three-dimensional simulation model visual point,The average green intensity in the left and right camera projection images representing the mth three-dimensional simulation model visual point,And representing the average value of blue intensity in left and right camera projection images of the mth three-dimensional simulation model visual point.
The output value of the three-dimensional model registration function refers to a result obtained by calculation in the registration process, and the result is used for describing a spatial transformation relationship between the two-dimensional information and the three-dimensional model, so that the two can achieve a better alignment or matching relationship. Furthermore, if the output value is within the preset registration parameter threshold or the registration function is minimized, the three-dimensional simulation model and the two-dimensional information at the moment are indicated to have better adaptation degree, so that the three-dimensional simulation model can be directly used for the subsequent operation simulation flow, and if the output value does not accord with the prequalification condition, model optimization and adjustment are needed.
In the three-dimensional space, when the three-dimensional simulation model is observed from a certain visual angle or camera position, the number of the model surface points which can be seen is the number of visual points. In the embodiment, the relevant visual points can be vertexes of the surface of the model or subdivided smaller points, which jointly form the visible part of the three-dimensional simulation model under different visual angles, and meanwhile, the more the number of the visual points is, the more the details of the three-dimensional simulation model are, and the more the visual effect is real.
The intensity value of the red channel in the projection image of the left camera of the three-dimensional simulation model can be used for acquiring the brightness or the color intensity of the red channel of any visible point on the three-dimensional model when the three-dimensional simulation model is projected onto the two-dimensional image plane of the left camera, and the intensity value is a scalar which can be used for describing the color shade or the brightness of the point on the red channel.
The intensity value of the green channel of the three-dimensional simulation model visual point in the left camera projection image is the color intensity of any visual point on the model on the green channel.
The blue channel is one of the components of a color CT image, and together with the red and green channels, forms the color space of the color CT image. The intensity value of the blue channel in the left camera projection image for the three-dimensional simulation model can be used to describe the color intensity of any one of the three-dimensional model's visual points on the blue channel, as well as a scalar.
The correlation intensity values play an important role in the three-dimensional model registration function, and are used to calculate the color intensity differences of the three-dimensional simulation model in the left and right camera projection images, so as to evaluate the alignment degree between the three-dimensional model and the real operation area or structure, and in this embodiment, by minimizing the color intensity differences, the optimal parameters for the alignment of the three-dimensional simulation model and the real operation structure can be found, so as to obtain the optimized target three-dimensional simulation model.
The intensity value of the red channel, the intensity value of the red channel and the intensity value of the red channel in the projection image of the right camera of the visual point of the three-dimensional simulation model are the same as the explanation of the parameters, and the difference is that the three-dimensional simulation model is projected onto the two-dimensional image plane of the right camera, and the brightness or the intensity of different colors of any visual point is obtained.
The average value of the intensity values of the different color channels reflects the overall brightness or the color shade of the three-dimensional model on the different color channels. Based on the method, brightness or color distribution information of the three-dimensional model on different color channels can be analyzed, so that the alignment degree between the three-dimensional model and a real operation area or structure can be estimated.
The three-dimensional model registration function is mainly used for realizing matching by comparing color intensity information of camera projection images, comprehensively considering intensity distribution of three primary colors of red, green and blue of a three-dimensional simulation model in left and right camera projection images, calculating deviation of color intensity and average intensity of each visual point in the camera projection images by the function, and further measuring alignment conditions between the three-dimensional simulation model and a real operation area or anatomical structure, so that the matching degree of the model and the actual conditions is ensured.
When the registration function value is minimized or the output of the registration function value is controlled within a preset reference threshold, a group of optimal parameter configuration can be found, the parameters can ensure that the three-dimensional simulation model is accurately aligned with a real operation area or an anatomical structure, a target three-dimensional simulation model is obtained by utilizing the optimized parameters, the model is an indispensable part of an operation simulation auxiliary and navigation system, detailed and real-time operation guidance can be provided for doctors, and the assistance operation is smoothly carried out.
The registration function plays a key role in the adjustment process of the three-dimensional simulation model. And (3) according to the feedback result of the registration function, adjusting the shape, position and other parameters of the three-dimensional simulation model to ensure that the shape, position and other parameters are highly consistent with the real anatomical structure of the patient, and ensuring the accuracy and reliability of surgical navigation. In addition, in order to enhance the precision of the model, the optimized three-dimensional simulation model is compared with the two-dimensional image in the calibration stage, and the vertex coordinates of the three-dimensional model are mapped into an image coordinate system through coordinate conversion so as to be directly compared with and verified with the two-dimensional image information.
Furthermore, the method for acquiring the three-dimensional simulation model of the patient target in the embodiment is only an optional condition of the present invention, and in one or some other embodiments, the method for acquiring the three-dimensional simulation model of the patient target can be adjusted according to the target requirement of the three-dimensional simulation model and the actual condition of model calibration, and the individual differences of different patients, such as body types, lesion positions, lesion degrees, etc., can be better adapted by adjusting the acquisition method, so as to generate the three-dimensional simulation model more conforming to the actual condition of the patient.
S3, analyzing the target three-dimensional simulation model by utilizing the operation space positioning subsystem to acquire space distribution information and posture information of the tissue of the operation region of the patient, wherein the specific implementation contents are as follows:
And the surgical space positioning subsystem is combined with the preoperative detection information to analyze the target three-dimensional simulation model and obtain a space positioning analysis result.
The operation space positioning subsystem is tightly matched with the preoperative detection information, and deep analysis is carried out on the target three-dimensional simulation model to obtain a space positioning analysis result. The subsystem integrates medical imaging, electromagnetic tracking and optical positioning technologies and is used for measuring three-dimensional space positions and postures of tissues and surgical instruments in a surgical area.
The operation space positioning subsystem can analyze the relative position relation between the operation instrument and the operation area of the patient in real time, and provides visual and clear visual navigation for doctors. In the operation simulation process, a doctor can pre-make an operation path and an implementation plan according to the space positioning information and the posture data of the operation instrument, so that the accuracy and the safety of the operation are improved.
The operation space positioning subsystem can accurately measure the space position and the gesture of the three-dimensional simulation model, plays a remarkable role in improving operation accuracy and safety, can effectively reduce operation risks, and further promotes the optimization of operation simulation effects.
Based on the results obtained by the spatial localization analysis subsystem, spatial layout information about the tissue of the surgical field of the patient may be extracted, which in an alternative embodiment specifically includes spatial distance data and spatial angle data between the tissue of the surgical field.
The above spatial distance satisfies the following relationship:
Wherein, the Representing the space distance between A and B in three-dimensional simulation model,,) Representing the spatial coordinates of the point a in the three-dimensional simulation model, (-) -A/D,,) Representing the space coordinate of the point B in the three-dimensional simulation model;
In the embodiment, a three-dimensional space distance formula is adopted to calculate the space distance, and the method can accurately reflect the linear distance between any two or more points in the three-dimensional simulation model in the three-dimensional space.
The above spatial angle values satisfy the following relationship:
Wherein, the Representing the angular values of a and B in the three-dimensional simulation model,Representing A, O the spatial distance of two points in the three-dimensional simulation model,Representing B, O the spatial distance of two points in the three-dimensional simulation model,Representing the spatial distance of a and B in the three-dimensional simulation model.
In order to determine the angular relationship between the tissue of the surgical field and the reference point, the spatial angular values are described in the examples. In actual operation, mathematical deduction is performed by using a cosine theorem, so that the included angle of any two points relative to a reference point is obtained.
By calculating the spatial distance and angle between the tissues of the surgical field, the precise location of the surgical instrument and the lesion can be more accurately determined. Based on the space distribution information, doctors can deeply understand the layout and mutual position relation of the tissues in the operation area, so that accidental damage to key tissues is effectively prevented, and the operation risk is further reduced. In addition, doctors can customize the operation scheme for patients by using the space information and the posture data of the surgical instruments, including selection of operation paths, planning of operation steps, selection of the surgical instruments and the like, so as to ensure smooth operation of the operation simulation flow.
Furthermore, in this embodiment, the method of acquiring spatial information and posture information is only an optional condition of the present invention, and in other embodiments, the method of acquiring information of the three-dimensional model may be changed and optimized according to the requirements of the patient operation scheme and the specific structure of the three-dimensional simulation model, and in other embodiments, the process of acquiring information may be modified and perfected according to the actual requirement of the operation, the complexity of the model and the technical conditions, so as to ensure that the relevant information of the three-dimensional model may be acquired accurately and efficiently.
S4, establishing a horizontal parallax safety evaluation model in the operation planning and simulation subsystem so as to obtain a safety horizontal parallax distance, wherein the operation planning and simulation subsystem realizes the simulation of a patient operation scheme according to the target three-dimensional simulation model, the spatial distribution information, the posture information and the safety horizontal parallax distance, and the specific implementation contents are as follows:
according to the parallax type of the operation simulation, a horizontal parallax safety evaluation model is established, wherein the parallax type mainly comprises zero parallax, positive parallax, negative parallax and divergent parallax in the embodiment.
The human visual system, when observing the three-dimensional model, focuses not only on the main target object, but also senses the surrounding objects at the same time and then images on the retina. When two adjacent imaging points on the retina fall in the fusion zone, the brain can integrate the relevant visual information to form single stereoscopic vision. But in order to ensure a stereoscopic effect in the fusion zone, the parallax of the images received by both eyes must be maintained within a reasonable range.
According to the principle of parallel binocular vision information acquisition and related research results, most of stereoscopic videos captured by human vision show negative parallax characteristics, namely, the intersection point of binocular vision is positioned in front of a display screen, so that the visual perception that an object is about to jump out of the screen is created. In the embodiment, in order to improve the immersion and stereoscopic depth effect of the naked eye 3D display image, the parallax type covers various situations such as positive parallax, zero parallax, negative parallax and the like. The above parallax type is specifically shown in fig. 2, where (a) in fig. 2 represents zero parallax, (b) in fig. 2 represents positive parallax, (c) in fig. 2 represents negative parallax, and (d) in fig. 2 represents divergent parallax.
When the intersection point of the two eyes is just on the screen of the display, the visual effect of the screen is in a zero parallax state, the screen is in a plane and lacks depth sense, as shown in (a) of fig. 2, if the intersection point is behind the screen, positive parallax is formed, a viewer can experience a stereoscopic effect that the three-dimensional simulation model goes deep into the screen, as shown in (b) of fig. 2, conversely, if the intersection point is in front of the screen, negative parallax is generated, the three-dimensional simulation model can jump out of the screen, so that strong stereoscopic vision impact is brought about, as shown in (c) of fig. 2, and finally, when the two eyes cannot meet, that is, divergent parallax is brought about, the viewer cannot feel any stereoscopic effect, as shown in (d) of fig. 2.
As can be seen from fig. 2, by adjusting and controlling the eyes of the operator, the position of the intersection point of the eyes and the parallax range are maintained within a specific range, and thus a better visual effect of the operator can be obtained.
The naked eye display devices used by the surgical participants and the specific viewing distance between them and the screen must be taken into account during the surgical simulation procedure. To ensure the accuracy of the surgery and the visual comfort of the participants, the safety range of the horizontal parallax must meet certain conditions. Namely, in order to achieve vividness and safety of the stereoscopic vision effect in the operation simulation process, the range of the horizontal parallax is required to be correspondingly adjusted according to the characteristics of the parallax type and the actual viewing distance, and a more real and reliable operation simulation environment can be constructed by carefully classifying and processing various types of parallaxes, so that comprehensive vision assistance is provided for doctors, the success rate of operation is improved, and the safety of patients is guaranteed.
The horizontal parallax safety evaluation model satisfies the following relationship:
Wherein, the Indicating the visual acuity of the human eye,Representing the distance between the viewer's eyes and the display screen,Representing the center distance of the pupils of the eyes of the viewer,Representing the horizontal pixel resolution of the display,Represents the pupil diameter size of the human eye,Representing binocular disparity values in the horizontal direction.
Visual acuity is a visual indicator, i.e., visual acuity refers to the ability of a person to distinguish fine objects or fine parts of distant objects, in embodiments the visual acuity of the person's eyes, set to;
The pupil diameter of the human eye is set to be 4 mm;
the horizontal pixel resolution of a display, that is, a display displaying a three-dimensional analog model, refers to the number of pixels that the display can display in the horizontal direction, and is an important indicator for measuring the definition of the display, which determines the level of detail of the display that presents an image.
The horizontal pixel resolution of the display satisfies the following relationship:
Wherein, the Representing the horizontal pixel resolution of the display,Representing the diagonal dimension of the display,Representing the display resolution of the display screen.
Due to the existence of the interpupillary distance of the two eyes, the same object can generate horizontal difference when being imaged on the retina of the two eyes, namely horizontal parallax is formed. By regulating and controlling the horizontal parallax value between the left eye image and the right eye image, more realistic stereoscopic vision effect can be simulated. If the horizontal parallax is improperly set, too large or too small, visual discomfort is caused or the perception effect of stereoscopic vision is weakened. Therefore, it is important to properly adjust the horizontal parallax value according to the specific situation of a doctor or a user, and when the three-dimensional simulation model of a patient is observed and analyzed based on the horizontal parallax value, the optimal stereoscopic vision effect and viewing comfort level can be obtained, so that the smooth operation simulation flow is ensured.
In an alternative embodiment, the surgical planning and simulation subsystem obtains a horizontal parallax evaluation result by using a horizontal parallax safety evaluation model, and adjusts the equipment use distance in the surgical simulation process according to the horizontal parallax evaluation result so as to maintain the safety horizontal parallax distance in the surgical simulation process.
The operation planning and simulation subsystem utilizes a horizontal parallax safety evaluation model to accurately evaluate the horizontal parallax. Based on the evaluation result, the system can intelligently adjust the distance between the operation simulation equipment and the observer, and ensures that the horizontal parallax is always in a safe and proper range in the whole operation simulation flow.
On the premise that the horizontal parallax is in an optimal range, the system can also carry out corresponding horizontal displacement adjustment on left and right images of the three-dimensional simulation model according to the required translation amount so as to further regulate and control the horizontal parallax. However, in this process, part of the image edge information may be lost due to the panning operation. In order to effectively solve the problem, a bilinear interpolation algorithm is adopted in the embodiment to process the image edge, so that the edge of the finally presented three-dimensional simulation model is smooth and natural, and the accuracy and fluency of the three-dimensional simulation model are maintained.
Visual monitoring equipment is configured in the surgical planning and simulation subsystem. The operation planning and simulation subsystem realizes the visual monitoring of the operation simulation flow based on the visual monitoring equipment, the target three-dimensional simulation model, the spatial distribution information, the gesture information and the safe horizontal parallax distance.
In this embodiment, a computer-based surgical simulation assistance and navigation system further includes:
According to past operation data, a set of operation scheme evaluation system is set, and the system covers operation execution quality, postoperative rehabilitation condition and patient satisfaction as evaluation indexes.
In terms of operation execution quality, two key indexes of time consumption and blood loss in operation are mainly considered, and the two indexes are inversely related to operation effects, namely, the lower the numerical value is, the more successful the operation is, and the more beneficial to patients is.
The postoperative rehabilitation status is evaluated by the success rate and the recovery quality of postoperative recovery of the patient, and can be classified into three grades of excellent, good and poor. Among them, the higher the patient proportion of excellent and good recovery, the better the surgical effect is generally, and the positive correlation is embodied.
In terms of patient satisfaction, the evaluation is mainly based on subjective feeling of the patient, and can be classified into three layers of dissatisfaction, general satisfaction and very satisfaction, wherein the higher the patient satisfaction is, the more desirable the surgical effect is.
In this example, a conventional normal surgery group and a computer-assisted surgery group were set, and the specific statistical analysis contents were as follows:
In the statistical analysis link, SPSS statistical software is selected as an analysis tool to process and analyze the related data of different components. The average value plus-minus standard deviation is adopted for the measurement data Expressed in terms of (c) and the data differences between the different groups are compared using t-test methods, whereas the count data is expressed in terms of percent (%) and is obtained byTests were performed to evaluate the variability between different groups. When the P value is less than the threshold of 0.05, it is judged that the difference between the different components has significance, and has a clear meaning in statistics.
Wherein the method comprises the steps of(X-bar) represents the Mean (Mean), i.e., the average of a set of data, which can be used to describe the center position or average level of a set of data by summing all values of any set of data and then dividing by the number of values.
The standard deviation (Standard Deviation) is a statistic that measures the degree of discretization of a set of data. The larger the standard deviation, the more scattered the data points, and the smaller the standard deviation, the more concentrated the data points, and the standard deviation is used to describe the fluctuation or dispersion degree of the data.
(Chi-squared) represents statistics of Chi-squared test, a non-parametric test method, which is used mainly to compare differences between actual observed frequency and expected frequency to test relevance or independence between classified variables, whereinThe larger the value, the larger the difference between the observed frequency and the expected frequency.
P represents a probability value (Probability value) for measuring whether the observed data differences are generated by random errors or reach a statistically significant level. A significance level (e.g., 0.05) is statistically set, and when the P-value is less than this significance level, it is considered that the observed difference is not due to random errors, but has a statistical significance.
1. And performing contrast analysis on the indexes related to the operation simulation execution.
Comparing the performance of the traditional conventional surgery group with the computer-assisted surgery group in terms of surgery time and intraoperative bleeding volume, it was found that the computer-assisted surgery group exhibited significant advantages. Specifically, the computer assisted surgery group had shorter surgery time and less bleeding than the traditional conventional surgery group, and the data from the two groups were statistically analyzed by t-test, and the difference between the two groups was significant (P < 0.05), and the related data are shown in table 1.
Table 1 two sets of operation execution quality index comparison data tables
2. And comparing and analyzing the postoperative recovery degree.
The computer-assisted surgery group exhibited better performance when comparing the degree of postoperative recovery. In particular, the rate (in percent) of post-operative recovery to an excellent level was significantly higher in the computer-assisted surgery group than in the control group, this difference being achieved byThe test was statistically confirmed, indicating that the differences between the two groups were significant (P < 0.05), see table 2 for details.
Table 2 two sets of postoperative rehabilitation condition index comparison data tables
3. And (5) comparing and analyzing the satisfaction index of the patient.
The results of the post-operative satisfaction survey showed that patient satisfaction in the computer-assisted surgery group exhibited significantly higher levels than in the control group. In particular, the percent of patient satisfaction was significantly higher in the computer-assisted surgery group than in the control group byThe test was statistically validated, indicating significant differences between the two groups (P < 0.05), see table 3 for details.
Table 3 two sets of postoperative recovery degree comparison data tables
Through deep analysis of the data in tables 1,2 and 3, the technical field of patient operation simulation assistance and navigation can be clearly seen, the computer-based operation simulation assistance and navigation system provided by the invention has better performance, the system not only greatly shortens the operation time, but also remarkably reduces the bleeding amount in the operation, thereby improving the scientificity and safety in the operation process, effectively reducing the operation risk of the patient, and verifying the effectiveness and feasibility of the operation simulation assistance and navigation system to a certain extent. In addition, the satisfaction degree of the postoperative patients reaches 92.00%, and the operation simulation auxiliary and navigation system has obvious advantages in improving the operation curative effect and the clinical value.
Referring to fig. 3, in an alternative embodiment, the present invention further provides a computer-based operation simulation assistance and navigation system, which includes a traffic intelligent monitoring subsystem, an information management subsystem, an intelligent scheduling subsystem, and a safety pre-warning and rescue subsystem, wherein the traffic intelligent monitoring subsystem, the information management subsystem, the intelligent scheduling subsystem, and the safety pre-warning and rescue subsystem are connected to each other, so as to implement the specific steps of the computer-based operation simulation assistance and navigation system related embodiments provided by the present invention. The computer-based operation simulation auxiliary and navigation system has complete, objective and stable structure, and improves the overall applicability and practical application capability of the invention.
It should be noted that the above embodiments are only used to illustrate the technical solution of the present invention, but not to limit the technical solution of the present invention, and although the detailed description of the present invention is given with reference to the above embodiments, it should be understood by those skilled in the art that the technical solution described in the above embodiments may be modified or some or all technical features may be equivalently replaced, and these modifications or substitutions do not make the essence of the corresponding technical solution deviate from the scope of the technical solution of the embodiments of the present invention, and all the modifications or substitutions are included in the scope of the claims and the specification of the present invention.
Claims (10)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510905594.9A CN120392290B (en) | 2025-07-02 | 2025-07-02 | A computer-based surgical simulation assistance and navigation system |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202510905594.9A CN120392290B (en) | 2025-07-02 | 2025-07-02 | A computer-based surgical simulation assistance and navigation system |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN120392290A true CN120392290A (en) | 2025-08-01 |
| CN120392290B CN120392290B (en) | 2025-09-19 |
Family
ID=96528680
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510905594.9A Active CN120392290B (en) | 2025-07-02 | 2025-07-02 | A computer-based surgical simulation assistance and navigation system |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN120392290B (en) |
Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6301495B1 (en) * | 1999-04-27 | 2001-10-09 | International Business Machines Corporation | System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan |
| WO2021227548A1 (en) * | 2020-05-12 | 2021-11-18 | 北京航空航天大学 | Digital guide system for mandibular osteotomy |
| US20230105822A1 (en) * | 2020-03-04 | 2023-04-06 | 360 Knee Systems Pty Ltd | Intraoperative guidance systems and methods |
| WO2024140645A1 (en) * | 2022-12-29 | 2024-07-04 | 北京和华瑞博医疗科技有限公司 | Prosthesis mounting assistant device and surgical navigation system |
| CN118453115A (en) * | 2024-05-20 | 2024-08-09 | 南通市传染病防治院(南通市第三人民医院) | Real-time image-guided surgery system |
| CN119302743A (en) * | 2024-09-30 | 2025-01-14 | 贵州医科大学附属医院 | A detection system to assist in the precise positioning of chest lesions |
| CN119679516A (en) * | 2024-12-25 | 2025-03-25 | 北京埃斯顿医疗科技有限公司 | Shoulder replacement surgery navigation system and method |
| WO2025077722A1 (en) * | 2023-10-09 | 2025-04-17 | 杭州键嘉医疗科技股份有限公司 | Orthopedic surgical robot system and orthopedic surgical robot operating method |
| CN120053074A (en) * | 2025-02-12 | 2025-05-30 | 大连理工大学附属中心医院(大连市中心医院) | Virtual navigation system based on mixed reality technology and application method thereof |
-
2025
- 2025-07-02 CN CN202510905594.9A patent/CN120392290B/en active Active
Patent Citations (9)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6301495B1 (en) * | 1999-04-27 | 2001-10-09 | International Business Machines Corporation | System and method for intra-operative, image-based, interactive verification of a pre-operative surgical plan |
| US20230105822A1 (en) * | 2020-03-04 | 2023-04-06 | 360 Knee Systems Pty Ltd | Intraoperative guidance systems and methods |
| WO2021227548A1 (en) * | 2020-05-12 | 2021-11-18 | 北京航空航天大学 | Digital guide system for mandibular osteotomy |
| WO2024140645A1 (en) * | 2022-12-29 | 2024-07-04 | 北京和华瑞博医疗科技有限公司 | Prosthesis mounting assistant device and surgical navigation system |
| WO2025077722A1 (en) * | 2023-10-09 | 2025-04-17 | 杭州键嘉医疗科技股份有限公司 | Orthopedic surgical robot system and orthopedic surgical robot operating method |
| CN118453115A (en) * | 2024-05-20 | 2024-08-09 | 南通市传染病防治院(南通市第三人民医院) | Real-time image-guided surgery system |
| CN119302743A (en) * | 2024-09-30 | 2025-01-14 | 贵州医科大学附属医院 | A detection system to assist in the precise positioning of chest lesions |
| CN119679516A (en) * | 2024-12-25 | 2025-03-25 | 北京埃斯顿医疗科技有限公司 | Shoulder replacement surgery navigation system and method |
| CN120053074A (en) * | 2025-02-12 | 2025-05-30 | 大连理工大学附属中心医院(大连市中心医院) | Virtual navigation system based on mixed reality technology and application method thereof |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120392290B (en) | 2025-09-19 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US10258427B2 (en) | Mixed reality imaging apparatus and surgical suite | |
| AU2018246254B2 (en) | Systems, devices and methods for enhancing operative accuracy using inertial measurement units | |
| US12412346B2 (en) | Methods for medical image visualization | |
| TWI615126B (en) | An image guided augmented reality method and a surgical navigation of wearable glasses using the same | |
| US20190333480A1 (en) | Improved Accuracy of Displayed Virtual Data with Optical Head Mount Displays for Mixed Reality | |
| WO2023021450A1 (en) | Stereoscopic display and digital loupe for augmented-reality near-eye display | |
| US20100121201A1 (en) | Non-invasive wound prevention, detection, and analysis | |
| CN106909771A (en) | Method and system for outputting augmented reality information | |
| CN109907827A (en) | A surgical navigation system for mandibular angle osteotomy | |
| Condino et al. | Registration sanity check for ar-guided surgical interventions: Experience from head and face surgery | |
| CN109875683A (en) | A method for establishing a prediction model of osteotomy surface in mandibular angle osteotomy | |
| CN109223177A (en) | Image display method, device, computer equipment and storage medium | |
| CN110478042A (en) | A kind of intervention operation navigation device based on artificial intelligence technology | |
| CN120392290B (en) | A computer-based surgical simulation assistance and navigation system | |
| CN111831118B (en) | Three-dimensional electroencephalogram display method and system based on augmented reality | |
| Chen et al. | An augmented reality surgical navigation system based on co-axial projection of surgical paths for open liver tumor surgery | |
| Mendicino et al. | Augmented reality as a tool to guide patient-specific templates placement in pelvic resections | |
| CN109875684A (en) | A prediction and real-time rendering method for mandibular angle osteotomy | |
| KR20240064970A (en) | Mixed reality based surgery support apparatus and method | |
| Johnson et al. | Depth perception of stereo overlays in image-guided surgery | |
| Condino et al. | Single feature constrained manual registration method for Augmented Reality applications in gynecological laparoscopic interventions | |
| Livingston et al. | Evaluating system capabilities and user performance in the battlefield augmented reality system | |
| Escobar et al. | Assessment of visual-spatial skills in medical context tasks when using monoscopic and stereoscopic visualization | |
| Buettner et al. | A Systematic Literature Review of Computer Support for Surgical Interventions | |
| US20240206988A1 (en) | Graphical user interface for a surgical navigation system |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |