CN120189241A - Surgical robot and its control method and image display method - Google Patents
Surgical robot and its control method and image display method Download PDFInfo
- Publication number
- CN120189241A CN120189241A CN202311792471.6A CN202311792471A CN120189241A CN 120189241 A CN120189241 A CN 120189241A CN 202311792471 A CN202311792471 A CN 202311792471A CN 120189241 A CN120189241 A CN 120189241A
- Authority
- CN
- China
- Prior art keywords
- component
- cursor
- motor
- endoscope
- detection device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/30—Surgical robots
- A61B34/35—Surgical robots for telesurgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/25—User interfaces for surgical systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/70—Manipulators specially adapted for use in surgery
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/361—Image-producing devices, e.g. surgical cameras
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B90/00—Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
- A61B90/36—Image-producing devices or illumination devices not otherwise provided for
- A61B90/37—Surgical systems with images on a monitor during operation
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Engineering & Computer Science (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Animal Behavior & Ethology (AREA)
- Public Health (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Veterinary Medicine (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- Robotics (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Pathology (AREA)
- Human Computer Interaction (AREA)
- Gynecology & Obstetrics (AREA)
- Radiology & Medical Imaging (AREA)
- Endoscopes (AREA)
Abstract
The application discloses a surgical robot, a control method thereof and an image display method. The surgical robot includes an endoscope assembly, an instrument driver, a signal processing device, and a display device. The endoscope assembly includes a manual operating member and a transmission member, the manual operating member being connected to the transmission member. The instrument driver comprises an input part, a motor and a detection device, wherein the input part is connected with the transmission part, the input part can be actuated by the transmission part and acts on a rotor of the motor under the condition that the manual operation part actuates the transmission part, and the detection device can detect the rotation angle of the rotor of the motor and generate a detection signal. The signal processing device establishes a cursor movement instruction in response to the detection signal. The display device is connected with the signal processing device in a communication way, receives a cursor movement instruction and displays the position state of the cursor after movement. The surgical robot solves the problem that a doctor at the affected side cannot indicate a surgical picture when a master doctor is in a clutch state or the master doctor performs surgical operation.
Description
Technical Field
The application relates to the technical field of medical treatment, in particular to a surgical robot, a control method thereof and an image display method.
Background
The surgical robot is a robot capable of remotely manipulating and completing a surgery and comprises three components, namely a doctor console, a patient side mechanical arm system and an imaging system. The doctor console is provided with a display unit for displaying the environment of the surgical instrument, a doctor operation control mechanism and an armrest. The display unit is provided with an observation window for observation by a doctor, the action of the operation control mechanism corresponds to the action of the surgical instrument, and the armrest is used for placing the arm of the doctor. In addition, the doctor console is also provided with other control switches which are convenient for hands or feet to touch or press, and are used for controlling corresponding parts of the mechanical arm system beside the patient to perform various functional operations so as to complete man-machine interaction. The imaging system is used to display view images (such as surgical pictures) acquired by the endoscope.
In the process of using the tele-operation robot, a master doctor and a doctor beside the patient can not mark the operation picture when the master doctor is in a clutch state in the process of communicating the operation picture, or the doctor beside the patient can not directly mark the picture when the master doctor is in operation.
Disclosure of Invention
The application aims to disclose a surgical robot, a control method thereof and an image display method.
In a first aspect, the present disclosure is directed to a surgical robot. The surgical robot includes an endoscope assembly, an instrument driver, a signal processing device, and a display device. The endoscope assembly includes a manual operating member and a transmission member, the manual operating member being connected to the transmission member. The instrument driver comprises an input part, a motor and a detection device, wherein the input part is connected with the transmission part, the input part can be actuated by the transmission part and acts on a rotor of the motor under the condition that the manual control operation part actuates the transmission part, and the detection device can detect the rotation angle of the rotor of the motor and generate a detection signal. The signal processing means is configured to establish a cursor movement instruction in response to the detection signal. The display device is in communication connection with the signal processing device and is configured to receive the cursor movement instruction and display the moved position state of the cursor.
In some embodiments, the manual operating member includes a first manual operating member and a second manual operating member, the transmission member includes a first transmission member and a second transmission member, the input member includes a first input member and a second input member, the motor includes a first motor and a second motor, and the detection device includes a first detection device and a second detection device. The first manual operation component can be actuated by external force and acts on the first transmission component, the first input component can be actuated by the first transmission component and acts on the rotor of the first motor, and the first detection device can detect the rotation angle of the rotor of the first motor and generate a first detection signal. The second manual operation component can be actuated by external force and acts on the second transmission component, the second input component can be actuated by the second transmission component and acts on the rotor of the second motor, and the second detection device can detect the rotation angle of the rotor of the second motor and generate a second detection signal. The signal processing device responds to a first detection signal sent by the first detection device to establish a coordinate axis conversion instruction for controlling a cursor and sends the coordinate axis conversion instruction to the display device, the signal processing device responds to a second detection signal sent by the second detection device to establish a numerical value change instruction for the cursor and sends the numerical value change instruction to the display device, and the cursor movement instruction comprises the coordinate axis conversion instruction and the numerical value change instruction.
In some embodiments, the manual operation member further comprises a third manual operation member and a third transmission member, the third manual operation member being connected with the third transmission member. The instrument driver comprises a third input member, which is actuatable by the third transmission member and acts on the rotor of the third motor, a third motor and a third detection means, which is configured to detect the angle of rotation of the rotor of the third motor, generating a third detection signal, in case the third manual operating member actuates the third transmission member. The signal processing device responds to a third detection signal sent by the third detection device, establishes a position marking instruction of the cursor and sends the position marking instruction to the display device.
In some embodiments, the signal processing device responds to the fourth detection signal sent by the third detection device to establish a coordinate conversion instruction, and according to the coordinate conversion instruction, the coordinate of the cursor under the endoscope coordinate system of the endoscope is converted into the coordinate under the absolute coordinate system.
In some embodiments, the endoscope assembly includes a housing including a through hole that communicates an interior of the housing and an exterior of the endoscope assembly, the manual operating member includes an operating portion, and a portion of the operating portion extends out of the through hole.
In a second aspect, the present application discloses a method of controlling a surgical robot. The surgical robot includes an endoscope assembly, an instrument driver, a signal processing device, and a display device. The endoscope assembly includes a first manual operating member, a first transmission member actuated by the first manual operating member, a second manual operating member, and a second transmission member actuated by the second manual operating member. The instrument driver comprises a first input component, a first motor, a first detection device, a second input component, a second motor and a second detection device, wherein the first input component can be actuated by the first transmission component and acts on a rotor of the first motor, the first detection device can detect the rotation angle of the rotor of the first motor to generate a first detection signal, the second input component can be actuated by the second transmission component and acts on the rotor of the second motor, and the second detection device can detect the rotation angle of the rotor of the second motor to generate a second detection signal. The signal processing means is capable of receiving a first detection signal of the first detection device and a second detection signal of the second detection device. The display device is in communication with the signal processing device. The control method of the surgical robot comprises the following steps:
the signal processing device receives the first detection signal, establishes an instruction for controlling coordinate axis conversion of a cursor and sends the instruction to a display device, and the cursor of the display device is switched to a selected coordinate axis;
The signal processing device receives the second detection signal, establishes a numerical value change instruction of a cursor and sends the numerical value change instruction to the display device, and the cursor of the display device moves along the direction of the selected coordinate axis.
In some embodiments, the surgical robot further comprises a third manual operation part and a third transmission part, a third input part, a third motor and a third detection device, wherein the third manual operation part and the third transmission part are arranged on the endoscope assembly, the third manual operation part can be actuated by external force and acts on the third transmission part, the third motor, the third detection device and the third input part are arranged on the instrument driver, the third input part can be actuated by the third transmission part and acts on a rotor of the third motor, the third detection device can detect the rotation angle of the rotor of the third motor to generate a third detection signal, the control method of the surgical robot further comprises the steps of applying a third operation to the third manual operation part to generate the third detection signal, and the signal processing device generates a marking instruction according to the third detection signal and marks the current position of a cursor of the display device.
In some embodiments, a fourth operation is applied to the third manual operation part to cause the third detection device to generate a coordinate conversion instruction, the signal processing apparatus receives the coordinate conversion instruction, converts coordinates of the cursor in an endoscope coordinate system of the endoscope into coordinates in an absolute coordinate system, and stores the coordinates of the cursor in the absolute coordinate system.
In a third aspect, the present application discloses an image display method. The image display method comprises the following steps:
Acquiring a first view image acquired by an endoscope, and displaying the first view image;
Displaying a three-dimensional coordinate system and a cursor on a first field-of-view image based on a first operation from an endoscope assembly, and highlighting a coordinate axis selected by the first operation;
Based on a second operation from the endoscope assembly, the cursor is moved in the direction of the selected coordinate axis.
In some embodiments, the first field of view image is displayed based on an endoscope coordinate system, and the initial position of the cursor is located at an origin of the endoscope coordinate system.
In some embodiments, the image display method further comprises the steps of acquiring a second view image corresponding to the moved position acquired by the endoscope when the endoscope moves from the position corresponding to the first view image, and displaying the initial position of the cursor and highlighting the selected coordinate axis on the second view image based on the first operation.
In some embodiments, the second field of view image is displayed based on an endoscope coordinate system, and the initial position of the cursor is located at an origin of the endoscope coordinate system to which the second field of view image corresponds.
In some embodiments, the image display method further comprises displaying a position marker at the current position of the cursor based on a third operation from the endoscope assembly.
In some embodiments, the image display method further comprises acquiring a third field of view image acquired by the endoscope, and displaying a direction identifier on the third field of view image if the position marker is outside the third field of view image, the direction identifier characterizing an azimuthal relationship of the third field of view image to the position marker.
According to the surgical robot, the control method and the image display method of the surgical robot, the main control doctor cannot indicate the surgical picture in the clutch state, and the doctor at the affected side can indicate the surgical picture. When the main control doctor is in the operation process and the endoscope is not controlled, if the indication requirement exists, the operation picture can be indicated by the patient side doctor, so that the problem that the main control doctor cannot indicate the operation picture in the clutch state or the patient side doctor cannot directly indicate the picture in the operation process is solved. On the premise of solving the problems, the manual control operation part can be operated by the doctor at the patient side to finish the indication of the operation picture, the operation is simple and convenient, the operation is finished by the doctor at the patient side, the continuity of the operation can be ensured, the continuity of the communication process is further ensured, and no participation of a third party is needed.
Drawings
FIG. 1 is a schematic view of a surgical robot;
FIG. 2 is a schematic view of a paracorporeal mechanical system of the surgical robot;
FIG. 3 is a schematic illustration of an assembly of an endoscope assembly and an instrument driver;
FIG. 4 is a cross-sectional view of the fitting shown in FIG. 3;
FIG. 5 is a schematic view of the endoscope assembly and instrument driver in an exploded view;
FIG. 6 is a partial cross-sectional view of an endoscope assembly;
FIG. 7 is a schematic illustration of the cursor after being activated and before not being moved;
FIG. 8 is a schematic diagram after cursor movement;
FIG. 9 is a schematic illustration of marking selected locations;
FIG. 10 is a schematic diagram of the positional relationship between a position marker and a current field of view image;
Fig. 11 is a flowchart of a first embodiment of an image display method;
fig. 12 is a flowchart of a second embodiment of an image display method;
fig. 13 is a flowchart of a third embodiment of an image display method.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The embodiments described in the following exemplary embodiments do not represent all embodiments consistent with the present application. Rather, they are merely examples of apparatus consistent with aspects of the application as detailed in the accompanying claims.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. Unless defined otherwise, technical or scientific terms used herein should be given the ordinary meaning as understood by one of ordinary skill in the art to which this application belongs. The terms "first," "second," and the like in the description and in the claims, are not used for any order, quantity, or importance, but are used for distinguishing between different elements. Likewise, the terms "a" or "an" and the like do not denote a limitation of quantity, but rather denote the presence of at least one. "plurality" or "plurality" means two or more. Unless otherwise indicated, the terms "front," "rear," "lower," and/or "upper" and the like are merely for convenience of description and are not limited to one location or one spatial orientation. The word "comprising" or "comprises", and the like, means that elements or items appearing before "comprising" or "comprising" are encompassed by the element or item recited after "comprising" or "comprising" and equivalents thereof, and that other elements or items are not excluded. The terms "connected" or "connected," and the like, are not limited to physical or mechanical connections, but may include electrical connections, whether direct or indirect. As used in this specification and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
Referring to fig. 1,2 and 3, a surgical robot is disclosed. The surgical robot includes a doctor console 30, a parasurgical mechanical system 50, an image processing device 40, and a signal processing apparatus. The doctor console 30 and the image processing apparatus 40 may include display means, respectively. The parasurgical mechanical system 50 may include an endoscope assembly 1, an endoscope 4, and an instrument driver 2. The signal processing means of the surgical robot may be a hardware device comprising a separate software program or a software and hardware device integrated in the doctor console 30, the patient side mechanical system 50 or the image processing device. The mechanical system 50 comprises a number of mechanical arms 501, the mechanical arms 501 comprising several links of connecting arms, adjacent connecting arms being movable with a certain degree of freedom. The connecting arm at the tail end of the mechanical arm is a holding arm, and the instrument driver 2 is arranged on the holding arm. The endoscope assembly 1 (also referred to as an endoscope adapter) is assembled with the endoscope 4. Referring again to fig. 3 and 4, in some cases, the endoscope assembly 1 is isolated from the instrument driver 2 by a sterile adapter 3.
Referring to fig. 6 and 4, the endoscope assembly 1 includes a manual operation member 11 and a transmission member 12. The manual operation member 11 is connected to the transmission member 12. How to connect is not limited as long as the transmission member 12 can be actuated in the case of operating the manual operation member 11. In the embodiment of the present application, the transmission member 12 and the manual operation member 11 are connected by a connecting shaft 110. The structure of the manual operation member 11 is not limited, and the transmission member 12 may be driven, and in the embodiment of the present application, the manual operation member 11 is a knob.
Referring to fig. 4 and 5, the instrument driver 2 comprises an input member 21, a motor 22 and a detection device (not shown). The instrument driver 2 comprises a detection device, mainly the part of the detection device responsible for detecting signals is arranged on the instrument driver 2, and the parts such as signal transmission and the like can be arranged on the instrument driver 2 or other parts. The input member 21 is connected to the transmission member 12, and in this embodiment, because of the sterile adapter 3, the input member 21 is connected to the transmission member 12 via a sterile adapter drive disk 31 of the sterile adapter 3. In case the manual operating member 11 actuates the transmission member 12, the input member 21 can be actuated by the transmission member 12 and act on the rotor of the motor 22. For example, when the manual control operation member 11 is a knob, rotating the knob may rotate the transmission member 12, the transmission member 12 rotates to drive the input member 21 to rotate, and the input member 21 rotates to drive the motor 22 to rotate. The detection device is capable of detecting the rotation angle of the rotor of the motor 22, and generating a detection signal.
The signal processing means is configured to establish a cursor movement instruction in response to the detection signal. The display device is in communication connection with the signal processing device and is configured to receive the cursor movement instruction and display the moved position state of the cursor.
As follows, in connection with fig. 7 and 8, the procedure of the doctor at the affected side to move the cursor to indicate the operation screen is described as follows:
When the master doctor is in a clutch state (such as operation preparation) and cannot control the operation robot to indicate the required position, the doctor beside the patient can indicate the required position. The procedure is described below in which the doctor at the affected side activates the cursor, which may be activated to display the position when the position is required to be indicated, or may be displayed when the endoscope 4 collects the view image. In either case, the coordinates after the cursor 5 is displayed may correspond to the O1 point as shown in fig. 7. After the cursor 5 is displayed, the doctor at the patient applies operation to the manual operation part 11, the manual operation part 11 enables the transmission part 12 to be actuated, and the transmission part 12 drives the input part 21 to enable the input part 21 to drive the motor 22 to rotate. The detecting device detects the rotation angle of the rotor of the motor 22, and generates a detection signal. The signal processing device establishes a cursor movement instruction. After receiving the cursor movement command, the display device displays the position state of the cursor 5 after movement, and as can be seen by comparing fig. 8 and fig. 7, the cursor 5 is moved to the point E.
When the master doctor can control the posture adjustment of the mechanical arm and the endoscope is not controlled, and the indication needs exist, the doctor at the affected side can operate the manual control operation part 11 to adjust the position of the cursor 5, so as to indicate the picture.
In summary, the main doctor cannot indicate the operation picture in the on-off state, and the doctor can indicate the operation picture through the patient side. When the main control doctor is in the operation process and the endoscope is not controlled, if the indication requirement exists, the operation picture can be indicated by the patient side doctor, so that the problem that the main control doctor cannot indicate the operation picture in the clutch state or the patient side doctor cannot directly indicate the picture in the operation process is solved. On the premise of solving the problems, the manual control operation part 11 can be operated by the doctor at the patient side to finish the indication of the operation picture, the operation is simple and convenient, the operation is finished by the doctor at the patient side, the continuity of the operation can be ensured, the continuity of the communication process is further ensured, and no participation of a third party is needed.
The structure of the manual operation member 11 is not limited based on the above-described operation of the manual operation member 11, and for example, the manual operation member 11 is a rotatable lever.
Referring to fig. 4, 5 and 6, as shown in fig. 6, the manual operation member 11 includes a first manual operation member 111 and a second manual operation member 112. As shown in fig. 5, the transmission member 12 includes a first transmission member 121 and a second transmission member 122. The input member 21 includes a first input member 211 and a second input member (not shown), but the connection relationship between the first input member 211 and the second input member with the first transmission member 121 and the second transmission member 122, respectively, can be known from the number of adapter transmission discs 31 of the aseptic adapter 3 in fig. 5. The motor 22 includes a first motor 221 and a second motor (not shown). The instrument driver includes an endoscope driving motor 231 and an endoscope input member 241 for driving the endoscope 4, in addition to the motor 22 connected to the manual operation member 11. Correspondingly, sterile adapter 3 includes an adapter drive disk 31 connected to endoscope input member 241. The endoscope assembly 1 comprises an endoscope drive disk 13 connected to an adapter drive disk 31. The endoscope driving disk 13 is connected to the endoscope 4, so that the endoscope driving motor 231 drives the endoscope input member 241, and drives the endoscope 4 by driving the adapter transmission disk 31 and the endoscope driving disk 13. The detection device comprises a first detection device and a second detection device.
The first manual operation member 111 is capable of being actuated by an external force and acts on the first transmission member 121. The first input member 211 is actuatable by the first transmission member 121 and acts upon the rotor of the first motor 221. The external force is a force for manipulating the first manual operation member 111, for example, in a case where the first manual operation member 111 is a knob, the external force is a force for rotating the knob. The first detection device is capable of detecting a rotation angle of a rotor of the first motor 221, and generating a first detection signal. The second manual operating member 112 is actuatable by an external force (external force as described previously) and acts on the second transmission member 122. The second input member 222 is actuatable by the second transmission member 122 and acts upon the rotor of the second electric machine. The second detection device can detect the rotation angle of the rotor of the second motor and generate a second detection signal. The signal processing device responds to a first detection signal sent by the first detection device, establishes a coordinate axis conversion instruction for controlling a cursor and sends the coordinate axis conversion instruction to the display device. The coordinate axis conversion instruction is used to select a coordinate axis, for example, the X-axis in fig. 7 and 8. The signal processing device responds to a second detection signal sent by the second detection device, establishes a numerical value change instruction of the cursor and sends the numerical value change instruction to the display device. The numerical change instruction of the cursor corresponds to the distance that the cursor moves along the selected coordinate axis. The cursor movement instruction includes the coordinate axis conversion instruction and the numerical value change instruction, so that the cursor is moved to a position on the selected coordinate axis by the movement of the selected coordinate axis and the cursor along the selected coordinate axis. If all XYZ axes are to be moved, one implementation method is to sequentially select corresponding coordinate axes and move a cursor along the selected coordinate axes, so that the cursor is moved to the indicated position through three operations.
In the above arrangement, since the manual control operation part 11 includes the first manual control operation part 111 and the second manual control operation part 112, the transmission part 12 includes the first transmission part 121 and the second transmission part 122, the input part 21 includes the first input part 211 and the second input part, the motor 22 includes the first motor 221 and the second motor, and the detection device includes the first detection device and the second detection device, the operation of indicating the operation screen by moving the cursor is realized by selecting the coordinate axis and combining the movement along the coordinate axis, the operation is simple and convenient, the operation continuity can be ensured, the continuity of the communication process is ensured, and the participation of a third party is not needed.
In some embodiments, the manual operation member further comprises a third manual operation member and a third transmission member, the third manual operation member being connected with the third transmission member. The instrument driver includes a third motor, a third detection device, and a third input member. In the case where the third manual operation member actuates the third transmission member, the third input member can be actuated by the third transmission member and acts on the rotor of the third motor, and the third detection means can detect the rotation angle of the rotor of the third motor, generating a third detection signal. The signal processing device responds to a third detection signal sent by the third detection device, establishes a position marking instruction of the cursor and sends the position marking instruction to the display device.
Referring to fig. 9 and comparing fig. 9 and fig. 8, the above embodiment will be described in which if the current position indicated by the cursor is to be saved or marked for later viewing or communication again, etc., the position of the cursor is marked, fig. 8 illustrates the case where the current position of the cursor 5 is not marked, and fig. 9 illustrates that the position corresponding to the cursor 5 has been marked by a black triangle. How the patient side doctor and/or the master doctor are made aware that the position has been marked can be done in any way, not limited to the aforementioned morphological changes of the cursor by color changes. The coordinates corresponding to the marked cursor are stored.
According to the arrangement, the doctor at the affected side operates the third manual control operation part to realize the marking of the current position, so that the doctor at the affected side can conveniently communicate again, and the like.
Referring to fig. 8 and 9, the signal processing device responds to the fourth detection signal sent by the third detection device, and establishes a coordinate conversion instruction for controlling the position of the cursor, and converts the coordinate of the cursor under the endoscope coordinate system of the endoscope into the coordinate under the absolute coordinate system according to the coordinate conversion instruction. The absolute coordinate system is a three-dimensional Cartesian coordinate system established by taking the base of the patient side robot as an origin, and the coordinate system is fixed relative to the base of the patient side robot and cannot move in the normal operation process. The endoscope coordinate system is a three-dimensional Cartesian coordinate system established by taking the center of an endoscope image as an origin, is fixed at the lens of the endoscope, is fixed relative to a view image of the endoscope, and moves along with the endoscope. In fig. 8 and 9, the coordinates in the endoscope coordinate system are denoted by E, the coordinates in the absolute coordinate system are denoted by P, and the coordinates of the cursor 5 in the absolute coordinate system can be determined from the mapping relationship between the absolute coordinate system and the endoscope coordinate system. The endoscope 4 is moved to generate a plurality of view images, for example, a current scene (referred to as a first scene), and the moved scenes are referred to as a second scene and a third scene, respectively. The first scene is an endoscope coordinate system E1, the second scene is an endoscope coordinate system E2, and the third scene is an endoscope coordinate system E3. In each scene, the coordinate system of the endoscope in each scene has a mapping relationship with the absolute coordinate system.
According to the arrangement, the doctor at the patient side controls the third manual control operation part to realize coordinate conversion, so that the operation is convenient, the operation continuity is ensured, and further, the continuity of the communication process is ensured.
Referring to fig. 5 and 3 in combination with fig. 7, the endoscope assembly 1 (also referred to as an endoscope adapter) includes a housing 14, the housing 14 including a through hole 141, the through hole 141 communicating between an interior of the housing 14 and an exterior of the endoscope assembly 1. The manual operation member 11 includes an operation portion 113, and a portion of the operation portion 113 extends out of the through hole 141. The structure of the operation portion 113 is not limited to the illustrated one-turn saw tooth, and the manual operation member 11 may be operated, but is not limited to the illustrated one-turn saw tooth.
Since the manual operation member 11 includes the operation portion 113 and a portion of the operation portion 113 is protruded from the through hole 141, the endoscope assembly 1 is compact and facilitates manipulation of the manual operation member 11.
In a second aspect, the application also discloses a control method of the surgical robot. The surgical robot comprises an endoscope assembly 1, an instrument driver 2, signal processing means and display means. The endoscope assembly 1 comprises a first manual operating member 111, a first transmission member 121 actuated by said first manual operating member 111, a second manual operating member 112 and a second transmission member 122 actuated by said second manual operating member 112. The instrument driver 2 comprises a first input member 211, a first motor 221, a first detection means, a second input member, a second motor and a second detection means. The first input member 211 is actuatable by the first transmission member 121 and acts upon the rotor of the first motor 221. The first detection device can detect the rotation angle of the rotor of the first motor and generate a first detection signal. The second input member is actuatable by the second transmission member 122 and acts on the rotor of the second motor, and the second detection means is operable to detect the angle of rotation of the rotor of the second motor, generating a second detection signal. The signal processing means is capable of receiving a first detection signal of the first detection device and a second detection signal of the second detection device. The display device is in communication with the signal processing device.
The control method of the surgical robot comprises the following steps:
a first operation is applied to the first manual operation section 111 to generate the first detection signal. The primary function of the first operation is to enable rotation of the first motor 221, ultimately realizing a first detection signal, the first operation being determined according to the structure of the first manual operation member 111. In the present embodiment, the first manual operation member 111 is a knob, and the first operation is to rotate the first manual operation member 111. The signal processing device receives the first detection signal, establishes an instruction for controlling coordinate axis conversion of a cursor and sends the instruction to the display device, and the cursor of the display device is switched to the selected coordinate axis. The coordinate axis conversion instruction is for selecting one of the X axis, the Y axis, and the Z axis as described above.
A second operation is applied to the second manual operation section 112 to generate the second detection signal. The description of the second operation may be referred to the description of the first operation. The signal processing device receives the second detection signal, establishes a numerical value change instruction of the cursor and sends the numerical value change instruction to the display device, and the cursor of the display device moves along the direction of the selected coordinate axis. The numerical change instruction of the cursor is as described above.
As set up above, only need the doctor beside the patient to operate first manual operation part 111 and second manual operation part 112 can remove the position of moving the cursor to the communication, solved master control doctor and can't instruct the operation picture when the separation and reunion state, or when master control doctor is operating in the operation, the problem that the doctor beside the patient can't direct instruct the picture, moreover, the control of operation robot is simple, and convenient, and accomplish by the doctor beside the patient, can ensure the continuity of operation, and then, ensure the continuity of communication process, also need not the third party to participate in.
In a further embodiment, the surgical robot further comprises a third manual operating member and a third transmission member, a third input, a third motor and a third detection device. The third manual operation part and the third transmission part are arranged on the endoscope assembly, and the third manual operation part can be actuated by external force and acts on the third transmission part. The third motor, the third detection device and the third input component are arranged on the instrument driver, the third input component can be actuated by the third transmission component and acts on the rotor of the third motor, and the third detection device can detect the rotation angle of the rotor of the third motor and generate a third detection signal. The control method of the surgical robot further includes applying a third operation to the third manual operation part to generate the third detection signal, and the signal processing device generates a marking instruction to mark the current position of the cursor of the display device according to the third detection signal. Fig. 9 can be compared with fig. 8, and fig. 9 can be understood as a display device displaying a current position mark of a cursor. The current cursor is filled in black so that the doctor at the patient side and/or the master doctor knows that the current position of the cursor has been marked.
According to the arrangement, the current position of the cursor is marked by operating the third manual control operation part, the operation robot is simple and convenient to control and is finished by a doctor beside the patient, so that the continuity of operation can be ensured, further, the continuity of a communication process is ensured, and no participation of a third person is needed.
In a further embodiment, the control method of the manual robot further includes applying a fourth operation to a third manual operation part to cause the third detection device to generate a coordinate conversion instruction. Although the object of the fourth operation and the third operation are both the third manual operation member, as long as they are different from each other, they can be discriminated by software. The signal processing device receives the coordinate conversion instruction, converts the coordinate of the cursor under the endoscope coordinate system of the endoscope into the coordinate under the absolute coordinate system according to the conversion relation between the endoscope coordinate system and the absolute coordinate system, and stores the coordinate of the cursor under the absolute coordinate system.
As set forth above, the coordinates of the cursor in the endoscope coordinate system are converted into the coordinates in the absolute coordinate system by the fourth operation applied by the third manual operation member, and stored. The control of the operation robot is simple and convenient, and the operation robot is completed by a doctor beside the patient, so that the operation continuity can be ensured, the continuity of the communication process is ensured, and no participation of a third party is needed.
In a third aspect, with reference to fig. 11, an embodiment of the present application further discloses an image display method. The image display method comprises the following steps:
In this step, how the image acquired by the endoscope 4 is transmitted to the display device may be in any way, and the acquired first view image may be referred to as shown in fig. 7.
S1, displaying a three-dimensional coordinate system and a cursor on a first view image based on a first operation from an endoscope assembly, wherein a coordinate axis selected by the first operation is highlighted; in this image display method, the endoscope assembly may or may not be the endoscope assembly shown in fig. 3 to 6. The highlighting may be implemented in any manner, as long as the selected coordinate axis is distinguishable from other coordinate axes, for example, the selected coordinate axis is an X axis, and the extension and/or thickness of the X axis is different from those of the Y axis and the Z axis, so as to implement the highlighting.
And S3, based on a second operation from the endoscope assembly, the cursor moves along the direction of the selected coordinate axis.
Comparing fig. 7 and 8, the cursor 5 moves from the initial position O1 to the point E along the selected coordinate axis, and of course, if three coordinate axes need to be moved, it needs to be moved along each coordinate axis, and if only one coordinate axis needs to be moved, it only needs to be moved along the coordinate axis, as described above.
According to the setting, the image display method can be used for displaying only by the operation of the doctor beside the patient, the image display is convenient, the continuity of the image display can be ensured, and the continuity of communication between the master doctor and the doctor beside the patient is ensured.
Referring to fig. 7, the first view image is displayed based on an endoscope coordinate system, and the initial position of the cursor 5 is located at an origin O1 of the endoscope coordinate system. Of course, in some embodiments, the initial position may not be the origin O1.
As set forth above, since the initial position of the cursor 5 is located at the origin O1 of the endoscope coordinate system, the distance of cursor movement is shorter, and coordinate conversion is more convenient and algorithm is more convenient in the subsequent coordinate conversion.
The endoscope will move and accordingly a new view image will be generated (first scene, second scene, third scene, etc. as described previously). Therefore, the image display method further includes, when the endoscope moves from the position corresponding to the first visual field image, acquiring a second visual field image acquired by the endoscope and corresponding to the moved position, and displaying an initial position of a cursor and highlighting a selected coordinate axis on the second visual field image based on the first operation. Then, based on the second operation, the cursor is moved in the direction of the selected coordinate axis. The second visual field image is a visual field image generated after the endoscope moves to the position with respect to the image before the endoscope does not move. The position of the endoscope movement corresponds to different scenes, and different view images are generated. In the first scene, the endoscope coordinate system is E1, the second scene, the endoscope coordinate system is E2, and the like, the nth scene, and the endoscope coordinate system is En. As can be understood from the foregoing, the second field-of-view image is displayed based on an endoscope coordinate system, and the initial position of the cursor 5 is located at the origin of the endoscope coordinate system to which the second field-of-view image corresponds. For example, if the current second field image is a field image of the second scene, the initial position of the cursor 5 is the origin of the endoscope coordinate system E2, and if the current second field image is a field image of the third scene, the cursor 5 is located at the origin of the endoscope coordinate system E3.
As set forth above, since the initial position of the cursor 5 is located at the origin of the endoscope coordinate system corresponding to the second view image, the distance of cursor movement is shorter, and in the subsequent coordinate conversion, the coordinate conversion is more convenient, and the algorithm is more convenient.
Referring to fig. 12 in combination with fig. 9 and 8, the position marker 6 is displayed at the current position of the cursor based on a third operation from the endoscope assembly. The current position of the cursor in fig. 8 is not marked, the cursor in fig. 9 is filled in black, and has been marked. In fig. 9, the displayed position marks are black triangles for the doctor at the affected side and/or the master doctor, and of course, the position marks are not limited as long as the marks can be distinguished.
As set forth above, in the image display method, the cursor 5 is marked based on the third operation, facilitating the subsequent communication discussion, and so on.
Referring to fig. 10 and 13, the image display method further includes S5 of acquiring a third view image acquired by the endoscope, and displaying a direction mark 7 on the third view image in a case that the position mark 6 is out of the third view image, wherein the direction mark 7 characterizes the azimuth relationship between the third view image and the position mark 6.
In this step, the third field image is a new field image generated after the endoscope is moved, and may be understood as the second field image. In the image display method, after the third visual field image is acquired, it is determined whether the position mark 6 is within the visual field range of the third visual field image, as described in step S6. If at, the position marker 6 is displayed directly. If not, when the position mark is required to be communicated again, the view of the endoscope is required to be adjusted, and at this time, the view can be adjusted according to the direction mark 7.
By displaying the direction mark on the third view image, the direction mark characterizes the azimuth relation between the third view image and the position mark 6, so that the view of the endoscope can be conveniently adjusted.
The foregoing description of the preferred embodiments of the application is not intended to be limiting, but rather is intended to cover all modifications, equivalents, alternatives, and improvements that fall within the spirit and scope of the application.
Claims (14)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311792471.6A CN120189241A (en) | 2023-12-22 | 2023-12-22 | Surgical robot and its control method and image display method |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311792471.6A CN120189241A (en) | 2023-12-22 | 2023-12-22 | Surgical robot and its control method and image display method |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN120189241A true CN120189241A (en) | 2025-06-24 |
Family
ID=96067664
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202311792471.6A Pending CN120189241A (en) | 2023-12-22 | 2023-12-22 | Surgical robot and its control method and image display method |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN120189241A (en) |
-
2023
- 2023-12-22 CN CN202311792471.6A patent/CN120189241A/en active Pending
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP7275204B2 (en) | System and method for on-screen menus in telemedicine systems | |
| CN110799144B (en) | Systems and methods for haptic feedback for selection of menu items in a remote control system | |
| US9801690B2 (en) | Synthetic representation of a surgical instrument | |
| CN103607971B (en) | Medical master-slave operator | |
| KR101407986B1 (en) | Medical robotic system providing three-dimensional telestration | |
| JP2024170562A (en) | System and method for on-screen identification of instruments in a telemedical system - Patents.com | |
| EP3613547A1 (en) | Synthetic representation of a surgical robot | |
| US20140316433A1 (en) | Relay based tool control | |
| JP5800610B2 (en) | Medical master-slave manipulator | |
| JP7604251B2 (en) | Surgery system and display method | |
| JP5800609B2 (en) | Medical master-slave manipulator | |
| CN120189241A (en) | Surgical robot and its control method and image display method | |
| CN111110350B (en) | Front-end control device of surgical robot and surgical robot | |
| CN219629776U (en) | Surgical robot display system and surgical robot system | |
| JP2023172576A (en) | Surgical system, display method, and program | |
| JP2023172575A (en) | Surgery system, display method, and program | |
| WO2014176052A1 (en) | Relay based tool control | |
| JP2023008647A (en) | Surgery system, display method and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |