[go: up one dir, main page]

WO2014020943A1 - Endoscope system - Google Patents

Endoscope system Download PDF

Info

Publication number
WO2014020943A1
WO2014020943A1 PCT/JP2013/060189 JP2013060189W WO2014020943A1 WO 2014020943 A1 WO2014020943 A1 WO 2014020943A1 JP 2013060189 W JP2013060189 W JP 2013060189W WO 2014020943 A1 WO2014020943 A1 WO 2014020943A1
Authority
WO
WIPO (PCT)
Prior art keywords
illumination light
light
endoscope
irradiation
unit
Prior art date
Application number
PCT/JP2013/060189
Other languages
French (fr)
Japanese (ja)
Inventor
悠次 酒井
Original Assignee
オリンパスメディカルシステムズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by オリンパスメディカルシステムズ株式会社 filed Critical オリンパスメディカルシステムズ株式会社
Publication of WO2014020943A1 publication Critical patent/WO2014020943A1/en

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00057Operational features of endoscopes provided with means for testing or calibration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00172Optical arrangements with means for scanning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0638Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths

Definitions

  • the present invention relates to an endoscope system, and more particularly to an endoscope system that acquires an image by scanning a subject.
  • a subject is set in advance by swinging the tip of an illumination fiber that guides illumination light emitted from a light source unit. Obtained by separating the return light received by the light receiving fiber for each color component. An image of the subject is generated using the signal.
  • Japanese National Table of Contents 2010-515947 discloses a multicolor image of a multicolor calibration pattern using a scanning beam device, each color component of the acquired multicolor image, and each color component.
  • a calibration method is disclosed in which the scanning beam device is calibrated based on the comparison result by comparing the color components of the display of the multicolor calibration pattern corresponding to.
  • the irradiation position of the illumination light actually irradiated to the subject is shifted from the (ideal) irradiation position along a predetermined scanning pattern. This may cause distortion in the image generated in response to the illumination light irradiation.
  • the present invention has been made in view of the above-described circumstances, and an object thereof is to provide an endoscope system capable of accurately calibrating distortion of an image acquired using a scanning endoscope. It is said.
  • a light guide member that guides illumination light emitted from a light source, and an irradiation position of the illumination light that is irradiated to a subject through the light guide member correspond to a predetermined scanning pattern.
  • An endoscope provided with a drive unit capable of swinging the light guide member so as to draw a trajectory, and coordinate information capable of detecting the irradiation position of the illumination light emitted from the endoscope
  • the coordinate information acquisition unit for acquiring the illumination light, the irradiation position when the illumination light is irradiated along the predetermined scanning pattern, and the irradiation position of the illumination light detected based on the coordinate information are compared.
  • the comparison unit Based on the comparison result of the comparison unit, the comparison unit, a determination unit that determines whether or not the irradiation range of the illumination light irradiated from the endoscope satisfies a predetermined angle of view, and the illumination light Determination result that the irradiation range does not satisfy the predetermined angle of view
  • control for adjusting a drive signal supplied to the drive unit is performed, and a determination result that the irradiation range of the illumination light satisfies the predetermined angle of view is the determination unit.
  • a process for detecting a deviation amount between the locus of the irradiation position along the predetermined scanning pattern and the locus drawn by the irradiation position detected based on the coordinate information In the case of the above, a process for detecting a deviation amount between the locus of the irradiation position along the predetermined scanning pattern and the locus drawn by the irradiation position detected based on the coordinate information. And a control unit for performing.
  • the figure for demonstrating the time displacement of the irradiation coordinate of the illumination light from the point YMAX to the point SA when illumination light is irradiated to the virtual XY plane like FIG. The flowchart which shows an example of the process etc. which are performed by the endoscope system which concerns on the Example of this invention.
  • FIG. 1 is a diagram illustrating a configuration of a main part of an endoscope system according to an embodiment of the present invention.
  • an endoscope system 1 includes a scanning endoscope 2 that is inserted into a body cavity of a subject, a main body device 3 that is connected to the endoscope 2, and a main body device 3. And a monitor 4 connected to the.
  • the endoscope 2 includes an insertion portion 11 formed with an elongated shape and flexibility that can be inserted into a body cavity of a subject.
  • a connector (not shown) or the like for detachably connecting the endoscope 2 to the main body device 3 is provided at the proximal end portion of the insertion portion 11.
  • An illumination fiber having a function as a light guide member for guiding the illumination light supplied from the light source unit 21 of the main body device 3 to the objective optical system 14 is provided in a portion from the proximal end portion to the distal end portion in the insertion portion 11. 12 and a light receiving fiber 13 that receives the return light from the subject and guides it to the detection unit 23 of the main body device 3 are respectively inserted.
  • the end including the light incident surface of the illumination fiber 12 is disposed in a multiplexer 32 provided inside the main unit 3. Further, the end portion including the light emission surface of the illumination fiber 12 is disposed in a state in which it is not fixed by a fixing member or the like in the vicinity of the light incident surface of the lens 14 a provided at the distal end portion of the insertion portion 11.
  • the end including the light incident surface of the light receiving fiber 13 is fixedly disposed around the light emitting surface of the lens 14 b at the distal end surface of the distal end portion of the insertion portion 11. Further, the end including the light emitting surface of the light receiving fiber 13 is disposed in a duplexer 36 provided inside the main body device 3.
  • the objective optical system 14 includes a lens 14a into which illumination light from the illumination fiber 12 is incident, and a lens 14b that emits illumination light that has passed through the lens 14a to a subject.
  • An actuator 15 that is driven based on a drive signal output from the driver unit 22 of the main body device 3 is attached to the middle portion of the illumination fiber 12 on the distal end side of the insertion portion 11.
  • FIG. 2 is a diagram illustrating an example of a virtual XY plane set on the surface of the subject.
  • the point SA on the XY plane in FIG. 2 is the insertion axis when the insertion axis of the insertion unit 11 is virtually set to exist in the direction corresponding to the back side from the front side of the page. It shows the intersection with the page.
  • the X-axis direction on the XY plane in FIG. 2 is set as a direction from the left side to the right side of the drawing.
  • the Y-axis direction in the XY plane of FIG. 2 is set as a direction from the lower side to the upper side of the drawing.
  • the X axis and the Y axis constituting the XY plane of FIG. 2 intersect at the point SA.
  • the actuator 15 is based on the first drive signal output from the driver unit 22 of the main unit 3 and operates for swinging the end including the light emitting surface of the illumination fiber 12 in the X-axis direction. Based on an actuator (not shown) and a second drive signal output from the driver unit 22 of the main unit 3, the end including the light emitting surface of the illumination fiber 12 is swung in the Y-axis direction. And a Y-axis actuator (not shown). The end including the light exit surface of the illumination fiber 12 is swung in a spiral shape around the point SA in accordance with the operations of the X-axis actuator and the Y-axis actuator as described above.
  • a memory 16 in which endoscope information including various information related to the endoscope 2 is stored in advance.
  • the endoscope information stored in the memory 16 is read by the controller 25 of the main body device 3 when the endoscope 2 and the main body device 3 are connected.
  • the main unit 3 includes a light source unit 21, a driver unit 22, a detection unit 23, a memory 24, and a controller 25.
  • the light source unit 21 includes a light source 31a, a light source 31b, a light source 31c, and a multiplexer 32.
  • the light source 31 a includes, for example, a laser light source and the like, and is configured to emit red wavelength band light (hereinafter also referred to as R light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
  • R light red wavelength band light
  • the light source 31b includes a laser light source, for example, and is configured to emit light in a green wavelength band (hereinafter also referred to as G light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
  • G light a green wavelength band
  • the light source 31c includes, for example, a laser light source, and is configured to emit light in a blue wavelength band (hereinafter also referred to as B light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
  • B light a blue wavelength band
  • the multiplexer 32 multiplexes the R light emitted from the light source 31a, the G light emitted from the light source 31b, and the B light emitted from the light source 31c onto the light incident surface of the illumination fiber 12. It is configured so that it can be supplied.
  • the driver unit 22 includes a signal generator 33, digital / analog (hereinafter referred to as D / A) converters 34a and 34b, and an amplifier 35.
  • D / A digital / analog
  • the signal generator 33 is a predetermined drive signal as shown in FIG. 3, for example, as a first drive signal for swinging the end including the light emitting surface of the illumination fiber 12 in the X-axis direction.
  • a waveform signal is generated and output to the D / A converter 34a.
  • FIG. 3 is a diagram illustrating an example of a signal waveform of a first drive signal supplied to an actuator provided in the endoscope.
  • the signal generator 33 is based on the control of the controller 25, for example, as shown in FIG. 4, as a second drive signal for swinging the end including the light emitting surface of the illumination fiber 12 in the Y-axis direction.
  • a signal having a waveform in which the phase of the first drive signal is shifted by 90 ° is generated and output to the D / A converter 34b.
  • FIG. 4 is a diagram illustrating an example of a signal waveform of a second drive signal supplied to an actuator provided in the endoscope.
  • the D / A converter 34 a is configured to convert the digital first drive signal output from the signal generator 33 into an analog first drive signal and output the analog first drive signal to the amplifier 35.
  • the D / A converter 34 b is configured to convert the digital second drive signal output from the signal generator 33 into an analog second drive signal and output the analog second drive signal to the amplifier 35.
  • the amplifier 35 is configured to amplify the first and second drive signals output from the D / A converters 34 a and 34 b and output the amplified signals to the actuator 15.
  • the amplitude value (signal level) of the first drive signal illustrated in FIG. 3 gradually increases from the time T1 at which the minimum value is reached, and gradually decreases after reaching the maximum value at time T2. At time T3, it becomes the minimum value again.
  • the amplitude value (signal level) of the second drive signal illustrated in FIG. 4 gradually increases from the time T1 at which the minimum value is reached, and gradually decreases after reaching the maximum value near the time T2. Then, it becomes the minimum value again at time T3.
  • FIG. 5A is a diagram for explaining temporal displacement of illumination light irradiation coordinates from point SA to point YMAX when illumination light is irradiated on a virtual XY plane as shown in FIG. is there.
  • FIG. 5B is a diagram for explaining temporal displacement of illumination light irradiation coordinates from point YMAX to point SA when illumination light is irradiated onto a virtual XY plane as shown in FIG. is there.
  • illumination light is applied to a position corresponding to the point SA on the surface of the subject.
  • the amplitude values of the first and second drive signals increase from time T1 to time T2
  • the irradiation coordinates of the illumination light on the surface of the subject follow the first spiral locus outward from the point SA.
  • illumination light is irradiated to a point YMAX that is the outermost point of the illumination light irradiation coordinates on the surface of the subject.
  • the illumination light irradiation coordinates on the surface of the subject have a second spiral trajectory inward starting from the point YMAX.
  • illumination light is irradiated to the point SA on the surface of the subject.
  • the actuator 15 has a spiral in which the irradiation position of the illumination light applied to the subject through the objective optical system 14 is illustrated in FIGS. 5A and 5B based on the first and second drive signals supplied from the driver unit 22.
  • the end portion including the light emitting surface of the illumination fiber 12 can be swung so as to draw a locus corresponding to the scanning pattern.
  • the detection unit 23 includes a duplexer 36, detectors 37a, 37b, and 37c, and analog-digital (hereinafter referred to as A / D) converters 38a, 38b, and 38c.
  • a / D analog-digital
  • the demultiplexer 36 includes a dichroic mirror and the like, and separates the return light emitted from the light emitting surface of the light receiving fiber 13 into light for each of R (red), G (green), and B (blue) color components. And it is comprised so that it may radiate
  • the detector 37a detects the intensity of the R light output from the duplexer 36, generates an analog R signal corresponding to the detected intensity of the R light, and outputs the analog R signal to the A / D converter 38a. It is configured.
  • the detector 37b detects the intensity of the G light output from the duplexer 36, generates an analog G signal corresponding to the detected intensity of the G light, and outputs the analog G signal to the A / D converter 38b. It is configured.
  • the detector 37c detects the intensity of the B light output from the duplexer 36, generates an analog B signal according to the detected intensity of the B light, and outputs the analog B signal to the A / D converter 38c. It is configured.
  • the A / D converter 38a is configured to convert the analog R signal output from the detector 37a into a digital R signal and output it to the controller 25.
  • the A / D converter 38b is configured to convert the analog G signal output from the detector 37b into a digital G signal and output it to the controller 25.
  • the A / D converter 38c is configured to convert the analog B signal output from the detector 37c into a digital B signal and output it to the controller 25.
  • the memory 24 stores a control program for controlling the main device 3 in advance, and stores image correction information obtained as a result of processing by the controller 25. Details of such image correction information will be described later.
  • the coordinate position corresponding to the point SA is the time at which the illumination light is irradiated to an arbitrary coordinate position on the ideal (spiral) scanning pattern as illustrated in FIGS. 5A and 5B.
  • Table data TBD that can specify which time of the period up to time T3 when the illumination light is irradiated is stored in advance.
  • the irradiation position (coordinate position) when the illumination light supplied from the light source unit 21 is irradiated along an ideal (spiral) scanning pattern as shown in FIGS. 5A and 5B.
  • Table data TBD indicating a correspondence relationship with the irradiation time (elapsed time) is stored in advance.
  • the table data TBD is configured as data corresponding to each wavelength band light (R light, G light, and B light) supplied from the light source unit 21, for example.
  • the controller 25 is configured to read a control program stored in the memory 24 and to control the light source unit 21 and the driver unit 22 based on the read control program.
  • the controller 25 operates so as to store the endoscope information output from the memory 16 in the memory 24 when the insertion unit 11 is connected to the main body device 3.
  • the controller 25 is configured to generate an image for one frame based on the R signal, the G signal, and the B signal output from the detection unit 23 in a period corresponding to the time T1 to the time T2. Further, the controller 25 is configured to generate an image for one frame based on the R signal, the G signal, and the B signal output from the detection unit 23 during a period corresponding to the time T2 to the time T3.
  • the controller 25 performs an image correction process based on the image correction information on the image of each frame, and a corrected image obtained by performing the image correction process. Is displayed on the monitor 4 at a predetermined frame rate.
  • the controller 25 performs processing described later based on the table data TBD stored in the memory 24 and information on the coordinate position output from the light irradiation coordinate detection module 101 (hereinafter also referred to as coordinate information).
  • the image correction information is acquired by the operation, and the acquired image correction information is stored in the memory 24.
  • the controller 25 is configured to be able to at least temporarily hold coordinate information output from the light irradiation coordinate detection module 101.
  • the light irradiation coordinate detection module 101 having a function as a coordinate information acquisition unit includes a position detection element (PSD) and receives illumination light emitted through the objective optical system 14. The position at the time is detected, and the detected position is output as coordinate information.
  • PSD position detection element
  • the coordinate position of the point SA on the XY plane exemplified in FIGS. 2, 5A and 5B is set in advance to be (0, 0).
  • the coordinate information output from the light irradiation coordinate detection module 101 is a relative coordinate position based on the coordinate position (0, 0) of the point SA on the XY plane illustrated in FIGS. 2, 5A, and 5B. It is information which shows.
  • the controller 25 determines the irradiation position of the illumination light irradiated in a spiral shape from the endoscope 2 based on the coordinate information output from the light irradiation coordinate detection module 101 having the configuration as described above (coordinates). As position).
  • the surgeon or the like connects the endoscope 2 and the monitor 4 to the main body device 3, arranges the light irradiation coordinate detection module 101 at a position facing the distal end surface of the endoscope 2, and further, the light irradiation coordinates.
  • the coordinate information output from the detection module 101 is set to be input to the controller 25 of the main device 3.
  • the controller 25 controls the light source unit 21 to switch the light source 31b from OFF to ON while turning off the light sources 31a and 31c at a timing immediately after the endoscope information read from the memory 16 is stored in the memory 24.
  • the driver unit 22 is controlled to output the first and second drive signals to the actuator 15. Then, under such control of the controller 25, the G light is irradiated on the surface of the light irradiation coordinate detection module 101, and coordinate information corresponding to the position where the G light is received is sequentially output from the light irradiation coordinate detection module 101.
  • FIG. 6 is a flowchart illustrating an example of processing performed by the endoscope system according to the embodiment of the present invention.
  • the controller 25 selects a coordinate position corresponding to one or more predetermined irradiation times included in the table data TBD from the coordinate information output from the light irradiation coordinate detection module 101.
  • the extraction process is performed (step S1 in FIG. 6).
  • the controller 25 monitors the signal waveforms of the first and second drive signals output from the driver unit 22 before performing the process of step S1 in FIG.
  • the time T1 at the timing when the amplitude value (signal level) of the signal waveform of the signal becomes the minimum, and referring to the table data TBD based on the set time T1, it corresponds to the point XMAX in FIG. 5A.
  • the controller 25 receives from the coordinate information output from the light irradiation coordinate detection module 101 the coordinate position XA indicating the position where the G light is actually received at the time TXMAX, and the G light is actually received at the time TYMIN.
  • the light irradiation coordinate detection module 101 as the subject is actually irradiated with G light.
  • Coordinate positions XA, XB, YA and YB corresponding to the four irradiation positions located on the outermost periphery of the spiral scanning pattern can be extracted from each irradiation position at the time.
  • the controller 25 performs a process of comparing each coordinate position extracted in step S1 of FIG. 6 with the coordinate position of the table data TBD corresponding to the predetermined one or more irradiation times (FIG. 6). After step S2), based on the comparison result, whether or not the irradiation range of G light irradiated to the light irradiation coordinate detection module 101 satisfies a predetermined angle of view intended when the endoscope 2 is designed. Is performed (step S3 in FIG. 6).
  • the predetermined angle of view substantially coincides with the illumination light irradiation range when the illumination light having passed through the objective optical system 14 is irradiated along an ideal scanning pattern as shown in FIG. 5A (and FIG. 5B). It is assumed that the value is 90 degrees (for example).
  • the X axis of the actuator 15 is used.
  • the end including the light exit surface of the illumination fiber 12 is not accurately oscillated due to the waveform distortion or the like generated during the oscillating operation in the actuator for the Y axis and the actuator for the Y axis, and is irradiated through the objective optical system 14 The situation where the irradiation position of the illumination light to be deviated from the ideal scanning pattern occurs.
  • FIG. 7 is a diagram illustrating an example of a deviation that occurs between an ideal illumination light irradiation range and an actual illumination light irradiation range.
  • the coordinate positions XA and XB extracted in step S1 of FIG. 6 are the ideal irradiation drawn by the solid line in FIG.
  • the coordinate positions YA and YB extracted in step S1 in FIG. 6 are the ideal irradiation positions drawn by solid lines in FIG. 7 while matching the coordinate positions of the points XMAX and XMIN included in the position locus. There is a deviation that the coordinate positions of the point YMAX and the point YMIN included in the trajectory are not coincident.
  • the controller 25 compares the coordinate position XA and the coordinate position of the point XMAX based on each coordinate position extracted in step S1 of FIG. 6 and the coordinate position of the table data TBD stored in the memory 24.
  • the process of comparing the coordinate position YB and the coordinate position of the point YMIN, comparing the coordinate position XB and the coordinate position of the point XMIN, and comparing the coordinate position YA and the coordinate position of the point YMAX is shown in FIG.
  • step S2 it is possible to detect a deviation between the ideal G light irradiation position and the actual G light irradiation position at the four irradiation positions located on the outermost periphery of the spiral scanning pattern. it can.
  • the controller 25 matches the coordinate position XA and the coordinate position of the point XMAX, and the coordinate position YB and the coordinate position of the point YMIN match.
  • a process of detecting whether or not the coordinate position XB and the coordinate position of the point XMIN coincide and the coordinate position YA and the coordinate position of the point YMAX coincide is performed in step S3 in FIG. It is possible to detect whether or not the ideal G light irradiation positions at the four irradiation positions located on the outermost periphery of the scanning pattern coincide with the actual G light irradiation positions.
  • the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 is determined. Control for adjusting the first and / or second drive signals supplied to the actuator 15 after obtaining the determination result that the predetermined angle of view intended at the time of designing the endoscope 2 is not satisfied Is performed on the driver unit 22 (step S4 in FIG. 6).
  • the controller 25 based on the processing results of step S2 and step S3 in FIG. 6, for example, as shown in FIG. 7, compared to the Y coordinate value of the coordinate position corresponding to the point YMAX,
  • the amplitude of the second drive signal supplied to the actuator 15 The driver unit 22 is controlled to increase the value (signal level) from the current amplitude value (signal level).
  • control that increases or decreases the amplitude value (signal level) of the drive signal supplied to the actuator 15 from the current amplitude value (signal level) is performed in step S4 in FIG.
  • control for changing the phase of at least one of the drive signals so that the phase difference between the first and second drive signals supplied to the actuator 15 is 90 ° is performed in step S4 of FIG. It may be performed.
  • step S3 of FIG. 6 the controller 25, for example, matches the coordinate position XA and the coordinate position of the point XMAX, matches the coordinate position YB and the coordinate position of the point YMIN, and sets the coordinate position XB and the point XMIN.
  • the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 is determined by the endoscope. The processing from step S1 to step S4 in FIG. 6 is repeated until a determination result that the predetermined angle of view intended at the time of design 2 is satisfied is obtained.
  • step S3 in FIG. 6 the controller 25 determines the determination result that the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 satisfies a predetermined angle of view intended when the endoscope 2 is designed. If obtained, the processing after step S5 in FIG. 6 is performed while maintaining the control performed on the driver unit 22 at the timing when the determination result is obtained.
  • step S1 of FIG. 6 to S4 is not restricted to the case where G light is irradiated with respect to the light irradiation coordinate detection module 101, R light or B light was irradiated with respect to the light irradiation coordinate detection module 101 Even in this case, it can be implemented in substantially the same manner.
  • the controller 25 performs a process of extracting coordinate positions corresponding to each irradiation position for one frame from the coordinate information output from the light irradiation coordinate detection module 101 (step S5 in FIG. 6).
  • controller 25 responds to the return light received with the G light irradiation to the subject based on each coordinate position extracted in step S5 of FIG. 6 and the table data TBD stored in the memory 24.
  • a process for acquiring G image correction information used for correcting the G image generated in this way is performed (step S6 in FIG. 6).
  • FIG. 8 is a diagram illustrating an example of a deviation that occurs between the locus of the ideal illumination light irradiation position and the locus of the actual illumination light irradiation position.
  • the controller 25 includes, for example, the positional deviation amount GZL between the coordinate position extracted in step S5 of FIG. 6 and the coordinate position of the table data TBD in the table data TBD.
  • the controller 25 performs, for example, interpolation processing based on each displacement amount GZL calculated as described above, and is generated according to the return light received as the subject is irradiated with the G light.
  • G image correction information including a correction amount for correcting a positional shift of all the pixels of the G image can be acquired.
  • a process of detecting a deviation amount between the locus of the ideal irradiation position and the locus of the actual irradiation position, and acquiring image correction information based on the detected deviation amount is not limited to the process of steps S ⁇ b> 5 and S ⁇ b> 6, and other processes may be performed.
  • the controller 25 stores the G image correction information acquired in step S6 in FIG. 6 in the memory 24, and then performs an image correction process based on the G image correction information on the G image (step S7 in FIG. 6). .
  • step S5 when the light irradiation coordinate detection module 101 is irradiated with the R light, the subject is irradiated with the R light.
  • R image correction information including a correction amount for correcting the positional deviation of all the pixels of the R image generated according to the received return light is acquired, and an image based on the acquired R image correction information Correction processing is performed on the R image.
  • B image correction information including a correction amount for correcting the positional deviation of all the pixels of the B image generated according to the received return light is acquired, and an image based on the acquired B image correction information. Correction processing is performed on the B image.
  • the ideal illumination light irradiation position and the actual illumination light irradiation position are determined. It is possible to generate a corrected image in which the distortion of the image caused by the shift between the two is sufficiently corrected and display it on the monitor 4. As a result, according to the present embodiment, it is possible to accurately calibrate the distortion of an image acquired using a scanning endoscope.
  • the series of processes in FIG. 6 is not limited to the process performed when the G light is irradiated along the scanning pattern as shown in FIG. 5A, but, for example, along the scanning pattern as shown in FIG. 5B. It may be performed when G light is irradiated.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Biophysics (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

An endoscope system comprises: an endoscope, further comprising a light-guiding member, and a drive unit which is capable of making the light-guiding member fluctuate such that a projection location of illumination light which is projected through the light-guiding member traces a trajectory corresponding to a prescribed scan pattern; a coordinate information acquisition unit which acquires coordinate information, whereby it is possible to detect the projection location of the illumination light which is projected from the endoscope; a determination unit which determines whether a projection range of the illumination light which is projected from the endoscope satisfies a prescribed angle of view; a comparison unit; and a control unit which, if the projection range of the illumination light does not satisfy the prescribed angle of view, adjusts drive signals which are supplied to the drive unit, and if the projection range of the illumination light does satisfy the prescribed angle of view, detects a degree of deviation between the trajectory of the projection location along the prescribed scan pattern, and the trajectory which is traced by the projection location which is detected on the basis of the coordinate information.

Description

内視鏡システムEndoscope system
 本発明は、内視鏡システムに関し、特に、被写体を走査して画像を取得する内視鏡システムに関するものである。 The present invention relates to an endoscope system, and more particularly to an endoscope system that acquires an image by scanning a subject.
 医療分野の内視鏡においては、被検者の負担を軽減するために、当該被検者の体腔内に挿入される挿入部を細径化するための種々の技術が提案されている。そして、このような技術の一例として、前述の挿入部に相当する部分に固体撮像素子を有しない光走査型内視鏡、及び、当該光走査型内視鏡を具備して構成されたシステムが知られている。 In endoscopes in the medical field, various techniques have been proposed for reducing the diameter of an insertion portion that is inserted into a body cavity of a subject in order to reduce the burden on the subject. As an example of such a technique, there is an optical scanning endoscope that does not have a solid-state imaging device in a portion corresponding to the above-described insertion portion, and a system that includes the optical scanning endoscope. Are known.
 具体的には、前述の光走査型内視鏡を具備するシステムは、例えば、光源部から発せられた照明光を導光する照明用ファイバの先端部を揺動させることにより被写体を予め設定された走査パターンで走査し、当該被写体からの戻り光を照明用ファイバの周囲に配置された受光用ファイバで受光し、当該受光用ファイバで受光された戻り光を各色成分毎に分離して得た信号を用いて当該被写体の画像を生成するように構成されている。 Specifically, in the system including the above-described optical scanning endoscope, for example, a subject is set in advance by swinging the tip of an illumination fiber that guides illumination light emitted from a light source unit. Obtained by separating the return light received by the light receiving fiber for each color component. An image of the subject is generated using the signal.
 一方、前述のような構成を具備するシステムに適用可能な較正方法として、例えば、日本国特表2010-515947号公報に開示されているような較正方法が従来知られている。 On the other hand, as a calibration method applicable to a system having the above-described configuration, for example, a calibration method disclosed in Japanese National Publication No. 2010-515947 is known.
 具体的には、日本国特表2010-515947号公報には、走査ビーム装置を使用して多色較正パターンの多色画像を取得し、当該取得した多色画像の各色成分と、当該各色成分に対応する前記多色較正パターンの表示の色成分とを比較し、当該比較した結果に基づいて前記走査ビーム装置を較正する、という較正方法が開示されている。 Specifically, Japanese National Table of Contents 2010-515947 discloses a multicolor image of a multicolor calibration pattern using a scanning beam device, each color component of the acquired multicolor image, and each color component. A calibration method is disclosed in which the scanning beam device is calibrated based on the comparison result by comparing the color components of the display of the multicolor calibration pattern corresponding to.
 ところで、前述のような構成を具備するシステムによれば、例えば、被写体に対して実際に照射された照明光の照射位置が所定の走査パターンに沿った(理想的な)照射位置からずれることに起因し、当該照明光の照射に応じて生成される画像に歪みが生じる場合がある。 By the way, according to the system having the above-described configuration, for example, the irradiation position of the illumination light actually irradiated to the subject is shifted from the (ideal) irradiation position along a predetermined scanning pattern. This may cause distortion in the image generated in response to the illumination light irradiation.
 しかし、日本国特表2010-515947号公報によれば、前述のような要因により生じる画像の歪みを較正するための具体的な方法(例えば、走査ビーム装置におけるどの駆動パラメータをどのように変更すべきか等)について言及されておらず、その結果、前述のような要因により生じる画像の歪みを十分に較正することができない、という課題が生じている。 However, according to Japanese National Publication No. 2010-515947, a specific method for calibrating image distortion caused by the above-described factors (for example, which driving parameter in the scanning beam apparatus should be changed and how should it be changed). As a result, there is a problem that image distortion caused by the above-described factors cannot be calibrated sufficiently.
 本発明は、前述した事情に鑑みてなされたものであり、走査型の内視鏡を用いて取得される画像の歪みを精度良く較正することが可能な内視鏡システムを提供することを目的としている。 The present invention has been made in view of the above-described circumstances, and an object thereof is to provide an endoscope system capable of accurately calibrating distortion of an image acquired using a scanning endoscope. It is said.
 本発明の一態様の内視鏡システムは、光源から発せられた照明光を導く導光部材と、前記導光部材を経て被写体へ照射される前記照明光の照射位置が所定の走査パターンに応じた軌跡を描くように前記導光部材を揺動させることが可能な駆動部と、を備えた内視鏡と、前記内視鏡から照射された前記照明光の照射位置を検出可能な座標情報を取得する座標情報取得部と、前記所定の走査パターンに沿って前記照明光が照射された場合の照射位置と、前記座標情報に基づいて検出される前記照明光の照射位置と、を比較する比較部と、前記比較部の比較結果に基づき、前記内視鏡から照射された前記照明光の照射範囲が所定の画角を満たすか否かに係る判定を行う判定部と、前記照明光の照射範囲が前記所定の画角を満たさないとの判定結果が前記判定部により得られた場合において、前記駆動部に供給される駆動信号を調整するための制御を行い、前記照明光の照射範囲が前記所定の画角を満たすとの判定結果が前記判定部により得られた場合において、前記所定の走査パターンに沿った照射位置の軌跡と、前記座標情報に基づいて検出される照射位置により描かれる軌跡と、の間のずれ量を検出するための処理を行う制御部と、を有する。 According to an endoscope system of one embodiment of the present invention, a light guide member that guides illumination light emitted from a light source, and an irradiation position of the illumination light that is irradiated to a subject through the light guide member correspond to a predetermined scanning pattern. An endoscope provided with a drive unit capable of swinging the light guide member so as to draw a trajectory, and coordinate information capable of detecting the irradiation position of the illumination light emitted from the endoscope The coordinate information acquisition unit for acquiring the illumination light, the irradiation position when the illumination light is irradiated along the predetermined scanning pattern, and the irradiation position of the illumination light detected based on the coordinate information are compared. Based on the comparison result of the comparison unit, the comparison unit, a determination unit that determines whether or not the irradiation range of the illumination light irradiated from the endoscope satisfies a predetermined angle of view, and the illumination light Determination result that the irradiation range does not satisfy the predetermined angle of view In the case of being obtained by the determination unit, control for adjusting a drive signal supplied to the drive unit is performed, and a determination result that the irradiation range of the illumination light satisfies the predetermined angle of view is the determination unit. In the case of the above, a process for detecting a deviation amount between the locus of the irradiation position along the predetermined scanning pattern and the locus drawn by the irradiation position detected based on the coordinate information. And a control unit for performing.
本発明の実施例に係る内視鏡システムの要部の構成を示す図。The figure which shows the structure of the principal part of the endoscope system which concerns on the Example of this invention. 被写体の表面に設定される仮想的なXY平面の一例を示す図。The figure which shows an example of the virtual XY plane set to the surface of a to-be-photographed object. 内視鏡に設けられたアクチュエータに供給される第1の駆動信号の信号波形の一例を示す図。The figure which shows an example of the signal waveform of the 1st drive signal supplied to the actuator provided in the endoscope. 内視鏡に設けられたアクチュエータに供給される第2の駆動信号の信号波形の一例を示す図。The figure which shows an example of the signal waveform of the 2nd drive signal supplied to the actuator provided in the endoscope. 図2のような仮想的なXY平面に照明光が照射された場合における、点SAから点YMAXに至るまでの照明光の照射座標の時間的な変位を説明するための図。The figure for demonstrating the time displacement of the irradiation coordinate of the illumination light from the point SA to the point YMAX when illumination light is irradiated to the virtual XY plane like FIG. 図2のような仮想的なXY平面に照明光が照射された場合における、点YMAXから点SAに至るまでの照明光の照射座標の時間的な変位を説明するための図。The figure for demonstrating the time displacement of the irradiation coordinate of the illumination light from the point YMAX to the point SA when illumination light is irradiated to the virtual XY plane like FIG. 本発明の実施例に係る内視鏡システムににより行われる処理等の一例を示すフローチャート。The flowchart which shows an example of the process etc. which are performed by the endoscope system which concerns on the Example of this invention. 理想的な照明光の照射範囲と実際の照明光の照射範囲との間に生じるずれの一例を示す図。The figure which shows an example of the shift | offset | difference which arises between the irradiation range of an ideal illumination light, and the irradiation range of an actual illumination light. 理想的な照明光の照射位置の軌跡と実際の照明光の照射位置の軌跡との間に生じるずれの一例を示す図。The figure which shows an example of the shift | offset | difference which arises between the locus | trajectory of the irradiation position of ideal illumination light, and the locus | trajectory of the irradiation position of actual illumination light.
 以下、本発明の実施の形態について、図面を参照しつつ説明を行う。 Hereinafter, embodiments of the present invention will be described with reference to the drawings.
 図1から図8は、本発明の実施例に係るものである。図1は、本発明の実施例に係る内視鏡システムの要部の構成を示す図である。 1 to 8 relate to an embodiment of the present invention. FIG. 1 is a diagram illustrating a configuration of a main part of an endoscope system according to an embodiment of the present invention.
 内視鏡システム1は、例えば図1に示すように、被検者の体腔内に挿入される走査型の内視鏡2と、内視鏡2に接続される本体装置3と、本体装置3に接続されるモニタ4と、を有して構成されている。 For example, as shown in FIG. 1, an endoscope system 1 includes a scanning endoscope 2 that is inserted into a body cavity of a subject, a main body device 3 that is connected to the endoscope 2, and a main body device 3. And a monitor 4 connected to the.
 内視鏡2は、被検者の体腔内に挿入可能な細長形状及び可撓性を備えて形成された挿入部11を有して構成されている。なお、挿入部11の基端部には、内視鏡2を本体装置3に着脱自在に接続するための図示しないコネクタ等が設けられている。 The endoscope 2 includes an insertion portion 11 formed with an elongated shape and flexibility that can be inserted into a body cavity of a subject. In addition, a connector (not shown) or the like for detachably connecting the endoscope 2 to the main body device 3 is provided at the proximal end portion of the insertion portion 11.
 挿入部11の内部における基端部から先端部にかけての部分には、本体装置3の光源ユニット21から供給された照明光を対物光学系14へ導く導光部材としての機能を具備する照明用ファイバ12と、被写体からの戻り光を受光して本体装置3の検出ユニット23へ導く受光用ファイバ13と、がそれぞれ挿通されている。 An illumination fiber having a function as a light guide member for guiding the illumination light supplied from the light source unit 21 of the main body device 3 to the objective optical system 14 is provided in a portion from the proximal end portion to the distal end portion in the insertion portion 11. 12 and a light receiving fiber 13 that receives the return light from the subject and guides it to the detection unit 23 of the main body device 3 are respectively inserted.
 照明用ファイバ12の光入射面を含む端部は、本体装置3の内部に設けられた合波器32に配置されている。また、照明用ファイバ12の光出射面を含む端部は、挿入部11の先端部に設けられたレンズ14aの光入射面の近傍において、固定部材等により固定されない状態で配置されている。 The end including the light incident surface of the illumination fiber 12 is disposed in a multiplexer 32 provided inside the main unit 3. Further, the end portion including the light emission surface of the illumination fiber 12 is disposed in a state in which it is not fixed by a fixing member or the like in the vicinity of the light incident surface of the lens 14 a provided at the distal end portion of the insertion portion 11.
 受光用ファイバ13の光入射面を含む端部は、挿入部11の先端部の先端面における、レンズ14bの光出射面の周囲に固定配置されている。また、受光用ファイバ13の光出射面を含む端部は、本体装置3の内部に設けられた分波器36に配置されている。 The end including the light incident surface of the light receiving fiber 13 is fixedly disposed around the light emitting surface of the lens 14 b at the distal end surface of the distal end portion of the insertion portion 11. Further, the end including the light emitting surface of the light receiving fiber 13 is disposed in a duplexer 36 provided inside the main body device 3.
 対物光学系14は、照明用ファイバ12からの照明光が入射されるレンズ14aと、レンズ14aを経た照明光を被写体へ出射するレンズ14bと、を有して構成されている。 The objective optical system 14 includes a lens 14a into which illumination light from the illumination fiber 12 is incident, and a lens 14b that emits illumination light that has passed through the lens 14a to a subject.
 挿入部11の先端部側における照明用ファイバ12の中途部には、本体装置3のドライバユニット22から出力される駆動信号に基づいて駆動するアクチュエータ15が取り付けられている。 An actuator 15 that is driven based on a drive signal output from the driver unit 22 of the main body device 3 is attached to the middle portion of the illumination fiber 12 on the distal end side of the insertion portion 11.
 ここで、以降においては、挿入部11の長手方向の軸に相当する挿入軸(または対物光学系14の光軸)に対して垂直な仮想の平面として、図2に示すようなXY平面を被写体の表面に設定する場合を例に挙げつつ説明を進める。図2は、被写体の表面に設定される仮想的なXY平面の一例を示す図である。 Hereafter, an XY plane as shown in FIG. 2 is used as a virtual plane perpendicular to the insertion axis corresponding to the longitudinal axis of the insertion portion 11 (or the optical axis of the objective optical system 14). The explanation will be made with reference to the case of setting the surface. FIG. 2 is a diagram illustrating an example of a virtual XY plane set on the surface of the subject.
 具体的には、図2のXY平面上の点SAは、紙面手前側から奥側に相当する方向に挿入部11の挿入軸が存在するものとして仮想的に設定した場合における、当該挿入軸と紙面との交点を示している。また、図2のXY平面におけるX軸方向は、紙面左側から右側に向かう方向として設定されている。また、図2のXY平面におけるY軸方向は、紙面下側から上側に向かう方向として設定されている。また、図2のXY平面を構成するX軸及びY軸は、点SAにおいて交差している。 Specifically, the point SA on the XY plane in FIG. 2 is the insertion axis when the insertion axis of the insertion unit 11 is virtually set to exist in the direction corresponding to the back side from the front side of the page. It shows the intersection with the page. Further, the X-axis direction on the XY plane in FIG. 2 is set as a direction from the left side to the right side of the drawing. Further, the Y-axis direction in the XY plane of FIG. 2 is set as a direction from the lower side to the upper side of the drawing. Further, the X axis and the Y axis constituting the XY plane of FIG. 2 intersect at the point SA.
 アクチュエータ15は、本体装置3のドライバユニット22から出力される第1の駆動信号に基づき、照明用ファイバ12の光出射面を含む端部をX軸方向に揺動させるように動作するX軸用アクチュエータ(図示せず)と、本体装置3のドライバユニット22から出力される第2の駆動信号に基づき、照明用ファイバ12の光出射面を含む端部をY軸方向に揺動させるように動作するY軸用アクチュエータ(図示せず)と、を有して構成されている。そして、照明用ファイバ12の光出射面を含む端部は、前述のようなX軸用アクチュエータ及びY軸用アクチュエータの動作に伴い、点SAを中心として渦巻状に揺動される。 The actuator 15 is based on the first drive signal output from the driver unit 22 of the main unit 3 and operates for swinging the end including the light emitting surface of the illumination fiber 12 in the X-axis direction. Based on an actuator (not shown) and a second drive signal output from the driver unit 22 of the main unit 3, the end including the light emitting surface of the illumination fiber 12 is swung in the Y-axis direction. And a Y-axis actuator (not shown). The end including the light exit surface of the illumination fiber 12 is swung in a spiral shape around the point SA in accordance with the operations of the X-axis actuator and the Y-axis actuator as described above.
 挿入部11の内部には、内視鏡2に関連する種々の情報を含む内視鏡情報が予め格納されたメモリ16が設けられている。そして、メモリ16に格納された内視鏡情報は、内視鏡2と本体装置3とが接続された際に、本体装置3のコントローラ25によって読み出される。 Inside the insertion unit 11 is provided a memory 16 in which endoscope information including various information related to the endoscope 2 is stored in advance. The endoscope information stored in the memory 16 is read by the controller 25 of the main body device 3 when the endoscope 2 and the main body device 3 are connected.
 一方、本体装置3は、光源ユニット21と、ドライバユニット22と、検出ユニット23と、メモリ24と、コントローラ25と、を有して構成されている。 On the other hand, the main unit 3 includes a light source unit 21, a driver unit 22, a detection unit 23, a memory 24, and a controller 25.
 光源ユニット21は、光源31aと、光源31bと、光源31cと、合波器32と、を有して構成されている。 The light source unit 21 includes a light source 31a, a light source 31b, a light source 31c, and a multiplexer 32.
 光源31aは、例えばレーザ光源等を具備し、コントローラ25の制御によりオンされた際に、赤色の波長帯域の光(以降、R光とも称する)を合波器32へ出射するように構成されている。 The light source 31 a includes, for example, a laser light source and the like, and is configured to emit red wavelength band light (hereinafter also referred to as R light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
 光源31bは、例えばレーザ光源等を具備し、コントローラ25の制御によりオンされた際に、緑色の波長帯域の光(以降、G光とも称する)を合波器32へ出射するように構成されている。 The light source 31b includes a laser light source, for example, and is configured to emit light in a green wavelength band (hereinafter also referred to as G light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
 光源31cは、例えばレーザ光源等を具備し、コントローラ25の制御によりオンされた際に、青色の波長帯域の光(以降、B光とも称する)を合波器32へ出射するように構成されている。 The light source 31c includes, for example, a laser light source, and is configured to emit light in a blue wavelength band (hereinafter also referred to as B light) to the multiplexer 32 when turned on under the control of the controller 25. Yes.
 合波器32は、光源31aから発せられたR光と、光源31bから発せられたG光と、光源31cから発せられたB光と、を合波して照明用ファイバ12の光入射面に供給できるように構成されている。 The multiplexer 32 multiplexes the R light emitted from the light source 31a, the G light emitted from the light source 31b, and the B light emitted from the light source 31c onto the light incident surface of the illumination fiber 12. It is configured so that it can be supplied.
 ドライバユニット22は、信号発生器33と、デジタルアナログ(以下、D/Aという)変換器34a及び34bと、アンプ35と、を有して構成されている。 The driver unit 22 includes a signal generator 33, digital / analog (hereinafter referred to as D / A) converters 34a and 34b, and an amplifier 35.
 信号発生器33は、コントローラ25の制御に基づき、照明用ファイバ12の光出射面を含む端部をX軸方向に揺動させる第1の駆動信号として、例えば図3に示すような、所定の波形の信号を生成してD/A変換器34aに出力するように構成されている。図3は、内視鏡に設けられたアクチュエータに供給される第1の駆動信号の信号波形の一例を示す図である。 Based on the control of the controller 25, the signal generator 33 is a predetermined drive signal as shown in FIG. 3, for example, as a first drive signal for swinging the end including the light emitting surface of the illumination fiber 12 in the X-axis direction. A waveform signal is generated and output to the D / A converter 34a. FIG. 3 is a diagram illustrating an example of a signal waveform of a first drive signal supplied to an actuator provided in the endoscope.
 また、信号発生器33は、コントローラ25の制御に基づき、照明用ファイバ12の光出射面を含む端部をY軸方向に揺動させる第2の駆動信号として、例えば図4に示すような、前述の第1の駆動信号の位相を90°ずらした波形の信号を生成してD/A変換器34bに出力するように構成されている。図4は、内視鏡に設けられたアクチュエータに供給される第2の駆動信号の信号波形の一例を示す図である。 Further, the signal generator 33 is based on the control of the controller 25, for example, as shown in FIG. 4, as a second drive signal for swinging the end including the light emitting surface of the illumination fiber 12 in the Y-axis direction. A signal having a waveform in which the phase of the first drive signal is shifted by 90 ° is generated and output to the D / A converter 34b. FIG. 4 is a diagram illustrating an example of a signal waveform of a second drive signal supplied to an actuator provided in the endoscope.
 D/A変換器34aは、信号発生器33から出力されたデジタルの第1の駆動信号をアナログの第1の駆動信号に変換してアンプ35へ出力するように構成されている。 The D / A converter 34 a is configured to convert the digital first drive signal output from the signal generator 33 into an analog first drive signal and output the analog first drive signal to the amplifier 35.
 D/A変換器34bは、信号発生器33から出力されたデジタルの第2の駆動信号をアナログの第2の駆動信号に変換してアンプ35へ出力するように構成されている。 The D / A converter 34 b is configured to convert the digital second drive signal output from the signal generator 33 into an analog second drive signal and output the analog second drive signal to the amplifier 35.
 アンプ35は、D/A変換器34a及び34bから出力された第1及び第2の駆動信号を増幅してアクチュエータ15へ出力するように構成されている。 The amplifier 35 is configured to amplify the first and second drive signals output from the D / A converters 34 a and 34 b and output the amplified signals to the actuator 15.
 ここで、図3において例示した第1の駆動信号の振幅値(信号レベル)は、最小値となる時刻T1を起点として徐々に増加し、時刻T2において最大値になった後で徐々に減少し、時刻T3で再び最小値となる。 Here, the amplitude value (signal level) of the first drive signal illustrated in FIG. 3 gradually increases from the time T1 at which the minimum value is reached, and gradually decreases after reaching the maximum value at time T2. At time T3, it becomes the minimum value again.
 また、図4において例示した第2の駆動信号の振幅値(信号レベル)は、最小値となる時刻T1を起点として徐々に増加し、時刻T2の近辺において最大値になった後で徐々に減少し、時刻T3で再び最小値となる。 In addition, the amplitude value (signal level) of the second drive signal illustrated in FIG. 4 gradually increases from the time T1 at which the minimum value is reached, and gradually decreases after reaching the maximum value near the time T2. Then, it becomes the minimum value again at time T3.
 そして、図3に示すような第1の駆動信号がアクチュエータ15のX軸用アクチュエータに供給されるとともに、図4に示すような第2の駆動信号がアクチュエータ15のY軸用アクチュエータに供給されると、照明用ファイバ12の光出射面を含む端部が点SAを中心とした渦巻状に揺動され、このような揺動に応じて被写体の表面が図5A及び図5Bに示すような渦巻状に走査される。図5Aは、図2のような仮想的なXY平面に照明光が照射された場合における、点SAから点YMAXに至るまでの照明光の照射座標の時間的な変位を説明するための図である。図5Bは、図2のような仮想的なXY平面に照明光が照射された場合における、点YMAXから点SAに至るまでの照明光の照射座標の時間的な変位を説明するための図である。 Then, the first drive signal as shown in FIG. 3 is supplied to the X-axis actuator of the actuator 15, and the second drive signal as shown in FIG. 4 is supplied to the Y-axis actuator of the actuator 15. Then, the end including the light exit surface of the illumination fiber 12 is swung in a spiral shape around the point SA, and the surface of the subject is swirled as shown in FIGS. 5A and 5B according to such a swing. Scanned. FIG. 5A is a diagram for explaining temporal displacement of illumination light irradiation coordinates from point SA to point YMAX when illumination light is irradiated on a virtual XY plane as shown in FIG. is there. FIG. 5B is a diagram for explaining temporal displacement of illumination light irradiation coordinates from point YMAX to point SA when illumination light is irradiated onto a virtual XY plane as shown in FIG. is there.
 具体的には、時刻T1においては、被写体の表面の点SAに相当する位置に照明光が照射される。その後、第1及び第2の駆動信号の振幅値が時刻T1から時刻T2にかけて増加するに伴い、被写体の表面における照明光の照射座標が点SAを起点として外側へ第1の渦巻状の軌跡を描くように変位し、さらに、時刻T2に達すると、被写体の表面における照明光の照射座標の最外点である点YMAXに照明光が照射される。そして、第1及び第2の駆動信号の振幅値が時刻T2から時刻T3にかけて減少するに伴い、被写体の表面における照明光の照射座標が点YMAXを起点として内側へ第2の渦巻状の軌跡を描くように変位し、さらに、時刻T3に達すると、被写体の表面における点SAに照明光が照射される。 Specifically, at time T1, illumination light is applied to a position corresponding to the point SA on the surface of the subject. Thereafter, as the amplitude values of the first and second drive signals increase from time T1 to time T2, the irradiation coordinates of the illumination light on the surface of the subject follow the first spiral locus outward from the point SA. When it is displaced as drawn and further reaches time T2, illumination light is irradiated to a point YMAX that is the outermost point of the illumination light irradiation coordinates on the surface of the subject. Then, as the amplitude values of the first and second drive signals decrease from time T2 to time T3, the illumination light irradiation coordinates on the surface of the subject have a second spiral trajectory inward starting from the point YMAX. When it is displaced as drawn and further reaches time T3, illumination light is irradiated to the point SA on the surface of the subject.
 すなわち、アクチュエータ15は、ドライバユニット22から供給される第1及び第2の駆動信号に基づき、対物光学系14を経て被写体へ照射される照明光の照射位置が図5A及び図5Bに例示した渦巻状の走査パターンに応じた軌跡を描くように、照明用ファイバ12の光出射面を含む端部を揺動させることが可能な構成を具備している。 That is, the actuator 15 has a spiral in which the irradiation position of the illumination light applied to the subject through the objective optical system 14 is illustrated in FIGS. 5A and 5B based on the first and second drive signals supplied from the driver unit 22. The end portion including the light emitting surface of the illumination fiber 12 can be swung so as to draw a locus corresponding to the scanning pattern.
 一方、検出ユニット23は、分波器36と、検出器37a、37b及び37cと、アナログデジタル(以下、A/Dという)変換器38a、38b及び38cと、を有して構成されている。 On the other hand, the detection unit 23 includes a duplexer 36, detectors 37a, 37b, and 37c, and analog-digital (hereinafter referred to as A / D) converters 38a, 38b, and 38c.
 分波器36は、ダイクロイックミラー等を具備し、受光用ファイバ13の光出射面から出射された戻り光をR(赤)、G(緑)及びB(青)の色成分毎の光に分離して検出器37a、37b及び37cへ出射するように構成されている。 The demultiplexer 36 includes a dichroic mirror and the like, and separates the return light emitted from the light emitting surface of the light receiving fiber 13 into light for each of R (red), G (green), and B (blue) color components. And it is comprised so that it may radiate | emit to the detectors 37a, 37b, and 37c.
 検出器37aは、分波器36から出力されるR光の強度を検出し、当該検出したR光の強度に応じたアナログのR信号を生成してA/D変換器38aへ出力するように構成されている。 The detector 37a detects the intensity of the R light output from the duplexer 36, generates an analog R signal corresponding to the detected intensity of the R light, and outputs the analog R signal to the A / D converter 38a. It is configured.
 検出器37bは、分波器36から出力されるG光の強度を検出し、当該検出したG光の強度に応じたアナログのG信号を生成してA/D変換器38bへ出力するように構成されている。 The detector 37b detects the intensity of the G light output from the duplexer 36, generates an analog G signal corresponding to the detected intensity of the G light, and outputs the analog G signal to the A / D converter 38b. It is configured.
 検出器37cは、分波器36から出力されるB光の強度を検出し、当該検出したB光の強度に応じたアナログのB信号を生成してA/D変換器38cへ出力するように構成されている。 The detector 37c detects the intensity of the B light output from the duplexer 36, generates an analog B signal according to the detected intensity of the B light, and outputs the analog B signal to the A / D converter 38c. It is configured.
 A/D変換器38aは、検出器37aから出力されたアナログのR信号をデジタルのR信号に変換してコントローラ25へ出力するように構成されている。 The A / D converter 38a is configured to convert the analog R signal output from the detector 37a into a digital R signal and output it to the controller 25.
 A/D変換器38bは、検出器37bから出力されたアナログのG信号をデジタルのG信号に変換してコントローラ25へ出力するように構成されている。 The A / D converter 38b is configured to convert the analog G signal output from the detector 37b into a digital G signal and output it to the controller 25.
 A/D変換器38cは、検出器37cから出力されたアナログのB信号をデジタルのB信号に変換してコントローラ25へ出力するように構成されている。 The A / D converter 38c is configured to convert the analog B signal output from the detector 37c into a digital B signal and output it to the controller 25.
 メモリ24には、本体装置3の制御を行うための制御プログラム等が予め格納されているとともに、コントローラ25の処理結果として得られる画像補正情報が格納される。なお、このような画像補正情報の詳細については、後程説明する。 The memory 24 stores a control program for controlling the main device 3 in advance, and stores image correction information obtained as a result of processing by the controller 25. Details of such image correction information will be described later.
 また、メモリ24には、図5A及び図5Bに例示したような理想的な(渦巻状の)走査パターン上の任意の座標位置に照明光が照射される時刻が、点SAに相当する座標位置(0,0)に照明光が照射される時刻T1から、点YMAXに相当する座標位置に照明光が照射される時刻T2を経た後、点SAに相当する座標位置(0,0)に再度照明光が照射される時刻T3に至るまでの期間のうちのどの時刻であるかを特定可能なテーブルデータTBDが予め格納されている。 Further, in the memory 24, the coordinate position corresponding to the point SA is the time at which the illumination light is irradiated to an arbitrary coordinate position on the ideal (spiral) scanning pattern as illustrated in FIGS. 5A and 5B. After the time T1 when the illumination light is irradiated to the coordinate position corresponding to the point YMAX from the time T1 when the illumination light is irradiated at (0,0), the coordinate position (0,0) corresponding to the point SA again. Table data TBD that can specify which time of the period up to time T3 when the illumination light is irradiated is stored in advance.
 すなわち、メモリ24には、光源ユニット21により供給される照明光が図5A及び図5Bのような理想的な(渦巻状の)走査パターンに沿って照射された場合の照射位置(座標位置)と照射時刻(経過時刻)との間の対応関係を示すテーブルデータTBDが予め格納されている。 That is, in the memory 24, the irradiation position (coordinate position) when the illumination light supplied from the light source unit 21 is irradiated along an ideal (spiral) scanning pattern as shown in FIGS. 5A and 5B. Table data TBD indicating a correspondence relationship with the irradiation time (elapsed time) is stored in advance.
 また、前述のテーブルデータTBDは、例えば、光源ユニット21により供給される各波長帯域の光(R光、G光及びB光)にそれぞれ対応したデータとして構成されている。 The table data TBD is configured as data corresponding to each wavelength band light (R light, G light, and B light) supplied from the light source unit 21, for example.
 コントローラ25は、メモリ24に格納された制御プログラムを読み出し、当該読み出した制御プログラムに基づいて光源ユニット21及びドライバユニット22の制御を行うように構成されている。 The controller 25 is configured to read a control program stored in the memory 24 and to control the light source unit 21 and the driver unit 22 based on the read control program.
 コントローラ25は、挿入部11が本体装置3に接続された際にメモリ16から出力される内視鏡情報をメモリ24に格納させるように動作する。 The controller 25 operates so as to store the endoscope information output from the memory 16 in the memory 24 when the insertion unit 11 is connected to the main body device 3.
 コントローラ25は、時刻T1から時刻T2に相当する期間に検出ユニット23から出力されるR信号、G信号及びB信号に基づいて1フレーム分の画像を生成するように構成されている。また、コントローラ25は、時刻T2から時刻T3に相当する期間に検出ユニット23から出力されるR信号、G信号及びB信号に基づいて1フレーム分の画像を生成するように構成されている。 The controller 25 is configured to generate an image for one frame based on the R signal, the G signal, and the B signal output from the detection unit 23 in a period corresponding to the time T1 to the time T2. Further, the controller 25 is configured to generate an image for one frame based on the R signal, the G signal, and the B signal output from the detection unit 23 during a period corresponding to the time T2 to the time T3.
 さらに、コントローラ25は、メモリ24に画像補正情報が格納されている場合において、当該画像補正情報に基づく画像補正処理を各フレームの画像に対して施し、当該画像補正処理を施した補正後の画像を所定のフレームレートでモニタ4に表示させるように動作する。 Further, when image correction information is stored in the memory 24, the controller 25 performs an image correction process based on the image correction information on the image of each frame, and a corrected image obtained by performing the image correction process. Is displayed on the monitor 4 at a predetermined frame rate.
 一方、コントローラ25は、メモリ24に格納されたテーブルデータTBDと、光照射座標検出モジュール101から出力される座標位置の情報(以降、座標情報とも称する)と、に基づいて後述の処理を行うことにより画像補正情報を取得し、当該取得した画像補正情報をメモリ24に格納させるように動作する。また、コントローラ25は、光照射座標検出モジュール101から出力される座標情報を少なくとも一時的に保持できるように構成されている。 On the other hand, the controller 25 performs processing described later based on the table data TBD stored in the memory 24 and information on the coordinate position output from the light irradiation coordinate detection module 101 (hereinafter also referred to as coordinate information). The image correction information is acquired by the operation, and the acquired image correction information is stored in the memory 24. The controller 25 is configured to be able to at least temporarily hold coordinate information output from the light irradiation coordinate detection module 101.
 ここで、座標情報取得部としての機能を備えた光照射座標検出モジュール101は、位置検出素子(PSD:Position Sensitive Detector)等を具備し、対物光学系14を経て出射された照明光を受光した際の位置を検出するとともに、当該検出した位置を座標情報として出力するように構成されている。 Here, the light irradiation coordinate detection module 101 having a function as a coordinate information acquisition unit includes a position detection element (PSD) and receives illumination light emitted through the objective optical system 14. The position at the time is detected, and the detected position is output as coordinate information.
 なお、本実施例の光照射座標検出モジュール101においては、図2、図5A及び図5Bに例示したXY平面上の点SAの座標位置が(0,0)となるように予め設定されているものとする。すなわち、光照射座標検出モジュール101から出力される座標情報は、図2、図5A及び図5Bに例示したXY平面上の点SAの座標位置(0,0)を基準とした相対的な座標位置を示す情報である。 In the light irradiation coordinate detection module 101 of this embodiment, the coordinate position of the point SA on the XY plane exemplified in FIGS. 2, 5A and 5B is set in advance to be (0, 0). Shall. That is, the coordinate information output from the light irradiation coordinate detection module 101 is a relative coordinate position based on the coordinate position (0, 0) of the point SA on the XY plane illustrated in FIGS. 2, 5A, and 5B. It is information which shows.
 そのため、コントローラ25は、以上に述べたような構成を具備する光照射座標検出モジュール101から出力される座標情報に基づき、内視鏡2から渦巻状に照射される照明光の照射位置を(座標位置として)検出することができる。 Therefore, the controller 25 determines the irradiation position of the illumination light irradiated in a spiral shape from the endoscope 2 based on the coordinate information output from the light irradiation coordinate detection module 101 having the configuration as described above (coordinates). As position).
 続いて、以上に述べたような構成を具備する内視鏡システム1の動作等について説明する。なお、以降においては、簡単のため、光照射座標検出モジュール101に対し、図5Aに示したような走査パターンに沿ってG光が照射される場合を例に挙げて説明する。 Subsequently, the operation of the endoscope system 1 having the configuration as described above will be described. In the following, for the sake of simplicity, a case will be described in which the light irradiation coordinate detection module 101 is irradiated with G light along a scanning pattern as shown in FIG. 5A.
 まず、術者等は、内視鏡2及びモニタ4を本体装置3にそれぞれ接続し、内視鏡2の先端面に対向する位置に光照射座標検出モジュール101を配置し、さらに、光照射座標検出モジュール101から出力される座標情報が本体装置3のコントローラ25へ入力されるように設定する。 First, the surgeon or the like connects the endoscope 2 and the monitor 4 to the main body device 3, arranges the light irradiation coordinate detection module 101 at a position facing the distal end surface of the endoscope 2, and further, the light irradiation coordinates. The coordinate information output from the detection module 101 is set to be input to the controller 25 of the main device 3.
 その後、内視鏡システム1の各部の電源が投入されると、挿入部11のメモリ16に格納された内視鏡情報がコントローラ25により読みこまれ、当該読み込まれた内視鏡情報がメモリ24に格納される。 Thereafter, when the power of each unit of the endoscope system 1 is turned on, the endoscope information stored in the memory 16 of the insertion unit 11 is read by the controller 25, and the read endoscope information is stored in the memory 24. Stored in
 一方、コントローラ25は、メモリ16から読み込んだ内視鏡情報をメモリ24に格納させた略直後のタイミングにおいて、光源31a及び31cをオフしたまま光源31bをオフからオンへ切り替える制御を光源ユニット21に対して行うとともに、第1及び第2の駆動信号をアクチュエータ15へ出力させる制御をドライバユニット22に対して行う。そして、このようなコントローラ25の制御により、G光が光照射座標検出モジュール101の表面に照射され、G光を受光した位置に対応する座標情報が光照射座標検出モジュール101から順次出力される。 On the other hand, the controller 25 controls the light source unit 21 to switch the light source 31b from OFF to ON while turning off the light sources 31a and 31c at a timing immediately after the endoscope information read from the memory 16 is stored in the memory 24. In addition, the driver unit 22 is controlled to output the first and second drive signals to the actuator 15. Then, under such control of the controller 25, the G light is irradiated on the surface of the light irradiation coordinate detection module 101, and coordinate information corresponding to the position where the G light is received is sequentially output from the light irradiation coordinate detection module 101.
 ここで、画像補正情報の取得に係る処理等について、主に図6から図8を参照しつつ説明する。図6は、本発明の実施例に係る内視鏡システムにより行われる処理等の一例を示すフローチャートである。 Here, processing relating to acquisition of image correction information and the like will be described mainly with reference to FIGS. FIG. 6 is a flowchart illustrating an example of processing performed by the endoscope system according to the embodiment of the present invention.
 まず、コントローラ25は、テーブルデータTBDを参照することにより、光照射座標検出モジュール101から出力された座標情報の中から、テーブルデータTBDに含まれる所定の1以上の照射時刻に対応する座標位置を抽出する処理を行う(図6のステップS1)。 First, by referring to the table data TBD, the controller 25 selects a coordinate position corresponding to one or more predetermined irradiation times included in the table data TBD from the coordinate information output from the light irradiation coordinate detection module 101. The extraction process is performed (step S1 in FIG. 6).
 具体的には、コントローラ25は、例えば、図6のステップS1の処理を行う前に、ドライバユニット22から出力される第1及び第2の駆動信号の信号波形をモニタリングすることにより、2つの駆動信号の信号波形の振幅値(信号レベル)が最小となるタイミングで時刻T1を設定し、さらに、当該設定した時刻T1に基づいてテーブルデータTBDを参照することにより、図5Aの点XMAXに相当する位置に照明光が照射される時刻TXMAXと、図5Aの点YMINに相当する位置に照明光が照射される時刻TYMINと、図5Aの点XMINに相当する位置に照明光が照射される時刻TXMINと、図5Aの点YMAXに相当する位置に照明光が照射される時刻T2と、を検出しておく。そして、コントローラ25は、光照射座標検出モジュール101から出力された座標情報の中から、時刻TXMAXにおいて実際にG光が受光された位置を示す座標位置XAと、時刻TYMINにおいて実際にG光が受光された位置を示す座標位置YBと、時刻TXMINにおいて実際にG光が受光された位置を示す座標位置XBと、時刻TYMAXにおいて実際に照明光を受光した座標位置を示す座標位置YAと、をそれぞれ抽出する。 Specifically, for example, the controller 25 monitors the signal waveforms of the first and second drive signals output from the driver unit 22 before performing the process of step S1 in FIG. By setting the time T1 at the timing when the amplitude value (signal level) of the signal waveform of the signal becomes the minimum, and referring to the table data TBD based on the set time T1, it corresponds to the point XMAX in FIG. 5A. The time TXMAX when the position is irradiated with illumination light, the time TYMIN where the position corresponding to the point YMIN in FIG. 5A is irradiated with the illumination light, and the time TXMIN where the position corresponding to the point XMIN in FIG. 5A is irradiated with the illumination light Then, a time T2 at which illumination light is irradiated to a position corresponding to the point YMAX in FIG. 5A is detected. The controller 25 receives from the coordinate information output from the light irradiation coordinate detection module 101 the coordinate position XA indicating the position where the G light is actually received at the time TXMAX, and the G light is actually received at the time TYMIN. A coordinate position YB indicating the received position, a coordinate position XB indicating the position where the G light is actually received at time TXMIN, and a coordinate position YA indicating the coordinate position where the illumination light is actually received at time TYMAX, respectively. Extract.
 すなわち、図6のステップS1の具体例として挙げた以上の処理によれば、メモリ24に格納されたテーブルデータTBDに基づき、被写体としての光照射座標検出モジュール101に対して実際にG光が照射された際の各照射位置の中から、渦巻状の走査パターンの最外周に位置する4箇所の照射位置に対応する座標位置XA、XB、YA及びYBを抽出することができる。 That is, according to the above processing given as a specific example of step S1 in FIG. 6, based on the table data TBD stored in the memory 24, the light irradiation coordinate detection module 101 as the subject is actually irradiated with G light. Coordinate positions XA, XB, YA and YB corresponding to the four irradiation positions located on the outermost periphery of the spiral scanning pattern can be extracted from each irradiation position at the time.
 次に、コントローラ25は、図6のステップS1により抽出した各座標位置と、前述の所定の1以上の照射時刻に対応するテーブルデータTBDの座標位置と、を比較する処理を行った(図6のステップS2)後、さらに、当該比較した結果に基づき、光照射座標検出モジュール101に照射されたG光の照射範囲が内視鏡2の設計時に意図された所定の画角を満たすか否かに係る判定を行う(図6のステップS3)。なお、前記所定の画角は、対物光学系14を経た照明光が図5A(及び図5B)のような理想的な走査パターンに沿って照射された際の当該照明光の照射範囲に略一致する値(例えば90°)であるとする。 Next, the controller 25 performs a process of comparing each coordinate position extracted in step S1 of FIG. 6 with the coordinate position of the table data TBD corresponding to the predetermined one or more irradiation times (FIG. 6). After step S2), based on the comparison result, whether or not the irradiation range of G light irradiated to the light irradiation coordinate detection module 101 satisfies a predetermined angle of view intended when the endoscope 2 is designed. Is performed (step S3 in FIG. 6). The predetermined angle of view substantially coincides with the illumination light irradiation range when the illumination light having passed through the objective optical system 14 is irradiated along an ideal scanning pattern as shown in FIG. 5A (and FIG. 5B). It is assumed that the value is 90 degrees (for example).
 ところで、図3及び図4に示したような、理想的な信号波形を具備する第1及び第2の駆動信号がドライバユニット22から供給された場合であっても、例えば、アクチュエータ15のX軸用アクチュエータ及びY軸用アクチュエータにおける揺動動作の際に生じる波形の歪み等に起因し、照明用ファイバ12の光出射面を含む端部が正確に揺動されず、対物光学系14を経て照射される照明光の照射位置が理想的な走査パターン上から外れてしまうような状況が発生する。 By the way, even when the first and second drive signals having ideal signal waveforms as shown in FIGS. 3 and 4 are supplied from the driver unit 22, for example, the X axis of the actuator 15 is used. The end including the light exit surface of the illumination fiber 12 is not accurately oscillated due to the waveform distortion or the like generated during the oscillating operation in the actuator for the Y axis and the actuator for the Y axis, and is irradiated through the objective optical system 14 The situation where the irradiation position of the illumination light to be deviated from the ideal scanning pattern occurs.
 特に、図6のステップS3の処理を実施するタイミングにおいては、例えば、メモリ24に予め格納されたテーブルデータTBDの各座標位置を時系列順に並べて得られる理想的なG光の照射位置の軌跡に応じた照射範囲と、光照射座標検出モジュール101から出力される座標情報に含まれる各座標位置を時系列順に並べて得られる実際のG光の照射位置の軌跡に応じた照射範囲と、の間に図7に示すようなずれが生じ得る。図7は、理想的な照明光の照射範囲と実際の照明光の照射範囲との間に生じるずれの一例を示す図である。 In particular, at the timing of performing the process of step S3 in FIG. 6, for example, the locus of the ideal G light irradiation position obtained by arranging the coordinate positions of the table data TBD stored in advance in the memory 24 in chronological order. Between the corresponding irradiation range and the irradiation range according to the locus of the actual G light irradiation position obtained by arranging the coordinate positions included in the coordinate information output from the light irradiation coordinate detection module 101 in time series. A shift as shown in FIG. 7 may occur. FIG. 7 is a diagram illustrating an example of a deviation that occurs between an ideal illumination light irradiation range and an actual illumination light irradiation range.
 具体的には、図7に点線で描かれた実際の照射位置の軌跡によれば、図6のステップS1により抽出した座標位置XA及びXBが、図7に実線で描かれた理想的な照射位置の軌跡に含まれる点XMAX及び点XMINの座標位置に一致している一方で、図6のステップS1により抽出した座標位置YA及びYBが、図7に実線で描かれた理想的な照射位置の軌跡に含まれる点YMAX及び点YMINの座標位置に一致していない、というずれが生じている。 Specifically, according to the locus of the actual irradiation position drawn by the dotted line in FIG. 7, the coordinate positions XA and XB extracted in step S1 of FIG. 6 are the ideal irradiation drawn by the solid line in FIG. The coordinate positions YA and YB extracted in step S1 in FIG. 6 are the ideal irradiation positions drawn by solid lines in FIG. 7 while matching the coordinate positions of the points XMAX and XMIN included in the position locus. There is a deviation that the coordinate positions of the point YMAX and the point YMIN included in the trajectory are not coincident.
 すなわち、コントローラ25は、例えば、図6のステップS1により抽出した各座標位置と、メモリ24に格納されたテーブルデータTBDの座標位置と、に基づき、座標位置XAと点XMAXの座標位置とを比較し、座標位置YBと点YMINの座標位置とを比較し、座標位置XBと点XMINの座標位置とを比較し、さらに、座標位置YAと点YMAXの座標位置とを比較する処理を図6のステップS2において行うことにより、渦巻状の走査パターンの最外周に位置する4箇所の照射位置における理想的なG光の照射位置と実際のG光の照射位置との間のずれを検出することができる。 That is, for example, the controller 25 compares the coordinate position XA and the coordinate position of the point XMAX based on each coordinate position extracted in step S1 of FIG. 6 and the coordinate position of the table data TBD stored in the memory 24. The process of comparing the coordinate position YB and the coordinate position of the point YMIN, comparing the coordinate position XB and the coordinate position of the point XMIN, and comparing the coordinate position YA and the coordinate position of the point YMAX is shown in FIG. By performing in step S2, it is possible to detect a deviation between the ideal G light irradiation position and the actual G light irradiation position at the four irradiation positions located on the outermost periphery of the spiral scanning pattern. it can.
 また、コントローラ25は、例えば、図6のステップS2により得られた比較結果に基づき、座標位置XAと点XMAXの座標位置とが一致し、座標位置YBと点YMINの座標位置とが一致し、座標位置XBと点XMINの座標位置とが一致し、かつ、座標位置YAと点YMAXの座標位置とが一致しているか否かを検出する処理を図6のステップS3において行うことにより、渦巻状の走査パターンの最外周に位置する4箇所の照射位置における理想的なG光の照射位置と実際のG光の照射位置とが一致しているか否かを検出することができる。 Further, for example, based on the comparison result obtained in step S2 of FIG. 6, the controller 25 matches the coordinate position XA and the coordinate position of the point XMAX, and the coordinate position YB and the coordinate position of the point YMIN match. A process of detecting whether or not the coordinate position XB and the coordinate position of the point XMIN coincide and the coordinate position YA and the coordinate position of the point YMAX coincide is performed in step S3 in FIG. It is possible to detect whether or not the ideal G light irradiation positions at the four irradiation positions located on the outermost periphery of the scanning pattern coincide with the actual G light irradiation positions.
 コントローラ25は、図6のステップS3において、例えば、座標位置XAと点XMAXの座標位置とが一致しないこと、座標位置YBと点YMINの座標位置とが一致しないこと、座標位置XBと点XMINの座標位置とが一致しないこと、または、座標位置YAと点YMAXの座標位置とが一致しないことのいずれかを検出した場合には、光照射座標検出モジュール101に照射されたG光の照射範囲が内視鏡2の設計時に意図された所定の画角を満たしていないとの判定結果を得た後、アクチュエータ15に供給される第1及び(または)第2の駆動信号を調整するための制御をドライバユニット22に対して行う(図6のステップS4)。 In step S3 in FIG. 6, for example, the controller 25 determines that the coordinate position XA and the coordinate position of the point XMAX do not match, the coordinate position YB and the coordinate position of the point YMIN do not match, the coordinate position XB and the point XMIN When it is detected that the coordinate position does not match or the coordinate position YA does not match the coordinate position of the point YMAX, the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 is determined. Control for adjusting the first and / or second drive signals supplied to the actuator 15 after obtaining the determination result that the predetermined angle of view intended at the time of designing the endoscope 2 is not satisfied Is performed on the driver unit 22 (step S4 in FIG. 6).
 具体的には、コントローラ25は、図6のステップS2及びステップS3の処理結果に基づき、例えば図7のように、点YMAXに相当する座標位置のY座標値に比べて座標位置YAのY座標値が小さく、かつ、点YMINに相当する座標位置のY座標値に比べて座標位置YBのY座標値が大きいことを検出した場合には、アクチュエータ15に供給される第2の駆動信号の振幅値(信号レベル)を現在の振幅値(信号レベル)から増加させる制御をドライバユニット22に対して行う。 Specifically, the controller 25, based on the processing results of step S2 and step S3 in FIG. 6, for example, as shown in FIG. 7, compared to the Y coordinate value of the coordinate position corresponding to the point YMAX, When the value is small and it is detected that the Y coordinate value of the coordinate position YB is larger than the Y coordinate value of the coordinate position corresponding to the point YMIN, the amplitude of the second drive signal supplied to the actuator 15 The driver unit 22 is controlled to increase the value (signal level) from the current amplitude value (signal level).
 なお、本実施例によれば、アクチュエータ15に供給される駆動信号の振幅値(信号レベル)を現在の振幅値(信号レベル)から増加または減少させるような制御が図6のステップS4において行われるものに限らず、例えば、アクチュエータ15に供給される第1及び第2の駆動信号の位相差が90°になるように、少なくとも一方の駆動信号の位相を変化させる制御が図6のステップS4において行われるものであってもよい。 According to the present embodiment, control that increases or decreases the amplitude value (signal level) of the drive signal supplied to the actuator 15 from the current amplitude value (signal level) is performed in step S4 in FIG. For example, the control for changing the phase of at least one of the drive signals so that the phase difference between the first and second drive signals supplied to the actuator 15 is 90 ° is performed in step S4 of FIG. It may be performed.
 コントローラ25は、図6のステップS3において、例えば、座標位置XAと点XMAXの座標位置とが一致すること、座標位置YBと点YMINの座標位置とが一致すること、座標位置XBと点XMINの座標位置とが一致すること、及び、座標位置YAと点YMAXの座標位置とが一致することをそれぞれ検出することにより、光照射座標検出モジュール101に照射されたG光の照射範囲が内視鏡2の設計時に意図された所定の画角を満たすとの判定結果が得られるまでの間において、図6のステップS1からS4までの処理を繰り返し行う。 In step S3 of FIG. 6, the controller 25, for example, matches the coordinate position XA and the coordinate position of the point XMAX, matches the coordinate position YB and the coordinate position of the point YMIN, and sets the coordinate position XB and the point XMIN. By detecting that the coordinate position matches, and that the coordinate position YA and the coordinate position of the point YMAX match, respectively, the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 is determined by the endoscope. The processing from step S1 to step S4 in FIG. 6 is repeated until a determination result that the predetermined angle of view intended at the time of design 2 is satisfied is obtained.
 そして、コントローラ25は、図6のステップS3において、光照射座標検出モジュール101に照射されたG光の照射範囲が内視鏡2の設計時に意図された所定の画角を満たすとの判定結果を得た場合には、当該判定結果を得たタイミングにおいてドライバユニット22に対して行っている制御を維持しつつ、図6のステップS5以降の処理を行う。 Then, in step S3 in FIG. 6, the controller 25 determines the determination result that the irradiation range of the G light irradiated to the light irradiation coordinate detection module 101 satisfies a predetermined angle of view intended when the endoscope 2 is designed. If obtained, the processing after step S5 in FIG. 6 is performed while maintaining the control performed on the driver unit 22 at the timing when the determination result is obtained.
 なお、図6のステップS1からS4までの処理は、G光を光照射座標検出モジュール101に対して照射した場合に限らず、R光またはB光を光照射座標検出モジュール101に対して照射した場合であっても略同様に実施可能である。 In addition, the process from step S1 of FIG. 6 to S4 is not restricted to the case where G light is irradiated with respect to the light irradiation coordinate detection module 101, R light or B light was irradiated with respect to the light irradiation coordinate detection module 101 Even in this case, it can be implemented in substantially the same manner.
 一方、コントローラ25は、光照射座標検出モジュール101から出力された座標情報の中から、1フレーム分の各照射位置に対応する座標位置を抽出する処理を行う(図6のステップS5)。 On the other hand, the controller 25 performs a process of extracting coordinate positions corresponding to each irradiation position for one frame from the coordinate information output from the light irradiation coordinate detection module 101 (step S5 in FIG. 6).
 また、コントローラ25は、図6のステップS5により抽出した各座標位置と、メモリ24に格納されたテーブルデータTBDと、に基づき、被写体へのG光の照射に伴って受光された戻り光に応じて生成されるG画像の補正処理に用いられるG画像補正情報を取得するための処理を行う(図6のステップS6)。 Further, the controller 25 responds to the return light received with the G light irradiation to the subject based on each coordinate position extracted in step S5 of FIG. 6 and the table data TBD stored in the memory 24. A process for acquiring G image correction information used for correcting the G image generated in this way is performed (step S6 in FIG. 6).
 ここで、図6のステップS6の処理を実施するタイミングにおいては、例えば、メモリ24に予め格納されたテーブルデータTBDの各座標位置を時系列順に並べて得られる理想的なG光の照射位置の軌跡と、光照射座標検出モジュール101から出力される座標情報に含まれる各座標位置を時系列順に並べて得られる実際のG光の照射位置の軌跡と、の間に図8に示すようなずれが生じ得る。図8は、理想的な照明光の照射位置の軌跡と実際の照明光の照射位置の軌跡との間に生じるずれの一例を示す図である。 Here, at the timing of performing the process of step S6 in FIG. 6, for example, the locus of the ideal G light irradiation position obtained by arranging the coordinate positions of the table data TBD stored in advance in the memory 24 in time series order. 8 and the actual locus of the G light irradiation position obtained by arranging the coordinate positions included in the coordinate information output from the light irradiation coordinate detection module 101 in chronological order, as shown in FIG. obtain. FIG. 8 is a diagram illustrating an example of a deviation that occurs between the locus of the ideal illumination light irradiation position and the locus of the actual illumination light irradiation position.
 具体的には、図8に点線で描かれた実際の照射位置の軌跡によれば、内視鏡2の設計時に意図された所定の画角を満たすような照明範囲における少なくとも一部の照射位置の軌跡が、図8に実線で描かれた理想的な照射位置の軌跡に一致していない、というずれが生じている。 Specifically, according to the locus of the actual irradiation position drawn in dotted lines in FIG. 8, at least a part of the irradiation position in the illumination range that satisfies a predetermined angle of view intended when the endoscope 2 is designed There is a deviation in that the locus does not coincide with the locus of the ideal irradiation position drawn by a solid line in FIG.
 すなわち、コントローラ25は、図6のステップS6において、例えば、図6のステップS5により抽出した座標位置と、テーブルデータTBDの座標位置と、の間の位置ずれ量GZLを、テーブルデータTBDに含まれる各照射時刻毎に算出することにより、理想的なG光の照射位置の軌跡と、実際のG光の照射位置の軌跡と、の間のずれ量を検出することができる。 That is, in step S6 of FIG. 6, the controller 25 includes, for example, the positional deviation amount GZL between the coordinate position extracted in step S5 of FIG. 6 and the coordinate position of the table data TBD in the table data TBD. By calculating for each irradiation time, it is possible to detect a shift amount between the locus of the ideal G light irradiation position and the locus of the actual G light irradiation position.
 さらに、コントローラ25は、例えば、前述のように算出した各位置ずれ量GZLに基づいて補間処理を行うことにより、被写体へのG光の照射に伴って受光された戻り光に応じて生成されるG画像の全ての画素の位置ずれを補正するための補正量を含むG画像補正情報を取得することができる。 Furthermore, the controller 25 performs, for example, interpolation processing based on each displacement amount GZL calculated as described above, and is generated according to the return light received as the subject is irradiated with the G light. G image correction information including a correction amount for correcting a positional shift of all the pixels of the G image can be acquired.
 なお、本実施例によれば、理想的な照射位置の軌跡と、実際の照射位置の軌跡と、の間のずれ量を検出し、当該検出したずれ量に基づいて画像補正情報を取得する処理として、図6のステップS5及びステップS6の処理を行うものに限らず、他の処理を行うものであってもよい。 Note that, according to the present embodiment, a process of detecting a deviation amount between the locus of the ideal irradiation position and the locus of the actual irradiation position, and acquiring image correction information based on the detected deviation amount. As shown in FIG. 6, the process is not limited to the process of steps S <b> 5 and S <b> 6, and other processes may be performed.
 一方、コントローラ25は、図6のステップS6により取得したG画像補正情報をメモリ24に格納させた後、G画像補正情報に基づく画像補正処理をG画像に対して施す(図6のステップS7)。 On the other hand, the controller 25 stores the G image correction information acquired in step S6 in FIG. 6 in the memory 24, and then performs an image correction process based on the G image correction information on the G image (step S7 in FIG. 6). .
 そして、図6のステップS5からステップS7までの処理と同様の処理を、光照射座標検出モジュール101に対してR光を照射した場合に適用することにより、被写体へのR光の照射に伴って受光された戻り光に応じて生成されるR画像の全ての画素の位置ずれを補正するための補正量を含むR画像補正情報が取得され、さらに、当該取得されたR画像補正情報に基づく画像補正処理がR画像に対して施される。 Then, by applying the same processing as the processing from step S5 to step S7 in FIG. 6 when the light irradiation coordinate detection module 101 is irradiated with the R light, the subject is irradiated with the R light. R image correction information including a correction amount for correcting the positional deviation of all the pixels of the R image generated according to the received return light is acquired, and an image based on the acquired R image correction information Correction processing is performed on the R image.
 また、図6のステップS5からステップS7までの処理と同様の処理を、光照射座標検出モジュール101に対してB光を照射した場合に適用することにより、被写体へのB光の照射に伴って受光された戻り光に応じて生成されるB画像の全ての画素の位置ずれを補正するための補正量を含むB画像補正情報が取得され、さらに、当該取得されたB画像補正情報に基づく画像補正処理がB画像に対して施される。 Further, by applying the same processing as the processing from step S5 to step S7 in FIG. 6 when the light irradiation coordinate detection module 101 is irradiated with the B light, the subject is irradiated with the B light. B image correction information including a correction amount for correcting the positional deviation of all the pixels of the B image generated according to the received return light is acquired, and an image based on the acquired B image correction information. Correction processing is performed on the B image.
 すなわち、以上に述べたようなR画像補正情報、G画像補正情報及びB画像補正情報に基づく画像補正処理が行われることにより、理想的な照明光の照射位置と実際の照明光の照射位置との間のずれに起因して生じる画像の歪みが十分に補正された補正後の画像を生成してモニタ4に表示させることができる。その結果、本実施例によれば、走査型の内視鏡を用いて取得される画像の歪みを精度良く較正することができる。 That is, by performing the image correction processing based on the R image correction information, the G image correction information, and the B image correction information as described above, the ideal illumination light irradiation position and the actual illumination light irradiation position are determined. It is possible to generate a corrected image in which the distortion of the image caused by the shift between the two is sufficiently corrected and display it on the monitor 4. As a result, according to the present embodiment, it is possible to accurately calibrate the distortion of an image acquired using a scanning endoscope.
 なお、図6の一連の処理は、図5Aに示したような走査パターンに沿ってG光が照射される際に行われるものに限らず、例えば、図5Bに示したような走査パターンに沿ってG光が照射される際に行われるものであってもよい。 Note that the series of processes in FIG. 6 is not limited to the process performed when the G light is irradiated along the scanning pattern as shown in FIG. 5A, but, for example, along the scanning pattern as shown in FIG. 5B. It may be performed when G light is irradiated.
 本発明は、上述した実施例に限定されるものではなく、発明の趣旨を逸脱しない範囲内において種々の変更や応用が可能であることは勿論である。 The present invention is not limited to the above-described embodiments, and it is needless to say that various modifications and applications can be made without departing from the spirit of the invention.
 本出願は、2012年7月30日に日本国に出願された特願2012-168533号を優先権主張の基礎として出願するものであり、上記の開示内容は、本願明細書、請求の範囲、図面に引用されたものとする。 This application is filed on the basis of the priority claim of Japanese Patent Application No. 2012-168533 filed in Japan on July 30, 2012, and the above disclosure is disclosed in the present specification, claims, It shall be cited in the drawing.

Claims (5)

  1.  光源から発せられた照明光を導く導光部材と、前記導光部材を経て被写体へ照射される前記照明光の照射位置が所定の走査パターンに応じた軌跡を描くように前記導光部材を揺動させることが可能な駆動部と、を備えた内視鏡と、
     前記内視鏡から照射された前記照明光の照射位置を検出可能な座標情報を取得する座標情報取得部と、
     前記所定の走査パターンに沿って前記照明光が照射された場合の照射位置と、前記座標情報に基づいて検出される前記照明光の照射位置と、を比較する比較部と、
     前記比較部の比較結果に基づき、前記内視鏡から照射された前記照明光の照射範囲が所定の画角を満たすか否かに係る判定を行う判定部と、
     前記照明光の照射範囲が前記所定の画角を満たさないとの判定結果が前記判定部により得られた場合において、前記駆動部に供給される駆動信号を調整するための制御を行い、前記照明光の照射範囲が前記所定の画角を満たすとの判定結果が前記判定部により得られた場合において、前記所定の走査パターンに沿った照射位置の軌跡と、前記座標情報に基づいて検出される照射位置により描かれる軌跡と、の間のずれ量を検出するための処理を行う制御部と、
     を有することを特徴とする内視鏡システム。
    The light guide member that guides the illumination light emitted from the light source, and the light guide member is shaken so that the irradiation position of the illumination light applied to the subject through the light guide member draws a locus corresponding to a predetermined scanning pattern. An endoscope provided with a drive unit capable of moving;
    A coordinate information acquisition unit that acquires coordinate information capable of detecting the irradiation position of the illumination light emitted from the endoscope;
    A comparison unit that compares an irradiation position when the illumination light is irradiated along the predetermined scanning pattern and an irradiation position of the illumination light detected based on the coordinate information;
    A determination unit configured to determine whether an irradiation range of the illumination light irradiated from the endoscope satisfies a predetermined angle of view based on a comparison result of the comparison unit;
    When the determination unit obtains a determination result that the illumination light irradiation range does not satisfy the predetermined angle of view, the illumination unit performs control for adjusting a drive signal supplied to the drive unit, and When the determination result that the light irradiation range satisfies the predetermined angle of view is obtained by the determination unit, it is detected based on the locus of the irradiation position along the predetermined scanning pattern and the coordinate information. A control unit that performs processing for detecting a shift amount between the locus drawn by the irradiation position, and
    An endoscope system comprising:
  2.  前記所定の走査パターンに応じた軌跡は、前記内視鏡の挿入軸を中心とした渦巻状の軌跡であり、
     前記比較部は、前記渦巻状の軌跡の最外周に位置する所定の1つ以上の位置において、前記渦巻状の軌跡に沿って前記照明光が照射された場合の照射位置と、前記座標情報に基づいて検出される照射位置と、が一致しているか否かを比較する
     ことを特徴とする請求項1に記載の内視鏡システム。
    The trajectory according to the predetermined scanning pattern is a spiral trajectory centered on the insertion axis of the endoscope,
    The comparison unit includes an irradiation position when the illumination light is irradiated along the spiral locus at one or more predetermined positions located on the outermost periphery of the spiral locus, and the coordinate information. The endoscope system according to claim 1, wherein a comparison is made as to whether or not the irradiation position detected on the basis matches.
  3.  前記判定部は、前記所定の1つ以上の位置の全てにおいて、前記渦巻状の軌跡に沿って前記照明光が照射された場合の照射位置と、前記座標情報に基づいて検出される照射位置と、が一致しているとの比較結果が前記比較部により得られた場合に、前記内視鏡から照射された前記照明光の照射範囲が前記所定の画角を満たすとの判定結果を得る
     ことを特徴とする請求項2に記載の内視鏡システム。
    The determination unit includes an irradiation position when the illumination light is irradiated along the spiral trajectory at all of the one or more predetermined positions, and an irradiation position detected based on the coordinate information. When the comparison unit obtains a comparison result indicating that they match, the determination result that the irradiation range of the illumination light emitted from the endoscope satisfies the predetermined angle of view is obtained. The endoscope system according to claim 2.
  4.  前記判定部は、前記所定の1つ以上の位置のいずれかにおいて、前記渦巻状の軌跡に沿って前記照明光が照射された場合の照射位置と、前記座標情報に基づいて検出される照射位置と、が一致していないとの比較結果が前記比較部により得られた場合に、前記内視鏡から照射された前記照明光の照射範囲が前記所定の画角を満たさないとの判定結果を得、 前記制御部は、前記照明光の照射範囲が前記所定の画角を満たさないとの判定結果が前記判定部により得られた場合において、前記比較部の比較結果に基づき、前記駆動部に供給される前記駆動信号の振幅または位相を変化させる制御を行う
     ことを特徴とする請求項2に記載の内視鏡システム。
    The determination unit includes an irradiation position when the illumination light is irradiated along the spiral trajectory at any one of the predetermined one or more positions, and an irradiation position detected based on the coordinate information. And when the comparison unit obtains a comparison result that does not match, the determination result that the irradiation range of the illumination light emitted from the endoscope does not satisfy the predetermined angle of view. And when the determination result that the illumination light irradiation range does not satisfy the predetermined angle of view is obtained by the determination unit, the control unit applies the drive unit to the drive unit based on the comparison result of the comparison unit. The endoscope system according to claim 2, wherein control is performed to change the amplitude or phase of the supplied drive signal.
  5.  前記制御部は、前記ずれ量に基づき、前記被写体への前記照明光の照射に応じて生成される画像の全ての画素の位置ずれを補正するための補正量を含む画像補正情報を取得し、さらに、当該取得した画像補正情報に基づく画像補正処理を前記画像に対して施す
     ことを特徴とする請求項1に記載の内視鏡システム。
    The control unit acquires image correction information including a correction amount for correcting a positional shift of all pixels of an image generated in response to irradiation of the illumination light to the subject based on the shift amount, The endoscope system according to claim 1, further comprising performing an image correction process based on the acquired image correction information on the image.
PCT/JP2013/060189 2012-07-30 2013-04-03 Endoscope system WO2014020943A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2012168533 2012-07-30
JP2012-168533 2012-07-30

Publications (1)

Publication Number Publication Date
WO2014020943A1 true WO2014020943A1 (en) 2014-02-06

Family

ID=50027640

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2013/060189 WO2014020943A1 (en) 2012-07-30 2013-04-03 Endoscope system

Country Status (1)

Country Link
WO (1) WO2014020943A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016106829A (en) * 2014-12-05 2016-06-20 オリンパス株式会社 Optical-scanning observation system
JP2017000379A (en) * 2015-06-09 2017-01-05 オリンパス株式会社 Scanning endoscope system and scanning endoscope calibration method
WO2017037781A1 (en) * 2015-08-28 2017-03-09 オリンパス株式会社 Scanning-type observation device
WO2018116464A1 (en) * 2016-12-22 2018-06-28 オリンパス株式会社 Scanning image acquisition device and scanning image acquisition system
CN110613510A (en) * 2018-06-19 2019-12-27 清华大学 Self-projection endoscope device

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007526014A (en) * 2003-06-23 2007-09-13 マイクロビジョン,インク. Scanning endoscope
JP2008514342A (en) * 2004-10-01 2008-05-08 ユニバーシティ・オブ・ワシントン Remapping method to reduce image distortion
JP2010515947A (en) * 2007-01-10 2010-05-13 ユニヴァーシティ オブ ワシントン Calibration of the scanning beam device
JP2010148764A (en) * 2008-12-26 2010-07-08 Hoya Corp Optical scanning endoscope apparatus, optical scanning endoscope, and optical scanning endoscope processor
JP2010148769A (en) * 2008-12-26 2010-07-08 Hoya Corp Optical scanning endoscope apparatus, optical scanning endoscope, and optical scanning endoscope processor
JP2010158414A (en) * 2009-01-08 2010-07-22 Hoya Corp Processor and apparatus of optical scanning type endoscope
JP2010268972A (en) * 2009-05-21 2010-12-02 Hoya Corp Medical observation system and processor
JP2011004920A (en) * 2009-06-25 2011-01-13 Hoya Corp Endoscope apparatus
JP2011004929A (en) * 2009-06-25 2011-01-13 Hoya Corp Endoscope apparatus

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007526014A (en) * 2003-06-23 2007-09-13 マイクロビジョン,インク. Scanning endoscope
JP2008514342A (en) * 2004-10-01 2008-05-08 ユニバーシティ・オブ・ワシントン Remapping method to reduce image distortion
JP2010515947A (en) * 2007-01-10 2010-05-13 ユニヴァーシティ オブ ワシントン Calibration of the scanning beam device
JP2010148764A (en) * 2008-12-26 2010-07-08 Hoya Corp Optical scanning endoscope apparatus, optical scanning endoscope, and optical scanning endoscope processor
JP2010148769A (en) * 2008-12-26 2010-07-08 Hoya Corp Optical scanning endoscope apparatus, optical scanning endoscope, and optical scanning endoscope processor
JP2010158414A (en) * 2009-01-08 2010-07-22 Hoya Corp Processor and apparatus of optical scanning type endoscope
JP2010268972A (en) * 2009-05-21 2010-12-02 Hoya Corp Medical observation system and processor
JP2011004920A (en) * 2009-06-25 2011-01-13 Hoya Corp Endoscope apparatus
JP2011004929A (en) * 2009-06-25 2011-01-13 Hoya Corp Endoscope apparatus

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2016106829A (en) * 2014-12-05 2016-06-20 オリンパス株式会社 Optical-scanning observation system
JP2017000379A (en) * 2015-06-09 2017-01-05 オリンパス株式会社 Scanning endoscope system and scanning endoscope calibration method
WO2017037781A1 (en) * 2015-08-28 2017-03-09 オリンパス株式会社 Scanning-type observation device
JPWO2017037781A1 (en) * 2015-08-28 2018-06-14 オリンパス株式会社 Scanning observation device
WO2018116464A1 (en) * 2016-12-22 2018-06-28 オリンパス株式会社 Scanning image acquisition device and scanning image acquisition system
US10859816B2 (en) 2016-12-22 2020-12-08 Olympus Corporation Scanning-type image acquisition device and scanning-type image acquisition system
CN110613510A (en) * 2018-06-19 2019-12-27 清华大学 Self-projection endoscope device
CN110613510B (en) * 2018-06-19 2020-07-21 清华大学 Self-projection endoscope device

Similar Documents

Publication Publication Date Title
JP5530577B1 (en) Scanning endoscope system
JP5490331B1 (en) Endoscope system
JP5571268B1 (en) Scanning endoscope system
JP5702023B2 (en) Scanning endoscope system and method of operating scanning endoscope system
WO2014020943A1 (en) Endoscope system
JP5841513B2 (en) Scanning endoscope system
JP6265781B2 (en) Endoscope system and control method of endoscope system
JP5974208B1 (en) Optical scanning observation system
US9974432B2 (en) Scanning endoscope apparatus with drive signal correction
JP6381123B2 (en) Optical scanning observation system
US20180289247A1 (en) Endoscope system
WO2022014058A1 (en) Endoscope system, control device, lighting method, and program
JP5639289B2 (en) Scanning endoscope device
JP6437808B2 (en) Optical scanning observation system
JP2015033456A (en) Endoscope system
WO2016017199A1 (en) Optical scanning observation system
JP6599728B2 (en) Scanning endoscope device
WO2016181711A1 (en) Scanning endoscope
JP2018201812A (en) Scanning endoscope apparatus and image generation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13825400

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13825400

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP