GB2632324A - Autofocus during video-microscopy - Google Patents
Autofocus during video-microscopy Download PDFInfo
- Publication number
- GB2632324A GB2632324A GB2311944.9A GB202311944A GB2632324A GB 2632324 A GB2632324 A GB 2632324A GB 202311944 A GB202311944 A GB 202311944A GB 2632324 A GB2632324 A GB 2632324A
- Authority
- GB
- United Kingdom
- Prior art keywords
- focus
- sensor
- images
- lens
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000010865 video microscopy Methods 0.000 title description 6
- 230000008859 change Effects 0.000 claims abstract description 9
- 230000003287 optical effect Effects 0.000 claims description 41
- 238000000034 method Methods 0.000 claims description 39
- 238000000926 separation method Methods 0.000 claims description 7
- 238000001914 filtration Methods 0.000 claims description 3
- 239000011521 glass Substances 0.000 claims description 3
- 238000009499 grossing Methods 0.000 claims description 3
- 230000006641 stabilisation Effects 0.000 claims description 2
- 230000003019 stabilising effect Effects 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 description 12
- 230000008569 process Effects 0.000 description 8
- 238000006073 displacement reaction Methods 0.000 description 5
- 238000004422 calculation algorithm Methods 0.000 description 3
- 230000004075 alteration Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 2
- 210000004204 blood vessel Anatomy 0.000 description 2
- 238000005070 sampling Methods 0.000 description 2
- 230000035945 sensitivity Effects 0.000 description 2
- 208000012661 Dyskinesia Diseases 0.000 description 1
- 208000015592 Involuntary movements Diseases 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000017531 blood circulation Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 238000010191 image analysis Methods 0.000 description 1
- 230000006872 improvement Effects 0.000 description 1
- 239000007788 liquid Substances 0.000 description 1
- 230000004089 microcirculation Effects 0.000 description 1
- 238000000386 microscopy Methods 0.000 description 1
- 230000017311 musculoskeletal movement, spinal reflex action Effects 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 238000004445 quantitative analysis Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/241—Devices for focusing
- G02B21/245—Devices for focusing using auxiliary sources, detectors
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B21/00—Microscopes
- G02B21/24—Base structure
- G02B21/241—Devices for focusing
- G02B21/244—Devices for focusing using image analysis techniques
Landscapes
- Physics & Mathematics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- General Physics & Mathematics (AREA)
- Optics & Photonics (AREA)
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Automatic Focus Adjustment (AREA)
- Microscoopes, Condenser (AREA)
Abstract
Autofocusing an object 101 at a microscope 100 involves receiving, using a beam splitter 108, a portion of light from the object at a camera 102 and receiving a remaining portion of the light at a sensor 104. One or more images of the object are captured at the sensor which is inclined with respect to the orthogonal image plane, further wherein the images have a region of the highest level of focus, such that movements of the object result in a change of location of the region of the highest level of focus in the one or more images captured at the sensor. One or more images of the object are captured at the camera and a location of the region of the highest level of focus is determined in the images captured at the sensor. The image plane of the object to the camera is restored by adjusting a refocusing device 106, positioned between the object and the beam splitter, wherein the restoring is based on the determined location of the region of the highest level of focus on the sensor.
Description
Autofocus During Video-Microscopy
TECHNICAL FIELD
The present disclosure relates to a method and system for automatic focusing during continuous observation of moving objects for viewing or video-recording. Particularly, but not exclusively, it relates to a method for maintaining focus while undertaking optical video-microscopy in living subjects, for medical applications.
BACKGROUND
During video-microscopy of living animal or human subjects, movements vary the object plane, and must be accommodated in real time to maintain focus. The surface of an object can usually be located by a range-finding sensor (active autofocus), with the application of an off-set for objects at constant separation from the surface.
However, active autofocus is inappropriate for locating objects such as blood vessels, which have variable depth below the surface, particularly when there is simultaneous motion orthogonal to the optical axis. In these circumstances, the focal plane must be identified by adjusting optical power, while sampling the image, until sharpness is optimised (passive autofocus). Such sample images usually give no information about the required power and direction of refocusing. This has to be provided by randomly altering the focal plane, then re-sampling to establish whether sharpness has improved or deteriorated: a process known as "hunting". As improvement or deterioration are equally likely, this method is incompatible with continuous observation or video-recording.
Continuous, precise focusing and near-constant magnification are a particular requirement in microscopic clinical examinations, such as the imaging of microcirculations (eg: haemoglobin video imaging).
Accordingly, the ability to maintain focus during video-microscopy is desirable.
An object of the present invention is to mitigate some of the deficiencies of the prior art mentioned above.
SUMMARY OF THE INVENTION
Aspects and embodiments of the invention provide an improved method and system which enable an object imaged through a microscope to remain in focus despite movements of the object being imaged. The invention provides a refocusing device (such as an adjustable lens) in which the optical power of the refocusing device is adjusted in near synchrony with the object's continuous motion to maintain focus and allow quantitative analysis. For example, the refocusing device may be formed using lenses with nesting concave and convex surfaces, where the power of the adjustable lens is altered by moving one or both of the concave and convex surfaces such that the air-gap between them is increased or decreased. Further, the system is discrete from the body of the microscope and can attach to any standard microscope.
According to an aspect of the invention there is provided a method for autofocusing an object at a microscope, the method comprising: receiving, using a beam splitter, a portion of light from the object at a camera and receiving a remaining portion of the light at a sensor; capturing one or more images of the object at the sensor wherein the sensor is inclined with respect to the orthogonal image plane, further wherein the one or more images have a region of the highest level of focus, such that movements of the object result in a change of location of the region of the highest level of focus in the one or more images captured at the sensor; capturing one or more images of the object at the camera; determining a location of the region of the highest level of focus in the one or more images captured at the sensor; and restoring the image plane of the object to the camera by adjusting a refocusing device, positioned between the object and the beam splitter, wherein the restoring is based on the determined location of the region of the highest level of focus on the sensor.
This method is able to correct focus at rates in excess of 60.0 Hz, and thus ensures an object being imaged is consistently in focus. A sharp primary image is maintained throughout refocusing with minimal changes in magnification due to the refocusing device. For example, the system is able to focus on a particular object in a transparent medium and maintain focus on that object without shifting to focus objects at different depths. Further, the object can be held in focus by adjustment of the refocusing device, despite small movements in the plane orthogonal to the optical axis.
Optionally, wherein the movements of the object are axial movements.
Optionally, wherein the sensor is a second camera.
Optionally, wherein the restoring is based on the separation between the determined location of the region of the highest level of focus on the sensor and the position on the sensor that is parfocal with the camera.
Optionally wherein the refocusing device is a variable focus air-lens comprising a piano-convex lens nested with a piano-concave lens such that the variable focus air lens provides increasing optical power as the air-gap between the piano-convex lens and the piano-concave lens is increased. Optionally wherein the piano-convex lens and the piano-concave lens are formed of identical glass type and have the same focal length. This set up minimises aberrations and changes in magnification.
Optionally wherein adjusting the variable focus air-lens comprises translating the piano-concave lens and/or piano-convex lens along the optical axis. Optionally wherein the piano-concave lens and/or piano-convex lens is mounted on a high-speed stage. This enables the system to maintain focus by moving in continuous motion with any movement from the object.
Optionally, wherein either or both elements of the variable focus air-lens incorporate other optical components.
Optionally wherein the camera is parfocal with a location on the sensor.
Optionally wherein the inclination of the sensor is adjustable to achieve the required accuracy and focal range.
Optionally wherein the one or more images comprises images from a video stream.
Optionally wherein the portion of light delivered from the object to the refocusing device represents the one or more images in a near-parallel beam.
Optionally wherein the location of the region of the highest level of focus is determined at a frequency sufficient for the object to remain in almost constant focus. This enables the object to remain in focus at the required video rate.
Optionally further comprising stabilising the one or more images in real-time in the plane orthogonal to the optical axis to provide 3-dimensional image stabilisation.
There is also provided a system for autofocusing an object at a microscope, the system comprising: a beam splitter such that a portion of light from the object is received at a camera and a remaining portion of the light is received at a sensor, wherein the sensor is inclined with respect to the orthogonal image plane and configured to capture one or more images of the object wherein each of the one or more images comprise a region of the highest level of focus, such that movements of the object result in a change of location of the region of the highest level of focus in the one or more images captured at the sensor, further wherein the sensor is further configured to determine a location of the region of the highest level of focus in the one or more images captured at the sensor, the camera configured to capture one or more images of the object; and a refocusing device positioned between the object and the beam splitter and configured to restore the image plane of the object to the camera based on the determined location of the region of the highest level of focus on the sensor.
There is also provided a computer readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the following steps in a system comprising a camera configured to receive a portion of light from an object using a beam splitter and a sensor configured to receive a remaining portion of the light using the beam splitter, wherein the sensor is inclined with respect to the orthogonal image plane such that movements of the object result in a change of location of a region of the highest level of focus in the one or more images captured at the sensor, further wherein the sensor and a refocusing device positioned between the object and the beam splitter are used to focus an image at the camera, the steps comprising: receiving a plurality of frames of image data from the sensor; determining a location of the region of the highest level of focus for one or more frames of the plurality of frames of image data; and restoring the image plane of the object to the camera by adjusting the refocusing device, wherein the restoring is based on the determined location of the region of the highest level of focus and a desired location of the region of the highest level of focus.
Optionally wherein the step of determining a location of the region of the highest level of focus for the one or more frames comprises: applying a sharpness operator to the image; creating an array of sub-images; calculating a sharpness score for each sub-image; and determining a maximum sharpness score.
Optionally the steps further comprising smoothing the sharpness scores by fitting a curve and determining the maximum sharpness score from the smoothed sharpness scores. Optionally wherein the sharpness score for each sub-image is calculated using a Roberts Cross operator. Using a Roberts Cross operator provides an optimal trade-off between speed and accuracy.
Optionally the steps further comprising resizing each frame and applying image processing steps to each frame before the step of creating an array of sub-images. Optionally, wherein the image processing steps include one or more of: Gaussian filtering, thresholding, and boundary truncation. Such image processing accounts for noise, specular reflection and boundaries of the object.
Optionally wherein the refocusing device is an adjustable focus air-lens and the optical power of the adjustable focus air-lens is adjusted by changing the relative separation of the convex and concave surfaces of the adjustable focus air-lens.
Other aspects of the invention will be apparent from the appended claim set.
BRIEF DESCRIPTION OF THE DRAWINGS
One or more embodiments of the invention will now be described, by way of example only, with reference to the accompanying drawings, in which: Figure la is a schematic representation of the apparatus according to an aspect of the invention; Figure lb is a schematic representation of the apparatus according to an aspect of the invention; Figures 2a to 2c are representations of the sensor arrangement according to aspects of the invention; Figure 3 is a flowchart of the process of autofocussing a lens according to an aspect of the invention; Figure 4 is a flowchart of the process of determining a location of highest focus according to an aspect of the invention; Figure 5 is an image highlighting the location of highest focus according to an aspect of the invention.
Figure 6 is an image highlighting the determination of a location of highest focus according to an aspect of the invention; and Figures 7a to 7b are images showing the change in location of highest focus according to an aspect of the invention.
DETAILED DESCRIPTION
The present invention provides a device, method and software capable of continuously autofocusing an optical system in real time. Such a device is shown in Figure la.
In an infinity-corrected microscope, the focal plane of the objective lens coincides with the object plane, therefore light from the object plane emerges as a parallel beam and is refocused by a second lens onto the image plane. When an object moves towards or away from the focal plane of the objective lens, light from the object is no longer perfectly parallel as it emerges from that lens. Since the power of the second lens is constant, the image plane moves away from the camera sensor. In a manually-focused microscope, the user would move the microscope or object until the object was once again at the focal plane of the microscope objective. As will become apparent from the below description, the present system is able to automatically refocus thus keeping an image in focus.
Figure la shows a schematic of an optical system 100. Such an optical system 100 may be employed in a discrete arm attachable to a microscope. For example, optical system 100 can be attached to a port of an optical microscope or include microscope optics 103.
The optical system 100 can be interposed in the near-parallel beam of any infinity-corrected optical system. Reference to near-parallel throughout should be understood as also comprising parallel.
The optical system 100 comprises a camera 102, a sensor 104, an adjustable focus air lens 106, a beam-splitter 108, optional microscope optics 103, an optional lens 109 and optional convex lenses 110, 111 and 112. The lens 109 is a correcting lens which can reduce variations in magnification during refocusing and enhance the optical performance of the system. The lens 109 is preferably a low powered negative lens. Convex lenses 110, 111 and 112 may be repositioned within the system. In the simplest form of the system where the distance from the beam splitter 108 to the camera 102 and to the centre of the sensor 104 are equal, only lens 110 is included and lenses 111 and 112 may be omitted. The camera 102 is placed orthogonal to the imaging axis and is parfocal with the centre of the sensor 104. The sensor 104 is an imaging component which can detect and locate variations in image sharpness. The sensor 104 can be a second camera. The individual components of the optical system 100 are known commercially available components.
To maintain a focused image at the camera 102 (during video-microscopy for example), information is required about both the direction and the distance of travel of the object. This is achieved by identifying the region of best focus on the sensor 104, which is inclined.
Figure lb shows the transverse relocation 113 of the point of best focus of an object 101 on the sensor 104 caused by movement 105 of the object 101.
When the object 101 moves away from the sensor 104 as represented by object 101a, the object 101a is brought to a focus in front of the centre of the sensor 104. When the object 101 moves towards the sensor 104 as represented by object 101b, the object 101b is brought to a focus beyond the centre of the sensor 104. In this way, the direction and distance of axial displacement of the object (movement of the object towards or away from the lens in the optical axis) represented by arrow 105 is converted to a measurable transverse parameter represented by arrow 113. Arrow 113 shows the transverse displacement of the band of image that has the clearest focus. It is this transverse displacement that is used to inform the refocusing described below.
As described in detail below, the adjustable focus air lens comprises a piano-convex lens 106a and piano-concave lens 106b nested together with an air gap between them, with an actuator to control the distance between the two lenses. In a system in which the pla no-concave lens 106b is static and the piano-convex lens 106a moves, lens 110 could be incorporated into the piano-concave lens 106b, taking the place of its piano surface. Similarly, lens 109 may be incorporated into the piano surface of the piano-convex lens 106a. Any suitable motor or actuator may be used. A beam from an object being imaged is received at the adjustable focus air lens 106 and the distance, or relative separation, between the two lenses 106a, 106b is altered to restore the beam to parallel or to its previous state when the object is too close to or too far from the object plane of the system 100. For example, increasing the distance between the two lenses 106a, 106b increases the positive optical power. Such a set up provides an image coincident with the plane of the camera 102 such that the image of the object remains in focus even as the object distance changes. The adjustable focus air lens 106 can be adjusted by hand by a user or can be auto controlled by software as discussed in relation to Figure 4. For example, the adjustable focus air lens 106 may be adjusted by hand in a first step to obtain coarse focus and then subsequently adjusted using software to maintain focus.
The above set up enables an object being imaged to remain in focus throughout the imaging procedure despite changes in the object distance caused by movement of the object.
A near-parallel beam carrying an image of an object 101 is divided into a primary beam and a secondary beam using beam splitter 108 and the respective beams are converged onto a camera 102, and a sensor 104. The camera 102 receives a focused, high-resolution image for analysis. The sensor 104 is arranged so as to encompass a range of image planes which straddles the plane of best focus. The sensor 104 is inclined with respect to the orthogonal image plane, so that any axial displacement of the object is expressed as relocation of best focus on the sensor 104. Because the sensor 104 is inclined, axial movement of the object 101 causes the location of best focus at the sensor 104 to be moved. For example, axial movement of the object 101 causes transverse relocation of the region of best focus on the sensor 104. The centre of the image plane sensor is usually arranged to be parfocal with the camera. The sensor 104 is thus used to determine the current focal plane in order to refocus the image on to the camera 102. The image is refocused by moving one or both elements of the adjustable focus air lens 106, thereby adjusting the distance between the convex surface of 106a and the concave surface of 106b.
The adjustable focus air lens 106 is a specific example of a refocussing means and any suitable refocusing device may be used. Alternative adjustable focus lenses such as liquid lenses may be used, alternatively the object, the microscope or its objective may be moved.
The piano-convex lens 106a and the piano-concave lens 106b are of the same glass type and have the same focal length. As such, this combination of lenses has no optical power when the lenses 106a, 106b are apposed but develops increasing positive optical power as the air-gap between them is enlarged. Such a lens system minimises aberrations and changes in magnification.
Either or both of the lenses 106a, 106b can be moved along the optical axis to adjust the distance between them. Such movement can be carried out by a stage operated by a piezo motor. During use, one or both of the lenses 106a or 106b is in almost continuous motion, being translated along the optical axis to a separation between lenses 106a, 106b that keeps the image plane coincident with the plane of the camera 102.
The image beam, which has been corrected using the refocusing device (for example, restored to parallel using the refocusing device), then passes through the beam splitter 108. The beam splitter 108 may be a cube beam splitter in which there is asymmetrical partition of light. The beam splitter 108 preferably splits the image beam into a beam for high-resolution imaging at the camera 102 and a secondary beam for determination of the focal plane at the sensor 104 optionally at a ratio of 10:1, for example, such that the majority of light from the beam is transmitted to the camera 102.
The emergent beam from the adjustable focus air lens 106 can be converged onto the camera 102 and/or sensor 104 by one or more convex lenses 110, 111, 112. For example, a common convex lens 110 may be placed before the beam splitter 108. Alternatively, one or more convex lenses may be placed after the beam splitter 108 such as convex lenses 111, 112. Common convex lens 110 can be combined with one or more separate lenses 111, 112 in optical system 100.
System 100 receives a near-parallel beam from an object 101 which is passed through the adjustable focus air lens 106 and split at beam splitter 108 such that a portion of the beam is directed to the camera 102 and another portion of the beam is directed to the sensor 104. The sensor 104 is positioned to intersect the image plane at an angle which causes the image plane to be represented by a band of best focus (i.e., an area with the highest level of focus) on the sensor 104. Changes in the object distance move this band across the sensor 104 at right angles to its axis of inclination. The location of the band, on one side or the other of the position on the sensor known to be parfocal with the camera 102, is identified and tracked.
Any image plane can be selected using the camera 102, and its parfocal location on the sensor becomes the location to which the band of best focus is returned to maintain focus.
In some examples, the system may comprise optics in an optical device to which the autofocus arm is attached in which a beam-splitter divides the image beam, directing a portion of the light from the object 101 towards the refocusing device 106 and another portion of light to remain within the microscope optics 103 such that a portion of light from the object 101 is examined separately and the remaining portion of light is provided to the refocusing device 106.
The direction and distance the band of best focus has moved from its parfocal location informs the change in optical power required to restore a sharp primary image at the camera 102. The adjustable focus air lens 106 (or alternative refocusing device) is then adjusted, for example using a motor controllable manually, by a user or by software based on displacement of the band of highest focus. Changes in power of the refocusing device or adjustable focus air lens are repeatedly instructed, restoring the image plane to the location on the sensor 104 that is parfocal with the camera 102. The process of controlling the adjustable focus air lens 106 using software to control the motor is described in further detail below.
The sensor 104 can identify the direction and optical power required for refocusing, with no hunting, in any optical autofocus system: a still or video camera; a standard or video microscope; an optical telescope. It can be used for any frame rate and is, therefore, suitable for real-time, slow-motion and time-lapse video photography and microscopy.
The sensor 104 can be positioned in a number of different configurations as shown in Figures 2a to 2c.
In Figure 2a, the angle the sensor 104 makes with the orthogonal plane 210b is varied, determining a trade-off between accuracy versus focal range. Small angles provide increased sensitivity and accuracy; large angles provide increased focal range. An angle of inclination of 25° has been found to be appropriate for haemoglobin video imaging (also known as HVI), but this can be varied according to microscopic tasks being undertaken.
The sensor 104 may also be repositioned in the plane orthogonal to the optical axis. This is advantageous where areas with steep intensity gradients are most numerous, distant from the centre of the image. Alternatively, the beam can be steered to place the area of image richest in information about location of the focal plane at the centre of the sensor 104.
In Figure 2b, the inclined plane 220 of the sensor 104 is rotated around the optical axis in any meridian. In this configuration, the sensor 104 is ideally tilted to analyse images at right angles to the meridian in which the majority of edges are best aligned. This meridian can be discovered by rotating the sensor while estimating sharpness at its centre. Rotation of the sensor 104 enables alignment of bands of best focus and linear structures in the image e.g., vessels during haemoglobin video imaging.
In Figure 2c, the image of the object is made to be curved in space and transected by a flat sensor; alternatively, a flat image is focused onto a curved sensor. The chosen image plane is now defined by the diameter of the ring of best focus 240. Changes in the diameter of the ring 240 inform the power of refocussing that is required. This configuration caters for asymmetrical orientation of contrast in the image. It also has the advantage that the angle between the sensor and the orthogonal plane varies: shallow centrally (i.e., high sensitivity); steep peripherally (i.e., large focal range). Therefore, reference to 'band' of best focus throughout can be understood also as 'ring' of best focus when the configuration of Figure 2c is implemented.
Figure 3 is a flowchart representing a method of autofocusing an object using a computer program product which controls the adjustable air focus lens 106.
Whilst the optical system 100 described with reference to Figures 1 and 2 can function without the use of software (due to manual adjustment of the adjustable focus air lens 106), the motor of the adjustable focus air lens 106 can be controlled using software. Such software enables the system to maintain focus automatically and at the frame rate of the sensor 104 to process the frames in real time. Beneficially, the use of the software also allows for the focus to be maintained for extended periods of time. When the invention is used in medical applications, such as ophthalmic applications, the software can beneficially maintain the focus for longer periods than would otherwise be achieved manually.
At step 5302, the computer receives image data from the sensor 104. Such image data is received in real-time and the image frames are preferably in full-resolution. The transfer of the image data to the software occurs in a known manner.
At step 5304, the computer determines the location of best focus from the received image data for each frame of image data. This step is discussed in more detail in relation to Figure 4. The band of best focus is a band on the image with the highest contrast at the sensor 104. The location of best focus is the position of this band at the sensor 104 and identifies the current focal plane at the sensor 104. The location of best focus can be seen, for example, in Figure 5 represented by band of best focus 502. This step is preferably executed as close as possible to the frame rate of the sensor 104 to process the image frames in real-time so that the images are in focus throughout the imaging process. For example, the determination may be formed at a frequency of 60.0 Hz, the operating frequency of the sensor 104, such that the location of best focus is calculated for each frame and the resultant image remains in constant focus.
Once the computer has determined the location of best focus, it instructs the adjustable focus air lens 106 to move at step 5306 based on the determined location of best focus and the desired location of best focus. This is achieved using a motor, for example a stepper motor or piezo motor, which the computer controls such that the air gap in the adjustable focus air lens 106 is altered to correct the beam. This adjustment refocuses the image at the camera 102 and is done for each frame of image data. An example of this is illustrated in Figures 7a and 7b discussed below.
Figure 7a shows the desired location of best focus of an image at the centre of the image as represented by band 710. The centre of the image is preferably chosen as the location of best focus as this is typically where the desired object data would be located when setting up the system. Alternative locations can be selected if a different focal plane is desired. During imaging, the focus of the camera may shift such that the highest level of focus is provided at band 720. The software determines the distance between the desired location of best focus at band 710 and the provided band of best focus at band 720 of Figure 7b and instructs the lens to move such that the band of best focus is restored to location of band 710.
The desired location of best focus can be automatically set to the centre of the sensor 104 or can be defined by the user at an alternative location. This allows the system to maintain focus in situations where there are overlapping planes of interest.
During the final step 5306 of instructing the adjustable focus air lens 106 to move, the lens 106a or 106b moves in near synchrony with the object's continuous motion to maintain focus, by instructing the motor of the adjustable focus air lens 106 and operating at the frequency of the sensor 104. This is particularly important for optical microscopes used to image a patient's eye. To achieve this, a high-frequency control loop with minimal feedback delay is used. The steps in Figure 3 preferably execute concurrently at close to 60Hz. In other examples the steps can be executed at alternative frequencies dependent on the device as long as the live feed is constantly in focus, thus the example of 60Hz is non-limiting.
Figure 4, with reference to Figures 5 to 6, show the preferred steps performed in determining the location of the band of highest focus.
At step 5402, each frame of image data received by the computer at step 5302 is resized to allow frame-by-frame processing to execute at speeds in excess of 60 frames per second. For example, each frame may be resized to 640x480 pixels, or other predetermined image sizes.
At step S404, image processing steps are applied to each image frame. Such image processing steps include, for example, gaussian filtering to handle noise in the data, thresholding to handle specular reflection, and boundary truncation to deal with boundaries of the object being imaged such as boundaries of the eye. The skilled person understands that any known image processing techniques can be applied to the image data as required by the situation to achieve the same conditions for each image of the image data and to ensure the band of best focus can be identified (e.g., by applying image processing techniques which increase the contrast in the image).
Another optional part of step S404 is to apply an operator to calculate a "sharpness image", where the sharpness of each pixel in the image is given. Alternatively, the operator can be applied at a later stage in the process of Figure 4. In some examples, a Roberts Cross operator is used to calculate the sharpness scores based on neighbouring pixel intensity gradients. Advantageously, this method provides an optimal trade-off between speed and accuracy. The skilled person understands that alternative known methods can be used to calculate the sharpness scores to determine the intensity of a boundary or an edge in an image. For example, other operators include a Canny, Sobel, or Deriche operator. The sharpness score therefore can be used as a measure of the focus, as a well-defined boundary in an image is a good indication that the image is in focus. Furthermore, when imaging body parts, such as the eye, the boundary of a blood vessel does not show significant variation over short periods of time and thus using the score of the boundary provides a measure of the focus. A consistent sharpness score across multiple images would also indicate that the focus across the same images is the same.
Preferably, at step S406, an array of sub-images is created from the sharpness image using a sliding window. The sub-images can be 16-pixels wide about each horizontal pixel. For example, for a 640x480 image, 624 sub-images of resolution 16x480 could be created. In further embodiments other sizes of sub-images can be used.
At step 5408, a sharpness score is calculated for each sub-image. This can be achieved by taking the mean across each sub-image, or by applying the sharpness operator to each sub-image at this stage. This provides a curve of sharpness scores for each pixel along the horizontal 504 as shown in Figure 5. This curve approximates a bell curve. Known image sharpness scores, or metrics, which provide an objective measure between boundaries can be used.
The maximum of this curve 504 corresponds to the band of best focus 502 as shown by image 500 in Figure 5. However, in some cases, the initial curve can have significant local maxima near particular dark areas of the image due to bias in the edge-based algorithm.
For example, such local maxima occurs near particularly dark or vertical microvasculature when imaging an eye. This is shown in Figure 6 where curve 608 shows a number of local maxima. This can cause the algorithm to incorrectly favour certain locations in the image as the location of highest focus such as band 602 which is incorrectly identified.
To overcome this, the sharpness score data is optionally smoothed at step 5410. The data can be smoothed by fitting a gaussian curve using a least-squares calculation as shown by curve 606 in the image 600 in Figure 6. The maximum of curve 606 is determined at step 5412 and taken as the location of best-focus 604 in the image. This step ensures that small movements of the object plane always correspond proportionally to small movements of the location of best focus. Other smoothing operators, or algorithms may be used. This determined location of best focus 604 is then used to control the motor to restore the location of best focus to the desired location by calculating a difference between the locations and controlling the motor accordingly such that the distance between the lenses in the adjustable focus air lens 106 is adjusted to move the location of best focus back to the desired location.
The present invention therefore provides an apparatus which is able to maintain the focus of an image. The use of the sensor to monitor the image plane, and the refocussing device to restore the image plane to the camera allow for the system to compensate for any axial motion of the object, without significant loss in image quality for the camera. Furthermore, when coupled with the focussing software the system can maintain focus for extended periods of time. When the invention is used to examine blood flow, for example, the ability to maintain focus for extended periods of time can aid image analysis and provides a powerful diagnostic tool. Furthermore, the present invention is particularly advantageous when used in ophthalmology and other situations where small involuntary movements make maintaining focus over an extended period of time difficult.
Claims (24)
- Claims 1. A method for autofocusing an object at a microscope, the method comprising: receiving, using a beam splitter, a portion of light from the object at a camera and receiving a remaining portion of the light at a sensor; capturing one or more images of the object at the sensor wherein the sensor is inclined with respect to the orthogonal image plane, further wherein the one or more images have a region of the highest level of focus, such that movements of the object result in a change of location of the region of the highest level of focus in the one or more images captured at the sensor; capturing one or more images of the object at the camera; determining a location of the region of the highest level of focus in the one or more images captured at the sensor; and restoring the image plane of the object to the camera by adjusting a refocusing device, positioned between the object and the beam splitter, wherein the restoring is based on the determined location of the region of the highest level of focus on the sensor.
- 2. The method of claim 1, wherein the movements of the object are axial movements.
- 3. The method of any preceding claim, wherein the sensor is a second camera.
- 4. The method of any preceding claim wherein the restoring is based on the separation between the determined location of the region of the highest level of focus on the sensor and the position on the sensor that is parfocal with the camera.
- 5. The method of any preceding claim wherein the refocusing device is a variable focus air-lens comprising a piano-convex lens nested with a piano-concave lens such that the variable focus air-lens provides increasing optical power as the air-gap between the piano-convex lens and the piano-concave lens is increased.
- 6. The method of claim 5 wherein the piano-convex lens and the piano-concave lens are formed of identical glass type and have the same focal length.
- 7. The method of claim 5 or 6 wherein adjusting the variable focus air-lens comprises translating the piano-concave lens and/or the piano-convex lens along the optical axis.
- 8. The method of any of claims 5 to 7 wherein the piano-concave lens and/or piano-convex lens is mounted on a high-speed stage.
- 9. The method of claim 5 to 8 wherein either or both elements of the variable focus air-lens incorporate other optical components.
- 10.The method of any preceding claim wherein the camera is parfocal with a location on the sensor.
- 11.The method of any preceding claim wherein the inclination of the sensor is adjustable to achieve the required accuracy and focal range.
- 12.The method of any preceding claim wherein the one or more images comprises images from a video stream.
- 13.The method of any preceding claim wherein the portion of light delivered from the object to the refocusing device represents the one or more images in a near-parallel beam.
- 14. The method of any preceding claim wherein the location of the region of the highest level of focus is determined at a frequency sufficient for the object to remain in almost constant focus.
- 15.The method of any preceding claim further comprising stabilising the one or more images in real-time in the plane orthogonal to the optical axis to provide 3-dimensional image stabilisation.
- 16. A system for autofocusing an object at a microscope, the system comprising: a beam splitter such that a portion of light from the object is received at a camera and a remaining portion of the light is received at a sensor, wherein the sensor is inclined with respect to the orthogonal image plane and configured to capture one or more images of the object wherein each of the one or more images comprise a region of the highest level of focus, such that movements of the object result in a change of location of the region of the highest level of focus in the one or more images captured at the sensor, further wherein the sensor is further configured to determine a location of the region of the highest level of focus in the one or more images captured at the sensor, the camera configured to capture one or more images of the object; and a refocusing device positioned between the object and the beam splitter and configured to restore the image plane of the object to the camera based on the determined location of the region of the highest level of focus on the sensor.
- 17.The system of claim 16 further configured to carry out the method of any of claims 2 to 15.
- 18.A computer readable medium comprising instructions which, when executed by a computer, cause the computer to carry out the following steps, in a system comprising: a camera configured to receive a portion of light from an object using a beam splitter and a sensor configured to receive a remaining portion of the light using the beam splitter, wherein the sensor is inclined with respect to the orthogonal image plane such that movements of the object result in a change of location of a region of the highest level of focus in the one or more images captured at the sensor, further wherein the sensor and a refocusing device positioned between the object and the beam splitter are used to focus an image at the camera, the steps comprising: receiving a plurality of frames of image data from the sensor; determining a location of the region of the highest level of focus for one or more frames of the plurality of frames of image data; and restoring the image plane of the object to the camera by adjusting the refocusing device, wherein the restoring is based on the determined location of the region of the highest level of focus and a desired location of the region of the highest level of focus.
- 19.The computer readable medium of claim 18, wherein the step of determining a location of the region of the highest level of focus for the one or more frames, comprises: applying a sharpness operator to the image; creating an array of sub-images; calculating a sharpness score for each sub-image; and determining a maximum sharpness score.
- 20.The computer readable medium of claim 19, further comprising smoothing the sharpness scores by fitting a curve and determining the maximum sharpness score from the smoothed sharpness scores.
- 21.The computer readable medium of claim 19 or 20, further comprising resizing each frame and applying image processing steps to each frame before the step of creating an array of sub-images.
- 22.The computer readable medium of claim 21, wherein the image processing steps include one or more of: gaussian filtering, thresholding, and boundary truncation.
- 23.The computer readable medium of any of claims 19 to 22, wherein the sharpness score for each sub-image is calculated using a Roberts Cross operator.
- 24.The computer readable medium of any of claims 18 to 23, wherein the refocusing device is an adjustable focus air-lens and the optical power of the adjustable focus air-lens is adjusted by changing the relative separation of the convex and concave surfaces of the adjustable focus air-lens.
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2311944.9A GB2632324A (en) | 2023-08-03 | 2023-08-03 | Autofocus during video-microscopy |
| PCT/EP2024/072061 WO2025027199A1 (en) | 2023-08-03 | 2024-08-02 | Autofocus during video-microscopy |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| GB2311944.9A GB2632324A (en) | 2023-08-03 | 2023-08-03 | Autofocus during video-microscopy |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| GB202311944D0 GB202311944D0 (en) | 2023-09-20 |
| GB2632324A true GB2632324A (en) | 2025-02-05 |
Family
ID=88016945
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| GB2311944.9A Pending GB2632324A (en) | 2023-08-03 | 2023-08-03 | Autofocus during video-microscopy |
Country Status (2)
| Country | Link |
|---|---|
| GB (1) | GB2632324A (en) |
| WO (1) | WO2025027199A1 (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180275388A1 (en) * | 2015-09-24 | 2018-09-27 | Leica Biosystems Imaging, Inc. | Real-time focusing in line scan imaging |
| US20190285835A1 (en) * | 2018-03-14 | 2019-09-19 | Nanotronics Imaging, Inc. | Systems, devices and methods for automatic microscopic focus |
Family Cites Families (6)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPS6239376Y2 (en) * | 1980-09-16 | 1987-10-07 | ||
| JP4370554B2 (en) * | 2002-06-14 | 2009-11-25 | 株式会社ニコン | Autofocus device and microscope with autofocus |
| WO2005010495A2 (en) * | 2003-07-22 | 2005-02-03 | Trestle Corporation | System and method for generating digital images of a microscope slide |
| DE102007038579A1 (en) * | 2007-08-16 | 2009-02-19 | Carl Zeiss Microimaging Gmbh | Microscope with internal focus |
| JP5938401B2 (en) * | 2010-06-24 | 2016-06-22 | コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. | Autofocus for scanning microscopy based on differential measurements |
| US20170363864A1 (en) * | 2014-12-02 | 2017-12-21 | H. Jay Margolis | Dynamic Microscope with Combined Optical Focus and Aberrational Correction |
-
2023
- 2023-08-03 GB GB2311944.9A patent/GB2632324A/en active Pending
-
2024
- 2024-08-02 WO PCT/EP2024/072061 patent/WO2025027199A1/en active Pending
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20180275388A1 (en) * | 2015-09-24 | 2018-09-27 | Leica Biosystems Imaging, Inc. | Real-time focusing in line scan imaging |
| US20190285835A1 (en) * | 2018-03-14 | 2019-09-19 | Nanotronics Imaging, Inc. | Systems, devices and methods for automatic microscopic focus |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025027199A1 (en) | 2025-02-06 |
| GB202311944D0 (en) | 2023-09-20 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| JP2015500111A (en) | Automatic image optimization system, especially for stereomicroscopes | |
| US20180122092A1 (en) | Apparatus and Method for Capturing Images using Lighting from Different Lighting Angles | |
| US20190353886A1 (en) | Method for generating a three-dimensional model of a sample in a digital microscope and a digital microscope | |
| JP2022058300A (en) | A system and method for aligning the optical axis of the optical assembly perpendicular to the workpiece surface using multipoint automatic focusing | |
| US20250301232A1 (en) | Stereoscopic imaging platform with continuous autofocusing mode | |
| US12309349B2 (en) | Stereoscopic imaging platform with disparity and sharpness control automatic focusing mode | |
| CN111989608A (en) | Microscope and method for microscopically viewing a sample to present an image or three-dimensional image with extended depth of field | |
| US20120050517A1 (en) | Imaging systems and associated methods thereof | |
| US20230333452A1 (en) | Autofocus method and associated optical imaging system | |
| CN106842496A (en) | Method for automatically adjusting focus based on frequency domain comparison method | |
| GB2632324A (en) | Autofocus during video-microscopy | |
| JP2020144158A (en) | Imaging device and control device therefor | |
| US10897580B2 (en) | Observation device that controls numerical aperture in accordance with specified observation scope | |
| WO2019181553A1 (en) | Surgical microscope system | |
| US11863882B2 (en) | Stereoscopic imaging platform with target locking automatic focusing mode | |
| EP4544977A1 (en) | Imaging device, endoscope, system and method for imaging an object | |
| Liu et al. | Real time auto-focus algorithm for eye gaze tracking system | |
| CN118151360A (en) | Multi-depth-of-field microscopic real-time interaction automatic focusing system | |
| Sueishi et al. | Image-based Response Measurement of Liquid Lens and Iterative Calibration of Scanning Focus Tracking for Dynamic Iris Authentication | |
| JP6754096B2 (en) | Ophthalmic observation system and ophthalmic observation control program | |
| CN110515191A (en) | Microendoscopic imaging method and system |