WO2025188609A1 - Methods and systems for assessment of biological specimen adequacy - Google Patents
Methods and systems for assessment of biological specimen adequacyInfo
- Publication number
- WO2025188609A1 WO2025188609A1 PCT/US2025/018115 US2025018115W WO2025188609A1 WO 2025188609 A1 WO2025188609 A1 WO 2025188609A1 US 2025018115 W US2025018115 W US 2025018115W WO 2025188609 A1 WO2025188609 A1 WO 2025188609A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- sample
- adequacy
- biological sample
- tissue
- cell
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/693—Acquisition
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/774—Generating sets of training patterns; Bootstrap methods, e.g. bagging or boosting
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/77—Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
- G06V10/776—Validation; Performance evaluation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/60—Type of objects
- G06V20/69—Microscopic objects, e.g. biological cells or cellular parts
- G06V20/698—Matching; Classification
Definitions
- aspects of the present invention relate to systems and methods of analysis of imaging and assessment of biological samples to detect one or more features of adequacy, specifically the automated assessment of an unstained biological specimen utilizing Deep-Ultraviolet Microscopy to identify one or more features of adequacy.
- Biological samples in particular cell and tissues samples from biopsies or aspirations are vital for the diagnosis, staging, and monitoring of a number of disease conditions.
- bone marrow aspirations are used to identify hematologic conditions and cancers, including leukemia, aplastic anemia, sickle cell disease, and metastasis of solid tumors.
- approximately 1 mL of bone marrow aspirate is collected via an aspiration needle inserted into the marrow cavity. While this is a relatively simple process, the precise positioning of the needle in the bone marrow space is important to obtain diagnostically adequate samples, indicated by the presence of bony spicules in the aspirate.
- the technician performs a gross, visual inspection of the unstained slide for the presence of spicules, and if necessary, asks the physician to reposition the needle and collect another sample.
- This protocol has been shown to improve the success rate of bone marrow aspiration procedures as compared to those performed without any bedside technician.
- this naked-eye examination of bone marrow smears is still imprecise and, in some cases, may lead to excess aspirate being drawn from a patient.
- not all hospitals may have the resources to have an additional trained technician at the bedside during every aspiration.
- endobronchial ultrasound is a minimally invasive procedure that uses a bronchoscope with an ultrasound device to examine the lungs and nearby lymph nodes and can help practitioner’s diagnose lung cancer, infections, and other lung disease.
- a pulmonologist inserts a flexible tube with an ultrasound probe through the mouth and into the lungs.
- the ultrasound creates images of the lungs and lymph nodes, allowing the doctor to identify areas of concern and take tissue samples.
- the ability of a practitioner to know if they have obtained a diagnostically adequate sample is limited by current technology.
- EUS- FNA/B endoscopic ultrasound-guided fine needle aspiration/biopsy
- EUS- FNA/B is a diagnostic tool that enables safe and accurate tissue sampling critical to the diagnosis, staging, and treatment planning for many diseases or conditions, including gastrointestinal and oncologic conditions, such as pancreatic cancer among others.
- EUS-FNA/B is a minimally invasive technique that combines the high-resolution imaging capabilities of endoscopic ultrasound with the ability to obtain tissue samples for cytological and histological analysis.
- each sample, or pass is collected by advancing the needle into the target lesion under endoscopic ultrasound guidance, moving it back and forth within the lesion to collect cells or tissue, and then withdrawing it.
- EUS-FNA/B has significantly improved the accuracy of diagnosing and staging cancers, and in particular pancreatic cancer, evaluating mediastinal and abdominal lymph nodes, and assessing submucosal lesions of the gastrointestinal tract. Typically, multiple passes are collected during a procedure to increase the odds of a diagnostic specimen. While advances in EUS technology and needle design have reduced the rate of non-diagnostic procedures, sample inadequacy and false negatives remain a significant burden.
- ROSE rapid on-site evaluation
- Typical ROSE protocol involves a cytopathologist or cytotechnologist staining the prepared slides and visualizing the specimen using light microscopy. While ROSE can increase the success rate of these procedures, most procedures are performed without ROSE due to staff shortages and logistical challenges caused by the need for cytology personnel to be present during the procedure. To the best of our knowledge, ROSE is not typically implemented for bone marrow aspiration procedures. ROSE significantly reduces the sample inadequacy rate; however, resource and personnel limitations prevent the use of ROSE for all procedures.
- the present invention describes systems and methods for utilizing Deep-Ultraviolet (UV) imaging device for the automated detection of one or more features of adequacy in a biological sample, such as a biopsy, aspirate or other cell or tissue sample.
- a biological sample such as a biopsy, aspirate or other cell or tissue sample collected via fine- needle aspirate (FNA), EUS-FNA/B.
- FNA fine- needle aspirate
- the present invention can employ UV microscopy that enables label-free, high-resolution, and quantitative imaging by leveraging unique biomolecular absorption properties in the UV region of the spectrum (200-300 nm).
- the present invention includes systems and methods for the automated UV microscopy imaging of a biological sample, wherein the image is assessed by to one or more hardware processors configured by machine-readable instructions that automatically detects one or more features of adequacy in a biological sample.
- a biological sample to be tested for one or more features of adequacy can be selected from, a tissue aspirate sample, a surgical tissue biopsy sample, a cell sample, a FNA tissue sample, an EUS-FNA tissue sample, or an EUS-FNB tissue sample, and endobronchial ultrasound (EBUS) sample.
- a tissue aspirate sample a surgical tissue biopsy sample, a cell sample, a FNA tissue sample, an EUS-FNA tissue sample, or an EUS-FNB tissue sample, and endobronchial ultrasound (EBUS) sample.
- EBUS endobronchial ultrasound
- the present invention methods of applying machine learning to detect and analyze characteristics of cells, other cellular components, or non-cellular components from a biological sample that can indicate specimen adequacy.
- a reference sample set that is known to include a biological specimen adequate for diagnosis of a disease or condition is imaged utilizing UV microscopy device.
- the reference images may be transmitted to one or more processors, or other similar data processing devices or systems, where a feature of adequacy may be extracted. This extraction may be accomplished in a preferred embodiment by a machine learning system, such as a Convolutional neural network (ConvNet) module.
- ConvNet Convolutional neural network
- a test sample is imaged utilizing UV microscopy device which is transmitted to one or more processors, or other similar data processing device or system, where the image is analyzed to determine if a feature of adequacy is present. If such a feature of adequacy is present, the biological sample can be used for later diagnosis or processing. If such a feature of adequacy is not present a second biological sample can be collected and subjected to the analysis steps provided above.
- Fig. 1 A-D Clinical trial workflow including optional bedside examination of unstained smears, proposed visual UV inspection, and gold standard pathologist inspection of stained slides.
- B Computer rendering of the portable, LED-based UV microscope used in the clinical trial.
- C- D Pseudo-colorized UV microscopy image of an unstained bone marrow aspirate smear (C) and corresponding brightfield microscopy image of the same sample after May-Grunwald/Giemsa staining.
- FIG. 2A-B (A) Grayscale and pseudo-colorized UV microscopy images of bone marrow aspirate smears at low (left) and high (right) magnification. Arrows indicate spicules (green), megakaryocytes (yellow), erythroid precursor cells (red), and myeloid precursor cells. Scale bars: 200pm (left), 25pm (right). (B) Results from clinical trial for bedside inspection (left) and visual UV inspection (right) of unstained bone marrow aspirate smears compared to ground truth pathologist examination of stained slides.
- FIG. 3A-C (A) Automated spicule detection algorithm including UV image preprocessing and object detection steps. (B) Sample network output images from images with (left) and without (right) spicules. (C) Results from 5-fold cross validation using images from the clinical trial separated by patient ID compared to ground truth pathologist examination of stained slides.
- Fig. 4A-B Pseudo-colorized UV whole slide scan of an unstained, spiculated bone marrow aspirate smear (top) and a brightfield whole slide scan of the same stained smear (bottom) captured at low (left) and high (middle, right) magnifications.
- B Corresponding whole slide scans for an aspicular bone marrow aspirate smear. Low and high magnification whole slide scans are composed of 225 images in a 15x15 grid with approximately 8% overlap. Scale bars: 3mm (left), 300pm (middle), and 100pm (right).
- Fig. 5 Unstained bone marrow aspirate smear images of a spiculated region at 5X and 40X magnifications. Paired grayscale (top) and pseudo-colorized (bottom) images mimicking May- Grunwald/Giemsa staining are provided.
- FIG. 6A-C Pseudo-colorized UV whole slide scan of a liver fine-needle aspirate (FNA);
- B-C greyscale images of a liver FNA.
- FIG. 7A-B (A) Pseudo-colorized UV whole slide scan of a lymph FNA; (B) greyscale image of a lymph FNA.
- FIG. 8A-B (A) Pseudo-colorized UV whole slide scan of an adenocarcinoma FNA; (B) greyscale image of an adenocarcinoma FNA.
- Fig. 9 Overview of proposed approach to EUS FNA adequacy assessment utilizing label- free, deep-UV microscopy and machine learning based algorithms.
- Fig. 10 Wide-field unfixed and unstained pseudocolorized UV image of a spiculated bone marrow aspirate (left) and the corresponding white-light bright-field microscopy image after fixing and staining (right).
- the red arrowheads point to spicules present in the smear.
- Scale bar 200 um. Table shows improved accuracy, sensitivity, and specificity with both manual and automated assessment of adequacy using deep-UV images.
- FIG. 11A-B (A) Bone marrow cell morphology and nuclei are clearly visible in bone marrow aspirate smears imaged with deep-UV microscopy. (B) Visualization of and characterization of cellular and subcellular features, including nuclei and granules, enable classification of cells in peripheral blood smears.
- the present invention includes systems and methods of applying machine learning to classify and characterize the adequacy of biological samples, and preferably an untreated or unstained biological sample.
- the machine learning system can include a neural network comprises a convolutional neural network (ConvNet), while in alternative embodiment the neural network of the invention can include a single-step object detection algorithm.
- ConvNet convolutional neural network
- the inventive method includes the step of obtaining a biological sample from a subject for use in the diagnosis of a disease or condition.
- biological sample refers to a sample to be analyzed with the invention as generally described herein.
- a “biological sample” or “sample” may include any sample that may be subject to a UV microscopy, such as Deep UV microscopy.
- a “biological sample” or “sample” may include a cell or tissue sample from a subject, such as a biopsy or aspirate.
- a “biological sample” or “sample” refers to a sample typically derived from a biological fluid, tissue, organ, etc., often taken from an organism suspected of having a condition, such as a disease or disorder, such as an infection.
- a biological embodiment to be tested for one or more features of adequacy can be selected from, a tissue aspirate sample, a surgical tissue biopsy sample, a cell sample, a FNA tissue sample, an EUS-FNA tissue sample, an EUS-FNB tissue sample, a thyroid FNA sample, a lung biopsy sample, including a transbronchi al biopsy sample, and a percutaneous lung biopsy sample, a pancreas tissue sample, including a pancreatic FNA sample, and endoscopic ultrasound-guided fine needle biopsy (EUS-FNB) of solid pancreatic lesions, an endobronchial ultrasound (EBUS) sample, sentinel lymph node biopsies, melanoma biopsies, bone marrow aspiration, a liver tissue sample, an intestine tissue sample, a bone tissue sample, a muscle and tissue sample, sputum/oral fluid, amniotic fluid, blood, a blood fraction, bone marrow, urine, semen, stool, vaginal fluid
- Biosamples can be obtained from any subject or biological source. Although the sample is often taken from a human subject (e.g., a patient), samples can be taken from any organism, including, but not limited to mammals (e.g., dogs, cats, horses, goats, sheep, cattle, pigs, etc ), non-mammal higher organisms (e.g., reptiles, amphibians), as well as vertebrates and invertebrates.
- mammals e.g., dogs, cats, horses, goats, sheep, cattle, pigs, etc
- non-mammal higher organisms e.g., reptiles, amphibians
- the biological sample includes a cell sample, a biopsy, or fine needle aspiration, preferably from a human subject.
- fine needle aspiration can be used to obtain a biological sample from variety of different tissues or organs such as: thyroid, thyroid nodules, lymph nodes, breast tissue, bone marrow, lungs, pancreas, kidneys, and abdominal fluid in the peritoneal, or a combination of the same.
- the fine needle aspiration of the invention includes bone marrow fine needle aspiration.
- the biological sample can be obtained via ultrasound-guided fine needle biopsy (EUS-FNB), ultrasound-guided fine needle aspiration (EUS-FNA) as generally described herein.
- EUS-FNB ultrasound-guided fine needle biopsy
- EUS-FNA ultrasound-guided fine needle aspiration
- the invention further includes the use of a deep-ultraviolet (UV) imaging device configured to illuminate the biological sample with UV wavelengths of light and further capture an image of the sample for later processing and analysis.
- a deep-ultraviolet (UV) imaging device can include separate devices for illuminating the sample with UV light, and another separate device to image the sample for later processing and analysis.
- the deep-ultraviolet (UV) imaging device of the invention specifically include the systems, methods and apparatus for a deep-UV microscope described by Robles et al., U.S. Patent No. 12136284, (which is incorporated herein in its entirety by reference).
- the invention further includes the step of obtaining a test dataset by imaging the biological sample using a deep-ultraviolet (UV) imaging device described above.
- the UV imaging device illuminates the test sample with one or more wavelength of UV light and further captures an image of the illuminated sample which can be viewed directly through the UV imaging device or transmitted to a separate computer device having a graphical user interface configured to display the UV illuminated sample.
- test sample is a sample that may be used to generate a test dataset, for example of one or more features of adequacy, which may be qualitatively and/or quantitatively compared to a training dataset as generally described herein.
- the invention further includes the step of identifying in the image one or more features of adequacy.
- the features of adequacy can include features previously identified by a reference set generated by imaging a reference dataset of reference samples that are known to be adequate for diagnosis or further processing using a deep-ultraviolet (UV) microcopy that illuminates the samples forming the reference dataset and extracting one or more features of adequacy from a plurality of images from the reference dataset.
- a “reference sample” as used herein is a sample that may be used to train a computer learning system, such as by generating a training dataset.
- a “feature,” “feature of adequacy,” “feature of sample adequacy,” or “sample feature” is a feature of a sample that represents a quantifiable and/or observable feature of a cell or object visualized under UV illumination.
- a “feature of interest” may potentially correlate to a clinically relevant condition, or in a preferred embodiment a “feature of interest” may potentially correlate with a sample that is assessed as adequate for downstream diagnosis of a disease or condition, or further processing.
- a feature of interest is a feature that appears in an image of a sample, such as a biological sample, and may be recognized, segmented, and/or classified by a machine learning model.
- Examples of features of interest include components of images of a biological sample; the aforementioned images can characterize objects such as cells of the host (including both normal and abnormal host cells; e.g., tumor and normal somatic cells) red blood cells (nucleated and anucleate), white blood cells, somatic non-blood cells, and the like, non-cell components, and generally any observable particle that can be identified and visualized by UV microscopy or imaging device.
- a specimen feature presented above can be used as a separate classification for the machine learning systems described herein. Such systems can classify any of these alone or in combination with other examples.
- a feature of adequacy of the invention can include: cell type, cell morphology, non-cellular components, a cell phenotype, a pathogen, a cell genotype, chromatin morphology or content, dead cells, necrotic tissues, tissue fragments, extracellular matrix, sub cellular features and/or structures, sub-cellular granules, sub-cellular abnormal nucleus/cy topi asm ratio, one or more features described in Table 5, or a combination of the same.
- the feature of adequacy of the invention can include a bony spicule identified under UV illumination in a bone marrow aspiration.
- the biological sample can be further used to diagnose a disease or condition.
- a doctor or other qualified practitioner can evaluate and optionally further process the biological sample. For example, diagnostic indication from fine needle aspirations, apart from the visual identification of spicules bone marrow aspirates, are not generally visible simply through UV illumination which can preferably determine the adequacy of the sample.
- a cytologist, pathologist or other qualified practitioner can further process the sample, such as by staining the sample, and evaluate the adequacy on site.
- a qualified practitioner can used staining techniques known in the art to identity features, such as a mix of abnormal and normal cells; precancerous and cancerous cells, (i.e., adenocarcinoma, metastatic renal cell carcinoma, ductal carcinoma, hepatocellular carcinoma, lymphoma) the presence of neoplastic cells; indications of bacterial infection; ratio of normal and abnormal white blood cells; and the quantity of red blood cells.
- DSP digital signal processor
- ASIC application specific integrated circuit
- FPGA field programmable gate array
- a general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine.
- a processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
- a software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non- transitory, tangible computer-readable storage medium known in the art.
- An exemplary non- transitory, tangible computer-readable storage medium is coupled to the processor such that the processor can read information from, and write information to, the non-transitory, tangible computer-readable storage medium.
- non-transitory, tangible computer- readable storage medium may be integral to the processor.
- the processor and the non-transitory, tangible computer-readable storage medium may reside in an ASIC.
- the ASIC may reside in a user terminal.
- the processor and the non-transitory, tangible computer-readable storage medium may reside as discrete components in a user terminal.
- a software module may be implemented as digital logic components such as those in an FPGA once programmed with the software module.
- one or more of the components or subcomponents described in relation to the computer system shown in the Figures may comprise a cloud computing system.
- front-end systems such as input devices may provide information to back-end platforms such as servers (e.g., computer systems) and storage (e.g., memory).
- Software i.e., middleware
- SAAS software-as-a-service
- users may operate software located on back-end servers through the use of a front-end software application such as, but not limited to, a web browser.
- any of the computing systems described in herein can be implemented as software components executing on one or more general purpose processors or specially designed processors such as programmable logic devices (e.g., Field Programmable Gate Arrays (FPGAs)) and/or Application Specific Integrated Circuits (ASICs) designed to perform certain functions or a combination thereof.
- programmable logic devices e.g., Field Programmable Gate Arrays (FPGAs)
- ASICs Application Specific Integrated Circuits
- code executed during operation of the systems of the invention can be embodied by a form of software elements which can be stored in a nonvolatile storage medium (such as optical disk, flash storage device, mobile hard disk, cloud-based systems etc.
- Algorithms, machine learning models and/or other computational structures described herein may be implemented on a single device or distributed across multiple devices. The functions of the computational elements may be merged into one another or further split into multiple sub-modules.
- the hardware device of the invention can be any kind of device that can be programmed including, for example, any kind of computer including smart mobile devices (watches, phones, tablets, and the like), personal computers, powerful servers or supercomputers, or the like.
- the device includes one or more processors such as an ASIC or any combination processors, for example, one general purpose processor and two FPGAs.
- the device may be implemented as a combination of hardware and software, such as an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein.
- the system includes at least one hardware component and/or at least one software component. The embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software.
- the disclosed embodiments may be implemented on different hardware devices, for example using a plurality of CPUs equipped with GPUs capable of accelerating and/or coordinating computation.
- Each computational element may be implemented as an organized collection of computer data and instructions.
- System software typically interfaces with computer hardware, typically implemented as one or more processors (e.g., CPUs or ASICs as mentioned) and associated memory.
- the system software includes operating system software and/or firmware, as well as any middleware and drivers installed in the system.
- the system software provides basic non-task-specific functions of the computer.
- the modules and other application software are used to accomplish specific tasks.
- Each native instruction for a module is stored in a memory device and is represented by a numeric value.
- a computational element is implemented as a set of commands prepared by the programmer/developer.
- the module software that can be executed by the computer hardware is executable code committed to memory using “machine codes” selected from the specific machine language instruction set, or “native instructions,” designed into the hardware processor.
- the machine language instruction set, or native instruction set is known to, and essentially built into, the hardware processor(s). This is the “language” by which the system and application software communicates with the hardware processors.
- Each native instruction is a discrete code that is recognized by the processing architecture and that can specify particular registers for arithmetic, addressing, or control functions; particular memory locations or offsets; and particular addressing modes used to interpret operands. More complex operations are built up by combining these simple native instructions, which are executed sequentially, or as otherwise directed by control flow instructions.
- the inter-relationship between the executable software instructions and the hardware processor may be structural.
- the instructions per se may include a series of symbols or numeric values. They do not intrinsically convey any information. It is the processor, which by design was preconfigured to interpret the symbols/numeric values, which imparts meaning to the instructions.
- All of the methods described herein may include storing results of one or more steps of the method embodiments in memory.
- the results may include any of the results described herein and may be stored in any manner known in the art.
- the memory may include any memory described herein or any other suitable storage medium known in the art.
- the results can be accessed in the memory and used by any of the method or system embodiments described herein, formatted for display to a user, used by another software module, method, or system, etc.
- the results may be stored “permanently,” “semi-permanently,” temporarily, or for some period of time.
- the memory may be random access memory (RAM), and the results may not necessarily persist indefinitely in the memory.
- logic and similar implementations may include software or other control structures.
- Electronic circuitry may have one or more paths of electrical current constructed and arranged to implement various functions as described herein.
- one or more media may be configured to bear a device-detectable implementation when such media hold or transmit device-detectable instructions operable to perform as described herein, and preferrable transmitted to a mobile device as an audio signal, and even more preferably an inaudible audio signal.
- implementations may include an update or modification of existing software or firmware, or of gate arrays or programmable hardware, such as by performing a reception of or a transmission of one or more instructions in relation to one or more operations described herein.
- an implementation may include special-purpose hardware, software, firmware components, and/or general-purpose components executing or otherwise invoking special-purpose components. Specifications or other implementations may be transmitted by one or more instances of tangible transmission media as described herein, optionally by packet transmission or otherwise by passing through distributed media at various times.
- implementations may include executing a special-purpose instruction sequence or invoking circuitry for enabling, triggering, coordinating, requesting, or otherwise causing one or more occurrences of virtually any functional operations described herein.
- operational or other logical descriptions herein may be expressed as source code and compiled or otherwise invoked as an executable instruction sequence.
- implementations may be provided, in whole or in part, by source code, such as C++, or other code sequences.
- source or other code implementation may be compiled/implemented/translated/converted into a high-level descriptor language (e.g., initially implementing described technologies in C or C++ programming language and thereafter converting the programming language implementation into a logic-synthesizable language implementation, a hardware description language implementation, a hardware design simulation implementation, and/or other such similar mode(s) of expression).
- a high-level descriptor language e.g., initially implementing described technologies in C or C++ programming language and thereafter converting the programming language implementation into a logic-synthesizable language implementation, a hardware description language implementation, a hardware design simulation implementation, and/or other such similar mode(s) of expression.
- a logical expression e.g., computer programming language implementation
- a Verilog-type hardware description e.g., via Hardware Description Language (HDL) and/or Very High Speed Integrated Circuit Hardware Descriptor Language (VHDL)
- VHDL Very High Speed Integrated Circuit Hardware Descriptor Language
- Those skilled in the art will recognize how to obtain, configure, and optimize suitable transmission devices or computational elements, material supplies, actuators, or other structures in light of these teachings.
- block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof.
- ASICs Application Specific Integrated Circuits
- FPGAs Field Programmable Gate Arrays
- DSPs digital signal processors
- Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link (e.g., transmitter, receiver, transmission logic, reception logic, etc.), etc.).
- a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.
- a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link (e.g., transmitter, receiver, transmission logic, reception
- electromechanical systems having a wide range of electrical components such as hardware, software, firmware, and/or virtually any combination thereof; and a wide range of components that may impart mechanical force or motion such as rigid bodies, spring or torsional bodies, hydraulics, electro-magnetically actuated devices, and/or virtually any combination thereof.
- electro-mechanical system includes, but is not limited to, electrical circuitry operably coupled with a transducer (e.g., an actuator, a motor, a piezoelectric crystal, a Micro Electro Mechanical System (MEMS), etc.), electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc.)), electrical circuitry forming a communications device (e.g., a modem, communications switch, optical -electrical equipment, etc ), and/or any
- electro-mechanical systems include but are not limited to a variety of consumer electronics systems, medical devices, as well as other systems such as motorized transport systems, factory automation systems, security systems, and/or communication/computing systems.
- electro-mechanical as used herein is not necessarily limited to a system that has both electrical and mechanical actuation except as context may dictate otherwise.
- electrical circuitry includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc.)), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, optical-electrical equipment, etc.
- a memory device e.g., forms of memory (e.g., random access, flash, read only, etc.)
- communications device e.g., a modem, communications switch, optical-electrical equipment, etc
- a data processing system generally includes one or more of a system unit housing, a video display device, memory such as volatile or non-volatile memory, processors such as microprocessors or digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices (e.g., a touch pad, a touch screen, an antenna, etc.), and/or control systems including feedback loops and control motors (e g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities).
- a data processing system may be implemented utilizing suitable commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
- any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable,” to each other to achieve the desired functionality.
- operably couplable include but are not limited to physically mateable and/or physically interacting components, and/or wirelessly intractable, and/or wirelessly interacting components, and/or logically interacting, and/or logically intractable components.
- one or more components may be referred to herein as “configured to,” “configurable to,” “responsive to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc.
- configured to can generally encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.
- methods described herein can be performed with the individual steps executed in any suitable order, including: the precise order disclosed, without any intermediate steps or with one or more further steps interposed between the disclosed steps; with the disclosed steps performed in an order other than the exact order disclosed; with one or more steps performed simultaneously; and with one or more disclosed steps omitted.
- Bone marrow aspiration is a procedure routinely performed on patients being evaluated for diseases of the blood and bone marrow, including cancers. Evaluating the adequacy of bone marrow aspirates, indicated by the presence of bony spicules in the sample, is critical to ensure that the procedure was performed correctly, and that appropriate diagnostic material was collected.
- aspirate samples are evaluated using Giemsa staining protocols, which require a trained laboratory technician and lengthy processing that can take several hours. If a sample is deemed inadequate, which occurs in approximately 50% of cases, the patient must return to undergo another aspiration procedure. This is particularly problematic for pediatric cases where patients must be fully anesthetized.
- UV microscopy has been demonstrated for hematological analysis by leveraging unique absorption properties of biomolecules in the UV region of the spectrum for high-resolution molecular imaging.
- results from a clinical trial with pediatric oncology patients showing excellent agreement between intraoperative evaluation with our UV microscope and hematopathologist inspection of stained samples.
- a UV-bandpass fdter aligned with the 255nm LED used for illumination, was placed after the objective.
- This filter effectively screened out glass autofluorescence, permitting only the transmission of light at the illumination wavelength.
- a condenser lens module was introduced between the LED and the sample.
- the system incorporated manual translation stages to facilitate user-controlled slide translation and inspection.
- a turret housing 5/40X objective lenses was integrated to enable variable magnification during the inspection of bone marrow aspirate smears. This system was connected to a nearby PC for real-time visualization of samples in grayscale and pseudocolorized formats (Fig. 5), mimicking conventional May-Grunwald/Giemsa staining prior to hematopathologist inspection.
- Sources of error during the clinical trial included incorrect classification of spicules by the technician operating the UV system.
- a pretrained object detection algorithm (YOLOv7 8 ) for neural network based automated spicule detection.
- This system classifies whether a sample contains spicules and, if present, precisely identifies their location through bounding boxes in real-time.
- the reference sample comprises a dataset comprised 4289 images collected during the clinical trial involving 51 patients, capturing approximately 400 spicules. Given the dataset imbalance, a subset of images was specifically obtained to represent both the background and spicules adequately.
- Example 2 Evaluation of exemplary bone marrow aspirates.
- bone marrow aspirate images were saved by the lab technician operating the UV microscope and are shown in Figure 4A below, at both low and high magnification.
- spicules can be easily identified within the microscope field- of-view (green arrows) due to inherent UV contrast.
- Real-time pseudo-colorization via a simple colormap adjustment enhances the visual contrast for spicules and mimics the appearance of aspirate smears during pathological examination.
- unique nucleated marrow cells can be distinguished in addition to the larger spicules via strong absorption of 255nm light by nucleic acids.
- Observed cells include megakaryocytes (yellow), erythroid precursor cells (red), and myeloid precursor cells (blue), characterized by nuclear morphology and size. These diagnostically relevant cells are found in bone marrow and can indicate proper aspiration, even in the absence of spicules.
- the bone marrow aspiration success rate (without any intervention from a bedside technician) was 76.5%.
- the effective sensitivity and specificity are 100% and 0% respectively, assuming no assessment on the quality of the acquired aspirate.
- Bedside naked eye assessment of the unstained slides (Fig. 2B) only raised this accuracy to 82.4% with a much higher specificity (66.7%) but lower sensitivity (87.2%). While this technique does offer a slight improvement in aspiration adequacy, it did negatively impact the aspirate procedures.
- the introduction of five false negative assessments corresponds with five patients who underwent an additional and unnecessary aspiration, as their first aspiration was adequate. Results from the visual UV inspection (Fig.
- FIG. 3A The developed workflow for automated spicule detection using the YOLOv7 object detection algorithm is shown in Figure 3A below.
- the field-of-view (FOV) of each input image and patch are 3.1x3.1mm and 0.96x0.96mm, respectively.
- model hyperparameters were optimized, resulting in a Mean Average Precision (MAP) of 66.9%.
- MAP is a common metric used for assessing object detection model performance and the resulting value is very close to the maximum (69.7%) offered by the YOLOv7 algorithm and on par with other YOLOv7 implementations.
- FIG. 4A includes both whole slides scans at low magnification (left), demonstrating excellent concordance between pseudo-colorized UV images and stained brightfield images. Spicules are distinguishable in these macro-scale scans (inside of green inset border), highlighting the potential for adequacy assessment from automated, low- magnification scans of unstained slides. Higher magnification scans of smaller regions of interest show spicules (middle) and diagnostically relevant marrow cells including megakaryocytes (yellow arrows, right). Our simple pseudo-colorization algorithm mimics conventional staining even at smaller spatial scales. Corresponding images of an aspicular sample are also included (Fig. 4B) showing a lack of significant features at low and high magnification.
- Clinical trial A clinical trial was conducted at the Aflac Sedation Suite at Children’s Healthcare of Atlanta, Egleston Hospital, with pediatric patients undergoing bone marrow aspiration procedures. For each patient, a smear was prepared on a glass slide by a trained laboratory technician in the operating room. This smear was first visually inspected by the technician for the presence of spicules (naked eye examination). Then, the smear was immediately inspected on our UV microscope by a lab technician, who noted the presence or absence of spicules to determine if the aspiration was adequate.
- This portable UV is similar to one presented in previous work and features a deep-UV LED (Boston Electronics) for narrowband 255nm illumination, 5/40X objective lenses (LMU-5X-UVB/LMU-40X-UVB, Thorlabs) for viewing samples at variable magnifications, and manual translation stages for slide inspection.
- This system was connected to a PC with a custom graphical user interface (GUI) for real-time visualization of samples in pseudo-colorized format.
- GUI graphical user interface
- This colorization mimics conventional May- Griinwald/Giemsa staining via a simple colormap adjustment without requiring any use of computationally expensive artificial intelligence (Al) algorithms.
- the Y0L0v7 (You Only Look Once) object detection algorithm was adapted to perform rapid object detection and classification in bone marrow aspirate images.
- the Y0L0v7 network surpasses most real-time object detectors in both speed and accuracy.
- the acquired dataset comprised approximately 4300 images captured during the clinical trial (across all patients) including images of 356 distinct spicules. These spicules were manually annotated within all images and verified by hematopathologists.
- a randomized 5-fold cross-validation was conducted to generate an aspirate adequacy assessment per clinical trial patient.
- training data comprised 80% of the acquired dataset separated by patient ID, with 20% saved for validation (corresponding with around 10 patients).
- each full size (2048x2048p) image was converted into 640x640p patches with approximately 10% overlap to be compatible with the chosen object detection network.
- the images were then converted into a PyTorch tensor as a compatible input for the model.
- the model’s outputs on all patches were combined to generate a reconstructed image with the predicted bounding boxes. Then, non-maximum suppression was implemented to remove redundant bounding box predictions and improve the object detection accuracy.
- the network bounding box area thresholds were optimized via receiver operating characteristic (ROC) curve analysis and final detected spicule locations were extracted.
- This network was trained in Python 3.9 with PyTorch 1.11.0 on a NVIDIA GeForce RTX 2080Ti GPU for 100 epochs with a batch size of 8.
- An adequacy assessment was generated per patient ID and then compiled to compare with adequacy assessments from bedside assessment and visual UV inspection.
- each smear was stained and imaged with brightfield microscopy for comparison.
- bone marrow aspirate smears were fixed using methanol (ThermoFisher Scientific) for 15 minutes and stained in May Grunwald solution (MG500, Sigma Aldrich) for 10 minutes.
- May Grunwald solution MG500, Sigma Aldrich
- PBS phosphate buffered saline
- the smears were stained with a 1 : 10 diluted Giemsa solution (GS500, Sigma Aldrich) for 20 minutes.
- samples were washed in PBS again and air-dried prior to imaging with a commercial color brightfield scanner (Cytation 7, Biotek).
- Example 4 Application of Endoscopic ultrasound-guided fine needle aspiration/biopsy (EUS- FNA/B),
- one method to improve the diagnostic accuracy and adequacy of EUS- FNA/B is the use of ROSE to provide real-time feedback on samples.
- ROSE In procedures performed with ROSE, cytopathology staff perform staining, imaging, and adequacy assessment of each pass in the procedure room. If the sample is deemed inadequate, another pass is collected.
- ROSE also enables triage of limited specimens for follow up analysis by immunohistochemistry, flow cytometry, microbial culture, cytogenetics, or molecular studies. Increasing utilization of cytogenetics to refine diagnostic results furthers the need for efficient and adequate sampling.
- ROSE is used in approximately 75-80% of EUS- FNA/B procedures, but rates can vary greatly amongst different centers. ROSE adoption is significantly lower in Europe and Asia, for example.
- One hurdle for universal implementation of ROSE for EUS-FNA/B is the limited reimbursement provided for the procedure. Reimbursement for performance of ROSE by a pathologist is about $133 per hour, compared to over $600 per hour reimbursement for routine surgical pathology. While low reimbursement has led to reduced implementation of ROSE, a retrospective study of patients who underwent EUS-FNA with or without ROSE demonstrated that ROSE reduced the number of additional EUS-FNA procedures by half, resulting in $252 savings per case (after accounting for the additional cost of performing ROSE).
- Fine Needle Biopsy has been heralded as a solution to improve adequacy rates in EUS procedures without ROSE; however, even with significant design improvements in various recent FNB platforms.
- the rate of inconclusive samples from solid pancreatic lesions remains as high as 13%, with particular difficulty associated with patients having concurrent pancreatitis or fibrosis.
- UV deep ultraviolet
- EUS-FNA/B EUS-FNA/B methodology
- the imaging system characterized by a straightforward optical path, allows cell analysis using images captured from a single wavelength LED light source.
- Pseudo-colorized UV images derived from deep-UV microscopy, have demonstrated diagnostic efficacy equivalent to the gold standard Giemsa-stained images.
- Applicants Leveraging deep learning-based cell segmentation and classification algorithms, Applicants can analyze these images to categorize various cell types, enabling rapid automated sample assessment in without pathology personnel.
- Example 5 Point-of-care device for EUS-FNA/B adequacy assessment.
- the present disclosures include a system for a point-of-care device for EUS-FNA/B adequacy assessment.
- the system incorporates a deep learning framework to develop an analysis algorithm for FNA/B samples.
- the system includes the creation of an annotated dataset of FNA/Bs imaged by deep-UV microscopy which can be applied to a single-step object detection algorithm.
- the algorithm outputs key parameters that can enable the clinician to determine adequacy, including counts of benign cell types (epithelial cells, pancreatic ductal cells, acinar cells, neutrophils, lymphocytes), count of suspected malignant cells (cells with atypia, high nucleus/cytoplasm ratio, dense cytoplasm), and identification of regions of interest (necrosis, fibrosis, stromal fragments). Based on these outputs, a clinician can determine sample adequacy.
- benign cell types epidermal cells, pancreatic ductal cells, acinar cells, neutrophils, lymphocytes
- count of suspected malignant cells cells with atypia, high nucleus/cytoplasm ratio, dense cytoplasm
- regions of interest necrosis, fibrosis, stromal fragments
- This system incorporates deep-UV imaging and deep learning-based adequacy assessment of biological samples, such as aspirates and biopsies from a human or animal subject.
- biological samples such as aspirates and biopsies from a human or animal subject.
- traditional analysis of bone marrow aspirate smears involves fixing and then staining with Giemsa or similar dyes, which requires approximately 45 minutes of wait time before the slides are viewed under traditional light microscopy.
- This traditional approach of visualizing peripheral blood smears with stains is comparable to the presently described label-free deep-UV microscopy system and methods which rely on the inherent absorbance of specific UV wavelengths by cellular biomolecules, most notably nucleic acids and proteins.
- Deep-UV microscopy has the benefit of label/stain-free direct visualization with pseudo-colorization that recapitulates traditional Giemsa staining, removing the steps of fixing, staining, and waiting for the slide to dry.
- Figure 10 shows spicules present in a bone marrow aspirate (red arrows) with their characteristic deep blue hue in the unstained pseudocolorized UV image (left), which is nearly identical to the Giemsa-stained slide (right), which takes over 45 min to process.
- myelopoietic cells e g., promyelocytes, myelocytes, metamyelocytes, band neutrophils, and lymphocytes
- erythropoietic cells e.g., normoblasts
- Example 6 Identification of lesional, non-lesionaL and acellular components of the FNA/Bs specimens using deep-UV microscopy.
- the present disclosures describe a device for rapid adequacy assessment of EUS-FNA/B samples (Figure IB) which can be configured to use single-wavelength imaging for assessment of EUS-FNA/B quality.
- the prototype uses a 255nm LED followed by two 50mm lenses used as a condenser.
- the power at the sample is controlled electronically with a maximum of 60mW.
- the system further includes transmission microscope with UV compatible optics, comprising a UV objective a fussed silica 150mm tube lens, a UV filter (255nm bandpass, more on this below) and a UV sensitive camera.
- the field of view and resolution of the system are c.a. 0.5mm and 0.6pm, respectively, with the 20X objective.
- biological samples can be smeared on regular glass microscope slides, as is currently done in the standard of care, for analysis with the UV imaging system of the disclosure.
- imaging at wavelengths less than 300 nm must be performed on quartz rather than glass slides, as glass is highly absorbing in that range and also produces strong autofluorescence.
- Applicant have found that sufficient deep UV light is transmittance through standard 1mm glass microscope slides to enable high contrast deep-UV microscopy using camera integration times of 50-500ms, depending on the magnification.
- glass autofluorescence is removed using a narrow (lOnm) bandpass filter centered at 255nm.
- a sample may be adequate for cytological evaluation but insufficient for molecular testing, in which case the adequacy determination would be dependent on whether the clinician required molecular testing results.
- a sample collected from this patient containing significant numbers of pancreatic ductal cells, but no suspected malignant cells may prompt an additional pass to ensure the correct area was sampled, but multiple passes with only normal pancreatic ductal cells may constitute an adequate sample. This approach has the potential to not only reduce the percentage of inadequate samples, but to also reduce the rate of false negative diagnosis.
- Smears can further be performed immediately upon collection of EUS-FNA/B samples and imaged using deep-UV microscopy. After evaluation by UV microscopy, the smears can be stained with Papanicolaou stain following the routine protocol for EUS-FNA/B samples. Deep-UV microscopy is a non-destructive technique, enabling evaluation of the exact same slide using both methods. Images are reviewed by a cytopathologist, who will assess the adequacy of each sample. In this embodiment, sensitivity and specificity are calculated for each quality assessment method relative to the ground truth assessment. When multiple biological samples, such as aspirates or biopsy samples are collected for a single patient, each sample is assessed separately.
- a prevalence, P of approximately 50% can be established as an initial criterion.
- Z the normal distribution value, can be set to 1.96 to correspond with the 95% confidence interval, and W, the maximum acceptable width of the 95% confidence interval, can be set to 10%.
- Sensitivity and specificity are estimated at 90% based on previous data showing nearly identical appearance of the deep-UV and stained smears. These values determined the TP (true positives) + FN (false negatives) and TN (true negatives) + FP (false positives) used for the sample size calculations. Based on these assumptions, a sample size of 70 smears can be used.
- Example 7 Deep learning-based algorithm to automate the identification and classification of key cell types and features to support adequacy assessment.
- EUS-FNA/B samples have varying requirements for adequacy depending on the lesion type and suspected diagnosis.
- the system of the disclosures can provide the clarifying information to address two critical questions that determine whether an additional pass is necessary: 1) Is identifiable tissue present in the sample, or the sample acellular or comprised primarily of blood? and 2) Is the sample from the area of interest?
- the Applicant’s system evaluates the cellularity, including the number of suspected malignant cells, and sample features that could hinder cytological assessment. Examples noted above demonstrate the ability to implement deep learning-based object detection algorithms for bone marrow aspirate adequacy assessment and to differentiate cell types in peripheral blood samples.
- similar deep learning framework to develop a classification algorithm for EUS-FNA/B samples can be implements which can include the creation of an annotated dataset of FNA/Bs imaged by deep-UV microscopy and implementation of a single-step object detection algorithm.
- additional data augmentation strategies will be implemented to mitigate the risk of decreased accuracy of the classification algorithm for cell types or features that have limited representation in our dataset.
- Three additional techniques can apply as needed to achieve the more precise sensitivity and specificity and can include: 1) synthetic minority oversample technique (SMOTE); 2) traditional CV augmentation for the minority class; and 3) class weighting.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Computing Systems (AREA)
- Software Systems (AREA)
- Medical Informatics (AREA)
- Databases & Information Systems (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Artificial Intelligence (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Investigating, Analyzing Materials By Fluorescence Or Luminescence (AREA)
- Investigating Or Analysing Biological Materials (AREA)
Abstract
Systems and methods of analysis of imaging and assessment of biological samples to detect one or more features of adequacy, specifically the automated assessment of an unstained biological specimen utilizing Deep-Ultraviolet Microscopy to identify one or more features of adequacy.
Description
METHODS AND SYSTEMS FOR ASSESSMENT OF BIOLOGICAL SPECIMEN ADEQUACY
CROSS-REFERENCE TO RELATED APPLICATION
This International PCT Application claims the benefit of and priority to U.S. Provisional Application No. 63/561,719 filed, March 05, 2024, the specification, claims, and drawings of which are incorporated herein by reference in their entirety.
STATEMENT OF GOVERNMENT INTEREST
This invention was made with government support under grant numbers R35GM147437, and R41-EB035057, awarded by the National institutes of Health, grant numbers 1752011 by the National Science Foundation. The government has certain rights in the invention.
TECHNICAL FIELD
Aspects of the present invention relate to systems and methods of analysis of imaging and assessment of biological samples to detect one or more features of adequacy, specifically the automated assessment of an unstained biological specimen utilizing Deep-Ultraviolet Microscopy to identify one or more features of adequacy.
BACKGROUND
Biological samples, in particular cell and tissues samples from biopsies or aspirations are vital for the diagnosis, staging, and monitoring of a number of disease conditions. In one example, bone marrow aspirations are used to identify hematologic conditions and cancers, including leukemia, aplastic anemia, sickle cell disease, and metastasis of solid tumors. During these procedures, approximately 1 mL of bone marrow aspirate is collected via an aspiration needle inserted into the marrow cavity. While this is a relatively simple process, the precise positioning of the needle in the bone marrow space is important to obtain diagnostically adequate samples, indicated by the presence of bony spicules in the aspirate. Unfortunately, 8-50% of aspirations unsuccessful due to operator error, hemodilution, or underlying pathologies. Aspirate samples are currently evaluated using May-Grunwald Giemsa staining procedures which require a trained laboratory technician, biochemical stains, and lengthy processing times before pathologist inspection, which often occurs several days later. If a sample is deemed inadequate, patients must undergo an additional bone marrow aspiration procedure, which is particularly problematic for pediatric patients who require deep sedation, and results in a delay in the diagnosis and monitoring of hematologic conditions.
To improve the success rate of bone marrow aspirations, some clinics have implemented a bedside inspection protocol during which a bone marrow lab technician prepares a bone marrow smear immediately following initial collection of bone marrow. The technician performs a gross, visual inspection of the unstained slide for the presence of spicules, and if necessary, asks the physician to reposition the needle and collect another sample. This protocol has been shown to improve the success rate of bone marrow aspiration procedures as compared to those performed without any bedside technician. However, this naked-eye examination of bone marrow smears is still imprecise and, in some cases, may lead to excess aspirate being drawn from a patient. In addition, not all hospitals may have the resources to have an additional trained technician at the bedside during every aspiration.
In another example, endobronchial ultrasound (EBUS) is a minimally invasive procedure that uses a bronchoscope with an ultrasound device to examine the lungs and nearby lymph nodes and can help practitioner’s diagnose lung cancer, infections, and other lung disease. In a typical EBUS procedure, a pulmonologist inserts a flexible tube with an ultrasound probe through the mouth and into the lungs. The ultrasound creates images of the lungs and lymph nodes, allowing the doctor to identify areas of concern and take tissue samples. However, as noted above, the ability of a practitioner to know if they have obtained a diagnostically adequate sample is limited by current technology.
In another example, endoscopic ultrasound-guided fine needle aspiration/biopsy (EUS- FNA/B) is a diagnostic tool that enables safe and accurate tissue sampling critical to the diagnosis, staging, and treatment planning for many diseases or conditions, including gastrointestinal and oncologic conditions, such as pancreatic cancer among others. EUS-FNA/B is a minimally invasive technique that combines the high-resolution imaging capabilities of endoscopic ultrasound with the ability to obtain tissue samples for cytological and histological analysis. When applying EUS-FNA/B to a subject, each sample, or pass, is collected by advancing the needle into the target lesion under endoscopic ultrasound guidance, moving it back and forth within the lesion to collect cells or tissue, and then withdrawing it. EUS-FNA/B has significantly improved the accuracy of diagnosing and staging cancers, and in particular pancreatic cancer, evaluating mediastinal and abdominal lymph nodes, and assessing submucosal lesions of the gastrointestinal tract. Typically, multiple passes are collected during a procedure to increase the odds of a
diagnostic specimen. While advances in EUS technology and needle design have reduced the rate of non-diagnostic procedures, sample inadequacy and false negatives remain a significant burden.
For some biopsy and aspiration procedures, rapid on-site evaluation (ROSE) is used to assess specimen adequacy to provide feedback during the procedure. Typical ROSE protocol involves a cytopathologist or cytotechnologist staining the prepared slides and visualizing the specimen using light microscopy. While ROSE can increase the success rate of these procedures, most procedures are performed without ROSE due to staff shortages and logistical challenges caused by the need for cytology personnel to be present during the procedure. To the best of our knowledge, ROSE is not typically implemented for bone marrow aspiration procedures. ROSE significantly reduces the sample inadequacy rate; however, resource and personnel limitations prevent the use of ROSE for all procedures. Multiple studies have shown that ROSE significantly improves the diagnostic accuracy and adequacy of EUS-FNA for solid cancerous tissues, such as pancreatic lesions while also reducing complications. As a result, there exists a need for improved quality and reliability of EUS-FNA/B procedures by developing a low-cost, point-of-care adequacy assessment instrument that can be used in any setting, including centers currently unable to offer ROSE.
Thus, there is a clinical need for real-time, automated determination of specimen adequacy of biopsy and aspirate samples, among other biological samples.
SUMMARY OF THE INVENTION
In one embodiment, the present invention describes systems and methods for utilizing Deep-Ultraviolet (UV) imaging device for the automated detection of one or more features of adequacy in a biological sample, such as a biopsy, aspirate or other cell or tissue sample. In a preferred aspect, the present invention describes systems and methods for utilizing Deep- Ultraviolet (UV) imaging device for the automated detection of one or more features of adequacy in a biological sample, such as a biopsy, aspirate or other cell or tissue sample collected via fine- needle aspirate (FNA), EUS-FNA/B. The present invention can employ UV microscopy that enables label-free, high-resolution, and quantitative imaging by leveraging unique biomolecular absorption properties in the UV region of the spectrum (200-300 nm). In a preferred aspect, the present invention includes systems and methods for the automated UV microscopy imaging of a biological sample, wherein the image is assessed by to one or more hardware processors configured by machine-readable instructions that automatically detects one or more features of
adequacy in a biological sample. In a preferred aspect, a biological sample to be tested for one or more features of adequacy can be selected from, a tissue aspirate sample, a surgical tissue biopsy sample, a cell sample, a FNA tissue sample, an EUS-FNA tissue sample, or an EUS-FNB tissue sample, and endobronchial ultrasound (EBUS) sample.
In another preferred embodiment, the present invention methods of applying machine learning to detect and analyze characteristics of cells, other cellular components, or non-cellular components from a biological sample that can indicate specimen adequacy. In this embodiment, a reference sample set that is known to include a biological specimen adequate for diagnosis of a disease or condition is imaged utilizing UV microscopy device. The reference images may be transmitted to one or more processors, or other similar data processing devices or systems, where a feature of adequacy may be extracted. This extraction may be accomplished in a preferred embodiment by a machine learning system, such as a Convolutional neural network (ConvNet) module.
In a preferred embodiment, a test sample is imaged utilizing UV microscopy device which is transmitted to one or more processors, or other similar data processing device or system, where the image is analyzed to determine if a feature of adequacy is present. If such a feature of adequacy is present, the biological sample can be used for later diagnosis or processing. If such a feature of adequacy is not present a second biological sample can be collected and subjected to the analysis steps provided above.
Additional aspects of the invention will be evident from the specification, claims and figures presented herein.
BRIEF DESCRIPTION OF THE FIGURES
Fig. 1 A-D. (A) Clinical trial workflow including optional bedside examination of unstained smears, proposed visual UV inspection, and gold standard pathologist inspection of stained slides. (B) Computer rendering of the portable, LED-based UV microscope used in the clinical trial. (C- D) Pseudo-colorized UV microscopy image of an unstained bone marrow aspirate smear (C) and corresponding brightfield microscopy image of the same sample after May-Grunwald/Giemsa staining.
Fig. 2A-B. (A) Grayscale and pseudo-colorized UV microscopy images of bone marrow aspirate smears at low (left) and high (right) magnification. Arrows indicate spicules (green), megakaryocytes (yellow), erythroid precursor cells (red), and myeloid precursor cells. Scale bars:
200pm (left), 25pm (right). (B) Results from clinical trial for bedside inspection (left) and visual UV inspection (right) of unstained bone marrow aspirate smears compared to ground truth pathologist examination of stained slides.
Fig. 3A-C. (A) Automated spicule detection algorithm including UV image preprocessing and object detection steps. (B) Sample network output images from images with (left) and without (right) spicules. (C) Results from 5-fold cross validation using images from the clinical trial separated by patient ID compared to ground truth pathologist examination of stained slides.
Fig. 4A-B. (A) Pseudo-colorized UV whole slide scan of an unstained, spiculated bone marrow aspirate smear (top) and a brightfield whole slide scan of the same stained smear (bottom) captured at low (left) and high (middle, right) magnifications. (B) Corresponding whole slide scans for an aspicular bone marrow aspirate smear. Low and high magnification whole slide scans are composed of 225 images in a 15x15 grid with approximately 8% overlap. Scale bars: 3mm (left), 300pm (middle), and 100pm (right).
Fig. 5. Unstained bone marrow aspirate smear images of a spiculated region at 5X and 40X magnifications. Paired grayscale (top) and pseudo-colorized (bottom) images mimicking May- Grunwald/Giemsa staining are provided.
Fig. 6A-C. (A) Pseudo-colorized UV whole slide scan of a liver fine-needle aspirate (FNA); (B-C) greyscale images of a liver FNA.
Fig. 7A-B. (A) Pseudo-colorized UV whole slide scan of a lymph FNA; (B) greyscale image of a lymph FNA.
Fig. 8A-B. (A) Pseudo-colorized UV whole slide scan of an adenocarcinoma FNA; (B) greyscale image of an adenocarcinoma FNA.
Fig. 9. Overview of proposed approach to EUS FNA adequacy assessment utilizing label- free, deep-UV microscopy and machine learning based algorithms.
Fig. 10. Wide-field unfixed and unstained pseudocolorized UV image of a spiculated bone marrow aspirate (left) and the corresponding white-light bright-field microscopy image after fixing and staining (right). The red arrowheads point to spicules present in the smear. Scale bar = 200 um. Table shows improved accuracy, sensitivity, and specificity with both manual and automated assessment of adequacy using deep-UV images.
Fig. 11A-B. (A) Bone marrow cell morphology and nuclei are clearly visible in bone marrow aspirate smears imaged with deep-UV microscopy. (B) Visualization of and
characterization of cellular and subcellular features, including nuclei and granules, enable classification of cells in peripheral blood smears.
DETAILED DESCRIPTION OF THE INVENTION
The embodiments herein and the various features and details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted to avoid unnecessarily obscuring the embodiments herein. Also, the various embodiments described herein are not necessarily mutually exclusive, as some embodiments can be combined with one or more other embodiments to form new embodiments. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those skilled in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
The present invention includes systems and methods of applying machine learning to classify and characterize the adequacy of biological samples, and preferably an untreated or unstained biological sample. In certain embodiment, the machine learning system can include a neural network comprises a convolutional neural network (ConvNet), while in alternative embodiment the neural network of the invention can include a single-step object detection algorithm.
The inventive method includes the step of obtaining a biological sample from a subject for use in the diagnosis of a disease or condition. The term “biological sample,” or “sample” refers to a sample to be analyzed with the invention as generally described herein. In addition, as generally used herein a “biological sample” or “sample” may include any sample that may be subject to a UV microscopy, such as Deep UV microscopy. In one preferred embodiment, a “biological sample” or “sample” may include a cell or tissue sample from a subject, such as a biopsy or aspirate. In preferred embodiments, a “biological sample” or “sample” refers to a sample typically derived from a biological fluid, tissue, organ, etc., often taken from an organism suspected of having a condition, such as a disease or disorder, such as an infection.
In a preferred aspect, a biological embodiment to be tested for one or more features of adequacy can be selected from, a tissue aspirate sample, a surgical tissue biopsy sample, a cell sample, a FNA tissue sample, an EUS-FNA tissue sample, an EUS-FNB tissue sample, a thyroid
FNA sample, a lung biopsy sample, including a transbronchi al biopsy sample, and a percutaneous lung biopsy sample, a pancreas tissue sample, including a pancreatic FNA sample, and endoscopic ultrasound-guided fine needle biopsy (EUS-FNB) of solid pancreatic lesions, an endobronchial ultrasound (EBUS) sample, sentinel lymph node biopsies, melanoma biopsies, bone marrow aspiration, a liver tissue sample, an intestine tissue sample, a bone tissue sample, a muscle and tissue sample, sputum/oral fluid, amniotic fluid, blood, a blood fraction, bone marrow, urine, semen, stool, vaginal fluid, peritoneal fluid, pleural fluid, tissue explant, organ culture, cell culture, and any other tissue or cell preparation, or fraction or derivative thereof or isolated therefrom.
Biologicalsamples can be obtained from any subject or biological source. Although the sample is often taken from a human subject (e.g., a patient), samples can be taken from any organism, including, but not limited to mammals (e.g., dogs, cats, horses, goats, sheep, cattle, pigs, etc ), non-mammal higher organisms (e.g., reptiles, amphibians), as well as vertebrates and invertebrates.
In a preferred embodiment, the biological sample includes a cell sample, a biopsy, or fine needle aspiration, preferably from a human subject. As noted below, fine needle aspiration can be used to obtain a biological sample from variety of different tissues or organs such as: thyroid, thyroid nodules, lymph nodes, breast tissue, bone marrow, lungs, pancreas, kidneys, and abdominal fluid in the peritoneal, or a combination of the same. In one preferred embodiment, the fine needle aspiration of the invention includes bone marrow fine needle aspiration. In alternative embodiments, the biological sample can be obtained via ultrasound-guided fine needle biopsy (EUS-FNB), ultrasound-guided fine needle aspiration (EUS-FNA) as generally described herein.
The invention further includes the use of a deep-ultraviolet (UV) imaging device configured to illuminate the biological sample with UV wavelengths of light and further capture an image of the sample for later processing and analysis. As used herein, a deep-ultraviolet (UV) imaging device can include separate devices for illuminating the sample with UV light, and another separate device to image the sample for later processing and analysis. In a preferred embodiment, the deep-ultraviolet (UV) imaging device of the invention specifically include the systems, methods and apparatus for a deep-UV microscope described by Robles et al., U.S. Patent No. 12136284, (which is incorporated herein in its entirety by reference).
The invention further includes the step of obtaining a test dataset by imaging the biological sample using a deep-ultraviolet (UV) imaging device described above. During this process, the
UV imaging device illuminates the test sample with one or more wavelength of UV light and further captures an image of the illuminated sample which can be viewed directly through the UV imaging device or transmitted to a separate computer device having a graphical user interface configured to display the UV illuminated sample. As used herein “test sample” is a sample that may be used to generate a test dataset, for example of one or more features of adequacy, which may be qualitatively and/or quantitatively compared to a training dataset as generally described herein.
The invention further includes the step of identifying in the image one or more features of adequacy. As noted above, the features of adequacy can include features previously identified by a reference set generated by imaging a reference dataset of reference samples that are known to be adequate for diagnosis or further processing using a deep-ultraviolet (UV) microcopy that illuminates the samples forming the reference dataset and extracting one or more features of adequacy from a plurality of images from the reference dataset. A “reference sample” as used herein is a sample that may be used to train a computer learning system, such as by generating a training dataset.
As used herein, a “feature,” “feature of adequacy,” “feature of sample adequacy,” or “sample feature” is a feature of a sample that represents a quantifiable and/or observable feature of a cell or object visualized under UV illumination. In certain embodiments, a “feature of interest” may potentially correlate to a clinically relevant condition, or in a preferred embodiment a “feature of interest” may potentially correlate with a sample that is assessed as adequate for downstream diagnosis of a disease or condition, or further processing. In certain embodiments, a feature of interest is a feature that appears in an image of a sample, such as a biological sample, and may be recognized, segmented, and/or classified by a machine learning model. Examples of features of interest include components of images of a biological sample; the aforementioned images can characterize objects such as cells of the host (including both normal and abnormal host cells; e.g., tumor and normal somatic cells) red blood cells (nucleated and anucleate), white blood cells, somatic non-blood cells, and the like, non-cell components, and generally any observable particle that can be identified and visualized by UV microscopy or imaging device. Each of these examples of a specimen feature presented above can be used as a separate classification for the machine learning systems described herein. Such systems can classify any of these alone or in combination with other examples.
As described below, a feature of adequacy of the invention can include: cell type, cell morphology, non-cellular components, a cell phenotype, a pathogen, a cell genotype, chromatin morphology or content, dead cells, necrotic tissues, tissue fragments, extracellular matrix, sub cellular features and/or structures, sub-cellular granules, sub-cellular abnormal nucleus/cy topi asm ratio, one or more features described in Table 5, or a combination of the same. In one preferred embodiment, the feature of adequacy of the invention can include a bony spicule identified under UV illumination in a bone marrow aspiration.
As noted above, where a feature of adequacy is identified, the biological sample can be further used to diagnose a disease or condition. In this embodiment, a doctor or other qualified practitioner can evaluate and optionally further process the biological sample. For example, diagnostic indication from fine needle aspirations, apart from the visual identification of spicules bone marrow aspirates, are not generally visible simply through UV illumination which can preferably determine the adequacy of the sample.
To make or confirm a determination of the adequacy of a sample, or to make a diagnosis based on the sample, a cytologist, pathologist or other qualified practitioner can further process the sample, such as by staining the sample, and evaluate the adequacy on site. In this embodiment, a qualified practitioner can used staining techniques known in the art to identity features, such as a mix of abnormal and normal cells; precancerous and cancerous cells, (i.e., adenocarcinoma, metastatic renal cell carcinoma, ductal carcinoma, hepatocellular carcinoma, lymphoma) the presence of neoplastic cells; indications of bacterial infection; ratio of normal and abnormal white blood cells; and the quantity of red blood cells. Those skilled in the art will appreciate that the disclosed process for adequacy assessment is non-destructive, so the assessed sample can be used for other types of testing.
With respect to the computer aided aspects of the invention, those of skill in the art will further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described
functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure. The various illustrative logical blocks, modules, and circuits described in connection with the embodiments disclosed herein may be implemented or performed with a general-purpose processor, a digital signal processor (DSP), an application specific integrated circuit (ASIC), a field programmable gate array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any combination thereof designed to perform the functions described herein. A general-purpose processor may be a microprocessor, but in the alternative, the processor may be any conventional processor, controller, microcontroller, or state machine. A processor may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP core, or any other such configuration.
The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, a software module implemented as digital logic devices, or in a combination of these. A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of non- transitory, tangible computer-readable storage medium known in the art. An exemplary non- transitory, tangible computer-readable storage medium is coupled to the processor such that the processor can read information from, and write information to, the non-transitory, tangible computer-readable storage medium. In the alternative, the non-transitory, tangible computer- readable storage medium may be integral to the processor. The processor and the non-transitory, tangible computer-readable storage medium may reside in an ASIC. The ASIC may reside in a user terminal. In the alternative, the processor and the non-transitory, tangible computer-readable storage medium may reside as discrete components in a user terminal. In some embodiments, a software module may be implemented as digital logic components such as those in an FPGA once programmed with the software module.
It is further contemplated that one or more of the components or subcomponents described in relation to the computer system shown in the Figures such as, but not limited to, the network, processor, memory, etc., may comprise a cloud computing system. In one such system, front-end systems such as input devices may provide information to back-end platforms
such as servers (e.g., computer systems) and storage (e.g., memory). Software (i.e., middleware) may enable interaction between the front-end and back-end systems, with the back-end system providing services and online network storage to multiple front-end clients. For example, a software-as-a-service (SAAS) model may implement such a cloud-computing system. In such a system, users may operate software located on back-end servers through the use of a front-end software application such as, but not limited to, a web browser.
As described above, any of the computing systems described in herein, whether controlled by end users directly or by a remote entity controlling one or more components of said system of the invention, can be implemented as software components executing on one or more general purpose processors or specially designed processors such as programmable logic devices (e.g., Field Programmable Gate Arrays (FPGAs)) and/or Application Specific Integrated Circuits (ASICs) designed to perform certain functions or a combination thereof. In some embodiments, code executed during operation of the systems of the invention (computational elements) can be embodied by a form of software elements which can be stored in a nonvolatile storage medium (such as optical disk, flash storage device, mobile hard disk, cloud-based systems etc. ), including a number of instructions for making a computer device (such as personal computers, servers, network equipment, etc.). Algorithms, machine learning models and/or other computational structures described herein may be implemented on a single device or distributed across multiple devices. The functions of the computational elements may be merged into one another or further split into multiple sub-modules.
The hardware device of the invention can be any kind of device that can be programmed including, for example, any kind of computer including smart mobile devices (watches, phones, tablets, and the like), personal computers, powerful servers or supercomputers, or the like. The device includes one or more processors such as an ASIC or any combination processors, for example, one general purpose processor and two FPGAs. The device may be implemented as a combination of hardware and software, such as an ASIC and an FPGA, or at least one microprocessor and at least one memory with software modules located therein. In various embodiments, the system includes at least one hardware component and/or at least one software component. The embodiments described herein could be implemented in pure hardware or partly in hardware and partly in software. In some cases, the disclosed embodiments may be implemented on different hardware devices, for example using a plurality of CPUs equipped with GPUs capable
of accelerating and/or coordinating computation. Each computational element may be implemented as an organized collection of computer data and instructions. System software typically interfaces with computer hardware, typically implemented as one or more processors (e.g., CPUs or ASICs as mentioned) and associated memory. In certain embodiments, the system software includes operating system software and/or firmware, as well as any middleware and drivers installed in the system. The system software provides basic non-task-specific functions of the computer. In contrast, the modules and other application software are used to accomplish specific tasks. Each native instruction for a module is stored in a memory device and is represented by a numeric value.
At one level a computational element is implemented as a set of commands prepared by the programmer/developer. However, the module software that can be executed by the computer hardware is executable code committed to memory using “machine codes” selected from the specific machine language instruction set, or “native instructions,” designed into the hardware processor. The machine language instruction set, or native instruction set, is known to, and essentially built into, the hardware processor(s). This is the “language” by which the system and application software communicates with the hardware processors. Each native instruction is a discrete code that is recognized by the processing architecture and that can specify particular registers for arithmetic, addressing, or control functions; particular memory locations or offsets; and particular addressing modes used to interpret operands. More complex operations are built up by combining these simple native instructions, which are executed sequentially, or as otherwise directed by control flow instructions.
The inter-relationship between the executable software instructions and the hardware processor may be structural. In other words, the instructions per se may include a series of symbols or numeric values. They do not intrinsically convey any information. It is the processor, which by design was preconfigured to interpret the symbols/numeric values, which imparts meaning to the instructions.
All of the methods described herein may include storing results of one or more steps of the method embodiments in memory. The results may include any of the results described herein and may be stored in any manner known in the art. The memory may include any memory described herein or any other suitable storage medium known in the art. After the results have been stored, the results can be accessed in the memory and used by any of the method or system embodiments
described herein, formatted for display to a user, used by another software module, method, or system, etc. Furthermore, the results may be stored “permanently,” “semi-permanently,” temporarily, or for some period of time. For example, the memory may be random access memory (RAM), and the results may not necessarily persist indefinitely in the memory.
Notably, there are various vehicles by which processes and/or systems and/or other technologies described herein can be affected (e.g., hardware, software, and/or firmware), and that the preferred vehicle will vary with the context in which the processes and/or systems and/or other technologies are deployed. For example, if an implementer determines that speed and accuracy are paramount, the implementer may opt for a mainly hardware and/or firmware vehicle; alternatively, if flexibility is paramount, the implementer may opt for a mainly software implementation; or, yet again alternatively, the implementer may opt for some combination of hardware, software, and/or firmware. Hence, there are several possible vehicles by which the processes and/or devices and/or other technologies described herein may be affected, none of which is inherently superior to the other in that any vehicle to be utilized is a choice dependent upon the context in which the vehicle will be deployed and the specific concerns (e.g., speed, flexibility, or predictability) of the implementer, any of which may vary. Those skilled in the art will recognize that optical aspects of implementations will typically employ optically oriented hardware, software, and or firmware.
In some embodiment described herein, logic and similar implementations may include software or other control structures. Electronic circuitry, for example, may have one or more paths of electrical current constructed and arranged to implement various functions as described herein. In some implementations, one or more media may be configured to bear a device-detectable implementation when such media hold or transmit device-detectable instructions operable to perform as described herein, and preferrable transmitted to a mobile device as an audio signal, and even more preferably an inaudible audio signal. In some variants, for example, implementations may include an update or modification of existing software or firmware, or of gate arrays or programmable hardware, such as by performing a reception of or a transmission of one or more instructions in relation to one or more operations described herein. Alternatively, or additionally, in some variants, an implementation may include special-purpose hardware, software, firmware components, and/or general-purpose components executing or otherwise invoking special-purpose components. Specifications or other implementations may be transmitted by one or more instances
of tangible transmission media as described herein, optionally by packet transmission or otherwise by passing through distributed media at various times.
Alternatively, or additionally, implementations may include executing a special-purpose instruction sequence or invoking circuitry for enabling, triggering, coordinating, requesting, or otherwise causing one or more occurrences of virtually any functional operations described herein. In some variants, operational or other logical descriptions herein may be expressed as source code and compiled or otherwise invoked as an executable instruction sequence. In some contexts, for example, implementations may be provided, in whole or in part, by source code, such as C++, or other code sequences.
In other implementations, source or other code implementation, using commercially available and/or techniques in the art, may be compiled/implemented/translated/converted into a high-level descriptor language (e.g., initially implementing described technologies in C or C++ programming language and thereafter converting the programming language implementation into a logic-synthesizable language implementation, a hardware description language implementation, a hardware design simulation implementation, and/or other such similar mode(s) of expression). For example, some or all of a logical expression (e.g., computer programming language implementation) may be manifested as a Verilog-type hardware description (e.g., via Hardware Description Language (HDL) and/or Very High Speed Integrated Circuit Hardware Descriptor Language (VHDL)) or other circuitry model which may then be used to create a physical implementation having hardware (e.g., an Application Specific Integrated Circuit). Those skilled in the art will recognize how to obtain, configure, and optimize suitable transmission devices or computational elements, material supplies, actuators, or other structures in light of these teachings.
The foregoing detailed description has set forth various embodiments of the devices and/or processes via the use of block diagrams, flowcharts, and/or examples. In so far as block diagrams, flowcharts, and/or examples contain one or more functions and/or operations, it will be understood by those within the art that each function and/or operation within such block diagrams, flowcharts, or examples can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, or virtually any combination thereof. In one embodiment, several portions of the subject matter described herein may be implemented via Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs), digital signal processors (DSPs), or other integrated formats. However, those skilled in the art will recognize that some aspects of the
embodiments disclosed herein, in whole or in part, can be equivalently implemented in integrated circuits, as one or more computer programs running on one or more computers (e.g., as one or more programs running on one or more computer systems), as one or more programs running on one or more processors (e.g., as one or more programs running on one or more microprocessors), as firmware, or as virtually any combination thereof, and that designing the circuitry and/or writing the code for the software and or firmware would be well within the skill of one of skill in the art in light of this disclosure. In addition, those skilled in the art will appreciate that the mechanisms of the subject matter described herein are capable of being distributed as a program product in a variety of forms, and that an illustrative embodiment of the subject matter described herein applies regardless of the particular type of signal bearing medium used to actually carry out the distribution. Examples of a signal bearing medium include, but are not limited to, the following: a recordable type medium such as a floppy disk, a hard disk drive, a Compact Disc (CD), a Digital Video Disk (DVD), a digital tape, a computer memory, etc.; and a transmission type medium such as a digital and/or an analog communication medium (e.g., a fiber optic cable, a waveguide, a wired communications link, a wireless communication link (e.g., transmitter, receiver, transmission logic, reception logic, etc.), etc.).
The various embodiments described herein can be implemented by various types of electromechanical systems having a wide range of electrical components such as hardware, software, firmware, and/or virtually any combination thereof; and a wide range of components that may impart mechanical force or motion such as rigid bodies, spring or torsional bodies, hydraulics, electro-magnetically actuated devices, and/or virtually any combination thereof. Consequently, as used herein “electro-mechanical system” includes, but is not limited to, electrical circuitry operably coupled with a transducer (e.g., an actuator, a motor, a piezoelectric crystal, a Micro Electro Mechanical System (MEMS), etc.), electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc.)), electrical circuitry
forming a communications device (e.g., a modem, communications switch, optical -electrical equipment, etc ), and/or any non-electrical analog thereto, such as optical or other analogs.
Those skilled in the art will also appreciate that examples of electro-mechanical systems include but are not limited to a variety of consumer electronics systems, medical devices, as well as other systems such as motorized transport systems, factory automation systems, security systems, and/or communication/computing systems. Those skilled in the art will recognize that electro-mechanical as used herein is not necessarily limited to a system that has both electrical and mechanical actuation except as context may dictate otherwise. The various aspects described herein which can be implemented, individually and/or collectively, by a wide range of hardware, software, firmware, and/or any combination thereof can be viewed as being composed of various types of “electrical circuitry.” Consequently, as used herein “electrical circuitry” includes, but is not limited to, electrical circuitry having at least one discrete electrical circuit, electrical circuitry having at least one integrated circuit, electrical circuitry having at least one application specific integrated circuit, electrical circuitry forming a general purpose computing device configured by a computer program (e.g., a general purpose computer configured by a computer program which at least partially carries out processes and/or devices described herein, or a microprocessor configured by a computer program which at least partially carries out processes and/or devices described herein), electrical circuitry forming a memory device (e.g., forms of memory (e.g., random access, flash, read only, etc.)), and/or electrical circuitry forming a communications device (e.g., a modem, communications switch, optical-electrical equipment, etc.). Those having skill in the art will recognize that the subject matter described herein may be implemented in an analog or digital fashion or some combination thereof.
It should be noted that at least a portion of the devices and/or processes described herein can be integrated into a data processing system. Those having skill in the art will recognize that a data processing system generally includes one or more of a system unit housing, a video display device, memory such as volatile or non-volatile memory, processors such as microprocessors or digital signal processors, computational entities such as operating systems, drivers, graphical user interfaces, and applications programs, one or more interaction devices (e.g., a touch pad, a touch screen, an antenna, etc.), and/or control systems including feedback loops and control motors (e g., feedback for sensing position and/or velocity; control motors for moving and/or adjusting components and/or quantities). A data processing system may be implemented utilizing suitable
commercially available components, such as those typically found in data computing/communication and/or network computing/communication systems.
Moreover, the herein described components (e g., operations), devices, objects, and the discussion accompanying them are used as examples for the sake of conceptual clarity and that various configuration modifications are contemplated. Consequently, as used herein, the specific exemplars set forth and the accompanying discussion are intended to be representative of their more general classes. In general, use of any specific exemplar is intended to be representative of its class, and the non-inclusion of specific components (e.g., operations), devices, and objects should not be taken limiting.
With respect to the use of substantially any plural and/or singular terms herein, those having skill in the art can translate from the plural to the singular and/or from the singular to the plural as is appropriate to the context and/or application. The various singular/plural permutations are not expressly set forth herein for sake of clarity.
The herein described subject matter sometimes illustrates different components contained within, or connected with, different other components. It is to be understood that such depicted architectures are merely exemplary, and that in fact many other architectures may be implemented which achieve the same functionality. In a conceptual sense, any arrangement of components to achieve the same functionality is effectively “responsive” such that the desired functionality is achieved. Hence, any two components herein combined to achieve a particular functionality can be seen as “responsive with” each other such that the desired functionality is achieved, irrespective of architectures or intermedial components. Likewise, any two components so associated can also be viewed as being “operably connected,” or “operably coupled,” to each other to achieve the desired functionality, and any two components capable of being so associated can also be viewed as being “operably couplable,” to each other to achieve the desired functionality. Specific examples of operably couplable include but are not limited to physically mateable and/or physically interacting components, and/or wirelessly intractable, and/or wirelessly interacting components, and/or logically interacting, and/or logically intractable components.
In some instances, one or more components may be referred to herein as “configured to,” “configurable to,” “responsive to,” “adapted/adaptable,” “able to,” “conformable/conformed to,” etc. Those skilled in the art will recognize that such terms (e.g., “configured to”) can generally
encompass active-state components and/or inactive-state components and/or standby-state components, unless context requires otherwise.
While particular aspects of the present subject matter described herein have been shown and described, it will be apparent to those skilled in the art that, based upon the teachings herein, changes and modifications may be made without departing from the subject matter described herein. It will be understood by those within the art that, in general, terms used herein are generally intended as “open” terms (e g., the term “including” should be interpreted as “including but not limited to,” the term “having” should be interpreted as “having at least,” the term “includes” should be interpreted as “includes but is not limited to,” etc.). It will be further understood by those within the art that if a specific number of an introduced claim recitation is intended, such an intent will be explicitly recited in the claim, and in the absence of such recitation no such intent is present. For example, as an aid to understanding, the following appended claims may contain usage of the introductory phrases “at least one” and “one or more” to introduce claim recitations. However, the use of such phrases should not be construed to imply that the introduction of a claim recitation by the indefinite articles “a” or “an” limits any particular claim containing such introduced claim recitation to claims containing only one such recitation, even when the same claim includes the introductory phrases “one or more” or “at least one” and indefinite articles such as “a” or “an” (e.g., “a” and/or “an” should typically be interpreted to mean “at least one” or “one or more”); the same holds true for the use of definite articles used to introduce claim recitations. In addition, even if a specific number of an introduced claim recitation is explicitly recited, those skilled in the art will recognize that such recitation should typically be interpreted to mean at least the recited number (e.g., the bare recitation of “two recitations,” without other modifiers, typically means at least two recitations, or two or more recitations). Furthermore, in those instances where a convention analogous to “at least one of A, B, and C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, and C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together, B and C together, and/or A, B, and C together, etc.). In those instances where a convention analogous to “at least one of A, B, or C, etc.” is used, in general such a construction is intended in the sense one having skill in the art would understand the convention (e.g., “a system having at least one of A, B, or C” would include but not be limited to systems that have A alone, B alone, C alone, A and B together, A and C together,
B and C together, and/or A, B, and C together, etc. ). It will be further understood by those within the art that typically a disjunctive word and/or phrase presenting two or more alternative terms, whether in the description, claims, or drawings, should be understood to contemplate the possibilities of including one of the terms, either of the terms, or both terms unless context dictates otherwise. For example, the phrase “A or B” will be typically understood to include the possibilities of “A” or “B” or “A and B.”
The word “substantially,” as used herein with respect to any property or circumstance, refers to a degree of deviation that is sufficiently small so as to not appreciably detract from the identified property or circumstance. The exact degree of deviation allowable in a given circumstance will depend on the specific context, as would be understood by one having ordinary skill in the art.
Use of the terms “about” or “approximately” are intended to describe values above and/or below a stated value or range, as would be understood by one having ordinary skill in the art in the respective context. In some instances, this may encompass values in a range of approx. +/— 10%; in other instances there may be encompassed values in a range of approx. +/-5%; in yet other instances values in a range of approx. +/-2% may be encompassed; and in yet further instances, this may encompass values in a range of approx. +/-1%.
It will be understood that the terms “comprises” and/or “comprising,” when used in this specification, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof, unless indicated herein or otherwise clearly contradicted by context.
Recitations of a value range herein, unless indicated otherwise, serves as a shorthand for referring individually to each separate value falling within the stated range, including the endpoints of the range, each separate value within the range, and all intermediate ranges subsumed by the overall range, with each incorporated into the specification as if individually recited herein.
Unless indicated otherwise, or clearly contradicted by context, methods described herein can be performed with the individual steps executed in any suitable order, including: the precise order disclosed, without any intermediate steps or with one or more further steps interposed between the disclosed steps; with the disclosed steps performed in an order other than the exact
order disclosed; with one or more steps performed simultaneously; and with one or more disclosed steps omitted.
With respect to the appended claims, those skilled in the art will appreciate that recited operations therein may generally be performed in any order. Also, although various operational flows are presented in a sequence(s), it should be understood that the various operations may be performed in other orders than those which are illustrated or may be performed concurrently. Examples of such alternate orderings may include overlapping, interleaved, interrupted, reordered, incremental, preparatory, supplemental, simultaneous, reverse, or other variant orderings, unless context dictates otherwise. Furthermore, terms like “responsive to,” “related to,” or other past tense adjectives are generally not intended to exclude such variants, unless context dictates otherwise.
Although the present technology has been described in detail for the purpose of illustration based on what is currently considered to be the most practical and preferred implementations, it is to be understood that such detail is solely for that purpose and that the technology is not limited to the disclosed implementations, but, on the contrary, is intended to cover modifications and equivalent arrangements that are within the spirit and scope of the appended claims. For example, it is to be understood that the present technology contemplates that, to the extent possible, one or more features of any implementation can be combined with one or more features of any other implementation.
The invention now being generally described will be more readily understood by reference to the following examples, which are included merely for the purposes of illustration of certain aspects of the embodiments of the present invention. The examples are not intended to limit the invention, as one of skill in the art would recognize from the above teachings and the following examples that other techniques and methods can satisfy the claims and can be employed without departing from the scope of the claimed invention.
EXAMPLES
Example 1 : Experimental rationale and result.
Bone marrow aspiration is a procedure routinely performed on patients being evaluated for diseases of the blood and bone marrow, including cancers. Evaluating the adequacy of bone marrow aspirates, indicated by the presence of bony spicules in the sample, is critical to ensure that the procedure was performed correctly, and that appropriate diagnostic material was collected. Currently, aspirate samples are evaluated using Giemsa staining protocols, which require a trained
laboratory technician and lengthy processing that can take several hours. If a sample is deemed inadequate, which occurs in approximately 50% of cases, the patient must return to undergo another aspiration procedure. This is particularly problematic for pediatric cases where patients must be fully anesthetized. Recently, label-free deep ultraviolet (UV) microscopy has been demonstrated for hematological analysis by leveraging unique absorption properties of biomolecules in the UV region of the spectrum for high-resolution molecular imaging. In this work, we present a portable, LED-based UV microscope for rapid imaging and real time inspection of bone marrow aspirates. We discuss results from a clinical trial with pediatric oncology patients showing excellent agreement between intraoperative evaluation with our UV microscope and hematopathologist inspection of stained samples. We also adapt neural networks for fast spicule detection and perform automated whole slide scanning of aspirate smears. Ultimately, we demonstrate the potential of UV microscopy to allow for real-time evaluation of bone marrow aspirate samples, reducing the time, cost, and number of bone marrow aspiration procedures performed in the clinic.
To accommodate UV-opaque glass microscope slides commonly used in the clinic for the preparation of bone marrow aspirate smears, a UV-bandpass fdter, aligned with the 255nm LED used for illumination, was placed after the objective. This filter effectively screened out glass autofluorescence, permitting only the transmission of light at the illumination wavelength. To improve illumination power within the microscope’s field of view (FOV) and mitigate glass absorption, a condenser lens module was introduced between the LED and the sample. Furthermore, the system incorporated manual translation stages to facilitate user-controlled slide translation and inspection. A turret housing 5/40X objective lenses was integrated to enable variable magnification during the inspection of bone marrow aspirate smears. This system was connected to a nearby PC for real-time visualization of samples in grayscale and pseudocolorized formats (Fig. 5), mimicking conventional May-Grunwald/Giemsa staining prior to hematopathologist inspection.
To validate the efficacy of the UV microscope system for real-time analysis of bone marrow aspirate adequacy, a clinical trial was conducted at the Aflac Sedation Suite at Children’s Healthcare of Atlanta, Egleston, involving 51 pediatric patients undergoing bone marrow aspiration. For each patient, a smear was prepared following the bone marrow aspiration procedure. This smear was first visually inspected by a bedside technician for the presence of
spicules (naked eye examination). Then, the smear was immediately inspected on our UV microscope by a lab technician, who noted the presence or absence of spicules to determine if the aspiration was adequate. On average, this examination process was completed in less than two minutes. Subsequently, the sample was stained and inspected by a hematopathologist to check for the presence of spicules, establishing a reliable ground truth for comparison. The results from the clinical trial are summarized in Tables 1 and 2 below.
Table 1. Visual UV inspection results
Table 2. Bedside inspection results
Results from the clinical trial comparing UV inspection of unstained slides with ground truth hematopathologist inspection of stained slides demonstrates high accuracy with our portable UV microscope system (94.1%). In comparison, the accuracy of the bedside inspection practice, becoming more widely adopted to improve clinical outcomes, was lower (82.3%). These results suggest that real-time feedback from our UV microscope offers an improvement in bone marrow aspirate adequacy assessment as compared to the current clinical gold standard and thus could significantly reduce the number of bone marrow procedures performed.
Automated Spicule Detection
Sources of error during the clinical trial included incorrect classification of spicules by the technician operating the UV system. To mitigate these errors and extend the usability of this device to settings where a trained user is not present, we adapted a pretrained object detection algorithm (YOLOv78) for neural network based automated spicule detection. This system classifies whether a sample contains spicules and, if present, precisely identifies their location through bounding boxes in real-time.
The reference sample comprises a dataset comprised 4289 images collected during the clinical trial involving 51 patients, capturing approximately 400 spicules. Given the dataset imbalance, a subset of images was specifically obtained to represent both the background and spicules adequately. To ensure robust evaluation, a randomized 5-fold cross-validation was conducted, producing an aspirate adequacy assessment for each unique patient ID. The optimization of bounding box and detection threshold was achieved through an analysis of the Receiver Operating Characteristic (ROC) curve. This approach enhances the reliability and effectiveness of the automated spicule detection system. The results of the adequacy assessment obtained from automated UV analysis are presented in Table 3 below.
Table 3. Automated UV Inspection Results
These results demonstrate that our network can detect spicules from grayscale UV microscopy images with high accuracy, in less than 15ms per image FOV. The 5-fold cross validation yields superior performance to both manual UV inspection and bedside UV inspection of unstained slides, indicating the capability for rapid, automated spicule detection and classification from unstained bone marrow aspirate smears.
Example 2: Evaluation of exemplary bone marrow aspirates.
Clinical Trial
During the clinical trial, bone marrow aspirate images were saved by the lab technician operating the UV microscope and are shown in Figure 4A below, at both low and high magnification. At low magnification, spicules can be easily identified within the microscope field- of-view (green arrows) due to inherent UV contrast. Real-time pseudo-colorization via a simple colormap adjustment enhances the visual contrast for spicules and mimics the appearance of aspirate smears during pathological examination. At high magnification, unique nucleated marrow cells can be distinguished in addition to the larger spicules via strong absorption of 255nm light by nucleic acids. Observed cells include megakaryocytes (yellow), erythroid precursor cells (red), and myeloid precursor cells (blue), characterized by nuclear morphology and size. These
diagnostically relevant cells are found in bone marrow and can indicate proper aspiration, even in the absence of spicules.
During the 51 -patient clinical trial, the bone marrow aspiration success rate (without any intervention from a bedside technician) was 76.5%. The effective sensitivity and specificity are 100% and 0% respectively, assuming no assessment on the quality of the acquired aspirate. Bedside naked eye assessment of the unstained slides (Fig. 2B) only raised this accuracy to 82.4% with a much higher specificity (66.7%) but lower sensitivity (87.2%). While this technique does offer a slight improvement in aspiration adequacy, it did negatively impact the aspirate procedures. The introduction of five false negative assessments corresponds with five patients who underwent an additional and unnecessary aspiration, as their first aspiration was adequate. Results from the visual UV inspection (Fig. 2C) demonstrate an improvement in aspiration adequacy assessment compared to current clinical practices with an overall accuracy of 94.1% and sensitivity and specificity of 97.4% and 83.3%, respectively. The proposed procedure minimally impacts current clinical protocols, only requiring around three minutes for UV inspection before the smear is sent to a laboratory for staining and pathologist evaluation, highlighting the potential to significantly reduce the number of bone marrow aspiration procedures performed. The three misclassifications from visual UV inspection were a result of incorrect identification of large RBC clusters within the aspirate smears by the operating lab technician. Indeed, these clusters appear highly absorbing under UV illumination and can be mistaken for bony spicules by a user, thus prompting the integration of real-time object detection and classification networks.
Deep learning algorithm for automated spicule detection
The developed workflow for automated spicule detection using the YOLOv7 object detection algorithm is shown in Figure 3A below. The field-of-view (FOV) of each input image and patch are 3.1x3.1mm and 0.96x0.96mm, respectively. During the algorithm implementation, model hyperparameters were optimized, resulting in a Mean Average Precision (MAP) of 66.9%. MAP is a common metric used for assessing object detection model performance and the resulting value is very close to the maximum (69.7%) offered by the YOLOv7 algorithm and on par with other YOLOv7 implementations.
Output images from the trial (Fig. 3B) clearly demonstrate accurate object detection in images with (left) and without (right) spicules. The resulting bounding boxes are generated at under 100ms per FOV and tightly surround detected spicule regions, distinctly highlighting their
locations. Results from the automated UV inspection (Fig. 3C) demonstrate that the five-fold cross-validation shows an improvement in aspiration adequacy assessment compared to both the bedside inspection and visual UV inspection with an overall accuracy of 95.7% and sensitivity and specificity of 100% and 80.0%, respectively. This analysis only included 47 patients from the trial, however, as images were not collected for four patients by the technicians operating the UV microscope. The primary source of error during the automated adequacy assessment remained dense RBC clusters, which would be mitigated with larger datasets featuring more of these absorbing regions in the training data.
Whole slide scans of unstained and stained samples: Paired UV whole slide scans of unstained smears and brightfield whole slide scans of stained smears were generated for several slides with varying levels of spicularity. Figure 4A includes both whole slides scans at low magnification (left), demonstrating excellent concordance between pseudo-colorized UV images and stained brightfield images. Spicules are distinguishable in these macro-scale scans (inside of green inset border), highlighting the potential for adequacy assessment from automated, low- magnification scans of unstained slides. Higher magnification scans of smaller regions of interest show spicules (middle) and diagnostically relevant marrow cells including megakaryocytes (yellow arrows, right). Our simple pseudo-colorization algorithm mimics conventional staining even at smaller spatial scales. Corresponding images of an aspicular sample are also included (Fig. 4B) showing a lack of significant features at low and high magnification.
Example 3: Materials and Methods.
Clinical trial: A clinical trial was conducted at the Aflac Sedation Suite at Children’s Healthcare of Atlanta, Egleston Hospital, with pediatric patients undergoing bone marrow aspiration procedures. For each patient, a smear was prepared on a glass slide by a trained laboratory technician in the operating room. This smear was first visually inspected by the technician for the presence of spicules (naked eye examination). Then, the smear was immediately inspected on our UV microscope by a lab technician, who noted the presence or absence of spicules to determine if the aspiration was adequate. This portable UV is similar to one presented in previous work and features a deep-UV LED (Boston Electronics) for narrowband 255nm illumination, 5/40X objective lenses (LMU-5X-UVB/LMU-40X-UVB, Thorlabs) for viewing samples at variable magnifications, and manual translation stages for slide inspection. This system was connected to a PC with a custom graphical user interface (GUI) for real-time visualization of
samples in pseudo-colorized format. This colorization mimics conventional May- Griinwald/Giemsa staining via a simple colormap adjustment without requiring any use of computationally expensive artificial intelligence (Al) algorithms. On average, the lab technician’s adequacy assessment (blinded to the bedside naked eye examination results) was completed in under three minutes. Each sample was then stained and inspected by a hematopathologist for the presence of spicules to be used as ground truth as per conventional clinical protocols.
Statistical analysis of clinical trial results: Three aspirate adequacy assessments were performed for each patient: a bedside naked eye inspection of an unstained slide, UV inspection of an unstained slide, and pathologist inspection of a stained slide. Accuracy, sensitivity, and specificity were calculated for bedside and UV inspection assessment method relative to the ground truth pathologist assessment across all patients.
Deep learning algorithm for automated spicule detection
To generate real-time aspirate adequacy assessment and improve assessment accuracy, deep-learning algorithms were leveraged for automated spicule detection. The Y0L0v7 (You Only Look Once) object detection algorithm was adapted to perform rapid object detection and classification in bone marrow aspirate images. The Y0L0v7 network surpasses most real-time object detectors in both speed and accuracy. The acquired dataset comprised approximately 4300 images captured during the clinical trial (across all patients) including images of 356 distinct spicules. These spicules were manually annotated within all images and verified by hematopathologists.
A randomized 5-fold cross-validation was conducted to generate an aspirate adequacy assessment per clinical trial patient. For each model, training data comprised 80% of the acquired dataset separated by patient ID, with 20% saved for validation (corresponding with around 10 patients). During training, each full size (2048x2048p) image was converted into 640x640p patches with approximately 10% overlap to be compatible with the chosen object detection network. The images were then converted into a PyTorch tensor as a compatible input for the model. The model’s outputs on all patches were combined to generate a reconstructed image with the predicted bounding boxes. Then, non-maximum suppression was implemented to remove redundant bounding box predictions and improve the object detection accuracy. The network bounding box area thresholds were optimized via receiver operating characteristic (ROC) curve analysis and final detected spicule locations were extracted. This network was trained in Python
3.9 with PyTorch 1.11.0 on a NVIDIA GeForce RTX 2080Ti GPU for 100 epochs with a batch size of 8. An adequacy assessment was generated per patient ID and then compiled to compare with adequacy assessments from bedside assessment and visual UV inspection.
Whole slide scanning via automated UV microscopy: During the clinical trial, additional smears were prepared from each patient using leftover aspirate samples. These smears were imaged using a previously developed low-cost UV microscope system for automated UV imaging of bone marrow aspirate slides and facile visualization of spicule density. Imaging was performed with 5X and 20X objective lenses corresponding with magnifications required to visualize spicules and individual cells, respectively. Each whole slide scan comprised of 225 images captured in a 15x15 grid with approximately 20% overlap. The resulting images were then stitched using a simple, open-source algorithm (Grid collection/stitching plug-in, Fiji) which linearly blends overlapping image regions before display for analysis. For validation of image quality and accuracy of pseudo-colorization algorithms, each smear was stained and imaged with brightfield microscopy for comparison. First, bone marrow aspirate smears were fixed using methanol (ThermoFisher Scientific) for 15 minutes and stained in May Grunwald solution (MG500, Sigma Aldrich) for 10 minutes. After a brief rinse in phosphate buffered saline (PBS), the smears were stained with a 1 : 10 diluted Giemsa solution (GS500, Sigma Aldrich) for 20 minutes. Then, samples were washed in PBS again and air-dried prior to imaging with a commercial color brightfield scanner (Cytation 7, Biotek).
Example 4: Application of Endoscopic ultrasound-guided fine needle aspiration/biopsy (EUS- FNA/B),
As noted above, one method to improve the diagnostic accuracy and adequacy of EUS- FNA/B is the use of ROSE to provide real-time feedback on samples. In procedures performed with ROSE, cytopathology staff perform staining, imaging, and adequacy assessment of each pass in the procedure room. If the sample is deemed inadequate, another pass is collected. ROSE also enables triage of limited specimens for follow up analysis by immunohistochemistry, flow cytometry, microbial culture, cytogenetics, or molecular studies. Increasing utilization of cytogenetics to refine diagnostic results furthers the need for efficient and adequate sampling. Based on a range of studies, it is estimated that ROSE is used in approximately 75-80% of EUS- FNA/B procedures, but rates can vary greatly amongst different centers. ROSE adoption is significantly lower in Europe and Asia, for example. One hurdle for universal implementation of
ROSE for EUS-FNA/B is the limited reimbursement provided for the procedure. Reimbursement for performance of ROSE by a pathologist is about $133 per hour, compared to over $600 per hour reimbursement for routine surgical pathology. While low reimbursement has led to reduced implementation of ROSE, a retrospective study of patients who underwent EUS-FNA with or without ROSE demonstrated that ROSE reduced the number of additional EUS-FNA procedures by half, resulting in $252 savings per case (after accounting for the additional cost of performing ROSE).
Efforts have been made to incorporate the benefits of ROSE without requiring a pathologist throughout the sample collection procedure. Telecytology by a remote pathologist can reduce the time required per procedure; however, billing and reimbursement codes for telecytology are not yet available. Some settings reported success in training alternative evaluators, such as cytotechnologists or endosonographers, to perform ROSE, but this approach has had mixed success, and there is no reimbursement available when ROSE is not performed by a pathologist.
Fine Needle Biopsy (FNB) has been heralded as a solution to improve adequacy rates in EUS procedures without ROSE; however, even with significant design improvements in various recent FNB platforms. However, as highlighted by one example, the rate of inconclusive samples from solid pancreatic lesions remains as high as 13%, with particular difficulty associated with patients having concurrent pancreatitis or fibrosis.
As described below, the present disclosure describes the use of label-free deep ultraviolet (UV) assay technology coupled with EUS-FNA/B methodology to improve diagnostic assessments and improve clinical outcomes. The imaging system, characterized by a straightforward optical path, allows cell analysis using images captured from a single wavelength LED light source. Pseudo-colorized UV images, derived from deep-UV microscopy, have demonstrated diagnostic efficacy equivalent to the gold standard Giemsa-stained images. Leveraging deep learning-based cell segmentation and classification algorithms, Applicants can analyze these images to categorize various cell types, enabling rapid automated sample assessment in without pathology personnel.
Example 5: Point-of-care device for EUS-FNA/B adequacy assessment.
In one embodiment, the present disclosures include a system for a point-of-care device for EUS-FNA/B adequacy assessment. In one embodiment, the system incorporates a deep learning framework to develop an analysis algorithm for FNA/B samples. In this embodiment, the system
includes the creation of an annotated dataset of FNA/Bs imaged by deep-UV microscopy which can be applied to a single-step object detection algorithm. The algorithm outputs key parameters that can enable the clinician to determine adequacy, including counts of benign cell types (epithelial cells, pancreatic ductal cells, acinar cells, neutrophils, lymphocytes), count of suspected malignant cells (cells with atypia, high nucleus/cytoplasm ratio, dense cytoplasm), and identification of regions of interest (necrosis, fibrosis, stromal fragments). Based on these outputs, a clinician can determine sample adequacy.
This system incorporates deep-UV imaging and deep learning-based adequacy assessment of biological samples, such as aspirates and biopsies from a human or animal subject. As noted above, traditional analysis of bone marrow aspirate smears involves fixing and then staining with Giemsa or similar dyes, which requires approximately 45 minutes of wait time before the slides are viewed under traditional light microscopy. This traditional approach of visualizing peripheral blood smears with stains is comparable to the presently described label-free deep-UV microscopy system and methods which rely on the inherent absorbance of specific UV wavelengths by cellular biomolecules, most notably nucleic acids and proteins. Deep-UV microscopy has the benefit of label/stain-free direct visualization with pseudo-colorization that recapitulates traditional Giemsa staining, removing the steps of fixing, staining, and waiting for the slide to dry. Figure 10 shows spicules present in a bone marrow aspirate (red arrows) with their characteristic deep blue hue in the unstained pseudocolorized UV image (left), which is nearly identical to the Giemsa-stained slide (right), which takes over 45 min to process. Further, at higher magnification (Figure 11), myelopoietic cells (e g., promyelocytes, myelocytes, metamyelocytes, band neutrophils, and lymphocytes) and erythropoietic cells (e.g., normoblasts), which are important cells for disease diagnosis from bone marrow aspirations, show the same morphological and colorimetric characteristics in the label-free UV images and in the stained images.
While adequacy assessment of EUS-FNA/B samples requires more complex analysis than bone marrow aspirate adequacy assessment, prior work has demonstrated similar image analysis complexity in the classification of blood cells, including a 5-part white blood cell differential, using the same deep learning approach. A high concordance between traditional Giemsa staining and deep-UV imaging in blood samples from healthy individuals, patients with thrombocytopenia, and patients with Sickle Cell Disease has previously been demonstrated, indicating comparative efficacy between the two analytical methods.
The present inventors have further demonstrated the ability to use deep learning techniques for automated analysis of deep-UV cell images, achieving 94% accuracy for a 5-part white blood cell differential. After transitioning to a single-step object detection algorithm based on Y0L0v7, Applicant were able to add red blood cell and platelet identification, in addition to white blood cell classification. The modified algorithm increased the classification accuracy to over 98% for each cell type (Table 4).
Table 4. Classification accuracy of cells in peripheral blood smears.
Based on the demonstrated ability to use deep-UV imaging for bone marrow aspirate adequacy assessment and the success of performing blood cell classification, including a 5-part white blood cell differential, Applicants have demonstrated the viability of their EUS-FNA/B adequacy assessment.
Example 6: Identification of lesional, non-lesionaL and acellular components of the FNA/Bs specimens using deep-UV microscopy.
In one embodiment, the present disclosures describe a device for rapid adequacy assessment of EUS-FNA/B samples (Figure IB) which can be configured to use single-wavelength imaging for assessment of EUS-FNA/B quality. Specifically, the prototype uses a 255nm LED followed by two 50mm lenses used as a condenser. In this preferred embodiment, the power at the sample is controlled electronically with a maximum of 60mW. The system further includes transmission microscope with UV compatible optics, comprising a UV objective a fussed silica 150mm tube lens, a UV filter (255nm bandpass, more on this below) and a UV sensitive camera.
The field of view and resolution of the system are c.a. 0.5mm and 0.6pm, respectively, with the 20X objective.
In this embodiment, biological samples can be smeared on regular glass microscope slides, as is currently done in the standard of care, for analysis with the UV imaging system of the disclosure. Notably, it is commonly believed by those of ordinary skill in the art that that imaging at wavelengths less than 300 nm must be performed on quartz rather than glass slides, as glass is highly absorbing in that range and also produces strong autofluorescence. However, Applicant have found that sufficient deep UV light is transmittance through standard 1mm glass microscope slides to enable high contrast deep-UV microscopy using camera integration times of 50-500ms, depending on the magnification. In this embodiment, glass autofluorescence is removed using a narrow (lOnm) bandpass filter centered at 255nm. Note that cells are only exposed to UV light for a fraction of a second and no cell damage occurs with such short exposures. The use of standard microscope glass slides provides a significant advantage for incorporating this pipeline into the clinical workflow, as quartz slides are far more expensive and clinical pathology labs have not validated them for use for the diagnostic imaging.
After image collection, key parameters, example of which are provided in Table 5, are quantified to enable to determine whether they have collected an adequate sample or whether an additional pass is necessary rather than explicitly stating whether a sample is adequate or inadequate, as knowledge about the suspected diagnosis and evaluation of previous passes frequently influence adequacy assessment. For example, a sample may be adequate for cytological evaluation but insufficient for molecular testing, in which case the adequacy determination would be dependent on whether the clinician required molecular testing results. In another example, a sample collected from this patient containing significant numbers of pancreatic ductal cells, but no suspected malignant cells may prompt an additional pass to ensure the correct area was sampled, but multiple passes with only normal pancreatic ductal cells may constitute an adequate sample. This approach has the potential to not only reduce the percentage of inadequate samples, but to also reduce the rate of false negative diagnosis.
Smears can further be performed immediately upon collection of EUS-FNA/B samples and imaged using deep-UV microscopy. After evaluation by UV microscopy, the smears can be stained with Papanicolaou stain following the routine protocol for EUS-FNA/B samples. Deep-UV microscopy is a non-destructive technique, enabling evaluation of the exact same slide using both
methods. Images are reviewed by a cytopathologist, who will assess the adequacy of each sample. In this embodiment, sensitivity and specificity are calculated for each quality assessment method relative to the ground truth assessment. When multiple biological samples, such as aspirates or biopsy samples are collected for a single patient, each sample is assessed separately.
The following formulas (1-4) can be used to determine sample size (N) for this study based on required sensitivity and specificity:
In a preferred embodiment, a prevalence, P, of approximately 50% can be established as an initial criterion. Z, the normal distribution value, can be set to 1.96 to correspond with the 95% confidence interval, and W, the maximum acceptable width of the 95% confidence interval, can be set to 10%. Sensitivity and specificity are estimated at 90% based on previous data showing nearly identical appearance of the deep-UV and stained smears. These values determined the TP (true positives) + FN (false negatives) and TN (true negatives) + FP (false positives) used for the sample size calculations. Based on these assumptions, a sample size of 70 smears can be used. Example 7: Deep learning-based algorithm to automate the identification and classification of key cell types and features to support adequacy assessment.
EUS-FNA/B samples have varying requirements for adequacy depending on the lesion type and suspected diagnosis. To assist clinicians in obtaining adequate samples across a diverse range of lesions, the system of the disclosures can provide the clarifying information to address two critical questions that determine whether an additional pass is necessary: 1) Is identifiable tissue present in the sample, or the sample acellular or comprised primarily of blood? and 2) Is the sample from the area of interest? To answer these questions, the Applicant’s system evaluates the cellularity, including the number of suspected malignant cells, and sample features that could
hinder cytological assessment. Examples noted above demonstrate the ability to implement deep learning-based object detection algorithms for bone marrow aspirate adequacy assessment and to differentiate cell types in peripheral blood samples. As such, similar deep learning framework to develop a classification algorithm for EUS-FNA/B samples can be implements which can include the creation of an annotated dataset of FNA/Bs imaged by deep-UV microscopy and implementation of a single-step object detection algorithm.
In alternative embodiments, additional data augmentation strategies will be implemented to mitigate the risk of decreased accuracy of the classification algorithm for cell types or features that have limited representation in our dataset. Three additional techniques can apply as needed to achieve the more precise sensitivity and specificity and can include: 1) synthetic minority oversample technique (SMOTE); 2) traditional CV augmentation for the minority class; and 3) class weighting.
REFERENCES
1. “Bone marrow aspiration,” Journal of Clinical Pathology 54, 657-663 (2001).
2. S. Malempati, S. Joshi, S. Lai, D. Braner, and K. Tegtmeyer, “Bone Marrow Aspiration and Biopsy,” The New England journal of medicine 361, e28 (2009).
3. R. S. Riley, D. Williams, M. Ross, S. Zhao, A. Chesney, B. D. Clark, and J. M. Ben-Ezra, “Bone marrow aspirate and biopsy: a pathologist’s perspective. II. interpretation of the bone marrow aspirate and biopsy,” J Clin Lab Anal 23(5), 259-307 (2009).
4. O. O. Odejide, A. M. Cronin, D. J. DeAngelo, Z. A. Bernazzoli, J. O. Jacobson, S. J. Rodig, A. S. LaCasce, T. J. Mazeika, K. D. Earles, and G. A. Abel, “Improving the quality of bone marrow assessment: Impact of operator techniques and use of a specimen preparation checklist,” Cancer 119(19), 3472-3478 (2013).
5. G. Loureiro, E. G. Rizzatti, A. F. Sandes, and M. de Lourdes Chauffaille, “Quality control of bone marrow aspirates: additional steps toward a safer and more efficient procedure,” Cancer 120(9), 1441-1442 (2014).
6. R. S. Riley, P. Gandhi, S. E. Harley, P. Garcia, J. B. Dalton, and A. Chesney, “A Synoptic Reporting System to Monitor Bone Marrow Aspirate and Biopsy Quality,” J Pathol Inform 12, 23 (2021).
7. J. A. Hawing, O. G. Cantu Rodriguez, A. G6mez-De Leon, C. Mancias, L. del C. Tarin Arzaga, and D. Gomez-Almaguer, “Aspicular Bone Marrow Aspiration: A Common, but Not a Minor Problem,” Blood 134, 5841 (2019).
8. S. Bunting, M. Atuan, and S. Castellino, “Improving the Quality of Bone Marrow Biopsy in a Pediatric Hospital,” American Journal of Clinical Pathology 149(suppl 1), S51 (2018).
9. B. J. Zeskind, C. D. Jordan, W. Timp, L. Trapani, G. Waller, V. Horodincu, D. J. Ehrlich, and P. Matsudaira, “Nucleic acid and protein mass mapping by live-cell deep-ultraviolet microscopy,” Nat Methods 4(7), 567-569 (2007).
10. M. C. Cheung, J. G. Evans, B. McKenna, and D. J. Ehrlich, “Deep ultraviolet mapping of intracellular protein and nucleic acid in femtograms per pixel,” Cytometry A 79(11), 920-932 (2011).
11. M. C. Cheung, R. LaCroix, B. K. McKenna, L Liu, J. Winkelman, and D. J. Ehrlich, “Intracellular protein and nucleic acid measured in eight cell types using deep-ultraviolet mass mapping,” Cytometry A 83(6), 540-551 (2013).
12. S. Soltani, A. Ojaghi, and F. E. Robles, “Deep UV dispersion and absorption spectroscopy of biomolecules,” Biomed Opt Express 10(2), 487-499 (2019).
13. A. Ojaghi, G. Carrazana, C. Caruso, A. Abbas, D. R. Myers, W. A. Lam, and F. E. Robles, “Label-free hematology analysis using deep-ultraviolet microscopy,” PNAS 117(26), 14779-14789 (2020).
14. N. Kaza, A. Ojaghi, and F. E. Robles, “Hemoglobin quantification in red blood cells via dry mass mapping based on UV absorption,” J Biomed Opt 26(8), 086501 (2021).
15. A. Ojaghi, P. Casteleiro Costa, C. Caruso, W. A. Lam, and F. E. Robles, “Label- free automated neutropenia detection and grading using deep-ultraviolet microscopy,” Biomed Opt Express 12(10), 6115-6128 (2021).
16. A. Ojaghi, E. Williams, N. Kaza, V. Gorti, H. Choi, J. Torey, T. Wiley, B. Turner, S. Jackson, S. Park, W. A. Lam, and F. E. Robles, “Label-free deep-UV microscopy detection and grading of neutropenia using a passive microfluidic device,” Opt. Lett. (2022).
17. N. Kaza, A. Ojaghi, and F. E. Robles, “Virtual Staining, Segmentation, and Classification of Blood Smears for Label-Free Hematology Analysis,” BME Frontiers 2022, (2022).
18. V. Gorti, N. Kaza, E. K. Williams, W. A. Lam, and F. E. Robles, “Compact and low-cost deep-ultraviolet microscope system for label-free molecular imaging and point-of-care hematological analysis,” Biomed. Opt. Express, BOE 14(3), 1245-1255 (2023).
19. C.-Y. Wang, A. Bochkovskiy, and H.-Y. M. Liao, “YOLOv7: Trainable bag-of- freebies sets new state-of-the-art for real-time object detectors,” (2022).
20. S. Bhumbla, D. K. Gupta, and Nisha, “A Review: Object Detection Algorithms,” in 2023 Third International Conference on Secure Cyber Computing and Communication (ICSCCC) (2023), pp. 827-832.
21. S. Preibisch, S. Saalfeld, and P. Tomancak, “Globally optimal stitching of tiled 3D microscopic image acquisitions,” Bioinformatics 25(11), 1463-1465 (2009).
22. J. Schindelin, I. Arganda-Carreras, E. Frise, V. Kaynig, M. Longair, T. Pietzsch, S. Preibisch, C. Rueden, S. Saalfeld, B. Schmid, J.-Y. Tinevez, D. J. White, V. Hartenstein, K.
Eliceiri, P. Tomancak, and A. Cardona, “Fiji: an open-source platform for biological-image analysis,” Nat Methods 9(7), 676-682 (2012).
23. L. Zhang, H. Shao, and S. Alkan, Diagnostic Pathology of Hematopoietic Disorders of Spleen and Liver (Springer Nature, 2020).
24. A. Auerbach and N. Aguilera, Diagnostic Pathology: Spleen: Diagnostic Pathology: Spleen - E-Book (Elsevier Health Sciences, 2022).
25. E. S. Jaffe, N. L. Harris, J. Vardiman, D. A. Arber, and E. Campo, Hematopathology E-Book (Elsevier Health Sciences, 2010).
26. G. Cai and A. J. Adeniran, Rapid On-Site Evaluation (ROSE): A Practical Guide (Springer Nature, 2019).
27. B. L. Witt, “Rapid On Site Evaluation (ROSE): A Pathologists’ Perspective,” Techniques in Vascular and Interventional Radiology 24(3), 100767 (2021).
28. Erickson, R. A. EUS-guided FNA. Gastrointestinal Endosc 60, 267-279 (2004).
29. Costache, M.-I., lordache, S., Gasdal Karstensen, J., Saftoiu, A. & Vilmann, P. Endoscopic Ultrasound-Guided Fine Needle Aspiration: From the Past to the Future. Endosc Ultrasound 2, 77-85 (2013).
30. Bluen, B. E. et al. Accuracy and quality assessment of EUS-FNA: A single-center large cohort of biopsies. Diagn Ther Endosc (2012) doi: 10.1155/2012/139563.
31. Schmidt, R. L., Witt, B. L., Lopez-Calderon, L. E. & Layfield, L. J. The influence of rapid onsite evaluation on the adequacy rate of fine-needle aspiration cytology: A systematic review and meta-analysis. Am J Clin Pathol 139, 300-308 (2013).
32. Yang, F., Liu, E. & Sun, S. Rapid on-site evaluation (ROSE) with EUS-FNA: The ROSE looks beautiful. Endoscopic Ultrasound vol. 8 283-287 (2019).
33. Hebert-Magee, S. et al. The presence of a cytopathologist increases the diagnostic accuracy of endoscopic ultrasound-guided fine needle aspiration cytology for pancreatic adenocarcinoma: A meta-analysis. Cytopathology 24, 159-171 (2013).
34. Iglesias-Garcia, J. et al. Influence of On-Site Cytopathology Evaluation on the Diagnostic Accuracy of Endoscopic Ultrasound-Guided Fine Needle Aspiration (EUS-FNA) of Solid Pancreatic Masses. American Journal of Gastroenterology 106, 1705-1710 (2011).
35. Spier, B. J. et al. Predictors of malignancy and recommended follow-up for patients with negative endoscopic ultrasound-guided fine-needle aspiration of suspected pancreatic lesions. Canadian Journal of Gastroenterology vol. 23 279-286 (2009).
36. Blackford, A. L. et al. Pancreatic Cancer Surveillance and Survival of High-Risk Individuals. JAMA Oncol 10, 1087-1096 (2024).
37. Blackford, A. L., Canto, M. I., Klein, A. P., Hruban, R. H. & Goggins, M. Recent Trends in the Incidence and Survival of Stage 1A Pancreatic Cancer: A Surveillance, Epidemiology, and End Results Analysis. J Natl Cancer Inst 112, 1162-1169 (2020).
38. Ushio, J. et al. Pancreatic ductal adenocarcinoma: Epidemiology and risk factors. Diagnostics 11, (2021).
39. McGuigan, A. et al. Pancreatic cancer: A review of clinical diagnosis, epidemiology, treatment and outcomes. World Journal of Gastroenterology vol. 24 4846-4861 (2018).
40. Kaur, S., Baine, M. J., Jain, M., Sasson, A. R. & Batra, S. K. Early Diagnosis of Pancreatic Cancer: Challenges and New Developments. Biomark Med 6, 597-612 (2012).
41. Poruk, K. E., Firpo, M. A., Adler, D. G. & Mulvihill, S. J. Screening for Pancreatic Cancer. Ann Surg 257, 17-26 (2013).
42. Soreide, K. Early Diagnosis of Sporadic Pancreatic Cancer, in Textbook of Pancreatic Cancer (eds. Soreide, K. & Stattner, S.) 339-356 (Springer, Cham, 2021). doi: 10.1007/978-3-030-53786-9_23.
43. Ojaghi, A. et al. Label-free hematology analysis using deep-ultraviolet microscopy. PNAS 117, 14779-14789 (2020).
44. Viswanath Gorti et al. Rapid, point-of-care bone marrow aspirate adequacy assessment via ultraviolet microscopy, in review.
45. Kaza, N., Ojaghi, A. & Robles, F. E. Virtual Staining, Segmentation and Classification of Blood Smears for Label-free Hematology Analysis. BME Front (2022).
46. Volmar, K. E. et al. Pancreatic FNA in 1000 cases: a comparison of imaging modalities. Gastrointestinal Endosc 61, 854-861 (2005).
47. Gan, Q. et al. Adequacy evaluation and use of pancreatic adenocarcinoma specimens for next-generation sequencing acquired by endoscopic ultrasound-guided FNA and FNB. Cancer Cytopathol 130, 275-283 (2022).
48. van Riet, P., Cahen, D , Poley, J.-W. & Bruno, M. Mapping international practice patterns in EUS-guided tissue sampling: outcome of a global survey. Endosc Int Open 04, E360- E370 (2016).
49. American Cancer Society. Cancer Facts & Figures 2024. (2024).
50. Reddy, S. R. et al. Cost of cancer management by stage at diagnosis among Medicare beneficiaries. Curr Med Res Opin 38, 1285-1294 (2022).
51. Santos, G. D. C., Ko, H. M., Saieg, M. A. & Geddie, W. R. ‘The petals and thorns’ of ROSE (rapid on-site evaluation). Cancer Cytopathology vol. 121 4-8 (2013).
52. Hebert-Magee, S. et al. The presence of a cytopathologist increases the diagnostic accuracy of endoscopic ultrasound-guided fine needle aspiration cytology for pancreatic adenocarcinoma: A meta-analysis. Cytopathology 24, 159-171 (2013).
53. Guvendir, I., Zemheri, I. E. & Ozdil, K. Impact of rapid on-site evaluation on diagnostic accuracy of EUS-guided fine-needle aspiration of solid pancreatic lesions: experience from a single center. BMC Gastroenterol 22, (2022).
54. Pearson, L. N., Layfield, L. J. & Schmidt, R. L. Cost-effectiveness of rapid on-site evaluation of the adequacy of FNA cytology samples performed by nonpathologists. Cancer Cytopathol 126, 839-845 (2018).
55. Layfield, L. J., Bentz, J. S. & Gopez, E. V. Immediate on-site interpretation of fine- needle: Aspiration smears: A cost and compensation analysis. Cancer 93, 319-322 (2001).
56. Khoury, T. & Sbeit, W. Cost-effectiveness of rapid on-site evaluation of endoscopic ultrasound fine needle aspiration in gastrointestinal lesions. Cytopathology 32, 326-330 (2021).
57. Lin, O. et al. American Society of Cytopathology Telecytology validation recommendations for rapid on-site evaluation (ROSE). J Am Soc Cytopathol 13, 111-121 (2024).
58. Tambouret, R. H., Barkan, G. A., Daniel, ;, Kurtycz, F. I. & Padmanabhan, V. Cytopathology and More | FNA Cytology: Rapid on-Site Evaluation-How Practice Varies. (2014).
59. Collins, J. A., Novak, A., Ali, S. Z. & Olson, M. T. Cytotechnologists and on-site evaluation of adequacy. Korean Journal of Pathology vol. 47 405-410 (2013).
60. Renelus, B. D., Jamorabo, D. S., Boston, I., Briggs, W. M. & Poneros, J. M. Endoscopic Ultrasound-Guided Fine Needle Biopsy Needles Provide Higher Diagnostic Yield Compared to Endoscopic Ultrasound-Guided Fine Needle Aspiration Needles When Sampling Solid Pancreatic Lesions: A Meta-Analysis. Clin Endosc 54, 261-268 (2021).
61 . Gorti, V., Subramanian, A. R., Aljudi, A. A., Aumann, W. K. & Robles, F. E. Deepultraviolet microscopy for analysis of bone marrow aspirate adequacy, in Optical Diagnostics and Sensing XXIV: Toward Point-of-Care Diagnostics (eds. Baba, J. S. & Cote, G. L.) 37 (SPIE, 2024). doi: 10.1117/12.3002646.
62. Ojaghi, A., Gorti, V. & Robles, F. E. Label-free hematological assessment of neutropenia using a microfluidic device and deep-UV microscopy, in Optical Diagnostics and Sensing XXII: Toward Point-of-Care Diagnostics (ed. Cote, G. L.) vol. 119686 - 11 (SPIE, 2022).
63. Kaza, N., Ojaghi, A., Casteleiro Costa, P. & Robles, F. E. Deep learning based virtual staining of label-free ultraviolet (UV) microscopy images for hematological analysis. Proceedings of the SPIE 11655, (2021).
64. Kaza, N., Ojaghi, A. & Robles, F. E. Hemoglobin quantification in red blood cells via dry mass mapping based on UV absorption. J Biomed Opt 26, (2021).
65. Negida, A., Fahim, N. K. & Negida, Y. v0i0.158Sample Size Calculation Guide - Part 4:How to Calculate the Sample Size for a Diagnostic Test Accuracy Study based on Sensitivity, Specificity, and the Area Under the ROC Curve. Adv J Emerg Med 3, e33 (2019).
66. Wang, C.-Y., Bochkovskiy, A. & Liao, H.-Y. M. YOLOv7: Trainable bag-of- freebies sets new state-of-the-art for real-time object detectors. (2022).
67. Redmon, J., Divvala, S., Girshick, R. & Farhadi, A. You Only Look Once: Unified, Real-Time Object Detection. (2015).
68. Lin, T.-Y. et al. Microsoft COCO: Common Objects in Context, in Computer Vision - ECCV 2014. Lecture Notes in Computer Science vol. 8693 740-755 (2014).
69. Chawla, N. v, Bowyer, K. W., Hall, L. O. & Kegelmeyer, W. P. SMOTE: Synthetic Minority Over-sampling Technique. Journal of Artificial Intelligence Research 16, 321-357 (2002).
70. King, G. et al. Logistic Regression in Rare Events Data. Political analysis 9, 137— 163 (2001).
71. Becker, A. E., Hernandez, Y. G., Frucht, H. & Lucas, A. L. Pancreatic ductal adenocarcinoma: Risk factors, screening, and early detection. World Journal of Gastroenterology vol. 20 11182-11198
Claims
1. A method of applying machine learning to detect and analyze biological samples exposed to UV light comprising:
- training a neural network having multiple layers using a training dataset comprising:
- at least one reference dataset generated by imaging reference samples using a deep-ultraviolet (UV) imaging device and extracting one or more features of adequacy from a plurality of images from the reference dataset;
- obtaining a biological sample from a subject for use in the diagnosis of a disease or condition; and
- obtaining a test dataset by imaging the biological sample using a deep-UV microcopy and identifying in the image one or more features of adequacy, or the lack of any features of adequacy in the sample.
2. The method of claim 1, wherein said neural network comprises a convolutional neural network (ConvNet).
3. The method of claim 1, wherein said neural network comprises a single-step object detection algorithm.
4. The method of claim 1, wherein said biological sample is selected from: a tissue sample, a surgical tissue biopsy sample, a cell sample, a biological sample obtained via fine needle aspiration (FNA) sample, a biological sample obtained via Endoscopic Ultrasound-Guided Fine Needle Aspiration (EUS-FNA), a biological sample obtained via Endoscopic Ultrasound-Guided Fine Needle Biopsy EUS-FNB, a biological sample obtained via endobronchial ultrasound (EBUS), a transbronchial biopsy sample, a percutaneous lung biopsy sample, a pancreas tissue sample, a pancreatic FNA sample, an EUS-FNB of solid pancreatic lesions, sentinel lymph node biopsies, melanoma biopsies, bone marrow aspiration, a liver tissue sample, an intestine tissue sample, a bone tissue sample, a muscle and tissue sample, sputum/oral fluid, amniotic fluid, blood, a blood fraction, bone marrow, urine, semen, stool, vaginal fluid, peritoneal fluid, pleural fluid, tissue
explant, organ culture, cell culture, and any other tissue or cell preparation, or fraction or derivative thereof or isolated therefrom.
5. The method of claim 4, wherein said FNA and/or EUS-FNB/A is selected from: a thyroid aspiration, a thyroid nodule, a lymph node, breast tissue, bone marrow, lung, pancreas, kidney, and abdominal fluid in the peritoneal, or a combination of the same.
6. The method of claim 1, wherein said biological sample comprises a bone marrow fine needle aspiration.
7. The method of claim 6, wherein said wherein feature of adequacy comprises bony spicules.
8. The method of claims 1-7, wherein said biological sample is not stained or otherwise treated.
9. The method of claim 1, wherein said subject comprises a human.
10. The method of claim 1, wherein said feature of adequacy is selected from: cell type, cell morphology, non-cellular components, a cell phenotype, a pathogenic phenotype, a cell genotype, quantity, position or morphology or cell components, chromatin morphology or content, presence of dead cells, presence of necrotic tissues, tissue fragments, extracellular matrix, sub-cellular features and/or structures, sub-cellular granules, sub-cellular abnormal nucleus/cytoplasm ratio, one or more features described in Table 5, presence of a pathogen, or a combination of the same.
11. The method of claim 1, further comprising obtaining a second test dataset from a second biological sample when no feature of adequacy was identified in the first sample.
12. The method of claim 1, wherein the biological sample containing a feature of adequacy is further used to diagnose a disease or condition.
13. The method of claim 1, further comprising a computer device having a graphical user interface responsive to the deep-UV microscope and configured to display the biological sample.
14. A method of applying machine learning to detect and analyze biological samples exposed to UV light comprising:
- obtaining a biological sample from a subject for use in the diagnosis of a disease or condition; and
- obtaining a test dataset by imaging the biological sample using a deep-ultraviolet (UV) imaging device that illuminates the test biological sample with one or more wavelength of UV light, and identifying in the image one or more features of adequacy, wherein the feature was previously identified by a reference set generated by imaging reference samples using a deep-ultraviolet (UV) imaging device that illuminates the samples forming the reference dataset and extracting one or more features of adequacy from a plurality of images from the reference dataset; or
- identifying the lack of any features of adequacy in the test dataset.
15. The method of claim 14, wherein said feature of adequacy is identified via single-step object detection algorithm from the reference dataset.
16. The method of claim 14, wherein said biological sample is selected from: a tissue sample, a surgical tissue biopsy sample, a cell sample, a biological sample obtained via fine needle aspiration (FNA) sample, a biological sample obtained via Endoscopic Ultrasound-Guided Fine Needle Aspiration (EUS-FNA), a biological sample obtained via Endoscopic Ultrasound-Guided Fine Needle Biopsy EUS-FNB, a biological sample obtained via endobronchial ultrasound (EBUS), a transbronchial biopsy sample, a percutaneous lung biopsy sample, a pancreas tissue sample, a pancreatic FNA sample, an EUS-FNB of solid pancreatic lesions, sentinel lymph node biopsies, melanoma biopsies, bone marrow aspiration, a liver tissue sample, an intestine tissue sample, a bone tissue sample, a muscle and tissue sample, sputum/oral fluid, amniotic fluid, blood, a blood fraction, bone marrow, urine, semen, stool, vaginal fluid, peritoneal fluid, pleural fluid, tissue explant, organ culture, cell culture, and any other tissue or cell preparation, or fraction or derivative thereof or isolated therefrom.
17. The method of claim 16, wherein said FNA and/or EUS-FNB/A is selected from: a thyroid aspiration, a thyroid nodule, a lymph node, breast tissue, bone marrow, lung, pancreas, kidney, and abdominal fluid in the peritoneal, or a combination of the same.
18. The method of claim 14, wherein said biological sample comprises a bone marrow fine needle aspiration.
19. The method of claim 18, wherein said wherein feature of adequacy comprises bony spicules.
20. The method of claims 14-19, wherein said biological sample is not stained or otherwise treated.
21. The method of claim 14, wherein said subject comprises a human.
22. The method of claim 14, wherein said feature of adequacy is selected from: cell type, cell morphology, non-cellular components, a cell phenotype, a pathogenic phenotype, a cell genotype, quantity, position or morphology or cell components, chromatin morphology or content, presence of dead cells, presence of necrotic tissues, tissue fragments, extracellular matrix, sub-cellular features and/or structures, sub-cellular granules, sub-cellular abnormal nucleus/cytoplasm ratio, one or more features described in Table 5, presence of a pathogen, or a combination of the same.
23. The method of claim 14, wherein the biological sample containing a feature of adequacy is further used to diagnose a disease or condition.
Applications Claiming Priority (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US202463561719P | 2024-03-05 | 2024-03-05 | |
| US63/561,719 | 2024-03-05 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| WO2025188609A1 true WO2025188609A1 (en) | 2025-09-12 |
| WO2025188609A8 WO2025188609A8 (en) | 2025-10-02 |
Family
ID=96991436
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/US2025/018115 Pending WO2025188609A1 (en) | 2024-03-05 | 2025-03-03 | Methods and systems for assessment of biological specimen adequacy |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2025188609A1 (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170234874A1 (en) * | 2015-10-07 | 2017-08-17 | Clearbridge Biophotonics Pte Ltd. | Integrated visual morphology and cell protein expression using resonance-light scattering |
| CN113724223B (en) * | 2021-08-27 | 2022-05-24 | 江南大学 | Method and system for making YOLOv3 dataset based on optical microscope |
| US20230316595A1 (en) * | 2021-12-13 | 2023-10-05 | Georgia Tech Research Corporation | Microscopy Virtual Staining Systems and Methods |
-
2025
- 2025-03-03 WO PCT/US2025/018115 patent/WO2025188609A1/en active Pending
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20170234874A1 (en) * | 2015-10-07 | 2017-08-17 | Clearbridge Biophotonics Pte Ltd. | Integrated visual morphology and cell protein expression using resonance-light scattering |
| CN113724223B (en) * | 2021-08-27 | 2022-05-24 | 江南大学 | Method and system for making YOLOv3 dataset based on optical microscope |
| US20230316595A1 (en) * | 2021-12-13 | 2023-10-05 | Georgia Tech Research Corporation | Microscopy Virtual Staining Systems and Methods |
Also Published As
| Publication number | Publication date |
|---|---|
| WO2025188609A8 (en) | 2025-10-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20240293812A1 (en) | Apparatus and method for analyzing a bodily sample | |
| CN109670510B (en) | Deep learning-based gastroscope biopsy pathological data screening system | |
| US6246785B1 (en) | Automated, microscope-assisted examination process of tissue or bodily fluid samples | |
| JP7092503B2 (en) | Systems and methods for co-expression analysis | |
| CN105228749B (en) | Portable blood cell count monitor | |
| US10083340B2 (en) | Automated cell segmentation quality control | |
| CN111699510A (en) | Transformation of digital pathology images | |
| CA3188705A1 (en) | Tissue staining and sequential imaging of biological samples for deep learning image analysis and virtual staining | |
| Dey et al. | Digital analysis of microscopic images in medicine | |
| JP2020530613A (en) | Automated assay evaluation and normalization for image processing | |
| CN103907023A (en) | System and method for the detection of abnormalities in a biological sample | |
| JP7487418B2 (en) | Identifying autofluorescence artifacts in multiplexed immunofluorescence images | |
| EP4028942A1 (en) | Systems and methods for artificial intelligence based cell analysis | |
| WO2022246294A1 (en) | Automated segmentation of artifacts in histopathology images | |
| WO2012065117A2 (en) | Oral cancer point of care diagnostics | |
| Ulaganathan et al. | A clinicopathological study of various oral cancer diagnostic techniques | |
| Cheng et al. | Application of image recognition technology in pathological diagnosis of blood smears | |
| US20250140385A1 (en) | Information processing device, biological sample analysis system, and biological sample analysis method | |
| Gorti et al. | Rapid, point-of-care bone marrow aspirate adequacy assessment via deep ultraviolet microscopy | |
| WO2025188609A1 (en) | Methods and systems for assessment of biological specimen adequacy | |
| WO2023157755A1 (en) | Information processing device, biological specimen analysis system, and biological specimen analysis method | |
| Joseph | Hyperspectral optical imaging for detection, diagnosis and staging of cancer | |
| Sergeev et al. | Development of automated computer vision methods for cell counting and endometrial gland detection for medical images processing | |
| Gorti et al. | Deep-ultraviolet microscopy for analysis of bone marrow aspirate adequacy | |
| Sheikhzadeh | Improving cervical neoplasia diagnosis via novel in vivo imaging technologies and deep learning algorithms |