[go: up one dir, main page]

WO2025036631A1 - Correction de distorsion d'image par microscopie électronique à balayage (meb) basée sur un modèle de processus - Google Patents

Correction de distorsion d'image par microscopie électronique à balayage (meb) basée sur un modèle de processus Download PDF

Info

Publication number
WO2025036631A1
WO2025036631A1 PCT/EP2024/069759 EP2024069759W WO2025036631A1 WO 2025036631 A1 WO2025036631 A1 WO 2025036631A1 EP 2024069759 W EP2024069759 W EP 2024069759W WO 2025036631 A1 WO2025036631 A1 WO 2025036631A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
locations
predicted
distortion
measured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
PCT/EP2024/069759
Other languages
English (en)
Inventor
Yongfa Fan
Jiaxing REN
Yi-Yin Chen
Tianyu YAN
Qian Zhao
Mu FENG
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
ASML Netherlands BV
Original Assignee
ASML Netherlands BV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by ASML Netherlands BV filed Critical ASML Netherlands BV
Publication of WO2025036631A1 publication Critical patent/WO2025036631A1/fr
Pending legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/70Microphotolithographic exposure; Apparatus therefor
    • G03F7/70483Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
    • G03F7/70605Workpiece metrology
    • G03F7/70616Monitoring the printed patterns
    • G03F7/70625Dimensions, e.g. line width, critical dimension [CD], profile, sidewall angle or edge roughness
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/70Microphotolithographic exposure; Apparatus therefor
    • G03F7/70483Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
    • G03F7/70605Workpiece metrology
    • G03F7/70653Metrology techniques
    • G03F7/70655Non-optical, e.g. atomic force microscope [AFM] or critical dimension scanning electron microscope [CD-SEM]
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/70Microphotolithographic exposure; Apparatus therefor
    • G03F7/70483Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
    • G03F7/70605Workpiece metrology
    • G03F7/706835Metrology information management or control
    • G03F7/706837Data analysis, e.g. filtering, weighting, flyer removal, fingerprints or root cause analysis
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03FPHOTOMECHANICAL PRODUCTION OF TEXTURED OR PATTERNED SURFACES, e.g. FOR PRINTING, FOR PROCESSING OF SEMICONDUCTOR DEVICES; MATERIALS THEREFOR; ORIGINALS THEREFOR; APPARATUS SPECIALLY ADAPTED THEREFOR
    • G03F7/00Photomechanical, e.g. photolithographic, production of textured or patterned surfaces, e.g. printing surfaces; Materials therefor, e.g. comprising photoresists; Apparatus specially adapted therefor
    • G03F7/70Microphotolithographic exposure; Apparatus therefor
    • G03F7/70483Information management; Active and passive control; Testing; Wafer monitoring, e.g. pattern monitoring
    • G03F7/70605Workpiece metrology
    • G03F7/706835Metrology information management or control
    • G03F7/706839Modelling, e.g. modelling scattering or solving inverse problems
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/80Geometric correction
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30148Semiconductor; IC; Wafer

Definitions

  • PROCESS MODEL BASED SCANNING ELECTRON MICROSCOPY (SEM) IMAGE DISTORTION CORRECTION CROSS-REFERENCE TO RELATED APPLICATIONS
  • SEM scanning electron microscopy
  • BACKGROUND In manufacturing processes of integrated circuits (ICs), unfinished or finished circuit components are inspected to ensure that they are manufactured according to design and are free of defects.
  • Inspection systems utilizing optical microscopes or charged particle (e.g., electron) beam microscopes, such as a scanning electron microscope (SEM) can be employed.
  • SEM scanning electron microscope
  • IC components continue to shrink, and their structures continue to become more complex, accuracy and throughput in defect detection and inspection become more important.
  • SEM images themselves may introduce measurement distortion, including perspective shifts. Identifying and correcting measurement-induced distortions in metrology images becomes important as device sizes become smaller and image distortions therefore represent a larger percentage of measurement error.
  • the ability to monitor and detect IC non-idealities may be limited by an image quality of the inspection system, including by the alignment or calibration of an SEM system. [0004] In the context of semiconductor manufacture, metrology image distortion needs to be identified, classified, and corrected.
  • Embodiments of the present disclosure provide a method for determining an image distortion in an image of a patterning process based on a process model for the patterning process.
  • Embodiments provide methods of determining an image distortion based on differences between a predicted image of a patterning process and a measured image of a pattern fabricated by said patterning process.
  • Embodiments provide methods of image correction.
  • Embodiments provide methods of imaging process, modeling process, and patterning process characterization based on a determined image distortion.
  • a method for determining distortion in an image based on a process model comprising: determining an image transformation operation based on a relationship between a plurality of measurement locations in an image and locations corresponding to the plurality of measurement locations in a predicted image generated by a process model; and characterizing image distortion in the image based on the image transformation operation.
  • the image comprises an image of a fabricated pattern.
  • the process model is a lithography process model.
  • the process model is a resist development model.
  • the process model is an etch model.
  • the relationship between the plurality of measurement locations and the predicted locations comprises a relationship between a set of contours.
  • the relationship between the plurality of measurement locations and the predicted locations comprises a set of edge placement (EP) gauges.
  • the plurality of measurement locations comprise a set of EP gauges in the image.
  • the predicted locations comprise a set of EP gauges in the predicted image.
  • the plurality of measurement locations are obtained from a scanning electron microscope (SEM) image.
  • the SEM image comprises an image of a fabricated pattern, further comprising applying the image transformation operation to the image of the fabricated pattern to obtain a corrected SEM image.
  • determining the relationship comprises aligning the image and the predicted image.
  • determining the relationship comprises aligning the image and the predicted image.
  • the image transformation operation is based on an affine transformation.
  • the image transformation operation is based on a perspective transformation.
  • the image transformation operation comprises a transformation matrix.
  • the image transformation operation comprises a geometric transformation operation.
  • the plurality of measurement locations comprise at least some measurement locations that are locations in non-periodic patterns.
  • a method for image correction in a process model for a patterning process comprising: determining an image transformation operation based on differences between measured locations of multiple points on an image of an output of a patterning process and predicted locations corresponding to the multiple points, the predicted locations comprising locations predicted by a process model for the patterning process; and correcting the measured locations of the multiple points on the image based on the image transformation operation.
  • correcting the measured locations further comprises: determining whether the image transformation operation corresponds to a non-ideality in an imaging process based on the image transformation operation; and correcting the measured locations based on a determination that the image transformation operation corresponds to the non-ideality.
  • determining whether the image transformation operation corresponds to the non-ideality in the imaging process comprises determining one or more modes in the differences between measured locations of the multiple points and predicted locations of the multiple points.
  • the output of the patterning process contains non-periodic structures.
  • the image is a scanning electron microscope (SEM) image.
  • a method for metrology correction comprising: determining a distortion correction operation based on a relationship between a plurality of measured metrology locations in an image and locations corresponding to the plurality of metrology locations predicted by a process model; and correcting the measured metrology locations based on the distortion correction operation.
  • the plurality of measured metrology locations comprises a plurality of locations in a scanning electron microscope (SEM) image .
  • the relationship between the plurality of measured metrology locations and predicted locations comprises a set of edge placement (EP) gauges .
  • one or more non-transitory, machine-readable medium is provided having instructions thereon, the instructions when executed by a processor being configured to perform the method of any other embodiment.
  • one or more system comprising a processor and one or more non-transitory, machine-readable medium is provided having instructions thereon, the instructions when executed by the processor being configured to perform the method of any other embodiment.
  • FIG. 1 is a schematic diagram illustrating an exemplary electron beam inspection (EBI) system, according to an embodiment.
  • Figure 2 illustrates a block diagram of various subsystems of a lithographic projection apparatus, according to an embodiment.
  • Figure 3 illustrates an exemplary flow chart for simulating lithography in a lithographic projection apparatus, according to an embodiment.
  • Figure 4 depicts a schematic representation of process-model-based image correction, according to an embodiment.
  • Figure 5 depicts a schematic representation of an example image correction, according to an embodiment.
  • Figures 6A-6B illustrate example image distortions identifiable by process-model-based image distortion characterization, according to an embodiment.
  • Figure 7 is a flowchart illustrating a method for process-model-based image distortion identification, according to an embodiment.
  • Figure 8 is a block diagram of an example computer system, according to an embodiment of the present disclosure.
  • Embodiments described as being implemented in software should not be limited thereto, but can include embodiments implemented in hardware, or combinations of software and hardware, and vice-versa, as will be apparent to those skilled in the art, unless otherwise specified herein.
  • an embodiment showing a singular component should not be considered limiting; rather, the disclosure is intended to encompass other embodiments including a plurality of the same component, and vice-versa, unless explicitly stated otherwise herein.
  • the present disclosure encompasses present and future known equivalents to the known components referred to herein by way of illustration.
  • a patterning device can comprise, or can form, one or more design layouts.
  • the design layout can be generated utilizing CAD (computer-aided design) programs.
  • EDA electronic design automation
  • CAD programs follow a set of predetermined design rules in order to create functional design layouts/patterning devices. These rules are set based processing and design limitations. For example, design rules define the space tolerance between devices (such as gates, capacitors, etc.) or interconnect lines, to ensure that the devices or lines do not interact with one another in an undesirable way.
  • One or more of the design rule limitations may be referred to as a “critical dimension” (CD).
  • a critical dimension of a device can be defined as the smallest width of a line or hole, or the smallest space between two lines or two holes. Thus, the CD regulates the overall size and density of the designed device.
  • mask or “patterning device” as employed in this text may be broadly interpreted as referring to a generic patterning device that can be used to endow an incoming radiation beam with a patterned cross-section, corresponding to a pattern that is to be created in a target portion of the substrate.
  • classic mask transmissive or reflective; binary, phase-shifting, hybrid, etc.
  • examples of other such patterning devices include a programmable mirror array.
  • An example of such a device is a matrix-addressable surface having a viscoelastic control layer and a reflective surface.
  • EBI system 100 includes a main chamber 110, a load-lock chamber 120, an electron beam tool 140, and an equipment front end module (EFEM) 130. Electron beam tool 140 is located within main chamber 110.
  • the exemplary EBI system 100 may be a single or multi-beam system. While the description and drawings are directed to an electron beam, it is appreciated that the embodiments are not used to limit the present disclosure to specific charged particles.
  • EFEM 130 includes a first loading port 130a and a second loading port 130b. EFEM 130 may include additional loading port(s).
  • First loading port 130a and second loading port 130b receive wafer front opening unified pods (FOUPs) that contain wafers (e.g., semiconductor wafers or wafers made of other material(s)) or samples to be inspected (wafers and samples are collectively referred to as “wafers” hereafter).
  • wafers e.g., semiconductor wafers or wafers made of other material(s)
  • wafers and samples are collectively referred to as “wafers” hereafter.
  • One or more robot arms (not shown) in EFEM 130 transport the wafers to load-lock chamber 120.
  • Load-lock chamber 120 is connected to a load/lock vacuum pump system (not shown), which removes gas molecules in load-lock chamber 120 to reach a first pressure below the atmospheric pressure. After reaching the first pressure, one or more robot arms (not shown) transport the wafer from load-lock chamber 120 to main chamber 110.
  • Main chamber 110 is connected to a main chamber vacuum pump system (not shown), which removes gas molecules in main chamber 110 to reach a second pressure below the first pressure. After reaching the second pressure, the wafer is subject to inspection by electron beam tool 140.
  • electron beam tool 140 may comprise a single-beam inspection tool.
  • Controller 150 may be electronically connected to electron beam tool 140 and may be electronically connected to other components as well. Controller 150 may be a computer configured to execute various controls of EBI system 100. Controller 150 may also include one or processors, memory, and processing circuitry configured to execute various signal and image processing functions. While controller 150 is shown in Figure 1 as being outside of the structure that includes main chamber 110, load-lock chamber 120, and EFEM 130, it is appreciated that controller 150 can be part of the structure.
  • FIG. 2 illustrates a block diagram of various subsystems of a lithographic projection apparatus 10A, according to an embodiment of the present disclosure.
  • Major components are a radiation source 12A, which may be a deep-ultraviolet excimer laser source or other type of source including an extreme ultra violet (EUV) source (the lithographic projection apparatus itself need not have the radiation source), illumination optics which, e.g., define the partial coherence (denoted as sigma) and which may include optics 14A, 16Aa and 16Ab that shape radiation from the source 12A; a patterning device (or mask) 18A; and transmission optics 16Ac that project an image of the patterning device pattern onto a substrate plane 22A.
  • EUV extreme ultra violet
  • a pupil 20A can be included with transmission optics 16Ac. In some embodiments, there can be one or more pupils before and/or after mask 18A. As described in further detail herein, pupil 20A can provide patterning of the light that ultimately reaches substrate plane 22A.
  • a source provides illumination (i.e., radiation) to a patterning device and projection optics direct and shape the illumination, via the patterning device, onto a substrate.
  • illumination i.e., radiation
  • projection optics direct and shape the illumination, via the patterning device, onto a substrate.
  • the projection optics may include at least some of the components 14A, 16Aa, 16Ab and 16Ac.
  • An aerial image (AI) is the radiation intensity distribution at substrate level.
  • a resist model can be used to calculate the resist image from the aerial image, an example of which can be found in U.S. Patent Application Publication No.
  • the resist model is related to properties of the resist layer (e.g., effects of chemical processes which occur during exposure, post-exposure bake (PEB) and development).
  • Optical properties of the lithographic projection apparatus e.g., properties of the illumination, the patterning device and the projection optics dictate the aerial image and can be defined in an optical model. Since the patterning device used in the lithographic projection apparatus can be changed, it is desirable to separate the optical properties of the patterning device from the optical properties of the rest of the lithographic projection apparatus including at least the source and the projection optics.
  • the electromagnetic field of the radiation after the radiation passes the patterning device may be determined from the electromagnetic field of the radiation before the radiation reaches the patterning device and a function that characterizes the interaction.
  • This function may be referred to as the mask transmission function (which can be used to describe the interaction by a transmissive patterning device and/or a reflective patterning device).
  • the mask transmission function may have a variety of different forms. One form is binary. A binary mask transmission function has either of two values (e.g., zero and a positive constant) at any given location on the patterning device. A mask transmission function in the binary form may be referred to as a binary mask. Another form is continuous.
  • the modulus of the transmittance (or reflectance) of the patterning device is a continuous function of the location on the patterning device.
  • the phase of the transmittance (or reflectance) may also be a continuous function of the location on the patterning device.
  • a mask transmission function in the continuous form may be referred to as a continuous tone mask or a continuous transmission mask (CTM).
  • the CTM may be represented as a pixelated image, where each pixel may be assigned a value between 0 and 1 (e.g., 0.1, 0.2, 0.3, etc.) instead of binary value of either 0 or 1.
  • CTM may be a pixelated gray scale image, where each pixel having values (e.g., within a range [-255, 255], normalized values within a range [0, 1] or [-1, 1] or other appropriate ranges).
  • the thin-mask approximation also called the Kirchhoff boundary condition, is widely used to simplify the determination of the interaction of the radiation and the patterning device.
  • the thin-mask approximation assumes that the thickness of the structures on the patterning device is very small compared with the wavelength and that the widths of the structures on the mask are very large compared with the wavelength. Therefore, the thin-mask approximation assumes the electromagnetic field after the patterning device is the multiplication of the incident electromagnetic field with the mask transmission function.
  • a mask transmission function under the thin-mask approximation may be referred to as a thin-mask transmission function.
  • a mask transmission function encompassing M3D may be referred to as a M3D mask transmission function.
  • Figure 3 illustrates an exemplary flow chart for simulating lithography in a lithographic projection apparatus, according to an embodiment of the present disclosure.
  • Source model 31 represents optical characteristics (including radiation intensity distribution and/or phase distribution) of the source.
  • Projection optics model 32 represents optical characteristics (including changes to the radiation intensity distribution and/or the phase distribution caused by the projection optics) of the projection optics.
  • Design layout model 35 represents optical characteristics of a design layout (including changes to the radiation intensity distribution and/or the phase distribution caused by design layout 33), which is the representation of an arrangement of features on or formed by a patterning device.
  • Aerial image 36 can be simulated from design layout model 35, projection optics model 32, and design layout model 35.
  • Resist image 38 can be simulated from aerial image 36 using resist model 37.
  • Source model 31 can represent the optical characteristics of the source that include, but are not limited to, numerical aperture settings, illumination sigma ( ⁇ ) settings as well as any particular illumination shape (e.g., off-axis radiation sources such as annular, quadrupole, dipole, etc.).
  • Projection optics model 32 can represent the optical characteristics of the projection optics, including aberration, distortion, one or more refractive indexes, one or more physical sizes, one or more physical dimensions, etc.
  • Design layout model 35 can represent one or more physical properties of a physical patterning device, as described, for example, in U.S. Patent No.7,587,704, which is incorporated by reference in its entirety.
  • the objective of the simulation is to accurately predict, for example, edge placement, aerial image intensity slope and/or CD, which can then be compared against an intended design.
  • the intended design is generally defined as a pre-OPC design layout which can be provided in a standardized digital file format such as GDSII or OASIS or another file format.
  • clips one or more portions may be identified, which are referred to as “clips”.
  • a set of clips is extracted, which represents the complicated patterns in the design layout (typically about 50 to 1000 clips, although any number of clips may be used).
  • These patterns or clips represent small portions (i.e., circuits, cells or patterns) of the design and more specifically, the clips typically represent small portions for which particular attention and/or verification is needed.
  • clips may be the portions of the design layout, or may be similar or have a similar behavior of portions of the design layout, where one or more critical features are identified either by experience (including clips provided by a customer), by trial and error, or by running a full-chip simulation.
  • Clips may contain one or more test patterns or gauge patterns.
  • An initial larger set of clips may be provided a priori by a customer based on one or more known critical feature areas in a design layout which require particular image optimization.
  • an initial larger set of clips may be extracted from the entire design layout by using some kind of automated (such as machine vision) or manual algorithm that identifies the one or more critical feature areas.
  • ⁇ ⁇ is a weight constant associated with ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ ⁇ ⁇ .
  • the characteristic may be a position of an edge of a pattern, measured at a given point on the edge.
  • Different ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ ⁇ ⁇ may have different weight ⁇ ⁇ .
  • the weight ⁇ ⁇ for the ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ representing the difference between the actual position and the intended position of the edge may be given a higher value.
  • ⁇ ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ can also be a function of an interlayer characteristic, which is in turn a function of the design variables ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ ⁇ ⁇ .
  • ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ is not limited to the form in Eq.1.
  • ⁇ ⁇ , ⁇ ⁇ , ⁇ ⁇ ⁇ can be in any other suitable form.
  • the cost function may represent any one or more suitable characteristics of the lithographic projection apparatus, lithographic process or the substrate, for instance, focus, CD, image shift, pattern placement error, image distortion, image rotation, stochastic variation, throughput, local CD variation, process window, an interlayer characteristic, or a combination thereof.
  • the design variables ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ ⁇ ⁇ comprise one or more selected from dose, global bias of the patterning device, and/or shape of illumination. Since it is the resist image that often dictates the pattern on a substrate, the cost function may include a function that represents one or more characteristics of the resist image.
  • ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ ⁇ ⁇ can be simply a distance between a point in the resist image to an intended position of that point (i.e., edge placement error ⁇ ⁇ ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ ⁇ ⁇ .
  • the design variables can include any adjustable parameter such as an adjustable parameter of the source, the patterning device, the projection optics, dose, focus, etc.
  • the lithographic apparatus may include components collectively called a “wavefront manipulator” that can be used to adjust the shape of a wavefront and intensity distribution and/or phase shift of a radiation beam.
  • the lithographic apparatus can adjust a wavefront and intensity distribution at any location along an optical path of the lithographic projection apparatus, such as before the patterning device, near a pupil plane, near an image plane, and/or near a focal plane.
  • the wavefront manipulator can be used to correct or compensate for certain distortions of the wavefront and intensity distribution and/or phase shift caused by, for example, the source, the patterning device, temperature variation in the lithographic projection apparatus, thermal expansion of components of the lithographic projection apparatus, etc. Adjusting the wavefront and intensity distribution and/or phase shift can change values of the characteristics represented by the cost function. Such changes can be simulated from a model or actually measured.
  • the design variables can include parameters of the wavefront manipulator.
  • the design variables may have constraints, which can be expressed as ⁇ ⁇ , ⁇ ⁇ , ⁇ , ⁇ ⁇ ⁇ ⁇ , where ⁇ is a set of possible values of the design variables.
  • One possible constraint on the design variables may be imposed by a desired throughput of the lithographic projection apparatus. Without such a constraint imposed by the desired throughput, the optimization may yield a set of values of the design variables that are unrealistic. For example, if the dose is a design variable, without such a constraint, the optimization may yield a dose value that makes the throughput economically impossible.
  • the usefulness of constraints should not be interpreted as a necessity. For example, the throughput may be affected by the pupil fill ratio.
  • a low pupil fill ratio may discard radiation, leading to lower throughput.
  • Throughput may also be affected by the resist chemistry. Slower resist (e.g., a resist that requires higher amount of radiation to be properly exposed) leads to lower throughput.
  • the term “process model” means a model that includes one or more models that simulate a patterning process.
  • a process model can include any combination of: an optical model (e.g., that models a lens system/projection system used to deliver light in a lithography process and may include modelling the final optical image of light that goes onto a photoresist), a resist model (e.g., that models physical effects of the resist, such as chemical effects due to the light), an optical proximity correction (OPC) model (e.g., that can be used to make masks or reticles and may include sub-resolution resist features (SRAFs), etc.).
  • OPC optical proximity correction
  • SRAFs sub-resolution resist features
  • varying a pupil design concurrently with a mask pattern can mean making a small modification to a pupil design, then making a small adjustment to a mask pattern, and then another modification to the pupil design, and so on.
  • concurrency can refer to operations occurring at the same time, or having some overlapping in time.
  • the present disclosure provides apparatuses, methods and computer program products which, among other things, relate to modifying or optimizing features of a lithography apparatus in order to increase performance and manufacturing efficiency.
  • the features that can be modified can include an optical spectrum of light used in the lithography process, a mask, a pupil, etc.
  • Figure 4 depicts a schematic representation of process-model-based image correction according to an embodiment of the present disclosure.
  • a process model may be used to determine an image distortion correction for an image of output of a patterning process.
  • a process model may be a lithography model, such as a resist model, an etch model, an optical model, or any appropriate model of a process or sub-process involved in patterning of ICs. Evaluation of a patterning process and a fabricated pattern may depend on the accuracy of metrology measurements for the fabricated pattern (e.g., the output of the patterning process). Also, model accuracy may be dependent on metrology data used for calibration. Metrology measurements may experience distortions, such as due to non-idealities (e.g., defects) in metrology apparatus, metrology processes, etc.
  • the incident angle of the electron beam on the sample face may cause image distortions, such as perspective distortion (e.g., perspective warping), affine transformation, and various other geometric distortions. Correcting the metrology measurements for the image acquisition induced distortions may improve the overall accuracy of metrology and enable better process control and output.
  • image distortions such as perspective distortion (e.g., perspective warping), affine transformation, and various other geometric distortions.
  • Correcting the metrology measurements for the image acquisition induced distortions may improve the overall accuracy of metrology and enable better process control and output.
  • multiple periodic measurement locations may be used for determining an image distortion—for example, based on the expected positions of the periodic elements and the measured positions of the periodic elements.
  • the expected positions of the periodic elements may only be determined by a process model.
  • the process model may be used to predict measurement locations and the predicted measurement locations may be compared to measured locations to determine a distortion present in the image, if any.
  • the image may be corrected based on the determined distortion.
  • the corrected image may be used to calibrate the process model.
  • the process may be a lithography process, such as exposure, resist development, etch, fill, etc., or a part thereof.
  • the process parameters 402 may include a pattern 404 for the patterning process.
  • the pattern 404 may be a design layout (e.g., an intended pattern), a mask pattern (e.g., a pattern of a patterning device), a near field mask image (e.g., a model of the electric field caused by an interaction between illumination and a patterning device), or any other appropriate pattern.
  • the pattern 404 may be a pattern from any part of the process and may be created by one or more modeling or approximation step or operation.
  • the pattern 404 may include sub-resolution assist features (SRAFs) or other features which may not be present in a design layout.
  • SRAFs sub-resolution assist features
  • the process parameters may include process conditions 406.
  • the process conditions 406 may include a process recipe, such as an illumination type (e.g., wavelength, source conditions, etc.), an exposure time, a dose, etc.
  • the process conditions 406 may be intended conditions (e.g., conditions specified by a recipe) or alternatively or additionally measured conditions (e.g., conditions applied during a specific instance of the patterning process).
  • the process conditions 406 may include a process window.
  • the process conditions 406 may include ranges for one or more process conditions.
  • the process conditions 406 may include absolute values, relative values, etc.
  • a process model 410 may operate, such as based on the process parameters 402, to model a predicted pattern 412 of the patterning process.
  • the process model 410 may be any appropriate process model, such as an optical model, a resist model, an etch model, a lithography model, etc.
  • the process model 410 may be or include a metrology apparatus model, such as an SEM model.
  • the process model 410 may be a physical model, semi-physical model, etc. of the patterning process.
  • the process model 410 may be a machine learning model.
  • the process model 410 may be a set of models, including models which operate upon outputs of other models sequentially or serially.
  • the process model 410 may include a resist model and a SEM imaging model, where the SEM imaging model may operate on an output of the resist model to produce a predicted image of a resist patterning process.
  • the process model 410 may be calibrated, including iteratively, based metrology results (e.g., based on output of metrology tools).
  • the process model 410 may operate before, after, or during a patterning process.
  • the process model 410 may output results which may be stored and accessed during or after a patterning process, which may increase the speed at which the results of the process model 410 may be obtained.
  • the process model 410 may output a predicted pattern 412.
  • the predicted pattern 412 may be a metrology image, such as a predicted SEM image.
  • the predicted pattern 412 may be a three-dimensional pattern, such as a resist development pattern, which may be used to generate a two- dimensional pattern or image such as by taking planar slices of the three-dimensional pattern.
  • the predicted pattern 412 may be an image of an etched feature, where different heights (e.g., depths) of features may be visible in the image.
  • the predicted pattern 412 may be an image of multiple materials, where different properties of the materials (e.g., secondary electron emission, angle of incidence, etc.) may allow resolution of areas, shapes, etc. of the various materials.
  • the predicted pattern 412 may be of any appropriate resolution, field of view (FOV), etc.
  • the predicted pattern 412 may correspond to the pattern 404 for a full wafer, or for part of a wafer. [0080]
  • the predicted pattern 412 may be operated on to obtain one or more predicted locations 416 on the predicted pattern 412.
  • the predicted locations 416 may lie on feature contours in the predicted pattern 412.
  • the predicted locations 416 may be predicted contours.
  • the predicted locations 416 may be output by the process model 410 directly, or may be determined (e.g., identified) based on a predicted pattern 412 output by the process model.
  • the predicted locations 416 may be identified along features which are expected to be visible in an image of the patterning process. For example, the predicted locations 416 may lie along a resist development edge, an etch trench edge, a contact hole fill edge, etc.
  • the predicted locations 416 may be identified based on predicted image transition points in the predicted pattern 412.
  • the predicted locations 416 may identify feature shape.
  • the predicted locations 416 may identified feature location, which may be location relative to another feature, absolute location, etc.
  • the predicted locations 416 may be one- dimensional locations (e.g., point locations).
  • the predicted locations 416 may be identified as vectors or other multi-dimensional objects.
  • the predicted locations 416 may have a point location and a vector component, where the vector component may point perpendicular (or parallel) to a feature contour.
  • the predicted locations 416 may have a thickness, such as of an image transition from one pixel value to another, which may represent a thickness of a feature transition (e.g., etch feature sidewall angle).
  • One or more instance 420 of the patterning process may occur, such as based on the process parameters 402, to produce a fabricated pattern 422.
  • One or more instance 420 of the patterning process may be selected from instance of the patterning process which occur, have occurred, will occur, etc.
  • the one or more instance 420 of the patterning process may be an instance of the pattern process on a single wafer, on a batch of wafers, etc.
  • the one or more instance 420 of the patterning process may be performed on production wafers (e.g., wafer to be fully fabricated to produce ICs), on test or calibration wafers (e.g., wafers which may only be partially fabricated or wafers which may only have the specific patterning process performed and not others in a fabrication process), etc.
  • the one or more instance 420 of the patterning process may be selected from a production line.
  • the one or more instance 420 of the patterning process may be performed specifically in order to perform one or more embodiment described herein.
  • the one or more instance 420 of the patterning process may be selected based on control parameters.
  • the one or more instance 420 of the patterning process may be selected for metrology (e.g., imaging) based on an image or other metrology measure being out of range, trending out of range, or any other appropriate control measure.
  • each instance 420 of the patterning process may be selected for performance of one or more of the embodiments described herein, such as in-line metrology during one or more fabrication step.
  • the number of instances 420 of the patterning process selected may be determined based on the time required for imaging, the criticality of the patterning process in the fabrication procedure, etc.
  • the fabricated pattern 422 may correspond to the predicted pattern 412.
  • the predicted pattern 412 may be selected (e.g., selected to be generated by the process model 410) based on the fabricated pattern 422 (e.g., generated by the instance 420 of the patterning process), and vis versa.
  • the fabricated pattern 422 may contain defects, such as may be introduced by a fabrication process, and may vary from the predicted pattern 412.
  • the fabricated pattern 422 may be a physical pattern.
  • the fabricated pattern 422 may be a three-dimensional pattern (e.g., with varying height or depth along an axis perpendicular to the planar axis of the wafer) or may be substantially two- dimensional (for example, planarized).
  • the fabricated pattern 422 may contain multiple materials, which may have different imaging properties.
  • the fabricated pattern 422 of the patterning process may be subjected to metrology, such as through a measurement process 424.
  • the measurement process 424 may be scanning electron microscopy (SEM).
  • SEM scanning electron microscopy
  • the measurement process 424 may be any appropriate imaging process.
  • the measurement process 424 may be a pre-calibrated imaging process.
  • the measurement process 424 even when pre-calibrated, may introduce imaging distortions.
  • the measurement process 424 may introduce perspective distortion—which may be caused by a non-perpendicular angle of incidence of illumination on the fabricated pattern 422.
  • the measurement process 424 may output an image of the fabricated pattern 422, which may contain distortions.
  • the image may correspond to an image of the predicted pattern 412.
  • the image may be a two-dimensional representation of the fabricated pattern 422.
  • the image may be of any appropriate resolution, field of view (FOV), etc. image may correspond to the fabricated pattern 422 for a full wafer, for part of a wafer, etc.
  • the image of the fabricated pattern 422 may have the same or different resolution, size, etc. as the image of the predicted pattern 412.
  • the image of the fabricated pattern 422 may be adjusted in resolution, magnification, etc. to correspond to the image of the predicted pattern 412, and vice versa.
  • the image of the fabricated pattern 422, such as output by the measurement process 424, may be operated on to obtain one or more measured locations 426 on the fabricated pattern 422.
  • the measured locations 426 may lie on feature contours in image of the fabricated pattern 422.
  • the measured locations 426 may be measured contours, which may be identified by contour matching, template matching, etc.
  • the measured locations 426 may be output by the measurement process 424 directly, or may be determined (e.g., identified) based on an image output by the measurement process 424.
  • the measured locations 426 may be identified along features which are visible in the image of the fabricated pattern 422. For example, the measured locations 426 may lie along a resist development edge, an etch trench edge, a contact hole fill edge, etc.
  • the measured locations 426 may be identified (e.g., as absolute or relative locations) based on image transition points (e.g., transitions in pixel values) in the image of the fabricated pattern 422.
  • the measured locations 426 may qualify feature shape (e.g., lie along an outline of a feature showing its shape).
  • the measured locations 426 may determine feature location, which may be location relative to another feature, absolute location, etc.
  • the measured locations 426 may be one-dimensional locations (e.g., point locations).
  • the measured locations 426 may be determined as vectors or other multi-dimensional objects.
  • the measured locations 426 may have a point location and a vector component, where the vector component may point perpendicular (or parallel) to a feature contour.
  • the measured locations 426 may have a thickness, such as of an image transition from one pixel value to another, which may represent a thickness of a feature transition (e.g., etch feature sidewall angle).
  • the measured locations 426 may correspond to the predicted locations 416, such as in a one-to-one relationship. Predicted locations 416 may be generated based on determined measured location 426 or vis versa. Predicted locations 416 may be generated on the predicted pattern 412 based on measured location 426 determined on the fabricated pattern 422, or vis versa. [0087] An image distortion determination 430 may be performed based on the measured locations 426 and the predicted locations 416. The image distortion determination may be performed based on differences between the measured locations 426 and the predicted locations 416. The differences between the measured locations 426 and the predicted locations 416 may be differences between one of the measured locations 426 and a corresponding one of the predicted locations 416.
  • the differences between the measured locations 426 and the predicted locations 416 may be vector quantities, such as edge placement (EP) gauges.
  • the differences between the measured locations 426 and the predicted locations 416 may include information about location (e.g., absolute or relative location) and information about direction of the differences.
  • the differences between the measured locations 426 and the predicted locations 416 may be two-dimensional differences (e.g., vectors), three-dimensional vectors (for example, if heigh or depth information is known), etc.
  • the image distortion determination 430 may determine multiple image distortions. For example, the image distortion determination may determine that the image of the fabricated pattern 422 contains a magnification distortion and a perspective distortion.
  • the image distortion determination may determine an offset 440 due to the process model 410, which may correspond to distortion in the predicted pattern 412 relative to the fabricated pattern 422.
  • the offset 440 due to the process model 410 may be used to calibrate (e.g., further calibrate) the process model 410, update the process parameters 402, etc. at a model calibration operation 442.
  • the calibrated process model 410 may then be used to generate additional predicted patterns 412, predicted locations 416, for the same instance 420 of the patterning process or subsequent instances of the patterning process.
  • the image distortion determination may determine an offset due to image distortion 450, which may correspond to image distortion introduced by the measurement process 424.
  • the offset due to the image distortion 450 may be used to generate an image correction 452.
  • the image correction 452 may be applied to the image of the fabricated pattern 422, such as output by the measurement process 424.
  • the image correction 452 may be applied to the measured locations 426.
  • the corrected image of the fabricated pattern 422 may be used to generate corrected measurement locations.
  • the corrected measured locations may then be used to determine a subsequent image distortion.
  • the measured locations 426 may be corrected used to determine image distortions, which may be further ameliorated by model calibration operation 442 and image correction 452 until a termination criterion is reached.
  • the termination criterion may be a number of iterations, a total difference between the measured locations 426 and the predicted locations 416, etc.
  • the image distortion determination 430 may include determination of one or more distortion that may not be alleviated by either model calibration operation 442 or image correction 452.
  • a missing feature may be a defect in the instance 420 of the patterning process which is not caused by the process model 410 or the measurement process 424.
  • Identification of differences between the predicted pattern 412 and the fabricated pattern 422 which are not caused by imaging errors or modeling offsets may be vitally important to process control, as these differences may indicate defects in fabrication.
  • the image correction 452 may be determined based on the differences between the measured locations 426 and the predicted locations 416.
  • the image correction 452 may be determined for each instance 420 of the patterning process, a percentage of instances of the pattering process, etc.
  • the image correction 452 may be determined independently for each image of the fabricated pattern 422.
  • the image correction 452 does not assume that each image of the fabricated pattern 422 experiences the same image distortion (e.g., that the distortion is constant for the measurement process 424). As the measurement process 424 does not experience a constant level of distortion, even for multiple images of the same fabricated pattern 422, this produces a more accurate image correction 452, which in turn leads to more accurate control of the patterning process.
  • the image correction 452 may be determined for images of the fabricated pattern 422 of multiple sizes (e.g., FOV size). An image distortion may be minimized by reducing a FOV for an SEM image, however smaller FOV size may reduce the number of features which may be imaged and increase the imaging time.
  • Figure 5 depicts a schematic representation of an example image correction according to an embodiment of the present disclosure.
  • patterns predicted by a process model may be used to determine an image distortion correction for an image of output of a patterning process.
  • An example feature is depicted for an exemplary patterning process—the rounded bar 510.
  • the example feature may be any appropriate feature of a patterning process.
  • the example feature may be output by a process model, such as operating on set of process parameters for a patterning process.
  • the rounded bar 510 represents a predicted location and shape of the example feature of the patterning process.
  • the predicted location and shape may be output by the process model or by an imaging model operating on or within the process model.
  • the predicted location and shape may be an ideal location and shape, such as from a design layout.
  • a measured feature may be compared to an ideal feature (e.g., in location and shape) and a predicted feature (e.g., output by a process model).
  • a measured feature may be compared to an ideal feature (e.g., from a design layout) and a predicted feature may also be compared to the ideal feature (e.g., the same ideal feature to which the measured feature is compared). The differences between the measured feature and the ideal feature and the predicted feature and the ideal feature may then be compared.
  • the differences between the measured feature and the ideal feature may be in the form of EP gauges.
  • the differences between the predicted feature and the ideal feature may also be in the form of EP gauges.
  • the intervening feature (such as an ideal feature) may be any appropriate feature to which both the measured feature and the predicted feature are compared.
  • the predicted location and shape of the rounded bar 510 may be determined by contour matching, template matching, an imaging model, etc.
  • the predicted location and shape of the example feature may be defined by a set of predicted locations 512.
  • the predicted locations 512 may be generated by the process model, may be located on predicted contours, etc.
  • the predicted locations 512 may have any appropriate density—e.g., may be substantially continuous in the form of a contour.
  • the predicted locations 512 may be points, vectors, etc.
  • the predicted locations 512 may be selected based on image quantities, such as pixel value derivatives, or predicted pattern quantities, such as three-dimensional depth predictions.
  • a rounded bar 520 represents a measured location and shape of the example feature of the patterning process. The measured location and shape may be determined based on an image of a fabricated pattern fabricated according to the patterning process. The measured location and shape may be determined based on a metrology image. The measured location and shape may be determined based on an SEM image. The measured location and shape of the rounded bar 520 may be determined by contour matching, template matching, image processing, etc. The measured location and shape of the example feature may be defined by a set of measured locations 522.
  • the measured locations 522 may be determined by an imaging process, based on identified contours, etc.
  • the measured locations 522 may have any appropriate density—e.g., may be substantially continuous in the form of a contour.
  • the density of the measured locations 522 and the predicted locations 512 may be substantially commensurate.
  • the measured locations 522 may be points, vectors, etc.
  • the type of locations of the measured locations 522 and the predicted locations 512 (e.g., points, vectors, etc.) may be the same.
  • the measured locations 522 may be selected based on image quantities, such as pixel value derivatives.
  • the difference between the predicted locations 512 and the measured location may be described by a vector 524, where vectors will be further described in the following.
  • a portion 530 of the example feature is enlarged in image 540.
  • the image 540 depicts predicted locations 512A, as previously described in reference to predicted locations 512, on a predicted contour 510A and measured locations 522A, as previously described in reference to measured locations 522, on a measured contour 520A.
  • a difference is determined for each of the predicted locations 512A and a corresponding measured locations 522A.
  • the difference between each predicted location 512A and measured location 522A is depicted as one of vectors 541-554.
  • the vectors 541-554 contain information about the direction and magnitude of the difference between the predicted location 512A and the measured location 522A.
  • an image distortion may be determined.
  • the image distortion operation may be any appropriate geometric transform, such as an affine transform, a perspective transform, etc.
  • the image distortion may be determined based on a difference between measured locations, described by values x and y (e.g., points (x,y), and predicted locations, described by values X and Y (e.g., points (X,Y).
  • the values x and y may be adjusted (e.g., corrected) to more closely conform with the values X and Y based on an image transformation operation.
  • the same image transformation operation may be applied to all points of the measured locations (e.g., to the image of the fabricated pattern).
  • the values of the image transformation operation may be determined based on solving one or more matrix, such as by a least squares linear solver, to generate values of the image transformation which maximize image correction over all measured and predicted locations.
  • the image transformation operation may correct multiple distortions, and may or may not determine where image distortions arise (e.g., from the patterning process, from modeling of the patterning process, from imaging of the fabrication pattern).
  • these matrices may be expanded to have the form given by Equation 7, below: ⁇ ⁇ 1 0 0 0 ⁇ ⁇ ⁇ h é ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ (7) 0 ùéh ù é ⁇ ù ê 0 0 ⁇ ⁇ 1 ⁇ ⁇ ⁇ ⁇ 1 0 0 ⁇ ê ⁇ ⁇ ⁇ ⁇ ⁇ 0 ⁇ ⁇ ⁇ ⁇ ⁇ ⁇ 0 ⁇ ⁇
  • both the affine and perspective transform operations may be solved for, given differences between measured locations 522A and predicted locations 512A.
  • the transform operation which best corrects the obtained image of the measured locations 522A may be determined, such as based on root mean squared (RMS) errors between measured locations 522A and predicted locations 512A.
  • RMS root mean squared
  • Other methods may be used to evaluate and select transform operations for the image distortion determination.
  • an image distortion may be determined based on the vectors 541-554 of image 540. Based on the determined image distortion (e.g., the affine or perspective transformation of a matrix A), an image correction operation may be determined. The image correction may be applied to the image 540 by an operation 560 to generate a corrected image 570.
  • the corrected image 570 depicts predicted locations 512B, as previously described in reference to predicted locations 512, on a predicted contour 510B and measured locations 522B, as previously described in reference to measured locations 522, on a measured contour 520B.
  • the predicted locations 512B may be the predicted locations 512A, such as when the process model remains unchanged.
  • the predicted locations 512B may vary from the predicted locations 512A if the process model has changed, such as by calibration, changed in process parameters, etc.
  • the some of the measured locations 522B may be substantially the same as some of the measured locations 522A, such as if the image correction is small in one or more region of the image 570.
  • a difference may again be determined.
  • the difference between each predicted location 512B and measured location 522B is depicted as one of vectors 571-584.
  • the vectors 571-584 contain information about the direction and magnitude of the difference between the predicted location 512A and the measured location 522A.
  • the vectrors 571-584 may be smaller than the vectors 541-554. In some instances, some of the vectors 571-584 may be the same size as, larger than, different in direction, etc. that the vectors 541-554.
  • the image correction operation 560 may reduce a total error (e.g., difference) between the measured locations 522B and the predicted locations 512B (relative to the total error (e.g., difference) between the measured locations 522A and the predicted locations 512A) but may increase a distance between one or more of the measured locations 522B and a corresponding predicted location 512B (versus the previous distance between the measured locations 522A and the corresponding predicted location 512A).
  • the image transformation operation may correct for magnification differences, translation offsets, and other linear transforms between the measured locations 522A and the predicted locations 512A.
  • the process model or another model or operation may include one or more operation to match the measured locations 522A and the predicted locations 512A by means of magnification, translation, and other linear adjustments.
  • Figures 6A-6B illustrate example image distortions identifiable by process-model-based image distortion characterization according to an embodiment of the present disclosure. According to an embodiment of the present disclosure, various distortions may be identified and corrected by an image distortion determination.
  • Figures 6A-6B illustrate an example representation 600 of a pattern for a patterning process. The example representation 600 depicts the pattern as ideally modelled, fabricated, imaged, etc. The example representation 600 includes vertical and horizontal major and minor axes, in order to allow for description of image distortions.
  • the example representation 600 is used as the predicted pattern for a patterning process. This may be true if the patterning process is well-characterized, but in other cases the predicted pattern may be different from the ideal pattern (e.g., intended pattern, design layout, etc.) due to processing limitations.
  • Figure 6A images of patterns fabricated by the patterning process with a perspective distortion 610, a skew distortion 620, and a one-dimensional stretching distortion 630 are depicted.
  • Figure 6B images of patterns fabricated by the patterning process with a magnification distortion 640, a feature enlargement distortion 650, and without a distortion 660 are depicted.
  • Each of these distortions may be present in an image of a fabricated pattern or in the fabricated pattern itself, together with other distortions, alone, in combination, etc.
  • the images of the fabricated patterns may be compared to the predicted pattern, such as by using contours, measured and predicted locations, etc.
  • the measured locations and the predicted locations for each of the images of the fabricated patterns may be compared in overlay images 612, 622, 632, 642, 652, and 662, respectively.
  • the overlay images show the major and minor horizontal and vertical axes of the predicted pattern in solid lines and as measured from the fabricated patterns in dashed lines. For features of the fabricated pattern, differences between measured locations and predicted locations may be determined, such as from an overlay.
  • the differences may be determined as vectors (e.g., having values in both x and y directions).
  • the pattern of the differences may be used to determine a type of distortion, and subsequently to determine if the identified distortion is introduced by an imaging operation, a modeling operation, a patterning process, etc.
  • the differences between the predicted locations and the measured locations for the perspective distortion 610 are shown in vector map 612; the differences between the predicted locations and the measured locations for the skew distortion 620 are shown in vector map 622; and the differences between the predicted locations and the measured locations for the one-dimensional stretching distortion 630 are shown in vector map 634.
  • each of the vector maps may be characteristic of a type of distortion. Based on the vector maps, one or more types of distortion may be identified in an image of a fabricated pattern. For some of the types of distortion, the distortion may not be an image distortion, but may rather be a processing induced distortion (or a modeling offset). For example, the feature enlargement distortion 654 may be caused by over development.
  • Feature enlargement may cause a bimodal distribution of differences between measured locations and predicted locations, where measured locations are shifted further in an etch direction.
  • This type of distortion may not be corrected by an affine or perspective image transformation, but may rather cause adjustment of a process model.
  • an image may not contain an identifiable image distortion, such as if the image is an ideal image of the fabricated pattern.
  • the differences between the measured locations and the predicted locations may be due to random variations in processing and may not display any identifiable distribution in the vector map.
  • the vectors may follow a Gaussian distribution about a zero point or even a flat distribution.
  • FIG. 7 is a flowchart illustrating a method for process-model-based image distortion identification according to an embodiment of the present disclosure. Each of these operations is described in detail below. The operations of method 700 presented below are intended to be illustrative. In some embodiments, method 700 may be accomplished with one or more additional operations not described, and/or without one or more of the operations discussed. Additionally, the order in which the operations of method 700 are illustrated in Figure 7 and described below is not intended to be limiting.
  • one or more portions of method 700 may be implemented (e.g., by simulation, modeling, etc.) in one or more processing devices (e.g., one or more processors).
  • the one or more processing devices may include one or more devices executing some or all of the operations of method 700 in response to instructions stored electronically on an electronic storage medium.
  • the one or more processing devices may include one or more devices configured through hardware, firmware, and/or software to be specifically designed for execution of one or more of the operations of method 700, for example.
  • characteristics of a patterning process are acquired.
  • the patterning process may be a lithography process, or part thereof.
  • the characteristics may include a pattern, such as a design layout, a mask layout, a mask image, etc.
  • the characteristics may include a recipe or patterning process description, which may include any appropriate characteristics.
  • the characteristics may be settings which are applied to the patterning process.
  • the characteristics may include measurements of various parameters during a patterning process.
  • a process model is obtained.
  • the process model may be a physical model, semi-physical model, machine learning model, etc.
  • the process model may be an optical model, a resist model, an etch model, etc.
  • the process model may operate on the characteristics of the patterning process to determine a predicted output of the patterning process.
  • the process model may operate on ideal characteristics, measured characteristics, etc.
  • the process model may operate in time with a patterning process or asynchronously, such as before a patterning process with results stored t memory.
  • the process model may generate a predicted image based on the process characteristics.
  • the process model may generate an image directly or generate a predicted output which may be fed into an imaging model to generate a predicted image.
  • the predicted image may contain predicted contours or predicted locations of features.
  • the predicted image may be processed to identify predicted locations of features, which may define or outline features in the predicted image.
  • an image of a fabricated pattern is obtained. The fabricated pattern is fabricated by the patterning process using the characteristics of the patterning process.
  • the fabricated pattern may be fabricated by any appropriate means, such as by photolithography, development of resist, etching, etc.
  • the image of the fabricated pattern may be obtained by any appropriate method, such as by SEM.
  • the image of the fabricated pattern may be obtained from storage (e.g., from memory), such as for training of a process model.
  • the image of the fabricated pattern may or may not contain an image distortion, such as induced by the image method, imaging apparatus, etc.
  • the image of the fabricated pattern may be processes, such as by an image processing operation, to identify features in the fabricated pattern.
  • the features may be identified by measured contours, measured locations, etc.
  • the image may be processed to generate measured contours, measured locations, etc. which correspond to the features.
  • a difference is determined between the predicted image and the measured image.
  • the difference may be determined based on differences between predicted locations in the predicted image and measured locations in the measured image.
  • the difference may be determined by any appropriate method, such as those previously described.
  • the difference may be expressed as a set of vectors.
  • the difference may be determined based on EP gauges.
  • an image distortion is determined based on the determined differences between the predicted image and the measured image.
  • the image distortion may be an affine distortion.
  • the image distortion may be a perspective distortion.
  • the image distortion may be determined based on a characteristic distribution of differences between the predicted image and the measured image.
  • the image distortion may be classified, including by machine learning.
  • the image distortion may be quantified, e.g., its magnitude may be determined, its angle of rotation may be determined.
  • the image distortion may be identified as containing two or more separate image distortions, which may be separately correctable.
  • the image distortion may be communicated to a measurement operation for correction. For example, a perspective distortion may be caused by an offset in the angle of incidence of illumination of a SEM which may be corrected once identified.
  • the image distortion may be used to calibrate a process model.
  • the image distortion may be classified as corresponding to an imaging process, a patterning process, or a modeling process. Once the image distortion is classified, it may be communicated to a controller of a given process for correction. [0114]
  • the measured image is corrected based on the determined image distortion.
  • the measured image is corrected if and only if the determined image distortion corresponds to an imaging processing induced distortion.
  • the measured image may be corrected by application of an image transformation, such as an affine transformation, a perspective transformation, an enlargement transformation, etc.
  • the corrected image may be returned to the operation 708 for further image distortion determination.
  • the process model may be updated based on the corrected image (e.g., the corrected measured image).
  • the process model may be corrected based on differences between the corrected image and the predicted image.
  • the process model may be corrected based on image distortions determined to correspond to the process model (such as at the operation 712).
  • the process model may be calibrated.
  • the process model may be additionally trained based on the corrected image.
  • FIG. 8 is a diagram of an example computer system CS that may be used for one or more of the operations described herein, according to an embodiment of the present disclosure.
  • Computer system CS includes a bus BS or other communication mechanism for communicating information, and a processor PRO (or multiple processors) coupled with bus BS for processing information.
  • Computer system CS also includes a main memory MM, such as a random-access memory (RAM) or other dynamic storage device, coupled to bus BS for storing information and instructions to be executed by processor PRO.
  • Main memory MM also may be used for storing temporary variables or other intermediate information during execution of instructions by processor PRO.
  • Computer system CS further includes a read only memory (ROM) ROM or other static storage device coupled to bus BS for storing static information and instructions for processor PRO.
  • ROM read only memory
  • a storage device SD such as a magnetic disk or optical disk, is provided and coupled to bus BS for storing information and instructions.
  • Computer system CS may be coupled via bus BS to a display DS, such as a cathode ray tube (CRT) or flat panel or touch panel display for displaying information to a computer user.
  • a display DS such as a cathode ray tube (CRT) or flat panel or touch panel display for displaying information to a computer user.
  • An input device ID is coupled to bus BS for communicating information and command selections to processor PRO.
  • cursor control CC such as a mouse, a trackball, or cursor direction keys for communicating direction information and command selections to processor PRO and for controlling cursor movement on display DS.
  • This input device typically has two degrees of freedom in two axes, a first axis (e.g., x) and a second axis (e.g., y), that allows the device to specify positions in a plane.
  • a touch panel (screen) display may also be used as an input device.
  • portions of one or more methods described herein may be performed by computer system CS in response to processor PRO executing one or more sequences of one or more instructions contained in main memory MM. Such instructions may be read into main memory MM from another computer-readable medium, such as storage device SD. Execution of the sequences of instructions included in main memory MM causes processor PRO to perform the process steps (operations) described herein.
  • processors in a multi-processing arrangement may also be employed to execute the sequences of instructions contained in main memory MM.
  • hard-wired circuitry may be used in place of or in combination with software instructions.
  • the description herein is not limited to any specific combination of hardware circuitry and software.
  • the term “computer-readable medium” and/or “machine readable medium” as used herein refers to any medium that participates in providing instructions to processor PRO for execution. Such a medium may take many forms, including but not limited to, non-volatile media, volatile media, and transmission media.
  • Non-volatile media include, for example, optical or magnetic disks, such as storage device SD.
  • Volatile media include dynamic memory, such as main memory MM.
  • Transitory computer-readable media can include a carrier wave or other propagating electromagnetic signal, for example.
  • Various forms of computer readable media may be involved in carrying one or more sequences of one or more instructions to processor PRO for execution.
  • the instructions may initially be borne on a magnetic disk of a remote computer.
  • the remote computer can load the instructions into its dynamic memory and send the instructions over a telephone line using a modem.
  • a modem local to computer system CS can receive the data on the telephone line and use an infrared transmitter to convert the data to an infrared signal.
  • An infrared detector coupled to bus BS can receive the data carried in the infrared signal and place the data on bus BS.
  • Network link NDL typically provides data communication through one or more networks to other data devices.
  • network link NDL may provide a connection through local network LAN to a host computer HC. This can include data communication services provided through the worldwide packet data communication network, now commonly referred to as the “Internet” INT.
  • Internet WorldNet Services Inc.
  • the signals through the various networks and the signals on network data link NDL and through communication interface CI, which carry the digital data to and from computer system CS, are exemplary forms of carrier waves transporting the information.
  • Computer system CS can send messages and receive data, including program code, through the network(s), network data link NDL, and communication interface CI.
  • host computer HC might transmit a requested code for an application program through Internet INT, network data link NDL, local network LAN, and communication interface CI.
  • One such downloaded application may provide all or part of a method described herein, for example.
  • the received code may be executed by processor PRO as it is received, and/or stored in storage device SD, or other non-volatile storage for later execution. In this manner, computer system CS may obtain application code in the form of a carrier wave.
  • a method for determining distortion in an image based on a process model comprising: determining an image transformation operation based on a relationship between a plurality of measurement locations in an image and locations corresponding to the plurality of measurement locations in a predicted image generated by a process model; and characterizing image distortion in the image based on the image transformation operation.
  • the image comprises an image of a fabricated pattern.
  • the process model is a lithography process model.
  • the process model is a resist development model.
  • the process model is an etch model.
  • the relationship between the plurality of measurement locations and the predicted locations comprises a relationship between a set of contours. 7.
  • the method of clause 1, wherein the relationship between the plurality of measurement locations and the predicted locations comprises a set of edge placement (EP) gauges.
  • EP edge placement
  • the plurality of measurement locations comprise a set of EP gauges in the image.
  • the predicted locations comprise a set of EP gauges in the predicted image.
  • the plurality of measurement locations are obtained from a scanning electron microscope (SEM) image.
  • SEM scanning electron microscope
  • the SEM image comprises an image of a fabricated pattern, further comprising applying the image transformation operation to the image of the fabricated pattern to obtain a corrected SEM image.
  • determining the relationship comprises aligning the image and the predicted image. 14.
  • the image transformation operation is based on an affine transformation. 19.
  • the image transformation operation is based on a perspective transformation.
  • the image transformation operation comprises a transformation matrix.
  • the image transformation operation comprises a geometric transformation operation.
  • the plurality of measurement locations comprise at least some measurement locations that are locations in non-periodic patterns.
  • a method for image correction in a process model for a patterning process comprising: determining an image transformation operation based on differences between measured locations of multiple points on an image of an output of a patterning process and predicted locations corresponding to the multiple points, the predicted locations comprising locations predicted by a process model for the patterning process; and correcting the measured locations of the multiple points on the image based on the image transformation operation.
  • correcting the measured locations further comprises: determining whether the image transformation operation corresponds to a non-ideality in an imaging process based on the image transformation operation; and correcting the measured locations based on a determination that the image transformation operation corresponds to the non-ideality.
  • the determining whether the image transformation operation corresponds to the non-ideality in the imaging process comprises determining one or more modes in the differences between measured locations of the multiple points and predicted locations of the multiple points.
  • 28. further comprising: correcting the image based on the image transformation operation. 29.
  • the output of the patterning process contains non-periodic structures.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Exposure And Positioning Against Photoresist Photosensitive Materials (AREA)
  • Image Processing (AREA)
  • Length-Measuring Devices Using Wave Or Particle Radiation (AREA)

Abstract

L'invention concerne un procédé de détermination de distorsion dans une image sur la base d'un modèle de processus, consistant à : déterminer une opération de transformation d'image sur la base d'une relation entre une pluralité d'emplacements de mesure dans une image et des emplacements correspondant à la pluralité d'emplacements de mesure dans une image prévue générée par un modèle de processus; et caractériser une distorsion d'image dans l'image sur la base de l'opération de transformation d'image.
PCT/EP2024/069759 2023-08-11 2024-07-11 Correction de distorsion d'image par microscopie électronique à balayage (meb) basée sur un modèle de processus Pending WO2025036631A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202363532337P 2023-08-11 2023-08-11
US63/532,337 2023-08-11

Publications (1)

Publication Number Publication Date
WO2025036631A1 true WO2025036631A1 (fr) 2025-02-20

Family

ID=91958850

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2024/069759 Pending WO2025036631A1 (fr) 2023-08-11 2024-07-11 Correction de distorsion d'image par microscopie électronique à balayage (meb) basée sur un modèle de processus

Country Status (2)

Country Link
TW (1) TW202516287A (fr)
WO (1) WO2025036631A1 (fr)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5229872A (en) 1992-01-21 1993-07-20 Hughes Aircraft Company Exposure device including an electrically aligned electronic mask for micropatterning
US20070031745A1 (en) 2005-08-08 2007-02-08 Brion Technologies, Inc. System and method for creating a focus-exposure model of a lithography process
US20070050749A1 (en) 2005-08-31 2007-03-01 Brion Technologies, Inc. Method for identifying and using process window signature patterns for lithography process control
US20080028361A1 (en) * 2006-07-25 2008-01-31 Eiji Yamanaka Pattern evaluation method and evaluation apparatus and pattern evaluation program
US20080301620A1 (en) 2007-06-04 2008-12-04 Brion Technologies, Inc. System and method for model-based sub-resolution assist feature generation
US20080309897A1 (en) 2007-06-15 2008-12-18 Brion Technologies, Inc. Multivariable solver for optical proximity correction
US20090157630A1 (en) 2007-10-26 2009-06-18 Max Yuan Method of extracting data and recommending and generating visual displays
US7587704B2 (en) 2005-09-09 2009-09-08 Brion Technologies, Inc. System and method for mask verification using an individual mask error model
US20100162197A1 (en) 2008-12-18 2010-06-24 Brion Technologies Inc. Method and system for lithography process-window-maximixing optical proximity correction
US20100180251A1 (en) 2006-02-03 2010-07-15 Brion Technology, Inc. Method for process window optimized optical proximity correction
US20110202898A1 (en) * 2010-02-16 2011-08-18 Mentor Graphics Corporation Contour Alignment For Model Calibration
WO2018121965A1 (fr) * 2016-12-28 2018-07-05 Asml Netherlands B.V. Alignement assisté par simulation entre une image de métrologie et une conception
WO2020035285A1 (fr) * 2018-08-15 2020-02-20 Asml Netherlands B.V. Utilisation d'apprentissage automatique pour sélectionner automatiquement à partir d'images brutes des images sem moyennées de haute qualité
EP3893057A1 (fr) * 2020-04-10 2021-10-13 ASML Netherlands B.V. Alignement d'une image déformée
US20220179321A1 (en) * 2019-03-25 2022-06-09 Asml Netherlands B.V. Method for determining pattern in a patterning process

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5229872A (en) 1992-01-21 1993-07-20 Hughes Aircraft Company Exposure device including an electrically aligned electronic mask for micropatterning
US20070031745A1 (en) 2005-08-08 2007-02-08 Brion Technologies, Inc. System and method for creating a focus-exposure model of a lithography process
US20070050749A1 (en) 2005-08-31 2007-03-01 Brion Technologies, Inc. Method for identifying and using process window signature patterns for lithography process control
US7587704B2 (en) 2005-09-09 2009-09-08 Brion Technologies, Inc. System and method for mask verification using an individual mask error model
US20100180251A1 (en) 2006-02-03 2010-07-15 Brion Technology, Inc. Method for process window optimized optical proximity correction
US20080028361A1 (en) * 2006-07-25 2008-01-31 Eiji Yamanaka Pattern evaluation method and evaluation apparatus and pattern evaluation program
US20080301620A1 (en) 2007-06-04 2008-12-04 Brion Technologies, Inc. System and method for model-based sub-resolution assist feature generation
US20080309897A1 (en) 2007-06-15 2008-12-18 Brion Technologies, Inc. Multivariable solver for optical proximity correction
US20090157630A1 (en) 2007-10-26 2009-06-18 Max Yuan Method of extracting data and recommending and generating visual displays
US20100162197A1 (en) 2008-12-18 2010-06-24 Brion Technologies Inc. Method and system for lithography process-window-maximixing optical proximity correction
US20110202898A1 (en) * 2010-02-16 2011-08-18 Mentor Graphics Corporation Contour Alignment For Model Calibration
WO2018121965A1 (fr) * 2016-12-28 2018-07-05 Asml Netherlands B.V. Alignement assisté par simulation entre une image de métrologie et une conception
WO2020035285A1 (fr) * 2018-08-15 2020-02-20 Asml Netherlands B.V. Utilisation d'apprentissage automatique pour sélectionner automatiquement à partir d'images brutes des images sem moyennées de haute qualité
US20220179321A1 (en) * 2019-03-25 2022-06-09 Asml Netherlands B.V. Method for determining pattern in a patterning process
EP3893057A1 (fr) * 2020-04-10 2021-10-13 ASML Netherlands B.V. Alignement d'une image déformée

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
"IMAGE DISTORTION CORRECTION IN CHARGED PARTICLE INSPECTION", vol. 695, no. 31, 31 January 2022 (2022-01-31), XP007150023, ISSN: 0374-4353, Retrieved from the Internet <URL:https://www.researchdisclosure.com/database/RD695031> [retrieved on 20220131] *
"PROCESS MODEL BASED SCANNING ELECTRON MICROSCOPY (SEM) IMAGE DISTORTION CORRECTION", vol. 714, no. 71, 15 September 2023 (2023-09-15), XP007151772, ISSN: 0374-4353, Retrieved from the Internet <URL:https://www.researchdisclosure.com/database/RD714071> [retrieved on 20230915] *
"REGION-DENSITY BASED MISALIGNMENT INDEX FOR IMAGE ALIGNMENT", vol. 701, no. 70, 19 August 2022 (2022-08-19), XP007150566, ISSN: 0374-4353, Retrieved from the Internet <URL:https://www.researchdisclosure.com/database/RD701070> [retrieved on 20220819] *

Also Published As

Publication number Publication date
TW202516287A (zh) 2025-04-16

Similar Documents

Publication Publication Date Title
TWI694316B (zh) 基於缺陷機率的製程窗
US10846442B2 (en) Methods and systems for parameter-sensitive and orthogonal gauge design for lithography calibration
US9009647B2 (en) Methods and systems for lithography calibration using a mathematical model for a lithographic process
JP5016585B2 (ja) リソグラフィプロセスウィンドウをシミュレートするための方法及びシステム
US8056028B2 (en) Method of performing mask-writer tuning and optimization
KR102828484B1 (ko) 결함 기반의 프로세스 윈도우에 기초하여 시뮬레이션 프로세스를 캘리브레이팅하기 위한 방법
US8751979B1 (en) Determining the gradient and Hessian of the image log slope for design rule optimization for accelerating source mask optimization (SMO)
US9588439B1 (en) Information matrix creation and calibration test pattern selection based on computational lithography model parameters
TW202240320A (zh) 基於表現匹配之調節掃描器之波前最佳化
CN102057330A (zh) 基于模型的扫描器调节方法
TW201341970A (zh) 用於進階微影術之可察知透鏡升溫的源光罩最佳化
US20200103764A1 (en) Lithography simulation method
TWI654476B (zh) 使用圖案化裝置形貌誘導相位之方法及設備
US7207030B2 (en) Method for improving a simulation model of photolithographic projection
WO2016096361A1 (fr) Procédé et appareil permettant d&#39;utiliser la phase induite par la topographie d&#39;un dispositif de formation de motif
TWI822310B (zh) 度量衡方法及裝置
US20240184219A1 (en) System and method to ensure parameter measurement matching across metrology tools
WO2016096346A1 (fr) Procédé et appareil pour l&#39;utilisation d&#39;une phase induite par la topograhie d&#39;un dispositif de formation de motifs
KR20210037696A (ko) 매칭 퓨필 결정
WO2024141256A1 (fr) Optimisation de masque source basée sur des effets systématiques subits par un appareil lithographique
WO2025036631A1 (fr) Correction de distorsion d&#39;image par microscopie électronique à balayage (meb) basée sur un modèle de processus
TW202236025A (zh) 對基板區域上的量測資料進行模型化的方法及相關設備
WO2024012800A1 (fr) Systèmes et procédés de prédiction de variation stochastique post-gravure
WO2025093227A1 (fr) Procédé d&#39;étalonnage d&#39;erreur stochastique avec expositions de microchamp
WO2025087650A1 (fr) Surveillance d&#39;un effet d&#39;évanouissement dans une simulation de lithographie informatisée

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 24743765

Country of ref document: EP

Kind code of ref document: A1