[go: up one dir, main page]

CN119094877B - Image sensor, exposure method, exposure device and electronic equipment - Google Patents

Image sensor, exposure method, exposure device and electronic equipment

Info

Publication number
CN119094877B
CN119094877B CN202411094073.1A CN202411094073A CN119094877B CN 119094877 B CN119094877 B CN 119094877B CN 202411094073 A CN202411094073 A CN 202411094073A CN 119094877 B CN119094877 B CN 119094877B
Authority
CN
China
Prior art keywords
pixel
exposure
image
pixel array
array
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202411094073.1A
Other languages
Chinese (zh)
Other versions
CN119094877A (en
Inventor
李沛德
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN202411094073.1A priority Critical patent/CN119094877B/en
Publication of CN119094877A publication Critical patent/CN119094877A/en
Application granted granted Critical
Publication of CN119094877B publication Critical patent/CN119094877B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/55Optical parts specially adapted for electronic image sensors; Mounting thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/54Mounting of pick-up tubes, electronic image sensors, deviation or focusing coils
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/741Circuitry for compensating brightness variation in the scene by increasing the dynamic range of the image compared to the dynamic range of the electronic image sensors

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

The application discloses an image sensor, an exposure method, an exposure device and electronic equipment, and belongs to the technical field of shooting. The image sensor comprises a pixel array, a micro lens array and a form switching structure, wherein the pixel array and the micro lens array are correspondingly arranged, the form switching structure is connected with the micro lens array, the pixel array comprises a plurality of pixel groups, each pixel group comprises N pixel units, the micro lens array comprises a plurality of micro lens groups, each micro lens group comprises a first micro lens or at least two second micro lenses, each pixel group is correspondingly provided with one first micro lens or at least two second micro lenses, the form switching structure is used for driving the micro lens array to move, each pixel group is controlled to be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses, when the pixel group corresponds to the first micro lenses, the pixel group is used for acquiring phase information, and when the pixel group corresponds to the at least two second micro lenses, the pixel group is used for acquiring image information.

Description

Image sensor, exposure method, exposure device and electronic equipment
Technical Field
The application belongs to the technical field of image pickup, and particularly relates to an image sensor, an exposure method, an exposure device and electronic equipment.
Background
With the development of shooting technology, the requirement of users on the focusing capability of the image sensor is increasing.
In the related art, phase detection focusing (Phase DetectionAuto Focus, PDAF) technology is generally used for auto-focusing during shooting. Specifically, phase focusing may be performed by an image sensor supporting the PDAF technology, where at least some pixels in the image sensor supporting the PDAF technology may output phase information. In general, the higher the pixel duty ratio of the image sensor output phase information, the higher the accuracy of auto-focusing.
However, the higher the pixel duty ratio supporting PDAF in an image sensor, the lower the sharpness of the image captured by the image sensor may be due to factors such as reduced pixel coverage of perceived colors, limited in-focus pixel arrangement, and the like.
Disclosure of Invention
The embodiment of the application aims to provide an image sensor, an exposure method, an exposure device and electronic equipment, which can improve the image definition on the basis of improving the phase focusing capability.
The embodiment of the application provides an image sensor, which comprises a pixel array, a microlens array and a form switching structure, wherein the pixel array and the microlens array are correspondingly arranged, the form switching structure is connected with the microlens array, the pixel array comprises a plurality of pixel groups, each pixel group comprises N pixel units, N is an integer larger than 1, the microlens array comprises a plurality of microlens groups, each microlens group comprises a first microlens or at least two second microlenses, each pixel group is correspondingly provided with one first microlens or at least two second microlenses, the form switching structure is used for driving the microlens array to move so as to control each pixel group to be switched from the corresponding first microlens to the corresponding at least two second microlenses or from the corresponding at least two second microlenses to the corresponding first microlens, the pixel groups are used for acquiring phase information when the pixel groups correspond to the first microlenses, and the pixel groups are used for acquiring image information when the pixel groups correspond to the at least two second microlenses.
In a second aspect, an embodiment of the present application provides an exposure method, which is applied to an image sensor according to the first aspect, and the exposure method includes controlling a pixel array of the image sensor to perform progressive exposure to obtain a first image frame, controlling the pixel array to stop exposure at a first time, and switching a microlens group corresponding to each pixel group through a form switching structure of the image sensor, wherein the first time is a time when the pixel array corresponding to the first image frame is not yet exposed and the pixel array corresponding to the second image frame is not yet exposed, controlling the pixel array of the image sensor to continue performing progressive exposure, controlling the pixel array to stop exposure at a second time, and switching the microlens group corresponding to each pixel group through the form switching structure, wherein the second time is a time when the pixel array corresponding to the first image frame is not yet exposed, controlling the pixel array of the image sensor to continue performing progressive exposure until the pixel array corresponding to the second image frame is completely exposed, and obtaining target image information or second image information based on first image frame and second image frame corresponding to second image data.
In a third aspect, an embodiment of the present application provides an exposure apparatus, which may include an image sensor as in the first aspect, and may further include a control module, where the control module is configured to control a pixel array of the image sensor to perform progressive exposure to obtain a first image frame, control the pixel array to terminate exposure at a first time, and switch a microlens group corresponding to each pixel group through a morphological switching structure of the image sensor, where the first time is a time when the pixel array corresponding to the first image frame is not yet exposed, and the pixel array corresponding to the second image frame begins to be exposed, control the pixel array of the image sensor to continue to perform progressive exposure, control the pixel array to terminate exposure at a second time, and switch the microlens group corresponding to each pixel group through the morphological switching structure, where the second time is a time when the pixel array corresponding to the first image frame is already exposed, and the pixel array corresponding to the second image frame is not yet exposed, control the pixel array of the image sensor to perform progressive exposure until the pixel array corresponding to the second image frame is completely exposed, and obtain the target image information based on the first image frame and the first image frame corresponding to the second image frame.
In a fourth aspect, embodiments of the present application provide an electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, performs steps of a method as in the second aspect.
In a fifth aspect, embodiments of the present application provide a readable storage medium having stored thereon a program or instructions which when executed by a processor perform the steps of a method as in the second aspect.
In a sixth aspect, embodiments of the present application provide a chip comprising a processor and a communication interface, the communication interface being coupled to the processor, the processor being configured to execute programs or instructions to implement a method as in the second aspect.
In a seventh aspect, embodiments of the present application provide a computer program product stored in a storage medium, the program product being executable by at least one processor to implement a method as in the second aspect.
In the image sensor provided by the embodiment of the application, each pixel group can be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses by driving the micro lens array to move back and forth through the form switching structure, so that each pixel group can respectively output image information and phase information before and after the micro lens array moves, the image sensor provided by the embodiment of the application can realize that all pixels support PDAF, and can improve the definition of acquired images on the basis of improving the phase focusing capability.
In the exposure method provided by the embodiment of the application, in the process of controlling the pixel array to perform progressive exposure, the pixel array can be controlled to stop exposure at the first time when the first image frame is not completely exposed and the second image frame is started to expose, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group in the pixel array, then the pixel array is controlled to continue to expose, and the pixel array can be controlled to stop exposure at the second time when the first image frame is completely exposed and the second image frame is not completely exposed, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group again and then the pixel array is controlled to continue to expose, so that one type of information, such as image information, can be output in the exposure period before the first time and after the second time, and another type of information, such as phase information, can be output in the exposure period after the first time and before the second time. It is ensured that the first data and the second data include therein image information or phase information output by each pixel unit in the pixel array. Thus, the target phase information for automatic focusing can be generated according to the phase information output by each pixel group, or the image information with higher definition can be obtained according to the image information output by each pixel group, so that the image sensor can have both the capability of collecting clear images and the capability of focusing accurately.
Drawings
FIG. 1 is a schematic diagram of an image sensor according to an embodiment of the present application;
fig. 2A is one of schematic structural diagrams of an image sensor in the related art;
FIG. 2B is a second schematic diagram of the structure of the related art image sensor;
FIG. 3A is a schematic diagram of a second embodiment of an image sensor;
FIG. 3B is a third schematic diagram of an image sensor according to an embodiment of the present application;
FIG. 4A is a schematic diagram of an image sensor according to an embodiment of the present application;
FIG. 4B is a schematic diagram of an image sensor according to an embodiment of the present application;
FIG. 5 is a schematic diagram of an image sensor according to an embodiment of the present application;
FIG. 6 is a schematic diagram of an image sensor according to an embodiment of the present application;
Fig. 7 is a schematic structural diagram of a camera module according to an embodiment of the present application;
FIG. 8 is a flow chart of an exposure method according to an embodiment of the present application;
FIG. 9 is a schematic diagram of an exposure method according to an embodiment of the present application;
fig. 10 is a schematic view of the structure of an exposure apparatus according to an embodiment of the present application;
FIG. 11 is a schematic diagram of an electronic device according to an embodiment of the present application;
FIG. 12 is a second schematic diagram of an electronic device according to an embodiment of the present application;
Wherein reference numerals in fig. 1 to 7 are:
100 parts of an image sensor, 10 parts of a pixel array, 11 parts of a pixel group, 12 parts of a pixel unit, 20 parts of a micro lens array, 21 parts of a lens group, 22 parts of a first micro lens, 23 parts of a second micro lens, 30 parts of a form switching structure, 40 parts of an exposure reading control module, 50 parts of an exposure trigger control module, 60 parts of a conversion circuit, 200 parts of a camera module, 210 parts of a lens group, 220 parts of a focusing motor, 230 parts of a filter and 240 parts of a base.
Detailed Description
The technical solutions of the embodiments of the present application will be clearly described below with reference to the drawings in the embodiments of the present application, and it is apparent that the described embodiments are some embodiments of the present application, but not all embodiments. All other embodiments, which are obtained by a person skilled in the art based on the embodiments of the present application, fall within the scope of protection of the present application.
Terms related to the embodiments of the present application will be described below.
An image sensor is a device for converting an optical image into an electronic signal, and is widely applied to digital cameras and other electronic optical devices. Nowadays, image sensors are largely divided into two types, a Charge-coupledDevice (CCD) sensor and a Complementary Metal Oxide Semiconductor (CMOS) active pixel sensor (Active pixel sensor). The CMOS active pixel sensor is hereinafter referred to as a CMOS sensor.
The pixel of the CMOS sensor can only sense brightness, if the pixel is required to sense color, a color filter matrix (ColorFilterArray, CFA) needs to be covered on the pixel, the CFA is used for filtering light rays of other wave bands, the light rays of the required wave bands pass through, and photoelectric conversion is carried out through the pixel, so that the pixel has the color sensing capability. It should be noted that, the energy of the CFA is wasted in the manner of setting the CFA, and the sensing capability of the image sensor for color is relatively limited in this manner.
PDAF is a technique for achieving rapid and accurate focusing by detecting the phase difference between an object and an imaging sensor. Taking a mobile phone as an example, the auto-focusing sensor and the pixel sensor can be directly integrated together by using the phase detection on the mobile phone, and the auto-focusing sensor can form a phase detection unit for phase detection, such as a 2 x 2 phase detection unit, on the pixel sensor, so that the PDAF sensor can be obtained. Then, the left and right opposite pairs of pixels can be taken out from each phase detection unit of the PDAF sensor, the detection of phase information such as the quantity of the travelling light and the like is carried out on objects in the scene, and the accurate focusing point can be rapidly determined by comparing the phase information of the pairs of pixels. Then the lens group can be pushed to the corresponding position at one time by the focusing motor, and automatic focusing is rapidly completed.
Currently, PDAF sensors include two types, namely, a density type PDAF image sensor, that is, only a part of pixels in all pixels of the image sensor support PDAF, for example, a conventional image sensor is 1200W pixels, if only 6% of PDAF is supported, that is, only 72W pixels can output PDAF information, so that only a part of phase difference information may cause inaccurate focusing of final calculation, thereby resulting in poor focusing capability of the image sensor. Another is an image sensor that supports full PDAF. Currently, the mainstream all-PDAF sensor is a four-phase detection (Quad Phase Detection, QPD) type image sensor. In the QPD type image sensor, PDAF information may be output per pixel, i.e., an image sensor having 1200W pixels may output 1200W PDAF phase difference information. But QPD type image sensors are lossy in terms of image sharpness.
The staggered (trigger) high dynamic light rendering (HIGH DYNAMIC RANGE, HDR) is also called line interleaving HDR, and the principle is that when each line of pixel units of the CMOS pixel is exposed and read, the next exposure is started immediately, namely, the readout of the image signal of the next original image of a certain line of pixel units is not needed to wait for the whole readout of the image signal of the last original image, but can be directly read out after each line of pixel units is exposed, and the readout of the image data of different original images can be simultaneously carried out.
For example, the n-th row pixel cell completes the exposure and reads the completion (e.g., long (long) exposure), and a second exposure (e.g., short) exposure) is performed immediately, at which time the n+1-th row pixel cell also performs the operation. It looks like yarn penetration in textile, so is called a row-interlacing exposure technique.
It can be appreciated that since the interval between the reading moments of the two frames of original images is greatly shortened, the artifacts and smear phenomena are greatly improved.
It is understood that interleaved HDR can be achieved by an interleaved (pulsed) exposure mode. The reading speeds of the Staggered (Staggered) exposure mode and the normal exposure mode are as follows:
The normal exposure mode is to read the next frame of image data after the previous frame of image data is read, namely frame-to-frame (readout), and the steady exposure mode realizes Line-to-Line (Line-to-Line) reading, namely the next frame of image data of the current Line is read without waiting for the whole reading of the previous frame of image data, but can be directly read after each Line is exposed, and the reading of different frames can be parallel. The Line-Based reading mode further shortens the inter-frame time interval and further optimizes Ghost (Ghost). Meanwhile, the multi-frame (such as long frame, middle frame and short frame) image data is not output in three frames, but is output as one frame of data by overlapping and interleaving. Then, the frame of image data is analyzed to separate out multi-frame image data.
2Stagger, the image sensor (sensor) completes one time of Stagger exposure, and generates 2 frames of images. Further, fusion processing is performed on the two frames of images, so that a frame of fusion image can be obtained.
For example, the 2-frame images may be subjected to a fusion process by an Image signal processor (Image SignalProcessor, ISP) to obtain a final fused Image.
The 3 trigger completes one trigger exposure, and 3 frames of images can be generated. Further, the ISP may integrate the 3 frame images into one frame image.
Other terms
The terms "first," "second," and the like in the description of the present application, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances such that the embodiments of the application are capable of operation in sequences other than those illustrated or otherwise described herein, and that the objects identified by "first," "second," etc. are generally of a type not limited to the number of objects, for example, the first object may be one or more. In addition, "and/or" in the specification means at least one of the connected objects, and the character "/", generally means a relationship in which the associated objects are one kind of "or".
The terms "at least one", and the like in the description of the present application mean that they encompass any one, any two, or a combination of two or more of the objects. For example, at least one of a, b, c (item) may represent "a", "b", "c", "a and b", "a and c", "b and c" and "a, b and c", wherein a, b, c may be single or plural. Similarly, the term "at least two" means two or more, and the meaning of the expression is similar to the term "at least one".
The image sensor, the exposure method, the exposure device, the electronic equipment and the medium provided by the embodiment of the application are described in detail below through specific embodiments and application scenes thereof with reference to the accompanying drawings.
With the development of shooting technology, the requirement of users on the focusing capability of the image sensor is increasing.
In the related art, phase detection focusing (Phase DetectionAuto Focus, PDAF) technology is generally used for auto-focusing during shooting. Specifically, phase focusing may be performed by an image sensor supporting the PDAF technology, where at least some pixels in the image sensor supporting the PDAF technology may output phase information. In general, the higher the pixel duty ratio of the image sensor output phase information, the higher the accuracy of auto-focusing.
However, the higher the pixel duty ratio supporting PDAF in an image sensor, the lower the sharpness of the image captured by the image sensor may be due to factors such as reduced pixel coverage of perceived colors, limited in-focus pixel arrangement, and the like.
In order to solve the technical problems, the embodiment of the application provides an image sensor which comprises a pixel array, a microlens array and a form switching structure, wherein the pixel array and the microlens array are correspondingly arranged, the form switching structure is connected with the microlens array, the pixel array comprises a plurality of pixel groups, each pixel group comprises N pixel units, N is an integer larger than 1, the microlens array comprises a plurality of microlens groups, each microlens group comprises a first microlens or at least two second microlenses, each pixel group is correspondingly provided with one first microlens or at least two second microlenses, the form switching structure is used for driving the microlens array to move so as to control each pixel group to be switched from the corresponding first microlens to the corresponding at least two second microlenses or from the corresponding at least two second microlenses to the corresponding first microlens, the pixel groups are used for acquiring phase information when the pixel groups correspond to the first microlenses, and the pixel groups are used for acquiring image information when the pixel groups correspond to the at least two second microlenses. Therefore, each pixel group can be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses by the form switching structure before and after the micro lens array is driven to move, so that each pixel group can respectively output image information and phase information before and after the micro lens array moves, and the image sensor provided by the embodiment of the application improves the definition of acquired images on the basis of realizing full-pixel PDAF support.
The embodiment of the application provides an image sensor, and fig. 1 shows a schematic structural diagram of the image sensor provided by the embodiment of the application. As shown in fig. 1, the image sensor 100 provided in the embodiment of the application may include a pixel array 10 and a microlens array 20 that are correspondingly disposed, and a form switching structure 30 connected to the microlens array 20, where the pixel array 10 includes a plurality of pixel groups 11, each pixel group 11 includes N pixel units 12, where N is an integer greater than 1, and the microlens array 20 includes a plurality of microlens groups 21, and each microlens group 21 includes a first microlens 22 or at least two second microlenses 23, and each pixel group 11 is correspondingly disposed with a first microlens 22 or at least two second microlenses 23. The form switching structure 30 is configured to drive the microlens array 20 to move, so as to control each pixel group 11 to be switched from the corresponding first microlens 22 to the corresponding at least two second microlenses 23, or from the corresponding at least two second microlenses 23 to the corresponding first microlens 22, wherein when the pixel group 11 corresponds to the first microlens 22, the pixel group 11 is configured to obtain phase information, and when the pixel group 11 corresponds to the at least two second microlenses 23, the pixel group 11 is configured to obtain image information.
In some embodiments of the present application, the area of the light incident surface of the first microlens is larger than the area of the light incident surface of the second microlens. This ensures that the first microlens can correspond to all pixel cells in a pixel group. For example, the area of the light incident surface of the first microlens is N times that of the light incident surface of the second microlens.
In some embodiments of the present application, the number of rows of the lens array in the microlens array is greater than the number of rows of the pixel groups in the pixel array, so as to ensure that each pixel group is correspondingly provided with a first microlens or at least a second microlens before and after the movement of the microlens array. Of course, in some embodiments, the number of rows of the lens array in the microlens array may also be equal to the number of rows of the pixel groups in the pixel array. For example, the pixel array includes L rows of pixel groups, and then the lens array may include l+t rows of lens groups, where L and T are positive integers.
In some embodiments of the present application, the number of pixel cells in one pixel group is 2, 4, 6, 9 or 16 for each pixel group in the pixel array, and the color of the pixel cells in each pixel group is the same. I.e., N may be 2, 4, 6, 9, 16, etc.
The pixel array may be referred to as a 4-in-one pixel array when n=4, a nine-in-one pixel array when n=9, or a sixteen-in-one pixel array when n=16, where "four-in-one", "nine-in-one", and "sixteen-in-one" refer to a pixel merging technique, i.e., merging a plurality of small pixels into one large pixel to improve resolution and definition of an image.
The four-in-one technology is to arrange four pixel units with the same color together to form a large pixel unit so as to realize higher sensitivity, the nine-in-one technology is to combine nine pixel units into a large pixel unit so as to improve the definition and detail expression of an image, and the sixteen-in-one technology is to combine sixteen pixel units into a large pixel unit so as to further improve the definition and detail expression of the image, but correspondingly reduce the resolution of the image.
In some embodiments of the present application, the color of the pixel units in each pixel group is the same, which is understood to mean that the color channels of the pixel units in each pixel group are the same.
In some embodiments of the application, adjacent groups of pixels in the pixel array differ in color or color channel.
In some embodiments of the present application, the color channels may be Red (Red, R) channel, green (Green, G) channel, blue (B) channel, where the G channel includes a Gr channel and a Gb channel.
In some embodiments of the application, an array of color filter arrays (ColorFilterArray, CFA) may be provided over the pixel array, where the CFA may include multiple filter groups, each filter group corresponding to one color channel.
It can be appreciated that in the embodiment of the present application, each pixel group corresponds to one filter group, and the color channel of each pixel group is determined by the filter group corresponding to each pixel group.
In some embodiments of the present application, when a pixel group is provided with a first microlens, N pixel units in the pixel group each correspond to the first microlens. For example, taking n=4 as an example, fig. 2A is a schematic structural diagram of the pixel group corresponding to the first microlens. As shown in fig. 2A, each of 4 pixel groups in the pixel array is provided with a first microlens, i.e., 4 pixel units in each pixel group are provided with a first microlens. At this time, the 4 pixel groups can output phase information, and accurate automatic focusing can be realized through the phase information.
It will be appreciated that if one first microlens is provided for each pixel group in one image sensor, the image sensor may be referred to as a QPD sensor having an optimal focusing capability.
In some embodiments of the present application, when the pixel group is provided with one first microlens correspondingly, the phase information output by the pixel group can be any one of an up-down phase difference of the pixel group and a left-right phase difference of the pixel group. For example, taking n=4, that is, the pixel group includes 4 pixel units, as shown in fig. 2A, one first microlens is disposed corresponding to the pixel group. Taking the first pixel group in fig. 2A as an example, if the pixel group outputs a left-right phase difference, an r1+r3 pixel unit may be used as a left phase and an r2+r4 pixel unit may be used as a right phase, and if the pixel group outputs a up-down phase difference, an r1+r2 pixel unit may be used as an upper phase and an r3+r4 pixel unit may be used as a lower phase.
It will be appreciated that both the left-right phase difference and the up-down phase difference of the pixel group may be used to perform PDAF. The difference is that the left-right phase difference is advantageous for the vertical stripe scene, and the up-down phase difference is advantageous for the horizontal stripe scene.
In some embodiments of the present application, when one pixel group is provided with at least two second microlenses, the correspondence between N pixel units in the pixel group and the at least two second microlenses may be any one of the following:
1) Each of the at least two second microlenses corresponds to one pixel unit in the pixel group, and different second microlenses correspond to different pixel units in the pixel group. It should be noted that, when each second microlens corresponds to one pixel unit in the pixel group, the sharpness of the image information acquired by the pixel group is highest.
For example, taking n=4 as an example, fig. 2B is a schematic structural diagram of the pixel group corresponding to the lens unit. As shown in fig. 2B, at least two second microlenses are disposed corresponding to each of the 4 pixel groups in the pixel array, and at this time, each pixel unit in each pixel group corresponds to one second microlens, so that each pixel group can output image information, so that the image data output by the image sensor has an optimal resolution, and the image sensor can capture a high-definition image.
It can be understood that if each pixel group in one image sensor is provided with N first microlenses, the image sensor may be referred to as a four-in-one sensor having the best definition.
2) Each of the at least two second microlenses corresponds to at least two pixel units in the pixel group, and different second microlenses correspond to different pixel units in the pixel group. The number of the pixel units corresponding to each second micro lens is smaller than N.
In some embodiments of the present application, for any one microlens group including at least two second microlenses, the arrangement of at least two second microlenses in one microlens group is adapted to the arrangement of the pixel units in the pixel group.
For example, taking the number of second microlenses in one microlens set as N as an example, the N second microlenses in the microlens set may be distributed in an array of m×m, where N is equal to the square of M, and M is a positive integer.
In some embodiments of the present application, the plurality of microlens sets includes at least two first type microlens sets and at least two second type microlens sets. Wherein each first type of microlens set comprises a first microlens, and each second type of microlens set comprises at least two second microlenses. In this way, the micro-lens array moves before and after, and the pixel array comprises a pixel group corresponding to the first micro-lens and a pixel group corresponding to at least two second micro-lenses.
In some embodiments of the present application, the microlens group including the first microlens and the pixel group including the at least one second microlens in the microlens array may be regularly arranged according to a certain rule, or may be irregularly arranged.
For convenience of description, in the following embodiments, unless specifically described, a case is illustrated in which a microlens group including a first microlens and a pixel group including at least one second microlens in a microlens array are regularly arranged.
The correspondence between the microlens array and the pixel array is described below with reference to the drawings before and after the movement of the microlens array.
For example, it is assumed that fig. 3A and 3B are schematic diagrams of the microlens array and the pixel array before and after the movement of the microlens array. As shown in fig. 3A, before the microlens array moves, the R pixel group 11-1 corresponds to one first microlens 22, and the Gr pixel group 11-2 corresponds to 4 second microlenses, that is, each pixel unit in the Gr pixel group 11-2 corresponds to one second microlens 23. If the form switching structure drives the microlens array to move two pixel groups along the column scanning direction of the pixel array, i.e. move two pixel groups downwards, as shown in fig. 3B, the R pixel group 11-1 is correspondingly provided with 4 second microlenses 23, i.e. each pixel unit in the R pixel group 11-1 corresponds to one second microlens 23, and the Gr pixel group 11-2 corresponds to one first microlens 22.
In some embodiments of the present application, as shown in fig. 1 to 4B, each of the pixel groups in the same row corresponds to a first microlens, or each of the pixel groups in the same row corresponds to at least two second microlenses.
It should be noted that "each pixel group in the same row corresponds to a first microlens" is understood to mean that each pixel group in the same row corresponds to one first microlens, and different pixel groups correspond to different first microlenses.
For example, the pixel group of the i-th row includes K pixel groups, and the K pixel groups may be in one-to-one correspondence with the K first microlenses.
It should be noted that "each pixel group in the same row corresponds to at least two second microlenses" is understood to mean that each pixel group in the same row corresponds to at least one second microlens, and different pixel groups correspond to at least two different second microlenses.
Thus, each pixel group in the same row corresponds to the first micro lens, or each pixel group in the same row corresponds to at least two second micro lenses. The pixel groups in the same row correspond to the same type of micro lenses, so that the form switching structure can be controlled to drive the micro lens array to move along the column direction, and each pixel group is controlled to be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses. Therefore, different information such as image information or phase information can be output by each pixel unit before and after the micro lens array moves.
The arrangement of the lens groups in the lens array will be described below with reference to the correspondence between the pixel groups and the lens groups.
In some embodiments of the present application, when the arrangement is the mode a, as shown in fig. 3A, in the case that the i-th row of pixel groups corresponds to the first microlenses 22, the i+1-th row of pixel groups corresponds to the at least two second microlenses 23, the i+2-th row of pixel groups corresponds to the at least two second microlenses 23, and the i+3-th row of pixel groups corresponds to the first microlenses 22, where i may be a positive integer.
Or when the arrangement mode is mode b, as shown in fig. 4A, in the case that the ith row of pixel groups corresponds to the first microlenses 22, the (i+1) th row of pixel groups corresponds to at least two second microlenses, the (i+2) th row of pixel groups corresponds to the first microlenses, and the (i+3) th row of pixel groups corresponds to at least two second microlenses, where i may be a positive integer.
Or when the arrangement mode is mode c, as shown in fig. 4B, in the case that the ith row of pixel groups corresponds to the first microlenses 22, the (i+1) th row of pixel groups corresponds to the first microlenses 22, the (i+2) th row of pixel groups corresponds to the at least two second microlenses 23, and the (i+3) th row of pixel groups corresponds to the at least two second microlenses 23, where i may be a positive integer.
Or when the arrangement mode is mode d, as shown in fig. 3B, in the case that the i-th row of pixel groups corresponds to at least two second microlenses 23, the i+1-th row of pixel groups corresponds to the first microlenses 22, the i+2-th row of pixel groups corresponds to the first microlenses 22, and the i+3-th row of pixel groups corresponds to at least two second microlenses 23, where i may be a positive integer.
Or when the arrangement mode is mode e, in the case that the ith row of pixel groups corresponds to at least two second microlenses, the (i+1) th row of pixel groups corresponds to the first microlenses, the (i+2) th row of pixel groups corresponds to the two second microlenses, and the (i+3) th row of pixel groups corresponds to the first microlenses, wherein i can be a positive integer.
Or when the arrangement mode is mode f, in the case that the ith row of pixel groups corresponds to at least two second microlenses, the (i+1) th row of pixel groups corresponds to at least two second microlenses, the (i+2) th row of pixel groups corresponds to the first microlenses, and the (i+3) th row of pixel groups corresponds to the first microlenses, wherein i can be a positive integer.
In some embodiments of the present application, when the arrangement manner between the pixel groups and the microlens groups is a, c, d or f, the shape switching structure may drive the microlens array to move 2 pixel groups along the column direction of the image sensor, so as to control each pixel group to be switched from the corresponding first microlens to the corresponding at least two second microlenses, or from the corresponding at least two second microlenses to the corresponding first microlens.
In some embodiments of the present application, when the arrangement manner between the pixel group and the microlens group is b or e, the shape switching structure may drive the microlens array to move 1 pixel group along the column direction of the image sensor, so as to control each pixel group to be switched from the corresponding first microlens to the corresponding at least two second microlenses, or from the corresponding at least two second microlenses to the corresponding first microlens.
Thus, for the adjacent 4 rows of pixel groups, since the 4 rows of pixel groups may sequentially correspond to the first microlens, the at least two second microlenses, the first microlens, or may sequentially correspond to the first microlens, the at least two second microlenses, or may sequentially correspond to the first microlens, the at least two second microlenses, the flexibility of correspondingly setting the lens group and the pixel group may be improved.
In some embodiments of the present application, the shape switching structure may drive the microlens array to move integrally. The form switching structure drives the micro lens array to complete one movement, and the lens group corresponding to each pixel group in the pixel array is switched once.
For example, if the ith pixel group of the pixel array corresponds to the first microlens before the microlens array moves, and the (i+1) th pixel group corresponds to at least two of the microlenses before the microlens array moves, then after the microlens array completes one movement, the ith pixel group corresponds to at least two of the microlenses, the (i+1) th pixel group corresponds to one of the first microlenses, and i is a positive integer.
In some embodiments of the present application, the configuration switching structure may include any one of a screw assembly, a sliding rail assembly, or any other structure capable of driving the microlens array to move.
For example, the shape switching structure includes a screw assembly, which may include a screw, a slider sleeved on the screw, and a motor connected to the screw, wherein the microlens array is connected to the slider. Thus, when the motor drives the screw rod to rotate, the sliding block can drive the micro lens array to integrally move along the screw rod.
In some embodiments of the present application, the form switching structure may also drive one or more microlens groups in the microlens array to move independently each time, and may be specifically determined according to actual use requirements.
In some embodiments of the application the form switching structure may be arranged to be movable in at least one of a row direction and a column direction of the pixel array. Specific arrangement structures can be seen in the related art, and the present application is not limited thereto.
In the image sensor provided by the embodiment of the application, each pixel group can be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses by driving the micro lens array to move before and after the form switching structure, so that each pixel group can respectively output image information and phase information before and after the micro lens array moves, and the image sensor provided by the embodiment of the application improves the definition of acquired images on the basis of realizing full-pixel PDAF support.
In some embodiments of the present application, as shown in fig. 5 in conjunction with fig. 1, the image sensor 100 may further include an exposure read control module 40 and an exposure trigger control module 50, where the exposure read control module 40 is connected to the pixel array 10, and the exposure trigger control module 50 is connected to the exposure read control module 40, the pixel array 10, and the form switching structure 30, respectively, and the exposure read control module 40 may be used to control the pixel array 10 to perform a row-by-row exposure to obtain a first image frame, and the exposure trigger control module 50 may be used to:
controlling the pixel array 10 to stop exposure at a first moment, controlling the form switching structure 30 to drive the micro lens array 20 to move, and controlling the pixel array 10 to continue to expose row by row after the micro lens array 20 is moved, wherein the first moment is the moment when the pixel array 10 corresponding to a first image frame is not exposed and the pixel array 10 corresponding to a second image frame starts to expose;
And controlling the pixel array 10 to stop exposure at a second moment, controlling the form switching structure 30 to drive the micro lens array 20 to move, and controlling the pixel array 10 to continue to expose row by row after the micro lens array 20 is moved, wherein the second moment is a moment when the pixel array 10 corresponding to the first image frame has completed exposure and the pixel array 10 corresponding to the second image frame has not completed exposure.
In some embodiments of the present application, as shown in fig. 9, the exposure time of the first image frame and the second image frame is schematically shown, where the first time may be the time a in fig. 9, and the second time may be the time B in fig. 9.
It should be noted that, the exposure triggering control module drives the microlens array to move once when controlling the form switching structure, and the microlens corresponding to each pixel group in the pixel array is changed once. Specifically, the first microlenses are switched to the at least two second microlenses, or the at least two second microlenses are switched to the first microlenses.
For the description of controlling the form switching structure by the exposure triggering control module to drive the microlens array to move, refer to the related description in the above embodiment, and in order to avoid repetition, the description is omitted here.
In some embodiments of the present application, the exposure trigger control module may be electrically connected to the form switching structure.
In some embodiments of the present application, "connected to the pixel array" may be understood as being connected to each pixel element in the pixel array separately, so as to achieve exposure control for each pixel element. Wherein, the exposure reading control module is connected with each pixel unit in the pixel array, the exposure triggering control module is connected with each pixel unit in the pixel array.
In some embodiments of the present application, the "exposure and reading control module may be configured to control the pixel array to perform a row-by-row exposure to obtain the first image frame" may be understood that the exposure and reading control module may control that the first image frame may be obtained after each row of the pixel array is exposed and read.
It should be noted that "controlling the pixel array to continue exposure" is understood to mean that when exposure is stopped from the pixel array, the exposure is continued from the next pixel unit of the pixel unit that has last completed exposure. For example, assuming that the pixel array stops exposing after the i-th pixel unit of the pixel array is exposed, when the exposure trigger control module controls the pixel array to continue exposing, the pixel array may start exposing from the i+1th pixel unit, i being a positive integer.
In some embodiments of the application, the pixel cell completing the exposure may include at least one of the pixel cell ending the sensitization and the sensitization data of the pixel cell having been read out.
In some embodiments of the present application, the exposure trigger control module may determine whether the microlens array has completed moving in the following manner.
Mode 1. By means of displacement detection means in the form switching structure, such as displacement information detected by a displacement sensor, it is determined whether or not the microlens array has completed moving.
For example, the displacement detection component may be connected to the microlens array and the exposure trigger control module, so that displacement information of the microlens array may be detected, and the detected displacement information may be fed back to the exposure trigger control module, so that the exposure trigger control module determines whether the microlens array has completed moving according to the displacement information.
And 2, determining whether the micro lens array is moved or not by controlling the time difference between the time when the form switching structure drives the micro lens array to move and the current time. For example, the exposure triggering control module controls the form switching structure to drive the micro lens array at the time t0, and then the exposure triggering control module can determine that the micro lens array has completed moving after a preset time period after the time t 0. The preset duration is determined by the moving speed of the micro lens array driven by the form switching structure.
Mode 3, travel determination by a form switching structure. For example, the form switching structure is a motor and screw structure, and then the exposure triggering control module can acquire the number of turns of the motor to determine whether the movement is completed.
In the process of controlling the pixel array to perform progressive exposure by the exposure reading control module, the exposure triggering control module can control the pixel array to stop exposure at the first time when the first image frame is not completely exposed and control the form switching structure to drive the micro lens array to move, and control the pixel array to continue exposure after the micro lens array is moved, and can control the pixel array to stop exposure at the second time when the first image frame is completely exposed and the second image frame is not completely exposed, control the form switching structure to drive the micro lens array to move, and control the pixel array to continue exposure after the micro lens array is moved, so that each pixel group in the pixel array can output one type of information, such as image information, in the exposure period before the first time and after the second time, and output another type of information, such as phase information, in the exposure period after the first time and before the second time. It is ensured that the data corresponding to the first image frame and the data corresponding to the second image frame include the image information or the phase information output by each pixel unit in the pixel array. In this way, it is possible to generate an image for auto-focusing an all-phase image from the phase information output from each pixel group, or to generate an image with higher definition from the image information output from each pixel group, therefore, the image sensor provided by the embodiment of the application can have the capability of collecting clear images and the capability of focusing accurately.
A circuit diagram between the exposure read control module, the exposure trigger control module, the form switching structure, and the pixel array is illustrated below.
In some embodiments of the present application, as shown in fig. 6 in conjunction with fig. 5, the pixel unit 12 may include: the photodiode PD1, the reset triode RST1, the floating switch TG1, the selection transistor RSeL1, the source follower SF1 and the capacitor FD1, wherein the first end of the photodiode PD1 is grounded, the second end of the photodiode PD1 is connected with the first end of the floating switch TG1, the second end of the floating switch TG1 is connected with the first control line TG, the third end of the floating switch TG1 is connected with the first end of the capacitor FD1, the second end of the capacitor FD1 is grounded, the first end of the reset triode RST1 is connected with the second control line RST, the second end of the reset triode RST1 is connected with the power line VDD, the third end of the reset triode RST1 is connected with the first end of the capacitor FD1, the first terminal of the signal amplifier SF1 is connected to the power line VDD, the second terminal of the signal amplifier SF1 is connected to the first terminal of the capacitor FD1, the third terminal of the signal amplifier SF1 is connected to the first terminal of the selection transistor RSeL1, the second terminal of the selection transistor RSeL1 is connected to the conversion circuit 60, and the third terminal of the selection transistor RSeL1 is connected to the third control line ROW-Sel, so that the pixel unit 12 can output analog image data to the analog-to-digital conversion circuit through the photodiode PD1, the reset transistor RST1, the floating switch TG1, the selection transistor RSeL, the signal amplifier SF1, and the capacitor FD1 when the pixel unit 12 receives light.
Wherein, PD1 is controlled by TG1, namely TG1 is responsible for switching PD1 to control the sensitization moment of PD 1. RST1 is responsible for clearing residual photogenerated electrons in PD1 and FD 1. FD1 is equivalent to a capacitor and is responsible for carrying the charge transferred by PD 1. SF1 is responsible for transferring the charge in FD1 to select transistor RseL1.RseL1 is responsible for controlling the output of the pixel cell, and when RseL is turned on, the charge in FD1 is transferred to the output circuit through SF 1.
In some embodiments of the present application, as shown in FIG. 6, the conversion circuit 60 may include an analog-to-digital converter (Analog to Digital Converter, ADC), ISP, and mobile industry processor interface MIPI connected in sequence. The ADC is connected to the second terminal of the selection transistor RSeL, and is configured to convert an analog image signal obtained by the PD1 through sensitization into a digital image signal, and output the digital image signal to the ISP, where the ISP is configured to convert the digital image signal output by the ADC into image information, and the MIPI is configured to output the image information.
In some embodiments of the present application, the pixel cell 12 shown in fig. 6 may be referred to as a pixel cell of a fixed photodiode pixel (Pinned Photodiode Pixel, PDD), abbreviated as PDD pixel cell. The PPD pixel includes a photosensitive region of the PD1, i.e., a photodiode PD, and 4 transistors, which are respectively a reset transistor RST, a floating switch TX1, a row selector SET, and a signal amplifier SF, and thus is also called a 4T pixel unit. The PPD pixel unit allows the introduction of a correlated double sampling circuit, eliminating kTC noise introduced by reset, 1/f noise and offset noise introduced by mos transistors. The PDD pixel unit operates as follows:
1. resetting. RST and TG are activated first, and residual electrons in PD and FD are emptied.
Pd sensitization. When RST and TG are turned off, the pixel starts to be sensitized, electron-hole pairs generated by light irradiation are separated by the existence of an electric field of PD, electrons move to n region, and holes move to p region.
3. And (3) transferring electric charges. After the PD is sensitized to a specified time, the TG is activated, and the charge is completely transferred from the PD to the FD for reading out, wherein the mechanism is similar to charge transfer in CCD.
4. And reading out the signal level. Next, the voltage signal of FD is output (Vout) to ADC through SF (source follower) and SET (row selector), and digital-to-analog conversion is performed, that is, digitized signal output is performed.
In some embodiments of the present application, at least one pixel unit corresponds to one conversion circuit.
In some embodiments of the present application, as shown in fig. 6, the exposure trigger control module 50 may be connected to the pixel unit by being connected to the first control line TG, the second control line RST, the third control line ROW-Sel, the reset line RST, and the power line VDD, thereby achieving connection to the pixel array.
In some embodiments of the present application, as shown in fig. 6, the exposure read control module 40 may be connected to the pixel unit by being connected to the first control line TG, the second control line RST, the third control line ROW-Sel, the reset line RST, and the power line VDD, thereby achieving connection to the pixel array.
In some embodiments of the present application, the exposure read control module may specifically be configured to control the pixel array to perform row-by-row exposure according to an interleaved (Staggered) exposure mode.
It can be understood that the reading speeds of the staggered exposure mode and the normal exposure mode are as follows:
The common exposure mode is to read the next frame of image data after the previous frame of image data is read, namely frame-to-frame (frame-to-frame) reading is performed, and the steady exposure mode is used for reading Line-to-Line (Line-to-Line), namely the next frame of image data in the current Line is read without waiting for the complete reading of the previous frame of image data, but can be directly read after each Line is exposed, and the reading of different frames can be parallel. The Line-Based reading mode further shortens the inter-frame time interval and further optimizes Ghost (Ghost). Meanwhile, the multi-frame (such as long frame, middle frame and short frame) image data is not output in three frames, but is output as one frame of image data by overlapping and interleaving. Then, the frame of image data is analyzed to separate out multi-frame image data.
It should be noted that, in the foregoing embodiment, the staggered exposure mode is illustrated, and in practical implementation, the exposure read control module may also control the pixel array to perform exposure according to the conventional exposure mode, where the exposure read control module is an exposure overall control structure of the image sensor.
In this way, the exposure and reading control module can control the pixel array to expose according to the staggered exposure mode, so that the exposure and reading time interval between the multi-frame image data exposed by the pixel array can be shortened, and the ghost image between the images corresponding to the multi-frame image data can be optimized, and the image quality can be improved.
In some embodiments of the present application, the exposure read control module may be further configured to send first information to the exposure trigger control module, where the first information may be used to indicate a first time and a second time, and the exposure trigger control module may be further configured to determine the first time and the second time according to the first information.
In some embodiments of the present application, the exposure read control module may send the first information to the exposure trigger control module after controlling the pixel array to start exposure, so that the exposure trigger control module may determine the first time and the second time.
In this way, since the exposure read control module can send the first information indicating the first time and the second time to the exposure trigger control module, the exposure trigger control module can accurately control the pixel array to stop exposure and switch the microlens group corresponding to the pixel group.
In some embodiments of the application, the first information includes any one of time information of the first time and time information of the second time, an exposure parameter of the first image frame and an exposure parameter of the second image frame. In this way, the flexibility of the exposure triggering control module in determining the first moment can be improved.
An embodiment of the present application provides a camera module, as shown in fig. 7, which includes the image sensor 100 in the above embodiment.
In some embodiments of the present application, as shown in fig. 7, the camera module 200 may further include a lens group 210, a focusing motor 220, a filter 230, a conversion circuit, and a memory. The lens group 210 is embedded in the focusing motor 220, the lens group 210 is disposed opposite to the image sensor 100, the filter 230 is disposed between the lens group 210 and the image sensor 100, and the conversion circuit is connected to an output end (e.g. a second end of the selection transistor RSeL) of the image sensor, for converting an analog image signal obtained by sensing the image sensor into a digital image signal and outputting the digital image signal. The memory is connected with the output end of the conversion circuit and used for storing the digital image signals obtained by conversion of the conversion circuit.
In some embodiments of the present application, as shown in fig. 7, the camera module 200 may further include a base 240 and a housing to which the base is connected, and the image sensor 100 is disposed on the base 240.
In some embodiments of the present application, the focusing motor may be connected to a first spring and a second spring disposed on the housing, wherein the housing is connected to the base. The housing, the first spring and the second spring are not shown in fig. 7, and for a specific structure, reference is made to the related art. When focusing, the focusing motor is electrified to generate magnetic force so as to compress the first elastic piece or the second elastic piece to push the lens group to the focusing position. It will be appreciated that the final position of the focus motor may be controlled by the magnitude of the current or voltage applied to the focus motor.
In some embodiments of the application, the filter may be a CFA filter.
It can be understood that the light converged into the lens group is projected to the filter, and the filter is used for filtering unnecessary light transmitted through the lens group, so as to prevent the image sensor from generating false color/ripple, thereby improving the effective resolution and color reproducibility of the image sensor. The light after passing through the filter plate can be perceived by the image sensor.
In some embodiments of the present application, the lens group may be composed of a plurality of glass lenses or plastic lenses together. When the camera module is used for photographing, light rays can be imaged on the image sensor through a series of refraction effects of the lenses, the more the refraction times are, the more the light ray correction and condensation effects are outstanding, and the better the imaging effect is when photographing.
It can be understood that the shooting principle of the camera module can be that when the camera module is aimed at a scene to shoot, light irradiates the image sensor through a lens, the sensor converts an optical signal into an analog image signal, the analog image signal is converted into a digital signal through an analog-to-digital converter in a conversion circuit, and finally the digital signal is processed into visualized image data by an ISP in the conversion circuit and is output to a memory.
In the camera module provided by the embodiment of the application, each pixel group can be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses by driving the micro lens array to move back and forth through the form switching structure, so that each pixel group can respectively output image information and phase information before and after the micro lens array moves, and the camera module provided by the embodiment of the application improves the definition of acquired images on the basis of realizing full-pixel PDAF support.
According to the exposure method provided by the embodiment of the application, the execution main body can be an exposure control device, or an electronic device, or a functional module or entity in the electronic device. In the embodiment of the application, an exposure method executed by an electronic device is taken as an example, and the exposure method provided by the embodiment of the application is described.
An embodiment of the present application provides an exposure method, which is applied to the image sensor in the above embodiment, as shown in fig. 8, and the exposure method provided in the embodiment of the present application may include steps 801 to 806.
Step 801, the electronic device controls a pixel array of the image sensor to perform line-by-line exposure, so as to obtain a first image frame.
It should be noted that "the electronic device controls the pixel array of the image sensor to perform the line-by-line exposure to obtain the first image frame" may be understood that the electronic device controls the pixel array to perform the line-by-line exposure from the first line of pixel units. After the last row of pixel units of the pixel array completes the row-by-row exposure, a first image frame can be obtained.
In some embodiments of the present application, the electronic device may control the pixel array to perform the row-by-row exposure according to the first exposure parameter. The first exposure parameters may include exposure parameters of a first image frame and exposure parameters of a second image frame.
In some embodiments of the present application, the exposure time period corresponding to the exposure parameter of the first image frame is longer than the exposure time period corresponding to the exposure parameter of the second image frame. I.e. the first image frame is a long image frame and the second image frame is a short image frame. In other embodiments, the exposure time period of the first image frame may be the same as the exposure time period of the second image frame.
In some embodiments of the present application, the electronic device may determine the first Exposure parameter according to the photographed scene, that is, the first Exposure parameter is an Auto Exposure (AE) parameter.
In some embodiments of the present application, the electronic device may further determine a preset Auto Focus (Auto Focus) parameter and an Auto white balance (AutoWhite Balance) parameter according to the captured scene, so that a preview image of the image sensor meets human eyes.
In some embodiments of the application, the electronics can control the pixel array to perform a row-by-row exposure in an interlaced exposure mode. Therefore, the electronic device can control the pixel array to perform line-by-line exposure according to the staggered exposure mode, so that the exposure and reading time interval between corresponding data of at least two image frames exposed by the pixel array is shortened, ghost between the at least two image frames can be optimized, and the image shooting quality can be improved.
Step 802, the electronic device controls the pixel array to stop exposure at a first moment, and switches the microlens group corresponding to each pixel group through the form switching structure of the image sensor. The first time is a time when the pixel array corresponding to the first image frame is not exposed, and the pixel array corresponding to the second image frame starts to be exposed.
It should be noted that the electronic device may control the pixel array to stop exposure, and then switch the microlens set corresponding to each pixel set through the form switching structure of the image sensor.
In some embodiments of the present application, the electronic device may switch the microlens set corresponding to each pixel set by controlling the mode that the form switching structure drives the microlens set to move.
It should be noted that "switching the microlens group corresponding to each pixel group" may include switching the pixel group from the corresponding first microlens to the corresponding at least two second microlenses so that the pixel group may output phase information before switching the corresponding microlens group and may output image information after switching the corresponding microlens group, or switching the pixel group from the corresponding at least two second microlenses to the corresponding first microlens so that each pixel group may output image information before switching the corresponding microlens group and may output phase information after switching the corresponding microlens group.
Step 803, the electronic device controls the pixel array of the image sensor to continue the line-by-line exposure.
It will be appreciated that the electronics can control the pixel array to continue the progressive exposure from the location where the exposure was to be interrupted.
Step 804, the electronic device controls the pixel array to stop exposure at the second moment, and switches the microlens group corresponding to each pixel group through the form switching structure.
The second time is a time when the pixel array corresponding to the first image frame has completed exposure and the pixel array corresponding to the second image frame has not completed exposure.
Step 805, the electronic device controls the pixel array of the image sensor to continue to perform the line-by-line exposure until the exposure of the pixel array corresponding to the second image frame is completed.
For additional description of steps 804 and 805, see the description of steps 802 and 803 in the above embodiments.
Step 806, the electronic device obtains the target image information or the target phase information based on the first data corresponding to the first image frame and the second data corresponding to the second image frame.
In some embodiments of the present application, the first data includes image information output from a part of the pixel groups of the pixel array and phase information output from another part of the pixel groups of the pixel array, and the second data includes phase information output from a part of the pixel groups of the pixel array and image information output from another part of the pixel groups of the pixel array.
In some embodiments of the present application, the target phase information may be used for auto-focusing, and the target image information may generate a higher definition image, such as a high definition image.
In some embodiments of the present application, the first data and the second data include image information or phase information corresponding to each pixel unit in the pixel array, and the step 806 may include a step 806A and a step 806B described below.
Step 806A, the electronic device synthesizes the image information in the first data and the image information in the second data to obtain the target image information.
Step 806B, the electronic device synthesizes the phase information in the first data with the phase information in the second data to obtain the target phase information.
In some embodiments of the present application, the exposure time period of the first image frame and the exposure time period of the second image frame may be the same or different.
When the exposure time length of the first image frame is the same as the exposure time length of the second image frame, the phase information in the first data and the phase information in the second data can be directly fused to obtain target phase information, and the image information in the first data and the image information in the second data can be directly fused to obtain target image information.
If the exposure time length of the first image frame is different from that of the second image frame, the electronic device can adjust the exposure gains corresponding to the first data and the second data so that the total exposure amounts corresponding to the first data and the second data are the same, and then the electronic device synthesizes the first data and the second data according to requirements.
In some embodiments of the present application, the electronic device may perform step 806B in an in-focus scene, and the electronic device may perform step 806A when the scene is taken (i.e., when an image needs to be acquired).
The electronic equipment can synthesize the image information in the first data and the image information in the second data to obtain target image information, or can synthesize the image information in the first data and the image information in the second image to obtain target image information, so that the electronic equipment can synthesize target phase information in a focusing stage to realize quick and accurate automatic focusing based on the target phase information, and can synthesize the target image information in a shooting stage to obtain a high-definition image. Therefore, the image sensor provided by the embodiment of the application improves the definition of the acquired image on the basis of realizing the full-pixel PDAF support.
In the exposure method provided by the embodiment of the application, in the process of controlling the pixel array to perform progressive exposure, the pixel array can be controlled to stop exposure at the first time when the first image frame is not completely exposed and the second image frame is started to expose, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group in the pixel array, then the pixel array is controlled to continue to expose, and the pixel array can be controlled to stop exposure at the second time when the first image frame is completely exposed and the second image frame is not completely exposed, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group again and then the pixel array is controlled to continue to expose, so that one type of information, such as image information, can be output in the exposure period before the first time and after the second time, and another type of information, such as phase information, can be output in the exposure period after the first time and before the second time. It is ensured that the first data and the second data include therein image information or phase information output by each pixel unit in the pixel array. Thus, the target phase information for automatic focusing can be generated according to the phase information output by each pixel group, or the image information with higher definition can be obtained according to the image information output by each pixel group, so that the image sensor can have both the capability of collecting clear images and the capability of focusing accurately.
In some embodiments provided by the present application, the exposure method provided by the embodiment of the present application may further include:
step 807, the electronic device determines a first time according to the exposure parameter of the first image frame, and determines a second time according to the exposure parameter of the second image frame.
In this way, since the exposure parameters of the image include the exposure time of the image, the accuracy of the first time and the second time can be improved by determining the first time according to the exposure parameters of the first image frame and determining the second time according to the exposure parameters of the second image frame.
The exposure method provided by the embodiment of the application is described below with reference to examples.
For example, assuming that the pixel group in the image sensor includes 4 pixel units, the exposure mode is 2stagger, that is, stagger outputs long and short two-frame images, and the electronic device is a mobile phone, the detailed procedure for realizing the image normal angel output compatible PDAF and image definition is as follows:
Step 1, the mobile phone presets AE parameters, AF parameters and AWB parameters according to a photographing scene of a user, so that a preview image of the camera accords with human eye perception.
And 2, when a user presses a photographing key, the mobile phone outputs preset AE parameters to an exposure reading control module, so that the exposure reading control module controls the pixel array to start exposure according to the stagger exposure mode.
Step 3, before reaching the time indicated by the point a in fig. 9, that is, before the beginning exposure time of the short image in the stagger exposure mode, the mobile phone controls the pixel array to sense light line by line and reads out the image data of the long image, at this time, the correspondence between the microlens array and the pixel array is shown in fig. 3A, and it can be seen that at this time, the first R pixel group at the upper left of the pixel array corresponds to a first microlens, that is, the sensing mode of the R pixel group is a phase detection mode, so that the R pixel group can output the image data of the QPD structure.
And 4, when the point A in fig. 9 is reached, the pixel array operates the pixel units in the first row to start sensitization and reading out again so as to output data of the short image. At this time, the mobile phone can control the exposure or readout action of the suspended pixel array through the exposure mode trigger control module, and control the form switching module through the exposure mode trigger control module to push the micro lens array upwards or downwards so as to adjust the micro lens or the first lens unit corresponding to each pixel group. After the micro lens array completes the movement, the corresponding relation between the micro lens array and the pixel array is shown in fig. 3B, and after the micro lens completes the movement, the mobile phone can trigger the control module through the exposure mode to control the pixel array to continue the exposure and the data reading action.
Step 5, at the time point shown by point B in fig. 9, the long image in stagger exposure mode has been fully sensitized and read out. At this time, the mobile phone can control the exposure or readout action of the suspended pixel array through the exposure mode trigger control module, and control the form switching module through the exposure mode trigger control module to push the micro lens array upwards or downwards so as to adjust the micro lens or the first lens unit corresponding to each pixel group. After the microlens array completes the movement, the corresponding relationship between the microlens array and the pixel array is shown in fig. 3A.
And 6, after the micro lens is moved, the mobile phone can trigger the control module through the exposure mode to control the pixel array to continue exposure and data reading actions until the moment shown by a point C in fig. 9, and two frames of images in stagger exposure mode are output.
It will be appreciated that since the photosensitive pattern of each pixel group in the pixel array is switched twice during the exposure of the pixel array in stagger exposure mode, the long image is divided into two parts of X1 and Y1 and the short image is divided into two parts of X2 and Y2 as shown in fig. 9.
And 7, in the focusing stage, the mobile phone can synthesize the phase information, also called QPD data, in the short image and the long image to obtain a full-phase image, namely the image with the best focusing capability.
In the shooting stage, the mobile phone can synthesize the image information in the short image and the long image, which is also called tetra data, so as to obtain the image with the best definition.
In this way, in the process of controlling the pixel array to expose according to the stagger exposure mode, the microlens group corresponding to each pixel group in the pixel array is switched for a plurality of times by controlling the movement of the microlens array, so that each pixel group outputs image information and phase information in the process of exposing the pixel array according to the stagger exposure mode, and therefore, the data output by each pixel group can not only accurately perform PDAF, but also generate a high-definition image. This allows images acquired by the image sensor to be compatible with both sharpness and focusing capabilities.
In the above embodiment, the electronic device controls the pixel array to expose two image frames as an example, and in practical implementation, the electronic device may control the pixel array to expose 3 or more image frames according to the staggered exposure mode.
Taking exposing 3 image frames as an example, the electronic device may obtain a high-definition image or perform accurate auto-focusing according to the data corresponding to the 3 image frames by adopting the processing manner of step 806.
For example, assuming that the electronic device obtains data of 3 image frames, namely data 1, data 2 and data 3, the electronic device may first obtain the first image information from the image information in data 1 and the image information in data 2, then the electronic device may first obtain the second image information from the image information in data 2 and the image information in data 3, and then fuse the first image information and the second image information to obtain the target image information. The method for obtaining the target phase information is similar to the method for obtaining the target image information.
According to the exposure method provided by the embodiment of the application, the execution main body can be an exposure device. In the embodiment of the present application, an exposure apparatus performing an exposure method is taken as an example, and the exposure apparatus provided in the embodiment of the present application is described.
An embodiment of the present application provides an exposure apparatus, which may be the image sensor in the above embodiment, as shown in fig. 10, and the exposure apparatus 1000 further includes a control module 1001. The control module 1001 may be configured to:
Controlling a pixel array of an image sensor to perform progressive exposure so as to obtain a first image frame;
The method comprises the steps of controlling a pixel array to stop exposure at a first moment, switching a micro lens group corresponding to each pixel group through a form switching structure of an image sensor, controlling the pixel array of the image sensor to stop exposure at a second moment, controlling the pixel array of the image sensor to stop exposure at the second moment, switching the micro lens group corresponding to each pixel group through the form switching structure, controlling the pixel array of the image sensor to continue to perform row-by-row exposure until the pixel array corresponding to the second image frame is exposed, and obtaining target image information or target phase information based on first data corresponding to the first image frame and second data corresponding to the second image frame.
In some embodiments of the present application, the first data and the second data include image information or phase information corresponding to each pixel group in the pixel array;
the control module 1001 is further configured to synthesize the image information in the first data and the image information in the second data to obtain target image information, or
And synthesizing the phase information in the first data and the phase information in the second data to obtain target phase information.
In the exposure device provided by the embodiment of the application, in the process of controlling the pixel array to perform progressive exposure, the pixel array can be controlled to stop exposure at the first time when the first image frame is not completely exposed and the second image frame is started to expose, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group in the pixel array, then the pixel array is controlled to continue to expose, and the pixel array can be controlled to stop exposure at the second time when the first image frame is completely exposed and the second image frame is not completely exposed, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group again and then the pixel array is controlled to continue to expose, so that one type of information, such as image information, can be output in the exposure period before the first time and after the second time, and another type of information, such as phase information, can be output in the exposure period after the first time and before the second time. It is ensured that the first data and the second data include therein image information or phase information output by each pixel unit in the pixel array. Thus, the target phase information for automatic focusing can be generated according to the phase information output by each pixel group, or the image information with higher definition can be obtained according to the image information output by each pixel group, so that the image sensor can have both the capability of collecting clear images and the capability of focusing accurately.
The exposure device in the embodiment of the application can be an electronic device, and can also be a component in the electronic device, such as an integrated circuit or a chip. The electronic device may be a terminal, or may be other devices than a terminal. The electronic device may be a Mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted electronic device, a Mobile internet appliance (Mobile INTERNET DEVICE, MID), an augmented reality (augmentedreality, AR)/Virtual Reality (VR) device, a robot, a wearable device, an ultra-Mobile personal computer (UMPC), a netbook or a Personal Digital Assistant (PDA), etc., and may also be a server, a network attached storage (NetworkAttached Storage, NAS), a personal computer (personal computer, PC), a Television (TV), a teller machine, a self-service machine, etc., which are not particularly limited in the embodiments of the present application.
The exposure apparatus in the embodiment of the present application may be an apparatus having an operating system. The operating system may be an Android operating system, an ios operating system, or other possible operating systems, and the embodiment of the present application is not limited specifically.
The exposure device provided by the embodiment of the application can realize each process realized by the method embodiment shown in fig. 9, and achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
Optionally, as shown in fig. 11, the embodiment of the present application further provides an electronic device 1100, including a processor 1101 and a memory 1102, where the memory 1102 stores a program or an instruction that can be executed on the processor 1101, and the program or the instruction implements each step of the above-mentioned embodiment of the exposure method when executed by the processor 1101, and can achieve the same technical effect, so that repetition is avoided, and no further description is given here.
It should be noted that, the electronic device in the embodiment of the present application includes a mobile electronic device and a non-mobile electronic device.
Fig. 12 is a schematic hardware structure of an electronic device implementing an embodiment of the present application.
The electronic device 1500 includes, but is not limited to, radio frequency units 1501, network modules 1502, audio output units 1503, input units 1504, sensors 1505, display units 1506, user input units 1507, interface units 1508, memory 1509, and a processor 1510.
The sensor 1505 includes an image sensor, a pixel array and a microlens array correspondingly disposed on the image sensor, and a shape switching structure connected to the microlens array;
the pixel array comprises a plurality of pixel groups, each pixel group comprises N pixel units, and N is an integer greater than 1;
The micro-lens array comprises a plurality of micro-lens groups, wherein each micro-lens group comprises a first micro-lens or at least two second micro-lenses, and each pixel group is correspondingly provided with one first micro-lens or at least two second micro-lenses;
The form switching structure is used for driving the micro lens array to move so as to control each pixel group to be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses, wherein the pixel group is used for acquiring phase information when the pixel group corresponds to the first micro lens, and the pixel group is used for acquiring image information when the pixel group corresponds to the at least two second micro lenses.
In some embodiments of the present application, each of the pixel groups in the same row corresponds to the first microlens;
Or each of the pixel groups in the same row corresponds to at least two second microlenses.
In some embodiments of the present application, in the case where the i-th row of pixel groups corresponds to the first microlenses, the i+1-th row of pixel groups corresponds to at least two second microlenses, the i+2-th row of pixel groups corresponds to at least two second microlenses, and the i+3-th row of pixel groups corresponds to the first microlenses;
or in the case that the i-th row of pixel groups corresponds to the first microlenses, the i+1-th row of pixel groups corresponds to at least two second microlenses, the i+2-th row of pixel groups corresponds to the first microlenses, and the i+3-th row of pixel groups corresponds to at least two second microlenses;
or in the case that the i-th row of pixel groups corresponds to the first microlenses, the i+1-th row of pixel groups corresponds to the first microlenses, the i+2-th row of pixel groups corresponds to the at least two second microlenses, and the i+3-th row of pixel groups corresponds to the at least two second microlenses;
In the case that the i-th row of the pixel groups corresponds to at least two second microlenses, the i+1-th row of the pixel groups corresponds to the first microlenses, the i+2-th row of the pixel groups corresponds to the first microlenses, and the i+3-th row of the pixel groups corresponds to at least two second microlenses;
wherein i is a positive integer.
In some embodiments of the present application, the number of pixel units in each pixel group is 2, 4, 6, 9 or 16, and the color of the pixel units in each pixel group is the same.
In some embodiments of the application, the image sensor further comprises an exposure reading control module and an exposure triggering control module, wherein the exposure reading control module is connected with the pixel array, and the exposure triggering control module is respectively connected with the exposure reading control module, the pixel array and the form switching structure;
The exposure reading control module is used for controlling the pixel array to perform progressive exposure so as to obtain a first image frame;
the exposure trigger control module is used for:
Controlling the pixel array to stop exposure at a first moment, controlling the form switching structure to drive the micro lens array to move, and controlling the pixel array to continue to expose row by row after the micro lens array is moved, wherein the first moment is the moment when the pixel array corresponding to a first image frame does not complete exposure and the pixel array corresponding to a second image frame starts exposure;
And controlling the pixel array to stop exposure at the second moment, controlling the form switching structure to drive the micro lens array to move, and controlling the pixel array to continue to expose row by row after the micro lens array is moved, wherein the second moment is the moment when the pixel array corresponding to the first image frame is exposed and the pixel array corresponding to the second image frame is not exposed.
In some embodiments of the present application, the exposure read control module is specifically configured to control the pixel array to perform row-by-row exposure according to an interlaced exposure mode.
In some embodiments of the present application, the exposure read control module is further configured to send first information to the exposure trigger control module, where the first information is used to indicate a first time and a second time;
the exposure trigger control module is further used for determining a first moment and a second moment according to the first information.
In some embodiments of the application, the first information comprises any one of:
time information of the first time and time information of the second time;
an exposure parameter of the first image frame and an exposure parameter of the second image frame.
Those skilled in the art will appreciate that the electronic device 1500 may also include a power source (e.g., a battery) for powering the various components, which may be logically connected to the processor 1510 via a power management system so as to perform functions such as managing charging, discharging, and power consumption via the power management system. The electronic device structure shown in fig. 12 does not constitute a limitation of the electronic device, and the electronic device may include more or less components than illustrated, or may combine certain components, or may be arranged in different components, which are not described in detail herein.
Wherein the processor 1510 is configured to:
Controlling a pixel array of an image sensor to perform progressive exposure so as to obtain a first image frame;
Controlling the pixel array to stop exposure at a first moment, and switching the micro lens group corresponding to each pixel group through a form switching structure of the image sensor, wherein the first moment is the moment when the pixel array corresponding to the first image frame does not finish exposure and the pixel array corresponding to the second image frame starts exposure;
Controlling the pixel array of the image sensor to continue to perform progressive exposure;
Controlling the pixel array to stop exposure at a second moment, and switching the micro lens group corresponding to each pixel group through a form switching structure, wherein the second moment is the moment when the pixel array corresponding to the first image frame has completed exposure and the pixel array corresponding to the second image frame has not completed exposure;
controlling the pixel array of the image sensor to continue to perform progressive exposure until the pixel array corresponding to the second image frame is exposed;
And obtaining target image information or target phase information based on the first data corresponding to the first image frame and the second data corresponding to the second image frame.
In some embodiments of the present application, the first data and the second data include image information or phase information corresponding to each pixel group in the pixel array;
a processor 1510 for synthesizing the image information in the first data and the image information in the second data to obtain target image information, or
And synthesizing the phase information in the first data and the phase information in the second data to obtain target phase information.
In the electronic device provided by the embodiment of the application, in the process of controlling the pixel array to perform progressive exposure, the pixel array can be controlled to stop exposure at the first time when the first image frame is not completely exposed and the second image frame is started to expose, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group in the pixel array, then the pixel array is controlled to continue to expose, and the pixel array can be controlled to stop exposure at the second time when the first image frame is completely exposed and the second image frame is not completely exposed, and the form switching structure is controlled to switch the micro lens group corresponding to each pixel group again and then the pixel array is controlled to continue to expose, so that one type of information, such as image information, can be output in the exposure period before the first time and after the second time, and another type of information, such as phase information, can be output in the exposure period after the first time and before the second time. It is ensured that the first data and the second data include therein image information or phase information output by each pixel unit in the pixel array. Thus, the target phase information for automatic focusing can be generated according to the phase information output by each pixel group, or the image information with higher definition can be obtained according to the image information output by each pixel group, so that the exposure device can have the capability of collecting clear images and the capability of focusing accurately.
It should be appreciated that in embodiments of the present application, the input unit 1504 may include a graphics processor (Graphics Processing Unit, GPU) 15041 and a microphone 15042, the graphics processor 15041 processing image data of still pictures or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. Sensor 1505 may include an image sensor as shown in any of fig. 3-8. The display unit 1506 may include a display panel 15061, and the display panel 15061 may be configured in the form of a liquid crystal display, an organic light emitting diode, or the like. The user input unit 1507 includes at least one of a touch panel 15071 and other input devices 15072. The touch panel 15071 is also referred to as a touch screen. The touch panel 15071 may include two parts, a touch detection device and a touch controller. Other input devices 15072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, a joystick, and so forth, which are not described in detail herein.
The memory 1509 may be used to store software programs as well as various data. The memory 1509 may mainly include a first memory area storing programs or instructions and a second memory area storing data, wherein the first memory area may store an operating system, application programs or instructions (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like. Further, the memory 1509 may include volatile memory or nonvolatile memory, or the memory 1509 may include both volatile and nonvolatile memory. The nonvolatile Memory may be a Read-Only Memory (ROM), a Programmable ROM (PROM), an Erasable PROM (EPROM), an Electrically Erasable EPROM (EEPROM), or a flash Memory. The volatile memory may be random access memory (RandomAccess Memory, RAM), static random access memory (STATIC RAM, SRAM), dynamic random access memory (DYNAMIC RAM, DRAM), synchronous Dynamic Random Access Memory (SDRAM), double data rate Synchronous dynamic random access memory (Double DATARATE SDRAM, DDRSDRAM), enhanced Synchronous dynamic random access memory (ENHANCED SDRAM, ESDRAM), synchronous link dynamic random access memory (SYNCH LINKDRAM, SLDRAM), and Direct random access memory (DRRAM). Memory 1509 in embodiments of the application include, but are not limited to, these and any other suitable types of memory.
The processor 1510 may include one or more processing units, and optionally the processor 1510 integrates an application processor that primarily processes operations involving an operating system, user interfaces, application programs, and the like, and a modem processor that primarily processes wireless communication signals, such as a baseband processor. It will be appreciated that the modem processor described above may not be integrated into the processor 1510.
The embodiment of the application also provides a readable storage medium, on which a program or an instruction is stored, which when executed by a processor, implements the processes of the above-described embodiment of the exposure method, and can achieve the same technical effects, and in order to avoid repetition, the description is omitted here.
Wherein the processor is a processor in the electronic device described in the above embodiment. The readable storage medium includes computer readable storage medium such as computer readable memory ROM, random access memory RAM, magnetic or optical disk, etc.
The embodiment of the application further provides a chip, the chip comprises a processor and a communication interface, the communication interface is coupled with the processor, the processor is used for running a program or instructions, the above processes for implementing the above embodiment of the exposure method can be implemented, and the same technical effects can be achieved, so that repetition is avoided, and the description is omitted here.
It should be understood that the chips referred to in the embodiments of the present application may also be referred to as system-on-chip chips, chip systems, or system-on-chip chips, etc.
Embodiments of the present application provide a computer program product stored in a storage medium, where the program product is executed by at least one processor to implement the processes of implementing the foregoing exposure method embodiments, and achieve the same technical effects, and are not repeated herein.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element. Furthermore, it should be noted that the scope of the methods and apparatus in the embodiments of the present application is not limited to performing the functions in the order shown or discussed, but may also include performing the functions in a substantially simultaneous manner or in an opposite order depending on the functions involved, e.g., the described methods may be performed in an order different from that described, and various steps may be added, omitted, or combined. Additionally, features described with reference to certain examples may be combined in other examples.
From the above description of the embodiments, it will be clear to those skilled in the art that the above-described embodiment method may be implemented by means of software plus a necessary general hardware platform, but of course may also be implemented by means of hardware, but in many cases the former is a preferred embodiment. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art in the form of a computer software product stored in a storage medium (e.g. ROM/RAM, magnetic disk, optical disk) comprising instructions for causing a terminal (which may be a mobile phone, a computer, a server, or a network device, etc.) to perform the method according to the embodiments of the present application.
The embodiments of the present application have been described above with reference to the accompanying drawings, but the present application is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present application and the scope of the claims, which are to be protected by the present application.

Claims (12)

1. The image sensor is characterized by comprising a pixel array, a micro lens array and a form switching structure, wherein the pixel array and the micro lens array are correspondingly arranged, and the form switching structure is connected with the micro lens array;
The pixel array comprises a plurality of pixel groups, each pixel group comprises N pixel units, and N is an integer greater than 1;
The micro-lens array comprises a plurality of micro-lens groups, wherein each micro-lens group comprises a first micro-lens or at least two second micro-lenses, and each pixel group is correspondingly provided with one first micro-lens or at least two second micro-lenses;
The form switching structure is used for driving the micro lens array to move so as to control each pixel group to be switched from the corresponding first micro lens to the corresponding at least two second micro lenses or from the corresponding at least two second micro lenses to the corresponding first micro lenses, wherein when the pixel group corresponds to the first micro lenses, the pixel group is used for acquiring phase information, and when the pixel group corresponds to the at least two second micro lenses, the pixel group is used for acquiring image information.
2. The image sensor of claim 1, wherein each of the pixel groups of the same row corresponds to the first microlens;
Or each of the pixel groups in the same row corresponds to at least two second microlenses.
3. The image sensor according to claim 2, wherein in the case where the i-th row of the pixel group corresponds to the first microlens, the i+1-th row of the pixel group corresponds to at least two of the second microlenses, the i+2-th row of the pixel group corresponds to at least two of the second microlenses, and the i+3-th row of the pixel group corresponds to the first microlens;
Or in the case that the i-th row of the pixel groups corresponds to the first microlenses, the i+1-th row of the pixel groups corresponds to at least two second microlenses, the i+2-th row of the pixel groups corresponds to the first microlenses, and the i+3-th row of the pixel groups corresponds to at least two second microlenses;
Or in the case that the i-th row of the pixel groups corresponds to the first microlenses, the i+1-th row of the pixel groups corresponds to the first microlenses, the i+2-th row of the pixel groups corresponds to at least two second microlenses, and the i+3-th row of the pixel groups corresponds to at least two second microlenses;
or in the case that the i-th row of the pixel groups corresponds to at least two second microlenses, the i+1-th row of the pixel groups corresponds to the first microlenses, the i+2-th row of the pixel groups corresponds to the first microlenses, and the i+3-th row of the pixel groups corresponds to at least two second microlenses;
wherein i is a positive integer.
4. The image sensor of claim 1, wherein the number of pixel cells in the pixel group is 2, 4, 6, 9 or 16, and the color of the pixel cells in each pixel group is the same.
5. The image sensor of any one of claims 1 to 4, further comprising an exposure read control module and an exposure trigger control module, wherein the exposure read control module is connected to the pixel array, and the exposure trigger control module is connected to the exposure read control module, the pixel array, and the form switching structure, respectively;
The exposure reading control module is used for controlling the pixel array to perform progressive exposure so as to obtain a first image frame;
the exposure trigger control module is used for:
Controlling the pixel array to stop exposure at a first moment, controlling the form switching structure to drive the micro lens array to move, and controlling the pixel array to continue to expose row by row after the micro lens array is moved, wherein the first moment is the moment when the pixel array corresponding to the first image frame does not finish exposure and the pixel array corresponding to the second image frame starts exposure;
And controlling the pixel array to stop exposure at a second moment, controlling the form switching structure to drive the micro lens array to move, and controlling the pixel array to continue to expose row by row after the micro lens array is moved, wherein the second moment is the moment when the pixel array corresponding to the first image frame is exposed and the pixel array corresponding to the second image frame is not exposed.
6. The image sensor of claim 5, wherein the exposure read control module is specifically configured to control the pixel array to perform a row-by-row exposure according to an interlaced exposure mode.
7. The image sensor of claim 5, wherein the exposure read control module is further configured to send first information to the exposure trigger control module, the first information being configured to indicate the first time and the second time;
the exposure triggering control module is further configured to determine the first time and the second time according to the first information.
8. The image sensor of claim 7, wherein the first information comprises any one of:
the time information of the first time and the time information of the second time;
an exposure parameter of the first image frame and an exposure parameter of the second image frame.
9. An exposure method applied to the image sensor according to any one of claims 1 to 8, characterized by comprising:
controlling a pixel array of the image sensor to perform progressive exposure so as to obtain a first image frame;
controlling the pixel array to stop exposure at a first moment, and switching a micro lens group corresponding to each pixel group through a form switching structure of the image sensor, wherein the first moment is the moment when the pixel array corresponding to the first image frame does not complete exposure and the pixel array corresponding to the second image frame starts exposure;
Controlling the pixel array of the image sensor to continue to perform progressive exposure;
Controlling the pixel array to stop exposure at a second moment, and switching the micro lens group corresponding to each pixel group through the form switching structure, wherein the second moment is the moment when the pixel array corresponding to the first image frame is completely exposed and the pixel array corresponding to the second image frame is not completely exposed;
controlling the pixel array of the image sensor to continue to perform progressive exposure until the pixel array corresponding to the second image frame is exposed;
And obtaining target image information or target phase information based on the first data corresponding to the first image frame and the second data corresponding to the second image frame.
10. The method of claim 9, wherein the step of determining the position of the substrate comprises,
The first data and the second data comprise image information or phase information corresponding to each pixel group in the pixel array;
The obtaining target image information or target phase information based on the first data corresponding to the first image frame and the second data corresponding to the second image frame includes:
synthesizing the image information in the first data and the image information in the second data to obtain the target image information, or
And synthesizing the phase information in the first data and the phase information in the second data to obtain the target phase information.
11. An exposure apparatus, characterized in that the exposure apparatus comprises the image sensor according to any one of claims 1 to 8, the apparatus further comprising a control module;
the control module is used for:
controlling a pixel array of the image sensor to perform progressive exposure so as to obtain a first image frame;
controlling the pixel array to stop exposure at a first moment, and switching a micro lens group corresponding to each pixel group through a form switching structure of the image sensor, wherein the first moment is the moment when the pixel array corresponding to the first image frame does not complete exposure and the pixel array corresponding to the second image frame starts exposure;
Controlling the pixel array of the image sensor to continue to perform progressive exposure;
Controlling the pixel array to stop exposure at a second moment, and switching the micro lens group corresponding to each pixel group through the form switching structure, wherein the second moment is the moment when the pixel array corresponding to the first image frame is completely exposed and the pixel array corresponding to the second image frame is not completely exposed;
controlling the pixel array of the image sensor to continue to perform progressive exposure until the pixel array corresponding to the second image frame is exposed;
And obtaining target image information or target phase information based on the first data corresponding to the first image frame and the second data corresponding to the second image frame.
12. An electronic device comprising a processor and a memory storing a program or instructions executable on the processor, which when executed by the processor, implement the steps of the exposure method of claim 9 or 10.
CN202411094073.1A 2024-08-09 2024-08-09 Image sensor, exposure method, exposure device and electronic equipment Active CN119094877B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202411094073.1A CN119094877B (en) 2024-08-09 2024-08-09 Image sensor, exposure method, exposure device and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202411094073.1A CN119094877B (en) 2024-08-09 2024-08-09 Image sensor, exposure method, exposure device and electronic equipment

Publications (2)

Publication Number Publication Date
CN119094877A CN119094877A (en) 2024-12-06
CN119094877B true CN119094877B (en) 2025-09-16

Family

ID=93665833

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202411094073.1A Active CN119094877B (en) 2024-08-09 2024-08-09 Image sensor, exposure method, exposure device and electronic equipment

Country Status (1)

Country Link
CN (1) CN119094877B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107135340A (en) * 2017-04-28 2017-09-05 广东欧珀移动通信有限公司 Imaging sensor, focusing control method, imaging device and mobile terminal
CN115224060A (en) * 2021-04-21 2022-10-21 三星电子株式会社 Image sensor with a plurality of pixels

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017207695A (en) * 2016-05-20 2017-11-24 株式会社ニコン Optical device
US20230395626A1 (en) * 2022-06-03 2023-12-07 Omnivision Technologies, Inc. Hybrid image pixels for phase detection auto focus
JP7404447B1 (en) * 2022-06-22 2023-12-25 ゼタテクノロジーズ株式会社 Solid-state imaging device, solid-state imaging device manufacturing method, and electronic equipment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107135340A (en) * 2017-04-28 2017-09-05 广东欧珀移动通信有限公司 Imaging sensor, focusing control method, imaging device and mobile terminal
CN115224060A (en) * 2021-04-21 2022-10-21 三星电子株式会社 Image sensor with a plurality of pixels

Also Published As

Publication number Publication date
CN119094877A (en) 2024-12-06

Similar Documents

Publication Publication Date Title
JP5718069B2 (en) Solid-state imaging device and imaging device
KR101428596B1 (en) Image sensor
CN105049685B (en) The control method of picture pick-up device and picture pick-up device
JP5850680B2 (en) Imaging apparatus and control method thereof
KR20180052700A (en) Image pickup device and image pickup device
JP7473041B2 (en) Image pickup element and image pickup device
JP2021182763A (en) Image pick-up device and imaging apparatus
US11272130B2 (en) Image capturing apparatus
US20050128324A1 (en) Image sensing apparatus and method of controlling same
US11412168B2 (en) Imaging element and method of controlling the same, and imaging device
JP6265962B2 (en) Imaging device and imaging apparatus
US20180220058A1 (en) Image capture apparatus, control method therefor, and computer-readable medium
JP6700751B2 (en) IMAGING DEVICE, IMAGING DEVICE CONTROL METHOD, AND PROGRAM
CN119094877B (en) Image sensor, exposure method, exposure device and electronic equipment
JP2018078609A (en) Imaging device and imaging apparatus
US10203206B2 (en) Image capture apparatus having signal readouts using distance measurement region
JP6444254B2 (en) FOCUS DETECTION DEVICE, IMAGING DEVICE, FOCUS DETECTION METHOD, PROGRAM, AND STORAGE MEDIUM
US10880477B2 (en) Image capturing apparatus and multi-readout mode control method for carrying out a live view display
JP2018050267A (en) Image pickup apparatus and image pickup element control method
JP7091044B2 (en) Image sensor and image sensor
JP7566065B2 (en) Image pickup element and image pickup device
JP7614681B1 (en) Solid-state imaging device having AI function, driving method thereof, and electronic device
JP2020057892A (en) Imaging device
WO2024084930A1 (en) Solid-state imaging device, method for driving same, and electronic equipment
JP2003153090A (en) X-y scanning image pickup element and imaging device equipped with the same

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant