[go: up one dir, main page]

HK1204406B - Methods of reading pixel data from a pixel array and an imaging system - Google Patents

Methods of reading pixel data from a pixel array and an imaging system Download PDF

Info

Publication number
HK1204406B
HK1204406B HK15104765.8A HK15104765A HK1204406B HK 1204406 B HK1204406 B HK 1204406B HK 15104765 A HK15104765 A HK 15104765A HK 1204406 B HK1204406 B HK 1204406B
Authority
HK
Hong Kong
Prior art keywords
pixel
pixel regions
exposure
exposure times
pixel data
Prior art date
Application number
HK15104765.8A
Other languages
Chinese (zh)
Other versions
HK1204406A1 (en
Inventor
邝江涛
吴东晖
王超
船津英一
Original Assignee
豪威科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US14/243,480 external-priority patent/US9413992B2/en
Application filed by 豪威科技股份有限公司 filed Critical 豪威科技股份有限公司
Publication of HK1204406A1 publication Critical patent/HK1204406A1/en
Publication of HK1204406B publication Critical patent/HK1204406B/en

Links

Abstract

The subject application relates to methods of reading pixel data from a pixel array and an imaging system. A method of reading pixel data from a pixel array includes exposing each one of a plurality of regions of pixels a respective exposure time. Pixel data is read from the plurality of regions of pixels. The pixel data is interpolated from a first one of the plurality of regions of pixels to determine the pixel data of the regions of pixels other than the first one of the plurality of regions of pixels to generate a first image having the first exposure time. The pixel data is interpolated from the second one of the plurality of regions of pixels to determine the pixel data of the regions of pixels other than the second one of the plurality of regions to generate a second image having the second exposure time. The images are combined to produce a high dynamic range image.

Description

Method for reading pixel data from pixel array and imaging system
Cross reference to related applications
This application claims the benefit of U.S. provisional application No. 61/825,419, filed on 5/20/2013.
Technical Field
The present invention relates generally to image sensors. More specifically, examples of the disclosure relate to image processing and signal processing techniques utilized in High Dynamic Range (HDR) image sensors.
Background
Standard image sensors have a limited dynamic range of approximately 60dB to 70 dB. However, the real world luminance dynamic range is much larger. Natural scenes typically span a range of 90dB and above. To capture highlights and shadows simultaneously, HDR techniques have been used in image sensors to increase the captured dynamic range. The most common technique to increase dynamic range is to combine multiple exposures captured with a standard (low dynamic range) image sensor into a single linear HDR image, which has a much larger dynamic range than a single exposure image.
One of the most common HDR sensor solutions would have multiple exposures into one single image sensor. With different exposure integration times or different sensitivities (e.g., by inserting neutral density filters), one image sensor may have 2, 3, 4, or even more different exposures in a single image sensor. Multiple exposure images may be obtained in a single shot using this HDR image sensor. However, using such an HDR sensor reduces the overall image resolution compared to a normal full resolution image sensor. For example, for an HDR sensor that combines 4 different exposures in one image sensor, each HDR image would be only one-quarter resolution of the full resolution image.
Disclosure of Invention
In one aspect, the present application provides a method of reading pixel data from a pixel array including a plurality of pixels, wherein the plurality of pixels are organized into a plurality of pixel regions arranged in a pattern in the pixel array. The method may comprise: exposing a first one of the plurality of pixel regions for a first exposure time; exposing a second one of the plurality of pixel regions for a second exposure time; reading pixel data from the plurality of pixel regions; interpolating the pixel data from the first one of the plurality of pixel regions to determine the pixel data of the plurality of pixel regions other than the first one of the plurality of pixel regions for the first exposure time to generate a first one of a plurality of images from the pixel array, wherein the first one of the plurality of images has the first exposure time; interpolating the pixel data from the second one of the plurality of pixel regions to determine the pixel data of the plurality of pixel regions other than the second one of the plurality of regions for the second exposure time to generate a second one of the plurality of images from the pixel array, wherein the second one of the plurality of images has the second exposure time; and combining the plurality of images to generate a high dynamic range image. In another aspect, the present application provides a method of reading pixel data from a pixel array comprising a plurality of pixels, wherein the plurality of pixels are organized into a plurality of pixel regions arranged in a pattern in the pixel array. The method may comprise: exposing each of a plurality of pixel regions for a respective one of a plurality of exposure times; reading pixel data from the plurality of pixel regions; determining, for each respective one of the plurality of exposure times, an exposure ratio for each of the plurality of pixel regions, wherein the exposure ratio for each respective one of the plurality of exposure times for each of the plurality of pixel regions is equal to a respective one of the plurality of exposure times divided by the exposure time for the one of the plurality of pixel regions; for each respective one of the plurality of exposure times, replacing the pixel data of each of the plurality of pixel regions whose exposure ratio is less than a first threshold with the pixel data from the one of the plurality of pixel regions multiplied by the exposure ratio of the one of the plurality of pixel regions to generate an image for the respective one of the plurality of exposure times; and combining the plurality of images to generate a high dynamic range image.
In yet another aspect, the present application provides an imaging system. The imaging system may include: a pixel array comprising a plurality of pixels, wherein the plurality of pixels are organized into a plurality of pixel regions arranged in a pattern in the pixel array; control circuitry coupled to the pixel array to control operation of the pixel array, wherein the control circuitry is coupled to expose each of a plurality of pixel regions for a respective one of a plurality of exposure times; and readout circuitry coupled to the pixel array to readout pixel data from the plurality of pixel regions, the imaging system coupled to: determining, for each respective one of the plurality of exposure times, an exposure ratio for each of the plurality of pixel regions, wherein the exposure ratio for each respective one of the plurality of exposure times for each of the plurality of pixel regions is equal to a respective one of the plurality of exposure times divided by the exposure time for the one of the plurality of pixel regions; for each respective one of the plurality of exposure times, replacing the pixel data of each of the plurality of pixel regions whose exposure ratio is less than a first threshold with the pixel data from the one of the plurality of pixel regions multiplied by the exposure ratio of the one of the plurality of pixel regions to generate an image for the respective one of the plurality of exposure times; and combining the plurality of images to generate a high dynamic range image.
Drawings
Non-limiting and non-exhaustive embodiments of the present invention are described with reference to the following figures, wherein like reference numerals refer to like parts throughout the various views unless otherwise specified.
FIG. 1A is a diagram illustrating one example of a four exposure HDR image sensor according to the teachings of this disclosure.
FIG. 1B is a diagram illustrating one example of pixel regions in which there is a first exposure in an example four exposure HDR image sensor according to the teachings of this disclosure.
FIG. 2 is a diagram illustrating one example of a four exposure HDR image sensor, where the full resolution of the image sensor is captured with an example upscaling technique, according to the teachings of this disclosure.
FIG. 3 is a diagram illustrating one example of a four exposure HDR image sensor, where the full resolution of the image sensor is captured with an example pixel data replacement technique, according to the teachings of this disclosure.
Figure 4 is a diagram illustrating one example of an imaging system including an image sensor pixel array with fast intra-frame autofocus, according to the teachings of this disclosure.
Corresponding reference characters indicate corresponding parts throughout the several views of the drawings. Skilled artisans will appreciate that elements in the figures are illustrated for simplicity and clarity and have not necessarily been drawn to scale. For example, the dimensions of some of the elements in the figures may be exaggerated relative to other elements to help improve understanding of various embodiments of the present invention. Additionally, common but well-understood elements that are useful or necessary in a commercially feasible embodiment are often not depicted in order to facilitate a less obstructed view of these various embodiments of the present invention.
Detailed Description
As will be shown, the present disclosure is directed to methods and apparatus for HDR image sensors with full resolution. In the following description, numerous specific details are set forth in order to provide a thorough understanding of the present invention. In the following description, numerous specific details are set forth to provide a thorough understanding of embodiments. One skilled in the relevant art will recognize, however, that the techniques described herein may be practiced without one or more of the specific details, or with other methods, components, materials, and so forth. In other instances, well-known structures, materials, or operations are not shown or described in detail to avoid obscuring certain aspects.
Reference throughout this specification to "one embodiment," "an embodiment," "one example," or "an example" means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the present invention. Thus, the appearances of the phrases such as "in one embodiment" or "in one example" in various places throughout this specification are not necessarily all referring to the same embodiment or example. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments or examples. The following is a detailed description of terms and elements used in the description of examples of the invention, with reference to the accompanying drawings.
As will be shown, examples in accordance with the teachings of this disclosure provide image processing techniques that restore full resolution of HDR image sensors that include two or more exposures in one single image sensor chip. It should be noted that the example techniques described below utilize a four exposure HDR image sensor as an example. Of course, it should be understood that HDR image sensors with any multiple exposures may also be utilized in accordance with the teachings of this disclosure.
In the example depicted in FIG. 1A, an example four exposure HDR image sensor 100 is illustrated. In the illustrated example, a bayer pattern is shown for purposes of explanation. However, it should be understood that any other colored or non-colored pattern may also be utilized in accordance with the teachings of the present invention. As shown in the depicted example, image sensor 100 includes a plurality of pixel regions T0, T1, T2, and T3 arranged in a two-dimensional array as shown. In the example, it is assumed that images obtained with the plurality of pixel regions T0, T1, T2, and T3 are acquired with exposures E0, E1, E2, and E3, respectively. Referring to the example illustrated in FIG. 1B, it can be appreciated that if only the T0 pixel region of the image sensor 100 (where there is only exposure E0) is used to capture an image, only one-fourth of the total number of pixels of the image sensor 100 is available. In other words, the other three quarters of the pixel area other than the T0 pixel area are not used to capture an image when exposing E0, and thus do not directly capture the full resolution of image sensor 100.
There are a variety of techniques that may be used to capture the full resolution of the image sensor 100 at HDR in accordance with the teachings of this disclosure. For example, in one example, the full resolution of image sensor 100 may be captured with HDR with scale up. Scaling is an uncommon process involving a trade-off between smoothness and sharpness. Various scaling methods may be utilized, such as nearest neighbor interpolation, bilinear interpolation, spline interpolation, or other vector-based interpolation. In one example, a bi-cubic interpolation method may be used to obtain a upscaled image for each of exposures E0, E1, E2, and E3. Suppose that:
t0' scaled up (T0),
t1' scaled up (T1),
t2 is scaled up (T2), and
t3' scaled up (T3),
according to the teachings of this disclosure, image data of other pixel regions having different exposures may be recovered. To illustrate, FIG. 2 shows an example image sensor 200, recovering image data using an example upscaling technique for missing pixel locations (e.g., non-T0 pixel regions), with recovered T0' data for an E0 exposure as shown, in accordance with the teachings of this disclosure. It should be appreciated that recovered T1 ' data may be similarly used for non-T1 pixel regions for E1 exposures, recovered T2 ' data may be similarly used for non-T2 pixel regions for E2 exposures, and recovered T3 ' data may be similarly used for non-T3 pixel regions for E3 exposures in accordance with the teachings of this disclosure. Thus, according to the teachings of this disclosure, full resolution of the image sensor 100 may be obtained for all exposures E0, E1, E2, and E3 to obtain full resolution HDR information using example scale up techniques.
In another example, pixel replacement may be utilized in an HDR image sensor, where the pixel response is assumed to be linear with respect to exposure time. In the example, the pixel values for the missing pixel locations can be estimated from the other exposures. For example, assume that:
it can then be assumed that
T0_1″=r1×T1
T0_2″=r2×T2
T0_3″=r3×T3
By integrating values estimated from other exposures and replacing missing pixel locations with values estimated from other exposures, a full resolution image for T0 can thus be recovered, as shown in the example illustrated in fig. 3, fig. 3 showing an example image sensor 300 in accordance with the teachings of this disclosure, in which image data is recovered using example pixel data replacement techniques for the missing pixel locations.
According to the teachings of this disclosure, for the remaining exposures E1, E2, and E3, similar pixel data substitution techniques may be utilized to recover the T1, T2, and T3 pixel data. It should be noted, however, that if the longer exposure is saturated, the estimated exposure pixel values are lost because the useful pixel information is clipped due to the full well capacity. In this case, the upscaled image pixel values may instead be utilized using the techniques previously discussed, in accordance with the teachings of this disclosure.
It should be appreciated that pixel replacement techniques suffer from poor signal-to-noise ratio (SNR) as the exposure ratio increases. For example, if E0/E3 is 64, then E3 must apply 64x digital gain to obtain the same digital level of E0, which may have an 18dB lower SNR compared to E0. Furthermore, if E3 falls into the non-linear response region of the image sensor, the estimated value for E0_3 "will be inaccurate. On the other hand, the scale up technique suffers from reduced resolution and/or inefficient high frequency estimation, which results in zig-zag artifacts due to interpolation.
In another example, both a upscaling technique and a pixel data replacement technique may be utilized based on the exposure ratio to recover missing pixel information in accordance with the teachings of this disclosure. For example, in one example,
T0recoveredif T0 ″, if
T0recovered(1-w) × T0"+ w × T0', if
WhereinAnd is
T0recoveredIf T0
Thus, according to the teachings of this disclosure, for HDR scenes, upscaling techniques may be utilized, while for low dynamic range scenes, pixel data replacement techniques may be utilized to restore the full resolution of the HDR image sensor.
In one example, after restoring full resolution images of T0, T1, T2, and T3, a HDR combining process may be used to combine the full resolution images of T0, T1, T2, and T3 into a single HDR image, in accordance with the teachings of this disclosure. In one example, a tone mapping process may be performed to compress the dynamic range and produce a normal image suitable for display.
Figure 4 is an example illustrating an imaging system 491 that includes an example HDR image sensor 492 having a plurality of image sensor pixel cells, in accordance with the teachings of this disclosure. As shown in the depicted example, imaging system 491 includes an HDR image sensor 492 coupled to control circuitry 498 and readout circuitry 494 (which is coupled to functional logic 496).
In one example, the HDR image sensor 492 is a two-dimensional (2D) array of image sensor pixel cells (e.g., pixels P1, P2, P3, …, Pn). It should be noted that HDR image sensor 492 may be an example of image sensor 100 of fig. 1A-1B or an example of image sensor 200 of fig. 2 or an example of image sensor 300 of fig. 3, and that similarly named and numbered elements mentioned below are coupled and function similarly as described above. As illustrated, each pixel cell is arranged into one row (e.g., row R1-Ry) and one column (e.g., column C1-Cx) to acquire image data of a person, place, object, etc., which can then be used to render a 2D image of the person, place, object, etc.
In one example, after each pixel cell P1, P2, P3, …, Pn has acquired its image data or image charge, the image data is readout by readout circuitry 494 and then transferred to function logic 496. In various examples, readout circuitry 494 can include HDR processing circuitry, amplification circuitry, analog-to-digital (ADC) conversion circuitry, or others as discussed above. Function logic 496 may simply store the image data or even manipulate the image data by applying post-image effects (e.g., HDR processing, tone mapping processing, cropping, rotating, removing red-eye, adjusting brightness, adjusting contrast, or otherwise). In one example, readout circuitry 494 may readout a row of image data at a time along readout column lines (illustrated) or may readout the image data using a variety of other techniques (not illustrated), such as a serial readout or a simultaneous full parallel readout of all pixels.
In one example, the control circuit 498 is coupled to the HDR image sensor 492 to control the operating characteristics of the image sensor 492. For example, the control circuit 498 may generate a shutter signal for controlling image acquisition. In one example, the shutter signal is a global shutter signal used to simultaneously enable all pixels within image sensor 492 to simultaneously capture their respective image data during a single acquisition window. In another example, the shutter signal is a rolling shutter signal such that each row of pixels, each column of pixels, or each group of pixels is sequentially enabled during successive acquisition windows.
The above description of illustrated examples of the invention, including what is described in the summary, is not intended to be exhaustive or to be limited to the precise forms disclosed. While specific embodiments of, and examples for, the invention are described herein for illustrative purposes, various equivalent modifications can be made without departing from the broader spirit and scope of the invention. Indeed, it should be understood that specific example voltages, currents, frequencies, power range values, times, etc., are provided for purposes of explanation and that other values may also be employed in other embodiments and examples in accordance with the teachings of this disclosure.
These modifications can be made to the examples of the invention in light of the above detailed description. The terms used in the following claims should not be construed to limit the invention to the specific embodiments disclosed in the specification and the claims. Rather, the scope is to be determined entirely by the following claims, which are to be construed in accordance with established doctrines of claim interpretation. The specification and drawings are, accordingly, to be regarded in an illustrative rather than a restrictive sense.

Claims (9)

1. A method of reading pixel data from a pixel array comprising a plurality of pixels organized into a plurality of pixel regions arranged in a pattern in the pixel array, the method comprising:
exposing each of a plurality of pixel regions for a respective one of a plurality of exposure times;
reading pixel data from the plurality of pixel regions;
determining, for each respective one of the plurality of exposure times, an exposure ratio for each of the plurality of pixel regions, wherein the exposure ratio for each respective one of the plurality of exposure times for each of the plurality of pixel regions is equal to a respective one of the plurality of exposure times divided by the exposure time for the one of the plurality of pixel regions;
for each respective one of the plurality of exposure times, replacing the pixel data of each of the plurality of pixel regions whose exposure ratio is less than a first threshold with the pixel data from the one of the plurality of pixel regions multiplied by the exposure ratio of the one of the plurality of pixel regions to produce an image for each respective one of the plurality of exposure times in a plurality of images;
for each respective one of the plurality of exposure times, replacing the pixel data of each of the plurality of pixel regions whose exposure ratio is greater than a second threshold with an interpolation of the pixel data from the one of the plurality of pixel regions having the respective one of the plurality of exposure times to generate the image for each respective one of the plurality of exposure times in the plurality of images; and
combining the plurality of images to produce a high dynamic range image.
2. The method of claim 1, further comprising, for each respective one of the plurality of exposure times, replacing the pixel data in the plurality of pixel regions for which the exposure ratio is greater than the first threshold and less than the second threshold with a combination of the pixel data from the one of the plurality of pixel regions multiplied by the exposure ratio of the one of the plurality of pixel regions and the interpolation of the pixel data from the one of the plurality of pixel regions having the respective one of the plurality of exposure times.
3. The method of claim 1, wherein a pixel response of each of the plurality of pixels in the pixel array is linear with respect to exposure time.
4. The method of claim 1, wherein the combining the plurality of images comprises performing a tone mapping process on the plurality of images to compress a dynamic range of the high dynamic range image.
5. An imaging system, comprising:
a pixel array comprising a plurality of pixels, wherein the plurality of pixels are organized into a plurality of pixel regions arranged in a pattern in the pixel array;
control circuitry coupled to the pixel array to control operation of the pixel array, wherein the control circuitry is coupled to expose each of a plurality of pixel regions for a respective one of a plurality of exposure times; and
readout circuitry coupled to the pixel array to readout pixel data from the plurality of pixel regions, the imaging system coupled to:
determining, for each respective one of the plurality of exposure times, an exposure ratio for each of the plurality of pixel regions, wherein the exposure ratio for each respective one of the plurality of exposure times for each of the plurality of pixel regions is equal to a respective one of the plurality of exposure times divided by the exposure time for the one of the plurality of pixel regions;
for each respective one of the plurality of exposure times, replacing the pixel data of each of the plurality of pixel regions whose exposure ratio is less than a first threshold with the pixel data from the one of the plurality of pixel regions multiplied by the exposure ratio of the one of the plurality of pixel regions to produce an image for each respective one of the plurality of exposure times in a plurality of images;
for each respective one of the plurality of exposure times, replacing the pixel data of each of the plurality of pixel regions whose exposure ratio is greater than a second threshold with an interpolation of the pixel data from the one of the plurality of pixel regions having the respective one of the plurality of exposure times to generate the image for each respective one of the plurality of exposure times in the plurality of images; and
combining the plurality of images to produce a high dynamic range image.
6. The imaging system of claim 5, further comprising functional logic coupled to the readout circuitry to store high dynamic range image data read out from the plurality of pixels.
7. The imaging system of claim 5, wherein the imaging system is further coupled to replace, for each respective one of the plurality of exposure times, the pixel data in each of the plurality of pixel regions whose exposure ratio is greater than the first threshold and less than the second threshold with a combination of the pixel data from the one of the plurality of pixel regions multiplied by the exposure ratio of the one of the plurality of pixel regions and the interpolation of the pixel data from the one of the plurality of pixel regions having the respective one of the plurality of exposure times.
8. The imaging system of claim 5, wherein a pixel response of each of the plurality of pixels in the pixel array is linear with respect to exposure time.
9. The imaging system of claim 5, wherein the imaging system is further coupled to perform a tone mapping process on the plurality of images to compress a dynamic range of the high dynamic range image to combine the plurality of images.
HK15104765.8A 2013-05-20 2015-05-19 Methods of reading pixel data from a pixel array and an imaging system HK1204406B (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361825419P 2013-05-20 2013-05-20
US61/825,419 2013-05-20
US14/243,480 US9413992B2 (en) 2013-05-20 2014-04-02 High dynamic range image sensor with full resolution recovery
US14/243,480 2014-04-02

Publications (2)

Publication Number Publication Date
HK1204406A1 HK1204406A1 (en) 2015-11-13
HK1204406B true HK1204406B (en) 2018-05-04

Family

ID=

Similar Documents

Publication Publication Date Title
US9413992B2 (en) High dynamic range image sensor with full resolution recovery
US9325918B2 (en) Image processing apparatus, imaging apparatus, solid-state imaging device, image processing method and program
CN101816171B (en) Multi-exposure pattern for enhancing dynamic range of images
Jinno et al. Multiple exposure fusion for high dynamic range image acquisition
CN101911671A (en) Imaging device and optical axis control method
TW201044856A (en) Image restoration method and apparatus
JPH0364908B2 (en)
GB2575137A (en) An image sensor and an image dynamic information processing method
TWI589156B (en) System and method for hdr imaging
US10715723B2 (en) Image processing apparatus, image acquisition system, image processing method, and image processing program
JP2014045352A (en) Image processing apparatus, method, and program, and image pickup apparatus having image processing apparatus
US20150002689A1 (en) Method, apparatus, and manufacture for enhanced resolution for images from high dynamic range (hdr) interlaced sensors
US20130033622A1 (en) Method and apparatus for motion artifact correction in hdr video
Wang et al. Neural global shutter: Learn to restore video from a rolling shutter camera with global reset feature
US9826174B2 (en) Image processing apparatus and method
Bätz et al. Multi-image super-resolution using a locally adaptive denoising-based refinement
CN101305400A (en) Image processing method and system
Choi et al. Super‐resolution approach to overcome physical limitations of imaging sensors: An overview
US9979908B2 (en) Image processing devices and image processing methods with interpolation for improving image resolution
CN103634529A (en) Raw data processing apparatus, raw data processing method and imaging device
HK1204406B (en) Methods of reading pixel data from a pixel array and an imaging system
US9800796B1 (en) Apparatus and method for low dynamic range and high dynamic range image alignment
CN117859149A (en) High dynamic range imaging apparatus and method of generating high dynamic range image
US20250267375A1 (en) Imaging device and operating method thereof
Li et al. Cross image cubic interpolator for spatially varying exposures