[go: up one dir, main page]

WO2018124054A1 - Imaging device and method for controlling same - Google Patents

Imaging device and method for controlling same Download PDF

Info

Publication number
WO2018124054A1
WO2018124054A1 PCT/JP2017/046598 JP2017046598W WO2018124054A1 WO 2018124054 A1 WO2018124054 A1 WO 2018124054A1 JP 2017046598 W JP2017046598 W JP 2017046598W WO 2018124054 A1 WO2018124054 A1 WO 2018124054A1
Authority
WO
WIPO (PCT)
Prior art keywords
imaging apparatus
target frame
exposure
image
control unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/046598
Other languages
French (fr)
Japanese (ja)
Inventor
弘明 関東
健夫 南
國末 勝次
亨治 石井
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Intellectual Property Management Co Ltd
Original Assignee
Panasonic Intellectual Property Management Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Intellectual Property Management Co Ltd filed Critical Panasonic Intellectual Property Management Co Ltd
Publication of WO2018124054A1 publication Critical patent/WO2018124054A1/en
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B7/00Control of exposure by setting shutters, diaphragms or filters, separately or conjointly
    • G03B7/08Control effected solely on the basis of the response, to the intensity of the light received by the camera, of a built-in light-sensitive device
    • G03B7/091Digital circuits
    • G03B7/093Digital circuits for control of exposure time
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N25/00Circuitry of solid-state image sensors [SSIS]; Control thereof
    • H04N25/50Control of the SSIS exposure
    • H04N25/53Control of the integration time
    • H04N25/532Control of the integration time by controlling global shutters in CMOS SSIS

Definitions

  • the present disclosure relates to an imaging apparatus and a control method thereof.
  • Patent Document 1 A technique described in Patent Document 1 is known as an image sensor using an organic photoelectric conversion element.
  • Such an image pickup apparatus is desired to be further improved.
  • an object of the present disclosure is to provide an imaging apparatus or a control method thereof that can realize further improvement.
  • An imaging apparatus includes an imaging element capable of nondestructive readout, and a control unit that controls imaging of the target frame using an auxiliary image obtained by nondestructive readout in the target frame. .
  • the present disclosure can provide an imaging apparatus or a control method thereof that can realize further improvement.
  • FIG. 1 is a block diagram of the imaging apparatus according to the first embodiment.
  • FIG. 2A is a diagram illustrating an appearance example of the imaging apparatus according to Embodiment 1.
  • FIG. 2B is a diagram illustrating an appearance example of the imaging apparatus according to Embodiment 1.
  • FIG. 3 is a diagram illustrating a configuration of the image sensor according to the first embodiment.
  • FIG. 4 is a circuit diagram illustrating a configuration of a pixel according to Embodiment 1.
  • FIG. 5 is a flowchart illustrating the operation of the imaging apparatus according to the first embodiment.
  • FIG. 6 is a diagram illustrating the operation of the imaging apparatus according to the first embodiment.
  • FIG. 7 is a flowchart illustrating the operation of the imaging apparatus according to the second embodiment.
  • FIG. 8 is a diagram illustrating the operation of the imaging apparatus according to the second embodiment.
  • FIG. 9 is a diagram illustrating an example of an auxiliary image according to the second embodiment.
  • FIG. 10 is a diagram illustrating an example of a threshold setting screen according to the second embodiment.
  • FIG. 11 is a flowchart illustrating the operation of the imaging apparatus according to the third embodiment.
  • FIG. 12 is a diagram for explaining subject change determination processing according to the third embodiment.
  • FIG. 13 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment.
  • FIG. 14 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment.
  • FIG. 15 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment.
  • FIG. 16 is a flowchart illustrating the operation of the imaging apparatus according to the fourth embodiment.
  • FIG. 17 is a diagram illustrating the operation of the imaging apparatus according to the fourth embodiment.
  • FIG. 18 is a diagram illustrating an example of an image selection screen according to the fourth embodiment.
  • FIG. 19 is a diagram illustrating the operation of the imaging apparatus according to the modification of the fourth embodiment.
  • FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus 100 according to the present embodiment.
  • 2A and 2B are diagrams illustrating an example of the appearance of the imaging apparatus 100.
  • the imaging apparatus 100 is a camera such as a digital still camera or a digital video camera.
  • the imaging device 101 is a solid-state imaging device (solid-state imaging device) that converts incident light into an electrical signal (image) and outputs the obtained electrical signal.
  • the imaging device 101 is an organic sensor using an organic photoelectric conversion device.
  • the control unit 102 controls the image sensor 101.
  • the control unit 102 performs various signal processing on the image obtained by the imaging element 101, and displays the obtained image on the display unit 103 or stores it in the storage unit 104.
  • the image output from the control unit 102 may be output to the outside of the imaging apparatus 100 via an input / output interface (not shown).
  • the control unit 102 is a circuit that performs information processing and is a circuit that can access the storage unit 104.
  • the control unit 102 is realized by a processor such as a DSP (Digital Signal Processor) or a GPU (Graphics Processing Unit).
  • the control unit 102 may be a dedicated or general-purpose electronic circuit.
  • the control unit 102 may be an aggregate of a plurality of electronic circuits.
  • the display unit 103 displays an image obtained by the image sensor 101, a user interface, and the like.
  • the display unit 103 is a display panel provided on the back of the camera, and is a liquid crystal panel, an organic EL (electroluminescence) panel, or the like.
  • the display unit 103 may be an electronic viewfinder (EVF) provided on the upper part of the camera. In this case, the image of the liquid crystal panel or the organic EL panel is projected through the lens.
  • the display unit 103 may be an external monitor connected to the imaging apparatus 100 via an interface such as HDMI (registered trademark), USB (registered trademark), or Wi-Fi (registered trademark).
  • the storage unit 104 is a general purpose or dedicated memory in which information is stored.
  • the storage unit 104 may be a magnetic disk or an optical disk, or may be expressed as a storage or a recording medium.
  • the storage unit 104 may be a non-volatile memory or a volatile memory.
  • the storage unit 104 is not limited to a memory built in the imaging apparatus 100, and may be a memory attached to the imaging apparatus 100.
  • the storage unit 104 may be an SD card or the like.
  • the storage unit 104 may be a combination of these plural types of memories.
  • the imaging system 106 includes, for example, one or a plurality of lenses, and condenses light from the outside of the imaging apparatus 100 on the imaging element 101.
  • the light shielding unit 107 is, for example, a mechanical shutter, and shields light from the imaging system 106.
  • FIG. 3 is a block diagram illustrating a configuration of the image sensor 101.
  • 3 includes a plurality of pixels (unit pixel cells) 201 arranged in a matrix, a vertical scanning unit 202, a column signal processing unit 203, a horizontal readout unit 204, and a row.
  • Each of the plurality of pixels 201 outputs a signal corresponding to the incident light to the vertical signal line 207 provided in the corresponding column.
  • the vertical scanning unit 202 resets the plurality of pixels 201 via the plurality of reset control lines 205.
  • the vertical scanning unit 202 sequentially selects the plurality of pixels 201 in units of rows via the plurality of address control lines 206.
  • the column signal processing unit 203 performs signal processing on the signals output to the plurality of vertical signal lines 207, and outputs the plurality of signals obtained by the signal processing to the horizontal reading unit 204.
  • the column signal processing unit 203 performs noise suppression signal processing represented by correlated double sampling, analog / digital conversion processing, and the like.
  • the horizontal readout unit 204 sequentially outputs a plurality of signals after the signal processing by the plurality of column signal processing units 203 to the horizontal output terminal 208.
  • FIG. 4 is a circuit diagram illustrating a configuration of the pixel 201.
  • the pixel 201 includes a photoelectric conversion unit 211, a charge storage unit 212, a reset transistor 213, an amplification transistor 214 (source follower transistor), and a selection transistor 215.
  • the photoelectric conversion unit 211 generates signal charges by photoelectrically converting incident light. A voltage Voe is applied to one end of the photoelectric conversion unit 211.
  • the photoelectric conversion unit 211 includes a photoelectric conversion layer made of an organic material.
  • the photoelectric conversion layer may include a layer made of an organic material and a layer made of an inorganic material.
  • the charge storage unit 212 is connected to the photoelectric conversion unit 211 and stores the signal charge generated by the photoelectric conversion unit 211. Note that the charge storage unit 212 may be configured with a parasitic capacitance such as a wiring capacitance instead of a dedicated capacitance element.
  • the reset transistor 213 is used to reset the potential of the signal charge.
  • the gate of the reset transistor 213 is connected to the reset control line 205, the source is connected to the charge storage unit 212, and the reset voltage Vreset is applied to the drain.
  • drain and source generally depend on circuit operation, and are often not specified from the element structure.
  • one of the source and the drain is referred to as a source and the other of the source and the drain is referred to as a drain.
  • the drain may be replaced with the source and the source may be replaced with the drain.
  • the amplification transistor 214 amplifies the voltage of the charge storage unit 212 and outputs a signal corresponding to the voltage to the vertical signal line 207.
  • the gate of the amplification transistor 214 is connected to the charge storage unit 212, and the power supply voltage Vdd or the ground voltage Vss is applied to the drain.
  • the selection transistor 215 is connected in series with the amplification transistor 214, and switches whether to output the signal amplified by the amplification transistor 214 to the vertical signal line 207.
  • the selection transistor 215 has a gate connected to the address control line 206, a drain connected to the source of the amplification transistor 214, and a source connected to the vertical signal line 207.
  • the voltage Voe, the reset voltage Vreset, and the power supply voltage Vdd are voltages commonly used in all the pixels 201.
  • Non-destructive reading is a process of reading image data during an exposure period and continuing exposure.
  • conventional readout hereinafter referred to as destructive readout
  • nondestructive reading it is possible to read the image data exposed up to that time during the exposure period and continue the exposure. Thereby, a plurality of images having different exposure times can be obtained by one exposure.
  • the electronic ND control is a process for electrically controlling the transmittance of the image sensor.
  • the transmittance means the proportion of light that is converted into an electrical signal in the incident light. That is, by setting the transmittance to 0%, it is possible to electrically shield the light.
  • the transmittance is controlled by controlling the voltage Voe shown in FIG. Thereby, exposure can be electrically terminated without using a mechanical shutter.
  • the image sensor 101 includes a mechanical shutter (light shielding unit 107), and electronic ND control and light shielding by the mechanical shutter may be used together, or light shielding by the mechanical shutter may be used without using the electronic ND control.
  • the image sensor 101 is an organic sensor.
  • the image sensor 101 only needs to realize nondestructive reading or electronic ND control, and may be other than an organic sensor.
  • the photoelectric conversion layer included in the photoelectric conversion unit 211 may be made of an inorganic material.
  • the photoelectric conversion layer may be made of amorphous silicon or chalcopyrite semiconductor.
  • the imaging apparatus 100 controls photographing of the target frame using an auxiliary image obtained by nondestructive readout in the target frame. Specifically, the imaging apparatus 100 performs automatic exposure (AE: Auto Exposure) of the target frame using the auxiliary image.
  • AE Auto Exposure
  • FIG. 5 is a flowchart showing an operation flow of the imaging apparatus 100.
  • FIG. 6 is a diagram for explaining the operation of the imaging apparatus 100.
  • the imaging device 100 captures a live image that is a moving image of a real-time subject and displays the live image on the display unit 103 in a period before time t ⁇ b> 1 at which still image capturing starts. is doing.
  • the imaging apparatus 100 starts exposure for capturing a still image (S101).
  • the imaging apparatus 100 acquires one or a plurality of auxiliary images by performing non-destructive readout a predetermined number of times during the exposure period (S102). In FIG. 6, two nondestructive readings are performed, but the number of nondestructive readings may be arbitrary.
  • the imaging apparatus 100 performs AE control using the obtained auxiliary image (S103).
  • the imaging apparatus 100 can perform the AE control using the auxiliary image by the same method as the AE control using the live image.
  • the imaging apparatus 100 controls the exposure time T1 of the target frame using the auxiliary image.
  • the imaging apparatus 100 detects the movement of the subject based on the difference between the plurality of auxiliary images, and controls the exposure time T1 based on the obtained movement.
  • the imaging apparatus 100 sets the exposure time T1 to be shorter as the movement is larger.
  • the imaging apparatus 100 controls the exposure time T1 based on the luminance level of one or more auxiliary images. For example, the imaging apparatus 100 sets the exposure time T1 longer as the luminance level is lower.
  • the imaging apparatus 100 ends the exposure (S104). For example, the imaging apparatus 100 ends the exposure when the above-described electronic ND control or the mechanical shutter performs light shielding.
  • the imaging apparatus 100 acquires an image by performing destructive reading (S105).
  • S105 destructive reading
  • nondestructive readout is performed immediately after the end of exposure, but it may not be immediately after.
  • the imaging apparatus 100 controls shooting of the target frame using the auxiliary image obtained by nondestructive reading in the target frame. Specifically, the imaging apparatus 100 performs automatic exposure of the target frame using the auxiliary image.
  • the imaging apparatus 100 performs automatic exposure of the target frame using the auxiliary image.
  • the accuracy of automatic exposure can be improved.
  • the method of the present embodiment is different from the conventional method in that the image used for AE control is changed from a live image to an auxiliary image obtained by nondestructive reading. Therefore, more accurate automatic exposure can be realized while diverting part of the conventional automatic exposure algorithm.
  • FIG. 7 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment.
  • FIG. 8 is a diagram for explaining the operation of the imaging apparatus 100.
  • the imaging apparatus 100 determines a threshold value TH used for determination processing described later (S111). Details of this process will be described later.
  • the imaging apparatus 100 starts exposure at time t1 (S112).
  • the imaging device 100 acquires an auxiliary image by nondestructive reading (S113).
  • the imaging apparatus 100 determines whether the luminance of the obtained auxiliary image exceeds the threshold value TH set in step S111 (S114).
  • the imaging apparatus 100 When the brightness of the auxiliary image does not exceed the threshold value TH (No in S114), the imaging apparatus 100 performs nondestructive readout again at time t3 after the time T2 has elapsed (S113), and the obtained auxiliary image It is determined whether the luminance exceeds a threshold value TH (S114). Then, the imaging apparatus 100 repeats these processes until the luminance of the auxiliary image exceeds the threshold value TH.
  • the imaging apparatus 100 ends the exposure by electronic ND control or mechanical shutter (S115), and acquires an image by destructive reading (S116: time). t6). Note that non-destructive reading may be performed instead of destructive reading.
  • FIG. 9 is a diagram illustrating an example of an auxiliary image and a luminance histogram obtained at each time.
  • the horizontal axis of the luminance histogram shown in FIG. 9 indicates luminance, and the vertical axis indicates the frequency of each luminance.
  • the auxiliary images obtained at each time have different brightness levels because of different exposure times. In other words, the brightness level increases as the obtained time is later.
  • the imaging apparatus 100 determines, for example, whether the maximum luminance value of the auxiliary image exceeds the threshold value TH. Note that the imaging apparatus 100 may compare the average value or median value of the luminance with the threshold value TH without being limited to the maximum luminance value.
  • the imaging apparatus 100 ends the exposure immediately after it is determined that the luminance of the auxiliary image exceeds the threshold value TH.
  • the exposure may end after a predetermined time.
  • the threshold value TH is set lower by a difference corresponding to the time.
  • the interval T2 at which nondestructive reading is performed is constant, but it may not be constant.
  • the imaging apparatus 100 may set an exposure time in advance by AE control or the like, and set a plurality of nondestructive readout timings so that the nondestructive readout interval at a time approaching the exposure time is shortened. Good.
  • the imaging apparatus 100 may shorten the nondestructive reading interval as the luminance of the obtained auxiliary image approaches the threshold value TH.
  • the time from the end of exposure to the time when destructive readout is performed may be arbitrary.
  • the imaging apparatus 100 sequentially performs non-destructive reading a plurality of times in the target frame, and stops exposure of the target frame when the luminance of the auxiliary image exceeds a predetermined threshold value TH.
  • the exposure period is determined before the image is captured, and an image having a desired luminance level is not always obtained.
  • the imaging apparatus 100 determines the threshold value TH according to the currently set shooting mode.
  • the imaging apparatus 100 may display a user interface for setting the threshold value TH on the display unit 103 and determine the threshold value TH based on a user instruction via the user interface user.
  • FIG. 10 is a diagram showing an example of this user interface.
  • a threshold TH corresponding to the sample image is set by the user selecting one of the sample images.
  • the sample image may be an image stored in advance or an auxiliary image obtained by nondestructive reading at the time of past shooting.
  • FIG. 10 shows an example in which the user selects the sample image, but the brightness level may be selected using an operation bar or the like for changing the brightness level. Alternatively, an operation of increasing or decreasing the luminance level with respect to the current luminance level may be used.
  • the user can intuitively and easily perform a brightness level selection operation.
  • threshold value TH need not be variable, and a fixed value may be used as the threshold value TH.
  • the imaging apparatus 100 sequentially performs a plurality of nondestructive readings in a target frame, and uses a plurality of auxiliary images obtained by the plurality of nondestructive readings in the target frame. Changes in the subject are detected. For example, the imaging apparatus 100 detects a change in an undesired subject that occurs due to the occurrence of a disturbance to the camera or a tripod falling when a long night is exposed, such as a starlit night.
  • FIG. 11 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment.
  • FIG. 12 is a diagram illustrating an example of a change in luminance of the auxiliary image when a change in the subject occurs.
  • the imaging apparatus 100 starts exposure (S121). At time t1 after the predetermined time T3 has elapsed, the imaging device 100 acquires an auxiliary image by nondestructive reading (S122). Then, the imaging apparatus 100 determines whether the subject has changed using a plurality of auxiliary images obtained after the start of exposure (S123). If it is determined that there is no change in the subject (No in S123) and the exposure period has not ended (No in S125), the imaging apparatus 100 again performs nondestructive reading at time t2 after the time T3 has elapsed. (S122), it is determined again whether the subject has changed using the plurality of auxiliary images obtained so far (S123).
  • the imaging apparatus 100 issues a warning (S124). For example, the imaging apparatus 100 displays a message indicating that the subject has changed on the display unit 103. Note that the imaging apparatus 100 may notify the user that the subject has changed by a warning sound or a voice message.
  • the imaging apparatus 100 repeats these series of processes until the exposure period ends (S125).
  • the imaging apparatus 100 ends the exposure and acquires an image by destructive reading (S126).
  • the luminance of the auxiliary image increases linearly as the exposure time increases. That is, the luminance increases with a constant slope.
  • the luminance gradient changes.
  • FIG. 12 shows a case where an instantaneous disturbance has occurred. It should be noted that the inclination of the luminance changes (increases or decreases) when the shooting scene changes due to the tripod falling or when an undesired subject enters the screen.
  • the imaging apparatus 100 detects a gradient of luminance using a plurality of auxiliary images obtained at a plurality of times. Then, the imaging apparatus 100 calculates the inclination based on the newly obtained luminance of the auxiliary image and the luminance of the auxiliary image obtained immediately before, and the difference between the obtained inclination and the past inclination is determined in advance. It is determined whether the threshold value is exceeded. The imaging apparatus 100 determines that the subject has changed when the difference exceeds the threshold value, and determines that the subject has not changed when the difference does not exceed the threshold value.
  • the past inclination is calculated based on the luminance at time t3 and time t4.
  • the difference between the slope calculated based on the luminance at time t4 and time t5 and the past slope exceeds the threshold. As a result, a warning is issued.
  • the imaging apparatus 100 performs the above determination using, for example, the average luminance of the auxiliary image. Note that the imaging apparatus 100 may use the maximum luminance, or may use the average luminance or the maximum luminance of a specific area of the auxiliary image. The imaging apparatus 100 may perform the above determination for each region including one or more pixels, and may determine that the subject has changed when it is determined that any region has a change.
  • the imaging apparatus 100 sequentially performs a plurality of nondestructive readings in a target frame, and uses a plurality of auxiliary images obtained by the plurality of nondestructive readings in the target frame. Detect changes in the subject. Thereby, for example, since a change in the subject during the exposure of the long exposure can be detected, it is possible to notify the user of the shooting failure during the long exposure at an early stage. Therefore, it is possible to prevent the exposure from being performed to the end even though the shooting has failed.
  • the imaging apparatus 100 may stop shooting (S124A). As a result, the imaging apparatus 100 can automatically stop shooting, for example, when an abnormality occurs during long exposure, so that exposure is performed to the end even though shooting has failed. Can be prevented.
  • the imaging apparatus 100 may perform re-imaging (S124B). Thereby, the imaging apparatus 100 can automatically perform re-imaging when an abnormality occurs during, for example, long exposure.
  • the imaging apparatus 100 when it is determined that there is a change in the subject, the imaging apparatus 100 performs correction for reducing the influence of the change in the subject on the image of the target frame obtained by shooting. May be. Specifically, when it is determined that there is a change in the subject (Yes in S123), the imaging apparatus 100 stores change information regarding the change in the subject (S124C). Then, after acquiring the image of the target frame by destructive reading (S126), the imaging apparatus 100 corrects the image using the change information (S127).
  • the imaging apparatus 100 determines a change for each area including one or a plurality of pixels, and stores information indicating the changed area and the amount of change as change information. Then, the imaging apparatus 100 performs correction by subtracting or adding a luminance value corresponding to the amount of change with respect to the changed area.
  • the imaging apparatus 100 determines the change and stores the change information for each non-destructive reading. However, after all the non-destructive reading is performed or the destructive reading is performed. After that, determination and change information may be stored together.
  • the imaging apparatus 100 can generate an image with reduced influence even when a change occurs in the subject.
  • Embodiment 4 In the present embodiment, another method for controlling the exposure time of a target frame using an auxiliary image will be described. In the first and second embodiments, the method for automatically controlling the exposure time is described. However, in this embodiment, the user provides information for determining the exposure time, and the exposure time is determined based on the user's operation. An example of control will be described. For example, the method of the present embodiment can be applied during long-second exposure of a still image.
  • FIG. 16 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment.
  • FIG. 17 is a diagram for explaining the operation of the imaging apparatus 100.
  • the imaging apparatus 100 starts exposure (S131). Next, the imaging apparatus 100 acquires an auxiliary image by nondestructive reading (S132). Next, the imaging apparatus 100 displays the obtained auxiliary image on the display unit 103 (S133).
  • the imaging apparatus 100 performs nondestructive readout again after a predetermined time has elapsed. Perform (S132), and display the newly obtained auxiliary image (S133).
  • nondestructive reading is performed at a predetermined cycle T4, and the read auxiliary images are sequentially displayed. Note that the interval at which nondestructive reading is performed may not be constant.
  • the imaging apparatus 100 stops the exposure (S135) and acquires an image by destructive reading (S137).
  • the imaging apparatus 100 acquires an image by destructive reading (S137).
  • the display unit 103 displays the auxiliary image during the exposure period of the target frame. Further, the control unit 102 receives an instruction to end the exposure of the target frame during the exposure period of the target frame. As a result, the user can check the luminance level at any time with reference to the auxiliary image displayed during the exposure period. For this reason, the user can stop the exposure at a desired luminance level. Therefore, the user can easily take an image with a desired luminance level.
  • the method for instructing the end of exposure by the user is not particularly limited.
  • an instruction to end the exposure is accepted when the user presses the shutter button or operates the touch panel during the exposure period.
  • the user controls the exposure time T1 during the exposure period.
  • the user selects an arbitrary image from a plurality of auxiliary images already obtained during or after the exposure period. Also good.
  • the imaging apparatus 100 displays an auxiliary image and stores the auxiliary image.
  • the imaging apparatus 100 displays a user interface for selecting one of a plurality of auxiliary images on the display unit 103 as illustrated in FIG.
  • auxiliary images obtained during exposure are sequentially stored in the storage unit 104. If exposure is in progress, a list of a plurality of auxiliary images obtained up to that timing is displayed. If it is after the exposure, a list of all auxiliary images obtained by the exposure is displayed.
  • the user can select an image having an arbitrary luminance level by selecting one auxiliary image via the interface.
  • the displayed image may include an image obtained by destructive readout.
  • the imaging apparatus 100 may delete auxiliary image data that has not been selected after the above selection. Further, the imaging apparatus 100 may end the exposure when a selection is made during the exposure.
  • the imaging apparatus 100 sequentially performs a plurality of nondestructive readings in the target frame, and stores a plurality of images obtained by the plurality of nondestructive readings in the storage unit 104 as moving images. Also good. Further, the imaging apparatus 100 may reproduce this moving image and display the reproduced moving image on the display unit 103. That is, the imaging apparatus 100 stores a plurality of images obtained by a plurality of nondestructive readings in a moving image file format. For example, a publicly known format including a moving image encoding method using inter-screen prediction or the like is used.
  • the stored moving image is played back automatically after the exposure is completed or based on a user operation.
  • Storing a plurality of images obtained by non-destructive readout in the moving image format in this way allows the shooting process itself during the exposure period to be stored as a moving image work.
  • the method it is possible to generate an operation without performing still image composition processing.
  • the amount of data can be reduced as compared to storing a plurality of still images.
  • the imaging apparatus 100 uses an imaging element 101 capable of nondestructive readout and an auxiliary image obtained by nondestructive readout in the target frame, and a control unit 102 that controls shooting of the target frame.
  • an imaging element 101 capable of nondestructive readout and an auxiliary image obtained by nondestructive readout in the target frame
  • a control unit 102 that controls shooting of the target frame.
  • the imaging apparatus 100 can control photographing of the frame using the auxiliary image obtained by nondestructive readout, appropriate control can be performed according to the situation during the exposure period. As described above, the imaging apparatus 100 can realize further improvement.
  • control unit 102 may control the exposure time of the target frame using the auxiliary image.
  • the imaging apparatus 100 can control the exposure time based on the information of the image obtained during the exposure period, it is more appropriate than when the exposure time is controlled based on the information obtained before the start of exposure.
  • the exposure time can be controlled.
  • control unit 102 may sequentially perform non-destructive reading a plurality of times in the target frame, and stop exposure of the target frame when the luminance of the auxiliary image exceeds a predetermined threshold.
  • the imaging apparatus 100 can appropriately control the exposure time based on the image information obtained during the exposure period.
  • the imaging apparatus 100 may further include a display unit 103, and the control unit 102 may further display a user interface for setting a threshold on the display unit 103.
  • control unit 102 may perform automatic exposure of the target frame using the auxiliary image.
  • the imaging apparatus 100 can perform automatic exposure based on the image information obtained during the exposure period, the image capturing apparatus 100 can perform automatic exposure more appropriately than when performing automatic exposure based on the information obtained before the start of exposure. Can be exposed.
  • control unit 102 sequentially performs a plurality of nondestructive readings in the target frame, and detects a change in the subject in the target frame using a plurality of auxiliary images obtained by the plurality of nondestructive readings. Good.
  • the imaging apparatus 100 can detect, for example, a change in the subject during the long exposure using the auxiliary image obtained by nondestructive reading.
  • control unit 102 may issue a warning when a change in the subject is detected.
  • the imaging apparatus 100 can notify the user of the occurrence of an abnormality during long exposure, for example.
  • control unit 102 may stop shooting when a change in the subject is detected.
  • the imaging apparatus 100 can automatically stop photographing when, for example, an abnormality occurs during long exposure.
  • control unit 102 may perform re-photographing when a change in the subject is detected.
  • the imaging apparatus 100 can automatically perform re-imaging when, for example, an abnormality occurs during long exposure.
  • control unit 102 may perform correction for reducing the influence of the change in the subject on the image of the target frame obtained by shooting.
  • the imaging apparatus 100 can generate an image with reduced influence.
  • the imaging apparatus 100 further includes a display unit 103 that displays an auxiliary image during the exposure period of the target frame, and the control unit 102 further instructs to end the exposure of the target frame during the exposure period of the target frame. May be accepted.
  • the user can take an image of a desired luminance level with reference to the auxiliary image displayed during the exposure period.
  • control unit 102 may sequentially perform a plurality of nondestructive readings in the target frame and store a plurality of images obtained by the nondestructive readings as a moving image.
  • the image sensor 101 may be an organic sensor.
  • the control method according to an aspect of the present disclosure is a control method of the imaging apparatus 100 including the imaging element 101 capable of nondestructive readout, and uses the auxiliary image obtained by nondestructive readout in the target frame, A control step for controlling frame shooting;
  • control method can control photographing of the frame by using the auxiliary image obtained by nondestructive reading, appropriate control can be performed according to the situation during the exposure period.
  • control method can realize further improvement.
  • the imaging device according to the embodiment of the present disclosure has been described above, but the present disclosure is not limited to this embodiment.
  • each processing unit included in the imaging apparatus is typically realized as an LSI that is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.
  • circuits are not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor.
  • An FPGA Field Programmable Gate Array
  • reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.
  • each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
  • Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.
  • the present disclosure may be realized as a control method executed by the imaging apparatus.
  • circuit configuration shown in the circuit diagram is an example, and the present disclosure is not limited to the circuit configuration. That is, similar to the circuit configuration described above, a circuit that can realize the characteristic function of the present disclosure is also included in the present disclosure. Moreover, all the numbers used above are illustrated for specifically explaining the present disclosure, and the present disclosure is not limited to the illustrated numbers.
  • division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be.
  • functions of a plurality of functional blocks having similar functions may be processed in parallel or time-division by a single hardware or software.
  • the imaging device has been described based on the embodiment, but the present disclosure is not limited to this embodiment. Unless it deviates from the gist of the present disclosure, various modifications conceived by those skilled in the art have been made in this embodiment, and forms constructed by combining components in different embodiments are also within the scope of one or more aspects. May be included.
  • the present disclosure can be applied to an imaging apparatus such as a digital still camera or a digital video camera.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Studio Devices (AREA)
  • Exposure Control For Cameras (AREA)
  • Transforming Light Signals Into Electric Signals (AREA)

Abstract

An imaging device (100) is provided with an imaging element (101) capable of nondestructive readout and a control unit (102) for controlling the imaging of a subject frame using an auxiliary image obtained by nondestructive readout in the subject frame. The control unit (102) may, for example, control the exposure time of the subject frame using the auxiliary image. The control unit (102) may, for example, successively perform nondestructive readout in the subject frame multiple times and stop exposure for the subject frame when the luminance of the auxiliary image exceeds a predetermined threshold.

Description

撮像装置及びその制御方法Imaging apparatus and control method thereof

 本開示は、撮像装置及びその制御方法に関する。 The present disclosure relates to an imaging apparatus and a control method thereof.

 有機光電変換素子を用いた撮像素子として、特許文献1に記載の技術が知られている。 A technique described in Patent Document 1 is known as an image sensor using an organic photoelectric conversion element.

特開2008-042180号公報JP 2008-042180 A

 このような撮像装置では、さらなる改善が望まれている。 Such an image pickup apparatus is desired to be further improved.

 そこで、本開示は、さらなる改善を実現できる撮像装置又はその制御方法を提供することを目的とする。 Therefore, an object of the present disclosure is to provide an imaging apparatus or a control method thereof that can realize further improvement.

 本開示の一態様に係る撮像装置は、非破壊読み出しが可能な撮像素子と、対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する制御部とを備える。 An imaging apparatus according to an aspect of the present disclosure includes an imaging element capable of nondestructive readout, and a control unit that controls imaging of the target frame using an auxiliary image obtained by nondestructive readout in the target frame. .

 本開示は、さらなる改善を実現できる撮像装置又はその制御方法を提供できる。 The present disclosure can provide an imaging apparatus or a control method thereof that can realize further improvement.

図1は、実施の形態1に係る撮像装置のブロック図である。FIG. 1 is a block diagram of the imaging apparatus according to the first embodiment. 図2Aは、実施の形態1に係る撮像装置の外観例を示す図である。FIG. 2A is a diagram illustrating an appearance example of the imaging apparatus according to Embodiment 1. 図2Bは、実施の形態1に係る撮像装置の外観例を示す図である。FIG. 2B is a diagram illustrating an appearance example of the imaging apparatus according to Embodiment 1. 図3は、実施の形態1に係る撮像素子の構成を示す図である。FIG. 3 is a diagram illustrating a configuration of the image sensor according to the first embodiment. 図4は、実施の形態1に係る画素の構成を示す回路図である。FIG. 4 is a circuit diagram illustrating a configuration of a pixel according to Embodiment 1. 図5は、実施の形態1に係る撮像装置の動作を示すフローチャートである。FIG. 5 is a flowchart illustrating the operation of the imaging apparatus according to the first embodiment. 図6は、実施の形態1に係る撮像装置の動作を示す図である。FIG. 6 is a diagram illustrating the operation of the imaging apparatus according to the first embodiment. 図7は、実施の形態2に係る撮像装置の動作を示すフローチャートである。FIG. 7 is a flowchart illustrating the operation of the imaging apparatus according to the second embodiment. 図8は、実施の形態2に係る撮像装置の動作を示す図である。FIG. 8 is a diagram illustrating the operation of the imaging apparatus according to the second embodiment. 図9は、実施の形態2に係る補助画像の一例を示す図である。FIG. 9 is a diagram illustrating an example of an auxiliary image according to the second embodiment. 図10は、実施の形態2に係る閾値設定用の画面例を示す図である。FIG. 10 is a diagram illustrating an example of a threshold setting screen according to the second embodiment. 図11は、実施の形態3に係る撮像装置の動作を示すフローチャートである。FIG. 11 is a flowchart illustrating the operation of the imaging apparatus according to the third embodiment. 図12は、実施の形態3に係る被写体の変化の判定処理を説明するための図である。FIG. 12 is a diagram for explaining subject change determination processing according to the third embodiment. 図13は、実施の形態3に係る撮像装置の動作の変形例を示すフローチャートである。FIG. 13 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment. 図14は、実施の形態3に係る撮像装置の動作の変形例を示すフローチャートである。FIG. 14 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment. 図15は、実施の形態3に係る撮像装置の動作の変形例を示すフローチャートである。FIG. 15 is a flowchart illustrating a modified example of the operation of the imaging apparatus according to the third embodiment. 図16は、実施の形態4に係る撮像装置の動作を示すフローチャートである。FIG. 16 is a flowchart illustrating the operation of the imaging apparatus according to the fourth embodiment. 図17は、実施の形態4に係る撮像装置の動作を示す図である。FIG. 17 is a diagram illustrating the operation of the imaging apparatus according to the fourth embodiment. 図18は、実施の形態4に係る画像選択用の画面例を示す図である。FIG. 18 is a diagram illustrating an example of an image selection screen according to the fourth embodiment. 図19は、実施の形態4の変形例に係る撮像装置の動作を示す図である。FIG. 19 is a diagram illustrating the operation of the imaging apparatus according to the modification of the fourth embodiment.

 以下、本開示の実施の形態について、図面を参照しながら説明する。以下に説明する実施の形態は、いずれも本開示の好ましい一具体例を示すものである。したがって、以下の実施の形態で示される数値、形状、材料、構成要素、構成要素の配置位置及び接続形態等は、一例であって本開示を限定する主旨ではない。よって、以下の実施の形態における構成要素のうち、本開示の最上位概念を示す独立請求項に記載されていない構成要素については、任意の構成要素として説明される。 Hereinafter, embodiments of the present disclosure will be described with reference to the drawings. Each of the embodiments described below shows a preferred specific example of the present disclosure. Therefore, the numerical values, shapes, materials, components, component arrangement positions, connection forms, and the like shown in the following embodiments are merely examples, and are not intended to limit the present disclosure. Therefore, among the constituent elements in the following embodiments, constituent elements that are not described in the independent claims indicating the highest concept of the present disclosure are described as arbitrary constituent elements.

 なお、各図は、模式図であり、必ずしも厳密に図示されたものではない。また、各図において、実質的に同一の構成に対しては同一の符号を付しており、重複する説明は省略又は簡略化する。 Each figure is a schematic diagram and is not necessarily shown strictly. Moreover, in each figure, the same code | symbol is attached | subjected to the substantially same structure, The overlapping description is abbreviate | omitted or simplified.

 (実施の形態1)
 [撮像装置の構成]
 まず、本開示の実施の形態に係る撮像装置の構成を説明する。図1は、本実施の形態に係る撮像装置100の構成を示すブロック図である。また、図2A及び図2Bは、撮像装置100の外観の例を示す図である。例えば、図2A及び図2Bに示すように、撮像装置100は、デジタルスチルカメラ又はデジタルビデオカメラ等のカメラである。
(Embodiment 1)
[Configuration of imaging device]
First, the configuration of the imaging device according to the embodiment of the present disclosure will be described. FIG. 1 is a block diagram illustrating a configuration of an imaging apparatus 100 according to the present embodiment. 2A and 2B are diagrams illustrating an example of the appearance of the imaging apparatus 100. FIG. For example, as illustrated in FIGS. 2A and 2B, the imaging apparatus 100 is a camera such as a digital still camera or a digital video camera.

 図1に示す撮像装置100は、撮像素子101と、制御部102と、表示部103と、記憶部104と、撮像系106と、遮光部107とを備える。撮像素子101は、入射光を電気信号(画像)に変換し、得られた電気信号を出力する固体撮像素子(固体撮像装置)であり、例えば、有機光電変換素子を用いた有機センサである。 1 includes an imaging element 101, a control unit 102, a display unit 103, a storage unit 104, an imaging system 106, and a light shielding unit 107. The imaging device 101 is a solid-state imaging device (solid-state imaging device) that converts incident light into an electrical signal (image) and outputs the obtained electrical signal. For example, the imaging device 101 is an organic sensor using an organic photoelectric conversion device.

 制御部102は、撮像素子101の制御を行う。また、制御部102は、撮像素子101で得られた画像に対して各種信号処理を施し、得られた画像を表示部103に表示したり、記憶部104に記憶したりする。なお、制御部102から出力された画像は、図示しない入出力インタフェースを介して撮像装置100の外部に出力されてもよい。 The control unit 102 controls the image sensor 101. In addition, the control unit 102 performs various signal processing on the image obtained by the imaging element 101, and displays the obtained image on the display unit 103 or stores it in the storage unit 104. Note that the image output from the control unit 102 may be output to the outside of the imaging apparatus 100 via an input / output interface (not shown).

 この制御部102は、情報処理を行う回路であり、記憶部104にアクセス可能な回路である。例えば、制御部102は、DSP(Digital Signal Processor)又はGPU(Graphics Processing Unit)等のプロセッサにより実現される。なお、制御部102は、専用又は汎用の電子回路であってもよい。また、制御部102は、複数の電子回路の集合体であってもよい。 The control unit 102 is a circuit that performs information processing and is a circuit that can access the storage unit 104. For example, the control unit 102 is realized by a processor such as a DSP (Digital Signal Processor) or a GPU (Graphics Processing Unit). Note that the control unit 102 may be a dedicated or general-purpose electronic circuit. The control unit 102 may be an aggregate of a plurality of electronic circuits.

 表示部103は、撮像素子101で得られた画像、及びユーザインタフェース等を表示する。例えば、表示部103は、カメラ背面に設けられた表示パネルであり、液晶パネル又は有機EL(エレクトロルミネッセンス)パネル等である。なお、表示部103は、カメラ上部に設けられた電子ビューファインダー(EVF)であってもよい。この場合、液晶パネル又は有機ELパネルの映像がレンズを介して投影される。また、表示部103は、撮像装置100に、HDMI(登録商標)、USB(登録商標)又はWi-Fi(登録商標)などのインタフェースを介して接続された外部モニタであってもよい。 The display unit 103 displays an image obtained by the image sensor 101, a user interface, and the like. For example, the display unit 103 is a display panel provided on the back of the camera, and is a liquid crystal panel, an organic EL (electroluminescence) panel, or the like. The display unit 103 may be an electronic viewfinder (EVF) provided on the upper part of the camera. In this case, the image of the liquid crystal panel or the organic EL panel is projected through the lens. Further, the display unit 103 may be an external monitor connected to the imaging apparatus 100 via an interface such as HDMI (registered trademark), USB (registered trademark), or Wi-Fi (registered trademark).

 記憶部104は、情報が記憶される汎用又は専用のメモリである。例えば、記憶部104は、磁気ディスク又は光ディスク等であってもよいし、ストレージ又は記録媒体等と表現されてもよい。また、記憶部104は、不揮発性メモリでもよいし、揮発性メモリでもよい。また、記憶部104は、撮像装置100に内蔵されているメモリに限らず、撮像装置100に装着されるメモリであってもよい。例えば、記憶部104は、SDカード等であってもよい。また、記憶部104は、これらの複数種類のメモリの組み合わせであってもよい。 The storage unit 104 is a general purpose or dedicated memory in which information is stored. For example, the storage unit 104 may be a magnetic disk or an optical disk, or may be expressed as a storage or a recording medium. The storage unit 104 may be a non-volatile memory or a volatile memory. In addition, the storage unit 104 is not limited to a memory built in the imaging apparatus 100, and may be a memory attached to the imaging apparatus 100. For example, the storage unit 104 may be an SD card or the like. The storage unit 104 may be a combination of these plural types of memories.

 撮像系106は、例えば、1又は複数のレンズを含み、撮像装置100の外部からの光を撮像素子101に集光する。 The imaging system 106 includes, for example, one or a plurality of lenses, and condenses light from the outside of the imaging apparatus 100 on the imaging element 101.

 遮光部107は、例えば、メカシャッタであり、撮像系106の光を遮光する。 The light shielding unit 107 is, for example, a mechanical shutter, and shields light from the imaging system 106.

 [撮像素子の構成]
 次に、撮像素子101の構成を説明する。図3は、撮像素子101の構成を示すブロック図である。
[Configuration of image sensor]
Next, the configuration of the image sensor 101 will be described. FIG. 3 is a block diagram illustrating a configuration of the image sensor 101.

 図3に示す撮像素子101は、行列状に配置された複数の画素(単位画素セル)201と、垂直走査部202と、カラム信号処理部203と、水平読み出し部204と、行毎に設けられている複数のリセット制御線205と、行毎に設けられている複数のアドレス制御線206と、列毎に設けられている複数の垂直信号線207と、水平出力端子208とを備える。 3 includes a plurality of pixels (unit pixel cells) 201 arranged in a matrix, a vertical scanning unit 202, a column signal processing unit 203, a horizontal readout unit 204, and a row. A plurality of reset control lines 205, a plurality of address control lines 206 provided for each row, a plurality of vertical signal lines 207 provided for each column, and a horizontal output terminal 208.

 複数の画素201の各々は、入射光に応じた信号を、対応する列に設けられている垂直信号線207に出力する。 Each of the plurality of pixels 201 outputs a signal corresponding to the incident light to the vertical signal line 207 provided in the corresponding column.

 垂直走査部202は、複数のリセット制御線205を介して複数の画素201をリセットする。また、垂直走査部202は、複数のアドレス制御線206を介して、複数の画素201を行単位で順次選択する。 The vertical scanning unit 202 resets the plurality of pixels 201 via the plurality of reset control lines 205. The vertical scanning unit 202 sequentially selects the plurality of pixels 201 in units of rows via the plurality of address control lines 206.

 カラム信号処理部203は、複数の垂直信号線207に出力された信号に信号処理を行い、当該信号処理により得られた複数の信号を水平読み出し部204へ出力する。例えば、カラム信号処理部203は、相関二重サンプリングに代表される雑音抑圧信号処理及び、アナログ/デジタル変換処理等を行う。 The column signal processing unit 203 performs signal processing on the signals output to the plurality of vertical signal lines 207, and outputs the plurality of signals obtained by the signal processing to the horizontal reading unit 204. For example, the column signal processing unit 203 performs noise suppression signal processing represented by correlated double sampling, analog / digital conversion processing, and the like.

 水平読み出し部204は、複数のカラム信号処理部203で信号処理された後の複数の信号を順次水平出力端子208に出力する。 The horizontal readout unit 204 sequentially outputs a plurality of signals after the signal processing by the plurality of column signal processing units 203 to the horizontal output terminal 208.

 以下、画素201の構成を説明する。図4は、画素201の構成を示す回路図である。 Hereinafter, the configuration of the pixel 201 will be described. FIG. 4 is a circuit diagram illustrating a configuration of the pixel 201.

 図4に示すように画素201は、光電変換部211と、電荷蓄積部212と、リセットトランジスタ213と、増幅トランジスタ214(ソースフォロアトランジスタ)と、選択トランジスタ215とを備える。 As shown in FIG. 4, the pixel 201 includes a photoelectric conversion unit 211, a charge storage unit 212, a reset transistor 213, an amplification transistor 214 (source follower transistor), and a selection transistor 215.

 光電変換部211は、入射光を光電変換することにより信号電荷を生成する。光電変換部211の一端には電圧Voeが印加されている。具体的には、光電変換部211は、有機材料で構成される光電変換層を含む。なお、この光電変換層は、有機材料で構成される層と無機材料で構成される層とを含んでもよい。 The photoelectric conversion unit 211 generates signal charges by photoelectrically converting incident light. A voltage Voe is applied to one end of the photoelectric conversion unit 211. Specifically, the photoelectric conversion unit 211 includes a photoelectric conversion layer made of an organic material. The photoelectric conversion layer may include a layer made of an organic material and a layer made of an inorganic material.

 電荷蓄積部212は、光電変換部211に接続されており、光電変換部211で生成された信号電荷を蓄積する。なお、電荷蓄積部212は、専用の容量素子ではなく、配線容量等の寄生容量で構成されてもよい。 The charge storage unit 212 is connected to the photoelectric conversion unit 211 and stores the signal charge generated by the photoelectric conversion unit 211. Note that the charge storage unit 212 may be configured with a parasitic capacitance such as a wiring capacitance instead of a dedicated capacitance element.

 リセットトランジスタ213は、信号電荷の電位をリセットするために用いられる。リセットトランジスタ213のゲートはリセット制御線205に接続されており、ソースは電荷蓄積部212に接続されており、ドレインにはリセット電圧Vresetが印加される。 The reset transistor 213 is used to reset the potential of the signal charge. The gate of the reset transistor 213 is connected to the reset control line 205, the source is connected to the charge storage unit 212, and the reset voltage Vreset is applied to the drain.

 なお、ドレイン及びソースの定義は、一般的に回路動作に依存するものであり、素子構造からは特定できない場合が多い。本実施の形態では、便宜的にソース及びドレインの一方をソースと呼び、ソース及びドレインの他方をドレインと呼ぶが、本実施の形態におけるドレインをソース、ソースをドレインと置き換えてもよい。 Note that the definitions of drain and source generally depend on circuit operation, and are often not specified from the element structure. In this embodiment, for convenience, one of the source and the drain is referred to as a source and the other of the source and the drain is referred to as a drain. However, in this embodiment, the drain may be replaced with the source and the source may be replaced with the drain.

 増幅トランジスタ214は、電荷蓄積部212の電圧を増幅することで、当該電圧に応じた信号を垂直信号線207へ出力する。増幅トランジスタ214のゲートは電荷蓄積部212に接続されており、ドレインに電源電圧Vddまたは接地電圧Vssが印加される。 The amplification transistor 214 amplifies the voltage of the charge storage unit 212 and outputs a signal corresponding to the voltage to the vertical signal line 207. The gate of the amplification transistor 214 is connected to the charge storage unit 212, and the power supply voltage Vdd or the ground voltage Vss is applied to the drain.

 選択トランジスタ215は、増幅トランジスタ214と直列に接続されており、増幅トランジスタ214が増幅した信号を垂直信号線207に出力するか否かを切り替える。選択トランジスタ215のゲートはアドレス制御線206に接続されており、ドレインは増幅トランジスタ214のソースに接続されており、ソースは垂直信号線207に接続されている。 The selection transistor 215 is connected in series with the amplification transistor 214, and switches whether to output the signal amplified by the amplification transistor 214 to the vertical signal line 207. The selection transistor 215 has a gate connected to the address control line 206, a drain connected to the source of the amplification transistor 214, and a source connected to the vertical signal line 207.

 また、例えば、電圧Voe、リセット電圧Vreset及び電源電圧Vddは、全画素201で共通に用いられる電圧である。 Further, for example, the voltage Voe, the reset voltage Vreset, and the power supply voltage Vdd are voltages commonly used in all the pixels 201.

 また、有機センサの特徴として、非破壊読み出しと、電子ND(Neutral Density)制御とがある。非破壊読み出しとは、露光期間中に画像データを読み出し、引き続き露光を継続する処理である。従来の読み出し(以下、破壊読み出しと呼ぶ)では、読み出しを行う際には露光を終了する必要があった。つまり、1回の露光により1枚の画像しか得ることができなかった。これに対して、非破壊読み出しを用いることで、露光期間中に、その時刻までに露光された画像データを読み出し、引き続き露光を継続することができる。これにより、1回の露光で露光時間の異なる複数の画像を得ることができる。 The characteristics of the organic sensor include non-destructive readout and electronic ND (Neutral Density) control. Non-destructive reading is a process of reading image data during an exposure period and continuing exposure. In conventional readout (hereinafter referred to as destructive readout), it is necessary to end exposure when performing readout. That is, only one image could be obtained by one exposure. On the other hand, by using nondestructive reading, it is possible to read the image data exposed up to that time during the exposure period and continue the exposure. Thereby, a plurality of images having different exposure times can be obtained by one exposure.

 また、電子ND制御とは、電気的に撮像素子の透過率を制御する処理である。ここで透過率とは、入射光のうち電気信号に変換される光の割合を意味する。つまり、透過率を0%に設定することで、電気的に遮光を実現できる。具体的には、図4に示す電圧Voeが制御されることで透過率が制御される。これにより、メカシャッタを用いることなく、電気的に露光を終了させることができる。 The electronic ND control is a process for electrically controlling the transmittance of the image sensor. Here, the transmittance means the proportion of light that is converted into an electrical signal in the incident light. That is, by setting the transmittance to 0%, it is possible to electrically shield the light. Specifically, the transmittance is controlled by controlling the voltage Voe shown in FIG. Thereby, exposure can be electrically terminated without using a mechanical shutter.

 なお、撮像素子101は、メカシャッタ(遮光部107)を備え、電子ND制御とメカシャッタによる遮光とを併用してもよいし、電子ND制御を用いず、メカシャッタによる遮光を用いてもよい。 Note that the image sensor 101 includes a mechanical shutter (light shielding unit 107), and electronic ND control and light shielding by the mechanical shutter may be used together, or light shielding by the mechanical shutter may be used without using the electronic ND control.

 また、ここでは、撮像素子101が有機センサである例を述べるが、撮像素子101は、非破壊読み出し、又は電子ND制御を実現できればよく、有機センサ以外であってもよい。つまり、光電変換部211に含まれる光電変換層は無機材料で構成されてもよい。例えば、光電変換層はアモルファスシリコン又はカルコパイライト系半導体等で構成されてもよい。 In addition, here, an example in which the image sensor 101 is an organic sensor will be described. However, the image sensor 101 only needs to realize nondestructive reading or electronic ND control, and may be other than an organic sensor. That is, the photoelectric conversion layer included in the photoelectric conversion unit 211 may be made of an inorganic material. For example, the photoelectric conversion layer may be made of amorphous silicon or chalcopyrite semiconductor.

 [撮像装置の動作]
 次に、本実施の形態に係る撮像装置100の動作を説明する。本実施の形態に係る撮像装置100は、対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する。具体的には、撮像装置100は、補助画像を用いて、対象フレームの自動露出(AE:Auto Exposure)を行う。
[Operation of imaging device]
Next, the operation of the imaging apparatus 100 according to the present embodiment will be described. The imaging apparatus 100 according to the present embodiment controls photographing of the target frame using an auxiliary image obtained by nondestructive readout in the target frame. Specifically, the imaging apparatus 100 performs automatic exposure (AE: Auto Exposure) of the target frame using the auxiliary image.

 図5は、撮像装置100の動作の流れを示すフローチャートである。図6は、撮像装置100の動作を説明するための図である。 FIG. 5 is a flowchart showing an operation flow of the imaging apparatus 100. FIG. 6 is a diagram for explaining the operation of the imaging apparatus 100.

 図6に示すように、静止画の撮影を開始する時刻t1より前の期間において、撮像装置100は、リアルタイムの被写体の動画像であるライブ画像を撮影し、当該ライブ画像を表示部103に表示している。 As illustrated in FIG. 6, the imaging device 100 captures a live image that is a moving image of a real-time subject and displays the live image on the display unit 103 in a period before time t <b> 1 at which still image capturing starts. is doing.

 時刻t1において、例えば、ユーザの操作に基づき、撮像装置100は、静止画を撮影するための露光を開始する(S101)。次に、撮像装置100は、露光期間中に予め定められた回数の非破壊読み出しを行うことで1又は複数の補助画像を取得する(S102)。なお、図6では2回の非破壊読み出しが行われているが、非破壊読み出しの回数は任意でよい。次に、撮像装置100は、得られた補助画像を用いてAE制御を行う(S103)。 At time t1, for example, based on a user operation, the imaging apparatus 100 starts exposure for capturing a still image (S101). Next, the imaging apparatus 100 acquires one or a plurality of auxiliary images by performing non-destructive readout a predetermined number of times during the exposure period (S102). In FIG. 6, two nondestructive readings are performed, but the number of nondestructive readings may be arbitrary. Next, the imaging apparatus 100 performs AE control using the obtained auxiliary image (S103).

 ここで、従来では、ライブ画像を用いてAE制御が行われていた。よって、撮像装置100は、ライブ画像を用いたAE制御と同様の手法により、補助画像を用いたAE制御を行うことができる。具体的には、撮像装置100は、補助画像を用いて、対象フレームの露光時間T1を制御する。例えば、撮像装置100は、複数の補助画像の差に基づき被写体の動きを検出し、得られた動きに基づき露光時間T1を制御する。例えば、撮像装置100は、動きが大きいほど露光時間T1を短く設定する。または、撮像装置100は、1以上の補助画像の輝度レベルに基づき、露光時間T1を制御する。例えば、撮像装置100は、輝度レベルが低いほど露光時間T1を長く設定する。 Here, conventionally, AE control has been performed using live images. Therefore, the imaging apparatus 100 can perform the AE control using the auxiliary image by the same method as the AE control using the live image. Specifically, the imaging apparatus 100 controls the exposure time T1 of the target frame using the auxiliary image. For example, the imaging apparatus 100 detects the movement of the subject based on the difference between the plurality of auxiliary images, and controls the exposure time T1 based on the obtained movement. For example, the imaging apparatus 100 sets the exposure time T1 to be shorter as the movement is larger. Alternatively, the imaging apparatus 100 controls the exposure time T1 based on the luminance level of one or more auxiliary images. For example, the imaging apparatus 100 sets the exposure time T1 longer as the luminance level is lower.

 次に、ステップS103で設定された露光時間T1が経過した時刻t4において、撮像装置100は露光を終了する(S104)。例えば、撮像装置100は、上述した電子ND制御、又は、メカシャッタにより遮光が行われることで露光が終了する。 Next, at time t4 when the exposure time T1 set in step S103 has elapsed, the imaging apparatus 100 ends the exposure (S104). For example, the imaging apparatus 100 ends the exposure when the above-described electronic ND control or the mechanical shutter performs light shielding.

 次に、撮像装置100は、破壊読み出しを行うことで画像を取得する(S105)。なお、図6では、露光終了の直後に非破壊読み出しが行われているが、直後でなくてもよい。 Next, the imaging apparatus 100 acquires an image by performing destructive reading (S105). In FIG. 6, nondestructive readout is performed immediately after the end of exposure, but it may not be immediately after.

 以上のように、本実施の形態に係る撮像装置100は、対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する。具体的には、撮像装置100は、補助画像を用いて、対象フレームの自動露出を行う。ここで、ライブ画像を用いて自動露出を行う場合、自動露出を行ってから静止画の撮影を行うまでに時間差がある。これにより、自動露出の精度が低くなる場合がある。これに対して、本実施の形態の手法では、この時間差を低減できるので、自動露出の精度を向上できる。 As described above, the imaging apparatus 100 according to the present embodiment controls shooting of the target frame using the auxiliary image obtained by nondestructive reading in the target frame. Specifically, the imaging apparatus 100 performs automatic exposure of the target frame using the auxiliary image. Here, when performing automatic exposure using a live image, there is a time difference from when automatic exposure is performed to when a still image is shot. This may reduce the accuracy of automatic exposure. On the other hand, since the time difference can be reduced in the method of the present embodiment, the accuracy of automatic exposure can be improved.

 また、本実施の形態の手法は、従来の手法に対して、AE制御に使用される画像がライブ画像から非破壊読み出しで得られた補助画像に変更されている点が異なる。よって、従来の自動露出のアルゴリズムの一部を流用しつつ、より精度の高い自動露出を実現できる。 Also, the method of the present embodiment is different from the conventional method in that the image used for AE control is changed from a live image to an auxiliary image obtained by nondestructive reading. Therefore, more accurate automatic exposure can be realized while diverting part of the conventional automatic exposure algorithm.

 また、ここでは、静止画を撮影する場合を例に説明を行ったが、動画像を撮影する場合に同様の手法を用いてもよい。 In addition, here, the case where a still image is shot has been described as an example, but a similar method may be used when shooting a moving image.

 (実施の形態2)
 本実施の形態では、補助画像を用いて、対象フレームの露光時間を制御する別の手法について説明する。
(Embodiment 2)
In the present embodiment, another method for controlling the exposure time of a target frame using an auxiliary image will be described.

 図7は、本実施の形態に係る撮像装置100の動作の流れを示すフローチャートである。図8は、撮像装置100の動作を説明するための図である。 FIG. 7 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment. FIG. 8 is a diagram for explaining the operation of the imaging apparatus 100.

 まず、撮像装置100は、後述する判定処理に用いる閾値THを決定する(S111)。なお、この処理の詳細については後述する。 First, the imaging apparatus 100 determines a threshold value TH used for determination processing described later (S111). Details of this process will be described later.

 次に、撮像装置100は、時刻t1において露光を開始する(S112)。所定時間が経過した後の時刻t2において、撮像装置100は、非破壊読み出しにより補助画像を取得する(S113)。そして、撮像装置100は、得られた補助画像の輝度がステップS111で設定された閾値THを超えたかを判定する(S114)。 Next, the imaging apparatus 100 starts exposure at time t1 (S112). At time t2 after the predetermined time has elapsed, the imaging device 100 acquires an auxiliary image by nondestructive reading (S113). Then, the imaging apparatus 100 determines whether the luminance of the obtained auxiliary image exceeds the threshold value TH set in step S111 (S114).

 補助画像の輝度が閾値THを超えていない場合(S114でNo)、撮像装置100は、時間T2が経過した後の時刻t3において、再度非破壊読み出しを行い(S113)、得られた補助画像の輝度が閾値THを超えたかを判定する(S114)。そして、撮像装置100は、補助画像の輝度が閾値THを超えるまで、これらの処理を繰り返す。 When the brightness of the auxiliary image does not exceed the threshold value TH (No in S114), the imaging apparatus 100 performs nondestructive readout again at time t3 after the time T2 has elapsed (S113), and the obtained auxiliary image It is determined whether the luminance exceeds a threshold value TH (S114). Then, the imaging apparatus 100 repeats these processes until the luminance of the auxiliary image exceeds the threshold value TH.

 補助画像の輝度が閾値THを超えた場合(S114でYes:時刻t5)、撮像装置100は、電子ND制御又はメカシャッタにより露光を終了し(S115)、破壊読み出しにより画像を取得する(S116:時刻t6)。なお、破壊読み出しの代わりに非破壊読み出しが行われてもよい。 When the luminance of the auxiliary image exceeds the threshold value TH (Yes in S114: time t5), the imaging apparatus 100 ends the exposure by electronic ND control or mechanical shutter (S115), and acquires an image by destructive reading (S116: time). t6). Note that non-destructive reading may be performed instead of destructive reading.

 図9は、各時刻に得られた補助画像及び輝度ヒストグラムの一例を示す図である。図9に示す輝度ヒストグラムの横軸は輝度を示し、縦軸は各輝度の度数を示す。図9に示すように、各時刻に得られる補助画像は、互いに露光時間が異なるため、輝度レベルが異なる。つまり、得られる時刻が遅くなるほど輝度レベルが高くなる。 FIG. 9 is a diagram illustrating an example of an auxiliary image and a luminance histogram obtained at each time. The horizontal axis of the luminance histogram shown in FIG. 9 indicates luminance, and the vertical axis indicates the frequency of each luminance. As shown in FIG. 9, the auxiliary images obtained at each time have different brightness levels because of different exposure times. In other words, the brightness level increases as the obtained time is later.

 ステップS114の判定では、撮像装置100は、例えば、補助画像の輝度の最大値が閾値THを超えたかを判定する。なお、撮像装置100は、輝度の最大値に限らず、輝度の平均値又は中央値等と閾値THとを比較してもよい。 In the determination in step S114, the imaging apparatus 100 determines, for example, whether the maximum luminance value of the auxiliary image exceeds the threshold value TH. Note that the imaging apparatus 100 may compare the average value or median value of the luminance with the threshold value TH without being limited to the maximum luminance value.

 また、図8では、撮像装置100は、補助画像の輝度が閾値THを超えたと判定された直後に露光を終了しているが、予め定められた時間後に露光を終了してもよい。この場合、閾値THは当該時間に相当する差分だけ低く設定される。 In FIG. 8, the imaging apparatus 100 ends the exposure immediately after it is determined that the luminance of the auxiliary image exceeds the threshold value TH. However, the exposure may end after a predetermined time. In this case, the threshold value TH is set lower by a difference corresponding to the time.

 また、図8では、非破壊読み出しが行われる間隔T2が一定であるが、一定でなくてもよい。例えば、撮像装置100は、AE制御等により事前に露光時間を設定しておき、その露光時間に近づく時刻における非破壊読み出しの間隔が短くなるように複数の非破壊読み出しのタイミングを設定してもよい。または、撮像装置100は、得られた補助画像の輝度が閾値THに近づくほど、非破壊読み出しの間隔を短くしてもよい。また、露光終了から、破壊読み出しが行われるまでの時間は任意でよい。 In FIG. 8, the interval T2 at which nondestructive reading is performed is constant, but it may not be constant. For example, the imaging apparatus 100 may set an exposure time in advance by AE control or the like, and set a plurality of nondestructive readout timings so that the nondestructive readout interval at a time approaching the exposure time is shortened. Good. Alternatively, the imaging apparatus 100 may shorten the nondestructive reading interval as the luminance of the obtained auxiliary image approaches the threshold value TH. Further, the time from the end of exposure to the time when destructive readout is performed may be arbitrary.

 また、ここでは、静止画を撮影する場合を例に説明を行ったが、動画像を撮影する場合に同様の手法を用いてもよい。 In addition, here, the case where a still image is shot has been described as an example, but a similar method may be used when shooting a moving image.

 以上のように、撮像装置100は、対象フレームにおいて複数回の非破壊読み出しを順次行い、補助画像の輝度が予め定められた閾値THを超えた場合に、対象フレームの露光を停止する。ここで、従来の一般的な手法では、画像の撮影前に露光期間を決定しており、必ずしも所望の輝度レベルの画像を得られるとは限らない。一方で、本実施の形態の手法を用いることで、露光期間中に、画像の輝度レベルが所望の値に達したかを動的に判定できるので、所望の輝度レベルの画像を精度良く得ることができる。 As described above, the imaging apparatus 100 sequentially performs non-destructive reading a plurality of times in the target frame, and stops exposure of the target frame when the luminance of the auxiliary image exceeds a predetermined threshold value TH. Here, in the conventional general method, the exposure period is determined before the image is captured, and an image having a desired luminance level is not always obtained. On the other hand, by using the method of the present embodiment, it is possible to dynamically determine whether the brightness level of the image has reached a desired value during the exposure period, so that an image with the desired brightness level can be obtained with high accuracy. Can do.

 また、ステップS111における閾値THの決定方法としては、例えば、撮像装置100は、現在設定されている撮影モードに応じて、閾値THを決定する。または、撮像装置100は、閾値THを設定するためのユーザインタフェースを表示部103に表示し、当該ユーザインタフェースユーザを介してユーザの指示に基づき、閾値THを決定してもよい。図10は、このユーザインタフェースの一例を示す図である。 Further, as a method for determining the threshold value TH in step S111, for example, the imaging apparatus 100 determines the threshold value TH according to the currently set shooting mode. Alternatively, the imaging apparatus 100 may display a user interface for setting the threshold value TH on the display unit 103 and determine the threshold value TH based on a user instruction via the user interface user. FIG. 10 is a diagram showing an example of this user interface.

 例えば、図10に示すように、互いに輝度レベルの異なる複数のサンプル画像が表示され、ユーザがいずれかのサンプル画像を選択することで、当該サンプル画像に対応する閾値THが設定される。なお、サンプル画像は、予め保存されている画像であってもよいし、過去の撮影時に非破壊読み出しにより得られた補助画像であってもよい。 For example, as shown in FIG. 10, a plurality of sample images having different luminance levels are displayed, and a threshold TH corresponding to the sample image is set by the user selecting one of the sample images. The sample image may be an image stored in advance or an auxiliary image obtained by nondestructive reading at the time of past shooting.

 また、図10では、ユーザがサンプル画像を選択する例を示しているが、輝度レベルを変更するための操作バー等を用いて、輝度レベルが選択されてもよい。または、現在の輝度レベルに対して輝度レベルを上げる、又は下げる操作が用いられてもよい。 FIG. 10 shows an example in which the user selects the sample image, but the brightness level may be selected using an operation bar or the like for changing the brightness level. Alternatively, an operation of increasing or decreasing the luminance level with respect to the current luminance level may be used.

 このような、ユーザインタフェースを設けることで、ユーザは、輝度レベルの選択操作を直感的に、かつ容易に行うことができる。 By providing such a user interface, the user can intuitively and easily perform a brightness level selection operation.

 なお、閾値THは可変である必要はなく、閾値THとして固定値が用いられてもよい。 Note that the threshold value TH need not be variable, and a fixed value may be used as the threshold value TH.

 (実施の形態3)
 本実施の形態では、対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する処理の別の例を説明する。具体的には、本実施の形態に係る撮像装置100は、対象フレームにおいて複数回の非破壊読み出しを順次行い、複数回の非破壊読み出しで得られた複数の補助画像を用いて、対象フレーム内における被写体の変化を検出する。例えば、撮像装置100は、星夜などを長秒露光する際における、カメラへの外乱の発生、又は、三脚が倒れる等により生じる望まない被写体の変化を検出する。
(Embodiment 3)
In the present embodiment, another example of processing for controlling photographing of a target frame using an auxiliary image obtained by nondestructive reading in the target frame will be described. Specifically, the imaging apparatus 100 according to the present embodiment sequentially performs a plurality of nondestructive readings in a target frame, and uses a plurality of auxiliary images obtained by the plurality of nondestructive readings in the target frame. Changes in the subject are detected. For example, the imaging apparatus 100 detects a change in an undesired subject that occurs due to the occurrence of a disturbance to the camera or a tripod falling when a long night is exposed, such as a starlit night.

 図11は、本実施の形態に係る撮像装置100の動作の流れを示すフローチャートである。図12は、被写体の変化が発生した場合の補助画像の輝度の変化例を示す図である。 FIG. 11 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment. FIG. 12 is a diagram illustrating an example of a change in luminance of the auxiliary image when a change in the subject occurs.

 まず、撮像装置100は、露光を開始する(S121)。所定の時間T3が経過した後の時刻t1において、撮像装置100は、非破壊読み出しにより補助画像を取得する(S122)。そして、撮像装置100は、露光開始後に得られた複数の補助画像を用いて、被写体に変化があったかを判定する(S123)。被写体に変化がないと判定され(S123でNo)、かつ、露光期間が終了していない場合(S125でNo)、撮像装置100は、時間T3が経過した後の時刻t2において、再度非破壊読み出しを行い(S122)、それまでに得られた複数の補助画像を用いて、再度、被写体に変化があったかを判定する(S123)。 First, the imaging apparatus 100 starts exposure (S121). At time t1 after the predetermined time T3 has elapsed, the imaging device 100 acquires an auxiliary image by nondestructive reading (S122). Then, the imaging apparatus 100 determines whether the subject has changed using a plurality of auxiliary images obtained after the start of exposure (S123). If it is determined that there is no change in the subject (No in S123) and the exposure period has not ended (No in S125), the imaging apparatus 100 again performs nondestructive reading at time t2 after the time T3 has elapsed. (S122), it is determined again whether the subject has changed using the plurality of auxiliary images obtained so far (S123).

 一方、被写体に変化があると判定された場合(S123でYes)、撮像装置100は、警告を発する(S124)。例えば、撮像装置100は、被写体に変化があった旨を示すメッセージを表示部103に表示する。なお、撮像装置100は、警告音又は音声メッセージにより、被写体に変化があった旨をユーザに通知してもよい。 On the other hand, when it is determined that the subject has changed (Yes in S123), the imaging apparatus 100 issues a warning (S124). For example, the imaging apparatus 100 displays a message indicating that the subject has changed on the display unit 103. Note that the imaging apparatus 100 may notify the user that the subject has changed by a warning sound or a voice message.

 そして、撮像装置100は、これらの一連の処理を、露光期間が終了するまで、繰り返す(S125)。 Then, the imaging apparatus 100 repeats these series of processes until the exposure period ends (S125).

 露光期間が終了した場合(S125でYes)、撮像装置100は、露光を終了し、破壊読み出しにより画像を取得する(S126)。 When the exposure period ends (Yes in S125), the imaging apparatus 100 ends the exposure and acquires an image by destructive reading (S126).

 以下、複数の補助画像を用いて被写体に変化があったかを判定する方法について説明する。図12に示すように、露光時間の増加に伴い、補助画像の輝度は線形的に増加する。つまり、輝度が一定の傾きで増加する。一方で、時刻t0に示すように、被写体の変化が発生した場合、輝度の傾きが変化する。なお、図12は、瞬間的な外乱が発生した場合を示している。なお、三脚が倒れることで撮影する場面が変わったり、望まない被写体が画面内に侵入した場合等にも輝度の傾きが変化(増加又は減少)する。 Hereinafter, a method for determining whether a subject has changed using a plurality of auxiliary images will be described. As shown in FIG. 12, the luminance of the auxiliary image increases linearly as the exposure time increases. That is, the luminance increases with a constant slope. On the other hand, as shown at time t0, when a change in the subject occurs, the luminance gradient changes. FIG. 12 shows a case where an instantaneous disturbance has occurred. It should be noted that the inclination of the luminance changes (increases or decreases) when the shooting scene changes due to the tripod falling or when an undesired subject enters the screen.

 具体的には、例えば、撮像装置100は、複数の時刻に得られた複数の補助画像を用いて輝度の傾きを検出する。そして、撮像装置100は、新たに得られた補助画像の輝度と、直前に得られた補助画像の輝度とに基づき、傾きを算出し、得られた傾きと、過去の傾きの差が予め定められた閾値を超えるかを判定する。撮像装置100は、当該差が閾値を超える場合には、被写体に変化があったと判定し、閾値を超えない場合には、被写体に変化がないと判定する。 Specifically, for example, the imaging apparatus 100 detects a gradient of luminance using a plurality of auxiliary images obtained at a plurality of times. Then, the imaging apparatus 100 calculates the inclination based on the newly obtained luminance of the auxiliary image and the luminance of the auxiliary image obtained immediately before, and the difference between the obtained inclination and the past inclination is determined in advance. It is determined whether the threshold value is exceeded. The imaging apparatus 100 determines that the subject has changed when the difference exceeds the threshold value, and determines that the subject has not changed when the difference does not exceed the threshold value.

 例えば、図12に示す例では、時刻t5より前の時点では、例えば、時刻t3及び時刻t4の輝度に基づき過去の傾きが算出される。そして、時刻t4及び時刻t5の輝度に基づき算出された傾きと、過去の傾きとの差が閾値を超える。これにより、警告が発せられる。 For example, in the example shown in FIG. 12, at the time before time t5, for example, the past inclination is calculated based on the luminance at time t3 and time t4. The difference between the slope calculated based on the luminance at time t4 and time t5 and the past slope exceeds the threshold. As a result, a warning is issued.

 撮像装置100は、上記の判定を、例えば、補助画像の平均輝度を用いて行う。なお、撮像装置100は、最大輝度を用いてもよいし、補助画像の特定の領域の平均輝度又は最大輝度を用いてもよい。また、撮像装置100は、1以上の画素を含む領域毎に上記判定を行い、いずれかの領域に変化があると判定した場合に、被写体に変化があると判定してもよい。 The imaging apparatus 100 performs the above determination using, for example, the average luminance of the auxiliary image. Note that the imaging apparatus 100 may use the maximum luminance, or may use the average luminance or the maximum luminance of a specific area of the auxiliary image. The imaging apparatus 100 may perform the above determination for each region including one or more pixels, and may determine that the subject has changed when it is determined that any region has a change.

 このように、本実施の形態に係る撮像装置100は、対象フレームにおいて複数回の非破壊読み出しを順次行い、複数回の非破壊読み出しで得られた複数の補助画像を用いて、対象フレーム内における被写体の変化を検出する。これにより、例えば、長時間露光の露光中における被写体の変化を検知できるので、長時間露光時の撮影の失敗を早期にユーザに通知できる。よって、撮影が失敗しているにも関わらず、露光が最後まで行われてしまうことを予防できる。 As described above, the imaging apparatus 100 according to the present embodiment sequentially performs a plurality of nondestructive readings in a target frame, and uses a plurality of auxiliary images obtained by the plurality of nondestructive readings in the target frame. Detect changes in the subject. Thereby, for example, since a change in the subject during the exposure of the long exposure can be detected, it is possible to notify the user of the shooting failure during the long exposure at an early stage. Therefore, it is possible to prevent the exposure from being performed to the end even though the shooting has failed.

 なお、図13に示すように、被写体に変化があると判定された場合(S123でYes)、撮像装置100は、撮影を停止してもよい(S124A)。これにより、撮像装置100は、例えば、長時間露光の露光中において異常が発生した場合に、自動的に撮影を停止できるので、撮影が失敗しているにも関わらず、露光が最後まで行われてしまうことを防止できる。 As shown in FIG. 13, when it is determined that there is a change in the subject (Yes in S123), the imaging apparatus 100 may stop shooting (S124A). As a result, the imaging apparatus 100 can automatically stop shooting, for example, when an abnormality occurs during long exposure, so that exposure is performed to the end even though shooting has failed. Can be prevented.

 または、図14に示すように、被写体に変化があると判定された場合(S123でYes)、撮像装置100は、再撮影を行ってもよい(S124B)。これにより、撮像装置100は、例えば、長時間露光の露光中において異常が発生した場合に、自動的に再撮影を行える。 Alternatively, as shown in FIG. 14, when it is determined that there is a change in the subject (Yes in S123), the imaging apparatus 100 may perform re-imaging (S124B). Thereby, the imaging apparatus 100 can automatically perform re-imaging when an abnormality occurs during, for example, long exposure.

 または、図15に示すように、被写体に変化があると判定された場合、撮像装置100は、撮影により得られた対象フレームの画像に対して被写体の変化の影響を低減するための補正を行ってもよい。具体的には、被写体に変化があると判定された場合(S123でYes)、撮像装置100は、被写体に変化に関する変化情報を保存する(S124C)。そして、撮像装置100は、破壊読み出しにより対象フレームの画像を取得した(S126)後、変化情報を用いて、当該画像を補正する(S127)。 Alternatively, as illustrated in FIG. 15, when it is determined that there is a change in the subject, the imaging apparatus 100 performs correction for reducing the influence of the change in the subject on the image of the target frame obtained by shooting. May be. Specifically, when it is determined that there is a change in the subject (Yes in S123), the imaging apparatus 100 stores change information regarding the change in the subject (S124C). Then, after acquiring the image of the target frame by destructive reading (S126), the imaging apparatus 100 corrects the image using the change information (S127).

 例えば、撮像装置100は、1又は複数画素を含む領域毎に変化を判定し、変化があった領域を示す情報と、その変化量とを変化情報として保存する。そして、撮像装置100は、変化のあった領域に対して、その変化量に対応する輝度値を減算又は加算することで補正を行う。 For example, the imaging apparatus 100 determines a change for each area including one or a plurality of pixels, and stores information indicating the changed area and the amount of change as change information. Then, the imaging apparatus 100 performs correction by subtracting or adding a luminance value corresponding to the amount of change with respect to the changed area.

 なお、図15に示す処理では、撮像装置100は、非破壊読み出し毎に変化の判定及び変化情報の保存を行っているが、全ての非破壊読み出しが行われた後、又は破壊読み出しが行われた後に、一括して判定及び変化情報の保存を行ってもよい。 In the process illustrated in FIG. 15, the imaging apparatus 100 determines the change and stores the change information for each non-destructive reading. However, after all the non-destructive reading is performed or the destructive reading is performed. After that, determination and change information may be stored together.

 以上により、撮像装置100は、被写体に変化が生じた場合でも、その影響を低減した画像を生成できる。 As described above, the imaging apparatus 100 can generate an image with reduced influence even when a change occurs in the subject.

 また、図11~図15を用いて、被写体の変化が検知された場合の複数の動作例を説明したが、これらのうちの複数を組み合わせてもよい。 Further, although a plurality of operation examples when a change in the subject is detected has been described with reference to FIGS. 11 to 15, a plurality of these may be combined.

 (実施の形態4)
 本実施の形態では、補助画像を用いて、対象フレームの露光時間を制御する別の手法について説明する。実施の形態1及び2では、自動的に露光時間を制御する方法について述べたが、本実施の形態では、ユーザが露光時間を判断するための情報を提供し、ユーザの操作に基づき露光時間を制御する例を説明する。例えば、本実施の形態の手法は、静止画の長秒露光時に適用できる。
(Embodiment 4)
In the present embodiment, another method for controlling the exposure time of a target frame using an auxiliary image will be described. In the first and second embodiments, the method for automatically controlling the exposure time is described. However, in this embodiment, the user provides information for determining the exposure time, and the exposure time is determined based on the user's operation. An example of control will be described. For example, the method of the present embodiment can be applied during long-second exposure of a still image.

 図16は、本実施の形態に係る撮像装置100の動作の流れを示すフローチャートである。図17は、撮像装置100の動作を説明するための図である。 FIG. 16 is a flowchart showing an operation flow of the imaging apparatus 100 according to the present embodiment. FIG. 17 is a diagram for explaining the operation of the imaging apparatus 100.

 まず、撮像装置100は、露光を開始する(S131)。次に、撮像装置100は、非破壊読み出しにより補助画像を取得する(S132)。次に、撮像装置100は、得られた補助画像を表示部103に表示する(S133)。 First, the imaging apparatus 100 starts exposure (S131). Next, the imaging apparatus 100 acquires an auxiliary image by nondestructive reading (S132). Next, the imaging apparatus 100 displays the obtained auxiliary image on the display unit 103 (S133).

 ユーザからの露光終了の指示がなく(S134でNo)、かつ、予め定められた露光期間が終了していない場合(S136でNo)、撮像装置100は、所定の時間経過後に再度非破壊読み出しを行い(S132)、新たに得られた補助画像を表示する(S133)。 If there is no instruction to end exposure from the user (No in S134) and the predetermined exposure period has not ended (No in S136), the imaging apparatus 100 performs nondestructive readout again after a predetermined time has elapsed. Perform (S132), and display the newly obtained auxiliary image (S133).

 例えば、図17に示すように予め定められた周期T4で非破壊読み出しが行われるとともに、読み出された補助画像が順次表示される。なお、非破壊読み出しが行われる間隔は一定でなくてもよい。 For example, as shown in FIG. 17, nondestructive reading is performed at a predetermined cycle T4, and the read auxiliary images are sequentially displayed. Note that the interval at which nondestructive reading is performed may not be constant.

 一方、ユーザからの露光終了の指示があった場合(S134でYes)、撮像装置100は、露光を停止し(S135)、破壊読み出しにより画像を取得する(S137)。また、予め定められた露光期間が終了した場合(S136でYes)、撮像装置100は、破壊読み出しにより画像を取得する(S137)。 On the other hand, when there is an instruction to end the exposure from the user (Yes in S134), the imaging apparatus 100 stops the exposure (S135) and acquires an image by destructive reading (S137). When the predetermined exposure period ends (Yes in S136), the imaging apparatus 100 acquires an image by destructive reading (S137).

 以上のように、表示部103は、対象フレームの露光期間中に補助画像を表示する。また、制御部102は、対象フレームの露光期間中に当該対象フレームの露光を終了する指示を受け付ける。これにより、ユーザは、露光期間中に表示される補助画像を参照して、輝度レベルを随時確認することができる。このため、ユーザは、所望の輝度レベルで露光を停止することができる。よって、ユーザは、所望の輝度レベルの画像を容易に撮影することができる。 As described above, the display unit 103 displays the auxiliary image during the exposure period of the target frame. Further, the control unit 102 receives an instruction to end the exposure of the target frame during the exposure period of the target frame. As a result, the user can check the luminance level at any time with reference to the auxiliary image displayed during the exposure period. For this reason, the user can stop the exposure at a desired luminance level. Therefore, the user can easily take an image with a desired luminance level.

 なお、ユーザによる露光終了の指示の手法は特に限定されないが、例えば、露光期間中にユーザがシャッターボタンを押下する、又はタッチパネルを操作する、ことで露光終了の指示が受け付けられる。 Note that the method for instructing the end of exposure by the user is not particularly limited. For example, an instruction to end the exposure is accepted when the user presses the shutter button or operates the touch panel during the exposure period.

 また、ここでは、露光期間中にユーザが露光時間T1を制御する例を述べたが、露光期間中、又は露光後に、既に得られた複数の補助画像から、ユーザが任意の画像を選択してもよい。この場合、撮像装置100は、例えば、ステップS133において、補助画像を表示するとともに当該補助画像を保存する。 In addition, although an example in which the user controls the exposure time T1 during the exposure period is described here, the user selects an arbitrary image from a plurality of auxiliary images already obtained during or after the exposure period. Also good. In this case, for example, in step S133, the imaging apparatus 100 displays an auxiliary image and stores the auxiliary image.

 また、例えば、ユーザの操作に従い、撮像装置100は、図18に示すような、複数の補助画像のうち一つを選択するためのユーザインタフェースを表示部103に表示する。具体的には、露光中に得られた補助画像は順次、記憶部104に記憶される。露光中であれば、そのタイミングまでに得られた複数の補助画像が一覧表示される。また、露光後であれば、当該露光で得られた全ての補助画像が一覧表示される。ユーザは、当該インタフェースを介して一つの補助画像を選択することで、任意の輝度レベルの画像を選択することができる。なお、露光後に選択が行われる場合には、表示される画像に破壊読み出しにより得られた画像が含まれてもよい。 Further, for example, in accordance with a user operation, the imaging apparatus 100 displays a user interface for selecting one of a plurality of auxiliary images on the display unit 103 as illustrated in FIG. Specifically, auxiliary images obtained during exposure are sequentially stored in the storage unit 104. If exposure is in progress, a list of a plurality of auxiliary images obtained up to that timing is displayed. If it is after the exposure, a list of all auxiliary images obtained by the exposure is displayed. The user can select an image having an arbitrary luminance level by selecting one auxiliary image via the interface. When selection is performed after exposure, the displayed image may include an image obtained by destructive readout.

 なお、選択される画像は1枚に限らず複数であってもよい。また、撮像装置100は、上記の選択後に、選択されなかった補助画像のデータを削除してもよい。また、撮像装置100は、露光中に選択が行われた場合には露光を終了してもよい。 Note that the number of images to be selected is not limited to one, and a plurality of images may be selected. The imaging apparatus 100 may delete auxiliary image data that has not been selected after the above selection. Further, the imaging apparatus 100 may end the exposure when a selection is made during the exposure.

 また、図19に示すように、撮像装置100は、対象フレームにおいて複数回の非破壊読み出しを順次行い、複数回の非破壊読み出しで得られた複数の画像を動画として記憶部104に保存してもよい。また、撮像装置100は、この動画を再生し、再生された動画を表示部103に表示してもよい。つまり、撮像装置100は、複数回の非破壊読み出しで得られた複数の画像を動画のファイル形式で保存する。例えば、画面間予測等が用いられる動画像符号化方式を含む公知の形式が用いられる。 In addition, as illustrated in FIG. 19, the imaging apparatus 100 sequentially performs a plurality of nondestructive readings in the target frame, and stores a plurality of images obtained by the plurality of nondestructive readings in the storage unit 104 as moving images. Also good. Further, the imaging apparatus 100 may reproduce this moving image and display the reproduced moving image on the display unit 103. That is, the imaging apparatus 100 stores a plurality of images obtained by a plurality of nondestructive readings in a moving image file format. For example, a publicly known format including a moving image encoding method using inter-screen prediction or the like is used.

 また、保存された動画は、露光終了後に自動的に、又はユーザの操作に基づき再生される。 In addition, the stored moving image is played back automatically after the exposure is completed or based on a user operation.

 このように動画形式で非破壊読み出しで得られた複数の画像を保存することで、露光期間中の撮影経過そのものを動画作品として保存することができる。また、非破壊読み出しで得られた画像を用いず、破壊読み出しで得られた複数の画像を用いる場合には、時間経過ぶんの静止画を合成することで動画を生成する必要があるが、上記手法を用いることで、静止画の合成処理を行うことなく動作を生成できる。また、動画形式で複数の画像を保存することで、複数の静止画を保存する場合に比べてデータ量を削減できる。 Storing a plurality of images obtained by non-destructive readout in the moving image format in this way allows the shooting process itself during the exposure period to be stored as a moving image work. In addition, when using a plurality of images obtained by destructive readout without using an image obtained by nondestructive readout, it is necessary to generate a moving image by synthesizing still images over time. By using the method, it is possible to generate an operation without performing still image composition processing. Also, by storing a plurality of images in a moving image format, the amount of data can be reduced as compared to storing a plurality of still images.

 本開示の一態様に係る撮像装置100は、非破壊読み出しが可能な撮像素子101と、対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する制御部102とを備える。 The imaging apparatus 100 according to one aspect of the present disclosure uses an imaging element 101 capable of nondestructive readout and an auxiliary image obtained by nondestructive readout in the target frame, and a control unit 102 that controls shooting of the target frame. With.

 これによれば、撮像装置100は、非破壊読み出しにより得られた補助画像を用いて、そのフレームの撮影を制御できるので、露光期間中の状況に応じて、適切な制御を行える。このように、撮像装置100は、さらなる改善を実現できる。 According to this, since the imaging apparatus 100 can control photographing of the frame using the auxiliary image obtained by nondestructive readout, appropriate control can be performed according to the situation during the exposure period. As described above, the imaging apparatus 100 can realize further improvement.

 例えば、制御部102は、補助画像を用いて、対象フレームの露光時間を制御してもよい。 For example, the control unit 102 may control the exposure time of the target frame using the auxiliary image.

 これによれば、撮像装置100は、露光期間中に得られた画像の情報に基づき露光時間を制御できるので、露光開始前に得られた情報に基づき露光時間を制御する場合に比べ、より適切に露光時間を制御できる。 According to this, since the imaging apparatus 100 can control the exposure time based on the information of the image obtained during the exposure period, it is more appropriate than when the exposure time is controlled based on the information obtained before the start of exposure. The exposure time can be controlled.

 例えば、制御部102は、対象フレームにおいて複数回の非破壊読み出しを順次行い、補助画像の輝度が予め定められた閾値を超えた場合に、対象フレームの露光を停止してもよい。 For example, the control unit 102 may sequentially perform non-destructive reading a plurality of times in the target frame, and stop exposure of the target frame when the luminance of the auxiliary image exceeds a predetermined threshold.

 これによれば、撮像装置100は、露光期間中に得られた画像の情報に基づき、適切に露光時間を制御できる。 According to this, the imaging apparatus 100 can appropriately control the exposure time based on the image information obtained during the exposure period.

 例えば、撮像装置100は、さらに、表示部103を備え、制御部102は、さらに、閾値を設定するためのユーザインタフェースを表示部103に表示してもよい。 For example, the imaging apparatus 100 may further include a display unit 103, and the control unit 102 may further display a user interface for setting a threshold on the display unit 103.

 これによれば、ユーザの好みに応じた露出設定を容易に行える。 According to this, it is possible to easily set the exposure according to the user's preference.

 例えば、制御部102は、補助画像を用いて、対象フレームの自動露出を行ってもよい。 For example, the control unit 102 may perform automatic exposure of the target frame using the auxiliary image.

 これによれば、撮像装置100は、露光期間中に得られた画像の情報に基づき自動露出を行えるので、露光開始前に得られた情報に基づき自動露出を行う場合に比べ、より適切に自動露出を行える。 According to this, since the imaging apparatus 100 can perform automatic exposure based on the image information obtained during the exposure period, the image capturing apparatus 100 can perform automatic exposure more appropriately than when performing automatic exposure based on the information obtained before the start of exposure. Can be exposed.

 例えば、制御部102は、対象フレームにおいて複数回の非破壊読み出しを順次行い、複数回の非破壊読み出しで得られた複数の補助画像を用いて、対象フレーム内における被写体の変化を検出してもよい。 For example, the control unit 102 sequentially performs a plurality of nondestructive readings in the target frame, and detects a change in the subject in the target frame using a plurality of auxiliary images obtained by the plurality of nondestructive readings. Good.

 これによれば、撮像装置100は、非破壊読み出しにより得られた補助画像を用いて、例えば、長時間露光の露光中における被写体の変化を検知できる。 According to this, the imaging apparatus 100 can detect, for example, a change in the subject during the long exposure using the auxiliary image obtained by nondestructive reading.

 例えば、制御部102は、被写体の変化が検出された場合、警告を発してもよい。 For example, the control unit 102 may issue a warning when a change in the subject is detected.

 これによれば、撮像装置100は、例えば、長時間露光の露光中における異常の発生をユーザに通知できる。 According to this, the imaging apparatus 100 can notify the user of the occurrence of an abnormality during long exposure, for example.

 例えば、制御部102は、被写体の変化が検出された場合、撮影を停止してもよい。 For example, the control unit 102 may stop shooting when a change in the subject is detected.

 これによれば、撮像装置100は、例えば、長時間露光の露光中において異常が発生した場合に、自動的に撮影を停止できる。 According to this, the imaging apparatus 100 can automatically stop photographing when, for example, an abnormality occurs during long exposure.

 例えば、制御部102は、被写体の変化が検出された場合、再撮影を行ってもよい。 For example, the control unit 102 may perform re-photographing when a change in the subject is detected.

 これによれば、撮像装置100は、例えば、長時間露光の露光中において異常が発生した場合に、自動的に再撮影を行える。 According to this, the imaging apparatus 100 can automatically perform re-imaging when, for example, an abnormality occurs during long exposure.

 例えば、制御部102は、被写体の変化が検出された場合、撮影により得られた対象フレームの画像に対して被写体の変化の影響を低減するための補正を行ってもよい。 For example, when a change in the subject is detected, the control unit 102 may perform correction for reducing the influence of the change in the subject on the image of the target frame obtained by shooting.

 これによれば、撮像装置100は、被写体に変化が生じた場合でも、その影響を低減した画像を生成できる。 According to this, even when a change occurs in the subject, the imaging apparatus 100 can generate an image with reduced influence.

 例えば、撮像装置100は、さらに、対象フレームの露光期間中に補助画像を表示する表示部103を備え、制御部102は、さらに、対象フレームの露光期間中に当該対象フレームの露光を終了する指示を受け付けてもよい。 For example, the imaging apparatus 100 further includes a display unit 103 that displays an auxiliary image during the exposure period of the target frame, and the control unit 102 further instructs to end the exposure of the target frame during the exposure period of the target frame. May be accepted.

 これによれば、ユーザは、露光期間中に表示される補助画像を参照して、所望の輝度レベルの画像を撮影できる。 According to this, the user can take an image of a desired luminance level with reference to the auxiliary image displayed during the exposure period.

 例えば、制御部102は、対象フレームにおいて複数回の非破壊読み出しを順次行い、複数回の非破壊読み出しで得られた複数の画像を動画として保存してもよい。 For example, the control unit 102 may sequentially perform a plurality of nondestructive readings in the target frame and store a plurality of images obtained by the nondestructive readings as a moving image.

 例えば、撮像素子101は、有機センサであってもよい。 For example, the image sensor 101 may be an organic sensor.

 本開示の一態様に係る制御方法は、非破壊読み出しが可能な撮像素子101を備える撮像装置100の制御方法であって、対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する制御ステップを含む。 The control method according to an aspect of the present disclosure is a control method of the imaging apparatus 100 including the imaging element 101 capable of nondestructive readout, and uses the auxiliary image obtained by nondestructive readout in the target frame, A control step for controlling frame shooting;

 これによれば、当該制御方法は、非破壊読み出しにより得られた補助画像を用いて、そのフレームの撮影を制御できるので、露光期間中の状況に応じて、適切な制御を行える。このように、当該制御方法は、さらなる改善を実現できる。 According to this, since the control method can control photographing of the frame by using the auxiliary image obtained by nondestructive reading, appropriate control can be performed according to the situation during the exposure period. Thus, the control method can realize further improvement.

 なお、これらの包括的または具体的な態様は、システム、方法、集積回路、コンピュータプログラムまたはコンピュータ読み取り可能なCD-ROMなどの記録媒体で実現されてもよく、システム、方法、集積回路、コンピュータプログラム及び記録媒体の任意な組み合わせで実現されてもよい。 Note that these comprehensive or specific modes may be realized by a system, a method, an integrated circuit, a computer program, or a recording medium such as a computer-readable CD-ROM, and the system, method, integrated circuit, and computer program. Also, any combination of recording media may be realized.

 以上、本開示の実施の形態に係る撮像装置について説明したが、本開示は、この実施の形態に限定されるものではない。 The imaging device according to the embodiment of the present disclosure has been described above, but the present disclosure is not limited to this embodiment.

 例えば、上記実施の形態に係る撮像装置に含まれる各処理部は典型的には集積回路であるLSIとして実現される。これらは個別に1チップ化されてもよいし、一部又は全てを含むように1チップ化されてもよい。 For example, each processing unit included in the imaging apparatus according to the above embodiment is typically realized as an LSI that is an integrated circuit. These may be individually made into one chip, or may be made into one chip so as to include a part or all of them.

 また、集積回路化はLSIに限るものではなく、専用回路又は汎用プロセッサで実現してもよい。LSI製造後にプログラムすることが可能なFPGA(Field Programmable Gate Array)、又はLSI内部の回路セルの接続や設定を再構成可能なリコンフィギュラブル・プロセッサを利用してもよい。 Further, the integration of circuits is not limited to LSI, and may be realized by a dedicated circuit or a general-purpose processor. An FPGA (Field Programmable Gate Array) that can be programmed after manufacturing the LSI or a reconfigurable processor that can reconfigure the connection and setting of circuit cells inside the LSI may be used.

 また、上記各実施の形態において、各構成要素は、専用のハードウェアで構成されるか、各構成要素に適したソフトウェアプログラムを実行することによって実現されてもよい。各構成要素は、CPUまたはプロセッサなどのプログラム実行部が、ハードディスクまたは半導体メモリなどの記録媒体に記録されたソフトウェアプログラムを読み出して実行することによって実現されてもよい。 Further, in each of the above embodiments, each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component. Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded on a recording medium such as a hard disk or a semiconductor memory.

 また、本開示は、撮像装置により実行される制御方法として実現されてもよい。 Also, the present disclosure may be realized as a control method executed by the imaging apparatus.

 また、上記回路図に示す回路構成は、一例であり、本開示は上記回路構成に限定されない。つまり、上記回路構成と同様に、本開示の特徴的な機能を実現できる回路も本開示に含まれる。また、上記で用いた数字は、全て本開示を具体的に説明するために例示するものであり、本開示は例示された数字に制限されない。 The circuit configuration shown in the circuit diagram is an example, and the present disclosure is not limited to the circuit configuration. That is, similar to the circuit configuration described above, a circuit that can realize the characteristic function of the present disclosure is also included in the present disclosure. Moreover, all the numbers used above are illustrated for specifically explaining the present disclosure, and the present disclosure is not limited to the illustrated numbers.

 また、ブロック図における機能ブロックの分割は一例であり、複数の機能ブロックを一つの機能ブロックとして実現したり、一つの機能ブロックを複数に分割したり、一部の機能を他の機能ブロックに移してもよい。また、類似する機能を有する複数の機能ブロックの機能を単一のハードウェア又はソフトウェアが並列又は時分割に処理してもよい。 In addition, division of functional blocks in the block diagram is an example, and a plurality of functional blocks can be realized as one functional block, a single functional block can be divided into a plurality of functions, or some functions can be transferred to other functional blocks. May be. In addition, functions of a plurality of functional blocks having similar functions may be processed in parallel or time-division by a single hardware or software.

 また、フローチャートにおける各ステップが実行される順序は、本開示を具体的に説明するために例示するためのものであり、上記以外の順序であってもよい。また、上記ステップの一部が、他のステップと同時(並列)に実行されてもよい。 In addition, the order in which the steps in the flowchart are executed is for illustration in order to specifically describe the present disclosure, and may be in an order other than the above. Also, some of the above steps may be executed simultaneously (in parallel) with other steps.

 以上、一つまたは複数の態様に係る撮像装置について、実施の形態に基づいて説明したが、本開示は、この実施の形態に限定されるものではない。本開示の趣旨を逸脱しない限り、当業者が思いつく各種変形を本実施の形態に施したものや、異なる実施の形態における構成要素を組み合わせて構築される形態も、一つまたは複数の態様の範囲内に含まれてもよい。 As described above, the imaging device according to one or more aspects has been described based on the embodiment, but the present disclosure is not limited to this embodiment. Unless it deviates from the gist of the present disclosure, various modifications conceived by those skilled in the art have been made in this embodiment, and forms constructed by combining components in different embodiments are also within the scope of one or more aspects. May be included.

 本開示は、デジタルスチルカメラ又はデジタルビデオカメラ等の撮像装置に適用できる。 The present disclosure can be applied to an imaging apparatus such as a digital still camera or a digital video camera.

 100 撮像装置
 101 撮像素子
 102 制御部
 103 表示部
 104 記憶部
 106 撮像系
 107 遮光部
 201 画素
 202 垂直走査部
 203 カラム信号処理部
 204 水平読み出し部
 205 リセット制御線
 206 アドレス制御線
 207 垂直信号線
 208 水平出力端子
 211 光電変換部
 212 電荷蓄積部
 213 リセットトランジスタ
 214 増幅トランジスタ
 215 選択トランジスタ
DESCRIPTION OF SYMBOLS 100 Image pick-up device 101 Image pick-up element 102 Control part 103 Display part 104 Storage part 106 Imaging system 107 Light-shielding part 201 Pixel 202 Vertical scanning part 203 Column signal processing part 204 Horizontal read-out part 205 Reset control line 206 Address control line 207 Vertical signal line 208 Horizontal Output terminal 211 Photoelectric conversion unit 212 Charge storage unit 213 Reset transistor 214 Amplification transistor 215 Selection transistor

Claims (14)

 非破壊読み出しが可能な撮像素子と、
 対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する制御部とを備える
 撮像装置。
An image sensor capable of non-destructive readout;
An imaging apparatus comprising: a control unit that controls photographing of the target frame using an auxiliary image obtained by nondestructive reading in the target frame.
 前記制御部は、前記補助画像を用いて、前記対象フレームの露光時間を制御する
 請求項1記載の撮像装置。
The imaging apparatus according to claim 1, wherein the control unit controls an exposure time of the target frame using the auxiliary image.
 前記制御部は、
 前記対象フレームにおいて複数回の非破壊読み出しを順次行い、
 前記補助画像の輝度が予め定められた閾値を超えた場合に、前記対象フレームの露光を停止する
 請求項2記載の撮像装置。
The controller is
Sequentially performing a plurality of nondestructive readings in the target frame,
The imaging apparatus according to claim 2, wherein exposure of the target frame is stopped when the luminance of the auxiliary image exceeds a predetermined threshold value.
 前記撮像装置は、さらに、表示部を備え、
 前記制御部は、さらに、前記閾値を設定するためのユーザインタフェースを前記表示部に表示する
 請求項3記載の撮像装置。
The imaging apparatus further includes a display unit,
The imaging device according to claim 3, wherein the control unit further displays a user interface for setting the threshold on the display unit.
 前記制御部は、前記補助画像を用いて、前記対象フレームの自動露出を行う
 請求項1記載の撮像装置。
The imaging apparatus according to claim 1, wherein the control unit performs automatic exposure of the target frame using the auxiliary image.
 前記制御部は、
 前記対象フレームにおいて複数回の非破壊読み出しを順次行い、
 前記複数回の非破壊読み出しで得られた複数の補助画像を用いて、前記対象フレーム内における被写体の変化を検出する
 請求項1記載の撮像装置。
The controller is
Sequentially performing a plurality of nondestructive readings in the target frame,
The imaging apparatus according to claim 1, wherein a change in a subject in the target frame is detected using a plurality of auxiliary images obtained by the plurality of nondestructive readings.
 前記制御部は、前記被写体の変化が検出された場合、警告を発する
 請求項6記載の撮像装置。
The imaging device according to claim 6, wherein the control unit issues a warning when a change in the subject is detected.
 前記制御部は、前記被写体の変化が検出された場合、撮影を停止する
 請求項6記載の撮像装置。
The imaging apparatus according to claim 6, wherein the control unit stops shooting when a change in the subject is detected.
 前記制御部は、前記被写体の変化が検出された場合、再撮影を行う
 請求項6記載の撮像装置。
The imaging apparatus according to claim 6, wherein the control unit performs re-imaging when a change in the subject is detected.
 前記制御部は、前記被写体の変化が検出された場合、撮影により得られた前記対象フレームの画像に対して前記被写体の変化の影響を低減するための補正を行う
 請求項6記載の撮像装置。
The imaging apparatus according to claim 6, wherein, when a change in the subject is detected, the control unit performs correction for reducing an influence of the change in the subject on an image of the target frame obtained by photographing.
 前記撮像装置は、さらに、
 前記対象フレームの露光期間中に前記補助画像を表示する表示部を備え、
 前記制御部は、さらに、前記対象フレームの露光期間中に当該対象フレームの露光を終了する指示を受け付ける
 請求項1記載の撮像装置。
The imaging device further includes:
A display unit for displaying the auxiliary image during an exposure period of the target frame;
The imaging apparatus according to claim 1, wherein the control unit further receives an instruction to end exposure of the target frame during an exposure period of the target frame.
 前記制御部は、
 前記対象フレームにおいて複数回の非破壊読み出しを順次行い、
 前記複数回の非破壊読み出しで得られた複数の画像を動画として保存する
 請求項1記載の撮像装置。
The controller is
Sequentially performing a plurality of nondestructive readings in the target frame,
The imaging device according to claim 1, wherein a plurality of images obtained by the plurality of nondestructive readings are stored as moving images.
 前記撮像素子は、有機センサである
 請求項1~12のいずれか1項に記載の撮像装置。
The imaging apparatus according to any one of claims 1 to 12, wherein the imaging element is an organic sensor.
 非破壊読み出しが可能な撮像素子を備える撮像装置の制御方法であって、
 対象フレームにおいて非破壊読み出しにより得られた補助画像を用いて、当該対象フレームの撮影を制御する制御ステップを含む
 制御方法。
A method for controlling an imaging apparatus including an imaging element capable of nondestructive readout,
A control method including a control step of controlling photographing of a target frame using an auxiliary image obtained by nondestructive reading in the target frame.
PCT/JP2017/046598 2016-12-27 2017-12-26 Imaging device and method for controlling same Ceased WO2018124054A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2016-254540 2016-12-27
JP2016254540A JP2020031259A (en) 2016-12-27 2016-12-27 Imaging apparatus and control method thereof

Publications (1)

Publication Number Publication Date
WO2018124054A1 true WO2018124054A1 (en) 2018-07-05

Family

ID=62709350

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/046598 Ceased WO2018124054A1 (en) 2016-12-27 2017-12-26 Imaging device and method for controlling same

Country Status (2)

Country Link
JP (1) JP2020031259A (en)
WO (1) WO2018124054A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11678081B2 (en) 2020-02-05 2023-06-13 Panasonic Intellectual Property Managevent Co., Ltd. Imaging device and image processing method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11956560B2 (en) * 2020-10-09 2024-04-09 Meta Platforms Technologies, Llc Digital pixel sensor having reduced quantization operation

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005204298A (en) * 2003-12-17 2005-07-28 Hewlett-Packard Development Co Lp System and method for showing exposure information during taking in image
JP2007259428A (en) * 2006-02-27 2007-10-04 Seiko Epson Corp Imaging device, imaging apparatus, imaging system, imaging method, motion data generation system, motion data generation program, and motion data generation method
JP2010074584A (en) * 2008-09-19 2010-04-02 Sony Corp Imaging apparatus and imaging method
JP2014168209A (en) * 2013-02-28 2014-09-11 Canon Marketing Japan Inc Imaging apparatus, imaging method and program
WO2015045828A1 (en) * 2013-09-27 2015-04-02 富士フイルム株式会社 Imaging device and imaging method
JP2015159353A (en) * 2014-02-21 2015-09-03 オリンパス株式会社 Imaging apparatus and imaging method
JP2016161653A (en) * 2015-02-27 2016-09-05 富士フイルム株式会社 Imaging apparatus and method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005204298A (en) * 2003-12-17 2005-07-28 Hewlett-Packard Development Co Lp System and method for showing exposure information during taking in image
JP2007259428A (en) * 2006-02-27 2007-10-04 Seiko Epson Corp Imaging device, imaging apparatus, imaging system, imaging method, motion data generation system, motion data generation program, and motion data generation method
JP2010074584A (en) * 2008-09-19 2010-04-02 Sony Corp Imaging apparatus and imaging method
JP2014168209A (en) * 2013-02-28 2014-09-11 Canon Marketing Japan Inc Imaging apparatus, imaging method and program
WO2015045828A1 (en) * 2013-09-27 2015-04-02 富士フイルム株式会社 Imaging device and imaging method
JP2015159353A (en) * 2014-02-21 2015-09-03 オリンパス株式会社 Imaging apparatus and imaging method
JP2016161653A (en) * 2015-02-27 2016-09-05 富士フイルム株式会社 Imaging apparatus and method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11678081B2 (en) 2020-02-05 2023-06-13 Panasonic Intellectual Property Managevent Co., Ltd. Imaging device and image processing method

Also Published As

Publication number Publication date
JP2020031259A (en) 2020-02-27

Similar Documents

Publication Publication Date Title
US10368025B2 (en) Imaging element, imaging apparatus, its control method, and control program
JP5614993B2 (en) Imaging apparatus and solid-state imaging device driving method
US10785423B2 (en) Image sensor, image capturing apparatus, and image capturing method
JP6731626B2 (en) Imaging device, camera, and imaging method
JP5222068B2 (en) Imaging device
CN107306326A (en) Camera device and image capture method
WO2018124054A1 (en) Imaging device and method for controlling same
JP2022044285A (en) Detection device and detection method
US9282245B2 (en) Image capturing apparatus and method of controlling the same
JP2013192058A (en) Imaging apparatus
JP5043400B2 (en) Imaging apparatus and control method thereof
US10368020B2 (en) Image capturing apparatus and control method therefor
JP6362099B2 (en) Imaging apparatus, control method therefor, program, and storage medium
JP6664066B2 (en) Imaging device and control method thereof
WO2018124056A1 (en) Imaging device and control method therefor
US9794502B2 (en) Image capturing apparatus
US10382740B2 (en) Image processing apparatus, method for controlling the same, and image capture apparatus
JP2007013362A (en) Image pickup device and image pickup method
JP7571173B2 (en) Image capture device, image capture device control method, and program
JP5737924B2 (en) Imaging device
JP2012120076A (en) Imaging apparatus
JP6706850B2 (en) Imaging device, camera, and imaging method
WO2018124057A1 (en) Imaging device and control method therefor
JP5518025B2 (en) Photoelectric conversion device and imaging device
JP2010011047A (en) Imaging apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17889262

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17889262

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP