[go: up one dir, main page]

CN112598594A - Color consistency correction method and related device - Google Patents

Color consistency correction method and related device Download PDF

Info

Publication number
CN112598594A
CN112598594A CN202011558507.0A CN202011558507A CN112598594A CN 112598594 A CN112598594 A CN 112598594A CN 202011558507 A CN202011558507 A CN 202011558507A CN 112598594 A CN112598594 A CN 112598594A
Authority
CN
China
Prior art keywords
camera
white balance
value
color
balance value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202011558507.0A
Other languages
Chinese (zh)
Inventor
吴晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Oppo Chongqing Intelligent Technology Co Ltd
Original Assignee
Oppo Chongqing Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo Chongqing Intelligent Technology Co Ltd filed Critical Oppo Chongqing Intelligent Technology Co Ltd
Priority to CN202011558507.0A priority Critical patent/CN112598594A/en
Publication of CN112598594A publication Critical patent/CN112598594A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/90Dynamic range modification of images or parts thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Studio Devices (AREA)

Abstract

本申请实施例公开了一种颜色一致性矫正方法及相关装置,应用于电子设备,方法包括:当检测到第一摄像头切换到第二摄像头的操作指令时,计算保持第二摄像头与第一摄像头颜色一致性的第一白平衡值,并计算保持第二摄像头与第一摄像头颜色一致性的第一颜色矫正矩阵值;通过第一白平衡值和第一颜色矫正矩阵值分别初始化第一预设滤波器和第二预设滤波器,得到第一目标滤波器和第二目标滤波器;在第一摄像头切换到第二摄像头时,获取第二摄像头对应的第二白平衡值和第二颜色矫正矩阵值;根据第二白平衡值和第二颜色矫正矩阵值、第一目标滤波器和第二目标滤波器,对第二摄像头进行颜色一致性矫正。采用本申请实施例有利于提高用户体验。

Figure 202011558507

The embodiment of the present application discloses a color consistency correction method and a related device, which are applied to electronic equipment. The method includes: when an operation instruction for switching the first camera to the second camera is detected, calculating and maintaining the second camera and the first camera. The first white balance value of color consistency is calculated, and the first color correction matrix value that maintains the color consistency between the second camera and the first camera is calculated; the first preset is respectively initialized by the first white balance value and the first color correction matrix value filter and the second preset filter to obtain the first target filter and the second target filter; when the first camera switches to the second camera, obtain the second white balance value and the second color correction corresponding to the second camera matrix value; according to the second white balance value and the second color correction matrix value, the first target filter and the second target filter, perform color consistency correction on the second camera. Adopting the embodiments of the present application is beneficial to improve user experience.

Figure 202011558507

Description

Color consistency correction method and related device
Technical Field
The application relates to the technical field of electronics, in particular to a color consistency correction method and a related device.
Background
With the widespread use of electronic devices (mobile phones, tablet computers, etc.), the electronic devices are developed toward diversification and personalization, and the electronic devices have more and more applications, and have more and more powerful functions, so that the electronic devices become indispensable electronic products in the life of users.
In order to meet the requirement that a user can shoot clear and vivid photos, most of electronic equipment adopts a plurality of cameras, and through various complex calculations, the clear and vivid photos can be shot. The light-changing white balance algorithm is to synchronize and initialize Automatic White Balance (AWB) information between two cameras, so that colors are kept consistent and smooth at the moment of switching the cameras. However, in the process of switching, the phenomenon of color jump is often easy to occur, and the user experience is low.
Disclosure of Invention
The embodiment of the application provides a color consistency correction method and a related device, which can be beneficial to ensuring the smoothness of color transition of a shot picture and improving user experience in the switching process of two cameras.
In a first aspect, an embodiment of the present application provides a color consistency correction method, which is applied to an electronic device, where the electronic device includes a first camera and a second camera, and the method includes:
when an operation instruction for switching the first camera to the second camera is detected, calculating a first white balance value for keeping the color consistency of the second camera and the first camera, and calculating a first color correction matrix value for keeping the color consistency of the second camera and the first camera;
respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter;
when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera;
and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter.
In a second aspect, an embodiment of the present application provides a color consistency correction apparatus, which is applied to an electronic device, where the electronic device includes a first camera and a second camera, and the apparatus includes: a calculation unit, an initialization unit, an acquisition unit and a correction unit, wherein,
the computing unit is used for computing a first white balance value for keeping the color consistency of the second camera and the first camera when an operation instruction for switching the first camera to the second camera is detected, and computing a first color correction matrix value for keeping the color consistency of the second camera and the first camera;
the initialization unit is configured to initialize a first preset filter and a second preset filter respectively according to the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter;
the acquiring unit is configured to acquire a second white balance value and a second color correction matrix value corresponding to the second camera when the first camera is switched to the second camera;
and the correcting unit is used for carrying out color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter.
In a third aspect, an embodiment of the present application provides an electronic device, including a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for executing steps in any method of the first aspect of the embodiment of the present application.
In a fourth aspect, the present application provides a computer-readable storage medium, where the computer-readable storage medium stores a computer program for electronic data exchange, where the computer program makes a computer perform part or all of the steps described in any one of the methods of the first aspect of the present application.
In a fifth aspect, the present application provides a computer program product, wherein the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to perform some or all of the steps as described in any one of the methods of the first aspect of the embodiments of the present application. The computer program product may be a software installation package.
It can be seen that, in the embodiment of the present application, when an operation instruction to switch the first camera to the second camera is detected, a first white balance value for maintaining color consistency between the second camera and the first camera is calculated, and a first color correction matrix value for maintaining color consistency between the second camera and the first camera is calculated; respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter; when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera; and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter. So, above-mentioned first target filter and second target filter can be constantly updated, at the in-process that first camera switches to the second camera, be favorable to guaranteeing the smoothness nature of the colour transition of shooing the picture, be favorable to guaranteeing to shoot the colour synchronization of picture between two different cameras, be favorable to improving user experience.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic structural diagram of an electronic device according to an embodiment of the present disclosure;
fig. 2 is a schematic diagram of a software structure of an electronic device according to an embodiment of the present application;
fig. 3 is a schematic structural diagram of a camera module according to an embodiment of the present disclosure;
fig. 4A is a schematic flowchart of a color consistency correction method according to an embodiment of the present application;
fig. 4B is a sequence diagram of a color consistency correction method according to an embodiment of the present application;
fig. 5 is a block diagram illustrating functional units of an electronic device according to an embodiment of the present disclosure;
fig. 6 is a block diagram illustrating functional units of a color consistency correction apparatus according to an embodiment of the present application.
Detailed Description
In order to make the technical solutions of the present application better understood, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference herein to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the application. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
1) The electronic device may be a portable electronic device, such as a cell phone, a tablet computer, a wearable electronic device with wireless communication capabilities (e.g., a smart watch), etc., that also contains other functionality, such as personal digital assistant and/or music player functionality. Exemplary embodiments of the portable electronic device include, but are not limited to, portable electronic devices that carry an IOS system, an Android system, a Microsoft system, or other operating system. The portable electronic device may also be other portable electronic devices such as a Laptop computer (Laptop) or the like. It should also be understood that in other embodiments, the electronic device may not be a portable electronic device, but may be a desktop computer.
2) The light-to-white balance algorithm is to synchronize and initialize Automatic White Balance (AWB) information between two cameras, so that colors are kept consistent and smooth at the moment of switching the cameras.
In a first section, the software and hardware operating environment of the technical solution disclosed in the present application is described as follows.
Fig. 1 shows a schematic structural diagram of an electronic device 100. The electronic device 100 may include a processor 110, an external memory interface 120, an internal memory 121, a Universal Serial Bus (USB) interface 130, a charge management module 140, a power management module 141, a battery 142, an antenna 1, an antenna 2, a mobile communication module 150, a wireless communication module 160, an audio module 170, a speaker 170A, a receiver 170B, a microphone 170C, an earphone interface 170D, a sensor module 180, a compass 190, a motor 191, a pointer 192, a camera 193, a display screen 194, a Subscriber Identification Module (SIM) card interface 195, and the like.
It is to be understood that the illustrated structure of the embodiment of the present application does not specifically limit the electronic device 100. In other embodiments of the present application, electronic device 100 may include more or fewer components than shown, or some components may be combined, some components may be split, or a different arrangement of components. The illustrated components may be implemented in hardware, software, or a combination of software and hardware.
Processor 110 may include one or more processing units, such as: the processor 110 may include an Application Processor (AP), a modem processor, a Graphics Processing Unit (GPU), an Image Signal Processor (ISP), a controller, a video codec, a Digital Signal Processor (DSP), a baseband processor, and/or a neural-Network Processing Unit (NPU), etc. Wherein the different processing units may be separate components or may be integrated in one or more processors. In some embodiments, the electronic device 100 may also include one or more processors 110. The controller can generate an operation control signal according to the instruction operation code and the time sequence signal to complete the control of instruction fetching and instruction execution. In other embodiments, a memory may also be provided in processor 110 for storing instructions and data. Illustratively, the memory in the processor 110 may be a cache memory. The memory may hold instructions or data that have just been used or recycled by the processor 110. If the processor 110 needs to reuse the instruction or data, it can be called directly from the memory. This avoids repeated accesses and reduces the latency of the processor 110, thereby increasing the efficiency with which the electronic device 100 processes data or executes instructions.
In some embodiments, processor 110 may include one or more interfaces. The interface may include an inter-integrated circuit (I2C) interface, an inter-integrated circuit audio (I2S) interface, a Pulse Code Modulation (PCM) interface, a universal asynchronous receiver/transmitter (UART) interface, a Mobile Industry Processor Interface (MIPI), a general-purpose input/output (GPIO) interface, a SIM card interface, a USB interface, and/or the like. The USB interface 130 is an interface conforming to the USB standard specification, and may specifically be a Mini USB interface, a Micro USB interface, a USB Type C interface, or the like. The USB interface 130 may be used to connect a charger to charge the electronic device 100, and may also be used to transmit data between the electronic device 100 and a peripheral device. The USB interface 130 may also be used to connect to a headset to play audio through the headset.
It should be understood that the interface connection relationship between the modules illustrated in the embodiments of the present application is only an illustration, and does not limit the structure of the electronic device 100. In other embodiments of the present application, the electronic device 100 may also adopt different interface connection manners or a combination of multiple interface connection manners in the above embodiments.
The charging management module 140 is configured to receive charging input from a charger. The charger may be a wireless charger or a wired charger. In some wired charging embodiments, the charging management module 140 may receive charging input from a wired charger via the USB interface 130. In some wireless charging embodiments, the charging management module 140 may receive a wireless charging input through a wireless charging coil of the electronic device 100. The charging management module 140 may also supply power to the electronic device through the power management module 141 while charging the battery 142.
The power management module 141 is used to connect the battery 142, the charging management module 140 and the processor 110. The power management module 141 receives input from the battery 142 and/or the charge management module 140, and supplies power to the processor 110, the internal memory 121, the external memory, the display 194, the camera 193, the wireless communication module 160, and the like. The power management module 141 may also be used to monitor parameters such as battery capacity, battery cycle count, battery state of health (leakage, impedance), etc. In some other embodiments, the power management module 141 may also be disposed in the processor 110. In other embodiments, the power management module 141 and the charging management module 140 may be disposed in the same device.
The wireless communication function of the electronic device 100 may be implemented by the antenna 1, the antenna 2, the mobile communication module 150, the wireless communication module 160, a modem processor, a baseband processor, and the like.
The antennas 1 and 2 are used for transmitting and receiving electromagnetic wave signals. Each antenna in the electronic device 100 may be used to cover a single or multiple communication bands. Different antennas can also be multiplexed to improve the utilization of the antennas. For example: the antenna 1 may be multiplexed as a diversity antenna of a wireless local area network. In other embodiments, the antenna may be used in conjunction with a tuning switch.
The mobile communication module 150 may provide a solution including 2G/3G/4G/5G wireless communication applied to the electronic device 100. The mobile communication module 150 may include at least one filter, a switch, a power amplifier, a Low Noise Amplifier (LNA), and the like. The mobile communication module 150 may receive the electromagnetic wave from the antenna 1, filter, amplify, etc. the received electromagnetic wave, and transmit the electromagnetic wave to the modem processor for demodulation. The mobile communication module 150 may also amplify the signal modulated by the modem processor, and convert the signal into electromagnetic wave through the antenna 1 to radiate the electromagnetic wave. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the processor 110. In some embodiments, at least some of the functional modules of the mobile communication module 150 may be disposed in the same device as at least some of the modules of the processor 110.
The wireless communication module 160 may provide a solution for wireless communication applied to the electronic device 100, including Wireless Local Area Networks (WLANs) (e.g., wireless fidelity (Wi-Fi) networks), bluetooth (blue tooth, BT), Global Navigation Satellite System (GNSS), Frequency Modulation (FM), Near Field Communication (NFC), Infrared (IR), UWB, and the like. The wireless communication module 160 may be one or more devices integrating at least one communication processing module. The wireless communication module 160 receives electromagnetic waves via the antenna 2, performs frequency modulation and filtering processing on electromagnetic wave signals, and transmits the processed signals to the processor 110. The wireless communication module 160 may also receive a signal to be transmitted from the processor 110, perform frequency modulation and amplification on the signal, and convert the signal into electromagnetic waves through the antenna 2 to radiate the electromagnetic waves.
The electronic device 100 implements display functions via the GPU, the display screen 194, and the application processor. The GPU is a microprocessor with color consistency correction, connected to the display screen 194 and the application processor. The GPU is used to perform mathematical and geometric calculations for graphics rendering. The processor 110 may include one or more GPUs that execute program instructions to generate or alter display information.
The display screen 194 is used to display images, videos, and the like. The display screen 194 includes a display panel. The display panel may be a Liquid Crystal Display (LCD), an organic light-emitting diode (OLED), an active matrix organic light-emitting diode (active-matrix organic light-emitting diode, AMOLED), a flexible light-emitting diode (FLED), a mini light-emitting diode (mini-light-emitting diode, mini), a Micro-o led, a quantum dot light-emitting diode (QLED), or the like. In some embodiments, the electronic device 100 may include 1 or more display screens 194.
The electronic device 100 may implement a photographing function through the ISP, the camera 193, the video codec, the GPU, the display screen 194, the application processor, and the like.
The ISP is used to process the data fed back by the camera 193. For example, when a photo is taken, the shutter is opened, light is transmitted to the camera photosensitive element through the lens, the optical signal is converted into an electrical signal, and the camera photosensitive element transmits the electrical signal to the ISP for processing and converting into an image visible to naked eyes. The ISP can also carry out algorithm optimization on the noise, brightness and skin color of the image. The ISP can also optimize parameters such as exposure, color temperature and the like of a shooting scene. In some embodiments, the ISP may be provided in camera 193.
The camera 193 is used to capture still images or video. The object generates an optical image through the lens and projects the optical image to the photosensitive element. The photosensitive element may be a Charge Coupled Device (CCD) or a complementary metal-oxide-semiconductor (CMOS) phototransistor. The light sensing element converts the optical signal into an electrical signal, which is then passed to the ISP where it is converted into a digital image signal. And the ISP outputs the digital image signal to the DSP for processing. The DSP converts the digital image signal into image signal in standard RGB, YUV and other formats. In some embodiments, the electronic device 100 may include 1 or more cameras 193.
The digital signal processor is used for processing digital signals, and can process digital image signals and other digital signals. For example, when the electronic device 100 selects a frequency bin, the digital signal processor is used to perform fourier transform or the like on the frequency bin energy.
Video codecs are used to compress or decompress digital video. The electronic device 100 may support one or more video codecs. In this way, the electronic device 100 may play or record video in a variety of encoding formats, such as: moving Picture Experts Group (MPEG) 1, MPEG2, MPEG3, MPEG4, and the like.
The NPU is a neural-network (NN) computing processor that processes input information quickly by using a biological neural network structure, for example, by using a transfer mode between neurons of a human brain, and can also learn by itself continuously. Applications such as intelligent recognition of the electronic device 100 can be realized through the NPU, for example: image recognition, face recognition, speech recognition, text understanding, and the like.
The external memory interface 120 may be used to connect an external memory card, such as a Micro SD card, to extend the memory capability of the electronic device 100. The external memory card communicates with the processor 110 through the external memory interface 120 to implement a data storage function. For example, files such as music, video, etc. are saved in an external memory card.
Internal memory 121 may be used to store one or more computer programs, including instructions. The processor 110 may execute the above-mentioned instructions stored in the internal memory 121, so as to enable the electronic device 100 to execute the method for displaying page elements provided in some embodiments of the present application, and various applications and data processing. The internal memory 121 may include a program storage area and a data storage area. Wherein, the storage program area can store an operating system; the storage program area may also store one or more applications (e.g., gallery, contacts, etc.), and the like. The storage data area may store data (e.g., photos, contacts, etc.) created during use of the electronic device 100, and the like. Further, the internal memory 121 may include a high-speed random access memory, and may also include a non-volatile memory, such as one or more magnetic disk storage components, flash memory components, Universal Flash Storage (UFS), and the like. In some embodiments, the processor 110 may cause the electronic device 100 to execute the method for displaying page elements provided in the embodiments of the present application and other applications and data processing by executing instructions stored in the internal memory 121 and/or instructions stored in a memory provided in the processor 110. The electronic device 100 may implement audio functions through the audio module 170, the speaker 170A, the receiver 170B, the microphone 170C, the earphone interface 170D, and the application processor, etc. Such as music playing, recording, etc.
The sensor module 180 may include a pressure sensor 180A, a gyro sensor 180B, an air pressure sensor 180C, a magnetic sensor 180D, an acceleration sensor 180E, a distance sensor 180F, a proximity light sensor 180G, a fingerprint sensor 180H, a temperature sensor 180J, a touch sensor 180K, an ambient light sensor 180L, a bone conduction sensor 180M, and the like.
The pressure sensor 180A is used for sensing a pressure signal, and converting the pressure signal into an electrical signal. In some embodiments, the pressure sensor 180A may be disposed on the display screen 194. The pressure sensor 180A can be of a wide variety, such as a resistive pressure sensor, an inductive pressure sensor, a capacitive pressure sensor, and the like. The capacitive pressure sensor may be a sensor comprising at least two parallel plates having an electrically conductive material. When a force acts on the pressure sensor 180A, the capacitance between the electrodes changes. The electronic device 100 determines the strength of the pressure from the change in capacitance. When a touch operation is applied to the display screen 194, the electronic apparatus 100 detects the intensity of the touch operation according to the pressure sensor 180A. The electronic apparatus 100 may also calculate the touched position from the detection signal of the pressure sensor 180A. In some embodiments, the touch operations that are applied to the same touch position but different touch operation intensities may correspond to different operation instructions. For example: and when the touch operation with the touch operation intensity smaller than the first pressure threshold value acts on the short message application icon, executing an instruction for viewing the short message. And when the touch operation with the touch operation intensity larger than or equal to the first pressure threshold value acts on the short message application icon, executing an instruction of newly building the short message.
The gyro sensor 180B may be used to determine the motion attitude of the electronic device 100. In some embodiments, the angular velocity of electronic device 100 about three axes (i.e., X, Y and the Z axis) may be determined by gyroscope sensor 180B. The gyro sensor 180B may be used for photographing anti-shake. For example, when the shutter is pressed, the gyro sensor 180B detects a shake angle of the electronic device 100, calculates a distance to be compensated for by the lens module according to the shake angle, and allows the lens to counteract the shake of the electronic device 100 through a reverse movement, thereby achieving anti-shake. The gyroscope sensor 180B may also be used for navigation, somatosensory gaming scenes.
The acceleration sensor 180E may detect the magnitude of acceleration of the electronic device 100 in various directions (typically three axes). The magnitude and direction of gravity can be detected when the electronic device 100 is stationary. The method can also be used for recognizing the posture of the electronic equipment, and is applied to horizontal and vertical screen switching, pedometers and other applications.
The ambient light sensor 180L is used to sense the ambient light level. Electronic device 100 may adaptively adjust the brightness of display screen 194 based on the perceived ambient light level. The ambient light sensor 180L may also be used to automatically adjust the white balance when taking a picture. The ambient light sensor 180L may also cooperate with the proximity light sensor 180G to detect whether the electronic device 100 is in a pocket to prevent accidental touches.
The fingerprint sensor 180H is used to collect a fingerprint. The electronic device 100 can utilize the collected fingerprint characteristics to unlock the fingerprint, access the application lock, photograph the fingerprint, answer an incoming call with the fingerprint, and so on.
The temperature sensor 180J is used to detect temperature. In some embodiments, electronic device 100 implements a temperature processing strategy using the temperature detected by temperature sensor 180J. For example, when the temperature reported by the temperature sensor 180J exceeds a threshold, the electronic device 100 performs a reduction in performance of a processor located near the temperature sensor 180J, so as to reduce power consumption and implement thermal protection. In other embodiments, the electronic device 100 heats the battery 142 when the temperature is below another threshold to avoid the low temperature causing the electronic device 100 to shut down abnormally. In other embodiments, when the temperature is lower than a further threshold, the electronic device 100 performs boosting on the output voltage of the battery 142 to avoid abnormal shutdown due to low temperature.
The touch sensor 180K is also referred to as a "touch panel". The touch sensor 180K may be disposed on the display screen 194, and the touch sensor 180K and the display screen 194 form a touch screen, which is also called a "touch screen". The touch sensor 180K is used to detect a touch operation applied thereto or nearby. The touch sensor can communicate the detected touch operation to the application processor to determine the touch event type. Visual output associated with the touch operation may be provided through the display screen 194. In other embodiments, the touch sensor 180K may be disposed on a surface of the electronic device 100, different from the position of the display screen 194.
Fig. 2 shows a block diagram of a software structure of the electronic device 100. The layered architecture divides the software into several layers, each layer having a clear role and division of labor. The layers communicate with each other through a software interface. In some embodiments, the Android system is divided into four layers, an application layer, an application framework layer, an Android runtime (Android runtime) and system library, and a kernel layer from top to bottom. The application layer may include a series of application packages.
As shown in fig. 2, the application package may include applications such as camera, gallery, calendar, phone call, map, navigation, WLAN, bluetooth, music, video, short message, etc.
The application framework layer provides an Application Programming Interface (API) and a programming framework for the application program of the application layer. The application framework layer includes a number of predefined functions.
As shown in FIG. 2, the application framework layers may include a window manager, content provider, view system, phone manager, resource manager, notification manager, and the like.
The window manager is used for managing window programs. The window manager can obtain the size of the display screen, judge whether a status bar exists, lock the screen, intercept the screen and the like.
The content provider is used to store and retrieve data and make it accessible to applications. The data may include video, images, audio, calls made and received, browsing history and bookmarks, phone books, etc.
The view system includes visual controls such as controls to display text, controls to display pictures, and the like. The view system may be used to build applications. The display interface may be composed of one or more views. For example, the display interface including the short message notification icon may include a view for displaying text and a view for displaying pictures.
The phone manager is used to provide communication functions of the electronic device 100. Such as management of call status (including on, off, etc.).
The resource manager provides various resources for the application, such as localized strings, icons, pictures, layout files, video files, and the like.
The notification manager enables the application to display notification information in the status bar, can be used to convey notification-type messages, can disappear automatically after a short dwell, and does not require user interaction. Such as a notification manager used to inform download completion, message alerts, etc. The notification manager may also be a notification that appears in the form of a chart or scroll bar text at the top status bar of the system, such as a notification of a background running application, or a notification that appears on the screen in the form of a dialog window. For example, prompting text information in the status bar, sounding a prompt tone, vibrating the electronic device, flashing an indicator light, etc.
The Android Runtime comprises a core library and a virtual machine. The Android runtime is responsible for scheduling and managing an Android system.
The core library comprises two parts: one part is a function which needs to be called by java language, and the other part is a core library of android.
The application layer and the application framework layer run in a virtual machine. And executing java files of the application program layer and the application program framework layer into a binary file by the virtual machine. The virtual machine is used for performing the functions of object life cycle management, stack management, thread management, safety and exception management, garbage collection and the like.
The system library may include a plurality of functional modules. For example: surface managers (surface managers), media libraries (media libraries), three-dimensional graphics processing libraries (e.g., OpenGL ES), 2D graphics engines (e.g., SGL), and the like.
The surface manager is used to manage the display subsystem and provide fusion of 2D and 3D layers for multiple applications.
The media library supports a variety of commonly used audio, video format playback and recording, and still image files, among others. The media library may support a variety of audio-video encoding formats, such as: MPEG4, H.264, MP3, AAC, AMR, JPG, PNG, etc.
The three-dimensional graphic processing library is used for realizing three-dimensional graphic drawing, image rendering, synthesis, layer processing and the like.
The 2D graphics engine is a drawing engine for 2D drawing.
The kernel layer is a layer between hardware and software. The inner core layer at least comprises a display driver, a camera driver, an audio driver and a sensor driver.
In a second section, example application scenarios disclosed in embodiments of the present application are described below.
Fig. 3 shows a schematic structural diagram of a camera module to which the present application is applicable, as shown in fig. 3, the electronic device may include a camera module, as shown in fig. 3, which is a schematic structural diagram of a camera module, the camera module may include a first camera and a second camera, the first camera and/or the second camera may be a wide-angle camera, a telephoto camera, or a main camera, for example, when the first camera is the main camera, the second camera may be the telephoto camera.
In the embodiment of the application, when an operation instruction for switching the first camera to the second camera is detected, a first white balance value for keeping the color consistency of the second camera and the first camera is calculated, and a first color correction matrix value for keeping the color consistency of the second camera and the first camera is calculated; respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter; when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera; and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter. So, above-mentioned first target filter and second target filter can be constantly updated, at the in-process that first camera switches to the second camera, be favorable to guaranteeing the smoothness nature of the colour transition of shooing the picture, be favorable to guaranteeing to shoot the colour synchronization of picture between two different cameras, be favorable to improving user experience.
In the third section, the scope of protection of the claims disclosed in the embodiments of the present application is described below.
Referring to fig. 4A, fig. 4A is a schematic flowchart of a color consistency correction method applied to an electronic device according to an embodiment of the present application, where the color consistency correction method includes the following operations.
S401, when an operation instruction that the first camera is switched to the second camera is detected, calculating a first white balance value for keeping the color consistency of the second camera and the first camera, and calculating a first color correction matrix value for keeping the color consistency of the second camera and the first camera.
The first camera and the second camera may be a camera module as shown in fig. 3.
For example, if the first camera is a main camera and the second camera is a telephoto camera, when a user is shooting, a switching instruction between the two cameras can be determined according to shooting environment parameters, for example, when a zoom multiple is greater than 2 times, zoom operation can be performed, an operation instruction for switching the first camera to the second camera is triggered, and specific conditions for switching the cameras are not limited herein.
It can be seen that, in the embodiment of the present application, because spectral responses of different cameras are different, that is, AWB values (White Balance values) calculated by the same color on different cameras should be different, so that differences exist in Automatic White Balance (AWB) of different cameras, when any two cameras are switched, corresponding AWB values of the two cameras will jump, which may cause color jump between images and poor user experience, so as to improve user experience, a user may complete switching of cameras without feeling, when a first camera is switched to a second camera, through the first White Balance value and the first color correction matrix value, in a process of switching the first camera to the second camera, fine color consistency correction may be performed on a picture shot by the camera, and the picture may be slowly and gradually transited to a picture shot by the second camera, the color consistency between the two cameras is kept, and the improvement of user experience is facilitated.
In one possible example, the calculating a first white balance value that maintains color consistency of the second camera with the first camera may include: determining a color mapping relationship between the first camera and the second camera; and calculating a first white balance value corresponding to the second camera when the color consistency of the second camera and the first camera is maintained based on the color mapping relation.
The color mapping relationship between the two cameras can be preset in advance, the color mapping relationship can be determined based on the spectral function corresponding to the first camera and the second spectral function corresponding to the second camera, and the first white balance value for keeping the color consistency of the second image and the first image is calculated based on the color mapping relationship.
Further, the spectral response function refers to a function with respect to wavelength, which can be understood as a ratio of the radiance received at each wavelength by the sensor in the camera to the radiance incident at each wavelength. The wavelengths of the cameras are different in different light source environments, and the sensor parameters of each camera are different, so that the spectral response functions corresponding to different cameras are different, a first spectral response function and a second spectral response function corresponding to a first camera and a second camera respectively can be obtained, the color mapping relation of color spaces corresponding to the two different cameras is obtained through the first spectral response function and the second spectral response function, and the first spectral response function and the second spectral response function can be obtained through measurement of a photoelectric detector.
Optionally, the color mapping relationship may be determined based on a first spectral response function corresponding to the first camera and a second spectral response function corresponding to the second camera, and specifically includes the following steps: acquiring an ambient light parameter of ambient light; determining the reflectivity of the target scene for the surface of the shot object based on the ambient light parameters; determining a color mapping relationship between the first camera and the second camera based on the reflectivity, the first spectral response, and the second spectral response function.
Wherein the ambient light parameter may comprise at least one of: the wavelength, the incident angle, the refractive index, the spectral range, the spectral distribution, etc. are not limited herein, and the reflectivity of the surface of different objects is different due to the characteristics of different objects, the incident angles and the wavelengths of different light sources irradiating the surface of the object, so that the reflectivity corresponding to any point on the surface of the object to be photographed in the target scene can be determined based on the parameters of the refractive index, the incident angle, etc. corresponding to the parameters of the ambient light, and the reflectivity of the surface of the object to be photographed is different due to the different light sources irradiating the light sources with different wavelengths, and the function values corresponding to the spectral response functions are different under the wavelengths corresponding to the different light sources, so that the R value, the G value and the B value calculated by the different cameras are different when the same object is photographed in the same target scene, accordingly, a color mapping relationship between the two cameras may be determined based on the reflectivity and the spectral response function corresponding to the first camera and the second spectral response function corresponding to the second camera.
Wherein the color mapping relationship between the first camera and the second camera can be fitted by the following model:
[R1,G1,B1]*f=[R2,G2,B2],
the following matrix can be obtained:
Figure BDA0002858049910000091
the function f is a color mapping relationship between the first camera and the second camera, the RGB color mode is a color standard in the industry, and various colors are obtained by changing three color channels of Red (Red, R), Green (Green, G), and Blue (Blue, B) and superimposing the three color channels on each other, RGB represents colors of the three channels of Red, Green, and Blue, and the standard almost includes all colors that can be perceived by human vision, which is one of the most widely used color systems at present, and therefore, in the embodiment of the present application, colors embodied on an image by the first camera and/or the second camera can be represented as vectors or matrices or specific values related to three parameters of R, G and B.
Further, the R1, G1, and B1 correspond to a first camera, the R2, G2, and B2 correspond to a second camera, and in order to obtain a case where the result corresponding to the model can be converged, the loss function of the model may be set to:
Figure BDA0002858049910000092
the arccos represents an inverse cosine in an inverse trigonometric function, and the loss function has different expressions for different color gamut scenarios, and is not limited to the loss function represented in the embodiment of the present application, for example, the loss function may also be set as:
Figure BDA0002858049910000093
wherein Mse is a function for calculating the mean square error; still alternatively, the loss function may be set as:
Figure BDA0002858049910000094
wherein Mse is a function for calculating mean square error, and log is a calculation logarithm.
Therefore, by adopting the embodiment of the application, the model can be trained in advance to determine the color mapping relationship between the two cameras, the color mapping relationship does not need to be calculated again, and the reduction of the power consumption of the electronic equipment is facilitated. Moreover, the subsequent first white balance value and the first color correction matrix value obtained by calculation in the embodiment of the application are simple in processing logic. In addition, in practical applications, since the ambient light may be changing, the corresponding spectral response function is also changing all the time, and similarly, the color mapping relationship corresponding to different spectral response functions can be obtained based on the above method, and the change of different scenes can also be covered.
Further, after obtaining the color mapping relationship (i.e., f-function), the first white balance value corresponding to the second camera that maintains the color consistency between the second camera and the first camera is:
Figure BDA0002858049910000101
wherein, RGain1, GGain1 and BGain1 are white balance gain values corresponding to the third white balance value corresponding to the first camera.
In one possible example, the calculating a first color correction matrix value that maintains color consistency of the second camera with the first camera may include: determining a first mapping relation of the first camera for converting from a first color gamut to a second color gamut; determining a second mapping relationship for the second camera to convert from the first color gamut to the second color gamut; acquiring a third white balance value and a third color correction matrix value corresponding to the first camera; and determining a first color correction matrix value corresponding to the second camera when the first camera and the second camera output the same RGB value at the same pixel point according to the third white balance value, the third color correction matrix value, the first mapping relation and the second mapping relation.
The first color gamut and/or the second color gamut may be set by a user or default, and is not limited herein.
Wherein, the first color gamut can understand the color language protocol (sensor RGB gamut) corresponding to different sensors in the camera; the second color gamut may be understood as an sRGB (standard Red Green blue) gamut, which may refer to a color language protocol defined under a unified standard condition, for example, the second color gamut may be a color language protocol developed by microsoft in conjunction with the image engine such as epressen and hewlett-packard, and provides a standard method for defining colors, so that various computer peripherals and application software such as display, printing and scanning have a common language for colors. The sRGB represents the three basic pigments of red, green and blue of the standard, and when the sRGB gamut value is 100%, it indicates that the display is very professional, 96% to 98% being the usual level, i.e. the medium level. If the sRGB gamut value cannot reach 100%, indicating that the display cannot fully display all colors, the smaller the value the poorer the display capability.
Wherein, the first mapping relationship for converting the first camera from the first color gamut to the second color gamut is:
[R1,G1,B1]*AWB1*CCM1*gamma1=[R,G,B];
the second mapping relationship for converting the second camera from the first color gamut to the second color gamut is as follows:
[R2,G2,B2]*AWB2*CCM2*gamma2=[R,G,B];
wherein, R1, G1 and B1 are RGB values of a certain point in the captured image in the first camera, R2, G2 and B2 are RGB values of a certain point in the captured image in the second camera, AWB1 is the following third added white balance value corresponding to the first camera at this time, AWB2 is the second white balance value corresponding to the second camera at this time, CCM1 is the third color correction matrix corresponding to the first camera at this time, CCM2 is the second color correction matrix corresponding to the second camera at this time, gamma 1 is the gamma value of the first camera, gamma2 is the gamma value of the second camera, and R, G, B is the value from the pixel (first color gamut) to the sRGB domain (second color gamut) corresponding to the certain point in the captured image (ideally, the same point is consistent with the value of the different camera output value R, G, B).
Further, when an operation command for switching the first camera to the second camera is detected, the third white balance value (AWB1) and the third color correction matrix (CCM1) corresponding to the first camera are known and can be obtained according to the statistical information of the first camera, so that when the color consistency between the second camera and the first camera is maintained (that is, when the second camera outputs the same RGB values as the first camera), the first mapping relationship and the second mapping relationship can be calculated, that is:
[R1,G1,B1]*AWB1*CCM1*gamma1=[R2,G2,B2]*AWB2*CCM2*gamma2=[R1,G1,B1]*f*AWB2*CCM2*gamma2;
thus, the first color correction matrix value corresponding to the second camera is determined as:
[f*AWB2]-1*AWB1*CCM1*gamma1*gamma2-1
in addition, since the camera performs synchronous mapping between gammas when AE (Auto exposure) light changes are synchronized, the above equation can be simplified as follows:
first color correction matrix value [ f AWB2]-1*AWB1*CCM1
The f function is a color mapping relationship between the first camera and the second camera, and at this time, the first color correction matrix value can maintain color consistency between the second camera and the first camera.
It can be seen that, in the embodiment of the present application, in consideration of different hardware conditions of different camera sensors, two cameras can be unified to the same color standard according to a first color gamut and a second color gamut, and when a shot picture of the second camera and a shot picture of the first camera output the same RGB values at the same pixel point through a first mapping relationship and a second mapping relationship of the two cameras and the same color standard, the second camera maintains a first color correction matrix value consistent with a color of the first camera, thereby facilitating subsequent color conversion in a process of switching the first camera to the second camera, enabling a user to complete switching of the cameras under an uninductive condition, and facilitating improvement of user experience.
S402, respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter.
The first preset filter and/or the second preset filter may be a default of a user's own device or system, and is not limited herein.
Wherein, the first preset filter or/and the second preset filter may be a time domain filter, and since the ambient light corresponding to the current camera switching by the user may change in real time, the white balance corresponding to the first camera and the second camera also changes, so as to increase the user experience and to make the color transition natural and smooth when the light changes synchronously, a filter may be added to soften the whole white balance adjustment process, so that the first preset filter may be initialized based on the first white balance value to fuse the first white balance value with the initial value in the first preset filter to obtain the first target filter, and similarly, the second preset filter may also be initialized based on the first color correction matrix value to fuse the first color correction matrix value with the initial value in the second preset filter, to obtain a second target filter to obtain two filters suitable for use by the second camera.
And S403, when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera.
The second white balance value and the second color correction matrix value are two values obtained by normal calculation of the second camera at the current moment and under the current environment.
Optionally, after the step S403, the method may further include the steps of: determining first statistical information of the first camera; determining whether the first white balance value needs to be adjusted according to the third white balance value and the first statistical information; and if the first white balance value needs to be adjusted, adjusting the first white balance value.
Wherein, the statistical information at least includes any one of the following information: automatic exposure value, automatic white balance value, automatic focus value, flicker detection, blackness compensation, lens shading correction, and the like, without limitation.
Whether the first white balance value for maintaining the color consistency between the second camera and the first camera needs to be adjusted or not can be determined according to a third white balance value (namely, an automatic white balance value) included in the current statistical information of the first camera, so that the first white balance value is adaptively changed, the accuracy of the first white balance value is maintained, and the difference between the first white balance value and the third white balance value currently corresponding to the first camera, namely, the difference of colors in the transition process of the cameras is reduced.
In one possible example, determining whether the first white balance value needs to be adjusted according to the third white balance value and the first statistical information may further include: determining a white point coordinate in the first statistical information having a minimum distance from the third white balance value; acquiring second statistical information of the second camera; calculating a homography matrix between the first camera and the second camera according to the first statistical information and the second statistical information; calculating a target coordinate corresponding to the white point coordinate according to the homography matrix; determining a white balance value with the minimum distance from the second statistical information to the target coordinate to obtain a fourth white balance value; and determining whether the first white balance value needs to be adjusted according to the fourth white balance value.
Since the color of the shot picture is influenced by the light source, white point coordinates of different light sources (such as an A light source, a C light source, a D50 light source, a D65 light source and the like) are different, and the camera does not know what the current light source is in shooting, a pixel coordinate system can be constructed, and the white point coordinate (x1 and x2) with the minimum distance from the third white balance value in the first statistical information corresponding to the first camera can be determined, so that the light source is determined.
Further, a homography matrix (M) between the first camera and the second camera may be determined according to first statistical information and second statistical information corresponding to the first camera and the second camera, and specifically, the homography matrix (describing a position mapping relationship of a current scene between a first camera pixel coordinate system and a second camera pixel coordinate system) between the cameras may be solved by calculating orb feature points of the first camera and the second camera (for example, feature points may be detected by using a fast from obtained segment test) algorithm); therefore, the target coordinate of the white point coordinate corresponding to the first camera in the coordinate system corresponding to the second camera, namely the target coordinate of the white point coordinate corresponding to the first camera in the coordinate system corresponding to the second camera can be determined according to the homography matrix
(x1,x2,1)*M=(x2,y2,1)
The target coordinates (x2, y2) can be determined by the mapping.
Therefore, the target coordinate corresponding to the light source in the picture shot by the second camera can be determined, and the influence of the hardware of the camera on the different colors of the picture shot by different or same light sources can be reduced.
Further, since the picture is actually changed in real time during the camera switching process, the white balance value with the minimum distance from the target coordinate can be determined according to the second statistical information and is a fourth white balance value, and the calculated first white balance value can be adjusted according to the fourth white balance value so as to adjust the first white balance value in real time according to the change of the environment (the change of the light source, etc.).
Therefore, in the embodiment of the application, the influence of real-time change of environmental factors on camera shooting is considered, and the first white balance value can be adjusted in a self-adaptive manner, so that color consistency between the second camera and the first camera is ensured, and the user experience is improved; meanwhile, the first white balance value can be adaptively adjusted based on the method, different scenes can be covered, and the practicability of the color consistency correction scheme can be improved.
In one possible example, the determining whether the first white balance value needs to be adjusted according to the fourth white balance value may include: calculating a difference value between the first white balance value and the fourth white balance value; if the difference value is smaller than or equal to a preset threshold value, adjusting the first white balance value to a fourth white balance value, and initializing the first preset filter according to the fourth white balance value to obtain a first target filter; and if the difference value is larger than the preset threshold value, determining that the first white balance value does not need to be adjusted.
The preset threshold may be set by the user or default, and is not limited herein.
When the difference value is smaller than or equal to a preset threshold value, determining that a white point coordinate corresponding to the second camera is found, namely determining a light source; otherwise, the white point coordinate corresponding to the second camera is determined not to be found, namely the light source is uncertain; therefore, after the white point coordinate is found, namely the light source is determined, the third white balance value can be adjusted to the fourth white balance value, so that the color consistency of the picture shot by the user is ensured.
S404, performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter.
The electronic device can input the second white balance value and the second color correction matrix value into the first target filter and the second target filter respectively to be fused with the first target filter and the second target filter, so that color consistency correction of a picture shot by the second camera is realized.
In one possible example, the performing color consistency correction on the second camera according to the second white balance value and the second color correction matrix value, the first target filter, and the second target filter may include: inputting the second white balance value into the first target filter to obtain a first result value; inputting the second color correction matrix value into the second target filter to obtain a second result value; and updating the first target filter according to the first result value and updating the second target filter according to the second result value in the process of switching the first camera to the second camera so as to realize color consistency correction on the second camera.
The initialized first target filter contains a first white balance value, which can be understood as a gain white balance value, and the second white balance value can be gradually fused with the first white balance value after passing through the target filter and slowly and gradually adjusted to a first result value, and based on the first result value, the white balance adjustment can be performed on the picture obtained by the second camera.
Meanwhile, since the initialized second target filter contains the first color correction matrix value, the second color correction matrix value can be gradually fused with the first color correction matrix value after passing through the target filter, and is slowly and gradually adjusted to be the second result value.
Therefore, in the embodiment of the present application, the first target filter and the second target filter may be updated based on the first result value and the second result value, and a new white balance value and a new color correction matrix value corresponding to the second camera are obtained in real time according to a change of an environment, and the first target filter and the second target filter are continuously updated according to the new white balance value and the new color correction matrix value, so as to implement gradual transition from a picture shot by the first camera to an output image (i.e., a picture shot by the second camera) that maintains color consistency with the picture under a condition that a user is not sensible, which is beneficial to ensuring that a color is finally and smoothly transitioned to a color corresponding to the second camera.
Alternatively, as shown in fig. 4B, for a timing flowchart of a color consistency correction method provided in this embodiment of the present application, in this embodiment, the electronic device may include 3 cameras, which are a main camera, a telephoto camera, and a wide-angle camera, when the electronic device detects an instruction to turn on the cameras, a program may be initialized, which may refer to a program corresponding to the entire color consistency correction method, and initializes the above 3 cameras, and then the user displays a first image corresponding to the main camera through a display screen of the electronic device, but at this time, another 2 cameras may also run together in the background, and when the electronic device detects an operation instruction to switch the main camera to the telephoto camera, a first image taken by the main camera for a current scene and a second image taken by the telephoto camera for the current scene may be acquired, when a user starts to slide from the main camera to the telephoto camera through the display screen, triggering a color consistency correction method algorithm (which may correspond to the method described in the above-mentioned step S401-step S404), obtaining a first white balance value and a first color correction matrix value for maintaining color consistency between the second image and the first image based on the algorithm at the moment when the first image is switched to the second image, initializing the first time domain filter by the first white balance value to obtain a first target filter for maintaining color smoothness during frame switching, initializing the second time domain filter by the first color correction matrix value to obtain a second target filter, and finally performing color consistency correction on a second white balance value and a second color correction matrix value corresponding to the second image based on the first target filter and the second target filter, and obtaining an output image corresponding to the second camera, and finally finishing the whole program.
It can be seen that, in the color consistency correction method described in the embodiment of the present application, when an operation instruction to switch the first camera to the second camera is detected, a first white balance value for maintaining the color consistency between the second camera and the first camera is calculated, and a first color correction matrix value for maintaining the color consistency between the second camera and the first camera is calculated; respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter; when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera; and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter. So, above-mentioned first target filter and second target filter can be constantly updated, at the in-process that first camera switches to the second camera, be favorable to guaranteeing the smoothness nature of the colour transition of shooing the picture, be favorable to guaranteeing to shoot the colour synchronization of picture between two different cameras, be favorable to improving user experience.
Referring to fig. 5 in keeping with the embodiment shown in fig. 4A, fig. 5 is a schematic structural diagram of an electronic device provided in an embodiment of the present application, and as shown in the drawing, the electronic device includes a first camera and a second camera, and the electronic device includes a processor, a memory, a communication interface, and one or more programs, where the one or more programs are stored in the memory and configured to be executed by the processor, and the program includes instructions for performing the following steps:
when an operation instruction for switching the first camera to the second camera is detected, calculating a first white balance value for keeping the color consistency of the second camera and the first camera, and calculating a first color correction matrix value for keeping the color consistency of the second camera and the first camera;
respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter;
when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera;
and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter.
It can be seen that, in the electronic device described in this embodiment of the present application, when an operation instruction to switch the first camera to the second camera is detected, a first white balance value for maintaining color consistency between the second camera and the first camera is calculated, and a first color correction matrix value for maintaining color consistency between the second camera and the first camera is calculated; respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter; when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera; and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter. So, above-mentioned first target filter and second target filter can be constantly updated, at the in-process that first camera switches to the second camera, be favorable to guaranteeing the smoothness nature of the colour transition of shooing the picture, be favorable to guaranteeing to shoot the colour synchronization of picture between two different cameras, be favorable to improving user experience.
In one possible example, in said calculating a first white balance value that maintains color consistency of said second camera with said first camera, the above program includes instructions for:
determining a color mapping relationship between the first camera and the second camera;
and calculating a first white balance value corresponding to the second camera when the color consistency of the second camera and the first camera is maintained based on the color mapping relation.
In one possible example, in said computing a first color correction matrix value that maintains color consistency of said second camera with said first camera, the program comprises instructions for:
determining a first mapping relation of the first camera for converting from a first color gamut to a second color gamut;
determining a second mapping relationship for the second camera to convert from the first color gamut to the second color gamut;
acquiring a third white balance value and a third color correction matrix value corresponding to the first camera;
and determining a first color correction matrix value corresponding to the second camera when the first camera and the second camera output the same RGB value at the same pixel point according to the third white balance value, the third color correction matrix value, the first mapping relation and the second mapping relation.
In one possible example, the program further includes instructions for performing the steps of:
determining first statistical information of the first camera;
determining whether the first white balance value needs to be adjusted according to the third white balance value and the first statistical information;
and if the first white balance value needs to be adjusted, adjusting the first white balance value.
In one possible example, in the determining whether the first white balance value needs to be adjusted according to the third white balance value and the first statistical information, the program includes instructions for:
determining a white point coordinate in the first statistical information having a minimum distance from the third white balance value;
acquiring second statistical information of the second camera;
calculating a homography matrix between the first camera and the second camera according to the first statistical information and the second statistical information;
calculating a target coordinate corresponding to the white point coordinate according to the homography matrix;
determining a white balance value with the minimum distance from the second statistical information to the target coordinate to obtain a fourth white balance value;
and determining whether the first white balance value needs to be adjusted according to the fourth white balance value.
In one possible example, in said determining whether the first white balance value needs to be adjusted based on the fourth white balance value, the above program includes instructions for:
calculating a difference value between the first white balance value and the fourth white balance value;
if the difference value is smaller than or equal to a preset threshold value, adjusting the first white balance value to a fourth white balance value, and initializing the first preset filter according to the fourth white balance value to obtain a first target filter;
and if the difference value is larger than the preset threshold value, determining that the first white balance value does not need to be adjusted.
In one possible example, in the color consistency correction of the second camera according to the second white balance value and the second color correction matrix value, the first target filter, and the second target filter, the program includes instructions for:
inputting the second white balance value into the first target filter to obtain a first result value;
inputting the second color correction matrix value into the second target filter to obtain a second result value;
and updating the first target filter according to the first result value and updating the second target filter according to the second result value in the process of switching the first camera to the second camera so as to realize color consistency correction on the second camera.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the electronic device comprises corresponding hardware structures and/or software modules for performing the respective functions in order to realize the above-mentioned functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the electronic device may be divided into the functional units according to the method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In the case of dividing each functional module with respect to each function, fig. 6 shows a schematic diagram of a color consistency correction apparatus, as shown in fig. 6, the color consistency correction apparatus 600 is applied to an electronic device, and the color consistency correction apparatus 600 may include: a calculation unit 601, an initialization unit 602, an acquisition unit 603 and a correction unit 604, wherein,
among other things, the computing unit 601 may be used to support the electronic device in performing step S401 described above, and/or other processes for the techniques described herein.
The initialization unit 602 may be used to support the electronic device to perform step S402 described above, and/or other processes for the techniques described herein.
The acquisition unit 603 may be used to support the electronic device to perform step S403 described above, and/or other processes for the techniques described herein.
The remediation unit 604 may be used to support the electronic device in performing step S404 described above, and/or other processes for the techniques described herein.
As can be seen, in the color consistency correction apparatus provided in the embodiment of the present application, when an operation instruction to switch the first camera to the second camera is detected, a first white balance value for maintaining the color consistency between the second camera and the first camera is calculated, and a first color correction matrix value for maintaining the color consistency between the second camera and the first camera is calculated; respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter; when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera; and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter. So, above-mentioned first target filter and second target filter can be constantly updated, at the in-process that first camera switches to the second camera, be favorable to guaranteeing the smoothness nature of the colour transition of shooing the picture, be favorable to guaranteeing to shoot the colour synchronization of picture between two different cameras, be favorable to improving user experience.
In one possible example, in terms of the calculating the first white balance value that keeps the color consistency of the second camera and the first camera, the calculating unit 601 is specifically configured to:
determining a color mapping relationship between the first camera and the second camera;
and calculating a first white balance value corresponding to the second camera when the color consistency of the second camera and the first camera is maintained based on the color mapping relation.
In one possible example, in terms of the calculating a first color correction matrix value that maintains color consistency between the second camera and the first camera, the calculating unit 601 is specifically configured to:
determining a first mapping relation of the first camera for converting from a first color gamut to a second color gamut;
determining a second mapping relationship for the second camera to convert from the first color gamut to the second color gamut;
acquiring a third white balance value and a third color correction matrix value corresponding to the first camera;
and determining a first color correction matrix value corresponding to the second camera when the first camera and the second camera output the same RGB value at the same pixel point according to the third white balance value, the third color correction matrix value, the first mapping relation and the second mapping relation.
In a possible example, in the aspect of performing color consistency correction on the second camera according to the second white balance value and the second color correction matrix value, the first target filter, and the second target filter, the correcting unit 604 is specifically configured to:
inputting the second white balance value into the first target filter to obtain a first result value;
inputting the second color correction matrix value into the second target filter to obtain a second result value;
and updating the first target filter according to the first result value and updating the second target filter according to the second result value in the process of switching the first camera to the second camera so as to realize color consistency correction on the second camera.
It should be noted that all relevant contents of each step related to the above method embodiment may be referred to the functional description of the corresponding functional module, and are not described herein again.
The electronic device provided by the embodiment is used for executing the color consistency correction method, so that the same effect as the implementation method can be achieved.
In case an integrated unit is employed, the electronic device may comprise a processing module, a storage module and a communication module. The processing module may be configured to control and manage the actions of the electronic device, and for example, may be configured to support the electronic device to perform the steps performed by the calculating unit 601, the initializing unit 602, the obtaining unit 603, and the correcting unit 604. The memory module may be used to support the electronic device in executing stored program codes and data, etc. The communication module can be used for supporting the communication between the electronic equipment and other equipment.
The processing module may be a processor or a controller. Which may implement or perform the various illustrative logical blocks, modules, and circuits described in connection with the disclosure. A processor may also be a combination of computing functions, e.g., a combination of one or more microprocessors, a Digital Signal Processing (DSP) and a microprocessor, or the like. The storage module may be a memory. The communication module may specifically be a radio frequency circuit, a bluetooth chip, a Wi-Fi chip, or other devices that interact with other electronic devices.
In an embodiment, when the processing module is a processor and the storage module is a memory, the electronic device according to this embodiment may be a device having the structure shown in fig. 1.
Embodiments of the present application also provide a computer storage medium, where the computer storage medium stores a computer program for electronic data exchange, the computer program enabling a computer to execute part or all of the steps of any one of the methods described in the above method embodiments, and the computer includes an electronic device.
Embodiments of the present application also provide a computer program product comprising a non-transitory computer readable storage medium storing a computer program operable to cause a computer to perform some or all of the steps of any of the methods as described in the above method embodiments. The computer program product may be a software installation package, the computer comprising an electronic device.
It should be noted that, for simplicity of description, the above-mentioned method embodiments are described as a series of acts or combination of acts, but those skilled in the art will recognize that the present application is not limited by the order of acts described, as some steps may occur in other orders or concurrently depending on the application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the above-described division of the units is only one type of division of logical functions, and other divisions may be realized in practice, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit may be stored in a computer readable memory if it is implemented in the form of a software functional unit and sold or used as a stand-alone product. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the above-mentioned method of the embodiments of the present application. And the aforementioned memory comprises: a U-disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and other various media capable of storing program codes.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash Memory disks, Read-Only memories (ROMs), Random Access Memories (RAMs), magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (10)

1. A color consistency correction method is applied to electronic equipment, and is characterized in that the electronic equipment comprises a first camera and a second camera, and the method comprises the following steps:
when an operation instruction for switching the first camera to the second camera is detected, calculating a first white balance value for keeping the color consistency of the second camera and the first camera, and calculating a first color correction matrix value for keeping the color consistency of the second camera and the first camera;
respectively initializing a first preset filter and a second preset filter through the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter;
when the first camera is switched to the second camera, acquiring a second white balance value and a second color correction matrix value corresponding to the second camera;
and performing color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter.
2. The method of claim 1, wherein the calculating a first white balance value that maintains color consistency of the second camera with the first camera comprises:
determining a color mapping relationship between the first camera and the second camera;
and calculating a first white balance value corresponding to the second camera when the color consistency of the second camera and the first camera is maintained based on the color mapping relation.
3. The method of claim 1, wherein the calculating a first color correction matrix value that maintains color consistency of the second camera with the first camera comprises:
determining a first mapping relation of the first camera for converting from a first color gamut to a second color gamut;
determining a second mapping relationship for the second camera to convert from the first color gamut to the second color gamut;
acquiring a third white balance value and a third color correction matrix value corresponding to the first camera;
and determining a first color correction matrix value corresponding to the second camera when the first camera and the second camera output the same RGB value at the same pixel point according to the third white balance value, the third color correction matrix value, the first mapping relation and the second mapping relation.
4. The method of claim 3, further comprising:
determining first statistical information of the first camera;
determining whether the first white balance value needs to be adjusted according to the third white balance value and the first statistical information;
and if the first white balance value needs to be adjusted, adjusting the first white balance value.
5. The method according to claim 4, wherein the determining whether the first white balance value needs to be adjusted according to the third white balance value and the first statistical information comprises:
determining a white point coordinate in the first statistical information having a minimum distance from the third white balance value;
acquiring second statistical information of the second camera;
calculating a homography matrix between the first camera and the second camera according to the first statistical information and the second statistical information;
calculating a target coordinate corresponding to the white point coordinate according to the homography matrix;
determining a white balance value with the minimum distance from the second statistical information to the target coordinate to obtain a fourth white balance value;
and determining whether the first white balance value needs to be adjusted according to the fourth white balance value.
6. The method of claim 5, wherein determining whether the first white balance value needs to be adjusted according to the fourth white balance value comprises:
calculating a difference value between the first white balance value and the fourth white balance value;
if the difference value is smaller than or equal to a preset threshold value, adjusting the first white balance value to a fourth white balance value, and initializing the first preset filter according to the fourth white balance value to obtain a first target filter;
and if the difference value is larger than the preset threshold value, determining that the first white balance value does not need to be adjusted.
7. The method of claim 1, wherein the color consistency rectification of the second camera according to the second white balance value and the second color rectification matrix value, the first target filter and the second target filter comprises:
inputting the second white balance value into the first target filter to obtain a first result value;
inputting the second color correction matrix value into the second target filter to obtain a second result value;
and updating the first target filter according to the first result value and updating the second target filter according to the second result value in the process of switching the first camera to the second camera so as to realize color consistency correction on the second camera.
8. A color consistency correction apparatus applied to an electronic device including a first camera and a second camera, the apparatus comprising: a calculation unit, an initialization unit, an acquisition unit and a correction unit, wherein,
the computing unit is used for computing a first white balance value for keeping the color consistency of the second camera and the first camera when an operation instruction for switching the first camera to the second camera is detected, and computing a first color correction matrix value for keeping the color consistency of the second camera and the first camera;
the initialization unit is configured to initialize a first preset filter and a second preset filter respectively according to the first white balance value and the first color correction matrix value to obtain a first target filter and a second target filter;
the acquiring unit is configured to acquire a second white balance value and a second color correction matrix value corresponding to the second camera when the first camera is switched to the second camera;
and the correcting unit is used for carrying out color consistency correction on the second camera according to the second white balance value, the second color correction matrix value, the first target filter and the second target filter.
9. An electronic device comprising a processor, a memory, a communication interface, and one or more programs stored in the memory and configured to be executed by the processor, the programs comprising instructions for performing the steps in the method of any of claims 1-7.
10. A computer-readable storage medium, characterized in that a computer program for electronic data exchange is stored, wherein the computer program causes a computer to perform the method according to any one of claims 1-7.
CN202011558507.0A 2020-12-24 2020-12-24 Color consistency correction method and related device Pending CN112598594A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011558507.0A CN112598594A (en) 2020-12-24 2020-12-24 Color consistency correction method and related device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011558507.0A CN112598594A (en) 2020-12-24 2020-12-24 Color consistency correction method and related device

Publications (1)

Publication Number Publication Date
CN112598594A true CN112598594A (en) 2021-04-02

Family

ID=75202356

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011558507.0A Pending CN112598594A (en) 2020-12-24 2020-12-24 Color consistency correction method and related device

Country Status (1)

Country Link
CN (1) CN112598594A (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950635A (en) * 2021-04-26 2021-06-11 Oppo广东移动通信有限公司 Gray dot detection method, gray dot detection device, electronic device, and storage medium
CN113676713A (en) * 2021-08-11 2021-11-19 维沃移动通信(杭州)有限公司 Image processing method, apparatus, device and medium
CN113766141A (en) * 2021-09-29 2021-12-07 维沃移动通信有限公司 Image information processing method and device
CN113891004A (en) * 2021-11-18 2022-01-04 展讯通信(上海)有限公司 Image processing method, device, device and storage medium
CN114612571A (en) * 2022-03-07 2022-06-10 重庆紫光华山智安科技有限公司 White balance calibration parameter generation method, image correction method, white balance calibration parameter generation system, image correction device and medium
CN114757856A (en) * 2022-06-16 2022-07-15 深圳深知未来智能有限公司 Automatic white balance algorithm and system based on unsupervised deep learning
CN115529448A (en) * 2022-03-10 2022-12-27 荣耀终端有限公司 Image processing method and related device
CN115550544A (en) * 2022-08-19 2022-12-30 荣耀终端有限公司 Image processing method and device
CN115955611A (en) * 2022-03-28 2023-04-11 荣耀终端有限公司 Image processing method and electronic equipment
CN116132784A (en) * 2021-11-11 2023-05-16 北京小米移动软件有限公司 Image processing method, device and storage medium
CN116908093A (en) * 2023-06-06 2023-10-20 迈宝嘉成(苏州)网络科技有限公司 Flaw identification system for goods
CN119094711A (en) * 2024-07-18 2024-12-06 昆山丘钛光电科技有限公司 A control method, device, medium and equipment for white balance synchronization
US12363269B2 (en) * 2022-11-09 2025-07-15 Nvidia Corporation Deferred color correction in image processing pipelines for autonomous systems and applications

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130342662A1 (en) * 2012-06-21 2013-12-26 Canon Kabushiki Kaisha Image processing device, image processing method, and program
CN106231193A (en) * 2016-08-05 2016-12-14 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN109410152A (en) * 2018-11-26 2019-03-01 Oppo广东移动通信有限公司 Imaging method and apparatus, electronic device, computer-readable storage medium
CN110009567A (en) * 2019-04-09 2019-07-12 三星电子(中国)研发中心 Image stitching method and device for fisheye lens
CN110880306A (en) * 2019-11-01 2020-03-13 南京图格医疗科技有限公司 Color restoration correction method for medical display
CN111246093A (en) * 2020-01-16 2020-06-05 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111314683A (en) * 2020-03-17 2020-06-19 Oppo广东移动通信有限公司 White balance adjustment method and related equipment
CN111866483A (en) * 2020-07-06 2020-10-30 Oppo广东移动通信有限公司 Color reproduction method and apparatus, computer readable medium and electronic device

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20130342662A1 (en) * 2012-06-21 2013-12-26 Canon Kabushiki Kaisha Image processing device, image processing method, and program
CN106231193A (en) * 2016-08-05 2016-12-14 深圳市金立通信设备有限公司 A kind of image processing method and terminal
CN109410152A (en) * 2018-11-26 2019-03-01 Oppo广东移动通信有限公司 Imaging method and apparatus, electronic device, computer-readable storage medium
CN110009567A (en) * 2019-04-09 2019-07-12 三星电子(中国)研发中心 Image stitching method and device for fisheye lens
CN110880306A (en) * 2019-11-01 2020-03-13 南京图格医疗科技有限公司 Color restoration correction method for medical display
CN111246093A (en) * 2020-01-16 2020-06-05 Oppo广东移动通信有限公司 Image processing method, image processing device, storage medium and electronic equipment
CN111314683A (en) * 2020-03-17 2020-06-19 Oppo广东移动通信有限公司 White balance adjustment method and related equipment
CN111866483A (en) * 2020-07-06 2020-10-30 Oppo广东移动通信有限公司 Color reproduction method and apparatus, computer readable medium and electronic device

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
PARK J等: "Efficient and robust color consistency for community photo collections", 《PROCEEDINGS OF THE IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION》, 31 December 2016 (2016-12-31), pages 430 - 438 *
徐渊等: "基于FPGA的实时CMOS视频图像预处理系统", 《深圳大学学报:理工版》, vol. 30, no. 4, 17 October 2013 (2013-10-17), pages 416 - 422 *
王哲: "基于结构与纹理分解的较大区域图像修复算法", 《中国优秀硕士学位论文全文数据库 (信息科技辑)》, 15 March 2016 (2016-03-15), pages 138 - 7242 *

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112950635A (en) * 2021-04-26 2021-06-11 Oppo广东移动通信有限公司 Gray dot detection method, gray dot detection device, electronic device, and storage medium
WO2023016320A1 (en) * 2021-08-11 2023-02-16 维沃移动通信(杭州)有限公司 Image processing method and apparatus, and device and medium
CN113676713A (en) * 2021-08-11 2021-11-19 维沃移动通信(杭州)有限公司 Image processing method, apparatus, device and medium
CN113766141A (en) * 2021-09-29 2021-12-07 维沃移动通信有限公司 Image information processing method and device
CN116132784A (en) * 2021-11-11 2023-05-16 北京小米移动软件有限公司 Image processing method, device and storage medium
CN113891004A (en) * 2021-11-18 2022-01-04 展讯通信(上海)有限公司 Image processing method, device, device and storage medium
CN114612571A (en) * 2022-03-07 2022-06-10 重庆紫光华山智安科技有限公司 White balance calibration parameter generation method, image correction method, white balance calibration parameter generation system, image correction device and medium
CN115529448A (en) * 2022-03-10 2022-12-27 荣耀终端有限公司 Image processing method and related device
CN115955611B (en) * 2022-03-28 2023-09-29 荣耀终端有限公司 Image processing method and electronic equipment
CN115955611A (en) * 2022-03-28 2023-04-11 荣耀终端有限公司 Image processing method and electronic equipment
CN117425091A (en) * 2022-03-28 2024-01-19 荣耀终端有限公司 Image processing method and electronic equipment
CN114757856B (en) * 2022-06-16 2022-09-20 深圳深知未来智能有限公司 Automatic white balance algorithm and system based on unsupervised deep learning
CN114757856A (en) * 2022-06-16 2022-07-15 深圳深知未来智能有限公司 Automatic white balance algorithm and system based on unsupervised deep learning
CN115550544A (en) * 2022-08-19 2022-12-30 荣耀终端有限公司 Image processing method and device
CN115550544B (en) * 2022-08-19 2024-04-12 荣耀终端有限公司 Image processing method and device
US12363269B2 (en) * 2022-11-09 2025-07-15 Nvidia Corporation Deferred color correction in image processing pipelines for autonomous systems and applications
CN116908093A (en) * 2023-06-06 2023-10-20 迈宝嘉成(苏州)网络科技有限公司 Flaw identification system for goods
CN116908093B (en) * 2023-06-06 2024-03-19 迈宝嘉成(苏州)网络科技有限公司 Flaw identification system for goods
CN119094711A (en) * 2024-07-18 2024-12-06 昆山丘钛光电科技有限公司 A control method, device, medium and equipment for white balance synchronization

Similar Documents

Publication Publication Date Title
CN112598594A (en) Color consistency correction method and related device
CN109917956B (en) A method and electronic device for controlling screen display
CN109814766B (en) Application display method and electronic device
CN115460318B (en) Display method and related device
CN111768416B (en) Photo cutting method and device
CN111182614B (en) Method and apparatus for establishing network connection and electronic device
CN114579016A (en) A method, electronic device and system for sharing an input device
CN112506386A (en) Display method of folding screen and electronic equipment
CN110830645B (en) Operation method, electronic equipment and computer storage medium
CN111553846B (en) Super-resolution processing method and device
CN110401768B (en) Method and device for adjusting working state of electronic equipment
CN112860428A (en) High-energy-efficiency display processing method and equipment
CN114115619A (en) Method and electronic device for displaying application program interface
CN114115769A (en) A display method and electronic device
WO2021057277A1 (en) Photographing method in dark light and electronic device
CN113115439A (en) Positioning method and related equipment
CN113676238A (en) Arrival angle determination method and related products
EP4254938A1 (en) Electronic device and operation method therefor
CN111768352A (en) Image processing method and device
US20240045557A1 (en) Method for Sharing Input Device, Electronic Device, and System
CN110780929A (en) Method and electronic device for calling hardware interface
CN111612723B (en) Image restoration method and device
CN113452969A (en) Image processing method and device
CN116048323B (en) Image processing method and electronic equipment
CN113781959B (en) Interface processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
AD01 Patent right deemed abandoned

Effective date of abandoning: 20241227

AD01 Patent right deemed abandoned