[go: up one dir, main page]

CN110288625B - Method and apparatus for processing image - Google Patents

Method and apparatus for processing image Download PDF

Info

Publication number
CN110288625B
CN110288625B CN201910598165.6A CN201910598165A CN110288625B CN 110288625 B CN110288625 B CN 110288625B CN 201910598165 A CN201910598165 A CN 201910598165A CN 110288625 B CN110288625 B CN 110288625B
Authority
CN
China
Prior art keywords
pixel
pixel point
image
value
confidence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910598165.6A
Other languages
Chinese (zh)
Other versions
CN110288625A (en
Inventor
邓涵
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Douyin Vision Co Ltd
Douyin Vision Beijing Co Ltd
Original Assignee
Beijing ByteDance Network Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing ByteDance Network Technology Co Ltd filed Critical Beijing ByteDance Network Technology Co Ltd
Priority to CN201910598165.6A priority Critical patent/CN110288625B/en
Publication of CN110288625A publication Critical patent/CN110288625A/en
Application granted granted Critical
Publication of CN110288625B publication Critical patent/CN110288625B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/143Segmentation; Edge detection involving probabilistic approaches, e.g. Markov random field [MRF] modelling
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/174Segmentation; Edge detection involving the use of two or more images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20076Probabilistic image processing
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Probability & Statistics with Applications (AREA)
  • Software Systems (AREA)
  • Image Analysis (AREA)

Abstract

Embodiments of the present disclosure disclose methods and apparatus for processing images. One embodiment of the method comprises: acquiring a first segmentation image of a first image, acquiring a second segmentation image of a previous frame image of the first image, wherein pixel values of pixel points of the first segmentation image and the second segmentation image are proportional to the probability that the pixel points belong to the foreground; for a pixel point of the first segmentation image, wherein the corresponding pixel value belongs to a second preset interval, determining the confidence coefficient of the pixel value of the pixel point as a first confidence coefficient, wherein the first confidence coefficient is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of a second segmentation image corresponding to the position of the pixel point according to the first confidence coefficient; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point. This embodiment reduces fluctuations between the divided images of the two previous and next frame images.

Description

Method and apparatus for processing image
Technical Field
Embodiments of the present disclosure relate to the field of computer technologies, and in particular, to a method and an apparatus for processing an image.
Background
Image segmentation may refer to techniques and processes that divide an image into several specific regions with unique properties and pose objects of interest. Image segmentation is generally a key step from image processing to image analysis.
There are many existing image segmentation methods. For example, a threshold-based segmentation method, a region-based segmentation method, an edge-based segmentation method, a neural network-based segmentation method, and the like.
In some application scenarios, the difference between the segmentation results of two frames before and after a video may be large due to problems such as lack of data or uneven data distribution, and the segmentation result may be unstable.
Disclosure of Invention
Embodiments of the present disclosure propose methods and apparatuses for processing an image.
In a first aspect, an embodiment of the present disclosure provides a method for processing an image, the method including: acquiring a first segmentation image obtained by image segmentation of a first image and a second segmentation image obtained by image segmentation of a previous frame of image of the first image, wherein pixel values of pixel points of the first segmentation image and the second segmentation image are proportional to the probability that the pixel points belong to the foreground, and the value range of the pixel values of the pixel points of the first segmentation image and the second segmentation image is a first preset interval; for pixel points of the first segmentation image, corresponding pixel values of which belong to a second preset interval, executing the following steps, wherein the second preset interval is a sub-interval of the first preset interval: determining the confidence coefficient of the pixel value of the pixel point as a first confidence coefficient, wherein the first confidence coefficient is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of a second segmentation image corresponding to the position of the pixel point according to the first confidence coefficient, wherein the weight of the pixel point is positively correlated with the first confidence coefficient, and the sum of the weight of the second pixel point and the weight of the pixel point is 1; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point.
In some embodiments, determining the confidence of the pixel value of the pixel point as the first confidence includes: determining the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; and normalizing the determined absolute value to obtain a first confidence coefficient.
In some embodiments, determining, according to the first confidence, a weight of the pixel point and a weight of a second pixel point of the second segmented image corresponding to the position of the pixel point includes: acquiring a gradient value of a third pixel point of the first image corresponding to the pixel point; according to the gradient value, determining a confidence coefficient used for representing that the third pixel point belongs to the edge pixel point as a second confidence coefficient, wherein the second confidence coefficient is positively correlated with the gradient value; and determining the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree, wherein the weight of the pixel point is negatively related to the second confidence degree.
In some embodiments, determining, as the second confidence, the confidence for characterizing that the third pixel belongs to the edge pixel according to the gradient value includes: normalizing the gradient value, and determining the normalized gradient value as a second confidence coefficient.
In some embodiments, determining the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree includes: determining a difference between one and the second confidence; and determining the weight of the pixel point and the weight of the second pixel point according to the product of the first confidence coefficient and the determined difference value.
In some embodiments, determining the weight of the pixel and the weight of the second pixel based on the product of the first confidence and the determined difference comprises: and determining a function value of a preset activation function corresponding to the product of the first confidence coefficient and the determined difference value as the weight of the pixel point.
In some embodiments, the activation function is a Sigmoid function.
In some embodiments, the second predetermined interval is a true sub-interval of the first predetermined interval, and a difference between an upper limit of the first predetermined interval and an upper limit of the second predetermined interval is smaller than a first predetermined threshold, and a difference between a lower limit of the second predetermined interval and a lower limit of the first predetermined interval is smaller than a second predetermined threshold.
In a second aspect, an embodiment of the present disclosure provides an apparatus for processing an image, the apparatus including: the image segmentation device comprises an acquisition unit and a processing unit, wherein the acquisition unit is configured to acquire a first segmentation image obtained by image segmentation of a first image and acquire a second segmentation image obtained by image segmentation of a previous frame of image of the first image, pixel values of pixel points of the first segmentation image and the second segmentation image are proportional to the probability that the pixel points belong to the foreground, and the value ranges of the pixel values of the pixel points of the first segmentation image and the second segmentation image are a first preset interval; the determining unit is configured to execute the following steps for pixel points of the first segmented image, of which the corresponding pixel values belong to a second preset interval, wherein the second preset interval is a sub-interval of the first preset interval: determining the confidence coefficient of the pixel value of the pixel point as a first confidence coefficient, wherein the first confidence coefficient is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of a second segmentation image corresponding to the position of the pixel point according to the first confidence coefficient, wherein the weight of the pixel point is positively correlated with the first confidence coefficient, and the sum of the weight of the second pixel point and the weight of the pixel point is 1; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point.
In some embodiments, the determining unit is further configured to determine the confidence of the pixel value of the pixel point as the first confidence, including: determining the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; and normalizing the determined absolute value to obtain a first confidence coefficient.
In some embodiments, the determining unit is further configured to determine, according to the first confidence, a weight of the pixel point and a weight of a second pixel point of the second segmented image corresponding to the position of the pixel point, including: acquiring a gradient value of a third pixel point of the first image corresponding to the pixel point; according to the gradient value, determining a confidence coefficient used for representing that the third pixel point belongs to the edge pixel point as a second confidence coefficient, wherein the second confidence coefficient is positively correlated with the gradient value; and determining the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree, wherein the weight of the pixel point is negatively related to the second confidence degree.
In some embodiments, the determining unit is further configured to determine, as the second confidence, a confidence for characterizing that the pixel belongs to an edge pixel according to the gradient value, and includes: normalizing the gradient value, and determining the normalized gradient value as a second confidence coefficient.
In some embodiments, the determining unit is further configured to: determining a difference between one and the second confidence; and determining the weight of the pixel point and the weight of the second pixel point according to the product of the first confidence coefficient and the determined difference value.
In some embodiments, the determining unit is further configured to: and determining a function value of a preset activation function corresponding to the product of the first confidence coefficient and the determined difference value as the weight of the pixel point.
In some embodiments, the activation function is a Sigmoid function.
In some embodiments, the second predetermined interval is a true sub-interval of the first predetermined interval, and a difference between an upper limit of the first predetermined interval and an upper limit of the second predetermined interval is smaller than a first predetermined threshold, and a difference between a lower limit of the second predetermined interval and a lower limit of the first predetermined interval is smaller than a second predetermined threshold.
In a third aspect, an embodiment of the present disclosure provides an electronic device, including: one or more graphics processors; storage means for storing one or more programs; when the one or more programs are executed by the one or more graphics processors, the one or more graphics processors are caused to implement a method as described in any implementation of the first aspect.
In a fourth aspect, embodiments of the present disclosure provide a computer-readable medium on which a computer program is stored, which computer program, when executed by a graphics processor, implements the method as described in any of the implementations of the first aspect.
According to the method and the device for processing the image, the confidence degree of the pixel values of the pixel points of the segmented image corresponding to the next frame of image is determined, the weight of the pixel points of the segmented image of the next frame of image and the weight of the pixel points of the segmented image corresponding to the previous frame of image are determined, and then the pixel values of the pixel points of the segmented image corresponding to the next frame of image are adjusted according to the weighted sum of the pixel values of the pixel points of the segmented image corresponding to the previous frame of image and the next frame of image, so that the volatility between the segmented images corresponding to the previous frame of image and the next frame of image is reduced.
Drawings
Other features, objects and advantages of the disclosure will become more apparent upon reading of the following detailed description of non-limiting embodiments thereof, made with reference to the accompanying drawings in which:
FIG. 1 is an exemplary system architecture diagram in which one embodiment of the present disclosure may be applied;
FIG. 2 is a flow diagram of one embodiment of a method for processing an image according to the present disclosure;
FIG. 3 is a schematic illustration of one application scenario of a method for processing an image according to an embodiment of the present disclosure;
FIG. 4 is a flow diagram of yet another embodiment of a method for processing an image according to the present disclosure;
FIG. 5 is a schematic block diagram of one embodiment of an apparatus for processing images according to the present disclosure;
FIG. 6 is a schematic structural diagram of an electronic device suitable for use in implementing embodiments of the present disclosure.
Detailed Description
The present disclosure is described in further detail below with reference to the accompanying drawings and examples. It is to be understood that the specific embodiments described herein are merely illustrative of the relevant invention and not restrictive of the invention. It should be noted that, for convenience of description, only the portions related to the related invention are shown in the drawings.
It should be noted that, in the present disclosure, the embodiments and features of the embodiments may be combined with each other without conflict. The present disclosure will be described in detail below with reference to the accompanying drawings in conjunction with embodiments.
Fig. 1 illustrates an exemplary architecture 100 to which embodiments of the method for processing images or the apparatus for processing images of the present disclosure may be applied.
As shown in fig. 1, the system architecture 100 may include terminal devices 101, 102, 103, a network 104, and a server 105. The network 104 serves as a medium for providing communication links between the terminal devices 101, 102, 103 and the server 105. Network 104 may include various connection types, such as wired, wireless communication links, or fiber optic cables, to name a few.
The terminal devices 101, 102, 103 interact with a server 105 via a network 104 to receive or send messages or the like. Various client applications may be installed on the terminal devices 101, 102, 103. Such as browser-like applications, search-like applications, social platform software, image processing-like applications, video processing-like applications, and so forth.
The terminal apparatuses 101, 102, and 103 may be hardware or software. When the terminal devices 101, 102, 103 are hardware, they may be various electronic devices including, but not limited to, smart phones, tablet computers, e-book readers, laptop portable computers, desktop computers, and the like. When the terminal apparatuses 101, 102, 103 are software, they can be installed in the electronic apparatuses listed above. It may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules to provide distributed services) or as a single piece of software or software module. And is not particularly limited herein.
The server 105 may be a server that provides various services, such as a backend server that provides support for client applications installed on the terminal devices 101, 102, 103. The server 105 may perform image analysis and other processing on the images uploaded by the clients, and generate processing results to feed back to the terminal devices 101, 102, and 103.
Note that, local to the server 105, the server 105 may directly extract and process an image stored locally, and in this case, the terminal apparatuses 101, 102, and 103 and the network 104 may not be present.
It should be noted that the method for processing an image provided by the embodiment of the present disclosure is generally performed by the server 105, and accordingly, the apparatus for processing an image is generally disposed in the server 105.
It should be noted that the terminal devices 101, 102, and 103 may also be installed with image processing applications, and the terminal devices 101, 102, and 103 may also process images based on the image processing applications, in this case, the method for processing images may also be executed by the terminal devices 101, 102, and 103, and accordingly, the apparatus for processing images may also be installed in the terminal devices 101, 102, and 103. At this point, the exemplary system architecture 100 may not have the server 105 and the network 104.
The server 105 may be hardware or software. When the server 105 is hardware, it may be implemented as a distributed server cluster composed of a plurality of servers, or may be implemented as a single server. When the server 105 is software, it may be implemented as multiple pieces of software or software modules (e.g., multiple pieces of software or software modules used to provide distributed services), or as a single piece of software or software module. And is not particularly limited herein.
It should be understood that the number of terminal devices, networks, and servers in fig. 1 is merely illustrative. There may be any number of terminal devices, networks, and servers, as desired for implementation.
With continued reference to FIG. 2, a flow 200 of one embodiment of a method for processing an image in accordance with the present disclosure is shown. The method for processing an image comprises the steps of:
step 201, obtaining a first segmentation image obtained by image segmentation of the first image, and obtaining a second segmentation image obtained by image segmentation of a frame of image before the first image.
In this embodiment, the pixel values of the pixels of the first segmented image and the second segmented image are proportional to the probability that the pixel belongs to the foreground, and the value ranges of the pixel values of the pixels of the first segmented image and the second segmented image are within a first preset interval.
Where foreground may refer to an image region of interest to a technician on an image, and background may refer to the remaining image regions other than the foreground. The pixel values of the pixel points of the first segmentation image and the second segmentation image and the probability that the pixel points belong to the foreground can be positively correlated or negatively correlated.
It should be noted that, if the pixel value of the pixel point of the first segmented image is positively correlated with the probability that the pixel point belongs to the foreground, the pixel value of the pixel point of the second segmented image should also be positively correlated with the probability that the pixel point belongs to the foreground. If the pixel value of the pixel point of the first segmented image is negatively correlated with the probability that the pixel point belongs to the foreground, the pixel value of the pixel point of the second segmented image should be negatively correlated with the probability that the pixel point belongs to the foreground.
The first preset interval may be preset by a technician according to an application requirement. Alternatively, the first preset interval may be [0-255 ].
Taking the pixel value of the pixel point of the segmented image and the probability that the pixel point belongs to the foreground as positive correlation, and the first preset interval is [0-255] as an example, the closer the pixel value of the pixel point of the segmented image is to 0, the more likely the pixel point belongs to the background. Correspondingly, the closer the pixel value of a pixel point of the segmented image is to 255, the more likely the pixel point is to belong to the foreground.
In this embodiment, the first divided image may be obtained by image-dividing the first image. The second divided image may be obtained by image-dividing an image of a frame preceding the first image. The previous frame image of the first image and the first image may refer to images of two adjacent frames of the same video.
It should be understood that the image segmentation method for obtaining the first segmentation image and the second segmentation image is to determine probabilities that each pixel of the image belongs to the foreground, and determine a pixel value of each pixel according to the probability that each pixel corresponds to, so as to generate the segmentation image.
In this embodiment, the executing entity (e.g., server 105 shown in fig. 1) of the method for processing an image may acquire the first and second segmented images from a local or other storage device (e.g., terminal devices 101, 102, 103 shown in fig. 1). Of course, the executing entity may also directly acquire the first segmented image and the second segmented image from some open-source data sets.
Alternatively, the execution main body may obtain a video to be processed in advance, then obtain images corresponding to two adjacent frames from the video to be processed, and use the next frame of image as the first image. Then, the executing body may perform image segmentation on the two acquired images by using various existing image segmentation algorithms to obtain a first segmented image and a second segmented image.
Step 202, for the pixel points of the first segmented image, whose corresponding pixel values belong to the second preset interval, the following steps are executed:
in this step, the second predetermined interval may be a sub-interval of the first predetermined interval. The second preset interval may be preset by a technician.
Step 2021, determining the confidence of the pixel value of the pixel point as the first confidence.
In this step, the confidence of the pixel value of the pixel point may be used to represent the trustworthiness of the pixel value of the pixel point. The first confidence may be positively correlated with an absolute value of a difference between the pixel point and a midpoint value of the first preset interval.
Since the pixel value of the pixel point is proportional to the probability that the pixel point belongs to the foreground, the closer the pixel value of the pixel point is to the midpoint value of the first preset interval, the closer the probability that the pixel point belongs to the foreground or the background can be represented.
Still taking the pixel value of the pixel point of the segmented image and the probability that the pixel point belongs to the foreground as positive correlation, and the first preset interval is [0-255] as an example, if the pixel value of the pixel point is 128, the probability that the pixel point belongs to the foreground or the background can be represented to be very close, that is, the confidence coefficient of the representation is relatively low, and the pixel point cannot be accurately judged to belong to the foreground or the background. The closer the pixel value of the pixel point is to 0, the higher the confidence coefficient that the pixel point belongs to the background can be represented. The closer the pixel value of the pixel point is to 255, the higher the confidence coefficient that the pixel point belongs to the foreground can be represented.
Therefore, the absolute value of the difference between the pixel point and the midpoint value of the first preset interval can be used for representing the confidence coefficient of the pixel value of the pixel point, and the greater the absolute value of the difference between the pixel point and the midpoint value of the first preset interval is, the higher the confidence coefficient of the pixel value which can represent the pixel point is.
Optionally, an absolute value of a difference between the pixel point and a midpoint value of the first preset interval may be determined, and then the determined absolute value is normalized to obtain a first confidence.
Of course, according to a specific application scenario, some transformations (for example, multiplication by an adjustment coefficient, etc.) may be performed on the absolute value of the difference between the pixel point and the midpoint value in the first preset interval, and then the transformation result is used as the first confidence. The specific calculation method can be flexibly set.
Step 2022, according to the first confidence, determining the weight of the pixel point and the weight of the second pixel point corresponding to the position of the pixel point of the second segmentation image.
In this step, the first and second segmented images are generally the same size. Therefore, each pixel point of the first segmentation image corresponds to each pixel point of the second segmentation image one to one according to the position on the image. The weight of the pixel point may be positively correlated with the first confidence, and the sum of the weight of the second pixel point and the weight of the pixel point may be 1.
Step 2023, determine the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as the new pixel value of the pixel point.
In this step, since the weight of the pixel point may be positively correlated with the first confidence, that is, the larger the first confidence is, the larger the weight of the pixel point may be, and the smaller the weight of the corresponding second pixel point may be. Therefore, the pixel value of the pixel point is updated through the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point, namely, the segmentation result of the pixel point and the segmentation result of the corresponding pixel point in the previous frame image are fused to adjust the segmentation result of the pixel point, so that the difference between the segmentation result of the pixel point and the segmentation result of the corresponding pixel point in the previous frame image can be reduced.
Optionally, the second preset interval may be a true sub-interval of the first preset interval, a difference between an upper limit of the first preset interval and an upper limit of the second preset interval is smaller than a first preset threshold, and a difference between a lower limit of the second preset interval and a lower limit of the first preset interval is smaller than a second preset threshold.
The first preset threshold and the second preset threshold may be preset by a technician. Because the corresponding pixel value of the first segmented image is closer to the pixel point of the upper limit or the lower limit of the first preset interval, the probability that the part of the pixel points belong to the foreground or the background is very high, namely, the accuracy of the segmentation result of the part of the pixel points is very high. Therefore, for the part of the pixel points, the segmentation results of the corresponding pixel points in the segmentation image corresponding to the previous frame image are not fused, and only the pixel values of the pixel points which cannot be accurately judged to belong to the foreground or the background in the first segmentation image are updated. Therefore, the accuracy of the first segmentation image and the difference between the first segmentation image and the second segmentation image can be ensured not to be too large, and the processing speed can be improved.
With continued reference to fig. 3, fig. 3 is a schematic diagram 300 of an application scenario of the method for processing an image according to the present embodiment. In the application scenario of fig. 3, the executing entity may acquire a video 301 in advance, and then select images corresponding to two adjacent frames from the video. The image corresponding to the previous frame is the first image 302, and the image corresponding to the next frame is the second image 303. Then, the first image 302 and the second image 303 can be respectively input to an image segmentation network 304 for image segmentation, so as to respectively obtain a first segmented image 305 corresponding to the first image 302 and a second segmented image 306 corresponding to the second image 303.
Taking the pixel values of the pixel points in the first segmentation image and the second segmentation image both between 0 and 255, and taking the positive correlation between the pixel values of the pixel points in the first segmentation image and the second segmentation image and the probability that the pixel points belong to the foreground as an example, the pixel points with the corresponding pixel values between 50 and 200 can be selected from the first segmentation image, and the pixel values of the selected pixel points are updated.
Taking the selected pixel point 307 with the pixel value of "a 1" as an example, the absolute value "C1" of the difference between "a 1" and the median 128 of 0 to 255 may be determined as the first confidence of the pixel point 307, and then the first confidence is normalized to obtain the first weight "W1" of the pixel point 307. If the pixel point corresponding to the pixel point 307 in the second segmented image is 308 and the pixel value of the pixel point 308 is "a 2", it can be determined that the second weight of the pixel point 308 is "1-W1". Thereafter, "W1 × a1+ (1-W1) × a 2" may be calculated as the updated pixel value "A3" of the pixel point 307.
The method provided by the embodiment of the disclosure determines the weight of the pixel point according to the confidence of the pixel value of the pixel point of the first segmented image, so as to ensure that the higher the confidence of the pixel point is, the larger the weight of the pixel point is, and further fuse the pixel values of the corresponding pixel points of the first segmented image and the second segmented image according to the weight, thereby better retaining the segmentation result represented by the first segmented image, and reducing the difference between the first segmented image and the second segmented image.
With further reference to FIG. 4, a flow 400 of yet another embodiment of a method for processing an image is shown. The flow 400 of the method for processing an image comprises the steps of:
step 401, obtaining a first segmented image obtained by image segmentation of the first image, and obtaining a second segmented image obtained by image segmentation of an image of a frame previous to the first image.
The specific implementation process of this step can refer to the related description of step 201 in the corresponding embodiment of fig. 2, and is not repeated here.
Step 402, for the pixel points of the first segmented image, whose corresponding pixel values belong to the second preset interval, executing the following steps:
step 4021, determining a confidence of the pixel value of the pixel point as a first confidence.
The specific implementation process of this step can refer to the related description of step 2021 in the corresponding embodiment of fig. 2, and is not repeated here.
Step 4022, obtaining a gradient value of a third pixel point of the first image corresponding to the pixel point.
In this step, the first image and the first divided image have generally the same size, and the pixel points of the first image and the first divided image correspond to each other one by one according to the positions on the image. The gradient value of the third pixel point can be predetermined by utilizing various existing gradient calculation methods.
And 4023, determining a confidence coefficient for representing that the third pixel belongs to the edge pixel as a second confidence coefficient according to the gradient value.
In this step, the confidence that the third pixel belongs to the edge pixel can be used to represent the trustworthiness of the third pixel that belongs to the edge pixel. The second confidence may be positively correlated with the gradient value. Generally, the gradient at the edge pixel point is larger, and therefore, the larger the corresponding gradient value is, the more likely it is that the edge pixel point is, the larger the corresponding second confidence is.
Optionally, after obtaining the gradient values, the gradient values may be normalized, and the normalized gradient values may be determined as the second confidence level.
It should be understood that after obtaining the gradient values, some transformation (e.g., multiplication by an adjustment coefficient, etc.) may be performed on the gradient values, and the transformed gradient values are used as the second confidence level. The specific method for calculating the second confidence coefficient can be flexibly set.
Step 4024, determining the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree.
In this step, when the first image is subjected to image segmentation, the relative ratio of the edge pixel points is difficult to accurately judge whether the edge pixel points belong to the foreground or the background. Therefore, the segmentation result corresponding to the edge pixel point, that is, the pixel value of the pixel point of the first segmented image corresponding to the edge pixel point is usually closer to the median of the first preset interval.
In other words, if the pixel value of the pixel point is closer to the median of the first preset interval, it may be that the pixel point belongs to the foreground or the background, and it may also be that the pixel point belongs to the edge pixel.
The segmentation results of two adjacent frames are different from each other due to the fact that the pixels which belong to the foreground or the background are not accurately judged if the segmentation results of the two adjacent frames are different from each other, and the segmentation results of the edge pixels are different from each other. Therefore, when the segmentation results of the corresponding pixel points in the previous frame of image are fused, lower weight can be distributed to the edge pixel points. Based on this, the weight of the pixel point may be negatively correlated with the second confidence. That is, the more likely the pixel is to be an edge pixel, the smaller the weight of the pixel should be during fusion.
Therefore, when the segmentation result of the pixel point is updated, excessive updating of the edge pixel point can be avoided to a certain extent, so that the accuracy of the segmentation result of the first segmentation image can be ensured while the difference between the first segmentation image and the second segmentation image is reduced.
Based on the above, various weight determination modes can be flexibly set, so that the weight of the pixel point is positively correlated with the first confidence coefficient and negatively correlated with the second confidence coefficient. For example, the difference between the first confidence and the second confidence may be determined as an index, and e is used as a base to determine the weight of the pixel.
Alternatively, a difference value with the gradient value of the pixel point after the normalization processing may be determined, and then the weight of the pixel point and the weight of the second pixel point may be determined according to the product of the first confidence and the determined difference value.
For example, the obtained product may be directly used as the weight of the pixel, and then a difference value between one and the obtained product may be used as the weight of the second pixel.
Optionally, a function value of the preset activation function corresponding to a product of the first confidence degree and the second confidence degree may be determined as the weight of the pixel point. The preset activation function can be preset by a technician according to application requirements. For example, the activation function may be a Sigmoid function, tanh function, ReLU function, or the like.
The activation function is used for determining the weight of the pixel point, and nonlinearity can be introduced, so that the first segmentation image is updated accurately.
Step 4025, determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as the new pixel value of the pixel point.
The specific implementation process of this step can refer to the related description of step 2023 in the corresponding embodiment of fig. 2, and is not repeated here.
In the method for processing an image in this embodiment, when the segmentation result of the pixel point is updated, the weight of the pixel point is determined comprehensively according to the pixel value and the gradient value of the pixel point, so as to avoid excessive updating of the edge pixel point, thereby reducing the difference between the first segmentation image and the second segmentation image and ensuring the accuracy of the segmentation result of the first segmentation image.
With further reference to fig. 5, as an implementation of the methods shown in the above figures, the present disclosure provides an embodiment of an apparatus for processing an image, which corresponds to the method embodiment shown in fig. 2, and which is particularly applicable in various electronic devices.
As shown in fig. 5, the apparatus 500 for processing an image provided by the present embodiment includes an acquisition unit 501 and a determination unit 502. The acquiring unit 501 is configured to acquire a first segmented image obtained by image segmentation of a first image and acquire a second segmented image obtained by image segmentation of a previous frame of image of the first image, pixel values of pixels of the first segmented image and the second segmented image are proportional to probability that the pixels belong to a foreground, and a value range of the pixel values of the pixels of the first segmented image and the second segmented image is a first preset interval; the determining unit 502 is configured to perform the following steps for pixel points of the first segmented image, the corresponding pixel values of which belong to a second preset interval, wherein the second preset interval is a sub-interval of the first preset interval: determining the confidence coefficient of the pixel value of the pixel point as a first confidence coefficient, wherein the first confidence coefficient is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of a second segmentation image corresponding to the position of the pixel point according to the first confidence coefficient, wherein the weight of the pixel point is positively correlated with the first confidence coefficient, and the sum of the weight of the second pixel point and the weight of the pixel point is 1; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point.
In the present embodiment, in the apparatus 500 for processing an image: the specific processing of the obtaining unit 501 and the determining unit 502 and the technical effects thereof can refer to the related descriptions of step 201 and step 202 in the corresponding embodiment of fig. 2, which are not repeated herein.
In some optional implementations of the present embodiment, the determining unit 502 is further configured to determine, as the first confidence, the confidence of the pixel value of the pixel point, where the determining includes: determining the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; and normalizing the determined absolute value to obtain a first confidence coefficient.
In some optional implementations of this embodiment, the determining unit 502 is further configured to determine, according to the first confidence, the weight of the pixel point and the weight of a second pixel point of the second segmented image corresponding to the position of the pixel point, including: acquiring a gradient value of a third pixel point of the first image corresponding to the pixel point; according to the gradient value, determining a confidence coefficient used for representing that the pixel point belongs to the edge pixel point as a second confidence coefficient, wherein the second confidence coefficient is positively correlated with the gradient value; and determining the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree, wherein the weight of the pixel point is negatively related to the second confidence degree.
In some optional implementations of this embodiment, the determining unit 502 is further configured to determine, as the second confidence, a confidence that the pixel belongs to the edge pixel according to the gradient value, where the determining includes: normalizing the gradient values, and determining the difference value of the normalized gradient values as a second confidence coefficient.
In some optional implementations of this embodiment, the determining unit 502 is further configured to determine the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree, including: determining a difference between one and the second confidence; and determining the weight of the pixel point and the weight of the second pixel point according to the product of the first confidence coefficient and the determined difference value.
In some optional implementations of the present embodiment, the determining unit 502 is further configured to: and determining a function value of a preset activation function corresponding to the product of the first confidence coefficient and the determined difference value as the weight of the pixel point.
In some optional implementations of this embodiment, the activation function is a Sigmoid function.
In some optional implementation manners of this embodiment, the second preset interval is a true sub-interval of the first preset interval, a difference between an upper limit of the first preset interval and an upper limit of the second preset interval is smaller than a first preset threshold, and a difference between a lower limit of the second preset interval and a lower limit of the first preset interval is smaller than a second preset threshold.
In the apparatus provided by the above embodiment of the present disclosure, an obtaining unit obtains a first segmentation image obtained by image segmentation of a first image, and obtains a second segmentation image obtained by image segmentation of a previous frame image of the first image, pixel values of pixels of the first segmentation image and the second segmentation image are proportional to a probability that the pixel points belong to a foreground, and a value range of the pixel values of the pixels of the first segmentation image and the second segmentation image is a first preset interval; the determining unit executes the following steps for pixel points of the first segmentation image, corresponding to pixel values belonging to a second preset interval, wherein the second preset interval is a sub-interval of the first preset interval: determining the confidence coefficient of the pixel value of the pixel point as a first confidence coefficient, wherein the first confidence coefficient is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of a second segmentation image corresponding to the position of the pixel point according to the first confidence coefficient, wherein the weight of the pixel point is positively correlated with the first confidence coefficient, and the sum of the weight of the second pixel point and the weight of the pixel point is 1; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point, thereby reducing the volatility between the divided images respectively corresponding to the front frame image and the rear frame image.
Referring now to FIG. 6, a schematic diagram of an electronic device (e.g., the server of FIG. 1) 600 suitable for use in implementing embodiments of the present disclosure is shown. The server shown in fig. 6 is only an example, and should not bring any limitation to the functions and the scope of use of the embodiments of the present disclosure.
As shown in fig. 6, electronic device 600 may include a processing means (e.g., central processing unit, graphics processor, etc.) 601 that may perform various appropriate actions and processes in accordance with a program stored in a Read Only Memory (ROM)602 or a program loaded from a storage means 608 into a Random Access Memory (RAM) 603. In the RAM 603, various programs and data necessary for the operation of the electronic apparatus 600 are also stored. The processing device 601, the ROM 602, and the RAM 603 are connected to each other via a bus 604. An input/output (I/O) interface 605 is also connected to bus 604.
Generally, the following devices may be connected to the I/O interface 605: input devices 606 including, for example, a touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; output devices 607 including, for example, a Liquid Crystal Display (LCD), a speaker, a vibrator, and the like; storage 608 including, for example, tape, hard disk, etc.; and a communication device 609. The communication means 609 may allow the electronic device 600 to communicate with other devices wirelessly or by wire to exchange data. While fig. 6 illustrates an electronic device 600 having various means, it is to be understood that not all illustrated means are required to be implemented or provided. More or fewer devices may alternatively be implemented or provided. Each block shown in fig. 6 may represent one device or may represent multiple devices as desired.
In particular, according to an embodiment of the present disclosure, the processes described above with reference to the flowcharts may be implemented as computer software programs. For example, embodiments of the present disclosure include a computer program product comprising a computer program embodied on a computer readable medium, the computer program comprising program code for performing the method illustrated in the flow chart. In such an embodiment, the computer program may be downloaded and installed from a network via the communication means 609, or may be installed from the storage means 608, or may be installed from the ROM 602. The computer program, when executed by the processing device 601, performs the above-described functions defined in the methods of embodiments of the present disclosure.
It should be noted that the computer readable medium described in the embodiments of the present disclosure may be a computer readable signal medium or a computer readable storage medium or any combination of the two. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any combination of the foregoing. More specific examples of the computer readable storage medium may include, but are not limited to: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In embodiments of the disclosure, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. In embodiments of the present disclosure, however, a computer readable signal medium may comprise a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated data signal may take many forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may also be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device. Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to: electrical wires, optical cables, RF (radio frequency), etc., or any suitable combination of the foregoing.
The computer readable medium may be embodied in the electronic device; or may exist separately without being assembled into the electronic device. The computer readable medium carries one or more programs which, when executed by the electronic device, cause the electronic device to: acquiring a first segmentation image obtained by image segmentation of a first image and a second segmentation image obtained by image segmentation of a previous frame of image of the first image, wherein pixel values of pixel points of the first segmentation image and the second segmentation image are proportional to the probability that the pixel points belong to the foreground, and the value range of the pixel values of the pixel points of the first segmentation image and the second segmentation image is a first preset interval; for pixel points of the first segmentation image, corresponding pixel values of which belong to a second preset interval, executing the following steps, wherein the second preset interval is a sub-interval of the first preset interval: determining the confidence coefficient of the pixel value of the pixel point as a first confidence coefficient, wherein the first confidence coefficient is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of a second segmentation image corresponding to the position of the pixel point according to the first confidence coefficient, wherein the weight of the pixel point is positively correlated with the first confidence coefficient, and the sum of the weight of the second pixel point and the weight of the pixel point is 1; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point.
Computer program code for carrying out operations for embodiments of the present disclosure may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C + +, and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the case of a remote computer, the remote computer may be connected to the user's computer through any type of network, including a Local Area Network (LAN) or a Wide Area Network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet service provider).
The flowchart and block diagrams in the figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present disclosure. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The units described in the embodiments of the present disclosure may be implemented by software or hardware. The described units may also be provided in a processor, and may be described as: a processor includes an acquisition unit and a determination unit. The names of the units do not limit the units themselves in some cases, and for example, the acquiring unit may also be described as a unit that acquires a first divided image obtained by image-dividing a first image and acquires a second divided image obtained by image-dividing an image of a frame preceding the first image.
The foregoing description is only exemplary of the preferred embodiments of the disclosure and is illustrative of the principles of the technology employed. It will be appreciated by those skilled in the art that the scope of the invention in the embodiments of the present disclosure is not limited to the specific combination of the above-mentioned features, but also encompasses other embodiments in which any combination of the above-mentioned features or their equivalents is made without departing from the inventive concept as defined above. For example, the above features and (but not limited to) technical features with similar functions disclosed in the embodiments of the present disclosure are mutually replaced to form the technical solution.

Claims (18)

1. A method for processing an image, comprising:
acquiring a first segmentation image obtained by image segmentation of a first image and a second segmentation image obtained by image segmentation of a previous frame of image of the first image, wherein pixel values of pixel points of the first segmentation image and the second segmentation image are proportional to the probability that the pixel points belong to the foreground, and the value range of the pixel values of the pixel points of the first segmentation image and the second segmentation image is a first preset interval;
for the pixel points of the first segmentation image, which correspond to the pixel values belonging to a second preset interval, executing the following steps, wherein the second preset interval is a sub-interval of the first preset interval: determining the confidence of the pixel value of the pixel point as a first confidence, wherein the first confidence is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of the second segmentation image corresponding to the position of the pixel point according to the first confidence, wherein the weight of the pixel point is positively correlated with the first confidence, and the sum of the weight of the second pixel point and the weight of the pixel point is 1; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point.
2. The method of claim 1, wherein the determining the confidence of the pixel value of the pixel point as the first confidence comprises:
determining the absolute value of the difference between the pixel point and the midpoint value of the first preset interval;
and normalizing the determined absolute value to obtain the first confidence coefficient.
3. The method of claim 1, wherein determining, according to the first confidence, a weight of the pixel and a weight of a second pixel of the second segmented image corresponding to the position of the pixel comprises:
acquiring a gradient value of a third pixel point of the first image corresponding to the pixel point;
determining a confidence coefficient used for representing that the third pixel point belongs to the edge pixel point as a second confidence coefficient according to the gradient value, wherein the second confidence coefficient is positively correlated with the gradient value;
and determining the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree, wherein the weight of the pixel point is negatively related to the second confidence degree.
4. The method according to claim 3, wherein the determining, as the second confidence, the confidence for characterizing that the third pixel belongs to an edge pixel according to the gradient value comprises:
and normalizing the gradient value, and determining the normalized gradient value as a second confidence coefficient.
5. The method of claim 4, wherein said determining the weight of the pixel and the weight of the second pixel according to the first confidence and the second confidence comprises:
determining a difference between 1 and the second confidence level;
and determining the weight of the pixel point and the weight of the second pixel point according to the product of the first confidence coefficient and the determined difference value.
6. The method of claim 5, wherein said determining a weight for the pixel and a weight for the second pixel based on a product of the first confidence and the determined difference comprises:
and determining a function value of a preset activation function corresponding to the product of the first confidence coefficient and the determined difference value as the weight of the pixel point.
7. The method of claim 6, wherein the activation function is a Sigmoid function.
8. The method of claim 1, wherein the second predetermined interval is a true sub-interval of the first predetermined interval, and a difference between an upper limit of the first predetermined interval and an upper limit of the second predetermined interval is smaller than a first predetermined threshold, and a difference between a lower limit of the second predetermined interval and a lower limit of the first predetermined interval is smaller than a second predetermined threshold.
9. An apparatus for processing an image, comprising:
the image segmentation device comprises an acquisition unit and a processing unit, wherein the acquisition unit is configured to acquire a first segmentation image obtained by image segmentation of a first image and acquire a second segmentation image obtained by image segmentation of a previous frame image of the first image, pixel values of pixel points of the first segmentation image and the second segmentation image are proportional to the probability that the pixel points belong to a foreground, and the value ranges of the pixel values of the pixel points of the first segmentation image and the second segmentation image are a first preset interval;
a determining unit configured to perform the following steps for a pixel point of the first segmented image, where a corresponding pixel value belongs to a second preset interval, where the second preset interval is a sub-interval of the first preset interval: determining the confidence of the pixel value of the pixel point as a first confidence, wherein the first confidence is positively correlated with the absolute value of the difference between the pixel point and the midpoint value of the first preset interval; determining the weight of the pixel point and the weight of a second pixel point of the second segmentation image corresponding to the position of the pixel point according to the first confidence, wherein the weight of the pixel point is positively correlated with the first confidence, and the sum of the weight of the second pixel point and the weight of the pixel point is 1; and determining the weighted sum of the pixel value of the pixel point and the pixel value of the second pixel point as a new pixel value of the pixel point.
10. The apparatus of claim 9, wherein the determining unit is further configured to:
determining the absolute value of the difference between the pixel point and the midpoint value of the first preset interval;
and normalizing the determined absolute value to obtain the first confidence coefficient.
11. The apparatus of claim 9, wherein the determining unit is further configured to:
acquiring a gradient value of a third pixel point of the first image corresponding to the pixel point;
determining a confidence coefficient used for representing that the third pixel point belongs to the edge pixel point as a second confidence coefficient according to the gradient value, wherein the second confidence coefficient is positively correlated with the gradient value;
and determining the weight of the pixel point and the weight of the second pixel point according to the first confidence degree and the second confidence degree, wherein the weight of the pixel point is negatively related to the second confidence degree.
12. The apparatus of claim 11, wherein the determining unit is further configured to:
and normalizing the gradient value, and determining the normalized gradient value as a second confidence coefficient.
13. The apparatus of claim 11, wherein the determining unit is further configured to:
determining a difference between 1 and the second confidence level;
and determining the weight of the pixel point and the weight of the second pixel point according to the product of the first confidence coefficient and the determined difference value.
14. The apparatus of claim 13, wherein the determining unit is further configured to:
and determining a function value of a preset activation function corresponding to the product of the first confidence coefficient and the determined difference value as the weight of the pixel point.
15. The apparatus of claim 14, wherein the activation function is a Sigmoid function.
16. The apparatus of claim 9, wherein the second predetermined interval is a true sub-interval of the first predetermined interval, and a difference between an upper limit of the first predetermined interval and an upper limit of the second predetermined interval is smaller than a first predetermined threshold, and a difference between a lower limit of the second predetermined interval and a lower limit of the first predetermined interval is smaller than a second predetermined threshold.
17. An electronic device, comprising:
one or more graphics processors;
a storage device having one or more programs stored thereon;
when executed by the one or more graphics processors, cause the one or more graphics processors to implement the method of any one of claims 1-8.
18. A computer-readable medium, on which a computer program is stored, wherein the program, when executed by a graphics processor, implements the method according to any one of claims 1-8.
CN201910598165.6A 2019-07-04 2019-07-04 Method and apparatus for processing image Active CN110288625B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910598165.6A CN110288625B (en) 2019-07-04 2019-07-04 Method and apparatus for processing image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910598165.6A CN110288625B (en) 2019-07-04 2019-07-04 Method and apparatus for processing image

Publications (2)

Publication Number Publication Date
CN110288625A CN110288625A (en) 2019-09-27
CN110288625B true CN110288625B (en) 2021-09-03

Family

ID=68020518

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910598165.6A Active CN110288625B (en) 2019-07-04 2019-07-04 Method and apparatus for processing image

Country Status (1)

Country Link
CN (1) CN110288625B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110838132B (en) * 2019-11-15 2022-08-05 北京字节跳动网络技术有限公司 Object segmentation method, device and equipment based on video stream and storage medium
CN111259680B (en) * 2020-02-13 2022-04-12 支付宝(杭州)信息技术有限公司 Two-dimensional code image binarization processing method and device
CN113763306B (en) * 2020-06-01 2024-06-04 杭州海康威视数字技术股份有限公司 Landmark detection method and device and electronic equipment
CN113066048B (en) * 2021-02-27 2024-11-05 华为技术有限公司 A method and device for determining confidence of segmentation image

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100505334B1 (en) * 2003-03-28 2005-08-04 (주)플렛디스 Real-time stereoscopic image conversion apparatus using motion parallaxr
CN102542593A (en) * 2011-09-30 2012-07-04 中山大学 Interactive video stylized rendering method based on video interpretation
US8953882B2 (en) * 2012-05-31 2015-02-10 Apple Inc. Systems and methods for determining noise statistics of image data
CN103810664B (en) * 2012-11-13 2017-06-27 中兴通讯股份有限公司 A kind of information concealing method and device
CN103942535B (en) * 2014-03-28 2017-04-12 广东威创视讯科技股份有限公司 Multi-target tracking method and device
CN109345580B (en) * 2018-10-23 2020-03-24 北京字节跳动网络技术有限公司 Method and apparatus for processing image

Also Published As

Publication number Publication date
CN110288625A (en) 2019-09-27

Similar Documents

Publication Publication Date Title
CN109858445B (en) Method and apparatus for generating a model
CN109829432B (en) Method and apparatus for generating information
CN110288625B (en) Method and apparatus for processing image
CN109255337B (en) Face key point detection method and device
CN109389072B (en) Data processing method and device
CN109981787B (en) Method and device for displaying information
CN109118456B (en) Image processing method and device
CN110213614B (en) Method and device for extracting key frame from video file
CN109377508B (en) Image processing method and device
CN110059623B (en) Method and apparatus for generating information
US11514263B2 (en) Method and apparatus for processing image
CN109977905B (en) Method and apparatus for processing fundus images
CN110211030B (en) Image generation method and device
CN112348910B (en) Method, device, apparatus and computer readable medium for acquiring images
CN111459364B (en) Icon updating method and device and electronic equipment
CN111783777B (en) Image processing method, apparatus, electronic device, and computer readable medium
CN110189252B (en) Method and device for generating average face image
CN111784712A (en) Image processing method, device, equipment and computer readable medium
CN109934142B (en) Method and apparatus for generating feature vectors of video
CN111757100B (en) Method and device for determining camera motion variation, electronic equipment and medium
CN111815654A (en) Method, apparatus, device and computer readable medium for processing image
CN109919220B (en) Method and apparatus for generating feature vectors of video
CN111784726A (en) Image matting method and device
CN111815656B (en) Video processing method, apparatus, electronic device and computer readable medium
CN111915532B (en) Image tracking method and device, electronic equipment and computer readable medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
CP01 Change in the name or title of a patent holder

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Douyin Vision Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: Tiktok vision (Beijing) Co.,Ltd.

Address after: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee after: Tiktok vision (Beijing) Co.,Ltd.

Address before: 100041 B-0035, 2 floor, 3 building, 30 Shixing street, Shijingshan District, Beijing.

Patentee before: BEIJING BYTEDANCE NETWORK TECHNOLOGY Co.,Ltd.

CP01 Change in the name or title of a patent holder