[go: up one dir, main page]

CN109389657B - Color filling method - Google Patents

Color filling method Download PDF

Info

Publication number
CN109389657B
CN109389657B CN201811050403.1A CN201811050403A CN109389657B CN 109389657 B CN109389657 B CN 109389657B CN 201811050403 A CN201811050403 A CN 201811050403A CN 109389657 B CN109389657 B CN 109389657B
Authority
CN
China
Prior art keywords
image
edge detection
color filling
filling
color
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811050403.1A
Other languages
Chinese (zh)
Other versions
CN109389657A (en
Inventor
邓立邦
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Zhimeiyuntu Tech Corp ltd
Original Assignee
Guangdong Zhimeiyuntu Tech Corp ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Zhimeiyuntu Tech Corp ltd filed Critical Guangdong Zhimeiyuntu Tech Corp ltd
Priority to CN201811050403.1A priority Critical patent/CN109389657B/en
Publication of CN109389657A publication Critical patent/CN109389657A/en
Application granted granted Critical
Publication of CN109389657B publication Critical patent/CN109389657B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/40Filling a planar surface by adding surface attributes, e.g. colour or texture
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Image Analysis (AREA)
  • Processing Or Creating Images (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a color filling method, which comprises the following steps: extracting an image from the uploaded image file, and carrying out gray processing on the image; performing edge detection and tracing on the image subjected to gray scale processing; comparing the images before and after line drawing, finding out an intersecting part and removing the intersecting part, thereby obtaining a corresponding image and a color filling path; repeating the steps, and continuously performing edge detection, line tracing and comparison on the image to obtain all color filling paths of the uploaded image; transmitting all filling paths of the image to drawing equipment, and enabling the drawing equipment to carry out color filling on the image according to the filling paths; the method combines an edge detection algorithm and an image comparison method before and after line drawing, realizes that grid subdivision and tracking of images are not needed, so that the method is not influenced by a scale and grid subdivision precision, achieves the technical effect of reducing calculated amount, and improves filling efficiency.

Description

Color filling method
Technical Field
The invention relates to the field of painting, in particular to a color filling method.
Background
Along with the transition of the times and the continuous development of the modern painting technology, the painting presents a rich and diversified situation, is not dependent on single display of lines or geometric figures, but focuses on matching and application of colors, the colors are distributed throughout the human society, the world is filled, the life is more colorful, and any painting element cannot be replaced, so that with the rapid development of computer technology and artificial intelligence, intelligent painting is more and more popular with the society.
In the prior art, a color filling method for an image by using a commercial painting product generally performs data interpolation on a plane by using data after gridding, so that each pixel point of a painting area under the resolution of the current device has a data value, and then fills a corresponding color in the pixel position of the painting area according to the data value. However, this method has the disadvantages: when large images are drawn, the calculated amount is too large, so that the operation efficiency is low; the influence of scale and grid subdivision accuracy is larger, and the filling effect is not ideal.
Disclosure of Invention
The invention provides a color filling method, which aims to solve the technical problems that the influence of scale and grid subdivision accuracy is large and the calculated amount is overlarge in the color filling process, and the method is free from the influence of image resolution and size, reduces the calculated amount and improves the filling efficiency when painting.
In order to solve the above technical problems, an embodiment of the present invention provides a color filling method, including:
s1, extracting an image from an uploaded image file, and carrying out gray processing on the image;
s2, carrying out edge detection and tracing on the image subjected to gray scale processing;
s3, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out an intersecting part and removing the intersecting part, so as to obtain a corresponding image and a color filling path;
s4, repeating the steps, and continuously performing edge detection, line drawing and comparison on the image to obtain all color filling paths of the uploaded image;
s5, transmitting all filling paths of the image to drawing equipment, and enabling the drawing equipment to carry out color filling on the image according to the filling paths.
As a preferred solution, the step S2 specifically includes:
s21, carrying out edge detection on the image subjected to gray scale processing through an algorithm;
s22, setting the pixel size, and tracing the image subjected to edge detection to obtain a corresponding image.
As a preferred solution, the step S3 specifically includes:
s31, comparing the image obtained in the step S1 with the image obtained in the step S2, and finding out and removing the intersecting part to obtain a corresponding image;
s32, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out and extracting an intersecting part, and thus obtaining a color filling path.
Preferably, the drawing device is a mechanical arm.
Preferably, the pixel size is 2 pixel points.
A color filling method, comprising:
s1, extracting an image from an uploaded image file, and carrying out gray processing on the image;
s2, carrying out edge detection and tracing on the image subjected to gray scale processing;
s3, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out an intersecting part and removing the intersecting part, so as to obtain a corresponding image and a color filling path;
s4, repeating the steps, and continuously performing edge detection, line drawing and comparison on the image to obtain all color filling paths of the uploaded image;
s5, transmitting all filling paths of the image to drawing equipment, and enabling the drawing equipment to carry out color filling on the image according to the filling paths;
the step S2 specifically includes:
s21, carrying out edge detection on the image subjected to gray scale processing through an algorithm;
s22, tracing the image subjected to edge detection by setting the pixel size to obtain a corresponding image;
the step S3 specifically includes:
s31, comparing the image obtained in the step S1 with the image obtained in the step S2, and finding out and removing the intersecting part to obtain a corresponding image;
s32, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out an intersecting part and extracting the intersecting part, so as to obtain a color filling path;
the drawing equipment is a mechanical arm;
the pixel size is 2 pixel points.
Compared with the prior art, the embodiment of the invention has the following beneficial effects:
1. the edge detection algorithm is used for carrying out edge detection and tracing on the image, and grid subdivision and tracking on the image are not needed, so that the method is not influenced by a scale and grid subdivision precision;
2. the color filling path is obtained based on the intersection part of the image comparison before and after the line drawing, so that the method has small calculated amount and improves the filling efficiency.
Drawings
Fig. 1: a specific flow chart of a color filling method of the invention;
fig. 2: a specific flow chart of step 2 in the color filling method of the invention;
fig. 3: is a specific flow chart of the step 3 in the color filling method.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and completely with reference to the accompanying drawings, in which it is apparent that the embodiments described are only some embodiments of the present invention, but not all embodiments. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Referring to fig. 1, a preferred embodiment of the present invention provides a color filling method, which includes obtaining a current uploaded image file, extracting an image from the uploaded image file on a system, and performing gray scale processing on the extracted image to obtain a first image; the system carries out edge detection on the first image through an algorithm, and carries out edge tracing on the first image by adopting lines with set pixel size according to actual requirements to obtain a second image; based on the comparison of the second image and the first image, the system finds out the intersection part of the second image and the first image and removes the intersection part to obtain a third image; meanwhile, the intersecting portion is extracted as a color-filled first path.
For the third image, the system repeats the processing steps, and performs edge detection and line drawing operation on the image to obtain a fourth image; then finding out the intersection part of the two images to obtain a fifth image and a color filling second path; and by analogy, obtaining all color filling paths of the uploaded image. The system transmits the paths to the drawing device to realize color filling of the image.
Referring to fig. 2, in the present embodiment, the performing edge detection and tracing on the image subjected to the gradation processing includes performing edge detection on the image subjected to the gradation processing by an algorithm; and setting the pixel size, and tracing the image subjected to edge detection to obtain a corresponding image.
Referring to fig. 3, in this embodiment, the comparing the images before and after tracing to find out the intersection and remove it, so as to obtain the corresponding image and the color filling path, including comparing the images before and after tracing to find out the intersection and remove it, so as to obtain the corresponding image; and comparing images before and after line drawing, finding out an intersecting part and extracting the intersecting part, thereby obtaining a color filling path.
In this embodiment, the drawing device is a hardware device such as a mechanical arm, and the image is filled by the mechanical arm, so that the filling accuracy is higher.
In this embodiment, the pixel size is 2 pixels, and the setting of 2 pixels is adopted to perform line tracing on the image subjected to edge detection, so that the line tracing is more accurate and the line tracing effect is better.
The specific implementation flow of the method is as follows:
s1: and uploading the image file.
The image file can be automatically uploaded by a system or can be uploaded by a user through a terminal. The terminal refers to hardware equipment such as mobile phones, computers, touch screens, notebooks and the like which are commonly used by users in daily life.
S2: the system extracts an image and carries out gray processing on the image to obtain a first image.
After the system or the user finishes uploading the image file, the system extracts the image and carries out gray processing on the image. Since the color image is composed of a plurality of pixel points, each pixel point is represented by three values of RGB; the gray processing is carried out on the image, the texture characteristic information of the image is not affected, and each pixel point can be represented by only one gray value, so that the image processing efficiency is greatly improved. The following is a gray processing weighted average formula:
f(i,j)=0.30R(i,j)+0.59G(i,j)+0.11B(i,j)
wherein i and j represent the positions of a pixel point in a two-dimensional space vector, namely: ith row, jth column.
According to the formula, the gray value of each pixel point of the image is calculated, and the range of the gray value is 0-255, so that the image is in a black-white gray state.
After the image gradation processing is completed, the image gradation is changed into a gradation map, and the gradation map is stored as a first image in a local server.
S3: and the system performs edge detection and tracing on the first image to obtain a second image.
S31: edge detection is performed on the first image.
In a digital image, an edge refers to a portion where local variation of the image is most remarkable, and the edge exists mainly between an object and a background, and is a discontinuity of local characteristics of the image, such as abrupt change of gray scale, an icon of a texture structure, an icon of a color, and the like.
And performing edge detection on the first image by the system through a Soble edge detection algorithm. The algorithm comprises two groups of 3x3 matrixes which respectively represent the transverse direction and the longitudinal direction, and carrying out plane convolution on the matrixes and the first image to obtain a transverse brightness difference approximation value and a longitudinal brightness difference approximation value; based on the approximate values of the transverse gradient and the longitudinal gradient of each pixel of the image, the gradient size and the gradient direction of each pixel are obtained; if the pixel gradient is greater than a certain threshold, it is considered an edge point. The following is a related formula for the algorithm:
Figure SMS_1
and calculating the brightness difference approximate value of the image in the transverse direction and the longitudinal direction through the formula. Wherein a represents a first image, gx and Gy represent images subjected to lateral and longitudinal edge detection, respectively.
Figure SMS_2
The gradient magnitude of each pixel of the image is calculated by the above formula.
Figure SMS_3
Through the above formula, the gradient direction of each pixel of the image is calculated. If the angle Θ is equal to zero, it represents that the image has a longitudinal edge where it is dark to the right of left Fang Jiao.
S32: and tracing the image with the edge detection to obtain a second image.
And obtaining edge points of the first image and distribution directions thereof through an edge detection algorithm. According to the actual requirement of drawing, the system adopts lines with set pixel size (such as 2 pixel points) to trace the edge points, and the lines are used as a second image and stored on a local server.
S4: and finding out an intersection part of the second image and the first image to obtain a third image and a color filling first path.
S41: and removing the intersection part of the images to obtain a third image.
The method comprises the steps of obtaining a first image by carrying out gray processing on an uploaded image; and obtaining a second image by carrying out edge detection and tracing on the first image. Based on the comparison of the second image with the first image, the system finds the intersection of the two and removes it as a third image and stores it on the local server.
S42: and extracting the intersection part of the images to obtain a color filling first path.
Based on the comparison of the second image with the first image, the system finds the intersection of the two and extracts it, populating the first path as a color and storing it on the local server.
S5: repeating the steps, and continuously performing edge detection, line tracing and comparison on the image to obtain all color filling paths of the uploaded image.
For the third image, the system repeats the processing steps, and performs edge detection and line drawing operation on the image to obtain a fourth image; then, the intersection part of the fourth image and the third image is found, the intersection part is removed to obtain a fifth image, and the fifth image is extracted to obtain a color filling second path. And so on until all color filling paths of the uploaded image are found.
S6: the painting device color fills the uploaded image according to the path.
And continuously performing edge detection, line drawing and comparison operation on the image based on the system to obtain all color filling paths of the uploaded image. The system transmits the path to the drawing device, such as: and the hardware such as a mechanical arm and the like is used for realizing color filling on the uploaded image.
The method carries out edge detection and tracing on the image based on the edge detection algorithm without carrying out grid subdivision and tracking on the image, so that the method is not influenced by a scale and grid subdivision precision; the color filling path is obtained based on the intersection part of the image comparison before and after the line drawing, so that the method has small calculated amount and improves the filling efficiency.
The foregoing embodiments have been provided for the purpose of illustrating the general principles of the present invention, and are not to be construed as limiting the scope of the invention. It should be noted that any modifications, equivalent substitutions, improvements, etc. made by those skilled in the art without departing from the spirit and principles of the present invention are intended to be included in the scope of the present invention.

Claims (6)

1. A color filling method, comprising:
s1, extracting an image from an uploaded image file, and carrying out gray processing on the image;
s2, carrying out edge detection and tracing on the image subjected to gray scale processing through an edge detection algorithm;
s3, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out an intersecting part and removing the intersecting part, so as to obtain a corresponding image and a color filling path;
s4, repeating the steps, and continuously performing edge detection, line drawing and comparison on the image to obtain all color filling paths of the uploaded image;
s5, transmitting all filling paths of the image to drawing equipment, and enabling the drawing equipment to carry out color filling on the image according to the filling paths.
2. The color filling method according to claim 1, wherein the step S2 specifically includes:
s21, performing edge detection on the image subjected to gray scale processing through an edge detection algorithm;
s22, setting the pixel size, and tracing the image subjected to edge detection to obtain a corresponding image.
3. The color filling method according to claim 1, wherein the step S3 specifically includes:
s31, comparing the image obtained in the step S1 with the image obtained in the step S2, and finding out and removing the intersecting part to obtain a corresponding image;
s32, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out and extracting an intersecting part, and thus obtaining a color filling path.
4. The color filling method according to claim 1, wherein the painting device is a robot arm.
5. The color filling method of claim 2, wherein the pixel size is 2 pixels.
6. A color filling method, comprising:
s1, extracting an image from an uploaded image file, and carrying out gray processing on the image;
s2, carrying out edge detection and tracing on the image subjected to gray scale processing;
s3, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out an intersecting part and removing the intersecting part, so as to obtain a corresponding image and a color filling path;
s4, repeating the steps, and continuously performing edge detection, line drawing and comparison on the image to obtain all color filling paths of the uploaded image;
s5, transmitting all filling paths of the image to drawing equipment, and enabling the drawing equipment to carry out color filling on the image according to the filling paths;
the step S2 specifically includes:
s21, carrying out edge detection on the image subjected to gray scale processing through an algorithm;
s22, tracing the image subjected to edge detection by setting the pixel size to obtain a corresponding image;
the step S3 specifically includes:
s31, comparing the image obtained in the step S1 with the image obtained in the step S2, and finding out and removing the intersecting part to obtain a corresponding image;
s32, comparing the image obtained in the step S1 with the image obtained in the step S2, finding out an intersecting part and extracting the intersecting part, so as to obtain a color filling path;
the drawing equipment is a mechanical arm;
the pixel size is 2 pixel points.
CN201811050403.1A 2018-09-10 2018-09-10 Color filling method Active CN109389657B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811050403.1A CN109389657B (en) 2018-09-10 2018-09-10 Color filling method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811050403.1A CN109389657B (en) 2018-09-10 2018-09-10 Color filling method

Publications (2)

Publication Number Publication Date
CN109389657A CN109389657A (en) 2019-02-26
CN109389657B true CN109389657B (en) 2023-06-27

Family

ID=65418676

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811050403.1A Active CN109389657B (en) 2018-09-10 2018-09-10 Color filling method

Country Status (1)

Country Link
CN (1) CN109389657B (en)

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009031978A (en) * 2007-07-26 2009-02-12 Seiko Epson Corp Coloring, coloring production apparatus, coloring production method and program thereof
JP6004260B2 (en) * 2012-08-27 2016-10-05 健二 東海林 Line drawing coloring system
US9569857B2 (en) * 2013-09-05 2017-02-14 ReallyColor, LLC Conversion of digital images into digital line drawings
CN104077773A (en) * 2014-06-23 2014-10-01 北京京东方视讯科技有限公司 Image edge detection method, and image target identification method and device
US10008011B1 (en) * 2014-11-26 2018-06-26 John Balestrieri Methods for creating a simulated watercolor-painted image from a source image
CN105437768A (en) * 2015-09-13 2016-03-30 常州大学 Machine-vision-based intelligent artistic paint robot

Also Published As

Publication number Publication date
CN109389657A (en) 2019-02-26

Similar Documents

Publication Publication Date Title
CN110176027B (en) Video target tracking method, device, equipment and storage medium
US10963727B2 (en) Method, device and storage medium for determining camera posture information
CN110503680B (en) Unsupervised convolutional neural network-based monocular scene depth estimation method
CN107292319A (en) The method and device that a kind of characteristic image based on deformable convolutional layer is extracted
CN110941999A (en) Method for adaptively calculating size of Gaussian kernel in crowd counting system
CN111598993A (en) Three-dimensional data reconstruction method and device based on multi-view imaging technology
JP2014527210A (en) Content adaptive system, method and apparatus for determining optical flow
CN111259841B (en) Image processing method and related equipment
CN110689014B (en) Method and device for detecting region of interest, electronic equipment and readable storage medium
CN110648284B (en) Image processing method and device with uneven illumination
CN110866900A (en) Water body color identification method and device
CN111665490A (en) Target tracking method and device, storage medium and electronic device
CN117975484B (en) Training method of change detection model, change detection method, device and equipment
CN113112542A (en) Visual positioning method and device, electronic equipment and storage medium
CN113766117B (en) Video de-jitter method and device
CN115082322A (en) Image processing method and device, and training method and device of image reconstruction model
CN111882565A (en) Image binarization method, device, equipment and storage medium
CN108734712B (en) Background segmentation method and device and computer storage medium
CN109389657B (en) Color filling method
CN118365523A (en) Method, system, electronic device and storage medium for representing images of any scale
CN111667499A (en) Image segmentation method, device and equipment for traffic signal lamp and storage medium
CN117059005A (en) Brightness compensation method of display module and display device
CN112835453B (en) Method, apparatus and storage medium for simulating interface effect when focusing human eyes
CN104318236A (en) Method and system for obtaining image local features
CN116740375A (en) Image feature extraction method, system and medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant