[go: up one dir, main page]

CN116844159B - A microscopic image acquisition and processing system and textile fiber classification method thereof - Google Patents

A microscopic image acquisition and processing system and textile fiber classification method thereof Download PDF

Info

Publication number
CN116844159B
CN116844159B CN202310927676.4A CN202310927676A CN116844159B CN 116844159 B CN116844159 B CN 116844159B CN 202310927676 A CN202310927676 A CN 202310927676A CN 116844159 B CN116844159 B CN 116844159B
Authority
CN
China
Prior art keywords
image
fiber
point
curve
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310927676.4A
Other languages
Chinese (zh)
Other versions
CN116844159A (en
Inventor
刘艳
邱星伟
袁裕禄
钱蕾
刘佳
马国军
仲重光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Chengxin Inspection Testing And Certification Co ltd
Original Assignee
Jiangsu Chengxin Inspection Testing And Certification Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Chengxin Inspection Testing And Certification Co ltd filed Critical Jiangsu Chengxin Inspection Testing And Certification Co ltd
Priority to CN202310927676.4A priority Critical patent/CN116844159B/en
Publication of CN116844159A publication Critical patent/CN116844159A/en
Application granted granted Critical
Publication of CN116844159B publication Critical patent/CN116844159B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/60Type of objects
    • G06V20/69Microscopic objects, e.g. biological cells or cellular parts
    • G06V20/698Matching; Classification
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/20Image enhancement or restoration using local operators
    • G06T5/30Erosion or dilatation, e.g. thinning
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/26Segmentation of patterns in the image field; Cutting or merging of image elements to establish the pattern region, e.g. clustering-based techniques; Detection of occlusion
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/36Applying a local operator, i.e. means to operate on image points situated in the vicinity of a given point; Non-linear local filtering operations, e.g. median filtering
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/764Arrangements for image or video recognition or understanding using pattern recognition or machine learning using classification, e.g. of video objects
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Medical Informatics (AREA)
  • Evolutionary Computation (AREA)
  • Databases & Information Systems (AREA)
  • Artificial Intelligence (AREA)
  • Nonlinear Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Molecular Biology (AREA)
  • Image Processing (AREA)

Abstract

本发明涉及的一种显微图像采集与处理系统及其纺织品纤维分类方法,它包括图像采集模块、FIFO缓存器、DDR5控制模块、图像数据处理模块、显示器驱动模块、只读存储器、I2C接口、CMOS图像传感器、高清多媒体接口和存储器,所述CMOS图像传感器连接图像采集模块,图像采集模块连接FIFO缓存器,所述FIFO缓存器连接DDR5控制模块,所述DDR5控制模块连接存储器,所述FIFO缓存器连接图像数据处理模块,所述图像数据处理模块连接显示器驱动模块,所述显示器驱动模块连接高清多媒体接口,所述只读存储器连接I2C接口。本发明增强了抽取拥有复杂关系特征数据的有效性,进而提高了SVM分类模型对纤维的分类精度。

The present invention relates to a microscopic image acquisition and processing system and a textile fiber classification method thereof, which comprises an image acquisition module, a FIFO buffer, a DDR5 control module, an image data processing module, a display driver module, a read-only memory, an I 2 C interface, a CMOS image sensor, a high-definition multimedia interface and a memory, wherein the CMOS image sensor is connected to the image acquisition module, the image acquisition module is connected to the FIFO buffer, the FIFO buffer is connected to the DDR5 control module, the DDR5 control module is connected to the memory, the FIFO buffer is connected to the image data processing module, the image data processing module is connected to the display driver module, the display driver module is connected to the high-definition multimedia interface, and the read-only memory is connected to the I 2 C interface. The present invention enhances the effectiveness of extracting feature data with complex relationships, thereby improving the classification accuracy of the SVM classification model for fibers.

Description

Microscopic image acquisition and processing system and textile fiber classification method thereof
Technical Field
The invention relates to the technical field of image processing, in particular to a microscopic image acquisition and processing system and a textile fiber classification method thereof.
Background
In the textile industry, the accurate classification of textile artworks is related to the important steps of subsequent packaging, finishing, dyeing and the like, for example, the dyeing step and ironing step of warp knitting textile artworks and weft knitting textile artworks of which the types of fabrics are knitted fabrics are completely different in dyeing and finishing processes, and the settings of some important production parameters of blended textile artworks and interweaving textile artworks of which the types of fabrics are woven fabrics are also different. Therefore, in order to ensure the quality of the textile artwork, it is necessary to perform qualitative classification of the textile artwork when it is made.
With the change of the Chinese economy from high-speed growth to high-quality development, higher requirements and faster detection speed are also put forward on fiber detection and identification. Conventional detection methods employ manual detection, which can consume a significant amount of manpower and time. The traditional classification mode is to perform qualitative classification by using the gray level co-occurrence matrix in the image recognition technology, namely classifying by various statistical characteristics of the gray level co-occurrence matrix, and is essentially an empirical summary of statistical indexes of a large amount of image data, and the method has lower accuracy.
With the development of computer technology, the technology of computer image processing is applied to fiber detection to obtain a certain effect, but the sample collection is affected by external factors, the resolution of sample images, the preprocessing effect of images, the selection of characteristic values and the classification model can finally determine the classification recognition effect. In the current research on fiber classification and identification, in order to reflect the cross-section characteristics of the fibers as comprehensively as possible, indexes are extracted and analyzed as much as possible, so that the fibers are accurately classified.
However, whether extracting numerous characteristics of the fiber can play a critical role in the identification of the fiber requires further analysis. In the process of extracting the characteristics, many characteristics are not independent, the redundancy of the characteristics can influence the accuracy and speed of fiber classification, and in order to solve the problems, an effective method of selecting the characteristics is adopted. Through feature selection, a space optimal feature subset is selected from a plurality of features to be selected, and a classification recognition process is optimized, which is important in the recognition of samples with vast information. There is a need for a microscopic image acquisition and processing system and method for classifying textile fibers.
Disclosure of Invention
The invention aims to overcome the defects, provides a microscopic image acquisition and processing system and a textile fiber classification method thereof, provides average 2DGabor bi-directional line characteristics, curvature, fitting curvature, segmented chord angle change characteristics and minimum external rectangle density characteristics aiming at the characteristics of fiber images, and enhances the effectiveness of extracting characteristic data with complex relations by fusing the maximum information number and m-PCA (Modified-PRINCIPAL COMPONENT ANALYSIS) in the aspect of characteristic selection, thereby improving the classification precision of SVM classification models on fibers.
The purpose of the invention is realized in the following way:
The microscopic image acquisition and processing system comprises an image acquisition module, a FIFO buffer, a DDR5 control module, an image data processing module, a display driving module, a read-only memory, an I 2 C interface, a CMOS image sensor, a high-definition multimedia interface and a memory, wherein the CMOS image sensor is connected with the image acquisition module, the image acquisition module is connected with the FIFO buffer, the FIFO buffer is connected with the DDR5 control module, the DDR5 control module is connected with the memory, the FIFO buffer is connected with the image data processing module, the image data processing module is connected with the display driving module, the display driving module is connected with the high-definition multimedia interface, and the read-only memory is connected with the I 2 C interface.
Further, the image acquisition module can adopt a 1/2.8 inch SNOY CMOSIMX inch 291 ultra-wide dynamic ultra-low illuminance sensor camera and a roller shutter exposure mode.
A method for classifying textile fibers of a microscopic image acquisition and processing system, comprising the following:
s1, acquiring a textile fiber sample image by using an image acquisition system based on an FPGA;
s2, inputting a fiber color image to be identified, preprocessing the image to output a denoised gray scale image, and dividing the preprocessed image through a threshold value to obtain a binary image;
S3, as the fiber image has rich line characteristics, a group of 2DGabor filters in different directions are defined firstly, and the directions with maximum response and secondary maximum response are selected as the line characteristics of the fiber after the filters are convolved with the fiber image, traversing the lines of the fiber, and respectively carrying out average calculation on the maximum value and the secondary maximum value obtained by all the lines to obtain average 2DGabor bi-directional line characteristics;
S4, detecting and analyzing the fiber contour as a curve object, and constructing a segmented fitting curve by using a variable-length included angle, dividing the curve into n segments, calculating segmented chord lengths and total chord lengths, and constructing 3 characteristics of curvature, fitting curvature and segmented chord length angle change on the basis;
s5, calculating the minimum circumscribed rectangle of the fiber and the fiber area, and extracting the density characteristic of the minimum circumscribed rectangle;
s6, extracting the feature vectors in the steps S3-S5, and performing dimension reduction on the feature data by fusing the maximum information number and the m-PCA method;
S7, performing weight distribution on the feature vector after the dimension reduction by using an attention mechanism, calculating a correlation coefficient between the fiber class and the feature value variable to obtain a Pearson correlation coefficient, inputting the Pearson correlation coefficient into a mapping function, outputting an optimized weight of the original feature value mapping, and finally outputting a new feature vector by combining the weight with the original feature vector;
S8, inputting the feature vector into a decision tree SVM for training.
Further, the step S1 further comprises the following steps that firstly, the acquisition system carries out initialization configuration on the camera through an I 2 C bus protocol, the camera focuses to shoot an acquired image, then the acquired image data is continuously written into the DDR5 SDRAM through the FIFO buffer for storage, the image data is read out through the FIFO buffer, the read-out image data is processed, finally, the acquired image is displayed on the HDMI, and the stored image can be uploaded to the host for subsequent processing.
Further, the step S2 further includes the following:
SS21, inputting a fiber color image to be identified, which is acquired from a camera and has the size of 72 x 72 pixels, graying the RGB image, denoising the fiber image by adopting a mean filter, taking the noisy image as g (x, y), setting the central point Z as (x, y), taking a square neighborhood window with the size of c x c to obtain a denoised image f (x, y), and constructing a Laplace operator The filtering template isBy passing throughThe method comprises the steps of obtaining a sharpened image, sharpening details of a reinforced fiber gray level image by using a Laplace image, wherein the operation is a preprocessing process of the image;
And SS22, processing the preprocessed image by an Otsu single-threshold segmentation method, selecting an optimal gray level threshold value, classifying pixels in the gray level image into a fiber target area and a background area by using the threshold value, setting the gray level value of the pixels in the fiber target area to 255, and setting the gray level value of the pixels in the background area to 0, so as to obtain a binary image.
Further, the step S3 further includes the following:
SS31, construct 2DGabor filter, defined as the following formula:
f is the frequency of a sine function, θ controls the direction of the adjusting filter, q represents the phase shift, r represents the spatial aspect ratio of the shape of the Gabor filter, and different r values are of different scales;
SS32, setting parameters of a 2D Gabor filter, taking 12 directions of 0, 30, 60, 90, 120, 180, 240, 300 and 360 degrees respectively, setting central frequencies of 1/16, 1/16.5, 1/17, 1/17.5 and 1/18 respectively, setting 8 scales of 0.05, 0.15, 0.25, 0.35, 0.45, 0.55, 0.65 and 0.75 respectively, and defining response values m k of a central pixel in different directions as the following formula:
The gray value of the current pixel point is represented by f (x, y), wherein L k represents a line formed by multiple points in the same direction, G k is a 2D Gabor filter with a direction k, in a fiber image, the background gray value is low, the fiber target gray value is high, the direction with a higher response value is taken as the line characteristic of the pixel point, and secondly, as the internal line of the fiber image is provided with an intersection point, the fiber line extends along different directions along the intersection point, the line characteristic of the pixel is represented in a bidirectional way on the basis, and the maximum value and the next maximum value are selected as the line direction of the pixel point by the response value;
SS33, extracting 2DGabor bi-directional line characteristics, and defining the following formula:
v 1,v2 is the direction k 1,k2 of the maximum response value and the next-largest response value, respectively;
In step SS34, the crossing point of the lines in the fiber is needed to be identified so as to determine whether to delete the direction, the specific operation is that the image is thinned, the skeleton of the target fiber is extracted, if the target fiber is a line target, the skeleton is white pixel point, if the target fiber is a background, the skeleton is black, a 3X 3 template is constructed to traverse all lines in the thinned fiber image, whether the sum of the white pixel values of the pixel points [ P 1,P2,P3,P4,P5,P6,P7,P8 ] in the 8 neighborhood is more than 2 is judged, if the sum is more than 2, the crossing point is the crossing point, the position of the crossing point is recorded;
and SS35, respectively carrying out average calculation on the maximum value and the next maximum value obtained by all lines to obtain an average 2DGabor bi-directional line characteristic, wherein the corresponding dimension is 2×8×5=80.
Further, the step S4 further includes the following:
SS41, corroding and expanding the image, wherein the image is taken as a pixel matrix A, and the corroding matrix B is taken as Corroding B to A to obtain C matrix, setting expansion matrixThe matrix E is converted into an image, the image is scanned by a progressive scanning method until a continuous area is found, the boundary pixels are marked along the contour of the area from the starting point of the area, and after the contour is processed, the scanning is continued from the upper position until a new area is found, thereby obtaining the fiber contour;
SS42, fitting a segmented curve by using a variable-length included angle, firstly traversing the curve, recording a starting point C and a terminal point D, marking the curve as CD, setting an initial angle theta and a fixed length p, starting traversing the curve CD by using the point C, and recording a point E 1 if CE 1 is equal to p;
SS43, search point E 1 to point D, record point F 1, and angle CE 1 to CF 1 is greater than θ;
SS44, selecting the point with the farthest distance between the curve CE 1F1 and the straight line CF 1, and recording as S 1;
SS45, traversing the curve S 1 D by taking S 1 as a starting point, and repeating the steps SS 42-SS 44 to obtain a point S 2,…,Sn;
The line segments on the SS46 and the curve are CS 1,S1S2,…,Sn D respectively;
SS47, sampling on the contour line of the unit pixel to obtain a mark point D (x i,yi) i=0, 1,2, n, the contour line is divided into n sections, then total chord length L n, and sectional chord length k is respectively:
SS48, establishing a curvature characteristic w by a piecewise fitted curve, wherein the curvature refers to the ratio of the arc length to the total chord length of the piecewise fitted curve, and the marking point D (x i,yi) i=0, 1,2,..n, the chord length is F x, the arc length is the total chord length L n, and the curvature w is represented by the following formula:
SS49, building a fitting curvature nq feature by piecewise fitting the curve, the curvature k i at the marker point D (x i,yi) is the following formula:
The fitting curvature nq is the sum of absolute values of the curvatures on the fitting line segment, and the formula is as follows:
SS410, establishing a segment chord length angle change tau characteristic through a segment fitting curve, marking a point D (x i,yi) as a pixel point on the curve, D i+n and D i-n as points with a distance of N pixels from a point D i on a contour line, obtaining vectors T i+n and T i-n according to the following formula, calculating an included angle between two adjacent segment vectors, wherein N pixel points on 1 curve totally calculate included angles between N-2N vectors, accumulating and averaging the N-2N included angles, and further obtaining the segment chord length angle change tau on the whole curve, wherein the formula is as follows:
further, the step S5 further includes the following:
SS51, carrying out filling operation after corroding and expanding the binary fiber image to obtain the boundary of the target, rotating the target within a range of 180 degrees, wherein the increment of each rotation angle is 5 degrees, and recording the maximum value and the minimum value of the boundary point of the external rectangle each time;
SS52, calculating the perimeter of the minimum circumscribed rectangle of the target fiber, setting as T, calculating the pixel number of the fiber target, wherein the pixel number is the area a of the fiber, and the characteristic of the density U of the minimum circumscribed rectangle is shown in the following formula:
further, the step S6 further includes the following:
SS61, extracted variable x i '(i=1, 2.), m) the number of samples is m, and the normalization operation is performed on x i': The maximum information number was calculated using the following formula:
where B is the number of samples, where b=m 0.6,I(x1,x2) represents the mutual information between x 1,x2;
Calculating the maximum information number of the characteristic variables to obtain a matrix T, wherein the matrix T is shown in the following formula:
SS62, let ty=λy, get the eigenvalue of T and the corresponding eigenvector { (λ 1,y1),(λ2,y2)…(λt,yt) }, where λ 12>…>λt >0, and T new eigenvectors y= [ y 1,y2,…yt ], i.e. principal components (y 1,y2,…yt);
SS63, calculates the cumulative contribution g j from the principal component y 1,y2,…,yj (j=1, 2,..once., t) and the eigenvalue λ, and the calculation formula is as follows:
If g j is greater than 85%, selecting the principal component [ y 1,y2,…yj ] as a new principal component matrix p;
SS64, for enhancing the correlation degree between the selected principal component and the original feature, introducing complex correlation coefficients after calculating the cumulative contribution rate, and respectively calculating the complex correlation coefficients of the original feature and the principal component group from j to t-j, wherein the complex correlation coefficients are as follows B is a parameter vector of a corresponding dimension, a complex correlation coefficient F from j to t-j is obtained, and a complex correlation coefficient greater than 0.9 is taken, so that the number of main components is k;
SS65, the final principal component number is j+k, the principal component matrix is marked as p j+k,X=xi×pj+k, the dimension of the original sample matrix is reduced by the principal component matrix p j+k, the dimension-reduced matrix X is a matrix with the size of 72 multiplied by m, and the dimension is 72.
Further, the step S7 further includes the following:
SS71, feature vector X i of the input image, and corresponding class Y j, calculate the Person correlation coefficient, the formula is as follows:
Cov denotes covariance and σ is standard deviation. p ij represents the importance of the feature vector with respect to the corresponding class.
SS72, obtaining the characteristic value weight corresponding to the characteristic vector X i through the following formula, wherein the value range of q ij is (0, 1):
SS73, the original feature vector and the corresponding feature value weight are fused to obtain the feature vector under the attention mechanism, and the fusion formula is as follows:
nXij=(1+qij)Xij
compared with the prior art, the invention has the beneficial effects that:
(1) Aiming at rich lines in the fiber, a 2DGabor filter is introduced to extract the average 2DGabor bidirectional line characteristics of the fiber, and compared with the traditional Gabor filter, the method has the advantages of higher efficiency of the extracted characteristics and higher recognition degree.
(2) Aiming at the problem of fitting fiber contours, a variable-length included angle piecewise fitting curve is introduced, the defects of multi-fold line and angle information loss when the curvature is large can be avoided, the curve is better approximated, and 3 characteristic values of curvature, piecewise fitting curvature and piecewise chord angle change are introduced on the basis.
(3) In terms of the morphology of the fibers, new fiber features, i.e., minimum circumscribed rectangular density features, are introduced.
(4) Aiming at the possible nonlinear and non-functional relation among the extracted fiber characteristic data, the method for fusing the maximum information number and m-PCA is used for carrying out dimension reduction processing on the characteristic data, so that the effectiveness of extracting the characteristic data with complex relation is enhanced, wherein the m-PCA is used for enhancing the correlation degree of the selected main component and the original characteristic, and the complex correlation coefficient is introduced after the cumulative contribution rate is calculated, so that the accuracy of analysis of the main component is improved, and the fiber classification precision is further improved.
(5) The attention mechanism is introduced to obtain the characteristic weight, so that the characteristic is fused with the attention mechanism to achieve the characteristic enhancement effect.
Drawings
Fig. 1 is a schematic structural diagram of an acquisition system according to the present invention.
Fig. 2 is a schematic structural diagram of the FPGA-based image acquisition system of the present invention.
FIG. 3 is a flow chart of the classification method according to the present invention.
Fig. 4 is a schematic diagram of a method for fitting a piecewise curve by using a variable-length included angle according to the present invention.
Figure 5 is a graph of the minimum circumscribed rectangular density profile of the present invention.
Wherein:
The device comprises an image acquisition module 1, a FIFO buffer 2, a DDR5 control module 3, an image data processing module 4, a display driving module 5, a read-only memory 6, an I 2 C interface 7, a CMOS image sensor 8, a high-definition multimedia interface 9 and a memory 10.
Detailed Description
In order to better understand the technical solution of the present invention, the following detailed description will be made with reference to the accompanying drawings. It should be understood that the following embodiments are not intended to limit the embodiments of the present invention, but are merely examples of embodiments that may be employed by the present invention. It should be noted that, the description herein of the positional relationship of the components, such as the component a being located above the component B, is based on the description of the relative positions of the components in the drawings, and is not intended to limit the actual positional relationship of the components.
Example 1:
Referring to fig. 1-2, fig. 1 depicts a schematic diagram of the acquisition system of the present invention. As shown in the figure, the microscopic image acquisition and processing system comprises an image acquisition module 1, a FIFO buffer 2, a DDR5 control module 3, an image data processing module 4, a display driving module 5, a read-only memory 6, an I 2 C interface 7, a CMOS image sensor 8, a high-definition multimedia interface 9 and a memory 10, wherein the CMOS image sensor 8 is connected with the image acquisition module 1, the image acquisition module 1 is connected with the FIFO buffer 2, the FIFO buffer 2 is connected with the DDR5 control module 3, the DDR5 control module 3 is connected with the memory 10, the FIFO buffer 2 is connected with the image data processing module 4, the image data processing module 4 is connected with the display driving module 5, the display driving module 5 is connected with the high-definition multimedia interface 9, namely, the HDMI, and the read-only memory 6 is connected with the I 2 C interface 7.
The image acquisition module 1 can adopt a 1/2.8 inch SNOY CMOSIMX inch 291 ultra-wide dynamic ultra-low illuminance sensor camera, adopts a roller shutter exposure mode, has the resolution of 1920 x 1080, performs initialization configuration on the camera through an I 2 C bus protocol, performs focusing operation on the camera to shoot and acquire images, continuously writes acquired image data into a DDR5 SDRAM memory through a FIFO buffer for storage, reads out the image data through the FIFO buffer, processes the read-out image data, and finally displays the acquired images on HDMI.
The stored image may be uploaded to a host for subsequent processing.
Referring to fig. 3-5, fig. 3 depicts a method flow diagram of the classification method of the present invention. As shown in the figure, the textile fiber classification method of the microscopic image acquisition and processing system comprises the following steps:
s1, acquiring a textile fiber sample image by using an image acquisition system based on an FPGA (field programmable gate array);
The SS11 is adopted, the fiber to be detected is illuminated by using a light source, so that the brightness of a shooting environment can be increased, a 1/2.8 inch SNOY CMOSIMX291 ultra-wide dynamic ultra-low illuminance sensor camera is adopted as an acquisition system, a roller shutter exposure mode is adopted, and the resolution of the camera is 1920 x 1080;
The SS12 is firstly configured by the acquisition system through an I 2 C bus protocol, the camera shoots and acquires images in focusing mode, then the acquired image data are continuously written into the DDR5 SDRAM through the FIFO buffer for storage, the image data are read out through the FIFO buffer, the read-out image data are processed, finally the acquired images are displayed on the HDMI, and the stored images can be uploaded to the host for subsequent processing.
S2, inputting a fiber color image to be identified, preprocessing the image to output a denoised gray scale image, and then dividing the preprocessed image through a threshold value to obtain a binary image;
SS21, inputting a fiber color image to be identified, which is acquired from a camera and has the size of 72 x 72 pixels, graying the RGB image, denoising the fiber image by adopting a mean filter, taking the noisy image as g (x, y), setting the central point Z as (x, y), taking a square neighborhood window with the size of c x c to obtain a denoised image f (x, y), constructing a Laplace operator' 2 f (x, y), and taking a filtering template as a filtering template By passing throughThe method comprises the steps of obtaining a sharpened image, sharpening details of a reinforced fiber gray level image by using a Laplace image, wherein the operation is a preprocessing process of the image;
And SS22, processing the preprocessed image by an Otsu single-threshold segmentation method, selecting an optimal gray level threshold value, classifying pixels in the gray level image into a fiber target area and a background area by using the threshold value, setting the gray level value of the pixels in the fiber target area to 255, and setting the gray level value of the pixels in the background area to 0, so as to obtain a binary image.
S3, as the fiber image has rich line characteristics, a group of 2DGabor filters in different directions are defined firstly, and the directions with maximum response and secondary maximum response are selected as the line characteristics of the fiber after the filters are convolved with the fiber image, traversing the lines of the fiber, and respectively carrying out average calculation on the maximum value and the secondary maximum value obtained by all the lines to obtain average 2DGabor bi-directional line characteristics;
SS31, construction of a 2DGabor filter, defined as the following equation (1):
f is the frequency of a sine function, θ controls the direction of the adjusting filter, q represents the phase shift, r represents the spatial aspect ratio of the shape of the Gabor filter, and different r values are of different scales;
SS32, setting parameters of a 2D Gabor filter, taking 12 directions of 0, 30, 60, 90, 120, 180, 240, 300 and 360 degrees respectively, setting central frequencies of 1/16, 1/16.5, 1/17, 1/17.5 and 1/18 respectively, setting 8 scales of 0.05, 0.15, 0.25, 0.35, 0.45, 0.55, 0.65 and 0.75 respectively, and defining response values m k of a central pixel in different directions as defined in the following formula (2):
The gray value of the current pixel point is represented by f (x, y), wherein L k represents a line formed by multiple points in the same direction, G k is a 2D Gabor filter with a direction k, in a fiber image, the background gray value is low, the fiber target gray value is high, the direction with a higher response value is taken as the line characteristic of the pixel point, and secondly, as the internal line of the fiber image is provided with an intersection point, the fiber line extends along different directions along the intersection point, the line characteristic of the pixel is represented in a bidirectional way on the basis, and the maximum value and the next maximum value are selected as the line direction of the pixel point by the response value;
SS33, extracting 2DGabor bi-directional line characteristics, defined as the following formula (3):
v 1,v2 is the direction k 1,k2 of the maximum response value and the next-largest response value, respectively;
In step SS34, the crossing point of the lines in the fiber is needed to be identified so as to determine whether to delete the direction, the specific operation is that the image is thinned, the skeleton of the target fiber is extracted, if the target fiber is a line target, the skeleton is white pixel point, if the target fiber is a background, the skeleton is black, a 3X 3 template is constructed to traverse all lines in the thinned fiber image, whether the sum of the white pixel values of the pixel points [ P 1,P2,P3,P4,P5,P6,P7,P8 ] in the 8 neighborhood is more than 2 is judged, if the sum is more than 2, the crossing point is the crossing point, the position of the crossing point is recorded;
and SS35, respectively carrying out average calculation on the maximum value and the next maximum value obtained by all lines to obtain an average 2DGabor bi-directional line characteristic, wherein the corresponding dimension is 2×8×5=80.
S4, detecting and analyzing the fiber contour as a curve object, and constructing a segmented fitting curve by using a variable-length included angle, dividing the curve into n segments, calculating segmented chord lengths and total chord lengths, and constructing 3 characteristics of curvature, fitting curvature and segmented chord length angle change on the basis;
SS41, corroding and expanding the image, wherein the image is taken as a pixel matrix A, and the corroding matrix B is taken as Corroding B to A to obtain C matrix, setting expansion matrixThe matrix E is converted into an image, the image is scanned by a progressive scanning method until a continuous area is found, the boundary pixels are marked along the contour of the area from the starting point of the area, and after the contour is processed, the scanning is continued from the upper position until a new area is found, thereby obtaining the fiber contour;
SS42, fitting a segmented curve by using a variable-length included angle, referring to FIG. 4, firstly traversing the curve, recording a starting point C and a terminal point D, marking the curve as CD, setting an initial angle theta, fixing a length p, starting traversing the curve CD by the point C, and recording a point E 1 if CE 1 is equal to p;
SS43, search point E 1 to point D, record point F 1, and angle CE 1 to CF 1 is greater than θ;
SS44, selecting the point with the farthest distance between the curve CE 1F1 and the straight line CF 1, and recording as S 1;
SS45, traversing the curve S 1 D by taking S 1 as a starting point, and repeating the steps SS 42-SS 44 to obtain a point S 2,…,Sn;
The line segments on the SS46 and the curve are CS 1,S1S2,…,Sn D respectively;
SS47, sampling on the contour line of the unit pixel, yields a mark point D (x i,yi) i=0, 1,2,..n, the contour line is divided into n segments, and then the total chord length L n and the segmented chord length k are respectively represented by the following formula (4):
SS48, establishing a curvature characteristic w by a piecewise fitted curve, wherein curvature refers to a ratio of an arc length to an overall chord length of the piecewise fitted curve, and a mark point D (x i,yi) i=0, 1,2,..n, assuming that the chord length is F x and the arc length is an overall chord length L n, the curvature w is represented by the following formula (5):
SS49, building a fitting curvature nq feature by piecewise fitting the curve, the curvature k i at the marker point D (x i,yi) is the following equation (6):
The fitting curvature nq is the sum of absolute values of the curvatures on the fitting line segment, and the formula is as follows:
SS410, establishing a segment chord length angle change tau characteristic through a segment fitting curve, marking a point D (x i,yi) as a pixel point on the curve, D i+n and D i-n as points with a distance of N pixels from a point D i on a contour line, obtaining vectors T i+n and T i-n through a formula (8), calculating an included angle between two adjacent segment vectors, wherein N pixel points on 1 curve totally calculate N-2N included angles, accumulating and averaging the N-2N included angles, and further obtaining the segment chord length angle change tau on the whole curve, and adopting the following formula (8):
s5, calculating the minimum circumscribed rectangle of the fiber and the fiber area, and extracting the density characteristic of the minimum circumscribed rectangle;
SS51, carrying out filling operation after corroding and expanding the binary fiber image to obtain the boundary of the target, rotating the target within a range of 180 degrees, wherein the increment of each rotation angle is 5 degrees, and recording the maximum value and the minimum value of the boundary point of the external rectangle each time;
SS52, calculating the perimeter of the minimum circumscribed rectangle of the target fiber, setting as T, calculating the pixel number of the fiber target, wherein the pixel number is the area a of the fiber, and the characteristic of the density U of the minimum circumscribed rectangle is shown in the following formula (9):
s6, extracting feature vectors in S3-S5, and performing dimension reduction treatment on the feature data by a method of fusing the maximum information number with m-PCA (multi-linear principal component analysis);
SS61, extracted variable x i '(i=1, 2.), m) the number of samples is m, and the normalization operation is performed on x i': the maximum information number is calculated using the following formula (10):
where B is the number of samples, where b=m 0.6,I(x1,x2) represents the mutual information between x 1,x2;
calculating the maximum information number of all the characteristic variables to obtain a matrix T, wherein the matrix T is shown in the following formula (11):
SS62, let ty=λy, get the eigenvalue of T and the corresponding eigenvector { (λ 1,y1),(λ2,y2)…(λt,yt) }, where λ 12>…>λt >0, and T new eigenvectors y= [ y 1,y2,…yt ], i.e. principal components (y 1,y2,…yt);
SS63, calculates the cumulative contribution g j from the principal component y 1,y2,…,yj (j=1, 2,..once., t) and the eigenvalue λ, and the calculation formula is shown in the following formula (12):
If g j is greater than 85%, selecting the principal component [ y 1,y2,…yj ] as a new principal component matrix p;
SS64, for enhancing the correlation degree between the selected principal component and the original feature, introducing complex correlation coefficients after calculating the cumulative contribution rate, and respectively calculating the complex correlation coefficients of the original feature and the principal component group from j to t-j, wherein the complex correlation coefficients are as follows B is a parameter vector of a corresponding dimension, a complex correlation coefficient F from j to t-j is obtained, and a complex correlation coefficient greater than 0.9 is taken, so that the number of main components is k;
SS65, the final principal component number is j+k, the principal component matrix is marked as p j+k,X=xi×pj+k, the dimension of the original sample matrix is reduced by the principal component matrix p j+k, the dimension-reduced matrix X is a matrix with the size of 72 multiplied by m, and the dimension is 72.
S7, performing weight distribution on the feature vector subjected to dimension reduction by using an attention mechanism, and calculating a correlation coefficient between the fiber class and the feature value variable to obtain a Pearson correlation coefficient;
SS71, feature vector X i of the input image, and corresponding class Y j, calculate the Person correlation coefficient, the formula is as follows:
Cov denotes covariance and σ is standard deviation. p ij represents the importance of the feature vector with respect to the corresponding class.
SS72, obtain the eigenvalue weight corresponding to eigenvector X i by the following formula (14), where the value range of q ij is (0, 1):
SS73, the fusion original feature vector and the corresponding feature value weight, the feature vector under the attention mechanism can be obtained, and the fusion formula is as follows (15):
nXij=(1+qij)Xij (15)。
S8, inputting the feature vector into a decision tree SVM for training;
SS81, making a fiber data set, dividing the fiber into Modal cotton, viscose cotton and flax cotton, respectively marking 1, 2 and 3 digital labels, selecting 70% of 200 fiber images as a training set and 30% as a test set, inputting feature vectors into a decision tree SVM classification model for training, wherein the training process mainly comprises the steps of classifying the fiber class with the label of 1 into positive class and classifying the fiber class with the label of 2 into negative class, classifying the fiber with the label of 3 into negative class on the basis, thereby realizing the multi-classification function of the fiber, taking 1 as a penalty parameter C of the SVM, and taking a Gaussian kernel function with relatively stable effect in consideration of the fact that the sample number is larger than the feature dimension Taking 0.05.
The foregoing is merely a specific application example of the present invention, and the protection scope of the present invention is not limited in any way. All technical schemes formed by equivalent transformation or equivalent substitution fall within the protection scope of the invention.

Claims (8)

1. A method for classifying textile fibers of a microscopic image acquisition and processing system, comprising the steps of:
s1, acquiring a textile fiber sample image by using an image acquisition system based on an FPGA;
s2, inputting a fiber color image to be identified, preprocessing the image to output a denoised gray scale image, and dividing the preprocessed image through a threshold value to obtain a binary image;
S3, as the fiber image has rich line characteristics, a group of 2DGabor filters in different directions are defined firstly, and the directions with maximum response and secondary maximum response are selected as the line characteristics of the fiber after the filters are convolved with the fiber image, traversing the lines of the fiber, and respectively carrying out average calculation on the maximum value and the secondary maximum value obtained by all the lines to obtain average 2DGabor bi-directional line characteristics;
S4, detecting and analyzing the fiber contour as a curve object, and constructing a segmented fitting curve by using a variable-length included angle, dividing the curve into n segments, calculating segmented chord lengths and total chord lengths, and constructing 3 characteristics of curvature, fitting curvature and segmented chord length angle change on the basis;
s5, calculating the minimum circumscribed rectangle of the fiber and the fiber area, and extracting the density characteristic of the minimum circumscribed rectangle;
s6, extracting the feature vectors in the steps S3-S5, and performing dimension reduction on the feature data by fusing the maximum information number and the m-PCA method;
S7, performing weight distribution on the feature vector after the dimension reduction by using an attention mechanism, calculating a correlation coefficient between the fiber class and the feature value variable to obtain a Pearson correlation coefficient, inputting the Pearson correlation coefficient into a mapping function, outputting an optimized weight of the original feature value mapping, and finally outputting a new feature vector by combining the weight with the original feature vector;
S8, inputting the feature vector into a decision tree SVM for training.
2. The method for classifying textile fibers of a microscopic image acquisition and processing system according to claim 1, wherein the step S1 is further characterized in that the acquisition system firstly carries out initialization configuration on a camera through an I 2 C bus protocol, the camera focuses and works to shoot an acquired image, then the acquired image data is continuously written into a DDR5 SDRAM through a FIFO buffer for storage, the image data is read out through the FIFO buffer, the read-out image data is processed, finally the acquired image is displayed on an HDMI, and the stored image is uploaded to a host for subsequent processing.
3. The method for classifying textile fibers in a microscopic image acquisition and processing system according to claim 1, wherein said step S2 further comprises the steps of:
SS21, inputting a fiber color image to be identified, which is acquired from a camera and has the size of 72 x 72 pixels, graying the RGB image, denoising the fiber image by adopting a mean filter, taking the noisy image as g (x, y), setting the central point Z as (x, y), taking a square neighborhood window with the size of c x c to obtain a denoised image f (x, y), and constructing a Laplace operator The filtering template isBy passing throughThe method comprises the steps of obtaining a sharpened image, sharpening details of a reinforced fiber gray level image by using a Laplace image, wherein the operation is a preprocessing process of the image;
And SS22, processing the preprocessed image by an Otsu single-threshold segmentation method, selecting an optimal gray level threshold value, classifying pixels in the gray level image into a fiber target area and a background area by using the threshold value, setting the gray level value of the pixels in the fiber target area to 255, and setting the gray level value of the pixels in the background area to 0, so as to obtain a binary image.
4. The method for classifying textile fibers in a microscopic image acquisition and processing system according to claim 1, wherein said step S3 further comprises the steps of:
SS31, construct 2DGabor filter, defined as the following formula:
f is the frequency of a sine function, θ controls the direction of the adjusting filter, q represents the phase shift, r represents the spatial aspect ratio of the shape of the Gabor filter, and different r values are of different scales;
SS32, setting parameters of a 2D Gabor filter, taking 12 directions of 0, 30, 60, 90, 120, 180, 240, 300 and 360 degrees respectively, setting central frequencies of 1/16, 1/16.5, 1/17, 1/17.5 and 1/18 respectively, setting 8 scales of 0.05, 0.15, 0.25, 0.35, 0.45, 0.55, 0.65 and 0.75 respectively, and defining response values m k of a central pixel in different directions as the following formula:
The gray value of the current pixel point is represented by f (x, y), wherein L k represents a line formed by multiple points in the same direction, G k is a 2D Gabor filter with a direction k, in a fiber image, the background gray value is low, the fiber target gray value is high, the direction with a higher response value is taken as the line characteristic of the pixel point, and secondly, as the internal line of the fiber image is provided with an intersection point, the fiber line extends along different directions along the intersection point, the line characteristic of the pixel is represented in a bidirectional way on the basis, and the maximum value and the next maximum value are selected as the line direction of the pixel point by the response value;
SS33, extracting 2DGabor bi-directional line characteristics, and defining the following formula:
v 1,v2 is the direction k 1,k2 of the maximum response value and the next-largest response value, respectively;
In step SS34, the crossing point of the lines in the fiber is needed to be identified so as to determine whether to delete the direction, the specific operation is that the image is thinned, the skeleton of the target fiber is extracted, if the target fiber is a line target, the skeleton is white pixel point, if the target fiber is a background, the skeleton is black, a 3X 3 template is constructed to traverse all lines in the thinned fiber image, whether the sum of the white pixel values of the pixel points [ P 1,P2,P3,P4,P5,P6,P7,P8 ] in the 8 neighborhood is more than 2 is judged, if the sum is more than 2, the crossing point is the crossing point, the position of the crossing point is recorded;
and SS35, respectively carrying out average calculation on the maximum value and the next maximum value obtained by all lines to obtain an average 2DGabor bi-directional line characteristic, wherein the corresponding dimension is 2×8×5=80.
5. The method for classifying textile fibers in a microscopic image acquisition and processing system according to claim 1, wherein said step S4 further comprises the steps of:
SS41, corroding and expanding the image, wherein the image is taken as a pixel matrix A, and the corroding matrix B is taken as Corroding B to A to obtain C matrix, setting expansion matrixThe matrix E is converted into an image, the image is scanned by a progressive scanning method until a continuous area is found, the boundary pixels are marked along the contour of the area from the starting point of the area, and after the contour is processed, the scanning is continued from the upper position until a new area is found, thereby obtaining the fiber contour;
SS42, fitting a segmented curve by using a variable-length included angle, firstly traversing the curve, recording a starting point C and a terminal point D, marking the curve as CD, setting an initial angle theta and a fixed length p, starting traversing the curve CD by using the point C, and recording a point E 1 if CE 1 is equal to p;
SS43, search point E 1 to point D, record point F 1, and angle CE 1 to CF 1 is greater than θ;
SS44, selecting the point with the farthest distance between the curve CE 1F1 and the straight line CF 1, and recording as S 1;
SS45, traversing the curve S 1 D by taking S 1 as a starting point, and repeating the steps SS 42-SS 44 to obtain a point S 2,…,Sn;
The line segments on the SS46 and the curve are CS 1,S1S2,…,Sn D respectively;
SS47, sampling on the contour line of the unit pixel to obtain a mark point D (x i,yi) i=0, 1,2, n, the contour line is divided into n sections, then total chord length L n, and sectional chord length k is respectively:
SS48, establishing a curvature characteristic w by a piecewise fitted curve, wherein the curvature refers to the ratio of the arc length to the total chord length of the piecewise fitted curve, and the marking point D (x i,yi) i=0, 1,2,..n, the chord length is F x, the arc length is the total chord length L n, and the curvature w is represented by the following formula:
SS49, building a fitting curvature nq feature by piecewise fitting the curve, the curvature k i at the marker point D (x i,yi) is the following formula:
The fitting curvature nq is the sum of absolute values of the curvatures on the fitting line segment, and the formula is as follows:
SS410, establishing a segment chord length angle change tau characteristic through a segment fitting curve, marking a point D (x i,yi) as a pixel point on the curve, D i+n and D i-n as points with a distance of N pixels from a point D i on a contour line, obtaining vectors T i+n and T i-n according to the following formula, calculating an included angle between two adjacent segment vectors, wherein N pixel points on 1 curve totally calculate included angles between N-2N vectors, accumulating and averaging the N-2N included angles, and further obtaining the segment chord length angle change tau on the whole curve, wherein the formula is as follows:
6. the method for classifying textile fibers in a microscopic image acquisition and processing system according to claim 1, wherein said step S5 further comprises the steps of:
SS51, carrying out filling operation after corroding and expanding the binary fiber image to obtain the boundary of the target, rotating the target within a range of 180 degrees, wherein the increment of each rotation angle is 5 degrees, and recording the maximum value and the minimum value of the boundary point of the external rectangle each time;
SS52, calculating the perimeter of the minimum circumscribed rectangle of the target fiber, setting as T, calculating the pixel number of the fiber target, wherein the pixel number is the area a of the fiber, and the characteristic of the density U of the minimum circumscribed rectangle is shown in the following formula:
7. The method for classifying textile fibers in a microscopic image acquisition and processing system according to claim 1, wherein said step S6 further comprises the steps of:
SS61, extracted variable x i '(i=1, 2.), m) the number of samples is m, and the normalization operation is performed on x i': The maximum information number was calculated using the following formula:
where B is the number of samples, where b=m 0.6,I(x1,x2) represents the mutual information between x 1,x2;
Calculating the maximum information number of the characteristic variables to obtain a matrix T, wherein the matrix T is shown in the following formula:
SS62, let ty=λy, get the eigenvalue of T and the corresponding eigenvector { (λ 1,y1),(λ2,y2)…(λt,yt) }, where λ 12>…>λt >0, and T new eigenvectors y= [ y 1,y2,…yt ], i.e. principal components (y 1,y2,…yt);
SS63, calculates the cumulative contribution g j from the principal component y 1,y2,…,yj (j=1, 2,..once., t) and the eigenvalue λ, and the calculation formula is as follows:
If g j is greater than 85%, selecting the principal component [ y 1,y2,…yj ] as a new principal component matrix p;
SS64, for enhancing the correlation degree between the selected principal component and the original feature, introducing complex correlation coefficients after calculating the cumulative contribution rate, and respectively calculating the complex correlation coefficients of the original feature and the principal component group from j to t-j, wherein the complex correlation coefficients are as follows B is a parameter vector of a corresponding dimension, a complex correlation coefficient F from j to t-j is obtained, and a complex correlation coefficient greater than 0.9 is taken, so that the number of main components is k;
SS65, the final principal component number is j+k, the principal component matrix is marked as p j+k,X=xi×pj+k, the dimension of the original sample matrix is reduced by the principal component matrix p j+k, the dimension-reduced matrix X is a matrix with the size of 72 multiplied by m, and the dimension is 72.
8. The method for classifying textile fibers in a microscopic image acquisition and processing system according to claim 1, wherein said step S7 further comprises the steps of:
SS71, feature vector X i of the input image, and corresponding class Y j, calculate the Person correlation coefficient, the formula is as follows:
Cov denotes covariance, σ is standard deviation, and p ij denotes importance of feature vectors with respect to corresponding classes;
SS72, obtaining the characteristic value weight corresponding to the characteristic vector X i through the following formula, wherein the value range of q ij is (0, 1):
SS73, the original feature vector and the corresponding feature value weight are fused to obtain the feature vector under the attention mechanism, and the fusion formula is as follows:
nXij=(1+qij)Xij
CN202310927676.4A 2023-07-26 2023-07-26 A microscopic image acquisition and processing system and textile fiber classification method thereof Active CN116844159B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310927676.4A CN116844159B (en) 2023-07-26 2023-07-26 A microscopic image acquisition and processing system and textile fiber classification method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310927676.4A CN116844159B (en) 2023-07-26 2023-07-26 A microscopic image acquisition and processing system and textile fiber classification method thereof

Publications (2)

Publication Number Publication Date
CN116844159A CN116844159A (en) 2023-10-03
CN116844159B true CN116844159B (en) 2024-12-13

Family

ID=88172583

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310927676.4A Active CN116844159B (en) 2023-07-26 2023-07-26 A microscopic image acquisition and processing system and textile fiber classification method thereof

Country Status (1)

Country Link
CN (1) CN116844159B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107516296A (en) * 2017-07-10 2017-12-26 昆明理工大学 An FPGA-based moving target detection and tracking system and method
CN115661525A (en) * 2022-10-25 2023-01-31 盐城工学院 A Timber Classification Recognition System Based on FPGA

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001250117A (en) * 1999-12-28 2001-09-14 Ge Medical Systems Global Technology Co Llc System for remotely performing feature image recognition and feature image discrimination
US7035467B2 (en) * 2002-01-09 2006-04-25 Eastman Kodak Company Method and system for processing images for themed imaging services
EP3270095A1 (en) * 2016-07-13 2018-01-17 Sightline Innovation Inc. System and method for surface inspection
CN108734148A (en) * 2018-05-29 2018-11-02 河南牧业经济学院 A kind of public arena image information collecting unmanned aerial vehicle control system based on cloud computing
CN113269020A (en) * 2021-03-08 2021-08-17 重庆邮电大学 Fingertip vein image feature identification positioning method and system

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107516296A (en) * 2017-07-10 2017-12-26 昆明理工大学 An FPGA-based moving target detection and tracking system and method
CN115661525A (en) * 2022-10-25 2023-01-31 盐城工学院 A Timber Classification Recognition System Based on FPGA

Also Published As

Publication number Publication date
CN116844159A (en) 2023-10-03

Similar Documents

Publication Publication Date Title
CN113435460B (en) A recognition method for bright crystal granular limestone images
CN108921201B (en) Dam defect identification and classification method based on feature combination and CNN
CN114663380B (en) Aluminum product surface defect detection method, storage medium and computer system
CN110021028B (en) An automatic garment-making method based on garment style graph
Xing et al. Automatic identification of cashmere and wool fibers based on the morphological features analysis
CN110310262A (en) A method, device and system for detecting tire defects
CN107870172A (en) A Method of Cloth Defect Detection Based on Image Processing
CN114581388A (en) Contact net part defect detection method and device
Zang et al. Traffic sign detection based on cascaded convolutional neural networks
CN112102224B (en) A cloth defect recognition method based on deep convolutional neural network
CN109993213A (en) An Automatic Recognition Method for Apparel Parts Diagram
Gao et al. A novel VBM framework of fiber recognition based on image segmentation and DCNN
CN119290896A (en) A method and system for detecting defects in textile products
CN114897790A (en) Method for quickly identifying surface cracks of reinforced concrete member based on deep learning
CN105931225A (en) Method for analyzing crystal growth shape and size distribution based on real-time image detection technology
CN118351100A (en) Image definition detection and processing method based on deep learning and gradient analysis
CN113139936B (en) Image segmentation processing method and device
Jin et al. End Image Defect Detection of Float Glass Based on Faster Region-Based Convolutional Neural Network.
CN109035296A (en) A kind of improved moving objects in video detection method
CN118230314A (en) Solid waste classification method and system based on image recognition
CN113744241A (en) Cell Image Segmentation Method Based on Improved SLIC Algorithm
Zhang et al. A novel concavity based method for automatic segmentation of touching cells in microfluidic chips
CN116188755A (en) Instrument angle correction and reading recognition device based on deep learning
CN109508714B (en) Low-cost multi-channel real-time digital instrument panel visual identification method and system
CN116844159B (en) A microscopic image acquisition and processing system and textile fiber classification method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant