[go: up one dir, main page]

CN118505707B - Turbine guide vane quality rapid detection method based on machine learning - Google Patents

Turbine guide vane quality rapid detection method based on machine learning Download PDF

Info

Publication number
CN118505707B
CN118505707B CN202410968941.8A CN202410968941A CN118505707B CN 118505707 B CN118505707 B CN 118505707B CN 202410968941 A CN202410968941 A CN 202410968941A CN 118505707 B CN118505707 B CN 118505707B
Authority
CN
China
Prior art keywords
low
bright
turbine guide
point
combination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202410968941.8A
Other languages
Chinese (zh)
Other versions
CN118505707A (en
Inventor
剧亚东
厉福海
贾婷
王艳平
王佳伟
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Hanwei Material Technology Co ltd
Original Assignee
Suzhou Hanwei Material Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Hanwei Material Technology Co ltd filed Critical Suzhou Hanwei Material Technology Co ltd
Priority to CN202410968941.8A priority Critical patent/CN118505707B/en
Publication of CN118505707A publication Critical patent/CN118505707A/en
Application granted granted Critical
Publication of CN118505707B publication Critical patent/CN118505707B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/168Segmentation; Edge detection involving transform domain methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • G06T2207/20061Hough transform
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Image Processing (AREA)
  • Image Analysis (AREA)

Abstract

The invention relates to the technical field of image processing, in particular to a quick detection method for the quality of a turbine guide vane based on machine learning, which comprises the following steps: acquiring a turbine guide blade image to obtain data points in a Hough space; dividing the data points into high bright spots and low bright spots; according to the difference of the horizontal coordinates, the vertical coordinates and the brightness of the low-bright spots and the high-bright spots, the possibility that each low-bright spot belongs to each high-bright spot is obtained, and a plurality of combinations are obtained; and detecting the quality of the turbine guide blade according to the number of the low-light spots contained in each combination, the possibility that each low-light spot in each combination belongs to the high-light spot in the combination, and the angle difference and the brightness difference between each low-light spot in each combination and the high-light spot in the combination. The method aims at solving the problem that the quality detection of the turbine guide blade cannot be directly carried out because the whole linear expression degree of the edge where the edge line is located cannot be directly obtained through Hough transformation.

Description

Turbine guide vane quality rapid detection method based on machine learning
Technical Field
The invention relates to the technical field of image processing, in particular to a quick detection method for the quality of a turbine guide vane based on machine learning.
Background
Turbine guide vanes are widely used in the fields of aviation, energy, and the like. Since the high quality turbine guide blades greatly reduce the probability of failure of mechanical equipment such as aircraft, quality detection of turbine guide blades in mechanical equipment is performed. Since the turbine guide vane is slightly deformed during operation due to fatigue, resonance, abrasion and the like, it cannot be detected by the existing method. Thus, the turbine guide vane is subjected to a minute deformation detection.
When the turbine guide vane is not deformed, the turbine guide vane can be regarded as being formed by a plurality of edges, and each edge is formed by an edge straight line; when the turbine guide vane is slightly deformed, the turbine guide vane is also composed of a plurality of edges, but the deformed edge is composed of a plurality of edge lines instead of one edge line. When deformation detection is performed on the turbine guide blade through Hough transformation, only each edge straight line is analyzed, and the whole analysis is not performed on the edge where each edge straight line is located, so that the deformation detection cannot be directly performed on the turbine guide blade through Hough transformation.
Disclosure of Invention
The invention provides a quick detection method for the quality of a turbine guide vane based on machine learning, which aims to solve the existing problems.
The quick detection method for the quality of the turbine guide vane based on machine learning adopts the following technical scheme:
The invention provides a quick detection method for the quality of a turbine guide vane based on machine learning, which comprises the following steps:
Acquiring a turbine guide vane image;
performing edge detection on the turbine guide blade image to obtain an edge detection image of the turbine guide blade image; mapping an edge detection image of the turbine guide vane image into a Hough space to obtain a plurality of data points in the Hough space; each data point has a brightness value brightness;
dividing the data points into high-bright points and low-bright points according to the brightness value of each data point in the Hough space; obtaining the latest Euclidean distance between each low-bright point and each high-bright point according to the difference of the transverse coordinates and the longitudinal coordinates of each low-bright point and each high-bright point in the Hough space; obtaining the possibility that each low-bright point belongs to each high-bright point according to the latest Euclidean distance between each low-bright point and each high-bright point and the difference between the brightness values of the low-bright point and the high-bright point, and dividing all the high-bright points and the low-bright points into a plurality of combinations;
Obtaining the deformation degree of the corresponding edge of each combination according to the number of the low-light spots contained in each combination, the possibility that the low-light spots in each combination belong to the high-light spots in each combination and the angle difference and the brightness difference of the low-light spots and the high-light spots in each combination; and obtaining the deformation degree of the turbine guide blade according to the deformation degree of the corresponding edge of each combination, and detecting the quality of the turbine guide blade.
Further, the specific method for acquiring the turbine guide vane image comprises the following steps:
Acquiring a blade image containing a background;
Graying the blade image containing the background to obtain a blade gray map containing the background, performing semantic segmentation on the blade gray map containing the background to obtain a blade gray map without the background, and marking the blade gray map without the background as a turbine guide blade image.
Further, the mapping the edge detection image of the turbine guiding vane image into the hough space to obtain a plurality of data points in the hough space comprises the following specific methods:
Taking an edge detection image of the turbine guide blade image as the input of Hough transformation, and outputting a coordinate point corresponding to each edge straight line in the edge detection image of the turbine guide blade image in Hough space;
And (5) marking coordinate points corresponding to the edge straight lines of the turbine guide vane images in the Hough space as data points.
Further, the method for dividing the data points into high-bright points and low-bright points according to the brightness value of each data point in the hough space comprises the following specific steps:
Presetting a brightness threshold The data points with the brightness value larger than the brightness threshold value in the Hough space are marked as high-bright points, and the data points with the brightness value smaller than or equal to the brightness threshold value in the Hough space are marked as low-bright points.
Further, the method for obtaining the latest euclidean distance between each low bright point and each high bright point according to the difference of the transverse coordinates of each low bright point and the longitudinal coordinates of each high bright point in the Hough space comprises the following specific steps:
In the method, in the process of the invention, Represent the firstLow bright point and the firstThe latest euclidean distance of the individual highlights,Indicating a preset weight increase value,Representing the first in Hough spaceThe abscissa of the individual highlight points,Representing the first in Hough spaceThe abscissa of the individual low-bright spots,Representing the first in Hough spaceThe ordinate of the individual highlight points,Representing the first in Hough spaceThe ordinate of the individual low spots.
Further, according to the latest euclidean distance between each low-bright point and each high-bright point and the difference between the brightness values of the low-bright point and the high-bright point, the possibility that each low-bright point belongs to each high-bright point is obtained, and the specific method comprises the following steps:
In the method, in the process of the invention, Represent the firstThe low bright point belongs toThe possibility of a high-point spot,Representing the first in Hough spaceThe brightness value of the individual low-bright spots,Representing the first in Hough spaceThe luminance value of the individual highlight points,Represent the firstLow bright point and the firstThe latest euclidean distance of the individual highlights,As an exponential function with a base of natural constant,Representing the normalization function.
Further, the method for dividing all the high-bright spots and the low-bright spots into a plurality of combinations comprises the following specific steps:
Will be the first A high bright point corresponding to the maximum value of the possibility that the low bright point belongs to each high bright point, andThe low bright spots are corresponding bright spots to each other, so that all corresponding bright spots of each high bright spot are obtained;
Will be the first A highlight point and a firstAll the corresponding bright spots of the high bright spots form a combination, the first isThe combination of the high-brightness points is marked as the firstCombinations of two.
Further, the deformation degree of the corresponding edge of each combination is obtained according to the number of the low-light-point contained in each combination, the possibility that the low-light-point belongs to the high-light-point in each combination, and the angle difference and the brightness difference of the low-light-point and the high-light-point in each combination, which comprises the following specific methods:
In the method, in the process of the invention, Represent the firstThe respective combinations correspond to the degree of deformation of the edges,Represent the firstThe number of low-light spots contained in each combination,Represent the firstIn the first combinationThe low bright point belongs toThe possibility of a high-point spot,Represent the firstIn the first combinationLow bright point and the firstThe absolute value of the difference in luminance values of the individual highlights,Represent the firstIn the first combinationLow bright point and the firstThe difference in the angle of the individual highlights,Representing a sigmoid function.
Further, the method for obtaining the deformation degree of the turbine guide vane according to the deformation degree of the corresponding edge of each combination comprises the following specific steps:
The maximum value of the deformation degree of the corresponding edges of all the combinations is recorded as the deformation degree of the turbine guide vane.
Further, the method for detecting the quality of the turbine guide vane comprises the following specific steps:
Presetting a deformation threshold If the deformation degree of the turbine guide blade is greater than the deformation threshold value, the turbine guide blade is marked as a blade with unqualified quality.
The technical scheme of the invention has the beneficial effects that: when the quality detection is carried out on the turbine guide blade by analyzing the result of Hough transformation of the turbine guide blade, according to the characteristic that the difference of the transverse coordinates of the coordinate points corresponding to two edge straight lines in one edge of the turbine guide blade in Hough space is far smaller than the difference of the transverse coordinates of the coordinate points corresponding to two edge straight lines in different edges, when the latest Euclidean distance between each low bright point and a high bright point in Hough space is calculated, the weight of the transverse coordinates of each low bright point and the high bright point is increased, so that the latest Euclidean distance between a longer edge straight line in one edge and a low bright point corresponding to a shorter edge straight line is far smaller than the latest Euclidean distance between a high bright point corresponding to a longer edge straight line in different edges and the latest Euclidean distance between a plurality of edge straight lines in the same edge and the low bright point in a combination; according to the characteristic that a longer edge straight line in an edge with micro deformation and an edge straight line of a deformation area in the edge have larger angle difference, the possibility that the edge straight line corresponding to the low bright point in each combination is the edge straight line of the deformation area is larger, and the deformation degree of the edge corresponding to each combination is more accurate after the edge straight line corresponding to the low bright point in each combination is the edge straight line of the deformation area according to the horizontal coordinate difference of the low bright point and the high bright point in each combination; when the quality of the turbine guide vane is detected, the contingency is removed according to the deformation degree of each edge, so that the detection result is more accurate.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of the steps of a machine learning based turbine stator blade quality rapid detection method of the present invention.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following description refers to the specific implementation, structure, characteristics and effects of the turbine guide vane quality rapid detection method based on machine learning according to the invention, which are provided by the invention, with reference to the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the machine learning-based turbine guide vane quality rapid detection method provided by the invention with reference to the accompanying drawings.
Referring to FIG. 1, a flowchart of the steps of a machine learning based turbine guide vane quality rapid detection method according to one embodiment of the present invention is shown, the method comprising the steps of:
step S001: turbine guide vane images are acquired.
The object of this embodiment is to perform quality inspection of the turbine stator blade, thus obtaining an image of the turbine stator blade to be inspected.
Specifically, photographing a turbine guide vane image to be detected to obtain a vane image containing a background; graying the blade image containing the background to obtain a blade gray map containing the background, performing semantic segmentation on the blade gray map containing the background to obtain a blade gray map without the background, and marking the blade gray map without the background as a turbine guide blade image. The semantic segmentation is known in the art, and this embodiment is not described in detail.
Step S002: performing edge detection on the turbine guide blade image to obtain an edge detection image of the turbine guide blade image; mapping the edge detection image of the turbine guide vane image into the Hough space to obtain a plurality of data points in the Hough space.
Since the output result of the hough-transformed turbine blade image is analyzed, the hough-transformed turbine blade image is subjected to the hough-transformation.
Specifically, edge detection is carried out on the turbine guide blade image through a sobel operator, and an edge detection image of the turbine guide blade image is obtained. The sobel operator is a known technique, and this embodiment is not described in detail. And taking the edge detection image of the turbine guide blade image as the input of Hough transformation, and outputting to obtain a coordinate point corresponding to each edge straight line in the edge detection image of the turbine guide blade image in Hough space. And (5) recording coordinate points corresponding to the edge straight lines of the turbine guide blade images in the Hough space as data points. Wherein the brightness value of the data point in the Hough space reflects the linear expression degree of the edge straight line corresponding to the data point, and the dimension of the coordinate axis of the Hough space is an angleDistance from
Step S003: dividing the data points into high-bright points and low-bright points according to the brightness value of each data point in the Hough space; obtaining the latest Euclidean distance between each low-bright point and each high-bright point according to the difference of the transverse coordinates and the longitudinal coordinates of each low-bright point and each high-bright point in the Hough space; according to the latest Euclidean distance between each low-bright point and each high-bright point and the difference between the brightness values of the low-bright point and the high-bright point, the possibility that each low-bright point belongs to each high-bright point is obtained, and all the high-bright points and the low-bright points are further divided into a plurality of combinations.
It should be noted that, when the turbine guiding vane is not deformed, the turbine guiding vane is formed by a plurality of edges, and each edge is formed by an edge straight line; when the turbine guide vane is slightly deformed, the turbine guide vane is also formed by a plurality of edges, but the deformed edges are formed by a plurality of edge lines. Since each edge line in the turbine guide blade corresponds to a coordinate point in the hough space, and when deformation detection is performed on the turbine guide blade, each edge is analyzed as a whole. Therefore, the data points in the Hough space are combined, so that each combination corresponds to one edge, and the deformation degree of the edge corresponding to each combination is calculated.
It should be further noted that when combining data points in the hough space. Since, when an edge of the turbine guide vane is deformed, the edge is formed by a longer edge line and a plurality of shorter edge lines. I.e. a longer edge line within the deformed edge. And the brightness value of each coordinate point in the Hough space reflects the linear expression degree of the edge straight line corresponding to the coordinate point, so that the brightness value of the longer edge straight line corresponding to the coordinate point in the Hough space is larger. Therefore, according to the brightness value of each coordinate point in the Hough space, the coordinate point corresponding to the longer edge straight line in each combination is obtained. Namely, according to the brightness value of each coordinate point in the Hough space, the coordinate points in the Hough space are divided into high-bright points and low-bright points.
It should be further noted that, when the low-bright point in a combination is obtained according to the high-bright point in the combination, the difference of the euclidean distances of the coordinate points corresponding to the two edge lines in one edge in the hough space is smaller than the euclidean distance of the coordinate points corresponding to the two edge lines in the different edges. Therefore, the Euclidean distance between each low bright point and each high bright point in the Hough space is used as a parameter for calculating the possibility that each low bright point belongs to each high bright point.
It should be further noted that, when the euclidean distance between each low bright point and each high bright point is calculated in the hough space, the difference of the coordinates points corresponding to the two edge lines in the same edge in the hough space on the ordinate is obviously smaller than the difference of the coordinates points corresponding to the two edge lines in different edges on the ordinate; and the difference of the punctuation corresponding to the two edge straight lines in the same edge in the Hough space on the abscissa is similar to the difference of the punctuation corresponding to the two edge straight lines in different edges on the abscissa. Therefore, when the Euclidean distance between each low-bright point and each high-bright point is calculated, the difference between each low-bright point and each high-bright point on the ordinate is increased, the difference between each low-bright point and each high-bright point on the abscissa is reduced, and the latest Euclidean distance between each low-bright point and each high-bright point is obtained.
It should be further noted that, the difference in brightness between the coordinate points corresponding to the two edge lines in one edge in the hough space is smaller than the difference in brightness between the coordinate points corresponding to the two edge lines in the different edges. Therefore, the brightness difference between each low-bright point and each high-bright point in the Hough space is used as a parameter for calculating the possibility that each low-bright point belongs to each high-bright point.
Specifically, the brightness threshold is presetThe data points with the brightness value larger than the brightness threshold value in the Hough space are marked as high-bright points, and the data points with the brightness value smaller than or equal to the brightness threshold value in the Hough space are marked as low-bright points. The preset brightness threshold value of the embodimentThis is described as an example, and other values may be set in other embodiments.
Further, the first step isThe low bright point belongs toThe probability of a highlight point is recorded asThe specific calculation formula is as follows:
In the method, in the process of the invention, Represent the firstLow bright point and the firstThe latest euclidean distance of the individual highlights,Representing a preset weight increment value, which is preset in this embodimentAs an example, other embodiments may be set to other values; Representing the first in Hough space The abscissa of the individual highlight points,Representing the first in Hough spaceThe abscissa of the individual low-bright spots,Representing the first in Hough spaceThe ordinate of the individual highlight points,Representing the first in Hough spaceThe ordinate of the individual low-light spots,Representing the function of the absolute value,Represent the firstThe low bright point belongs toThe possibility of a high-point spot,Representing the first in Hough spaceThe brightness value of the individual low-bright spots,Representing the first in Hough spaceThe luminance value of the individual highlight points,The present embodiment uses an exponential function based on natural constantsThe model is used to present the inverse proportional relationship,For the input of the model, an implementer can set an inverse proportion function according to actual conditions; representing a linear normalization function with normalization objects of all low-bright spots and all high-bright spots
It is to be noted that,The smaller the value of (c) is, description of the Hoff spaceLow bright point and the firstThe closer the highlight points are, i.eThe low bright point belongs toThe greater the likelihood of a highlight point; The smaller the value of (a) is, the description of the (b) Edge straight line corresponding to low-bright pointThe greater the likelihood that the corresponding edge lines of the highlighting are within the same edge, i.e. the firstThe low bright point belongs toThe greater the likelihood of a highlight point.
Further, obtain the firstThe possibility that a low bright spot belongs to all high bright spots. Will be the firstA high bright point corresponding to the maximum value of the possibility that the low bright point belongs to each high bright point and the first bright pointThe low bright spots are corresponding bright spots. Obtain all corresponding bright spots of all high bright spots, will beA highlight point and a firstAll the corresponding bright spots of the high bright spots form a combination, the first isThe combination of the high-brightness points is marked as the firstCombinations of (1)Each combination corresponds to an edge.
Step S004: obtaining the deformation degree of the corresponding edge of each combination according to the number of the low-light spots contained in each combination, the possibility that the low-light spots in each combination belong to the high-light spots in each combination and the angle difference and the brightness difference of the low-light spots and the high-light spots in each combination; and obtaining the deformation degree of the turbine guide blade according to the deformation degree of the corresponding edge of each combination, and detecting the quality of the turbine guide blade.
It should be noted that, when the deformation degree of the corresponding edge of each combination is calculated. The number of the low-light spots contained in each combination in the Hough space indirectly reflects the number of micro-deformation generated on the corresponding edge of each combination in the Hough space. Therefore, the number of low-light spots contained in each combination is used as a parameter for calculating the deformation degree of the corresponding edge of each combination.
It is further noted that, when an edge of the turbine guide vane is slightly deformed, the edge is formed by a longer edge line and a plurality of shorter edge lines. But not all shorter edge lines are edge lines of the deformation region. Therefore, the probability that the edge line corresponding to the low bright point in each combination is the edge line of the deformed region is calculated. Because there is a larger angle difference between the longer edge line in an edge with micro deformation and the edge line in the deformation area in the edge, that is, in a combination, if the difference between the abscissa of a low bright point and a high bright point is larger, the probability that the edge line corresponding to the low bright point is the edge line of the deformation area is larger. Therefore, the difference of the horizontal coordinates of each low-bright point and high-bright point in each combination is quantified, and the probability that the edge straight line corresponding to the low-bright point in the combination is the edge straight line of the deformation area is quantified, namely, the angle difference of each low-bright point and high-bright point in each combination is used as a parameter for calculating the deformation degree of the edge corresponding to each combination.
It should be further noted that, after the possibility that the edge line corresponding to each low-bright point in each combination is the edge line of the deformed area is obtained, the lengths of the different edge lines in the deformed area may be different, and the length of the edge line in the deformed area indirectly reflects the area of the deformed area, that is, the length of the edge line in the deformed area reflects the deformation degree of the edge. Therefore, the brightness value of each low-brightness point in each combination is used as a parameter for each calculation of the deformation degree of the corresponding edge of each combination.
Specifically, will beThe deformation degree of the corresponding edges of each combination is recorded asThe specific calculation formula is as follows:
In the method, in the process of the invention, Represent the firstThe respective combinations correspond to the degree of deformation of the edges,Represent the firstThe number of low-light spots contained in each combination,Represent the firstIn the first combinationThe low bright point belongs toThe possibility of a high-point spot,Represent the firstIn the first combinationLow bright point and the firstThe absolute value of the difference in luminance values of the individual highlights,Represent the firstIn the first combinationLow bright point and the firstThe difference in the angle of the individual highlights,Representing the sigmoid function, the embodiment uses the sigmoid function for normalization.
It is to be noted that,The larger the value of (a) is, the description of the (b)The more regions in the corresponding edges of the combination which generate micro deformations, i.e. the firstThe greater the degree of deformation of the corresponding edges of the combinations; the larger the value of (a) is, the description of the (b) In the first combinationThe greater the probability that the edge line corresponding to the low-bright point is the edge line in the deformation area, namelyThe greater the degree of deformation of the corresponding edges of the combinations; at the position ofIn the case of a larger value of (a),The larger the value of (a) is, the description of the (b)In the first combinationThe larger the area of the deformation area where the straight line of the edge corresponding to the low-bright point is located, namely the firstThe greater the degree of deformation of the corresponding edges of the respective combinations.
Further, the maximum value of the deformation degree of the corresponding edges of all the combinations is recorded as the deformation degree of the turbine guide vane.
Presetting a deformation thresholdIf the degree of deformation of the turbine guide vane is greater than the deformation thresholdThe turbine stator blade to be inspected is then marked as a blade of unacceptable quality. Deformation threshold preset in this embodimentAs an example, other embodiments may be set to other values; in this embodiment, the turbine guiding blade to be detected is taken as an example to process, and a quality detection result is obtained, and other turbine guiding blades perform quality detection according to the above method.
This embodiment is completed.
In summary, the above embodiments are only preferred embodiments of the present invention, and are not intended to limit the present invention, but any modifications, equivalent substitutions, improvements, etc. within the principles of the present invention should be included in the scope of the present invention.

Claims (9)

1. The quick detection method for the quality of the turbine guide vane based on machine learning is characterized by comprising the following steps of:
Acquiring a turbine guide vane image;
performing edge detection on the turbine guide blade image to obtain an edge detection image of the turbine guide blade image; mapping an edge detection image of the turbine guide vane image into a Hough space to obtain a plurality of data points in the Hough space; each data point has a luminance value luminance;
dividing the data points into high-bright points and low-bright points according to the brightness value of each data point in the Hough space; obtaining the latest Euclidean distance between each low-bright point and each high-bright point according to the difference of the transverse coordinates and the longitudinal coordinates of each low-bright point and each high-bright point in the Hough space; obtaining the possibility that each low-bright point belongs to each high-bright point according to the latest Euclidean distance between each low-bright point and each high-bright point and the difference between the brightness values of the low-bright point and the high-bright point, and dividing all the high-bright points and the low-bright points into a plurality of combinations;
Obtaining the deformation degree of the corresponding edge of each combination according to the number of the low-light spots contained in each combination, the possibility that the low-light spots in each combination belong to the high-light spots in each combination and the angle difference and the brightness difference of the low-light spots and the high-light spots in each combination; obtaining the deformation degree of the turbine guide blade according to the deformation degree of the corresponding edge of each combination, and detecting the quality of the turbine guide blade;
According to the latest Euclidean distance between each low-bright point and each high-bright point and the difference between the brightness values of the low-bright point and the high-bright point, the possibility that each low-bright point belongs to each high-bright point is obtained, and the specific method comprises the following steps:
In the method, in the process of the invention, Represent the firstThe low bright point belongs toThe possibility of a high-point spot,Representing the first in Hough spaceThe brightness value of the individual low-bright spots,Representing the first in Hough spaceThe luminance value of the individual highlight points,Represent the firstLow bright point and the firstThe latest euclidean distance of the individual highlights,As an exponential function with a base of natural constant,Representing the normalization function.
2. The method for quickly detecting the quality of the turbine guide vane based on machine learning according to claim 1, wherein the step of acquiring the image of the turbine guide vane comprises the following specific steps:
Acquiring a blade image containing a background;
Graying the blade image containing the background to obtain a blade gray map containing the background, performing semantic segmentation on the blade gray map containing the background to obtain a blade gray map without the background, and marking the blade gray map without the background as a turbine guide blade image.
3. The method for quickly detecting the quality of the turbine guide vane based on machine learning according to claim 1, wherein the mapping the edge detection image of the turbine guide vane image into the hough space to obtain a plurality of data points in the hough space comprises the following specific steps:
Taking an edge detection image of the turbine guide blade image as the input of Hough transformation, and outputting a coordinate point corresponding to each edge straight line in the edge detection image of the turbine guide blade image in Hough space;
And (5) marking coordinate points corresponding to the edge straight lines of the turbine guide vane images in the Hough space as data points.
4. The method for quickly detecting the quality of the turbine guide vane based on the machine learning according to claim 1, wherein the method for dividing the data points into high-brightness points and low-brightness points according to the brightness value of each data point in the hough space comprises the following specific steps:
Presetting a brightness threshold The data points with the brightness value larger than the brightness threshold value in the Hough space are marked as high-bright points, and the data points with the brightness value smaller than or equal to the brightness threshold value in the Hough space are marked as low-bright points.
5. The method for quickly detecting the quality of the turbine guide vane based on machine learning according to claim 1, wherein the obtaining the latest euclidean distance between each low-point and each high-point according to the difference of the horizontal and vertical coordinates of each low-point and each high-point in the hough space comprises the following specific steps:
In the method, in the process of the invention, Indicating a preset weight increase value,Representing the first in Hough spaceThe abscissa of the individual highlight points,Representing the first in Hough spaceThe abscissa of the individual low-bright spots,Representing the first in Hough spaceThe ordinate of the individual highlight points,Representing the first in Hough spaceThe ordinate of the individual low spots.
6. The quick detection method for quality of turbine guide vane based on machine learning according to claim 1, wherein the dividing all high-bright spots and low-bright spots into a plurality of combinations comprises the following specific steps:
Will be the first A high bright point corresponding to the maximum value of the possibility that the low bright point belongs to each high bright point, andThe low bright spots are corresponding bright spots to each other, so that all corresponding bright spots of each high bright spot are obtained;
Will be the first A highlight point and a firstAll the corresponding bright spots of the high bright spots form a combination, the first isThe combination of the high-brightness points is marked as the firstCombinations of two.
7. The method for quickly detecting the quality of a turbine guide vane based on machine learning according to claim 1, wherein the obtaining the deformation degree of the corresponding edge of each combination according to the number of the low-light spots contained in each combination, the probability that the low-light spots in each combination belong to the high-light spots in each combination, and the angle difference and the brightness difference of the low-light spots and the high-light spots in each combination comprises the following specific steps:
In the method, in the process of the invention, Represent the firstThe respective combinations correspond to the degree of deformation of the edges,Represent the firstThe number of low-light spots contained in each combination,Represent the firstIn the first combinationThe low bright point belongs toThe possibility of a high-point spot,Represent the firstIn the first combinationLow bright point and the firstThe absolute value of the difference in luminance values of the individual highlights,Represent the firstIn the first combinationLow bright point and the firstThe difference in the angle of the individual highlights,Representing a sigmoid function.
8. The method for quickly detecting the quality of the turbine guide vane based on the machine learning according to claim 1, wherein the method for obtaining the deformation degree of the turbine guide vane according to the deformation degree of the corresponding edge of each combination comprises the following specific steps:
The maximum value of the deformation degree of the corresponding edges of all the combinations is recorded as the deformation degree of the turbine guide vane.
9. The quick quality detection method for turbine guide vane based on machine learning according to claim 1, wherein the quality detection for turbine guide vane comprises the following specific steps:
Presetting a deformation threshold If the deformation degree of the turbine guide blade is greater than the deformation threshold value, the turbine guide blade is marked as a blade with unqualified quality.
CN202410968941.8A 2024-07-19 2024-07-19 Turbine guide vane quality rapid detection method based on machine learning Active CN118505707B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202410968941.8A CN118505707B (en) 2024-07-19 2024-07-19 Turbine guide vane quality rapid detection method based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202410968941.8A CN118505707B (en) 2024-07-19 2024-07-19 Turbine guide vane quality rapid detection method based on machine learning

Publications (2)

Publication Number Publication Date
CN118505707A CN118505707A (en) 2024-08-16
CN118505707B true CN118505707B (en) 2024-10-15

Family

ID=92231543

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202410968941.8A Active CN118505707B (en) 2024-07-19 2024-07-19 Turbine guide vane quality rapid detection method based on machine learning

Country Status (1)

Country Link
CN (1) CN118505707B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115100214A (en) * 2022-08-29 2022-09-23 南通市昊逸阁纺织品有限公司 Textile quality detection method based on image processing
CN118279301A (en) * 2024-05-31 2024-07-02 江苏瀚阳新材料科技有限公司 Light guide plate quality monitoring method and system based on image processing

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102006059663B4 (en) * 2006-12-18 2008-07-24 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus, method and computer program for identifying a traffic sign in an image
JP6459940B2 (en) * 2015-12-08 2019-01-30 株式会社Sumco Specific defect detection method, specific defect detection system and program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115100214A (en) * 2022-08-29 2022-09-23 南通市昊逸阁纺织品有限公司 Textile quality detection method based on image processing
CN118279301A (en) * 2024-05-31 2024-07-02 江苏瀚阳新材料科技有限公司 Light guide plate quality monitoring method and system based on image processing

Also Published As

Publication number Publication date
CN118505707A (en) 2024-08-16

Similar Documents

Publication Publication Date Title
CN113689428B (en) Mechanical part stress corrosion detection method and system based on image processing
CN114372955A (en) Casting defect X-ray diagram automatic identification method based on improved neural network
CN118279304B (en) Abnormal recognition method, device and medium for special-shaped metal piece based on image processing
CN111461101A (en) Method, device and equipment for identifying work clothes mark and storage medium
CN117372433B (en) Thickness parameter control method, device, equipment and storage medium
CN110033458A (en) It is a kind of based on pixel gradient distribution image threshold determine method
CN118097310B (en) Method for digitally detecting concrete surface defects
CN117392107A (en) Intelligent cutting processing system and method for die steel
CN118314312B (en) Quick identifying method and system for tool nose of machine tool
CN116843677A (en) Appearance quality detection system and method for sheet metal part
CN117745708A (en) Deep learning algorithm-based wood floor surface flaw detection method
CN118505707B (en) Turbine guide vane quality rapid detection method based on machine learning
CN115082428A (en) Metal spot detection method and system based on neural network
CN108765365A (en) A kind of rotor winding image qualification detection method
CN118587496A (en) Automatic identification system and method of parts processing accuracy based on computer vision
CN116258838B (en) Intelligent visual guiding method for duct piece mold clamping system
CN116228776B (en) Method and system for identifying welding defects of electromechanical equipment
CN115311287A (en) Method for detecting production abnormity of common rail oil injector
CN119048397B (en) Ring main unit group instrument reading method based on image processing
CN119495108B (en) A mobile terminal text recognition method and system based on neural network
CN110334664A (en) A statistical method, device, electronic equipment and medium for alloy precipitated phase fraction
CN120182257B (en) Method and system for detecting section of cable core of power supply and distribution cable of building
CN119693361B (en) Quality visual detection method and system for vehicle window gear injection mold
CN118397014B (en) Nonferrous metal rolling quality detection method and system based on machine vision
CN118279300B (en) Engine part identification method based on three-dimensional point cloud

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant