[go: up one dir, main page]

CN102645219B - Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection - Google Patents

Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection Download PDF

Info

Publication number
CN102645219B
CN102645219B CN201210150845.XA CN201210150845A CN102645219B CN 102645219 B CN102645219 B CN 102645219B CN 201210150845 A CN201210150845 A CN 201210150845A CN 102645219 B CN102645219 B CN 102645219B
Authority
CN
China
Prior art keywords
image
pixel
value
point
cross
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN201210150845.XA
Other languages
Chinese (zh)
Other versions
CN102645219A (en
Inventor
张立国
肖波
焦建彬
高学山
刘璐
张雷
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Harbin Fenghua Co ltd China Aerospace Science & Industry Corp
Original Assignee
Harbin Fenghua Co ltd China Aerospace Science & Industry Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Harbin Fenghua Co ltd China Aerospace Science & Industry Corp filed Critical Harbin Fenghua Co ltd China Aerospace Science & Industry Corp
Priority to CN201210150845.XA priority Critical patent/CN102645219B/en
Publication of CN102645219A publication Critical patent/CN102645219A/en
Application granted granted Critical
Publication of CN102645219B publication Critical patent/CN102645219B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Image Analysis (AREA)

Abstract

用于焊缝检测的爬壁机器人的视觉导航系统及焊缝的焊接定位方法及焊缝偏移量获取方法,本发明涉及用于焊缝检测的爬壁机器人的视觉导航系统及焊缝的焊接定位方法及焊缝偏移量获取方法。它为了解决自然环境恶劣场所现有技术无法实现示教再现方式对塔筒的检修维护工作的问题。视觉导航系统的电荷耦合摄像机和十字激光发射器固定在机器人头部的正前端,并且十字激光发射器在电荷耦合摄像机的正上方,十字激光发射器发射的激光照射在被焊接工件表面形成十字光斑,电荷耦合摄像机用于拍摄工件表面的十字光斑,电荷耦合摄像机数据输出端与计算机的数据输入端连接。本发明适用于对塔筒的检修维护领域。

The visual navigation system of the wall-climbing robot used for weld seam detection, the welding positioning method of the weld seam, and the welding seam offset acquisition method, the present invention relates to the visual navigation system of the wall-climbing robot used for weld seam detection and the welding seam The positioning method and the method of obtaining the offset of the welding seam. In order to solve the problem that the existing technology in places with harsh natural environments cannot realize the inspection and maintenance work of the tower in the way of teaching and reproduction. The charge-coupled camera and cross laser emitter of the visual navigation system are fixed at the front end of the robot head, and the cross laser emitter is directly above the charge-coupled camera, and the laser emitted by the cross laser emitter irradiates the surface of the workpiece to be welded to form a cross spot , the charge-coupled camera is used to photograph the cross spot on the workpiece surface, and the data output end of the charge-coupled camera is connected with the data input end of the computer. The invention is applicable to the field of inspection and maintenance of tower tubes.

Description

用于焊缝检测的爬壁机器人的视觉导航系统的焊缝的焊接定位方法及焊缝偏移量获取方法Welding seam welding positioning method and welding seam offset acquisition method of the visual navigation system of the wall-climbing robot used for welding seam detection

技术领域technical field

本发明适用于对塔筒的检修维护领域。The invention is applicable to the field of inspection and maintenance of tower tubes.

背景技术Background technique

最近几年风电产业蓬勃发展,风电塔筒数量猛增,相应的对于塔筒的检修维护工作需求量也大增。一般的风电塔筒高度在50到100米之间,塔筒分为若干段,每段由单块钢板卷板焊接而成。由于焊接工艺和焊接精度的影响,焊缝部分难免会出现虚焊、气孔、夹渣等问题,在风场使用过程中容易进一步引起裂缝。由于风电场多数都建立于海上、峡谷、山口等自然环境恶劣场所,导致上述问题在风机的使用中成为极大的安全隐患。一直以来,都是采用垂吊式作业方式来进行风电塔筒的检修、维护工作,风险较高,因此迫切需要极限方式下工作的机器来代替人工作业。而在这种极限条件下,很难通过示教再现方式来实现机器人工作过程的训练,需要机器人具有更智能的控制系统来适应工作环境,并进行相关的工作。In recent years, the wind power industry has developed vigorously, and the number of wind power towers has increased sharply. Correspondingly, the demand for maintenance of towers has also greatly increased. The height of a general wind power tower is between 50 and 100 meters. The tower is divided into several sections, and each section is welded by a single steel plate coil. Due to the influence of welding process and welding precision, problems such as weak welding, pores, and slag inclusions will inevitably occur in the weld part, which will easily cause further cracks during the use of the wind field. Since most of the wind farms are built in places with harsh natural environments such as seas, canyons, and mountain passes, the above problems have become a great safety hazard in the use of wind turbines. For a long time, the overhaul and maintenance of wind power towers have been carried out by means of hanging operation, which has high risks. Therefore, there is an urgent need for machines that work in extreme modes to replace manual operations. Under such extreme conditions, it is difficult to realize the training of the robot's working process through teaching and reproduction. It is necessary for the robot to have a more intelligent control system to adapt to the working environment and perform related tasks.

发明内容Contents of the invention

为了解决自然环境恶劣场所现有技术无法实现示教再现方式对塔筒的检修维护工作的问题,从而提供了用于焊缝检测的爬壁机器人的视觉导航系统的焊缝的焊接定位方法及焊缝偏移量获取方法。In order to solve the problem that the existing technology in the harsh natural environment cannot realize the inspection and maintenance of the tower tube by the teaching and reproduction method, a welding positioning method and a welding seam welding positioning method for the visual navigation system of the wall-climbing robot for welding seam detection are provided. Seam offset acquisition method.

本发明所述的用于焊缝检测的爬壁机器人的视觉导航系统,它包括电荷耦合摄像机、十字激光发射器和计算机,The visual navigation system of the wall-climbing robot used for welding seam detection according to the present invention includes a charge-coupled camera, a cross laser emitter and a computer,

电荷耦合摄像机和十字激光发射器固定在机器人头部的正前端,并且十字激光发射器在电荷耦合摄像机的正上方,所述的十字激光发射器发射的激光照射在被焊接工件表面形成十字光斑,电荷耦合摄像机用于拍摄工件表面的十字光斑,且十字激光发射器发射的激光光束的光轴与电荷耦合摄像机的摄像头的光轴的夹角为45°,电荷耦合摄像机的数据输出端与计算机的数据输入端连接。The charge-coupled camera and the cross laser emitter are fixed on the front end of the robot head, and the cross laser emitter is directly above the charge-coupled camera, and the laser emitted by the cross laser emitter forms a cross spot on the surface of the workpiece to be welded. The charge-coupled camera is used to shoot the cross spot on the surface of the workpiece, and the angle between the optical axis of the laser beam emitted by the cross laser transmitter and the optical axis of the camera head of the charge-coupled camera is 45°, and the data output terminal of the charge-coupled camera is connected to the computer's Data input connection.

焊接定位方法,它包括下述步骤:Welding location method, it comprises the steps:

步骤一、十字激光发射器发射红色十字线型激光束,所述激光束照射在被焊接工件表面形成十字光斑,并采用该激光束扫描工件表面,在扫描过程中,采用电荷耦合摄像机采集工件表面的视频信息,执行步骤二;Step 1. The cross laser emitter emits a red cross-shaped laser beam. The laser beam is irradiated on the surface of the workpiece to be welded to form a cross spot, and the laser beam is used to scan the surface of the workpiece. During the scanning process, a charge-coupled camera is used to collect the surface of the workpiece video information, go to step 2;

步骤二、计算机接收电荷耦合摄像机所采集的视频信息;Step 2, the computer receives the video information collected by the charge-coupled camera;

步骤三、计算机把视频信息中每一帧图像进行如下处理:Step 3, the computer processes each frame of image in the video information as follows:

步骤三一、将该帧图像分割成像素点,所述的像素点的值表示该点图像红、黄和蓝三色的值;Step 31, the frame image is divided into pixels, and the value of the pixel represents the value of the red, yellow and blue colors of the image at this point;

步骤三二、将该帧图像的所有红、黄和蓝三种颜色分量的数值分别存储于三个数组里;Step 32, storing the values of all red, yellow and blue color components of the frame image in three arrays respectively;

步骤三三、计算机通过二维最大熵分割法对该帧图像对应的红色分量的数组求取红色分量数值对应的灰度值的最佳阈值;Step 33, the computer obtains the optimal threshold value of the gray value corresponding to the red component value by the two-dimensional maximum entropy segmentation method for the array of the red component corresponding to the frame image;

步骤三四、将该帧图像的所有像素的灰度值与步骤三三获得的最佳阈值相比较,并将高于阈值的像素的灰度值置为255,将低于阈值的像素灰度值置为0,将该帧图像转换为二值图;Step three and four, compare the gray value of all pixels of the frame image with the optimal threshold obtained in step three and three, set the gray value of the pixel higher than the threshold to 255, and set the gray value of the pixel lower than the threshold Set the value to 0 to convert the frame image into a binary image;

步骤三五、运用Canny算子对步骤三四得到的二值图进行边缘检测,以获得二值图像的边缘信息,在该帧图像上表现为十字光斑的边缘;Steps 3 and 5, using the Canny operator to perform edge detection on the binary image obtained in steps 3 and 4, to obtain edge information of the binary image, which is shown as the edge of the cross spot on the frame image;

步骤三六、提取步骤三五的边缘信息,取十字光斑的边缘的中线作为骨架,得到光滑的单像素宽度的骨架曲线;Steps three and six, extracting the edge information of steps three and five, taking the midline of the edge of the cross spot as the skeleton, and obtaining a smooth skeleton curve with a single pixel width;

步骤三七、根据步骤三六得到的骨架曲线进行直线的Hough变换,得到弧线两边的两条直线,并标记出弧线的两个端点,找到弧线部分,检测出焊缝在该帧图像中的位置。Step 37: Carry out the Hough transformation of the straight line according to the skeleton curve obtained in step 36 to obtain two straight lines on both sides of the arc, mark the two endpoints of the arc, find the arc part, and detect the weld in the frame image position in .

焊缝偏移量获取方法,经过焊缝定位后,提取焊缝弧线的两个端点和十字光的中心点三个特征点,所述两个端点之间距离为a,中心点到与焊缝临近的端点之间的距离为b,设a,b的测量值为a',b',可求得偏移角度为:The welding seam offset acquisition method, after the welding seam positioning, extracts the two endpoints of the arc of the welding seam and the center point of the crosslight three feature points, the distance between the two endpoints is a, and the distance between the center point and the welding seam The distance between the end points adjacent to the seam is b, and the measured values of a and b are a', b', and the offset angle can be obtained as:

θθ == arccosarccos aa ++ bb aa '' ++ bb '' == arccosarccos bb '' bb ,, -- -- -- (( 1414 ))

位置偏移量为:The position offset is:

w=b'-b,   (15)w=b'-b, (15)

当a'=a,b'=b,为正常情况,机器人正常行进中;当a'>a,b'≤b,机器人向左偏移;当a'>a,b'≥b,机器人向右偏移。When a'=a, b'=b, it is a normal situation, and the robot is moving normally; when a'>a, b'≤b, the robot shifts to the left; when a'>a, b'≥b, the robot right offset.

本发明运用焊缝检测的爬壁机器人的视觉导航系统使机器人可以通过视觉导航系统达到了自行搜索焊缝;通过焊缝的焊接定位方法确定焊缝位置,通过焊缝偏移量获取方法跟踪焊缝方向,从而确保探伤设备可以一直沿着焊缝检测,达到了不偏离焊缝的目的。The present invention uses the visual navigation system of the wall-climbing robot for weld seam detection to enable the robot to search for weld seams by itself through the visual navigation system; determine the weld seam position through the welding seam welding positioning method, and track the welding seam through the welding seam offset acquisition method Seam direction, so as to ensure that the flaw detection equipment can always detect along the weld seam, and achieve the purpose of not deviating from the weld seam.

附图说明Description of drawings

图1为用于焊缝检测的爬壁机器人的视觉导航系统的结构示意图;图2为灰度-邻域灰度均值二维直方图;图3为焊缝外形示意;图4为激光投影和焊缝相交部分偏移情况的投影图;图5为激光投影和焊缝相交部分偏移情况与标准情况的对比;图6为图5的等效图。Fig. 1 is a schematic structural diagram of the visual navigation system of a wall-climbing robot used for weld detection; Fig. 2 is a two-dimensional histogram of gray-neighborhood gray-scale mean; Fig. 3 is a schematic diagram of a weld; Fig. 4 is a laser projection and The projection diagram of the offset of the intersection part of the weld; Figure 5 is the comparison between the laser projection and the offset of the intersection of the weld and the standard situation; Figure 6 is the equivalent diagram of Figure 5.

具体实施方式Detailed ways

具体实施方式一、结合图1具体说明本实施方式,本实施方式所述的用于焊缝检测的爬壁机器人的视觉导航系统,它包括电荷耦合摄像机1、十字激光发射器2和计算机,DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS 1. This embodiment is specifically described in conjunction with FIG. 1. The visual navigation system of the wall-climbing robot used for weld seam detection described in this embodiment includes a charge-coupled camera 1, a cross laser emitter 2 and a computer.

电荷耦合摄像机1和十字激光发射器2固定在机器人头部的正前端,并且十字激光发射器2在电荷耦合摄像机1的正上方,所述的十字激光发射器2发射的激光照射在被焊接工件表面形成十字光斑,电荷耦合摄像机1用于拍摄工件表面的十字光斑,且十字激光发射器2发射的激光光束的光轴与电荷耦合摄像机1的摄像头的光轴的夹角为45°,电荷耦合摄像机1的数据输出端与计算机的数据输入端连接。The charge-coupled camera 1 and the cross laser emitter 2 are fixed on the front end of the robot head, and the cross laser emitter 2 is directly above the charge-coupled camera 1, and the laser emitted by the cross laser emitter 2 is irradiated on the workpiece to be welded A cross spot is formed on the surface, and the charge-coupled camera 1 is used to photograph the cross spot on the workpiece surface, and the angle between the optical axis of the laser beam emitted by the cross laser emitter 2 and the optical axis of the camera head of the charge-coupled camera 1 is 45°, and the charge-coupled The data output end of the camera 1 is connected with the data input end of the computer.

具体实施方式二、基于具体实施方式一所述的用于焊缝检测的爬壁机器人的视觉导航系统的焊缝的焊接定位方法,它包括下述步骤:Specific embodiment two, based on the welding positioning method of the weld seam of the visual navigation system of the wall-climbing robot that is used for weld seam detection described in specific embodiment one, it comprises the following steps:

步骤一、十字激光发射器2发射红色十字线型激光束,所述激光束照射在被焊接工件表面形成十字光斑,并采用该激光束扫描工件表面,在扫描过程中,采用电荷耦合摄像机1采集工件表面的视频信息,执行步骤二;Step 1. The cross laser transmitter 2 emits a red cross-shaped laser beam. The laser beam is irradiated on the surface of the workpiece to be welded to form a cross spot, and the laser beam is used to scan the surface of the workpiece. During the scanning process, the charge-coupled camera 1 is used to collect For the video information on the surface of the workpiece, perform step 2;

步骤二、计算机接收电荷耦合摄像机所采集的视频信息;Step 2, the computer receives the video information collected by the charge-coupled camera;

步骤三、计算机把视频信息中每一帧图像进行如下处理:Step 3, the computer processes each frame of image in the video information as follows:

步骤三一、将该帧图像分割成像素点,所述的像素点的值表示该点图像红、黄和蓝三色的值;Step 31, the frame image is divided into pixels, and the value of the pixel represents the value of the red, yellow and blue colors of the image at this point;

步骤三二、将该帧图像的所有红、黄和蓝三种颜色分量的数值分别存储于三个数组里;Step 32, storing the values of all red, yellow and blue color components of the frame image in three arrays respectively;

步骤三三、计算机通过二维最大熵分割法对该帧图像对应的红色分量的数组求取红色分量数值对应的灰度值的最佳阈值;Step 33, the computer obtains the optimal threshold value of the gray value corresponding to the red component value by the two-dimensional maximum entropy segmentation method for the array of the red component corresponding to the frame image;

步骤三四、将该帧图像的所有像素的灰度值与步骤三三获得的最佳阈值相比较,并将高于阈值的像素的灰度值置为255,将低于阈值的像素灰度值置为0,将该帧图像转换为二值图;Step three and four, compare the gray value of all pixels of the frame image with the optimal threshold obtained in step three and three, set the gray value of the pixel higher than the threshold to 255, and set the gray value of the pixel lower than the threshold Set the value to 0 to convert the frame image into a binary image;

步骤三五、运用Canny算子对步骤三四得到的二值图进行边缘检测,以获得二值图像的边缘信息,在该帧图像上表现为十字光斑的边缘;Steps 3 and 5, using the Canny operator to perform edge detection on the binary image obtained in steps 3 and 4, to obtain edge information of the binary image, which is shown as the edge of the cross spot on the frame image;

步骤三六、提取步骤三五的边缘信息,取十字光斑的边缘的中线作为骨架,得到光滑的单像素宽度的骨架曲线;Steps three and six, extracting the edge information of steps three and five, taking the midline of the edge of the cross spot as the skeleton, and obtaining a smooth skeleton curve with a single pixel width;

步骤三七、根据步骤三六得到的骨架曲线进行直线的Hough变换,得到弧线两边的两条直线,并标记出弧线的两个端点,找到弧线部分,检测出焊缝在该帧图像中的位置。Step 37: Carry out the Hough transformation of the straight line according to the skeleton curve obtained in step 36 to obtain two straight lines on both sides of the arc, mark the two endpoints of the arc, find the arc part, and detect the weld in the frame image position in .

步骤三七中的Hough变换,得到弧线两边的两条直线的具体步骤为:Hough transformation in step 37, the specific steps to obtain the two straight lines on both sides of the arc are:

步骤三七一、将(ρ,θ)空间量化:骨架曲线ρ=xcosθ+ysinθ得到二维矩阵A(ρ,θ),初始化A(ρ,θ)为全零矩阵;Step 371. Quantize (ρ, θ) space: the skeleton curve ρ=xcosθ+ysinθ obtains a two-dimensional matrix A(ρ,θ), and initializes A(ρ,θ) as an all-zero matrix;

步骤三七二、对图像的每一个非零灰度值像素点坐标(x,y)由θ的每个量化值计算相应的ρ,令a(i,j)+1→a(i,j),a(i,j)为矩阵A(ρ,θ)的第i行j列的元素值;Step 372: Calculate the corresponding ρ from each quantized value of θ for each non-zero gray value pixel coordinate (x, y) of the image, let a(i,j)+1→a(i,j ), a(i,j) is the element value of the i-th row j column of the matrix A(ρ,θ);

步骤三七三、将全部非零灰度值像素点坐标(x,y)处理后,分析A(ρ,θ),如果A(ρ,θ)大于阈值T,则存在一条线段,(ρ,θ)是该线段的拟合参数,T是一个非负整数,由图像中景物的先验知识决定;Step 373: After processing all non-zero gray value pixel coordinates (x, y), analyze A(ρ, θ), if A(ρ, θ) is greater than the threshold T, then there is a line segment, (ρ, θ) θ) is the fitting parameter of the line segment, T is a non-negative integer, determined by the prior knowledge of the scene in the image;

步骤三七四、由(ρ,θ)和(x,y)共同确定图像中的线段,并将断裂部分连接,获得直线段。Step 374: Determine the line segment in the image jointly by (ρ, θ) and (x, y), and connect the broken parts to obtain a straight line segment.

具体实施方式三、结合图2具体说明本实施方式,本实施方式与具体实施方式二的区别在于,步骤三三所述的求取红色分量数值对应的灰度值的最佳阈值方法为:Specific embodiment three, this embodiment is specifically described in conjunction with FIG. 2 , the difference between this embodiment and specific embodiment two is that the optimal threshold method for obtaining the gray value corresponding to the red component value described in step three or three is:

图像上一个像素点紧邻的上下左右四个像素点以及其4个对角的相邻像素就构成了该像素点的8个像素点的邻域信息,通过二维最大熵分割法作出图像关于像素点灰度值及该像素点邻域信息灰度均值的二维直方图,利用二维熵最大求取红色分量数值对应的灰度值的最佳阈值f(x,y);The four pixels on the top, bottom, left, and right adjacent to a pixel on the image and its four diagonal adjacent pixels constitute the neighborhood information of the eight pixels of the pixel, and the image is obtained by the two-dimensional maximum entropy segmentation method. The two-dimensional histogram of the gray value of the point and the gray value of the neighborhood information of the pixel, using the maximum two-dimensional entropy to obtain the optimal threshold f(x, y) of the gray value corresponding to the value of the red component;

其中,(x,y)为像素点的坐标,Among them, (x, y) is the coordinate of the pixel point,

二维直方图沿对角线分布的A区和B区分别代表目标和背景,远离对角线的C区和D区代表边界和噪声,在A区和B区上利用点灰度-区域灰度均值二维最大熵法确定最佳阈值,可使真正代表目标和背景的信息量最大,二维熵为:Areas A and B distributed along the diagonal of the two-dimensional histogram represent the target and background, respectively, and areas C and D away from the diagonal represent the boundary and noise. On areas A and B, use point grayscale-area gray The degree-mean two-dimensional maximum entropy method determines the optimal threshold, which can maximize the amount of information that truly represents the target and background. The two-dimensional entropy is:

Hh == -- ΣΣ ii ΣΣ jj pp ii ,, jj loglog 22 pp ii ,, jj ,, -- -- -- (( 11 ))

熵的判别函数为:The discriminant function of entropy is:

f(x0,y0)=log2[PA(1-PA)]+HA/PA+(HL-HA)/(1-PA),   (2)f(x 0 ,y 0 )=log 2 [P A (1-P A )]+H A /P A +(H L -H A )/(1-P A ), (2)

最佳阈值为:The optimal threshold is:

f(x,y)=max{f(x0,y0)},   (3)f(x,y)=max{f(x 0 ,y 0 )}, (3)

其中,in,

PP AA ΣΣ ii sthe s ΣΣ jj tt pp ii ,, jj -- -- -- (( 44 ))

Hh AA == -- ΣΣ ii sthe s ΣΣ jj tt pp ii ,, jj loglog 22 pp ii ,, jj -- -- -- (( 55 ))

Hh LL == -- ΣΣ ii LL ΣΣ jj LL pp ii ,, jj loglog 22 pp ii ,, jj -- -- -- (( 66 ))

图像的灰度级数为L,总的象素点数为N(m×n),gi,j为图像中点灰度为i及其区域灰度均值为j的象素点数,pi,j为点灰度-区域灰度均值对(i,j)发生的概率,即:pi,j=gi, j/N,其中N(m×n)为图像的总象素数,pi,j,{i,j=1,2,…,L}表示图像关于点灰度-区域灰度均值的二维直方图。The number of gray levels of the image is L, the total number of pixels is N (m×n), g i, j is the number of pixels whose gray level is i in the image and the average gray level of the area is j, p i, j is the probability of occurrence of the pair (i, j) of point gray level - area gray level mean value, that is: p i, j = g i, j /N, where N (m×n) is the total number of pixels of the image, p i, j , {i, j=1, 2, ..., L} represent the two-dimensional histogram of the image about the point gray level - the average value of the area gray level.

具体实施方式四、本实施方式与具体实施方式二的区别在于,步骤三五中的根据二值图进行边缘检测的具体步骤为:Embodiment 4. The difference between this embodiment and Embodiment 2 is that the specific steps for edge detection according to the binary image in steps 3 and 5 are:

步骤三五一、用Gaussian滤波器对图像的最佳阈值f(x,y)进行卷积:Step 351, use the Gaussian filter to convolve the optimal threshold f(x, y) of the image:

g(x,y)=h(x,y,σ)*f(x,y)   (7)g(x,y)=h(x,y,σ)*f(x,y) (7)

hh (( xx ,, ythe y ,, σσ )) == 11 22 πσπσ 22 ee -- xx 22 ++ ythe y 22 22 σσ 22 -- -- -- (( 88 ))

其中,g(x,y)为卷积后的图像,h(x,y,σ)为Gaussian滤波器函数,σ为标准差,*代表卷积;Among them, g(x,y) is the convolved image, h(x,y,σ) is the Gaussian filter function, σ is the standard deviation, and * represents convolution;

步骤三五二、用一阶偏导有限差分法求出卷积后的图像g(x,y)的局部梯度M(x,y)和边缘方向θ(x,y),Step 352, use the first-order partial derivative finite difference method to obtain the local gradient M(x,y) and edge direction θ(x,y) of the convolved image g(x,y),

g'x(x,y)≈Gx(x,y)=[g(x+1,y)-g(x,y)+g(x+1,y+1)-g(x,y+1)]/2   (9)g' x (x,y)≈G x (x,y)=[g(x+1,y)-g(x,y)+g(x+1,y+1)-g(x,y +1)]/2 (9)

g'y(x,y)≈Gy(x,y)=[g(x,y+1)-g(x,y)+g(x+1,y+1)-g(x+1,y)]/2   (10)g' y (x,y)≈G y (x,y)=[g(x,y+1)-g(x,y)+g(x+1,y+1)-g(x+1 ,y)]/2 (10)

Mm (( xx ,, ythe y )) == GG xx (( xx ,, ythe y )) 22 ++ GG ythe y (( xx ,, ythe y )) 22 -- -- -- (( 1111 ))

θ(x,y)=arctan(Gx(x,y)/Gy(x,y))   (12)θ(x,y)=arctan(G x (x,y)/G y (x,y)) (12)

其中,Gx(x,y)表示图像g(x,y)的水平梯度;Gy(x,y)表示图像g(x,y)的垂直方向梯度;M(x,y)表示图像g(x,y)的局部梯度;θ(x,y)表示图像g(x,y)的边缘方向,Among them, G x (x, y) represents the horizontal gradient of the image g(x, y); G y (x, y) represents the vertical gradient of the image g(x, y); M(x, y) represents the image g The local gradient of (x,y); θ(x,y) represents the edge direction of the image g(x,y),

步骤三五三、将图像每一个像素点上将邻域信息局部梯度M(x,y)与沿着梯度线的两个像素相比,若邻域信息局部梯度M(x,y)小于沿着梯度线的像素,则将M(x,y)置为零,该梯度值为边缘点;若邻域信息局部梯度M(x,y)大于等于沿着梯度线的像素,该梯度值不是边缘点,则将不予处理;Step 353: Comparing the local gradient M(x,y) of the neighborhood information on each pixel of the image with the two pixels along the gradient line, if the local gradient M(x,y) of the neighborhood information is smaller than If the pixel along the gradient line, set M(x,y) to zero, the gradient value is an edge point; if the local gradient M(x,y) of the neighborhood information is greater than or equal to the pixel along the gradient line, the gradient value is not Edge points will not be processed;

步骤三五四、选取二维直方图的两个阈值,并作用于由非极大值抑制得到的图像,把梯度值小于较小阈值的像素的灰度值设为0,得到图像1;然后把梯度值小于较大阈值的像素的灰度值设为0,得到图像2;以图像2为基础,以图像1为补充来链接获得图像中十字光斑的边缘。Step 354, select two thresholds of the two-dimensional histogram, and act on the image obtained by non-maximum suppression, set the gray value of the pixel whose gradient value is smaller than the smaller threshold to 0, and obtain image 1; then Set the gray value of the pixel whose gradient value is smaller than the larger threshold to 0 to obtain image 2; based on image 2, use image 1 as a supplement to link to obtain the edge of the cross spot in the image.

具体实施方式五、本实施方式与具体实施方式四的区别在于,步骤三五四中链接获得图像中十字光斑的边缘的具体步骤为:Embodiment 5. The difference between this embodiment and Embodiment 4 is that the specific steps for linking and obtaining the edge of the cross spot in the image in steps 3, 5, and 4 are:

步骤三五四一,对图像2进行扫描,当遇到一个非零灰度值的像素点时,跟踪以其为开始点的轮廓线,直到轮廓线的终点;Step 3541, image 2 is scanned, and when a pixel point with a non-zero gray value is encountered, the contour line starting from it is traced until the end point of the contour line;

步骤三五四二,考察图像1中与图像2中轮廓线的终点位置对应的像素点的8个像素点的邻域信息,如果在该点的8个像素点的邻域中有非零灰度值的像素点存在,则将其包括到图像2中,作为新的开始点,然后重复步骤三六四一,直到在图像1和图像2中都无法继续为止;Step 3542, investigate the neighborhood information of the 8 pixel points of the pixel corresponding to the end position of the contour line in image 1 in image 1, if there is a non-zero gray in the neighborhood of 8 pixel points of this point If the pixel point of degree value exists, it is included in image 2 as a new starting point, and then steps 3641 are repeated until neither image 1 nor image 2 can continue;

步骤三五四三,当完成对包含某一像素点的轮廓线的链接之后,将这条轮廓线标记为已经访问;Step 3543, after completing the link to the contour line containing a certain pixel point, mark this contour line as visited;

重复步骤三五四一、步骤三五四二和步骤三五四三,直到图像2中找不到新轮廓线为止。Repeat step 3541, step 3542 and step 3543 until no new contour line can be found in image 2.

具体实施方式六、本实施方式与具体实施方式二的区别在于,步骤三六所述的骨架曲线为:Embodiment 6. The difference between this embodiment and Embodiment 2 is that the skeleton curves described in steps 3 and 6 are:

ρ=xcosθ+ysinθ;   (13)ρ=xcosθ+ysinθ; (13)

其中,ρ=xcosθ+ysinθ表示图像空间的一点(x,y)对应于(ρ,θ)空间的一条正弦曲线,x表示像素的横坐标,y表示像素的纵坐标,图像坐标原点到该直线的距离为ρ,该直线的法线与x轴的夹角为θ。Among them, ρ=xcosθ+ysinθ indicates that a point (x, y) in the image space corresponds to a sinusoidal curve in (ρ, θ) space, x indicates the abscissa of the pixel, y indicates the ordinate of the pixel, and the origin of the image coordinates reaches the line The distance is ρ, and the angle between the normal of the line and the x-axis is θ.

具体实施方式七、结合图3、图4、图5和图6具体说明本实施方式,基于具体实施方式二所述的用于焊缝检测的爬壁机器人的视觉导航系统的焊缝的焊接定位方法的焊缝偏移量获取方法,经过焊缝定位后,提取焊缝弧线的两个端点和十字光的中心点三个特征点,所述两个端点之间距离为a,中心点到与临近的端点之间的距离为b,设a,b的测量值为a',b',可求得偏移角度为:Specific Embodiment 7. This embodiment is specifically described in conjunction with FIGS. 3 , 4 , 5 and 6 , based on the welding positioning of the weld seam of the visual navigation system of the wall-climbing robot for weld seam detection described in Embodiment 2 The welding seam offset acquisition method of the method, after the welding seam positioning, extracts the two endpoints of the welding seam arc and the three feature points of the center point of the crosslight, the distance between the two endpoints is a, and the center point to The distance from the adjacent endpoint is b, and the measured values of a and b are a', b', and the offset angle can be obtained as:

θθ == arccosarccos aa ++ bb aa '' ++ bb '' == arccosarccos bb '' bb ,, -- -- -- (( 1414 ))

位置偏移量为:The position offset is:

w=b'-b,   (15)w=b'-b, (15)

当a'=a,b'=b,为正常情况,机器人正常行进中;当a'>a,b'≤b,机器人向左偏移;当a'>a,b'≥b,机器人向右偏移。When a'=a, b'=b, it is a normal situation, and the robot is moving normally; when a'>a, b'≤b, the robot shifts to the left; when a'>a, b'≥b, the robot right offset.

图5所示为机器人出现右偏的情况,在图6中将偏移状态和标准状态做一个对比。夹角θ为偏移角度。Figure 5 shows the situation where the robot is biased to the right. In Figure 6, a comparison is made between the offset state and the standard state. The included angle θ is the offset angle.

无论是履带式爬壁机器人还是轮式爬壁机器人,在垂直爬行时都不可能突然出现横向的位移,即a'=a&b'≥b的情况不可能在瞬时发生。因此图5可以等效于图6的结果,Whether it is a crawler wall-climbing robot or a wheeled wall-climbing robot, it is impossible for a sudden lateral displacement when crawling vertically, that is, the situation of a'=a&b'≥b cannot happen instantaneously. Therefore, Figure 5 can be equivalent to the result of Figure 6,

偏移角度 θ = arccos a + b a ' + b ' = arccos b ' b offset angle θ = arccos a + b a ' + b ' = arccos b ' b

在机器人沿着横焊缝爬行时可能会因重力因素出现向下小幅滑动的情况,此时会有a'=a&b'≥b的情况出现,位置偏移量w=b'-b。When the robot crawls along the horizontal weld, it may slide down slightly due to the gravity factor. At this time, a'=a&b'≥b will appear, and the position offset w=b'-b.

将计算所得的w和θ作为反馈量输入给机器人的PID控制部分,即可实现对运行轨迹的实时修正。The calculated w and θ are input to the PID control part of the robot as the feedback quantity, and the real-time correction of the running track can be realized.

Claims (6)

1.用于焊缝检测的爬壁机器人的视觉导航系统的焊缝的焊接定位方法,所述方法通过以下装置实现,所述装置包括电荷耦合摄像机(1)、十字激光发射器(2)和计算机,1. The welding positioning method of the welding seam of the visual navigation system of the wall-climbing robot used for welding seam detection, the method is realized by the following device, and the device includes a charge-coupled camera (1), a cross laser emitter (2) and computer, 电荷耦合摄像机(1)和十字激光发射器(2)固定在机器人头部的正前端,并且十字激光发射器(2)在电荷耦合摄像机(1)的正上方,所述的十字激光发射器(2)发射的激光照射在被焊接工件表面形成十字光斑,电荷耦合摄像机(1)用于拍摄工件表面的十字光斑,且十字激光发射器(2)发射的激光光束的光轴与电荷耦合摄像机(1)的摄像头的光轴的夹角为45°,电荷耦合摄像机(1)的数据输出端与计算机的数据输入端连接;The charge-coupled camera (1) and the cross laser emitter (2) are fixed at the front end of the robot head, and the cross laser emitter (2) is directly above the charge-coupled camera (1), and the cross laser emitter ( 2) The emitted laser irradiates on the surface of the workpiece to be welded to form a cross spot. The charge-coupled camera (1) is used to photograph the cross spot on the surface of the workpiece, and the optical axis of the laser beam emitted by the cross laser emitter (2) is aligned with the charge-coupled camera ( 1) The included angle of the optical axis of the camera is 45°, and the data output end of the charge-coupled camera (1) is connected to the data input end of the computer; 其特征在于,用于焊缝检测的爬壁机器人的视觉导航系统的焊缝的焊接定位方法包括下述步骤:It is characterized in that the welding positioning method of the welding seam of the visual navigation system of the wall-climbing robot used for welding seam detection comprises the following steps: 步骤一、十字激光发射器(2)发射红色十字线型激光束,所述激光束照射在被焊接工件表面形成十字光斑,并采用该激光束扫描工件表面,在扫描过程中,采用电荷耦合摄像机(1)采集工件表面的视频信息,执行步骤二;Step 1. The cross laser transmitter (2) emits a red cross-shaped laser beam. The laser beam is irradiated on the surface of the workpiece to be welded to form a cross spot, and the laser beam is used to scan the surface of the workpiece. During the scanning process, a charge-coupled camera is used to (1) Collect video information on the surface of the workpiece, and perform step 2; 步骤二、计算机接收电荷耦合摄像机所采集的视频信息;Step 2, the computer receives the video information collected by the charge-coupled camera; 步骤三、计算机把视频信息中每一帧图像进行如下处理:Step 3, the computer processes each frame of image in the video information as follows: 步骤三一、将该帧图像分割成像素点,所述的像素点的值表示该点图像红、黄和蓝三色的值;Step 31, the frame image is divided into pixels, and the value of the pixel represents the value of the red, yellow and blue colors of the image at this point; 步骤三二、将该帧图像的所有红、黄和蓝三种颜色分量的数值分别存储于三个数组里;Step 32, storing the values of all red, yellow and blue color components of the frame image in three arrays respectively; 步骤三三、计算机通过二维最大熵分割法对该帧图像对应的红色分量的数组求取红色分量数值对应的灰度值的最佳阈值;Step 33, the computer obtains the optimal threshold value of the gray value corresponding to the red component value by the two-dimensional maximum entropy segmentation method for the array of the red component corresponding to the frame image; 步骤三四、将该帧图像的所有像素的灰度值与步骤三三获得的最佳阈值相比较,并将高于阈值的像素的灰度值置为255,将低于阈值的像素灰度值置为0,将该帧图像转换为二值图;Step three and four, compare the gray value of all pixels of the frame image with the optimal threshold obtained in step three and three, set the gray value of the pixel higher than the threshold to 255, and set the gray value of the pixel lower than the threshold Set the value to 0 to convert the frame image into a binary image; 步骤三五、运用Canny算子对步骤三四得到的二值图进行边缘检测,以获得二值图像的边缘信息,在该帧图像上表现为十字光斑的边缘;Steps 3 and 5, using the Canny operator to perform edge detection on the binary image obtained in steps 3 and 4, to obtain edge information of the binary image, which is shown as the edge of the cross spot on the frame image; 步骤三六、提取步骤三五的边缘信息,取十字光斑的边缘的中线作为骨架,得到光滑的单像素宽度的骨架曲线;Steps three and six, extracting the edge information of steps three and five, taking the midline of the edge of the cross spot as the skeleton, and obtaining a smooth skeleton curve with a single pixel width; 步骤三七、根据步骤三六得到的骨架曲线进行直线的Hough变换,得到弧线两边的两条直线,并标记出弧线的两个端点,找到弧线部分,检测出焊缝在该帧图像中的位置。Step 37: Carry out the Hough transformation of the straight line according to the skeleton curve obtained in step 36 to obtain two straight lines on both sides of the arc, mark the two endpoints of the arc, find the arc part, and detect the weld in the frame image position in . 2.根据权利要求1所述的视觉导航系统的焊缝的焊接定位方法,其特征在于,步骤三三所述的求取红色分量数值对应的灰度值的最佳阈值方法为:2. the welding positioning method of the weld seam of visual navigation system according to claim 1, is characterized in that, the optimum threshold value method of obtaining the corresponding gray value of red component numerical value described in step 33 is: 图像上一个像素点紧邻的上下左右四个像素点以及其4个对角的相邻像素就构成了该像素点的8个像素点的邻域信息,通过二维最大熵分割法作出图像关于像素点灰度值及该像素点邻域信息灰度均值的二维直方图,利用二维熵最大求取红色分量数值对应的灰度值的最佳阈值f(x,y);The four pixels on the top, bottom, left, and right adjacent to a pixel on the image and its four diagonal adjacent pixels constitute the neighborhood information of the eight pixels of the pixel, and the image is obtained by the two-dimensional maximum entropy segmentation method. The two-dimensional histogram of the gray value of the point and the gray value of the neighborhood information of the pixel, using the maximum two-dimensional entropy to obtain the optimal threshold f(x, y) of the gray value corresponding to the value of the red component; 其中,(x,y)为像素点的坐标,Among them, (x, y) is the coordinate of the pixel point, 二维直方图沿对角线分布的A区和B区分别代表目标和背景,远离对角线的C区和D区代表边界和噪声,在A区和B区上利用点灰度-区域灰度均值二维最大熵法确定最佳阈值,可使真正代表目标和背景的信息量最大,二维熵为:Areas A and B distributed along the diagonal of the two-dimensional histogram represent the target and background, respectively, and areas C and D away from the diagonal represent the boundary and noise. On areas A and B, use point grayscale-area gray The degree-mean two-dimensional maximum entropy method determines the optimal threshold, which can maximize the amount of information that truly represents the target and background. The two-dimensional entropy is: Hh == -- ΣΣ ii ΣΣ jj pp ii ,, jj loglog 22 pp ii ,, jj ,, -- -- -- (( 11 )) 熵的判别函数为:The discriminant function of entropy is: f(x0,y0)=log2[PA(1-PA)]+HA/PA+(HL-HA)/(1-PA),   (2)f(x 0 ,y 0 )=log 2 [P A (1-P A )]+H A /P A +(H L -H A )/(1-P A ), (2) 最佳阈值为:The optimal threshold is: f(x,y)=max{f(x0,y0)},   (3)f(x,y)=max{f(x 0 ,y 0 )}, (3) 其中,in, PP AA ΣΣ ii sthe s ΣΣ jj tt pp ii ,, jj -- -- -- (( 44 )) Hh AA == -- ΣΣ ii sthe s ΣΣ jj tt pp ii ,, jj loglog 22 pp ii ,, jj -- -- -- (( 55 )) Hh LL == -- ΣΣ ii LL ΣΣ jj LL pp ii ,, jj loglog 22 pp ii ,, jj -- -- -- (( 66 )) 图像的灰度级数为L,总的象素点数为N(m×n),gi,j为图像中点灰度为i及其区域灰度均值为j的象素点数,pi,j为点灰度-区域灰度均值对(i,j)发生的概率,即:pi,j=gi,j/N,其中N(m×n)为图像的总象素数,pi,j,{i,j=1,2,…,L}表示图像关于点灰度-区域灰度均值的二维直方图。The number of gray levels of the image is L, the total number of pixels is N (m×n), g i, j is the number of pixels whose gray level is i in the image and the average gray level of the area is j, p i, j is the probability of occurrence of the pair (i, j) of point gray level - area gray level mean value, that is: p i, j = g i, j /N, where N (m×n) is the total number of pixels of the image, p i, j , {i, j=1, 2, ..., L} represent the two-dimensional histogram of the image about the point gray level - the average value of the area gray level. 3.根据权利要求1所述的视觉导航系统的焊缝的焊接定位方法,其特征在于,步骤三五中的根据二值图进行边缘检测的具体步骤为:3. the welding positioning method of the weld seam of visual navigation system according to claim 1, is characterized in that, the concrete steps that carry out edge detection according to binary image in step 35 are: 步骤三五一、用Gaussian滤波器对图像的最佳阈值f(x,y)进行卷积:Step 351, use the Gaussian filter to convolve the optimal threshold f(x, y) of the image: g(x,y)=h(x,y,σ)*f(x,y)   (7)g(x,y)=h(x,y,σ)*f(x,y) (7) hh (( xx ,, ythe y ,, σσ )) == 11 22 πσπσ 22 ee -- xx 22 ++ ythe y 22 22 σσ 22 -- -- -- (( 88 )) 其中,g(x,y)为卷积后的图像,h(x,y,σ)为Gaussian滤波器函数,σ为标准差,*代表卷积;Among them, g(x,y) is the convolved image, h(x,y,σ) is the Gaussian filter function, σ is the standard deviation, and * represents convolution; 步骤三五二、用一阶偏导有限差分法求出卷积后的图像g(x,y)的局部梯度M(x,y)和边缘方向θ(x,y),Step 352, use the first-order partial derivative finite difference method to obtain the local gradient M(x,y) and edge direction θ(x,y) of the convolved image g(x,y), g'x(x,y)≈Gx(x,y)=[g(x+1,y)-g(x,y)+g(x+1,y+1)-g(x,y+1)]/2   (9)g' x (x,y)≈G x (x,y)=[g(x+1,y)-g(x,y)+g(x+1,y+1)-g(x,y +1)]/2 (9) g'y(x,y)≈Gy(x,y)=[g(x,y+1)-g(x,y)+g(x+1,y+1)-g(x+1,y)]/2   (10)g' y (x,y)≈G y (x,y)=[g(x,y+1)-g(x,y)+g(x+1,y+1)-g(x+1 ,y)]/2 (10) Mm (( xx ,, ythe y )) == GG xx (( xx ,, ythe y )) 22 ++ GG ythe y (( xx ,, ythe y )) 22 -- -- -- (( 1111 )) θ(x,y)=arctan(Gx(x,y)/Gy(x,y))   (12)θ(x,y)=arctan(G x (x,y)/G y (x,y)) (12) 其中,Gx(x,y)表示图像g(x,y)的水平梯度;Gy(x,y)表示图像g(x,y)的垂直方向梯度;M(x,y)表示图像g(x,y)的局部梯度;θ(x,y)表示图像g(x,y)的边缘方向,Among them, G x (x, y) represents the horizontal gradient of the image g(x, y); G y (x, y) represents the vertical gradient of the image g(x, y); M(x, y) represents the image g The local gradient of (x,y); θ(x,y) represents the edge direction of the image g(x,y), 步骤三五三、将图像每一个像素点上将邻域信息局部梯度M(x,y)与沿着梯度线的两个像素相比,若邻域信息局部梯度M(x,y)小于沿着梯度线的像素,则将M(x,y)置为零,该梯度值为边缘点;若邻域信息局部梯度M(x,y)大于等于沿着梯度线的像素,该梯度值不是边缘点,则将不予处理;Step 353: Comparing the local gradient M(x,y) of the neighborhood information on each pixel of the image with the two pixels along the gradient line, if the local gradient M(x,y) of the neighborhood information is smaller than If the pixel along the gradient line, set M(x,y) to zero, the gradient value is an edge point; if the local gradient M(x,y) of the neighborhood information is greater than or equal to the pixel along the gradient line, the gradient value is not Edge points will not be processed; 步骤三五四、选取二维直方图的两个阈值,并作用于由非极大值抑制得到的图像,把梯度值小于较小阈值的像素的灰度值设为0,得到图像1;然后把梯度值小于较大阈值的像素的灰度值设为0,得到图像2;以图像2为基础,以图像1为补充来链接获得图像中十字光斑的边缘。Step 354, select two thresholds of the two-dimensional histogram, and act on the image obtained by non-maximum suppression, set the gray value of the pixel whose gradient value is smaller than the smaller threshold to 0, and obtain image 1; then Set the gray value of the pixel whose gradient value is smaller than the larger threshold to 0 to obtain image 2; based on image 2, use image 1 as a supplement to link to obtain the edge of the cross spot in the image. 4.根据权利要求3所述的视觉导航系统的焊缝的焊接定位方法,其特征在于,步骤三五四中链接获得图像中十字光斑的边缘的具体步骤为:4. the welding positioning method of the weld seam of visual navigation system according to claim 3, is characterized in that, the specific steps of linking the edge of the cross spot in the image obtained in step 354 are: 步骤三五四一,对图像2进行扫描,当遇到一个非零灰度值的像素点时,跟踪以其为开始点的轮廓线,直到轮廓线的终点;Step 3541, image 2 is scanned, and when a pixel point with a non-zero gray value is encountered, the contour line starting from it is traced until the end point of the contour line; 步骤三五四二,考察图像1中与图像2中轮廓线的终点位置对应的像素点的8个像素点的邻域信息,如果在该点的8个像素点的邻域中有非零灰度值的像素点存在,则将其包括到图像2中,作为新的开始点,然后重复步骤三五四一,直到在图像1和图像2中都无法继续为止;Step 3542, investigate the neighborhood information of the 8 pixel points of the pixel corresponding to the end position of the contour line in image 1 in image 1, if there is a non-zero gray in the neighborhood of 8 pixel points of this point If there is a pixel with a degree value, include it in Image 2 as a new starting point, and then repeat steps 3, 5, 4, and 1 until neither Image 1 nor Image 2 can continue; 步骤三五四三,当完成对包含某一像素点的轮廓线的链接之后,将这条轮廓线标记为已经访问;Step 3543, after completing the link to the contour line containing a certain pixel point, mark this contour line as visited; 重复步骤三五四一、步骤三五四二和步骤三五四三,直到图像2中找不到新轮廓线为止。Repeat step 3541, step 3542 and step 3543 until no new contour line can be found in image 2. 5.根据权利要求1所述的视觉导航系统的焊缝的焊接定位方法,其特征在于,步骤三六所述的骨架曲线为:5. the welding positioning method of the weld seam of visual navigation system according to claim 1, is characterized in that, the skeleton curve described in step three or six is: ρ=xcosθ+ysinθ;   (13)ρ=xcosθ+ysinθ; (13) 其中,ρ=xcosθ+ysinθ表示图像空间的一点(x,y)对应于(ρ,θ)空间的一条正弦曲线,x表示像素的横坐标,y表示像素的纵坐标,图像坐标原点到该直线的距离为ρ,该直线的法线与x轴的夹角为θ。Among them, ρ=xcosθ+ysinθ indicates that a point (x, y) in the image space corresponds to a sinusoidal curve in (ρ, θ) space, x indicates the abscissa of the pixel, y indicates the ordinate of the pixel, and the origin of the image coordinates reaches the line The distance is ρ, and the angle between the normal of the line and the x-axis is θ. 6.基于权利要求1所述的焊缝的焊接定位方法的焊缝偏移量获取方法,其特征在于,经过焊缝定位后,提取焊缝弧线的两个端点和十字光的中心点三个特征点,所述两个端点之间距离为a,中心点到与临近的端点之间的距离为b,设a,b的测量值为a',b',可求得偏移角度为:6. The seam offset acquisition method based on the welding positioning method of the seam according to claim 1 is characterized in that, after the seam is positioned, two endpoints of the arc of the seam and the central point three of the crosslight are extracted feature point, the distance between the two end points is a, the distance between the center point and the adjacent end point is b, and the measured values of a and b are a', b', and the offset angle can be obtained as : θθ == arccosarccos aa ++ bb aa '' ++ bb '' == arccosarccos bb '' bb ,, -- -- -- (( 1414 )) 位置偏移量为:The position offset is: w=b'-b,   (15)w=b'-b, (15) 当a'=a,b'=b,为正常情况,机器人正常行进中;当a'>a,b'≤b,机器人向左偏移;当a'>a,b'≥b,机器人向右偏移。When a'=a, b'=b, it is normal, and the robot is moving normally; when a'>a, b'≤b, the robot shifts to the left; when a'>a, b'≥b, the robot moves toward right offset.
CN201210150845.XA 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection Expired - Fee Related CN102645219B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201210150845.XA CN102645219B (en) 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201210150845.XA CN102645219B (en) 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection

Publications (2)

Publication Number Publication Date
CN102645219A CN102645219A (en) 2012-08-22
CN102645219B true CN102645219B (en) 2014-12-03

Family

ID=46658192

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201210150845.XA Expired - Fee Related CN102645219B (en) 2012-05-16 2012-05-16 Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection

Country Status (1)

Country Link
CN (1) CN102645219B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105988142A (en) * 2014-12-31 2016-10-05 新代科技股份有限公司 Pipe welding bead detection system and method

Families Citing this family (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103567607A (en) * 2013-11-06 2014-02-12 广东德科机器人技术与装备有限公司 Welding-seam tracking method
CN103853141A (en) * 2014-03-18 2014-06-11 航天科工哈尔滨风华有限公司 Control system of automatic detecting wall-climbing robot for fan tower cylinder welding seams
CN104121882B (en) * 2014-07-24 2017-06-13 中国石油集团渤海石油装备制造有限公司 Be in a pout detection method and the device of steel pipe seam
CN104634372B (en) * 2015-02-15 2017-03-22 易测智能科技(天津)有限公司 Terminal positioning device and positioning method for testing of mobile terminals
CN104776799B (en) * 2015-04-10 2017-06-13 清华大学 Detection device and method before the cosmetic welding of shadow feature are built using lateral light
CN106530269A (en) * 2015-09-15 2017-03-22 苏州中启维盛机器人科技有限公司 Weld detection method
CN105195888B (en) * 2015-10-09 2018-04-13 航天工程装备(苏州)有限公司 Agitating friction welds planar laser tracking compensation technique
CN105499865A (en) * 2016-01-22 2016-04-20 广西大学 Planar welding manipulator with function of automatic track seeking
CN106091936A (en) * 2016-06-01 2016-11-09 中国电子科技集团公司第四十研究所 A kind of cellophane offset detecting device based on machine vision technique and method
CN106382884A (en) * 2016-08-18 2017-02-08 广东工业大学 Point light source welding seam scanning detection method
CN106370113A (en) * 2016-11-01 2017-02-01 合肥超科电子有限公司 Water wheel offset detection device, automatic alignment device and water wheel support
CN106767401A (en) * 2016-11-26 2017-05-31 江苏瑞伯特视觉科技股份有限公司 A kind of shaft hole series part based on cross laser and machine vision determines appearance localization method
CN106695192B (en) * 2016-12-22 2018-04-10 江苏工程职业技术学院 A kind of climbing robot automatic welding control method
CN107414253B (en) * 2017-08-21 2022-09-06 河北工业大学 Welding seam tracking control device and method based on cross laser
CN108788550B (en) * 2018-06-27 2019-07-12 清华大学 Detection device, control method and device for detecting fine gap weld bead using detection device
CN109058053B (en) * 2018-07-04 2020-11-03 苏州智能制造研究院有限公司 Method for measuring horizontal displacement of top end of wind driven generator tower
WO2020042030A1 (en) * 2018-08-29 2020-03-05 深圳配天智能技术研究院有限公司 Gap detection method and system for visual welding system
CN109060262A (en) * 2018-09-27 2018-12-21 芜湖飞驰汽车零部件技术有限公司 A kind of wheel rim weld joint air-tight detection device and air-tightness detection method
CN109483018A (en) * 2018-11-06 2019-03-19 湖北书豪智能科技有限公司 The active vision bootstrap technique of weld seam in automatic welding of pipelines
CN109365998B (en) * 2018-12-24 2019-09-17 中南大学 Laser soldering device and vision positioning method based on machine vision positioning
CN109949245B (en) * 2019-03-25 2021-04-16 长沙智能驾驶研究院有限公司 Cross laser detection and positioning method, device, storage medium and computer equipment
CN110310295B (en) * 2019-03-27 2021-09-14 广东技术师范学院天河学院 Weld contour extraction method and system
CN109993741B (en) * 2019-04-03 2022-10-04 南昌航空大学 Steel rail welding seam contour automatic positioning method based on K-means clustering
CN112388626B (en) * 2019-08-15 2022-04-22 广东博智林机器人有限公司 Robot-assisted navigation method
CN111044701A (en) * 2019-12-30 2020-04-21 中核武汉核电运行技术股份有限公司 Device and method for calibrating position of wall-climbing robot for spent pool inspection of nuclear power plant
CN111551565A (en) * 2020-06-19 2020-08-18 湖南恒岳重钢钢结构工程有限公司 Wind power tower cylinder weld defect detection device and method based on machine vision
CN112797895B (en) * 2020-12-24 2022-07-22 上海智殷自动化科技有限公司 Frame body positioning device based on vision and laser
CN112902876B (en) * 2021-01-14 2022-08-26 西北工业大学 Method for measuring weld deflection of spin forming curved surface member of tailor-welded blank
CN113048882A (en) * 2021-03-09 2021-06-29 徐州徐工挖掘机械有限公司 Intelligent detection system for weld surface quality and implementation method
CN115018833B (en) * 2022-08-05 2022-11-04 山东鲁芯之光半导体制造有限公司 A kind of processing defect detection method of semiconductor device
CN115648223B (en) * 2022-12-09 2025-02-07 中建八局第二建设有限公司 Control method and equipment of flat curtain wall glue injection robot
CN116812033A (en) * 2023-05-18 2023-09-29 国能九江发电有限公司 Membrane water wall climbing robot and path deviation correcting method thereof
CN116558438B (en) * 2023-07-11 2023-09-15 湖南视觉伟业智能科技有限公司 Bottle blowing quality detection device and method

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1900701A (en) * 2006-07-19 2007-01-24 北京科技大学 Online detecting method and device for hot rolling strip surface fault based on laser line light source
CN101033953A (en) * 2007-02-02 2007-09-12 西安交通大学 Measurement method of planeness based on image processing and pattern recognizing
CN101718532A (en) * 2009-06-15 2010-06-02 三星重工业株式会社 Laser image module and non-contact type measurement device using same
CN101782552A (en) * 2010-02-05 2010-07-21 航天科工哈尔滨风华有限公司 Automatic on-line detecting device for welding lines of mast of wind driven generator

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3556589B2 (en) * 2000-09-20 2004-08-18 ファナック株式会社 Position and orientation recognition device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1900701A (en) * 2006-07-19 2007-01-24 北京科技大学 Online detecting method and device for hot rolling strip surface fault based on laser line light source
CN101033953A (en) * 2007-02-02 2007-09-12 西安交通大学 Measurement method of planeness based on image processing and pattern recognizing
CN101718532A (en) * 2009-06-15 2010-06-02 三星重工业株式会社 Laser image module and non-contact type measurement device using same
CN101782552A (en) * 2010-02-05 2010-07-21 航天科工哈尔滨风华有限公司 Automatic on-line detecting device for welding lines of mast of wind driven generator

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱志宏,李济泽,彭晋民,李军,高学山.微小型壁面检测爬壁机器人移动平台研究.《机械工程学报》.2011,第47卷(第3期),全文. *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105988142A (en) * 2014-12-31 2016-10-05 新代科技股份有限公司 Pipe welding bead detection system and method

Also Published As

Publication number Publication date
CN102645219A (en) 2012-08-22

Similar Documents

Publication Publication Date Title
CN102645219B (en) Welding and locating method of welding seam and method of obtaining welding seam offset for visual navigation system of wall climbing robot for weld inspection
CN103846606B (en) Welding track based on machine vision corrects Special testing device and method
CN103759648B (en) A kind of complicated angle welding method for detecting position based on Binocular stereo vision with laser
Zhang et al. Identification of the deviation of seam tracking and weld cross type for the derusting of ship hulls using a wall-climbing robot based on three-line laser structural light
CN106952281B (en) A method for weld profile feature recognition and real-time planning of weld bead
CN112101137B (en) Welding seam identification and path extraction method for wall-climbing robot navigation
CN102699534A (en) Scanning type laser vision sensing-based narrow-gap deep-groove automatic laser multilayer welding method for thick plate
CN104668738B (en) Cross type double-line laser vision sensing welding gun height real-time identification system and method
CN114140439A (en) Method and device for feature point recognition of laser welding seam based on deep learning
CN106181162A (en) A kind of real-time weld joint tracking detecting system based on machine vision and method
CN102607467A (en) Device and method for detecting elevator guide rail perpendicularity based on visual measurement
CN114043045B (en) Round hole automatic plug welding method and device based on laser vision
CN106485749A (en) A kind of rectangular pins element rough localization method based on angle point
CN105563481A (en) Robot vision guide method used for shaft hole assembling
CN110490873A (en) A kind of the mine rigid cage guide deformation diagnostic device and its localization method of view-based access control model and laser fusion
CN116734082B (en) Pipeline robot and multi-sensor fusion pipeline inner diameter defect and sludge detection method
CN117830817A (en) A three-dimensional recognition method for underwater V-groove welds
CN104021546B (en) The workpiece online method for rapidly positioning of label based on image procossing
CN213196231U (en) A seam tracking robot
CN107843602A (en) A kind of detection method for quality of welding line based on image
CN108788544A (en) A kind of weld seam starting point detecting method based on structured light vision sensor
CN104850869A (en) Automatic identification system and method for high-speed railway CPIII marker
CN117781963A (en) Welding bead flatness measurement system and method based on structured light triangulation
CN116740141A (en) Machine vision-based weld joint positioning system and method for small preceding assembly
CN117409039A (en) A wheel-rail relative displacement calculation method based on virtual point tracking network

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C53 Correction of patent of invention or patent application
CB03 Change of inventor or designer information

Inventor after: Zhang Liguo

Inventor after: Xiao Bo

Inventor after: Jiao Jianbin

Inventor after: Gao Xueshan

Inventor after: Liu Lu

Inventor after: Zhang Lei

Inventor before: Zhang Liguo

Inventor before: Xiao Bo

Inventor before: Jiao Jianbin

Inventor before: Gao Xueshan

COR Change of bibliographic data

Free format text: CORRECT: INVENTOR; FROM: ZHANG LIGUO XIAO BO JIAO JIANBIN GAO XUESHAN TO: ZHANG LIGUO XIAO BO JIAO JIANBIN GAO XUESHAN LIU LU ZHANG LEI

C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20141203

CF01 Termination of patent right due to non-payment of annual fee