CN110412564A - A train wagon recognition and ranging method based on multi-sensor fusion - Google Patents
A train wagon recognition and ranging method based on multi-sensor fusion Download PDFInfo
- Publication number
- CN110412564A CN110412564A CN201910689859.0A CN201910689859A CN110412564A CN 110412564 A CN110412564 A CN 110412564A CN 201910689859 A CN201910689859 A CN 201910689859A CN 110412564 A CN110412564 A CN 110412564A
- Authority
- CN
- China
- Prior art keywords
- sensor
- railway carriage
- data
- radar
- target
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 38
- 230000004927 fusion Effects 0.000 title claims abstract description 27
- 230000009977 dual effect Effects 0.000 claims abstract description 4
- 239000000284 extract Substances 0.000 claims abstract description 4
- 238000013528 artificial neural network Methods 0.000 claims description 7
- 238000009434 installation Methods 0.000 claims description 6
- 238000012937 correction Methods 0.000 claims description 4
- 238000012549 training Methods 0.000 claims description 3
- 230000003137 locomotive effect Effects 0.000 abstract description 5
- 238000005259 measurement Methods 0.000 description 5
- 238000005516 engineering process Methods 0.000 description 4
- 230000007613 environmental effect Effects 0.000 description 4
- 238000011160 research Methods 0.000 description 4
- 230000008447 perception Effects 0.000 description 3
- 238000012545 processing Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000008878 coupling Effects 0.000 description 2
- 238000010168 coupling process Methods 0.000 description 2
- 238000005859 coupling reaction Methods 0.000 description 2
- 230000007547 defect Effects 0.000 description 2
- 238000001914 filtration Methods 0.000 description 2
- 230000009286 beneficial effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013178 mathematical model Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000001932 seasonal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/08—Systems for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/02—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
- G01S7/41—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
- G01S7/415—Identification of targets based on measurements of movement associated with the target
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Multimedia (AREA)
- Train Traffic Observation, Control, And Security (AREA)
Abstract
本发明公开了一种基于多传感器融合的列车车皮识别与测距方法,包括以下步骤:S1双传感器对目标区域数据进行采集与处理,得到各自的标准数据;S2根据各自的标准数据,搭建数据融合平台,并得到有效目标区域;S3根据有效目标区域对目标进行识别与检测;S4在有效目标区域中提取目标车皮的距离信息,将得到的是否目标车皮的信息和对应的距离信息实时传递至前端。本发明能自动地测量出列车车尾部与待挂车皮之间距离,并实时为机车司机提供该距离参数,便于机车实时、准确地调整列车速度,最终达到可靠、安全地挂接待挂列车。
The invention discloses a train wagon recognition and ranging method based on multi-sensor fusion, comprising the following steps: S1 dual sensors collect and process target area data to obtain respective standard data; S2 construct data according to respective standard data Integrate the platform and obtain the effective target area; S3 identifies and detects the target according to the effective target area; S4 extracts the distance information of the target wagon in the effective target area, and transmits the obtained information of whether the target wagon is the target wagon and the corresponding distance information to the front end. The invention can automatically measure the distance between the rear of the train car and the wagon to be towed, and provide the distance parameter for the locomotive driver in real time, so that the locomotive can adjust the train speed accurately and in real time, and ultimately achieve reliable and safe hooking of the towed train.
Description
技术领域technical field
本发明属于交通控制系统技术领域,具体涉及一种基于多传感器融合的列车车皮识别与测距方法。The invention belongs to the technical field of traffic control systems, and in particular relates to a train wagon recognition and ranging method based on multi-sensor fusion.
背景技术Background technique
目前,货运列车在货场挂接车皮采用人工方法实现,该方法利用人眼目测的方式测量列车与待挂车皮之间的距离,误差较大,机车司机利用该距离参数调整车速不够精确,影响车皮挂接作业的工作质量。另外,由于受季节、天气、遮挡和光线等因素的影响,使得人工目测距离时会受到进一步的影响,使得作业效率进一步降低。At present, freight trains are attached to wagons in the freight yard by manual methods. This method uses human eyes to measure the distance between the train and the wagons to be attached. The error is relatively large. The locomotive driver’s adjustment of the speed by using this distance parameter is not accurate enough, which affects the wagons. Work quality of the hook job. In addition, due to the influence of factors such as seasons, weather, occlusion and light, the manual visual distance will be further affected, which further reduces the operating efficiency.
列车的驾驶人员与引导员收到命令后通过肉眼观察与分辨后按照相关程序协同完成,待挂接的车皮分布于不同的岔道上,火车司机负责将火车驶入待挂车皮的岔道并用肉眼判断距离以控制速度完成挂接,这种挂接方式被广泛用于全国大部分货运铁路。但人工作业具有如下缺点。第一,调度策略复杂,交通效率低下。第二,人工作业可控性低,易产生误判与疏忽,而造成作业事故(车速过快(大于5km/h)时,发生车皮碰撞,造成车皮本体变形;车速过慢(小于5km/h)时,推理不足,造成车皮挂接不上)。第三,受天气影响严重,例如午夜时间,受光照条件影响的人工调度能力会大大降低。After receiving the order, the driver and guide of the train observe and distinguish with the naked eye, and then cooperate with the relevant procedures. The wagons to be attached are distributed on different forks. The train driver is responsible for driving the train into the forks where the wagons are to be attached and judges with the naked eye. The coupling is done at a controlled speed, which is widely used on most of the freight railways in the country. However, manual work has the following disadvantages. First, the scheduling strategy is complex and the traffic efficiency is low. Second, the controllability of manual work is low, and misjudgment and negligence are prone to occur, resulting in operational accidents (when the vehicle speed is too fast (greater than 5km/h), the wagon collision occurs, causing the wagon body to deform; the vehicle speed is too slow (less than 5km/h). h), reasoning is insufficient, causing wagons to be articulated). Third, it is seriously affected by weather, such as midnight, and the ability of manual scheduling affected by lighting conditions will be greatly reduced.
目前,主流测距传感器技术如红外测距的优点是价格便宜,工作过程安全,但抗干扰性差,距离近(有效距离通常在十米以内),方向性差;激光雷达测距的优点是精确,缺点是工作过程需要注意人体安全,且制作的难度较大,成本较高,而且光学系统需要保持干净,否则将影响测量;毫米波雷达测距距离远(有效距离通常在100米以内)实时性好而虚警率高,视觉传感器识别能力强,缺点是测距策略复杂且精度低。由于环境因素的影响,比如季节变化造成的寒冷环境(零下十度左右)、雾天雨天夜晚等的低能见度环境、工作现场(货厂)非目标物体造成遮挡或干扰的复杂环境,虽然研究人员在传感器研究领域通过改进系统硬件性能并优化算法,或提出新的系统方案,并大幅提高单一传感器的性能,然而在上述复杂的交通工况中单一传感器精确的感知能力仍然有待提高,为此研究人员逐步开始关注多传感器信息融合的识别方法,且近年来软硬件技术的进步与数学算法的飞速发展,为多传感器融合提供了技术上与理论上的支撑。通过各种算法与数学模型可以自动检测出前方目标的距离。这种可控性高、自动化程度高的控制模式逐步成为国内外的研究热点与技术攻关领域。故为满足工作过程需要,所述传感器选择包括毫米波雷达及单目RGB视觉传感器融合的传感器,优势互补,可降低单一依赖性并提高性能以达到目标。At present, the advantages of mainstream ranging sensor technology such as infrared ranging are cheap price and safe working process, but poor anti-interference, short distance (effective distance is usually within ten meters), and poor directionality; the advantages of laser radar ranging are accurate, The disadvantage is that human safety needs to be paid attention to during the working process, and the production is difficult and costly, and the optical system needs to be kept clean, otherwise it will affect the measurement; the millimeter-wave radar has a long distance (the effective distance is usually within 100 meters) and real-time performance Good but the false alarm rate is high, and the visual sensor has strong recognition ability. The disadvantage is that the ranging strategy is complex and the accuracy is low. Due to the influence of environmental factors, such as the cold environment (about ten degrees below zero) caused by seasonal changes, the low-visibility environment such as foggy and rainy nights, and the complex environment where non-target objects at the work site (cargo factory) cause occlusion or interference, although the researchers In the field of sensor research, by improving system hardware performance and optimizing algorithms, or proposing new system solutions, the performance of a single sensor can be greatly improved. However, the precise perception ability of a single sensor in the above-mentioned complex traffic conditions still needs to be improved. For this research People have gradually begun to pay attention to the recognition method of multi-sensor information fusion, and in recent years, the progress of software and hardware technology and the rapid development of mathematical algorithms have provided technical and theoretical support for multi-sensor fusion. Through various algorithms and mathematical models, the distance to the target in front can be automatically detected. This control mode with high controllability and high degree of automation has gradually become a research hotspot and technical research field at home and abroad. Therefore, in order to meet the needs of the working process, the sensor selection includes millimeter-wave radar and monocular RGB vision sensor fusion sensors, which complement each other, can reduce single dependence and improve performance to achieve the goal.
发明内容Contents of the invention
本发明公开了一种基于多传感器融合的列车车皮识别与测距方法,该方法采用毫米波雷达与视觉传感器融合的方式,自动地测量出列车车尾部与待挂车皮之间距离,并实时为机车司机提供该距离参数,便于机车实时、准确地调整列车速度,最终达到可靠、安全地挂接待挂列车。The invention discloses a train wagon recognition and ranging method based on multi-sensor fusion. The method adopts the fusion of millimeter-wave radar and visual sensor to automatically measure the distance between the rear of the train wagon and the wagon to be towed, and provide real-time The locomotive driver provides this distance parameter, which is convenient for the locomotive to adjust the train speed in real time and accurately, and finally achieves reliable and safe hooking up of the waiting train.
本发明通过以下技术方案实现:一种基于多传感器融合的列车车皮识别与测距方法,所述方法包括以下步骤:The present invention is realized through the following technical solutions: a multi-sensor fusion-based train wagon recognition and ranging method, the method comprising the following steps:
S1双传感器对目标区域数据进行采集与处理,得到各自的标准数据;S1 dual sensors collect and process the data of the target area to obtain their respective standard data;
S2根据各自的标准数据,搭建数据融合平台,并得到有效目标区域;S2 builds a data fusion platform based on their respective standard data and obtains an effective target area;
S3根据有效目标区域对目标进行识别与检测;S3 identifies and detects the target according to the effective target area;
S4在有效目标区域中提取目标车皮的距离信息,将得到的是否目标车皮的信息和对应的距离信息实时传递至远程终端控制器。S4 extracts the distance information of the target wagon in the effective target area, and transmits the obtained information about whether the target wagon is the target wagon and the corresponding distance information to the remote terminal controller in real time.
进一步的,步骤S1中包括:Further, step S1 includes:
S11第一传感器实时采集前方区域图像数据,并输出图像原始数据,同时,第二传感器实时采集前方区域雷达数据,并输出雷达原始数据;S11 The first sensor collects the image data of the front area in real time, and outputs the original image data; at the same time, the second sensor collects the radar data of the front area in real time, and outputs the original radar data;
S12对所述图像原始数据和所述雷达原始数据进行标准化处理,分别得到图像标准数据和雷达标准数据,执行步骤S2。S12 standardizes the image raw data and the radar raw data to obtain image standard data and radar standard data respectively, and executes step S2.
进一步的,步骤S12中包括:Further, step S12 includes:
S121对图像原始数据进行畸变矫正;S121 performing distortion correction on the original image data;
S122对畸变矫正后的图像原始数据进行编码处理;S122 encodes the distortion-corrected original image data;
S123对进行过编码处理后的图像原始数据进行像素处理,而后生成图像标准数据,执行步骤S2。S123 performs pixel processing on the encoded original image data, and then generates image standard data, and executes step S2.
进一步的,步骤S12中还包括:Further, step S12 also includes:
S124对雷达原始数据进行阈值滤波;S124 performs threshold filtering on the radar raw data;
S125对经过阈值滤波的雷达原始数据进行卡尔曼状态估计,而后生成雷达标准数据,执行步骤S2。S125 performs Kalman state estimation on the threshold-filtered raw radar data, and then generates radar standard data, and executes step S2.
进一步的,包括:第一传感器安装标定模块、第二传感器安装标定模块、联合标定模块和时间同步模块,其中,Further, it includes: a first sensor installation calibration module, a second sensor installation calibration module, a joint calibration module and a time synchronization module, wherein,
所述第一传感器安装标定模块,用于对所述第一传感器进行安装与标定;The first sensor is equipped with a calibration module, which is used to install and calibrate the first sensor;
所述第二传感器安装标定模块,用于对所述第二传感器进行安装与标定;The second sensor is equipped with a calibration module, which is used to install and calibrate the second sensor;
所述联合标定模块,用于对所述第一传感器和第二传感器进行联合标定;The joint calibration module is used for joint calibration of the first sensor and the second sensor;
所述时间同步模块,用于对所述第一传感器和第二传感器进行时间同步,The time synchronization module is configured to perform time synchronization on the first sensor and the second sensor,
步骤S2中包括:Include in step S2:
S21对第一传感器进行安装与标定;S21 installing and calibrating the first sensor;
S22对第二传感器进行安装与标定;S22 installing and calibrating the second sensor;
S23对第一传感器和第二传感器进行联合标定和时间同步,得到有效目标区域。S23 performs joint calibration and time synchronization on the first sensor and the second sensor to obtain an effective target area.
进一步的,所述第一传感器为摄像机传感器。Further, the first sensor is a camera sensor.
进一步的,所述第二传感器为雷达传感器。Further, the second sensor is a radar sensor.
进一步的,所述雷达传感器为毫米波雷达。Further, the radar sensor is a millimeter wave radar.
进一步的,步骤S3中包括:Further, step S3 includes:
S31训练得到深度神经网络目标检测器;S31 training to obtain a deep neural network target detector;
S32将有效目标区域通过深度神经网络目标检测器检测前方区域铁轨,裁减得到含有目标车皮的有效测量区域。S32: The effective target area is detected by the deep neural network target detector to detect the railroad track in the front area, and the effective measurement area containing the target wagon is obtained by clipping.
进一步的,步骤S4中,具体的,对步骤S1-S3进行封装与整合,在图像识别所得到的有效测量区域中提取完成目标车皮的距离信息,将得到的是否目标车皮的信息与对应的距离信息实时传递至远程终端控制器。Further, in step S4, specifically, steps S1-S3 are packaged and integrated, and the distance information of the target wagon is extracted from the effective measurement area obtained by image recognition, and the obtained information on whether the target wagon is obtained and the corresponding distance The information is transmitted to the remote terminal controller in real time.
本发明的有益效果在于:The beneficial effects of the present invention are:
第一,通过毫米波雷传感器与图像传感器的融合,大大提升了可弥补单传感器在获取道路环境感知信息量不够丰富的缺陷;First, through the fusion of the millimeter wave mine sensor and the image sensor, it can greatly improve the defect that the single sensor can not obtain enough information in the road environment;
第二,恶劣天气情况下,人工判断会受到光线低温等天气因素的严重干扰,而毫米波雷达在环境适应能力上较为强大,极大提升车载环境感知能力的潜力以适应全天候的工作,包括冬天(气温低于30℃情况)、夏天、白天、夜间、晴天、雾天、雨天、雾霾天等。相比于仅靠人工操作,极大地提高了工作过程中环境感知能力的可靠性、准确性与适应性;Second, in severe weather conditions, human judgment will be severely disturbed by weather factors such as light and low temperature, while millimeter-wave radar is relatively strong in environmental adaptability, which greatly improves the potential of on-board environmental perception to adapt to all-weather work, including winter. (when the temperature is lower than 30°C), summer, daytime, nighttime, sunny day, foggy day, rainy day, smog day, etc. Compared with only relying on manual operation, it greatly improves the reliability, accuracy and adaptability of the environmental perception ability in the working process;
第三,通过使用传感器,能够科学准确地计算目标车皮的距离而不是操作人员仅凭经验与肉眼观察得到的模糊判断,从而大大消除了人工控制低效不稳定的缺陷;Third, through the use of sensors, the distance of the target wagon can be calculated scientifically and accurately instead of the fuzzy judgment obtained by the operator based on experience and naked eye observation, thus greatly eliminating the defect of low efficiency and instability of manual control;
附图说明Description of drawings
图1为本发明的一种基于多传感器融合的列车车皮识别与测距方法的方法流程图;Fig. 1 is a method flow chart of a train wagon recognition and ranging method based on multi-sensor fusion of the present invention;
图2为本发明的一种基于多传感器融合的列车车皮识别与测距方法的模块执行图。Fig. 2 is a module execution diagram of a train wagon recognition and ranging method based on multi-sensor fusion in the present invention.
具体实施方式Detailed ways
下面将结合本发明实施例中的附图对本发明实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本发明一部分实施例,而不是全部的实施例。基于本发明中的实施例,本领域普通技术人员在没有做出创造性劳动前提下所获得的所有其他实施例,都属于本发明保护的范围。The technical solutions in the embodiments of the present invention will be clearly and completely described below in conjunction with the accompanying drawings in the embodiments of the present invention. Obviously, the described embodiments are only some of the embodiments of the present invention, not all of them. Based on the embodiments of the present invention, all other embodiments obtained by persons of ordinary skill in the art without making creative efforts belong to the protection scope of the present invention.
参照图1所示,本发明通过以下技术方案实现:本发明提供了一种基于多传感器融合的列车车皮识别与测距方法的一实施例,所述方法包括以下步骤:With reference to shown in Fig. 1, the present invention is realized through the following technical solutions: the present invention provides a kind of embodiment based on the train wagon identification and ranging method of multi-sensor fusion, and described method comprises the following steps:
S1双传感器对目标区域数据进行采集与处理,得到各自的标准数据;S1 dual sensors collect and process the data of the target area to obtain their respective standard data;
S2根据各自的标准数据,搭建数据融合平台,并得到有效目标区域;S2 builds a data fusion platform based on their respective standard data and obtains an effective target area;
S3根据有效目标区域对目标进行识别与检测;S3 identifies and detects the target according to the effective target area;
S4在有效目标区域中提取目标车皮的距离信息,将得到的是否目标车皮的信息和对应的距离信息实时传递至远程终端控制器。S4 extracts the distance information of the target wagon in the effective target area, and transmits the obtained information about whether the target wagon is the target wagon and the corresponding distance information to the remote terminal controller in real time.
参照图1所示,在本部分优选实施例中,步骤S1中包括:Referring to Fig. 1, in the preferred embodiment of this part, step S1 includes:
S11第一传感器实时采集前方区域图像数据,并输出图像原始数据,同时,第二传感器实时采集前方区域雷达数据,并输出雷达原始数据;S11 The first sensor collects the image data of the front area in real time, and outputs the original image data; at the same time, the second sensor collects the radar data of the front area in real time, and outputs the original radar data;
S12对所述图像原始数据和所述雷达原始数据进行标准化处理,分别得到图像标准数据和雷达标准数据,执行步骤S2。S12 standardizes the image raw data and the radar raw data to obtain image standard data and radar standard data respectively, and executes step S2.
具体的,雷达为毫米波雷达,其使用短波电磁波的特殊雷达技术,采用了一种频率调制连续波(FMCW)的特殊毫米波雷达技术,通过捕获反射的FMCW信号,雷达系统实时采集前方目标物的距离、角度以及相对速度等数据。所述摄像机(摄像机传感器)选取1024×728分辨率RGB彩色摄像头,以实时采集彩色图像。步骤S12是对采集到的图像进行的预处理,对雷达数据进行滤波与修正。Specifically, the radar is a millimeter-wave radar, which uses a special radar technology of short-wave electromagnetic waves, and a special millimeter-wave radar technology of frequency modulated continuous wave (FMCW). By capturing the reflected FMCW signal, the radar system collects the front target in real time The distance, angle and relative speed and other data. The camera (camera sensor) selects a 1024×728 resolution RGB color camera to collect color images in real time. Step S12 is to preprocess the collected images, filter and correct the radar data.
参照图1所示,在本部分优选实施例中,步骤S12中包括:Referring to Fig. 1, in the preferred embodiment of this part, step S12 includes:
S121对图像原始数据进行畸变矫正;S121 performing distortion correction on the original image data;
S122对畸变矫正后的图像原始数据进行编码处理;S122 encodes the distortion-corrected original image data;
S123对进行过编码处理后的图像原始数据进行像素处理,而后生成标准数据,执行步骤S2。S123 performs pixel processing on the encoded original image data, and then generates standard data, and executes step S2.
参照图1所示,在本部分优选实施例中,步骤S12中还包括:Referring to Fig. 1, in the preferred embodiment of this part, step S12 also includes:
S124对雷达原始数据进行阈值滤波;S124 performs threshold filtering on the radar raw data;
S125对经过阈值滤波的雷达原始数据进行卡尔曼状态估计,而后生成雷达标准数据,执行步骤S2。S125 performs Kalman state estimation on the threshold-filtered raw radar data, and then generates radar standard data, and executes step S2.
参照图1-图2所示,在本部分优选实施例中,包括:第一传感器安装标定模块、第二传感器安装标定模块、联合标定模块和时间同步模块,其中,Referring to Figures 1-2, in the preferred embodiment of this part, it includes: a first sensor installation calibration module, a second sensor installation calibration module, a joint calibration module and a time synchronization module, wherein,
所述第一传感器安装标定模块,用于对所述第一传感器进行安装与标定;The first sensor is equipped with a calibration module, which is used to install and calibrate the first sensor;
所述第二传感器安装标定模块,用于对所述第二传感器进行安装与标定;The second sensor is equipped with a calibration module, which is used to install and calibrate the second sensor;
所述联合标定模块,用于对所述第一传感器和第二传感器进行联合标定;The joint calibration module is used for joint calibration of the first sensor and the second sensor;
所述时间同步模块,用于对所述第一传感器和第二传感器进行时间同步,The time synchronization module is configured to perform time synchronization on the first sensor and the second sensor,
步骤S2中包括:Include in step S2:
S21对第一传感器进行安装与标定;S21 installing and calibrating the first sensor;
S22对第二传感器进行安装与标定;S22 installing and calibrating the second sensor;
S23对第一传感器和第二传感器进行联合标定和时间同步,得到有效目标区域。S23 performs joint calibration and time synchronization on the first sensor and the second sensor to obtain an effective target area.
参照图2所示,在本部分优选实施例中,所述第一传感器为摄像机传感器。Referring to Fig. 2, in the preferred embodiment of this part, the first sensor is a camera sensor.
参照图2所示,在本部分优选实施例中,所述第二传感器为雷达传感器。Referring to Fig. 2, in the preferred embodiment of this part, the second sensor is a radar sensor.
参照图2所示,在本部分优选实施例中,所述雷达传感器为毫米波雷达。Referring to Fig. 2, in the preferred embodiment of this part, the radar sensor is a millimeter-wave radar.
参照图1所示,在本部分优选实施例中,步骤S3中包括:Referring to Fig. 1, in the preferred embodiment of this part, step S3 includes:
S31训练得到深度神经网络目标检测器;S31 training to obtain a deep neural network target detector;
S32将有效目标区域通过深度神经网络目标检测器检测前方区域铁轨,裁减得到含有目标车皮的有效测量区域。S32: The effective target area is detected by the deep neural network target detector to detect the railroad track in the front area, and the effective measurement area containing the target wagon is obtained by clipping.
参照图1所示,在本部分优选实施例中,步骤S4中,具体的,对步骤S1-S3进行封装与整合,在图像识别所得到的有效测量区域中提取完成目标车皮的距离信息,将得到的是否目标车皮的信息与对应的距离信息实时传递至前端。Referring to Fig. 1, in the preferred embodiment of this part, in step S4, specifically, the steps S1-S3 are packaged and integrated, and the distance information of the target wagon is extracted from the effective measurement area obtained by image recognition, and the The obtained information of whether the target wagon is available and the corresponding distance information are transmitted to the front end in real time.
具体的,对上述步骤进行封装与整合,利用得到的雷达信息在图像识别所得到的有效区域中提取完成目标车皮的距离信息。将得到的是否目标车皮的信息与对应的距离信息实时传递至前端,以协助工作人员通过得到的数据完成车皮挂接工作。作为优选,所述毫米波雷达采用ARS408-217-77GHz毫米波雷达。Specifically, the above steps are packaged and integrated, and the distance information of the target wagon is extracted from the effective area obtained by image recognition by using the obtained radar information. The information of whether the target wagon is the target and the corresponding distance information are transmitted to the front end in real time to assist the staff to complete the wagon hooking work through the obtained data. Preferably, the millimeter wave radar adopts ARS408-217-77GHz millimeter wave radar.
本实施例的具体工作过程:The specific work process of this embodiment:
火车挂接车皮时,司机负责火车由直道转向待挂车皮的岔道并启动该本发明方法涉及的相关设备,彩色图像传感器(摄像机传感器)采集前方区域RGB图像后发送给处理器,经过图像预处理后搭建深度神经网络框架,进行铁轨识别缩小雷达监测范围。判断前方区域存在目标车皮还是视角障碍物,若存在目标车皮,毫米波雷达传感器采集前方目标车尾的距离、速度及角度信息发送给处理器,之后通过信息融合平台对雷达标定、摄像头安装标定、联合标定以及时间同步。最后处理后向前端输出前方目标车皮的实时距离信息;若存在视角障碍物,则返回该信息并计算当前与前方障碍物的实时距离。以协助操作人员控制车速实现轨道交通的车皮挂接。When the train is attached to the wagon, the driver is responsible for the train turning from the straight track to the branch road to be attached to the wagon and starting the related equipment involved in the method of the present invention. The color image sensor (camera sensor) collects the RGB image of the front area and sends it to the processor. After image preprocessing Afterwards, a deep neural network framework is built to identify rail tracks and narrow the radar monitoring range. Judging whether there is a target car in the front area or an obstacle in the viewing angle, if there is a target car, the millimeter-wave radar sensor collects the distance, speed and angle information of the rear of the target car in front and sends it to the processor, and then uses the information fusion platform to perform radar calibration, camera installation calibration, Joint calibration and time synchronization. After the final processing, the real-time distance information of the target wagon in front is output to the front end; if there is an obstacle in the viewing angle, the information is returned and the real-time distance between the current and the obstacle in front is calculated. To assist the operator to control the speed of the vehicle to realize the wagon coupling of rail transit.
Claims (10)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910689859.0A CN110412564A (en) | 2019-07-29 | 2019-07-29 | A train wagon recognition and ranging method based on multi-sensor fusion |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN201910689859.0A CN110412564A (en) | 2019-07-29 | 2019-07-29 | A train wagon recognition and ranging method based on multi-sensor fusion |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN110412564A true CN110412564A (en) | 2019-11-05 |
Family
ID=68363779
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN201910689859.0A Pending CN110412564A (en) | 2019-07-29 | 2019-07-29 | A train wagon recognition and ranging method based on multi-sensor fusion |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN110412564A (en) |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112526997A (en) * | 2020-12-07 | 2021-03-19 | 中国第一汽车股份有限公司 | Automatic driving perception system and method and vehicle |
Citations (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| GB0509313D0 (en) * | 2004-06-17 | 2005-06-15 | Daimler Chrysler Ag | Method for hitching a trailer to a motor vehicle |
| US20080150786A1 (en) * | 1997-10-22 | 2008-06-26 | Intelligent Technologies International, Inc. | Combined Imaging and Distance Monitoring for Vehicular Applications |
| EP2484542A1 (en) * | 2011-02-08 | 2012-08-08 | Jost-Werke GmbH | System for automatic adjusting of a gap between a motorcar and a trailer coupled to it |
| US20130033597A1 (en) * | 2011-08-03 | 2013-02-07 | Samsung Electro-Mechanics Co., Ltd. | Camera system and method for recognizing distance using the same |
| CN202807928U (en) * | 2012-10-12 | 2013-03-20 | 中国神华能源股份有限公司 | Auxiliary arm of locating vehicle and locating vehicle |
| EP2581892A1 (en) * | 2011-10-13 | 2013-04-17 | Robert Bosch Gmbh | Distance measuring system and method for measuring distance, in particular between a vehicle and its surroundings |
| GB201307525D0 (en) * | 2013-04-26 | 2013-06-12 | Jaguar Land Rover Ltd | Vehicle hitch assistance system |
| US20140012465A1 (en) * | 2012-07-05 | 2014-01-09 | Uusi, Llc | Vehicle trailer connect system |
| CN205500347U (en) * | 2016-04-07 | 2016-08-24 | 莱芜钢铁集团有限公司 | Empty wagon machine of shunting of tipper and tipper |
| CN206203491U (en) * | 2016-08-31 | 2017-05-31 | 大唐环境产业集团股份有限公司 | A kind of novel car dumper transfer platform is to rail detection system |
| CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
| CN108536154A (en) * | 2018-05-14 | 2018-09-14 | 重庆师范大学 | Low speed automatic Pilot intelligent wheel chair construction method based on bioelectrical signals control |
| CN108596081A (en) * | 2018-04-23 | 2018-09-28 | 吉林大学 | A kind of traffic detection method merged based on radar and video camera |
| WO2018183546A1 (en) * | 2017-03-28 | 2018-10-04 | Sri International | Identification system for subject or activity identification using range and velocity data |
| CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | Laser radar and binocular visible light camera-based combined measurement method |
| WO2018218486A1 (en) * | 2017-05-31 | 2018-12-06 | Beijing Didi Infinity Technology And Development Co., Ltd. | Devices and methods for recognizing driving behavior based on movement data |
| CN108960183A (en) * | 2018-07-19 | 2018-12-07 | 北京航空航天大学 | A kind of bend target identification system and method based on Multi-sensor Fusion |
| CN109143241A (en) * | 2018-07-26 | 2019-01-04 | 清华大学苏州汽车研究院(吴江) | The fusion method and system of radar data and image data |
| CN109471128A (en) * | 2018-08-30 | 2019-03-15 | 福瑞泰克智能系统有限公司 | A kind of positive sample production method and device |
| CN109472831A (en) * | 2018-11-19 | 2019-03-15 | 东南大学 | Obstacle identification and ranging system and method for road roller construction process |
| CA3028659A1 (en) * | 2017-12-11 | 2019-06-11 | Beijing Didi Infinity Technology And Development Co., Ltd. | Systems and methods for identifying and positioning objects around a vehicle |
-
2019
- 2019-07-29 CN CN201910689859.0A patent/CN110412564A/en active Pending
Patent Citations (21)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20080150786A1 (en) * | 1997-10-22 | 2008-06-26 | Intelligent Technologies International, Inc. | Combined Imaging and Distance Monitoring for Vehicular Applications |
| GB0509313D0 (en) * | 2004-06-17 | 2005-06-15 | Daimler Chrysler Ag | Method for hitching a trailer to a motor vehicle |
| EP2484542A1 (en) * | 2011-02-08 | 2012-08-08 | Jost-Werke GmbH | System for automatic adjusting of a gap between a motorcar and a trailer coupled to it |
| US20130033597A1 (en) * | 2011-08-03 | 2013-02-07 | Samsung Electro-Mechanics Co., Ltd. | Camera system and method for recognizing distance using the same |
| EP2581892A1 (en) * | 2011-10-13 | 2013-04-17 | Robert Bosch Gmbh | Distance measuring system and method for measuring distance, in particular between a vehicle and its surroundings |
| US20140012465A1 (en) * | 2012-07-05 | 2014-01-09 | Uusi, Llc | Vehicle trailer connect system |
| CN202807928U (en) * | 2012-10-12 | 2013-03-20 | 中国神华能源股份有限公司 | Auxiliary arm of locating vehicle and locating vehicle |
| GB201307525D0 (en) * | 2013-04-26 | 2013-06-12 | Jaguar Land Rover Ltd | Vehicle hitch assistance system |
| CN205500347U (en) * | 2016-04-07 | 2016-08-24 | 莱芜钢铁集团有限公司 | Empty wagon machine of shunting of tipper and tipper |
| CN206203491U (en) * | 2016-08-31 | 2017-05-31 | 大唐环境产业集团股份有限公司 | A kind of novel car dumper transfer platform is to rail detection system |
| WO2018183546A1 (en) * | 2017-03-28 | 2018-10-04 | Sri International | Identification system for subject or activity identification using range and velocity data |
| WO2018218486A1 (en) * | 2017-05-31 | 2018-12-06 | Beijing Didi Infinity Technology And Development Co., Ltd. | Devices and methods for recognizing driving behavior based on movement data |
| CN107609522A (en) * | 2017-09-19 | 2018-01-19 | 东华大学 | A kind of information fusion vehicle detecting system based on laser radar and machine vision |
| CA3028659A1 (en) * | 2017-12-11 | 2019-06-11 | Beijing Didi Infinity Technology And Development Co., Ltd. | Systems and methods for identifying and positioning objects around a vehicle |
| CN108828606A (en) * | 2018-03-22 | 2018-11-16 | 中国科学院西安光学精密机械研究所 | Laser radar and binocular visible light camera-based combined measurement method |
| CN108596081A (en) * | 2018-04-23 | 2018-09-28 | 吉林大学 | A kind of traffic detection method merged based on radar and video camera |
| CN108536154A (en) * | 2018-05-14 | 2018-09-14 | 重庆师范大学 | Low speed automatic Pilot intelligent wheel chair construction method based on bioelectrical signals control |
| CN108960183A (en) * | 2018-07-19 | 2018-12-07 | 北京航空航天大学 | A kind of bend target identification system and method based on Multi-sensor Fusion |
| CN109143241A (en) * | 2018-07-26 | 2019-01-04 | 清华大学苏州汽车研究院(吴江) | The fusion method and system of radar data and image data |
| CN109471128A (en) * | 2018-08-30 | 2019-03-15 | 福瑞泰克智能系统有限公司 | A kind of positive sample production method and device |
| CN109472831A (en) * | 2018-11-19 | 2019-03-15 | 东南大学 | Obstacle identification and ranging system and method for road roller construction process |
Cited By (1)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN112526997A (en) * | 2020-12-07 | 2021-03-19 | 中国第一汽车股份有限公司 | Automatic driving perception system and method and vehicle |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN109298415A (en) | A kind of track and road barricade object detecting method | |
| CN102837658B (en) | Intelligent vehicle multi-laser-radar data integration system and method thereof | |
| CN104290753B (en) | A kind of vehicle motion state tracking prediction device in front of the vehicle and its Forecasting Methodology | |
| CN105799740B (en) | A kind of track foreign body intrusion automatic detection and method for early warning based on technology of Internet of things | |
| CN110471085B (en) | Track detecting system | |
| CN109552366B (en) | Intelligent detection and alarm system for locomotive-mounted railway obstacles and early warning method thereof | |
| CN209765730U (en) | vehicle type recognition system | |
| WO2021031469A1 (en) | Vehicle obstacle detection method and system | |
| CN105046968B (en) | A kind of motor bus identification and grasp shoot method, apparatus and system | |
| CN114228491A (en) | Head-up display system and method with night vision enhanced virtual reality | |
| CN107380163A (en) | Automobile intelligent alarm forecasting system and its method based on magnetic navigation | |
| CN113799852B (en) | Intelligent active obstacle identification protection method supporting dynamic mode switching | |
| CN101870293A (en) | Vehicle driving state evaluation method based on lane-cutting behavior detection | |
| CN202563344U (en) | Magnetic navigation and routing inspection robot | |
| CN114701543B (en) | High-precision equipment limit detection early warning system and method based on big data | |
| CN105083324A (en) | Method and device for locomotive anticollision using on-board optical detection mechanism combining with auxiliary image pickup | |
| CN118254779A (en) | A vehicle-road cooperative anti-collision system for underground mine tunnel environment | |
| CN110412564A (en) | A train wagon recognition and ranging method based on multi-sensor fusion | |
| CN204124125U (en) | A kind of front vehicles state of kinematic motion follows the trail of prediction unit | |
| CN115903779A (en) | Intelligent early warning system and method for shield tunnel electric locomotive | |
| CN119881864A (en) | Vehicle speed measuring system based on radar and machine vision | |
| CN113674211A (en) | A track quality monitoring and analysis system | |
| CN113232586A (en) | Infrared pedestrian projection display method and system for driving at night | |
| CN204978706U (en) | Utilize optical detection mechanism to track locomotive buffer stop of rail orbit | |
| WO2021046962A1 (en) | Detection system and detection method for obstacle between shielding door and vehicle body |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| RJ01 | Rejection of invention patent application after publication | ||
| RJ01 | Rejection of invention patent application after publication |
Application publication date: 20191105 |