[go: up one dir, main page]

WO2018109865A1 - Machine de bord de route et système de communication véhicule-vers-route - Google Patents

Machine de bord de route et système de communication véhicule-vers-route Download PDF

Info

Publication number
WO2018109865A1
WO2018109865A1 PCT/JP2016/087222 JP2016087222W WO2018109865A1 WO 2018109865 A1 WO2018109865 A1 WO 2018109865A1 JP 2016087222 W JP2016087222 W JP 2016087222W WO 2018109865 A1 WO2018109865 A1 WO 2018109865A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
detection result
unit
detection
road
Prior art date
Application number
PCT/JP2016/087222
Other languages
English (en)
Japanese (ja)
Inventor
栗田 明
元吉 克幸
平 明徳
Original Assignee
三菱電機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 三菱電機株式会社 filed Critical 三菱電機株式会社
Priority to PCT/JP2016/087222 priority Critical patent/WO2018109865A1/fr
Publication of WO2018109865A1 publication Critical patent/WO2018109865A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/09Arrangements for giving variable traffic instructions

Definitions

  • the present invention relates to a roadside device capable of communicating with an in-vehicle device mounted on a vehicle existing in a communication area, and a road-to-vehicle communication system including the in-vehicle device and the roadside device.
  • detection means such as a camera, are used in order to detect surrounding objects such as surrounding vehicles, pedestrians, or falling objects.
  • Patent Document 1 describes a vehicle equipped with a camera that recognizes the environment ahead of the vehicle. This camera is attached to the ceiling in front of the vehicle interior so that an object outside the vehicle can be imaged.
  • the present invention has been made in view of the above, and an object thereof is to obtain a roadside machine capable of improving the detection accuracy of surrounding objects.
  • the present invention includes a peripheral object detection unit that detects an object existing in the vicinity, and a first reception unit that receives a road-to-vehicle radio signal transmitted from the in-vehicle device.
  • a detection result integration unit that integrates a detection result of the peripheral object detection unit and a detection result of an object present in the vicinity of the on-vehicle device received from the on-vehicle device using a road-to-vehicle radio signal.
  • the roadside machine according to the present invention has an effect of improving the accuracy of detection results of surrounding objects.
  • FIG. 1 is a schematic configuration diagram of a road-vehicle communication system according to a first embodiment of the present invention.
  • Configuration diagram of roadside machine according to Embodiment 1 The figure which shows the example of arrangement
  • FIG. 3 is a flowchart showing a detection operation of a peripheral object of the roadside device according to the first exemplary embodiment.
  • FIG. 1 is a schematic configuration diagram of a road-vehicle communication system according to a first embodiment of the present invention.
  • Configuration diagram of roadside machine according to Embodiment 1 The figure which shows the example of arrangement
  • the flowchart which shows the 1st example of the integration process of the state parameter of Embodiment 1 Flowchart showing a second example of state parameter integration processing of the first embodiment
  • the block diagram of the roadside machine concerning Embodiment 2 Flowchart showing detection operation of surrounding objects of roadside machine according to second exemplary embodiment.
  • the block diagram of the roadside machine concerning Embodiment 3 Flowchart showing detection operation of surrounding objects of roadside machine according to third exemplary embodiment.
  • FIG. 1 is a schematic configuration diagram of a road-vehicle communication system according to a first exemplary embodiment of the present invention.
  • the road-vehicle communication system 100 includes a roadside machine 1 ⁇ / b> A and a roadside machine 1 ⁇ / b> B, and an in-vehicle device 2.
  • a plurality of constituent elements having similar functions are given a reference numeral in which a different alphabet is added after a common numeral.
  • only common numerals are used as symbols, and when distinguishing each of a plurality of components having similar functions, A code to which a different alphabet is added later is used.
  • the roadside machine 1A and the roadside machine 1B can be referred to as a plurality of roadside machines 1.
  • one in-vehicle device 2 is shown for simplicity, but the road-to-vehicle communication system 100 may include a plurality of in-vehicle devices 2.
  • the roadside machine 1 is not limited to two, and the road-vehicle communication system 100 can include a plurality of roadside machines 1.
  • the roadside machine 1 is a communication device that is fixedly installed at a predetermined location.
  • the roadside machine 1 is installed in the vicinity of a place where vehicles such as a roadway or a parking lot can pass, for example.
  • the roadside device 1 can wirelessly communicate with the in-vehicle device 2 existing in the communication area.
  • the communication area is an area predetermined for each roadside device 1 and is a range in which each roadside device 1 covers communication or an area where a radio signal transmitted by the roadside device 1 can be received. Further, the roadside machine 1 can communicate with other roadside machines 1.
  • road-to-vehicle communication communication between the roadside device 1 and the vehicle-mounted device 2
  • a signal transmitted and received using road-to-vehicle communication is referred to as a road-to-vehicle signal.
  • inter-road communication communication between the roadside machine 1 and the roadside machine 1
  • a signal transmitted and received using the inter-road communication is referred to as an inter-road signal.
  • the roadside machine 1 has a function of detecting a peripheral object 4 existing around the roadside machine 1.
  • the peripheral object 4 is any object including a vehicle, a fallen object, a pedestrian, and the like.
  • the in-vehicle device 2 is a communication device mounted on the vehicle 3.
  • the vehicle-mounted device 2 can communicate with the roadside device 1 when the vehicle-mounted device 2 exists in the communication area of the roadside device 1.
  • the in-vehicle device 2 has a function of detecting a peripheral object 4 existing around the vehicle 3 in which the in-vehicle device 2 is mounted.
  • the roadside device 1 and the in-vehicle device 2 each detect the surrounding object 4. And the vehicle equipment 2 notifies the detection result of the surrounding object 4 to the roadside machine 1 using road-to-vehicle communication.
  • the roadside device 1 integrates the detection result of the peripheral object 4 detected by itself and the detection result notified from the in-vehicle device 2.
  • the roadside device 1 may acquire detection results from other roadside devices 1 using road-to-road communication, and may further integrate the acquired detection results.
  • the roadside device 1 that integrates the detection results collected from the in-vehicle device 2 and the other roadside devices 1 and the detection results of the peripheral objects 4 detected by itself will be described.
  • FIG. 2 is a configuration diagram of the roadside machine according to the first embodiment.
  • the roadside machine 1 includes a peripheral object detection unit 11 and a first acquisition unit 12.
  • the first acquisition unit 12 includes a first detection accuracy output unit 13 and a first state parameter output unit 14.
  • the roadside machine 1 further includes a circulator 21, a first reception unit 22, and a second acquisition unit 23.
  • the second acquisition unit 23 includes a second detection accuracy output unit 24 and a second state parameter output unit 25.
  • the roadside machine 1 further includes a second reception unit 31 and a third acquisition unit 32.
  • the third acquisition unit 32 includes a third detection accuracy output unit 33 and a third state parameter output unit 34.
  • the roadside machine 1 further includes a detection result integration unit 41.
  • the detection result integration unit 41 includes a detection accuracy integration unit 42 and a state parameter integration unit 43.
  • the roadside device 1 further includes a peripheral object list generation unit 51, a transmission frame generation unit 52, and a first transmission unit 61.
  • the peripheral object detection unit 11 is connected to the peripheral object detection sensor 10.
  • the peripheral object detection unit 11 detects an object existing around the roadside machine 1 using the peripheral object detection sensor 10.
  • the peripheral object detection unit 11 outputs detection information related to the detected peripheral object to the first acquisition unit 12.
  • the peripheral object detection sensor 10 is, for example, an image pickup device or a radar.
  • the surrounding object detection sensor 10 is an imaging device
  • the surrounding object detection unit 11 detects the surrounding object 4 using a technique such as pattern recognition based on an image acquired by the imaging device. In this case, the peripheral object detection unit 11 outputs an image including the detected peripheral object 4 to the first acquisition unit 12 as a detection result.
  • the surrounding object detection unit 11 detects the surrounding object 4 based on the radar signal reflected by the surrounding object 4 after the radar is emitted, and detects the surrounding object 4. Is output to the first acquisition unit 12 as a detection result.
  • the first acquisition unit 12 processes the detection result output by the peripheral object detection unit 11, acquires the processed detection result, and outputs the acquired detection result to the detection result integration unit 41.
  • the first detection accuracy output unit 13 outputs the detection accuracy of the peripheral object 4 to the detection result integration unit 41 as the detection result after processing based on the detection result output by the peripheral object detection unit 11.
  • the detection accuracy varies depending on the performance of the sensor that detects the peripheral object 4, the state of the sensor, the positional relationship between the target peripheral object 4 and the sensor, and the like. In the case where the sensor is an imager, the detection accuracy is considered to be significantly reduced in a situation in which direct sunlight is applied to the lens of the imager. Therefore, the first acquisition unit 12 depends on the detected time zone and the weather at that time.
  • the detection accuracy value may be calculated.
  • the first acquisition unit 12 can detect the distance between the target peripheral object 4 and the sensor and calculate the detection accuracy according to this distance.
  • the first state parameter output unit 14 obtains a state parameter that is a parameter indicating the state of the peripheral object 4 based on the detection result output by the peripheral object detection unit 11, and detects the obtained state parameter as a detection result after processing.
  • the result is output to the result integration unit 41.
  • the state parameter is a parameter indicating the state of the peripheral object 4 and is, for example, the position, size, moving speed, and attribute of the peripheral object 4.
  • the first state parameter output unit 14 can hold a history of detection results output by the surrounding object detection unit 11 and can obtain a state parameter based on the history of detection results.
  • the first state parameter output unit 14 obtains the moving speed of the peripheral object 4 from a plurality of images including the peripheral object 4 based on the moving distance and the time taken for the movement. it can.
  • the attribute is information indicating the type of the peripheral object 4 such as a normal vehicle, a large vehicle, a two-wheeled vehicle, and other objects.
  • the circulator 21 is connected to the road-vehicle communication antenna 20 and demultiplexes the transmission signal and the reception signal.
  • the first receiving unit 22 receives the road-to-vehicle signal transmitted by the in-vehicle device 2 via the road-to-vehicle communication antenna 20 and the circulator 21.
  • the first reception unit 22 receives the detection result of the peripheral object 4 existing around the vehicle-mounted device 2 from the vehicle-mounted device 2 using the road-to-vehicle signal, and outputs the received detection result to the second acquisition unit 23. .
  • the second acquisition unit 23 processes the detection result received from the in-vehicle device 2 and outputs the detection result after processing to the detection result integration unit 41.
  • the second detection accuracy output unit 24 extracts the detection accuracy of each peripheral object 4 included in the detection result received from the in-vehicle device 2, and integrates the detection result as the detection result after processing. Output to the unit 41.
  • the second state parameter output unit 25 extracts the state parameters of each peripheral object 4 included in the detection result received from the in-vehicle device 2 and outputs the extracted state parameters to the detection result integration unit 41 as detection results after processing. Output.
  • the second receiving unit 31 is connected to the roadside communication antenna 30 and receives a roadside signal transmitted by another roadside device 1.
  • the second receiving unit 31 receives the detection result of the peripheral object 4 existing in the vicinity of the other roadside machine 1 from the other roadside machine 1 using the roadside signal, and receives the received roadside signal as the first signal. 3 is output to the acquisition unit 32.
  • the 3rd acquisition part 32 processes the detection result received from the other roadside machine 1, and outputs the detection result after a process to the detection result integration part 41.
  • FIG. Specifically, the third detection accuracy output unit 33 extracts the detection accuracy of each peripheral object 4 included in the detection result received from the other roadside device 1 and outputs the extracted detection accuracy to the detection result integration unit 41. To do.
  • the third state parameter output unit 34 extracts the state parameters of each peripheral object 4 included in the detection results received from the other roadside devices 1, and outputs the extracted state parameters to the detection result integration unit 41 as detection results after processing. Output.
  • the detection result integration unit 41 integrates and integrates the detection result output from the first acquisition unit 12, the detection result output from the second acquisition unit 23, and the detection result output from the third acquisition unit 32. Generate later detection results. Specifically, the detection result integration unit 41 detects the same peripheral object 4 from the plurality of detection results output from each of the first acquisition unit 12, the second acquisition unit 23, and the third acquisition unit 32. Are extracted and grouped.
  • the detection accuracy integration unit 42 integrates the detection accuracy output from each of the first detection accuracy output unit 13, the second detection accuracy output unit 24, and the third detection accuracy output unit 33 for each group. The detection accuracy after integration of the peripheral objects 4 is generated.
  • state parameter integration unit 43 integrates the state parameters output from the first state parameter output unit 14, the second state parameter output unit 25, and the third state parameter output unit 34 for each group, A state parameter after the integration of the object 4 is generated. Specific processing of the detection result integration unit 41 will be described later.
  • the peripheral object list generation unit 51 generates a peripheral object list based on the detection result after the detection result integration unit 41 integrates.
  • the peripheral object list includes detection accuracy and state parameters after integration for each peripheral object.
  • the transmission frame generation unit 52 generates a transmission frame to be transmitted to the in-vehicle device 2 based on the peripheral object list generated by the peripheral object list generation unit 51.
  • the transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61.
  • the first transmission unit 61 is connected to the road-vehicle communication antenna 20 via the circulator 21, and transmits the transmission frame output from the transmission frame generation unit 52 to the in-vehicle device 2.
  • FIG. 3 is a diagram showing an arrangement example of roadside units and vehicles.
  • the detection process of the peripheral object 4 performed by the roadside machine 1 will be described below using a specific example on the assumption of the state of FIG.
  • a roadside machine 1A and a roadside machine 1B are installed beside the roadway, and a fallen object 5A exists on the roadway.
  • the vehicle 3A travels in one lane
  • the vehicle 3B, the vehicle 3C, and the vehicle 3D travel in the other lane that is the opposite lane.
  • the roadway is left-hand traffic.
  • the peripheral objects 4 are a vehicle 3A, a vehicle 3B, a vehicle 3C, a vehicle 3D, and a fallen object 5A.
  • FIG. 4 is a diagram showing the state of the peripheral object shown in FIG.
  • the vehicle 3A exists at a position of 35.50000 degrees north latitude and 139.40000 degrees east longitude, and the size of the vehicle body is a large vehicle of about 10 m ⁇ 6 m ⁇ 4 m.
  • the vehicle 3B exists at a position of 35.50000 degrees north latitude and 139.50000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m ⁇ 3 m ⁇ 2 m.
  • the vehicle 3C exists at a position of 35.50000 degrees north latitude and 139.60000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m ⁇ 3 m ⁇ 2 m.
  • the vehicle 3D exists at a position of 35.50000 degrees north latitude and 139.70000 degrees east longitude, and the size of the vehicle body is a normal car of about 5 m ⁇ 3 m ⁇ 2 m.
  • the falling object 5A exists at a position of 35.50000 degrees north latitude and 139.30000 degrees east longitude, and is an object having a size of about 2 m ⁇ 2 m ⁇ 1 m.
  • FIG. 5 is a flowchart of the detection operation of the surrounding objects of the roadside machine according to the first embodiment.
  • FIG. 5 shows a detection operation performed by the roadside machine 1A shown in FIG.
  • the roadside machine 1A acquires the detection result of the peripheral object 4 acquired by the peripheral object detection unit 11 (step S101).
  • the first acquisition unit 12 acquires the detection accuracy and state parameters of each peripheral object 4 from the detection result acquired by the peripheral object detection unit 11.
  • the 1st receiving part 22 receives the detection result of peripheral object 4 which exists in the circumference of vehicle equipment 2 from vehicle equipment 2 (Step S102).
  • the second acquisition unit 23 receives the detection results acquired by each of the plurality of in-vehicle devices 2.
  • the second receiving unit 31 receives the detection result of the peripheral object 4 from the other roadside device 1B (step S103).
  • the roadside machine 1 ⁇ / b> B is shown as the other roadside machine 1, but the roadside machine 1 ⁇ / b> A may acquire the detection result of the peripheral object 4 from a plurality of other roadside machines 1.
  • FIG. 6 is a diagram showing an example of the detection result detected in the example shown in FIG. FIG. 6 shows the detection results acquired by the roadside device 1A by the processing shown in steps S101 to S103.
  • This detection result includes the detection result of the peripheral object 4 detected by each of the roadside machine 1A, the roadside machine 1B, the vehicle 3A, the vehicle 3B, and the vehicle 3C.
  • the detection result integration unit 41 of the roadside machine 1A integrates a plurality of received detection results (step S104). Specifically, the detection result integration unit 41 integrates the detection accuracy and state parameters included in the received plurality of detection results for each peripheral object 4.
  • the detection accuracy integration method include a method in which the detection accuracy having the largest value is used as the detection accuracy after integration, and a method in which arithmetic processing is performed using a plurality of detection accuracy and state parameter values.
  • the detection accuracy integration method may be selected in accordance with a state parameter integration method described later.
  • the peripheral object list generation unit 51 generates a peripheral object list including the detection results after the detection result integration unit 41 integrates (step S105).
  • FIG. 7 is a diagram illustrating a detection result after integrating the detection results illustrated in FIG. 6.
  • the “object” column indicates which object in FIG. 3 corresponds to each detection result after integration.
  • the detection result after integration shows a value of detection accuracy and a state parameter for each target peripheral object 4.
  • FIG. 8 is a diagram showing a transmission frame transmitted in the first embodiment.
  • a transmission frame 200 illustrated in FIG. 8 includes detection results of N peripheral objects.
  • the detection result includes a post-integration detection accuracy 201 and a post-integration state parameter 202.
  • the transmission frame 200 includes other data 203.
  • the transmission frame generation unit 52 outputs the generated transmission frame 200 to the first transmission unit 61.
  • the 1st transmission part 61 transmits the transmission frame 200 which the transmission frame production
  • the vehicle 3 equipped with the vehicle-mounted device 2 that has received the transmission frame 200 can control automatic driving or driving assistance using the detection result included in the transmission frame 200.
  • the state parameter integration method can be selected from a plurality of integration methods shown below depending on the type of the state parameter or the content of the detection result.
  • FIG. 9 is a flowchart showing a first example of state parameter integration processing.
  • the detection result integration unit 41 of the roadside machine 1A extracts the detection results of the same peripheral object 4 from the detection results acquired by a plurality of detection subjects as shown in FIG. 6 and groups them (step S108). .
  • the detection result integration unit 41 extracts the detection result of the same peripheral object 4
  • a method using image analysis may be mentioned.
  • the detection result integration unit 41 can recognize the same peripheral object from a plurality of images using an image including the peripheral object acquired by the peripheral object detection sensor 10.
  • the detection result integration unit 41 can extract the detection result of the same peripheral object 4 from the plurality of detection results based on the position information included in each detection result.
  • the detection result integration unit 41 considers errors and determines that these peripheral objects 4 are the same peripheral object 4 when the difference in position information of the peripheral objects 4 is within a predetermined range. can do. In the example of FIG.
  • the detection result integration unit 41 detects the detection result of the peripheral object 4 in which the 01th detection result, the 05th detection result, and the 07th detection result are the same from the position included in the detection result. Therefore, it can be determined that the 02nd detection result and the 08th detection result are the same peripheral object 4 detection results. Furthermore, the detection result integration unit 41 is the detection result of the peripheral object 4 in which the 03th detection result, the 06th detection result, and the 10th detection result are the same, and the 04th detection result and the 09th detection result. And the twelfth detection result can be determined to be the same detection result of the surrounding object 4. The detection result integration unit 41 can further determine that the eleventh detection result and the thirteenth detection result are detection results of the same peripheral object 4.
  • the detection result integration unit 41 weights and synthesizes the state parameters using a weighting factor corresponding to the detection accuracy of the detection results within the group divided in step S108 (step S109). For example, assuming that the detection accuracy of the detection means i is ⁇ i % and the numerical value of the state parameter of the detection means i is X i , the numerical value ⁇ total of the state parameter after integration is expressed by the following formula. In this case, the weighting factor is detection accuracy ⁇ i .
  • the state parameter integration process shown in FIG. 9 can be used when the state parameters are represented by numerical values.
  • FIG. 7 shows values obtained by rounding off the values calculated by using the weighting factors for the sizes of the surrounding objects 4 in the groups with detection result IDs (IDentification) of 03, 06 and 10 in FIG.
  • the size of the state parameter is indicated by numerical values of the length, width and height of the object.
  • the length value after integration is calculated as 4.12, as shown below.
  • the width value after integration is 2
  • the height value after integration is 1.76.
  • FIG. 10 is a flowchart showing a second example of state parameter integration processing.
  • the detection result integration unit 41 extracts the detection results of the same peripheral objects 4 and groups them (step 108).
  • the detection result integration unit 41 determines whether or not the state parameters of two or more detection results match (step S110). When the state parameters of two or more detection results match (step S110: Yes), the detection result integration unit 41 sets the most state parameters as the state parameters after integration (step S111). When the state parameters of two or more detection results do not match (step S110: No), the detection result integration unit 41 performs integration processing using other criteria (step S112).
  • the attribute values are two “other objects” and one “two-wheeled vehicle”.
  • the value of the attribute after integration is “other object”. If two or more state parameters do not match, this integration method cannot be used, so an integration process using other criteria is used. Also, this integration method is used when the number of detection results is equal to or greater than a predetermined number because the accuracy of state parameters after integration may decrease when the number of detection results in the same group is small. May be.
  • FIG. 11 is a flowchart showing a third example of state parameter integration processing.
  • the detection result integration unit 41 extracts the detection results of the same peripheral objects 4 and groups them (step 108).
  • the detection result integration unit 41 selects the state parameter with the highest detection result detection accuracy within the group as the state parameter after integration (step S113).
  • the moving speed values are 0 km / h, 1 km / h, and 20 km / h, and the detection accuracy corresponding to each state parameter is 95. %, 80% and 15%.
  • 0 km / h with the highest detection accuracy is selected as the value of the movement speed after integration. This integration method can be used even if the status parameter is a value other than a numerical value.
  • one integration process selected for each state parameter may be used, or a plurality of integration processes may be used in combination. Depending on the number and contents of the acquired state parameters, an integration process to be used may be selected as appropriate.
  • the detection result of the peripheral object 4 detected by the roadside device 1A and the detection result of the peripheral object 4 detected by the in-vehicle device 2 can be integrated. .
  • the amount of information used to obtain the detection result increases, so that the accuracy of the detection result can be improved.
  • the roadside machine 1 since the roadside machine 1 may be able to detect the peripheral object 4 present at the position where it is the blind spot of the vehicle 3, it can be expected to improve the accuracy of the detection result. In particular, in places with poor visibility such as intersections and curves, there is a high possibility that even the peripheral object 4 that cannot be detected by the sensor mounted on the vehicle 3 will be detected by the roadside machine 1.
  • the roadside machine 1A can further integrate the detection results acquired by the other roadside machine 1B. Since the plurality of roadside machines 1 are often installed at intervals, the other roadside machines 1B can detect the peripheral objects 4 existing in a different range from the roadside machine 1A. For this reason, the detection results after the integration include the detection results of the peripheral objects 4 existing in a wide range, and each vehicle 3 that has acquired the detection results It becomes possible to grasp the vehicle 3, the obstacle, the traffic jam, the occurrence of an accident, and the like.
  • FIG. FIG. 12 is a configuration diagram of a roadside machine according to the second embodiment.
  • the roadside device 1 shown in FIG. 12 further includes a threshold setting unit 71, a congestion degree estimation unit 72, and a threshold generation unit 73 in addition to the configuration of the roadside device 1 according to Embodiment 1 shown in FIG. .
  • a threshold setting unit 71 a congestion degree estimation unit 72
  • a threshold generation unit 73 a threshold generation unit 73 in addition to the configuration of the roadside device 1 according to Embodiment 1 shown in FIG.
  • differences from the roadside device 1 according to the first embodiment will be mainly described.
  • the threshold setting unit 71 sets a transmission permission determination threshold, which is a threshold for the in-vehicle device 2 to determine whether the detection result can be transmitted, in the transmission frame generation unit 52.
  • a transmission permission determination threshold which is a threshold for the in-vehicle device 2 to determine whether the detection result can be transmitted, in the transmission frame generation unit 52.
  • the in-vehicle device 2 determines whether or not the detection result can be transmitted based on the detection accuracy of the detection result.
  • the in-vehicle device 2 suppresses the amount of data transmitted and received by road-to-vehicle communication by extracting and transmitting a detection result having a detection accuracy higher than the transmission availability determination threshold value from the plurality of detected detection results. Is possible.
  • the congestion level estimation unit 72 estimates the congestion level of road-to-vehicle communication, and outputs the estimation result of the congestion level to the threshold value generation unit 73. For example, the congestion degree estimation unit 72 estimates the degree of congestion of road-to-vehicle communication based on the number of surrounding vehicles 3 on the assumption that the traffic amount of road-to-vehicle communication increases as the number of surrounding vehicles 3 increases. Can do. In this case, in the congestion degree estimation unit 72, among the peripheral objects 4 detected by the peripheral object detection unit 11, the state parameter attribute output by the first state parameter output unit 14 is a vehicle 3 such as a normal vehicle or a large vehicle. Based on the number of surrounding objects 4, the degree of congestion can be estimated.
  • the congestion degree estimation unit 72 may estimate the traffic from the in-vehicle device 2 to the roadside device 1 by analyzing the road-to-vehicle signal received by the first receiving unit 22.
  • the threshold generation unit 73 receives the estimation result of the congestion level output from the congestion level estimation unit 72, generates a transmission permission determination threshold value based on the estimation result, and outputs the generated transmission permission determination threshold value to the threshold setting unit 71. .
  • the threshold generation unit 73 increases the value of the transmission permission / inhibition determination threshold as the degree of congestion is higher, and decreases the value of the transmission permission / inhibition determination threshold as the degree of congestion is lower.
  • the threshold setting unit 71 sets a transmission permission / inhibition determination threshold according to the estimation result of the congestion degree, and can adjust the communication amount according to the congestion degree.
  • FIG. 13 is a flowchart showing an operation of detecting a peripheral object of the roadside machine according to the second embodiment.
  • the roadside device 1 performs detection result integration processing (step S100).
  • the detection result integration process shown in step S100 corresponds to steps S101 to S105 shown in FIG.
  • the congestion level estimation unit 72 estimates the congestion level of road-to-vehicle communication (step S201). Then, the threshold generation unit 73 generates a transmission permission determination threshold based on the estimated congestion level (step S202).
  • the transmission frame generation unit 52 generates a transmission frame including the detection result after the integration by the detection result integration process shown in step S100 and a transmission permission determination threshold (step S203).
  • the transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61.
  • FIG. 14 is a diagram illustrating a transmission frame transmitted in the second embodiment.
  • the transmission frame 210 includes a transmission permission determination threshold value 204 in addition to the information included in the transmission frame 200 shown in FIG.
  • the first transmission unit 61 transmits the generated transmission frame 210 to the in-vehicle device 2 (step S204).
  • the in-vehicle device 2 is notified of the transmission permission / inhibition determination threshold generated according to the congestion degree of road-to-vehicle communication.
  • the onboard device 2 changes to the roadside device 1. It is possible to control the transmission of the detection result on the roadside device 1 side.
  • the transmission permission / inhibition determination threshold is a threshold for detection accuracy, the roadside device 1 selectively acquires a detection result with high detection accuracy and high importance even in a state where the traffic is suppressed. It becomes possible to do.
  • FIG. 15 is a configuration diagram of a roadside machine according to the third embodiment.
  • the roadside machine 1 shown in FIG. 15 includes a transmission unnecessary list generation unit 53 and a threshold setting unit 71 in addition to the configuration of the roadside machine 1 according to Embodiment 1 shown in FIG.
  • the transmission unnecessary list generation unit 53 uses the peripheral object list generated by the peripheral object list generation unit 51 based on the detection result after integration, and no more detection results are transmitted between the in-vehicle device 2 and the roadside device 1.
  • a road-to-vehicle transmission unnecessary list that is a list of unnecessary peripheral objects 4 is generated.
  • the transmission unnecessary list generating unit 53 can identify the peripheral object 4 that does not require any further transmission of the detection result between the in-vehicle device 2 and the roadside device 1 based on the detection accuracy. For example, the transmission unnecessary list generating unit 53 extracts a peripheral object whose detection accuracy is equal to or higher than a predetermined value and whose moving speed is 0 from the peripheral object list, and generates a road-to-vehicle transmission unnecessary list. Can do.
  • the transmission unnecessary list generating unit 53 outputs the generated road-to-vehicle transmission unnecessary list to the transmission frame generating unit 52.
  • the threshold setting unit 71 sets a predetermined transmission availability determination threshold in the transmission frame generation unit 52. To do.
  • FIG. 16 is a flowchart of the detection operation of the surrounding objects of the roadside machine according to the third embodiment.
  • the detection result integration unit 41 of the roadside machine 1 performs detection result integration processing (step S100).
  • the threshold setting unit 71 sets a predetermined transmission permission / inhibition determination threshold in the transmission frame generation unit 52 (step S301).
  • the transmission frame generation unit 52 generates a transmission frame including the detection result after the integration of the peripheral objects 4 and a transmission permission / rejection determination threshold (step S302).
  • the transmission unnecessary list generating unit 53 generates a road-to-vehicle transmission unnecessary list from the peripheral object list (step S303).
  • the generated road-to-vehicle transmission unnecessary list is output to the transmission frame generation unit 52.
  • the transmission frame generation unit 52 turns on the road-to-vehicle transmission unnecessary flag corresponding to the detection result of the peripheral object 4 included in the road-to-vehicle transmission unnecessary list (step S304).
  • the transmission frame generation unit 52 outputs the generated transmission frame to the first transmission unit 61.
  • the 1st transmission part 61 transmits the transmission frame output from the transmission frame production
  • FIG. 17 is a diagram showing a transmission frame transmitted in the third embodiment.
  • a transmission frame 220 illustrated in FIG. 17 includes a road-to-vehicle transmission unnecessary flag 205 for each peripheral object 4 in addition to the information included in the transmission frame 210 illustrated in FIG.
  • the road-to-vehicle transmission unnecessary flag corresponding to the peripheral object 4 determined to be unnecessary between roads and vehicles is turned ON.
  • the vehicle-mounted device 2 may exclude the detection result of the peripheral object 4 for which the road-to-vehicle transmission unnecessary flag is ON from the transmission target. it can.
  • stationary object detection information that has already been detected with sufficient accuracy is not transmitted from the in-vehicle device 2 to the roadside device 1.
  • the detection performance of the peripheral object 4 is maintained and the roadside from the in-vehicle device 2 is maintained.
  • the amount of communication to the machine 1 can be suppressed.
  • FIG. 18 is a diagram illustrating a hardware configuration of the roadside machine according to the first to third embodiments.
  • Each function of the roadside machine 1 can be realized by using the processor 81 and the memory 82.
  • the processor 81 and the memory 82 are connected by a system bus 83.
  • the processor 81 can realize each function of the roadside machine 1 by reading and executing the computer program stored in the memory 82.
  • the memory 82 stores a computer program executed by the processor 81 and information used in accordance with the execution of the computer program.
  • the third acquisition unit 32, the detection result integration unit 41, the peripheral object list generation unit 51, the transmission frame generation unit 52, the transmission unnecessary list generation unit 53, the congestion degree estimation unit 72, and the threshold generation unit 73, the processor 81 has a memory 82. This is realized by reading out and executing each operation program stored in.
  • the threshold setting unit 71 is realized by the memory 82.
  • FIG. 18 one processor 81 and one memory 82 are shown.
  • the present invention is not limited to such an example, and the roadside machine 1 is formed by cooperation of a plurality of processors 81 and a plurality of memories 82. These functions may be realized.
  • the configuration described in the above embodiment shows an example of the contents of the present invention, and can be combined with another known technique, and can be combined with other configurations without departing from the gist of the present invention. It is also possible to omit or change the part.
  • FIG. 2 shows one peripheral object detection sensor 10 for simplicity, but the present invention is not limited to such an example.
  • the roadside machine 1 may include a plurality of surrounding object detection sensors 10.
  • the plurality of peripheral object detection sensors 10 may be the same type of sensors, or may be a plurality of types of sensors.
  • the road-to-vehicle communication antenna 20 and the road-to-road communication antenna 30 are different antennas, but when two communications are performed in close frequency bands, A common antenna can be used for inter-road communication.
  • the road-to-road communication is wireless communication, the present invention is not limited to such an example.
  • the road-to-road communication may be performed via a wired network such as an optical fiber.
  • the surrounding object 4 is an object that has a high possibility of moving over time, such as the vehicle 3, the fallen object 5, and a pedestrian, but the surrounding object 4 is fixed. It may be an object. For example, in order to realize an automatic driving system, detailed static map information of roads is required, but since this map information is required to be detailed and accurate information, it is actually running. It is desirable to use information on the peripheral object 4 collected from the vehicle 3.
  • the present invention may be used to detect information on surrounding objects for use as such map information.

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

La présente invention concerne : une unité de détection d'objet périphérique (11) qui détecte un objet présent dans l'environnement ; une première unité de réception (22) qui reçoit un signal véhicule-vers-route transmis à partir d'une machine embarquée ; et une unité d'intégration de résultat de détection (41) qui intègre un résultat de détection d'un objet détecté par l'unité de détection d'objet périphérique (11), et un résultat de détection, d'un objet présent dans l'environnement de la machine embarquée, reçu à partir de la machine embarquée à l'aide du signal de véhicule-vers-route.
PCT/JP2016/087222 2016-12-14 2016-12-14 Machine de bord de route et système de communication véhicule-vers-route WO2018109865A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/087222 WO2018109865A1 (fr) 2016-12-14 2016-12-14 Machine de bord de route et système de communication véhicule-vers-route

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2016/087222 WO2018109865A1 (fr) 2016-12-14 2016-12-14 Machine de bord de route et système de communication véhicule-vers-route

Publications (1)

Publication Number Publication Date
WO2018109865A1 true WO2018109865A1 (fr) 2018-06-21

Family

ID=62559577

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/087222 WO2018109865A1 (fr) 2016-12-14 2016-12-14 Machine de bord de route et système de communication véhicule-vers-route

Country Status (1)

Country Link
WO (1) WO2018109865A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019176299A (ja) * 2018-03-28 2019-10-10 住友電気工業株式会社 環境検出装置、環境検出システム、環境検出方法、及びコンピュータプログラム
JP2021149163A (ja) * 2020-03-16 2021-09-27 株式会社Soken 交通システム
JP2021149162A (ja) * 2020-03-16 2021-09-27 株式会社Soken 交通システム
JP2021174064A (ja) * 2020-04-20 2021-11-01 株式会社Soken 交通システム
CN113682307A (zh) * 2021-08-06 2021-11-23 南京市德赛西威汽车电子有限公司 一种可视化变道辅助方法及系统
WO2022208570A1 (fr) * 2021-03-29 2022-10-06 日本電気株式会社 Dispositif embarqué, serveur de commande, procédé de collecte de données mesurées et support d'enregistrement de programme

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10105880A (ja) * 1996-09-30 1998-04-24 Hitachi Ltd 移動体制御システム
JP2012256162A (ja) * 2011-06-08 2012-12-27 Sumitomo Electric Ind Ltd 路側通信機、無線通信システム、無線信号の受信方法及びコンピュータプログラム
JP2016110608A (ja) * 2014-12-01 2016-06-20 住友電気工業株式会社 路側通信装置、通信システム及びデータ中継方法
JP2016167202A (ja) * 2015-03-10 2016-09-15 住友電気工業株式会社 路側通信装置、データ中継方法、中央装置、コンピュータプログラム、及びデータ処理方法
JP2016167199A (ja) * 2015-03-10 2016-09-15 住友電気工業株式会社 路側通信装置、及びデータ中継方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10105880A (ja) * 1996-09-30 1998-04-24 Hitachi Ltd 移動体制御システム
JP2012256162A (ja) * 2011-06-08 2012-12-27 Sumitomo Electric Ind Ltd 路側通信機、無線通信システム、無線信号の受信方法及びコンピュータプログラム
JP2016110608A (ja) * 2014-12-01 2016-06-20 住友電気工業株式会社 路側通信装置、通信システム及びデータ中継方法
JP2016167202A (ja) * 2015-03-10 2016-09-15 住友電気工業株式会社 路側通信装置、データ中継方法、中央装置、コンピュータプログラム、及びデータ処理方法
JP2016167199A (ja) * 2015-03-10 2016-09-15 住友電気工業株式会社 路側通信装置、及びデータ中継方法

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2019176299A (ja) * 2018-03-28 2019-10-10 住友電気工業株式会社 環境検出装置、環境検出システム、環境検出方法、及びコンピュータプログラム
JP7069944B2 (ja) 2018-03-28 2022-05-18 住友電気工業株式会社 環境検出装置、環境検出システム、環境検出方法、及びコンピュータプログラム
JP2021149163A (ja) * 2020-03-16 2021-09-27 株式会社Soken 交通システム
JP2021149162A (ja) * 2020-03-16 2021-09-27 株式会社Soken 交通システム
JP7484258B2 (ja) 2020-03-16 2024-05-16 株式会社デンソー 交通システム
JP2021174064A (ja) * 2020-04-20 2021-11-01 株式会社Soken 交通システム
WO2022208570A1 (fr) * 2021-03-29 2022-10-06 日本電気株式会社 Dispositif embarqué, serveur de commande, procédé de collecte de données mesurées et support d'enregistrement de programme
JPWO2022208570A1 (fr) * 2021-03-29 2022-10-06
JP7658422B2 (ja) 2021-03-29 2025-04-08 日本電気株式会社 車載装置、制御サーバ、測定データの収集方法及びプログラム
CN113682307A (zh) * 2021-08-06 2021-11-23 南京市德赛西威汽车电子有限公司 一种可视化变道辅助方法及系统
CN113682307B (zh) * 2021-08-06 2023-09-12 南京市德赛西威汽车电子有限公司 一种可视化变道辅助方法及系统

Similar Documents

Publication Publication Date Title
WO2018109865A1 (fr) Machine de bord de route et système de communication véhicule-vers-route
US20250022371A1 (en) Moving object and driving support system for moving object
US9465105B2 (en) V2V communication-based vehicle identification apparatus and identification method thereof
JP2022024741A (ja) 車両制御装置、車両制御方法
JP6626410B2 (ja) 自車位置特定装置、自車位置特定方法
WO2015087502A1 (fr) Dispositif d'autolocalisation de véhicule
CN111284487A (zh) 车道线显示方法以及执行该方法的电子设备
WO2018116795A1 (fr) Système d'aide à la conduite et dispositif d'aide à la conduite
KR20180056675A (ko) 디지털 맵을 생성하기 위한 방법 및 시스템
WO2015098510A1 (fr) Dispositif de commande de véhicule, véhicule équipé d'un tel dispositif et procédé de détection de corps mobile
JP5200568B2 (ja) 車載装置、車両走行支援システム
CN109923598B (zh) 车辆用物体检测装置及车辆用物体检测系统
US12249240B2 (en) Communication device, vehicle, computer-readable storage medium, and communication method
JP6828655B2 (ja) 自車位置推定装置
US12240450B2 (en) V2X warning system for identifying risk areas within occluded regions
JP7659617B2 (ja) 画像認識装置、及び画像認識方法
JP4609467B2 (ja) 周辺車両情報生成装置、周辺車両情報生成システム、コンピュータプログラム及び周辺車両情報生成方法
KR102002583B1 (ko) 차량 및 도로시설물 정밀 위치정보 제공 시스템
CN112970051A (zh) 信息处理装置以及驾驶支援装置
US10132642B2 (en) Unit setting apparatus and unit setting method
US11501539B2 (en) Vehicle control system, sensing device and sensing data processing method
US11199854B2 (en) Vehicle control system, apparatus for classifying markings, and method thereof
KR20150055278A (ko) 차량용 레이더를 이용한 실시간 교통정보 측정 시스템 및 방법
WO2015092974A1 (fr) Dispositif de génération d'informations de voiture en approche
US12332065B2 (en) Device and method for generating lane information

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16923900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16923900

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP