WO2018180247A1 - 出力装置、制御方法、プログラム及び記憶媒体 - Google Patents
出力装置、制御方法、プログラム及び記憶媒体 Download PDFInfo
- Publication number
- WO2018180247A1 WO2018180247A1 PCT/JP2018/008346 JP2018008346W WO2018180247A1 WO 2018180247 A1 WO2018180247 A1 WO 2018180247A1 JP 2018008346 W JP2018008346 W JP 2018008346W WO 2018180247 A1 WO2018180247 A1 WO 2018180247A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- accuracy
- information
- road paint
- vehicle
- road
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0212—Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/28—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
- G01C21/30—Map- or contour-matching
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
- B60W30/18—Propelling the vehicle
- B60W30/18009—Propelling the vehicle related to particular drive situations
- B60W30/18163—Lane change; Overtaking manoeuvres
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W60/00—Drive control systems specially adapted for autonomous road vehicles
- B60W60/001—Planning or execution of driving tasks
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/3453—Special cost functions, i.e. other than distance or default speed limit of road segments
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/26—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
- G01C21/34—Route searching; Route guidance
- G01C21/36—Input/output arrangements for on-board computers
- G01C21/3602—Input other than that of destination using image analysis, e.g. detection of road signs, lanes, buildings, real preceding vehicles using a camera
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/20—Control system inputs
- G05D1/24—Arrangements for determining position or orientation
- G05D1/247—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons
- G05D1/249—Arrangements for determining position or orientation using signals provided by artificial sources external to the vehicle, e.g. navigation beacons from positioning sensors located off-board the vehicle, e.g. from cameras
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/60—Intended control result
- G05D1/646—Following a predefined trajectory, e.g. a line marked on the floor or a flight path
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/408—Radar; Laser, e.g. lidar
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2552/00—Input parameters relating to infrastructure
- B60W2552/53—Road markings, e.g. lane marker or crosswalk
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/09—Arrangements for giving variable traffic instructions
- G08G1/0962—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages
- G08G1/09626—Arrangements for giving variable traffic instructions having an indicator mounted inside the vehicle, e.g. giving voice messages where the origin of the information is within the own vehicle, e.g. a local storage device, digital map
Definitions
- the present invention relates to a technique for controlling a vehicle.
- Patent Document 1 the degree of faintness of a white line on a running road is determined based on the output of an external sensor, and if there is a lane that can detect a white line with higher accuracy than the current lane, A technique for controlling a vehicle to change lanes to a lane is disclosed.
- the state of the white line of the other lane may not be accurately determined when other vehicles exist in the vicinity. Also, it is impossible to grasp the state of the white line outside the measurement range of the external sensor.
- the present invention has been made in order to solve the above-described problems, and has as its main object to provide an output device capable of suitably controlling a vehicle based on information on road paint.
- the invention according to claim 1 is an output device, and obtains accuracy information indicating the accuracy of the collation for each road paint, and a collation unit that collates a road paint detection result by the detection device with map information.
- the invention according to claim 12 is a control method executed by the output device, and shows a collation step of collating a road paint detection result with map information by the detection device, and accuracy of the collation for each road paint.
- the invention according to claim 13 is a program executed by a computer, wherein a collation unit that collates a road paint detection result with map information by a detection device, and accuracy information indicating accuracy of the collation for each road paint
- the computer is caused to function as a first acquisition unit that acquires the control information and an output unit that outputs control information for controlling the moving body so that the collation can be performed with an accuracy of a predetermined value or more based on the accuracy information.
- the output device acquires a collation unit that collates a road paint detection result by the detection device with map information, and accuracy information indicating the accuracy of the collation for each road paint.
- a first acquisition unit and an output unit that outputs control information for controlling the moving body so that the collation can be performed with an accuracy of a predetermined value or more based on the accuracy information.
- the output device can obtain the accuracy information for each road paint when the detection result of the road paint detected by the detection device is compared with the map information, so that the above-described verification can be performed with an accuracy of a predetermined value or more.
- the moving body can be suitably controlled.
- the output unit identifies low-precision road paint with which the accuracy of the collation is less than the predetermined value from road paint provided in the path of the moving body, and from the low-precision road paint
- the control information for controlling the moving body is output so as to keep the moving body away.
- the output device can suitably move the vehicle so as not to detect road paint having a collation accuracy less than a predetermined value.
- the output unit searches for a route to a destination based on the accuracy information, and outputs information on the searched route as the control information.
- the output device can determine the route to travel in consideration of the accuracy of the comparison between the road paint detection result and the map information.
- the output unit outputs the control information for causing the display unit to display a path that can be collated with an accuracy equal to or higher than the predetermined value as a recommended path.
- the output device can suitably present to the user, as a recommended route, a route that can execute the comparison between the road paint detection result and the map information with an accuracy of a predetermined value or more.
- the output unit identifies a road paint having an accuracy equal to or higher than the predetermined value from the road paint provided on the path of the moving body, and moves the road paint to the road paint.
- the control information for controlling the moving body so as to bring the body closer is output.
- the output device can suitably move the vehicle so that the accuracy of the collation is close to the road paint having an accuracy not exceeding the predetermined value in order to enable collation with higher accuracy.
- the position estimation error in each of the first direction and the second direction with respect to the traveling direction of the moving body is compared with a predetermined threshold value, and a direction in which the position estimation error is larger than the threshold value is determined.
- a detection unit for detecting, and a second acquisition unit for acquiring, for each road paint, aptitude level information indicating aptitude level when the road paint is used as a reference for estimating the position of the direction detected by the detection unit The output unit specifies the high-precision road paint based on the accuracy information and the suitability information.
- the output device can suitably move the vehicle so that the vehicle approaches a road paint having a high degree of suitability as a reference for position estimation in the direction in which the position estimation error is determined to be larger than the threshold value.
- the output device further includes a position estimation unit that estimates the position of the moving body based on the result of the collation. According to this aspect, the output device can control the vehicle so as to maintain the position estimation accuracy at a predetermined level using the accuracy information of the collation.
- the output unit is required to control the moving body away from the road paint whose accuracy is less than the predetermined value based on the accuracy of the position estimation by the position estimation unit. Determine no. According to this aspect, the output device can accurately determine whether or not it is necessary to move the vehicle so as to avoid road paint in which the accuracy of matching is less than a predetermined value.
- the position estimation unit may proceed to the position estimation based on a result of the collation when road paint with which the collation is less than the predetermined value is included in a detection range by the detection device. Reduce the reflection weight. According to this aspect, the output device can suitably reduce the decrease in the position estimation accuracy based on the collation result for road paint whose collation accuracy is less than a predetermined value.
- the road paint in which the accuracy of the collation is less than the predetermined value in the accuracy information is a lane marking represented by a complex line.
- the lane markings represented by complex lines are likely to cause an error between the detection result of the detection device and the map information. Therefore, preferably, in the accuracy information, the lane markings represented by the composite lines are recorded as road paint whose collation accuracy is less than a predetermined value.
- the road paint in which the accuracy of the collation is less than the predetermined value in the accuracy information is a road paint in which fading has occurred. Since the detection accuracy of the road paint in which the blur has occurred is reduced by the detection device, an error is likely to occur between the detection result of the detection device and the map information. Therefore, preferably, in the accuracy information, the road paint in which the blur has occurred is recorded as a road paint whose collation accuracy is less than a predetermined value.
- a control method executed by an output device wherein a collation step of collating a road paint detection result by a detection device with map information, and the collation for each road paint
- the output device can suitably control the moving body so that the above-described collation can be performed with an accuracy of a predetermined value or more.
- a computer-executable program for collating a road paint detection result by a detection device with map information, and accuracy of the collation for each road paint.
- a computer as an output unit that outputs control information for controlling the moving body so that the collation can be performed with an accuracy equal to or higher than a predetermined value based on the accuracy information.
- the output device can suitably control the moving body so that the above-described collation can be performed with an accuracy of a predetermined value or more.
- the program is stored in a storage medium.
- FIG. 1 is a schematic configuration diagram of a driving support system according to the present embodiment.
- the driving support system shown in FIG. 1 is mounted on a vehicle and has an in-vehicle device 1 that performs control related to driving support of the vehicle, a lidar (Lidar: Light Detection and Ranging, or Laser Illuminated Detection And Ranging) 2, and a gyro sensor 3. And a vehicle speed sensor 4 and a GPS receiver 5.
- a lidar Light Detection and Ranging, or Laser Illuminated Detection And Ranging
- the in-vehicle device 1 is electrically connected to the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and based on these outputs, the position of the vehicle on which the in-vehicle device 1 is mounted ("own vehicle position"). Also called.) And the vehicle equipment 1 performs automatic driving
- the in-vehicle device 1 stores a map database (DB: DataBase) 10 that stores road data and information related to features that are landmarks provided on the road.
- DB DataBase
- the above-mentioned landmarks include, for example, road paints such as lane markings and road markings as well as solid objects such as kiloposts and signs periodically arranged along the road.
- the vehicle equipment 1 collates the output of the lidar 2 etc. with the information registered in map DB10, and estimates the own vehicle position.
- the lidar 2 emits a pulse laser in a predetermined angle range in the horizontal direction and the vertical direction, thereby discretely measuring the distance to an object existing in the outside world, and a three-dimensional point indicating the position of the object Generate group information.
- the lidar 2 includes an irradiation unit that emits laser light while changing the irradiation direction, a light receiving unit that receives reflected light (scattered light) of the irradiated laser light, and scan data based on a light reception signal output by the light receiving unit. And an output unit for outputting (point cloud data).
- the scan data is generated based on the irradiation direction corresponding to the laser beam received by the light receiving unit and the response delay time of the laser beam specified based on the above-described received light signal.
- the accuracy of the lidar distance measurement value is higher as the distance to the object is shorter, and the accuracy is lower as the distance is longer. Since road paint has a different reflectance from other road surface areas, it is possible to identify road paint point cloud data based on the level of a received light signal corresponding to the amount of reflected light. In the present embodiment, it is assumed that the rider 2 is installed so as to scan at least the road surface of the running road.
- the rider 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5 each supply output data to the in-vehicle device 1.
- the in-vehicle device 1 is an example of the “output device” in the present invention
- the lidar 2 is an example of the “detection device” in the present invention.
- FIG. 2 is a block diagram showing a functional configuration of the in-vehicle device 2.
- the in-vehicle device 2 mainly includes an interface 11, a storage unit 12, an input unit 14, a control unit 15, and an information output unit 16. Each of these elements is connected to each other via a bus line.
- the interface 11 acquires output data from sensors such as the lidar 2, the gyro sensor 3, the vehicle speed sensor 4, and the GPS receiver 5, and supplies the output data to the control unit 15.
- the interface 11 supplies a signal related to the traveling control of the vehicle generated by the control unit 15 to an electronic control unit (ECU: Electronic Control Unit) of the vehicle.
- ECU Electronic Control Unit
- the signal transmitted from the control unit 15 to the electronic control device of the vehicle via the interface 11 is an example of “control information” in the present invention.
- the storage unit 12 stores a program executed by the control unit 15 and information necessary for the control unit 15 to execute a predetermined process.
- the storage unit 12 stores a map DB 10 including road paint information.
- FIG. 3 shows an example of the data structure of road paint information.
- the road paint information is information associated with road data of a road on which road paint is provided, for example.
- the road paint information includes identification information for identifying individual road paint, position information indicating the position of the road paint, and detection accuracy when the road paint is detected by an external sensor such as the lidar 2.
- Information also referred to as “detection accuracy information Idet” and road paint direction and direction perpendicular to the vehicle direction (also referred to as “side surface direction”)
- the road paint detection accuracy indicated by the detection accuracy information Idet indicates the accuracy of matching between the position of the road paint specified based on the output of the lidar 2 and the position of the road paint specified based on the map DB 10.
- An example of road paint with poor detection accuracy indicated by the detection accuracy information Idet is a road having an exceptional shape represented by a composite line in addition to road paint in which the road paint is faint and difficult to detect. Paint is also included. As will be described later, the latter road paint is a road paint that is likely to be shifted between the position of the road paint specified based on the output of the lidar 2 and the position of the road paint specified based on the map DB 10.
- the detection accuracy information Idet may be flag information indicating whether or not the road paint has a low detection accuracy, or may be numerical information indicating the degree of detection accuracy stepwise. In the former case, the detection accuracy information Idet may be included only in the road paint information of road paint with low detection accuracy.
- the detection accuracy information Idet is not limited to information that directly indicates the detection accuracy, but may be information that indirectly indicates the detection accuracy. In the latter case, the detection accuracy information may be road paint type information indicating whether or not the road paint has a complex shape.
- the aptitude direction information Sdi is used when the vehicle-mounted device 2 compares the position of the road paint detected by the rider 2 with the position information of the road paint registered in the map DB 10 and estimates the position of the vehicle.
- the solid lane markings are continuously extended in the direction of travel of the vehicle, it is the best standard for estimating the position of the vehicle in the lateral direction of the vehicle, It is not suitable as a standard.
- the broken lane markings are discontinuously extended in the traveling direction of the vehicle, the lane markings are inferior to the solid lane markings, but are suitable as a reference for estimating the vehicle position in the lateral direction of the vehicle. Since the end portion can be used as a reference for estimating the vehicle position in the traveling direction of the vehicle, it is also suitable as a reference for estimating the vehicle position in the traveling direction of the vehicle.
- the aptitude direction information Sdi stores information indicating the aptitude degree as a reference for estimating the own vehicle position in the traveling direction and the side direction of the vehicle for each road paint.
- the aptitude direction information Sdi may be flag information indicating whether or not the road paint has a low aptitude level, or may be numerical information indicating the aptitude level step by step.
- the suitability direction information Sdi may be included only in the road paint information of a road paint having a low suitability level.
- the aptitude direction information Sdi is not limited to information that directly indicates the aptitude level, but may be information that indirectly indicates the aptitude level.
- the appropriate direction information Sdi may be information indicating the type of road paint.
- the map DB 10 may be updated periodically.
- the control unit 15 receives partial map information related to the area to which the vehicle position belongs from a server device that manages the map information via a communication unit (not shown), and reflects it in the map DB 10.
- the input unit 14 is a button operated by the user, a touch panel, a remote controller, a voice input device, and the like, and receives an input for specifying a destination for route search, an input for specifying on / off of automatic driving, and the like.
- the information output unit 16 is, for example, a display or a speaker that outputs based on the control of the control unit 15.
- the information output unit 16 is an example of the “display unit” in the present invention.
- the control unit 15 includes a CPU that executes a program and controls the entire vehicle-mounted device 1.
- the control unit 15 includes a host vehicle position estimation unit 17 and an automatic driving control unit 18.
- the control unit 15 is a “computer” that executes the “collation unit”, “first acquisition unit”, “second acquisition unit”, “position estimation unit”, “output unit”, “detection unit”, and program according to the present invention. It is an example.
- the own vehicle position estimation unit 17 is based on the distance and angle measurement values by the lidar 2 for the feature such as road paint and the feature position information extracted from the map DB 10, and the gyro sensor 3, the vehicle speed sensor 4, and / or Alternatively, the vehicle position estimated from the output data of the GPS receiver 5 is corrected.
- the vehicle position estimation unit 17 estimates a vehicle position from output data from the gyro sensor 3 and the vehicle speed sensor 4 based on a state estimation method based on Bayesian estimation, The measurement update step for correcting the estimated value of the vehicle position calculated in the prediction step is executed alternately.
- the automatic driving control unit 18 refers to the map DB 10 and transmits a signal necessary for automatic driving control to the vehicle based on the set route and the own vehicle position estimated by the own vehicle position estimating unit 17. Based on the set route, the automatic operation control unit 18 sets a target track, and the vehicle position estimated by the host vehicle position estimation unit 17 is set so as to be within a predetermined width from the target track. Then, a guide signal is transmitted to control the position of the vehicle.
- the own vehicle position estimation unit 17 sequentially repeats the prediction step and the measurement update step to estimate the own vehicle position.
- Various filters developed to perform Bayesian estimation can be used as the state estimation filter used in these steps, and examples thereof include an extended Kalman filter, an unscented Kalman filter, and a particle filter.
- various methods have been proposed for position estimation based on Bayesian estimation. In the following, vehicle position estimation using an extended Kalman filter will be briefly described as an example.
- FIG. 4 is a diagram showing the state variable vector x in two-dimensional orthogonal coordinates.
- the vehicle position on a plane defined on the two-dimensional orthogonal coordinates of xy is represented by coordinates “(x, y)” and the direction “ ⁇ ” of the vehicle.
- the azimuth ⁇ is defined as an angle formed by the traveling direction of the vehicle and the x axis.
- the coordinates (x, y) indicate an absolute position corresponding to a combination of latitude and longitude, for example.
- FIG. 5 is a diagram illustrating a schematic relationship between the prediction step and the measurement update step.
- calculation and update of the estimated value of the state variable vector X are sequentially performed by repeating the prediction step and the measurement update step.
- the state variable vector at the reference time (ie, current time) “t” to be calculated is expressed as “X ⁇ t ” or “X ⁇ t ”.
- state variable vector X t (x t , y t , ⁇ t ) T ”.
- the provisional estimated value estimated in the prediction step has “ - given the "updated in the measurement update step, subjecting the" ⁇ "and more accurate estimates on a character representing the value.
- the vehicle position estimation unit 17 calculates the covariance matrix “ ⁇ ⁇ t ” corresponding to the error distribution of the prior estimated value X ⁇ t at the time t ⁇ 1 calculated in the immediately preceding measurement update step. Is calculated from the covariance matrix of “ ⁇ ⁇ t ⁇ 1 ”.
- the host vehicle position estimation unit 17 associates the position vector of the feature registered in the map DB 10 with the scan data of the rider 2. Then, the vehicle position estimating section 17, when could this correspondence, and the measured value by the rider 2 features that could correspond with "Z t", pre-estimated value X - registered in t and map DB10
- the measurement estimated value “Z ⁇ t ” of the feature obtained by modeling the measurement processing by the lidar 2 using the position vector of the feature is obtained.
- the measurement value Z t is a two-dimensional vector representing the distance and scan angle of the feature measured by the lidar 2 at time t.
- the vehicle position estimation unit 17 multiplies the difference between the measured value Z t and the measured estimated value Z ⁇ t by a Kalman gain “K t ”, and pre-estimates this.
- K t a Kalman gain
- X ⁇ t an updated state variable vector (also referred to as “posterior estimated value”) X ⁇ t is calculated.
- X ⁇ t X - t + K t (Z t -Z ⁇ t) (1)
- the vehicle position estimating section 17 similarly to the prediction step, the covariance matrix sigma ⁇ t which corresponds to the error distribution of a posteriori estimate X ⁇ t pre-covariance matrix sigma - obtained from t.
- the parameters such as the Kalman gain K t, can be calculated for example in analogy to known self-position technique using an extended Kalman filter.
- the detection accuracy indicated by the detection accuracy information Idet calculates the posteriori estimated value X ⁇ t from equation (1) based on the low road paint, registering the measured values Z t and a map DB10 based on the output 2 of the rider 2
- the difference from the estimated measurement value Z ⁇ t obtained using the position vector of the feature obtained is increased. That is, in this case, the difference “Z t ⁇ Z ⁇ t ” multiplied by the Kalman gain K t in equation (1) becomes large. In this case, the estimation accuracy of the a posteriori estimated value X ⁇ t obtained by the equation (1) is lowered.
- the automatic driving control unit 18 controls the vehicle so as to avoid position estimation based on road paint with low detection accuracy indicated by the detection accuracy information Idet. Thereby, the automatic operation control part 18 suppresses suitably that position estimation accuracy falls.
- a specific control method will be described in detail in the following sections [First Vehicle Control Based on Road Paint Information] and [Second Vehicle Control Based on Road Paint Information].
- FIG. 6 is a flowchart showing first vehicle control based on road paint information executed by the automatic driving control unit 18.
- the automatic driving control unit 18 sets the vehicle that has been set in advance when road paint that is determined to have low detection accuracy based on the detection accuracy information Idet of road paint information exists in the vicinity of the target track. Correct the target trajectory. Note that when the flowchart of FIG. 6 is executed, it is assumed that the automatic operation control unit 18 has set the target trajectory of the vehicle along the route to the set destination.
- the automatic operation control unit 18 determines whether or not the current position estimation accuracy is worse than a predetermined value (step S100). For example, the automatic operation control unit 18 determines that the current position estimation accuracy is predetermined when the major axis of the error ellipse specified based on the error covariance matrix obtained in the calculation process of position estimation based on the extended Kalman filter is longer than a predetermined length. Judged to be worse than the value. And the automatic driving
- the automatic driving control unit 18 acquires road paint information associated with road data of a road that forms a route to the destination from the map DB 10 (step S101). In this case, the automatic driving control unit 18 acquires, for example, road paint information corresponding to a road on a route existing within a predetermined distance from the current position from the map data 10.
- the automatic driving control unit 18 detects that the road paint whose detection accuracy indicated by the detection accuracy information Idet is lower than a predetermined threshold (also simply referred to as “low detection accuracy”) is in the vicinity of the target track (for example, the target track). And the same lane) (step S102).
- a predetermined threshold also simply referred to as “low detection accuracy”
- the above-described threshold value is determined in advance based on an experiment or the like in consideration of whether or not the position estimation accuracy of the host vehicle position estimation unit 17 is reduced, and stored in the storage unit 12 or the like in advance.
- the automatic driving control unit 18 determines that road paint having low detection accuracy indicated by the detection accuracy information Idet is present in the vicinity of the target track (step S102; Yes)
- the automatic operation control unit 18 reflects the vehicle position estimation.
- the weight for the collation result with the map DB 10 at that time is lowered (step S103).
- the automatic operation control unit 18 determines that “K t (Z t ⁇ Z ⁇ t )” in Expression (1) in a travel section in which road paint with low detection accuracy may be within the measurement range of the rider 2. Is multiplied by a predetermined coefficient less than 1.
- the automatic operation control unit 18 can suitably reduce the decrease in position estimation accuracy even when the road paint having low detection accuracy is within the measurement range of the rider 2. it can.
- the automatic driving control unit 18 corrects the target trajectory of the vehicle so as to avoid road paint with low detection accuracy (step S104). Specifically, the automatic driving control unit 18 corrects the target track so as to change the lane to a lane different from the lane where the road paint having low detection accuracy is provided. In another example, when a lane line with low detection accuracy is provided on one side of a traveling one-lane road, the automatic operation control unit 18 sets the traveling position on the lane line opposite to the lane line. Correct the target trajectory in the lane so that it is biased. Thus, the automatic operation control unit 18 can suitably prevent position estimation based on the road paint with low detection accuracy by controlling the vehicle away from the road paint with low detection accuracy. it can.
- FIG. 7 is an overhead view of a vehicle traveling on a two-lane road 50 in which a composite road paint having low detection accuracy exists on the right side.
- the single lane marking 61 is a lane marking composed of only white lines
- the composite lane marking 62 is a complex lane marking including an orange line and a white line.
- a comb-shaped white line for emphasizing the orange line is provided on both sides of the orange line.
- a broken line “Lt” indicates a target track of the vehicle set by the automatic driving control unit 18.
- the vehicle position estimation unit 17 uses the vehicle indicated by the point cloud data of the target feature as a reference. The barycentric coordinates of the two-dimensional coordinates are calculated.
- the detection accuracy indicated by the road paint information detection accuracy information Idet corresponding to the composite lane line 62 is the detection of road paint information corresponding to the single lane line 61 and the other lane lines 63 and 64.
- the detection accuracy is set to be lower than the detection accuracy indicated by the accuracy information Idet.
- the automatic driving control unit 18 refers to the road paint information associated with the road 50 on which the vehicle is traveling from the map DB 10, and the detection accuracy indicated by the detection accuracy information Idet of the road paint information corresponding to the composite lane marking 62 is detected. Is determined to have low detection accuracy lower than a predetermined threshold. Therefore, in this case, in order to avoid performing position estimation based on the composite lane line 62, the automatic operation control unit 18 changes the lane to the left lane of the road 50 that is not close to the composite lane line 62. Lt reference) is set. When the vehicle is driven according to the target track indicated by the broken line Lt, the lane markings 63 and 64 are the closest lane markings when passing the side of the composite lane marking 62. Therefore, in this case, the host vehicle position estimation unit 17 can perform position estimation with reference to the lane marking 63 or / and the lane marking 64, and can maintain the positional accuracy in the side surface direction of the vehicle at a high level.
- FIG. 8 is a bird's-eye view of a vehicle when road paint having low detection accuracy is provided on a one-lane road.
- a single lane line 66 and a composite lane line 67 are provided between a single-lane road 53 and a road 54 in the opposite direction.
- the single lane marking 66 is composed of only a white line
- the composite lane marking 67 is composed of a white line and an orange line.
- the point cloud data of the orange line is obtained in addition to the point cloud data of the white line.
- point cloud data having a large variation in the width direction of the road is obtained from the lidar 2 as the point cloud data of the composite lane line 67, and the measurement value z for the composite lane line 67 has a relatively low accuracy. . Therefore, in the example of FIG. 8, the detection accuracy indicated by the detection accuracy information Idet of the road paint information corresponding to the composite lane marking 67 is set to a low detection accuracy.
- the automatic driving control unit 18 refers to the road paint information associated with the road 53 on which the vehicle is traveling from the map DB 10, and the detection accuracy indicated by the detection accuracy information Idet of the road paint information corresponding to the composite lane line 67 is detected. Is determined to have low detection accuracy lower than a predetermined threshold. Therefore, in this case, the automatic operation control unit 18 travels in a position that is biased to the lane line 68 rather than the lane line 67 in the road 53 in order to avoid performing position estimation with the composite lane line 67 as a reference. A target trajectory (see broken line Lt) is set.
- the host vehicle position estimation unit 17 performs position estimation with reference to the lane marking 68, and can maintain the position accuracy in the side surface direction of the vehicle at a high level.
- FIG. 9A shows another example of road paint in which the detection accuracy of the detection accuracy information Idet is low.
- a composite lane line 69 is provided between the road 55 and the road 56 in the opposite lane.
- the composite lane line 69 gradually increases in width along the road 55.
- the detection accuracy indicated by the detection accuracy information Idet of the road paint information is set to a low detection accuracy for such a composite lane marking 69 as well.
- FIG. 9 (B) shows a two-lane road 57 provided with road markings 70-73.
- no blur has occurred for the road markings 70, 71 provided in the left lane of the road 57
- no blur has occurred for the road markings 72, 73 provided in the right lane of the road 57.
- detection accuracy information Idet indicating low detection accuracy is registered. Therefore, in the example of FIG. 9B, when the vehicle passes the road 57, the automatic driving control unit 18 determines that the road markings 72 and 73 are based on the detection accuracy information Idet of the road paint information of the road markings 70 to 73. Judge that the lanes provided should be avoided. And the automatic driving
- the automatic operation control unit 18 refers to the detection accuracy information Idet of the road paint information, and searches for the route so as to avoid road paint having low detection accuracy. May be performed.
- the automatic operation control unit 18 searches for a route that minimizes the sum of the link costs calculated for each road according to the required time or distance. At this time, the automatic operation control unit 18 adds the cost based on the detection accuracy indicated by the detection accuracy information Idet to the link cost in addition to the required time and distance. In this case, the cost to be added based on the detection accuracy information Idet is set higher as the detection accuracy indicated by the detection accuracy information Idet is lower, for example. By doing in this way, the automatic driving
- the automatic operation control unit 18 adds low cost to the link cost corresponding to the road including the road paint with low detection accuracy, which is significantly higher than the cost for the required time and distance. It is also possible to search for a route substantially avoiding the road having the road paint.
- FIG. 10 is a diagram schematically showing a route selection screen displayed by the information output unit 16.
- the automatic driving control unit 18 recommends a recommended route based on a normal route search that does not take into account the detection accuracy information Idet based on the destination specified based on the user input (also referred to as “non-detection accuracy consideration route”).
- a recommended route also referred to as “detection accuracy-considered route” 84 based on the route search in consideration of the detection accuracy information Idet are respectively searched and displayed on the route search screen.
- the automatic operation control unit 18 clearly indicates a low detection accuracy section, which is a road section including road paint having low detection accuracy, on the detection accuracy non-considered route 83 by a broken line.
- a mark 81 indicates a destination and a mark 82 indicates a current position.
- the detection accuracy non-considered route 83 includes a low detection accuracy section that is a road section including road paint having low detection accuracy, and the position estimation accuracy may be lowered in the section. is there.
- the low detection accuracy section does not exist in the detection accuracy consideration route 84, when the detection accuracy consideration route 84 is selected as a travel route, position estimation based on road paint can be suitably executed. Is possible.
- the automatic driving control unit 18 uses the road paint information in the route search, and allows the user to suitably select a route avoiding the road section including the road paint with low detection accuracy as the route to travel. Can do.
- the second vehicle control based on the road paint information is based on the detection accuracy information Idet and the appropriate direction information Sdi included in the road paint information when the position estimation error in the traveling direction or the side direction is larger than a predetermined threshold.
- a road paint suitable as a vehicle position estimation reference in a direction with a large error is searched based on a predetermined standard, and the target trajectory is corrected so as to approach the road paint suitable as the vehicle position estimation reference.
- FIG. 11 is a flowchart showing second vehicle control based on road paint information executed by the automatic driving control unit 18.
- the automatic driving control unit 18 is a road suitable as a vehicle position estimation reference in a direction in which the vehicle position estimation error is large based on the road paint information detection accuracy information Idet and the appropriate direction information Sdi.
- the paint is searched, and when the appropriate road paint exists in the vicinity of the target track, the preset target track of the vehicle is corrected.
- the automatic operation control unit 18 has set a target trajectory of the vehicle along the route to the set destination.
- the automatic driving control unit 18 specifies an error in position estimation in the traveling direction and the side direction of the vehicle (step S201). For example, the automatic operation control unit 18 converts the error covariance matrix obtained in the position estimation calculation process based on the extended Kalman filter by a rotation matrix using the direction ⁇ of the own vehicle, so that the traveling direction of the vehicle And position estimation errors in the lateral direction are specified.
- the automatic driving control unit 18 monitors the position estimation accuracy in the traveling direction and the position estimation accuracy in the side surface direction of the host vehicle position estimation unit 17, respectively. Then, the automatic operation control unit 18 determines whether or not there is a direction with low position estimation accuracy (also referred to as “low position accuracy direction Dtag”) (step S202). For example, the automatic operation control unit 18 compares the position estimation error in each of the traveling direction and the side direction specified in step S201 with a predetermined threshold, and determines the direction in which the position estimation error is larger than the predetermined threshold as the low position accuracy direction Dtag. Detect as. Then, it is determined whether or not the low position accuracy direction Dtag is detected.
- the traveling direction and the side surface direction of the vehicle are examples of the “first direction” and the “second direction” in the present invention.
- step S202 when it is determined that the low position accuracy direction Dtag does not exist (step S202; No), the automatic operation control unit 18 determines that it is not necessary to correct the target track of the vehicle, and ends the process of the flowchart. .
- step S202 when the automatic driving control unit 18 determines that the low position accuracy direction Dtag exists (step S202; Yes), the road paint information associated with the road data of the road that forms the route to the destination is obtained from the map DB 10.
- step S203 In this case, for example, the automatic driving control unit 18 acquires road paint information corresponding to a road on a route existing within a predetermined distance from the current position from the map DB 10.
- the automatic operation control unit 18 determines whether or not road paint suitable as a reference for estimating the vehicle position in the low position accuracy direction Dtag exists in the vicinity of the target track (step S204). In this case, the automatic operation control unit 18 has a detection accuracy lower than a predetermined reference based on the detection accuracy information Idet and the suitability direction information Sdi included in the road paint information from the road paint in the vicinity of the target trajectory.
- Suitable as a reference for estimating the vehicle position in the low position accuracy direction Dtag excluding road paint to be determined and the road paint determined to be unsuitable for the vehicle position estimation in the low position accuracy direction Dtag from a predetermined standard
- the automatic operation control unit 18 determines that a road paint suitable as a reference for estimating the vehicle position in the low position accuracy direction Dtag exists in the vicinity of the target track (step S204; Yes), the automatic operation control unit 18 approaches the road paint.
- the target trajectory of the vehicle is corrected (step S205).
- the automatic driving control unit 18 corrects the target track so as to change the lane to a lane in which road paint suitable as a reference is provided.
- the automatic operation control unit 18 corrects the target track so as to change the lane to the lane where the road paint having higher suitability is provided.
- the target track may be modified so that the lane is changed to a lane in which road paint closer to the original target track is provided.
- the automatic operation control unit 18 may adjust the traveling position toward the lane line so that the traveling position is biased. Correct the target trajectory at.
- step S204 when the automatic driving control unit 18 determines that the low position accuracy direction Dtag does not exist (step S204; No), it determines that there is no need to correct the target trajectory of the vehicle, and performs the processing of the flowchart. finish.
- the automatic operation control unit 18 performs control so that the vehicle approaches the road paint. By doing so, position estimation can be suitably performed on the basis of the road paint.
- the automatic driving control unit 18 uses the road paint whose detection accuracy indicated by the detection accuracy information Idet is low detection accuracy, as in the first vehicle control based on the road paint information.
- the road paint with low detection accuracy is temporarily within the measurement range of the lidar 2 by lowering the weight on the collation result with the map DB 10 when it is reflected in the vehicle position estimation. Even if it becomes, you may make it reduce suitably the fall of a position estimation precision.
- FIG. 12 is an overhead view of a vehicle traveling on a two-lane road 90 where road paint having low detection accuracy exists on the left side.
- a solid line 93 that is a lane marking that divides the road 90 and the road 91 in the opposite direction
- a broken line 92 that is a lane marking that divides the road 90
- a solid line 94 that is the leftmost lane marking of the road 90.
- a broken line “Lt” indicates a target track of the vehicle set by the automatic driving control unit 18.
- the detection accuracy indicated by the road paint information detection accuracy information Idet corresponding to the solid line 94 is the detection accuracy indicated by the road paint information detection accuracy information Idet corresponding to the broken line 92 and the solid line 93. Lower detection accuracy is set.
- the appropriate direction information Sdi of the road paint information corresponding to the solid line 93 and the solid line 94 includes the vehicle position in the lateral direction of the vehicle. Information indicating that the suitability degree as the estimation reference is optimum and the suitability degree as the vehicle position estimation reference in the traveling direction of the vehicle is inappropriate.
- the aptitude direction information Sdi of the road paint information corresponding to the broken line 92 is used as a reference for estimating the vehicle position in the side direction of the vehicle.
- Information indicating that the aptitude degree is suitable and the aptitude degree as a reference for estimating the vehicle position in the traveling direction of the vehicle is also suitable.
- the automatic driving control unit 18 determines that the error of the vehicle position estimation in the side direction of the vehicle is larger than a predetermined threshold. That is, it is assumed that the side surface direction of the vehicle is determined to be the low position accuracy direction Dtag.
- the automatic operation control unit 18 refers to the road paint information corresponding to the broken line 92, the solid line 93, and the solid line 94 from the map DB 10.
- the automatic operation control unit 18 excludes the solid line 94 from the position estimation reference candidates.
- the vehicle position estimation unit 17 can maintain the position accuracy in the side surface direction of the vehicle at a high level by performing position estimation based on the solid line 93. it can.
- the target trajectory is slightly corrected so as to approach the broken line 92 in this lane, thereby changing the traveling direction of the vehicle.
- the position accuracy can also be maintained at a high level.
- the automatic operation control unit 18 sets the detection accuracy indicated by the detection accuracy information Idet corresponding to each of the broken line 92, the solid line 93, and the solid line 94, and the aptitude degree indicated by the aptitude direction information Sdi, respectively, to a predetermined reference.
- the road paint suitable for the vehicle position estimation reference in the low position accuracy direction Dtag may be determined by comprehensively determining based on the detection accuracy score and the suitability score.
- the case where the vehicle position estimation error in the side surface direction of the vehicle is larger than the predetermined threshold value is described.
- the case where the vehicle position estimation error in the vehicle traveling direction is larger than the predetermined threshold value is described.
- the road paint having a high degree of suitability for the vehicle position estimation in the traveling direction of the vehicle in the suitability direction information Sdi may be determined as the position estimation reference.
- the vehicle in the traveling direction of the vehicle in the aptitude direction information Sdi What is necessary is just to determine the road paint whose aptitude degree with respect to the position estimation and the aptitude degree with respect to the vehicle position estimation in the side direction is an aptitude degree equal to or higher than a predetermined reference as a position estimation reference.
- the in-vehicle device 1 in the present embodiment includes the own vehicle position estimation unit 17 and the automatic operation control unit 18.
- the own vehicle position estimation unit 17 estimates the own vehicle position by collating the detection result of road paint by the rider 2 with the map DB 10.
- the automatic driving control unit 18 acquires road surface paint information including detection accuracy information Idet indicating the accuracy of collation for each road paint from the map DB 10. Then, the automatic operation control unit 18 outputs information for controlling the vehicle to the electronic control device or the information output unit 16 of the vehicle so that the collation can be performed with an accuracy of a predetermined value or more based on at least the detection accuracy information Idet.
- the in-vehicle device 1 can preferably improve the accuracy of the vehicle position estimation.
- a server device (not shown) may have the map DB 10 instead of the configuration in which the map DB 10 is stored in the storage unit 12.
- the in-vehicle device 1 acquires necessary road surface paint information and the like by communicating with the server device through a communication unit (not shown).
- the configuration of the driving support system shown in FIG. 1 is an example, and the configuration of the driving support system to which the present invention is applicable is not limited to the configuration shown in FIG.
- the electronic control device of the vehicle instead of having the in-vehicle device 1, the electronic control device of the vehicle may execute processing of the own vehicle position estimating unit 17 and the automatic driving control unit 18 of the in-vehicle device 1.
- map DB10 is memorize
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Aviation & Aerospace Engineering (AREA)
- Mechanical Engineering (AREA)
- Transportation (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Electromagnetism (AREA)
- Human Computer Interaction (AREA)
- Traffic Control Systems (AREA)
- Navigation (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
- Instructional Devices (AREA)
Abstract
Description
[概略構成]
X^ t=X- t+Kt(Zt-Z^ t) 式(1)
道路ペイント情報に基づく第1車両制御は、車両が低検出精度となる道路ペイントが設けられる車線を走行する場合に、当該車線とは異なる車線に車線変更するように目標軌道を修正するものである。
図6は、自動運転制御部18が実行する道路ペイント情報に基づく第1車両制御を示すフローチャートである。図6のフローチャートでは、自動運転制御部18は、道路ペイント情報の検出精度情報Idetに基づき検出精度が低いと判断される道路ペイントが目標軌道の近傍に存在する場合に、予め設定されていた車両の目標軌道を修正する。なお、図6のフローチャートの実行時には、自動運転制御部18は、設定された目的地への経路に沿った車両の目標軌道を設定しているものとする。
図7は、低検出精度となる複合的な道路ペイントが右側に存在する2車線の道路50上を走行する車両の俯瞰図である。図7の例では、道路50とその反対方向の道路51とを区画する単一区画線61及び複合区画線62が存在する。ここで、単一区画線61は、白線のみから構成される区画線であり、複合区画線62は、オレンジ線と白線とを含む複合的な区画線である。ここで、複合区画線62では、オレンジ線の両側にオレンジ線を強調するための櫛歯形状の白線が設けられている。また、破線「Lt」は、自動運転制御部18が設定する車両の目標軌道を示す。
自動運転制御部18は、目的地への経路探索において、道路ペイント情報の検出精度情報Idetを参照し、低検出精度となる道路ペイントを避けるように経路探索を行ってもよい。
道路ペイント情報に基づく第2車両制御は、進行方向または側面方向の位置推定誤差が所定の閾値よりも大きい場合に、道路ペイント情報に含まれる検出精度情報Idetおよび適性方向情報Sdiに基づいて、当該誤差が大きい方向の自車位置推定の基準として適した道路ペイントを所定の基準で検索し、当該自車位置推定の基準として適した道路ペイントに接近するように目標軌道を修正するものである。
図11は、自動運転制御部18が実行する道路ペイント情報に基づく第2車両制御を示すフローチャートである。図11のフローチャートでは、自動運転制御部18は、道路ペイント情報の検出精度情報Idet、および適性方向情報Sdiに基づいて自車位置推定の誤差が大きい方向の自車位置推定の基準として適した道路ペイントを検索し、当該適した道路ペイントが目標軌道の近傍に存在する場合に、予め設定されていた車両の目標軌道を修正する。なお、図11のフローチャートの実行時には、自動運転制御部18は、設定された目的地への経路に沿った車両の目標軌道を設定しているものとする。
図12は、低検出精度となる道路ペイントが左側に存在する2車線の道路90上を走行する車両の俯瞰図である。図12の例では、道路90とその反対方向の道路91とを区画する区画線である実線93、道路90内を区画する区画線である破線92、道路90の左端の区画線である実線94が存在する。ここで、実線94にはかすれが生じているものとする。また、破線「Lt」は、自動運転制御部18が設定する車両の目標軌道を示す。
以下、実施例に好適な変形例について説明する。以下の変形例は、組み合わせて実施例に適用してもよい。
車載機1は、地図DB10を記憶部12に記憶する構成に代えて、図示しないサーバ装置が地図DB10を有してもよい。この場合、車載機1は、図示しない通信部でサーバ装置と通信することにより、必要な路面ペイント情報等を取得する。
図1に示す運転支援システムの構成は一例であり、本発明が適用可能な運転支援システムの構成は図1に示す構成に限定されない。例えば、運転支援システムは、車載機1を有する代わりに、車両の電子制御装置が車載機1の自車位置推定部17及び自動運転制御部18等の処理を実行してもよい。この場合、地図DB10は、例えば車両内の記憶部に記憶され、車両の電子制御装置は、地図DB10の更新情報を図示しないサーバ装置から受信してもよい。
2 ライダ
3 ジャイロセンサ
4 車速センサ
5 GPS受信機
10 地図DB
Claims (14)
- 検出装置による道路ペイントの検出結果と地図情報との照合を行う照合部と、
道路ペイント毎の前記照合の精度を示す精度情報を取得する第1取得部と、
前記精度情報に基づき、所定値以上の精度で前記照合が可能となるように移動体を制御するための制御情報を出力する出力部と、
を備える出力装置。 - 前記出力部は、前記移動体の経路に設けられた道路ペイントから前記照合が前記所定値未満の精度となる低精度道路ペイントを特定し、当該低精度道路ペイントから前記移動体を遠ざけるように前記移動体を制御するための前記制御情報を出力する請求項1に記載の出力装置。
- 前記出力部は、前記精度情報に基づき目的地までの経路を探索し、探索した経路の情報を前記制御情報として出力する請求項1または2に記載の出力装置。
- 前記出力部は、前記所定値以上の精度で前記照合が可能な経路の情報を、推奨する経路として表示部に表示させるための前記制御情報を出力する請求項3に記載の出力装置。
- 前記出力部は、前記移動体の経路に設けられた道路ペイントから前記照合が前記所定値以上の精度となる高精度道路ペイントを特定し、当該高精度道路ペイントに前記移動体を近づけるように前記移動体を制御するための前記制御情報を出力する請求項1に記載の出力装置。
- 前記移動体の進行方向に対する第1方向及び第2方向のそれぞれの位置推定誤差を所定の閾値と比較し、前記位置推定誤差が前記閾値よりも大きい方向を検出する検出部と、
前記検出部により検出された方向の位置推定の基準として、前記道路ペイントが使用される場合の適性度を示す適性度情報を、前記道路ペイントごとに取得する第2取得部と、
を備え、
前記出力部は、前記精度情報及び前記適性度情報に基づいて前記高精度道路ペイントを特定する請求項5に記載の出力装置。 - 前記照合の結果に基づき、前記移動体の位置推定を行う位置推定部をさらに備える請求項1~6のいずれか一項に記載の出力装置。
- 前記出力部は、前記位置推定部による前記位置推定の精度に基づき、前記照合が前記所定値未満の精度となる道路ペイントから前記移動体を遠ざける制御の要否を判定する請求項7に記載の出力装置。
- 前記位置推定部は、前記照合が前記所定値未満の精度となる道路ペイントが前記検出装置による検出範囲に含まれる場合、前記照合の結果による前記位置推定への反映の重み付けを小さくする請求項7または8に記載の出力装置。
- 前記精度情報において前記照合の精度が前記所定値未満となる道路ペイントは、複合的な線により表わされた区画線である請求項1~9のいずれか一項に記載の出力装置。
- 前記精度情報において前記照合の精度が前記所定値未満となる道路ペイントは、かすれが生じている道路ペイントである請求項1~9のいずれか一項に記載の出力装置。
- 出力装置が実行する制御方法であって、
検出装置による道路ペイントの検出結果と地図情報との照合を行う照合工程と、
道路ペイント毎の前記照合の精度を示す精度情報を取得する第1取得工程と、
前記精度情報に基づき、所定値以上の精度で前記照合が可能となるように移動体を制御するための制御情報を出力する出力工程と、
を有する制御方法。 - コンピュータが実行するプログラムであって、
検出装置による道路ペイントの検出結果と地図情報との照合を行う照合部と、
道路ペイント毎の前記照合の精度を示す精度情報を取得する第1取得部と、
前記精度情報に基づき、所定値以上の精度で前記照合が可能となるように移動体を制御するための制御情報を出力する出力部
としてコンピュータを機能させるプログラム。 - 請求項13に記載のプログラムを記憶した記憶媒体。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP18774558.3A EP3605498B1 (en) | 2017-03-28 | 2018-03-05 | Output device, control method, program, and storage medium |
US16/499,579 US12099361B2 (en) | 2017-03-28 | 2018-03-05 | Output device, control method, program and storage medium for control of a moving body based on road marking detection accuracy |
JP2019509087A JPWO2018180247A1 (ja) | 2017-03-28 | 2018-03-05 | 出力装置、制御方法、プログラム及び記憶媒体 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017-062441 | 2017-03-28 | ||
JP2017062441 | 2017-03-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018180247A1 true WO2018180247A1 (ja) | 2018-10-04 |
Family
ID=63675329
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2018/008346 WO2018180247A1 (ja) | 2017-03-28 | 2018-03-05 | 出力装置、制御方法、プログラム及び記憶媒体 |
Country Status (4)
Country | Link |
---|---|
US (1) | US12099361B2 (ja) |
EP (1) | EP3605498B1 (ja) |
JP (4) | JPWO2018180247A1 (ja) |
WO (1) | WO2018180247A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN114089739A (zh) * | 2020-05-21 | 2022-02-25 | 深圳市海柔创新科技有限公司 | 导航方法及导航装置 |
JP2023516353A (ja) * | 2020-03-04 | 2023-04-19 | ズークス インコーポレイテッド | ローカライゼーションエラー監視 |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110619666B (zh) * | 2019-09-20 | 2022-05-27 | 阿波罗智能技术(北京)有限公司 | 用于标定相机的方法及装置 |
JP7256216B2 (ja) * | 2021-02-25 | 2023-04-11 | 本田技研工業株式会社 | 車両制御装置、車両制御方法、およびプログラム |
JP2022182094A (ja) * | 2021-05-27 | 2022-12-08 | 本田技研工業株式会社 | 移動体制御装置、移動体制御方法、およびプログラム |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012117944A (ja) * | 2010-12-01 | 2012-06-21 | Aisin Aw Co Ltd | ナビゲーション装置 |
JP2013083576A (ja) * | 2011-10-11 | 2013-05-09 | Aisin Aw Co Ltd | 自車位置認識システム、自車位置認識プログラム、及び自車位置認識方法 |
JP2015141611A (ja) | 2014-01-29 | 2015-08-03 | アイシン・エィ・ダブリュ株式会社 | 自動運転支援装置、自動運転支援方法及びプログラム |
WO2016035199A1 (ja) * | 2014-09-05 | 2016-03-10 | 三菱電機株式会社 | 自動走行管理システム、サーバおよび自動走行管理方法 |
JP2017041070A (ja) * | 2015-08-19 | 2017-02-23 | ソニー株式会社 | 車両制御装置と車両制御方法と情報処理装置および交通情報提供システム |
Family Cites Families (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3419648B2 (ja) * | 1997-05-27 | 2003-06-23 | 株式会社日立製作所 | ナビゲーション装置 |
JP3575352B2 (ja) * | 1999-10-25 | 2004-10-13 | 株式会社デンソー | 車両用位置標定装置及び記録媒体 |
JP2009180631A (ja) | 2008-01-31 | 2009-08-13 | Denso It Laboratory Inc | ナビゲーション装置、ナビゲーション方法およびプログラム |
US8618922B2 (en) * | 2010-03-30 | 2013-12-31 | GM Global Technology Operations LLC | Method and system for ensuring operation of limited-ability autonomous driving vehicles |
JP5206752B2 (ja) | 2010-08-30 | 2013-06-12 | 株式会社デンソー | 走行環境認識装置 |
DE102012206903B4 (de) * | 2012-04-26 | 2025-01-23 | Robert Bosch Gmbh | Verfahren für ein Assistenzsystem eines Fahrzeugs |
US9719801B1 (en) * | 2013-07-23 | 2017-08-01 | Waymo Llc | Methods and systems for calibrating sensors using road map data |
JP6467773B2 (ja) * | 2014-02-25 | 2019-02-13 | アイシン・エィ・ダブリュ株式会社 | 経路探索システム、経路探索方法及びコンピュータプログラム |
EP2918974B1 (en) * | 2014-03-11 | 2019-01-16 | Volvo Car Corporation | Method and system for determining a position of a vehicle |
US9834207B2 (en) * | 2014-04-15 | 2017-12-05 | GM Global Technology Operations LLC | Method and system for detecting, tracking and estimating stationary roadside objects |
JP6075351B2 (ja) | 2014-10-10 | 2017-02-08 | トヨタ自動車株式会社 | 操舵支援制御装置 |
US9530313B2 (en) | 2014-10-27 | 2016-12-27 | Here Global B.V. | Negative image for sign placement detection |
US9483059B2 (en) * | 2014-11-26 | 2016-11-01 | Toyota Motor Engineering & Manufacturing North America, Inc. | Method to gain driver's attention for autonomous vehicle |
WO2016139748A1 (ja) | 2015-03-03 | 2016-09-09 | パイオニア株式会社 | 経路探索装置、制御方法、プログラム及び記憶媒体 |
JP6491929B2 (ja) | 2015-03-31 | 2019-03-27 | アイシン・エィ・ダブリュ株式会社 | 自動運転支援システム、自動運転支援方法及びコンピュータプログラム |
RU2690727C1 (ru) | 2015-07-27 | 2019-06-05 | Ниссан Мотор Ко., Лтд. | Устройство управления движением по маршруту и способ управления движением по маршруту |
KR20170015115A (ko) * | 2015-07-30 | 2017-02-08 | 삼성전자주식회사 | 자율 주행 차량 및 자율 주행 차량 제어 방법 |
US10384679B2 (en) * | 2015-09-30 | 2019-08-20 | Nissan Motor Co., Ltd. | Travel control method and travel control apparatus |
JP2017151703A (ja) * | 2016-02-24 | 2017-08-31 | トヨタ自動車株式会社 | 自動運転装置 |
US9851212B2 (en) * | 2016-05-06 | 2017-12-26 | Ford Global Technologies, Llc | Route generation using road lane line quality |
US20170356748A1 (en) * | 2016-06-14 | 2017-12-14 | nuTonomy Inc. | Route Planning for an Autonomous Vehicle |
US10369994B2 (en) * | 2016-07-20 | 2019-08-06 | Ford Global Technologies, Llc | Rear camera stub detection |
DE102016213782A1 (de) * | 2016-07-27 | 2018-02-01 | Volkswagen Aktiengesellschaft | Verfahren, Vorrichtung und computerlesbares Speichermedium mit Instruktionen zur Bestimmung der lateralen Position eines Fahrzeuges relativ zu den Fahrstreifen einer Fahrbahn |
US20180067494A1 (en) * | 2016-09-02 | 2018-03-08 | Delphi Technologies, Inc. | Automated-vehicle 3d road-model and lane-marking definition system |
JP6778063B2 (ja) * | 2016-09-07 | 2020-10-28 | 株式会社Soken | 運転支援装置、運転支援方法 |
KR102529903B1 (ko) * | 2016-12-14 | 2023-05-08 | 현대자동차주식회사 | 차량의 위치 추정 장치 및 방법 |
EP3569462B1 (en) * | 2017-01-10 | 2022-06-01 | Mitsubishi Electric Corporation | Travel path recognition device and travel path recognition method |
KR20180106417A (ko) * | 2017-03-20 | 2018-10-01 | 현대자동차주식회사 | 차량의 위치 인식 시스템 및 방법 |
-
2018
- 2018-03-05 JP JP2019509087A patent/JPWO2018180247A1/ja active Pending
- 2018-03-05 WO PCT/JP2018/008346 patent/WO2018180247A1/ja active IP Right Grant
- 2018-03-05 EP EP18774558.3A patent/EP3605498B1/en active Active
- 2018-03-05 US US16/499,579 patent/US12099361B2/en active Active
-
2021
- 2021-05-14 JP JP2021082518A patent/JP2021120683A/ja active Pending
-
2023
- 2023-02-21 JP JP2023025209A patent/JP2023078138A/ja active Pending
-
2024
- 2024-05-13 JP JP2024077740A patent/JP2024105508A/ja active Pending
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2012117944A (ja) * | 2010-12-01 | 2012-06-21 | Aisin Aw Co Ltd | ナビゲーション装置 |
JP2013083576A (ja) * | 2011-10-11 | 2013-05-09 | Aisin Aw Co Ltd | 自車位置認識システム、自車位置認識プログラム、及び自車位置認識方法 |
JP2015141611A (ja) | 2014-01-29 | 2015-08-03 | アイシン・エィ・ダブリュ株式会社 | 自動運転支援装置、自動運転支援方法及びプログラム |
WO2016035199A1 (ja) * | 2014-09-05 | 2016-03-10 | 三菱電機株式会社 | 自動走行管理システム、サーバおよび自動走行管理方法 |
JP2017041070A (ja) * | 2015-08-19 | 2017-02-23 | ソニー株式会社 | 車両制御装置と車両制御方法と情報処理装置および交通情報提供システム |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2023516353A (ja) * | 2020-03-04 | 2023-04-19 | ズークス インコーポレイテッド | ローカライゼーションエラー監視 |
JP7637692B2 (ja) | 2020-03-04 | 2025-02-28 | ズークス インコーポレイテッド | ローカライゼーションエラー監視 |
CN114089739A (zh) * | 2020-05-21 | 2022-02-25 | 深圳市海柔创新科技有限公司 | 导航方法及导航装置 |
Also Published As
Publication number | Publication date |
---|---|
EP3605498B1 (en) | 2025-05-07 |
JP2021120683A (ja) | 2021-08-19 |
EP3605498A4 (en) | 2021-01-13 |
US20200026297A1 (en) | 2020-01-23 |
EP3605498A1 (en) | 2020-02-05 |
US12099361B2 (en) | 2024-09-24 |
JPWO2018180247A1 (ja) | 2020-02-06 |
JP2023078138A (ja) | 2023-06-06 |
JP2024105508A (ja) | 2024-08-06 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2018181974A1 (ja) | 判定装置、判定方法、及び、プログラム | |
WO2018221453A1 (ja) | 出力装置、制御方法、プログラム及び記憶媒体 | |
US12174021B2 (en) | Measurement accuracy calculation device, self-position estimation device, control method, program and storage medium | |
JP6806891B2 (ja) | 情報処理装置、制御方法、プログラム及び記憶媒体 | |
WO2018180247A1 (ja) | 出力装置、制御方法、プログラム及び記憶媒体 | |
US11420632B2 (en) | Output device, control method, program and storage medium | |
JP2023054314A (ja) | 情報処理装置、制御方法、プログラム及び記憶媒体 | |
JP6923750B2 (ja) | 自己位置推定装置、自己位置推定方法、プログラム及び記憶媒体 | |
JP2024161105A (ja) | 自己位置推定装置、制御方法、プログラム及び記憶媒体 | |
WO2020209144A1 (ja) | 位置推定装置、推定装置、制御方法、プログラム及び記憶媒体 | |
JP2022176322A (ja) | 自己位置推定装置、制御方法、プログラム及び記憶媒体 | |
WO2018212302A1 (ja) | 自己位置推定装置、制御方法、プログラム及び記憶媒体 | |
US12181284B2 (en) | Measurement device, measurement method and program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 18774558 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2019509087 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2018774558 Country of ref document: EP Effective date: 20191028 |
|
WWG | Wipo information: grant in national office |
Ref document number: 2018774558 Country of ref document: EP |