[go: up one dir, main page]

CN108694363A - The method and apparatus that the pedestrian of vehicle periphery is detected - Google Patents

The method and apparatus that the pedestrian of vehicle periphery is detected Download PDF

Info

Publication number
CN108694363A
CN108694363A CN201710234918.6A CN201710234918A CN108694363A CN 108694363 A CN108694363 A CN 108694363A CN 201710234918 A CN201710234918 A CN 201710234918A CN 108694363 A CN108694363 A CN 108694363A
Authority
CN
China
Prior art keywords
frame
rider
orientation
vehicle
candidate
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201710234918.6A
Other languages
Chinese (zh)
Inventor
戴依若
张杨
照井孝
照井孝一
志磨健
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Astemo Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Priority to CN201710234918.6A priority Critical patent/CN108694363A/en
Priority to PCT/JP2018/015194 priority patent/WO2018190362A1/en
Priority to JP2019512544A priority patent/JP6756908B2/en
Publication of CN108694363A publication Critical patent/CN108694363A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/103Static body considered as a whole, e.g. static pedestrian or occupant recognition
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/166Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)
  • Traffic Control Systems (AREA)

Abstract

The present invention provides a kind of device and method that the pedestrian to vehicle periphery is detected, which includes:Acquiring unit, the image information of vehicle periphery is obtained from the filming apparatus in vehicle, and detects the current vehicle speed of vehicle;The initial candidate frame parameter of the pedestrian of vehicle periphery is calculated according to image information in initial candidate frame parameter calculation unit, and initial candidate frame parameter includes direction, and direction is angle of the pedestrian relative to filming apparatus;Adjustment unit is adjusted initial candidate frame parameter according to direction, the candidate frame parameter after being adjusted;Multiple pedestrians are classified as bicyclist and pedestrian by taxon according to the candidate frame parameter after adjustment, and obtain bicyclist's detection block parameter of bicyclist;Danger coefficient setup unit sets danger coefficient according to the change frequency of bicyclist's detection block parameter, the respective status information of bicyclist, direction for each bicyclist.

Description

Method and device for detecting pedestrian around vehicle
Technical Field
The invention relates to a method and a device for detecting pedestrians around a vehicle.
Background
The development of automobiles brings great convenience to human beings, but traffic accidents caused by the development of automobiles are difficult problems which puzzle people for a long time. In addition to the accidents caused by the illegal behaviors and bad habits of the drivers of the automobiles, the pedestrian traffic illegal behaviors are very serious on most roads in China. Particularly, the illegal behaviors of drivers (including non-motor vehicles such as motorcycles, mopeds, electric vehicles, bicycles, tricycles and the like) occupying motor lanes, not following traffic light regulations, wanting to cross roads, reversely driving and the like in streets and alleyways are frequently rare. Although the fifty-eighth provision of the road traffic safety law in China states that the highest speed of a disabled person driving wheelchair and an electric bicycle is not more than fifteen kilometers when the disabled person driving wheelchair and the electric bicycle run in a non-motor road. "however, the actual speed per hour of the electric bicycle is about 40-50 km, which far exceeds the maximum speed per hour specified by law. In 2016, China, about 3.9 hundred million non-motor vehicles were kept, wherein about 2.2 hundred million electric bicycles were kept. With such a large base number, the non-motor vehicles cannot neglect the danger hidden danger caused by road traffic.
Meanwhile, due to the rapid development of sensors and control technologies, active safety technologies for automobiles to prevent accident risks are receiving more and more attention. Automobiles around the world are also under acceleration in terms of safety-related anti-collision standardization, and the european new vehicle safety evaluation association (EuroNCAP) is expected to include a function of avoiding a collision with a bicycle as an automobile evaluation target after 2018. The continental group uses radar and a high-precision global positioning system sensor to realize synchronous movement of the automobile and the bicycle, so that the bicycle is monitored and automatic braking is realized; volvo, Swedish sporting goods company and Ericsson company collaboratively develop a system for bidirectional communication between a driver and a rider, and can prevent a vehicle from colliding with a pedestrian; the leopard road tiger is also currently developing bicycle anti-collision warning technology.
Aiming at the anti-collision warning technology of the rider, a sensor-based rider detection and tracking algorithm plays a crucial role. In the following patent, "Pedestrian and rider Detection Method based on Vision" (Vision based Pedestrian and bicycle Detection Method, U.S. patent application No. US9,087,263B2, published by the institute of science in taiwan, 7.21.2015), a vehicle-mounted camera is used as a sensor, and a rider Detection algorithm is disclosed, which detects a person and a bicycle separately and determines whether the person and the bicycle are a rider according to a spatial positional relationship therebetween. However, this patent requires the detection of a circular wheel of the bicycle, and is ineffective for a scene that cannot be seen by the bicycle wheel, so that it is impossible to determine whether the pedestrian is a rider.
Meanwhile, the working environment and conditions of the vehicle-mounted sensor are very harsh, and the collision avoidance time of a vehicle running fast is very short, so that the collision avoidance vehicle-mounted system is required to have the characteristics of strong real-time performance and high processing speed. In actual road conditions, a plurality of non-motor vehicles and motor vehicles, as well as a plurality of pedestrians and stationary roadblock objects are driven, so that the problem of detecting a plurality of riders is also unavoidable. However, in the detection of a plurality of riders, the calculation is complicated, and an algorithm with a large calculation amount is often difficult to meet the real-time requirement. Therefore, it is difficult to detect a plurality of riders in real time in the related art.
Disclosure of Invention
The present invention provides an apparatus for detecting a pedestrian around a vehicle, the apparatus including:
an acquisition unit that acquires image information around the vehicle from a camera in the vehicle and detects a current vehicle speed of the vehicle;
an initial frame candidate parameter calculation unit which calculates an initial frame candidate parameter of each of a plurality of pedestrians around the vehicle at each time point according to the image information, the initial frame candidate parameter including an orientation, wherein the orientation is an angle of the pedestrian with respect to the photographing device;
the adjusting unit is used for adjusting the initial candidate frame parameters according to the orientation to obtain the adjusted candidate frame parameters at each moment;
a classification unit configured to classify the plurality of pedestrians into one or more riders and one or more pedestrians according to the adjusted frame candidate parameters, and obtain rider detection frame parameters of each of the one or more riders at each time;
and a risk coefficient setting unit configured to set a risk coefficient for each of the one or more riders based on the rider detection frame parameter, the state information of each of the one or more riders, and the change frequency of the orientation.
The initial frame candidate parameters further include frame barycentric coordinates, a frame width w, a frame height h,
wherein the adjusted candidate frame parameters include the frame barycentric coordinate, the adjusted frame width w', the frame height h, and the orientation α, wherein
Wherein the classification unit takes the adjusted candidate frame parameter of each of the one or more riders at the each time as the rider detection frame parameter.
In this way, the rider in the pedestrian can be classified through the adjusted candidate frame parameters.
The classification unit takes the initial candidate frame parameter of each of the one or more pedestrians as the pedestrian detection frame parameter of each of the one or more pedestrians at the each time instant,
wherein the risk coefficient setting unit sets a risk coefficient for each of the one or more pedestrians, according to the pedestrian detection frame parameter, the respective state information of the one or more pedestrians, and the change frequency of the orientation.
The risk coefficient setting unit obtains, for each of the one or more riders, a relative speed and a relative movement direction of the each rider with respect to the vehicle, based on the frame barycentric coordinates in the rider detection frame parameters at the previous time and the frame barycentric coordinates in the rider detection frame parameters at the present time.
The risk coefficient setting unit determines whether the current rider is a new rider, if so, the risk coefficient is set to 1, otherwise, the risk coefficient is set according to the frame barycentric coordinates and the orientation, the relative speed, the relative movement direction, the state information, and the change frequency of the orientation in the rider detection frame parameters at the current time.
The state information is a normal state or an abnormal state, and is determined to be an abnormal state when the risk factor setting unit determines that the rider is a child based on the frame height in the rider detection frame parameter, determines that a distance between the rider and another rider is smaller than a predetermined safe distance from the image information, or determines that the rider is disturbed by an external object from the image information.
The present invention also provides a method of detecting a pedestrian around a vehicle, the method comprising:
a) acquiring image information around the vehicle from a shooting device in the vehicle, and detecting the current speed of the vehicle;
b) calculating initial frame candidate parameters of a plurality of pedestrians around the vehicle at each moment according to the image information, wherein the initial frame candidate parameters comprise orientation, and the orientation is the angle of the pedestrians relative to the shooting device;
c) adjusting the initial candidate frame parameters according to the orientation to obtain the adjusted candidate frame parameters at each moment;
d) classifying the pedestrians into one or more riders and one or more pedestrians according to the adjusted candidate frame parameters, and obtaining rider detection frame parameters of the one or more riders at each moment;
e) setting a risk factor for each of the one or more riders according to the rider detection frame parameter, the respective state information of the one or more riders, and the change frequency of the orientation.
By the device and the method, a plurality of riders and a plurality of pedestrians in a pedestrian can be rapidly distinguished, so that the plurality of riders can be detected in real time.
Drawings
Fig. 1 is a structural diagram of an apparatus for detecting pedestrians around a vehicle according to an embodiment of the present invention;
FIG. 2 is a flow chart of a method of detecting pedestrians around a vehicle, according to an embodiment of the present invention;
FIG. 3(a) is a schematic diagram of an initial candidate box for a pedestrian of the plurality of pedestrians at time t1, according to an embodiment of the present invention;
FIG. 3(b) is a schematic diagram of an adjusted candidate frame for a pedestrian shown in FIG. 3 (a);
FIG. 4(a) is a schematic diagram of an initial candidate box for another pedestrian of the plurality of pedestrians at time t1, according to an embodiment of the present invention;
FIG. 4(b) is a schematic diagram of another pedestrian's adjusted candidate frame shown in FIG. 4 (a);
fig. 5 is a schematic diagram showing the definition of orientation.
Detailed Description
Embodiments of the present invention will be described in detail below with reference to the accompanying drawings.
Fig. 1 is a block diagram of an apparatus 10 for detecting pedestrians around a vehicle according to an embodiment of the present invention, the apparatus 10 including an acquisition unit 11, an initial frame candidate parameter calculation unit 12, an adjustment unit 13, a classification unit 14, and a risk factor setting unit 15.
Fig. 2 is a flowchart of a method of detecting a pedestrian around a vehicle according to an embodiment of the present invention. As shown in fig. 2, in step S21, the acquisition unit 11 acquires image information of the surroundings of the vehicle from a camera in the vehicle, and detects the current vehicle speed of the vehicle. The camera (not shown) is a camera system, which may be one or more camera systems, installed at a proper position of the vehicle (which may be disposed at the upper end of the front windshield, the rear end of the vehicle body, and both sides of the vehicle body), and collects and stores image information from the front, rear, and both sides of the vehicle body. The camera system is generally composed of an optical system, which may have a zoom function, an autofocus function, and the like, and a camera, which may employ a color CCD (charge coupled device) camera.
in step S22, the initial frame candidate parameter calculation unit 12 calculates, from the above-described image information, initial frame candidate parameters for a plurality of pedestrians around the vehicle at each time, the initial frame candidate parameters including an orientation α, wherein the orientation α is an angle of the pedestrian with respect to the camera, that is, an angle of a torso of the pedestrian with respect to the camera.
fig. 5 is a schematic diagram showing the definition of the orientation α, α is 0 when the orientation of the pedestrian with respect to the camera is positive right, α 0 is pi/2 when the orientation of the pedestrian with respect to the camera is positive front, α 1 is pi or-pi when the orientation of the pedestrian with respect to the camera is positive left, α 2 is-pi/2 when the orientation of the pedestrian with respect to the camera is positive rear, other orientations are output in their actual sizes and are normalized to [ -pi, pi ], similarly, the orientation α 3 classified into a certain region may be classified into the same angle as in fig. 5, as shown in fig. 5, when the orientation of the pedestrian with respect to the camera is front right, the orientation is classified into class I, α is pi/4, when the orientation of the pedestrian with respect to the camera is front left, the orientation of the pedestrian with respect to the camera is classified into class II, α is pi/2, when the orientation of the pedestrian with respect to the camera is front left, the orientation of the pedestrian with respect to the camera is classified into class III, α is pi 3/4, when the orientation of the pedestrian with respect to the camera is front left, the orientation of the camera is classified into class IV, when the orientation of the pedestrian with respect to the orientation of the camera, the orientation of the camera is classified into the left, the orientation of the camera, the class VI, the orientation of the pedestrian, the orientation of the camera is classified into the class VI, the camera, the class VI, the orientation of the pedestrian, the camera is classified into the class VI, when the orientation of.
The initial frame candidate parameters further include frame barycentric coordinates, a frame width w, and a frame height h. Fig. 3(a) is a schematic diagram of an initial frame candidate F of one pedestrian of a plurality of pedestrians at time t1, the initial frame candidate parameters of which include frame barycentric coordinates (x) according to the embodiment of the present inventiont1,yt1,zt1) Frame width wt1Frame height ht1and the direction alphat1(not shown). Here, the frame barycentric coordinates are set to the world coordinate system of the vehicle, for example, the origin of coordinates is a position directly below the center of the vehicle head.
Here, for example, initial candidate frames including frame barycentric coordinates, frame widths, and frame heights may be first calculated by a feature extraction method (dense features: DPM deformable component model, ACF total channel features, HOG directional gradient histogram, dense edges, etc.; sparse features: size, shape, sparse edges, body components, gait, texture, gray/edge symmetry, etc.), contour template matching, etc., such as a deep learning algorithm in machine learning. Then, the orientation of the pedestrian is analyzed by using a deep learning algorithm or a common machine learning algorithm. For example, a specialized orientation detection network or different models based on different orientation gradient histograms may be employed to divide the input initial candidate box of the pedestrian into different orientations. This type of approach is called the cascade model.
In addition, for example, the box barycentric coordinates, the box width, the box height, and the orientation may be calculated together using the same deep learning network or the same general machine learning, i.e., a single model algorithm. Wherein, in the deep learning network, the pedestrian orientation is trained and detected as part of the regression loss function in the last fully connected layer of the deep learning network.
It can be seen that, in step S22, the initial frame candidate parameter calculation unit 12 can identify all pedestrians from the image information by the above calculation. In the present invention, the pedestrian includes a rider and/or a walker.
In step S23, the adjusting unit 13 adjusts the initial frame candidate parameter according to the orientation, and obtains an adjusted frame candidate parameter at each time.
Fig. 3(b) is a schematic diagram of the adjusted candidate frame for one pedestrian shown in fig. 3 (a). As shown in fig. 3(b), the adjusting unit 13 adjusts the initial frame candidate parameters of the initial frame candidate F of the pedestrian at the time t1 to obtain an adjusted frame candidate F' of the pedestrian at the time t 1. For example, the frame candidate parameter of the adjusted frame candidate F' includes the frame barycentric coordinate (x)t1,yt1,zt1) Frame width wt1', frame height ht1and the direction alphat1(not shown). Wherein,that is, the adjusting unit 13 is according to the frame height ht1and the direction alphat1To change the frame width w of the initial candidate frame Ft1Obtaining the frame width w of the adjusted candidate frame Ft1', while the other parameters are unchanged.
In this manner, the frame candidate parameter of the adjusted frame candidate of each pedestrian of the plurality of pedestrians at the time t1 can be obtained.
When the orientation of the pedestrian with respect to the image pickup device is forward or backward, the size of the initial frame candidate thereof is not changed, that is, the adjusted frame candidate parameter is the same as the initial frame candidate parameter. When the orientation of the pedestrian with respect to the image pickup device is left or right, the frame width is adjusted to be equal to the frame height while keeping the frame height of the initial frame candidate unchanged. When the orientation of the pedestrian with respect to the image pickup device is rear left or rear right or front left or front right, the frame width is adjusted to half the frame height while keeping the frame height of the initial frame candidate unchanged. In addition, the premise of the change (adjustment) is to ensure that the barycentric coordinates of the frame are unchanged. In addition, once the frame width changes, the coordinates of the four vertices of the adjusted candidate frame also change accordingly, as shown in fig. 3 (b).
In step S24, the classification unit 14 classifies the plurality of pedestrians into one or more riders and one or more pedestrians according to the adjusted frame candidate parameters, and obtains the rider detection frame parameters of each of the one or more riders at each time. Here, the classification unit 14 calculates the adjusted frame candidate parameters for each pedestrian using, for example, a deep learning classification algorithm, and classifies according to the calculation result. The specific calculation and classification method is the same as that of the prior art, and is not described in detail here.
Fig. 4(a) is a schematic diagram of an initial frame candidate G at time t1 for another pedestrian of the multiple pedestrians, and fig. 4(b) is a schematic diagram of an adjusted frame candidate G' for another pedestrian shown in fig. 4(a), according to the embodiment of the present invention. In step S24, the classification unit 14 classifies, for example, one pedestrian in fig. 3(B) as a rider (hereinafter referred to as "rider a"), and the other pedestrian in fig. 4(B) as a pedestrian (hereinafter referred to as "pedestrian B") in fig. 4(a) and 4 (B).
The classification unit 14 takes the adjusted candidate frame parameters of each of the one or more riders at each time as the rider detection frame parameters. In this example, for example, the classification section 14 sets the frame candidate parameter of the adjusted frame candidate F' at time t1 shown in fig. 3(b) as the rider detection frame parameter of the rider a. That is, the rider detection frame parameter includes a frame barycentric coordinate (x)t1,yt1,zt1) Frame width wt1', frame height ht1and the direction alphat1(not shown).
Further, the classification unit 14 uses the initial candidate frame parameter of each of the one or more pedestrians as the pedestrian detection frame parameter of each of the one or more pedestrians at each time. In this example, for example, the classification section 14 sets the initial frame candidate parameter of the initial frame candidate G at time t1 shown in fig. 4(a) as the pedestrian detection frame parameter of the pedestrian.
Here, as shown in fig. 4(a) and 4(B), for the pedestrian B, the pedestrian detection frame parameter is the same as the initial frame candidate parameter, that is, the frame width of the pedestrian detection frame is smaller than the frame width of the adjusted frame candidate, so that the range of detection for the pedestrian can be narrowed.
Next, in step S25, the risk factor setting unit 15 sets a risk factor for each of the one or more riders based on the rider detection frame parameter, the state information of each of the one or more riders, and the change frequency of the orientation.
Here, for each of one or more riders, the risk factor setting unit 15 obtains the relative speed and the relative moving direction of each rider with respect to the vehicle from the frame barycentric coordinates in the rider detection frame parameters at the previous time and the frame barycentric coordinates in the rider detection frame parameters at the present time.
For example, taking the rider in fig. 3(b) as an example, the previous time is t1, for example, and the current time is t2, for example. For rider A, the frame barycentric coordinate in the rider detection frame parameter at t1 is, for example, (x)t1,yt1,zt1) The frame barycentric coordinate of the rider detection frame parameter at t2 is, for example, (x)t2,yt2,zt2) From (x)t2,yt2,zt2) And (x)t1,yt1,zt1) The amount of change of (a) yields the relative speed v and the relative movement direction r of the rider a with respect to the vehicle. In the same way as described above,the respective relative speeds and relative directions of movement of the other riders can be obtained.
The risk coefficient setting unit 15 determines whether or not the rider is a new rider, sets the risk coefficient to 1 if the determination is positive, and sets the risk coefficient based on the frame barycentric coordinates and orientation, relative speed, relative movement direction, state information, and change frequency of orientation in the frame parameters detected by the rider at the present time.
For example, taking the rider a in fig. 3(b) as an example, the risk factor setting unit 15 first determines whether or not it is a new rider. Specifically, the rider a determines whether or not there is a rider detection frame parameter at the previous time t1, and if not, determines that it is a new rider, that is, a newly-appearing rider at the current time t2, sets the risk coefficient W to 1, and determines the next rider. If so, the rider is judged not to be a new rider, the risk coefficient W is initialized to 0, and the risk coefficient is set based on the frame barycentric coordinates and the orientation, the relative speed, the relative movement direction, the state information, and the change frequency of the orientation in the frame parameters detected by the rider at the current time t 2.
In this example, for example, if it is determined that the rider a is not a new rider, the risk coefficient W is initialized to 0, and the risk coefficient is set for the rider a according to the following conditions 1 to 5. The following describes conditions 1 to 5 in detail.
Condition 1: judging whether the frame barycentric coordinate in the rider detection frame parameter at the current time t2 is less than the alarm safety distance dwThe alarm safety distance dwCalculated by the following mathematical formula 1.
Where v is the relative speed of the rider A, tfBraking time required for the vehicle, tdThe reaction time of the driver of the vehicle, mu is the road on which the vehicle is locatedG is the acceleration of gravity, vhIs the current vehicle speed of the vehicle detected by the acquisition unit in step S21.
Here, the value of the friction coefficient μ of the road may be determined from the captured image information, for example, as shown in the following mathematical formula 2.
As described above, the coordinate origin is a position directly below the center of the vehicle head, and the distance between the frame barycentric coordinate and the coordinate origin is the distance between the rider and the center of the vehicle head. The frame barycentric coordinate of rider A in the rider detection frame parameter at the present time t2 is (x)t2,yt2,zt2) And obtaining the distance D between the rider A and the middle position of the bicycle head. The distance D is compared with the alarm safety distance D calculated as abovewComparing to judge whether the distance D is less than the alarm safety distance Dw
Condition 2: it is judged from the relative moving direction of the rider a whether the rider a is approaching the vehicle. Here, it is possible to determine whether the rider a is approaching the vehicle, for example, from the relative movement direction r calculated as described above.
Condition 3: whether the state information of the rider a is an abnormal state.
The state information is a normal state or an abnormal state, and is determined to be an abnormal state when the risk factor setting unit 15 determines that the rider is a child based on the frame height in the rider detection frame parameter, determines that the distance between the rider and another rider is less than a predetermined safe distance from the image information, or determines that the rider is disturbed by an external object from the image information.
For example, the frame height of rider A in the rider detection frame parameter at the present time t2 is htt2From the frame height ht2Can determine whether the rider A is a child, and if so, determinesAn abnormal state.
For example, taking current time t2 as an example, the distance between rider A and another rider, for example, rider C, is the distance D between the frame barycentric coordinates of rider A and the frame barycentric coordinates of rider CAC. Predetermined safety distance dsFor example, it is obtained according to the following mathematical formula 3.
Wherein k is a relative speed between rider A and rider C, t'fIs the time required for the rider to brake, td' is the reaction time of the rider, [ mu ] is the friction coefficient of the road on which the rider A is located (for example, obtained from the above-mentioned numerical expression 2), and g is the acceleration of gravity. Here, k may be obtained from the amount of change between the frame barycentric coordinates of rider a and the frame barycentric coordinates of rider C, for example.
Then the distance D is measuredACAt a predetermined safety distance dsComparing to determine the distance DAWhether or not it is less than a predetermined safety distance ds. If so, judging the state to be an abnormal state, otherwise, judging the state to be a normal state.
in addition, it is determined from the image information whether the rider A is disturbed by an external object, for example, whether an unknown object appears within a range of a predetermined angle with the heading α of the rider A as a starting point.
And when the three judgment results are negative, judging that the rider is in a normal state.
condition 4, it is judged whether or not the rider a can see the vehicle based on the orientation α of the rider a at the present time t 2.
here, for example, if the rider's direction α is any one of rear, rear left, rear right, left, and right, that is, if the direction α belongs to any one of class VI, class V, class VII, class IV, and class VIII shown in fig. 5, it is determined that the rider cannot see the vehicle, otherwise it is determined that the rider can see the vehicle.
Condition 5: whether the frequency of change of orientation is greater than a predetermined threshold.
here, for example, as for the rider a, 10 orientations α at 10 consecutive times before the current time t2 are taken out, the number of changes in these orientations, that is, the change frequency of the orientation is calculated.
Setting the risk factor of the rider a according to the conditions 1 to 5 is explained in detail below, taking the rider a as an example.
First, if the determination result of condition 1 or condition 2 is yes, that is, the frame barycentric coordinate of rider a at current time t2 is smaller than the warning safety distance or rider a is approaching the vehicle, 1 is added on the basis of the initialized risk coefficient of 0, that is, W is 0+1 or 1; on the contrary, if the judgment results of both the condition 1 and the condition 2 are negative, the risk coefficient is not changed, but is still the initialized risk coefficient, that is, W is 0. In this example, if the determination result of the condition 1 is yes, W is 1.
Secondly, if the judgment result of the condition 3 is yes, namely the current state is an abnormal state, adding 1 to the risk coefficient obtained through the judgment of the conditions 1 and 2, otherwise, maintaining the risk coefficient obtained through the judgment of the conditions 1 and 2. In this example, if the determination result of the condition 3 is yes, W is 1+1 is 2.
Finally, if the judgment result of the condition 4 or the condition 5 is yes, that is, it is judged that the rider can see the vehicle or the frequency of the change in the orientation is greater than the predetermined threshold value, 1 is continuously added to the risk coefficient judged through the condition 3, whereas if the judgment results of the conditions 4 and 5 are both no, the risk coefficient judged through the condition 3 is not changed. For example, if the results of the determinations in both conditions 4 and 5 are negative, the risk factor obtained by the determination in condition 3 is not changed, that is, W is 2.
In this way, the risk factor of each rider at each instant of time can be obtained and each rider can be tracked accordingly according to the risk factor in order to alert the vehicle if necessary.
The risk coefficient setting unit 15 also sets a risk coefficient for each of the one or more pedestrians, based on the pedestrian detection frame parameter, the state information of each of the one or more pedestrians, and the change frequency of the orientation. For example, the risk factor may be set for the pedestrian B in such a manner as to set the risk factor for the rider as described above.
By the device and the method, a plurality of riders and a plurality of pedestrians in a pedestrian can be rapidly distinguished, so that the plurality of riders can be detected in real time.
While the present invention has been described in conjunction with specific embodiments, it is evident that many alternatives, modifications, and variations will be apparent to those skilled in the art in light of the foregoing description. Accordingly, it is intended that such alternatives, modifications, and variations be included within the spirit and scope of the appended claims.

Claims (12)

1. An apparatus for detecting a pedestrian around a vehicle, characterized by comprising:
an acquisition unit that acquires image information around the vehicle from a camera in the vehicle and detects a current vehicle speed of the vehicle;
an initial frame candidate parameter calculation unit which calculates an initial frame candidate parameter of each of a plurality of pedestrians around the vehicle at each time point according to the image information, the initial frame candidate parameter including an orientation, wherein the orientation is an angle of the pedestrian with respect to the photographing device;
the adjusting unit is used for adjusting the initial candidate frame parameters according to the orientation to obtain the adjusted candidate frame parameters at each moment;
a classification unit configured to classify the plurality of pedestrians into one or more riders and one or more pedestrians according to the adjusted frame candidate parameters, and obtain rider detection frame parameters of each of the one or more riders at each time;
and a risk coefficient setting unit configured to set a risk coefficient for each of the one or more riders based on the rider detection frame parameter, the state information of each of the one or more riders, and the change frequency of the orientation.
2. The apparatus of claim 1, wherein the initial frame candidate parameters further include frame barycentric coordinates, a frame width w, a frame height h,
wherein the adjusted candidate frame parameters include the frame barycentric coordinate, the adjusted frame width w', the frame height h, and the orientation α, wherein
Wherein the classification unit takes the adjusted candidate frame parameter of each of the one or more riders at the each time as the rider detection frame parameter.
3. The apparatus of claim 1, wherein the classification unit is to take an initial candidate frame parameter of each of the one or more pedestrians as a pedestrian detection frame parameter of each of the one or more pedestrians at the each time instance,
wherein the risk coefficient setting unit sets a risk coefficient for each of the one or more pedestrians, according to the pedestrian detection frame parameter, the respective state information of the one or more pedestrians, and the change frequency of the orientation.
4. The apparatus according to claim 2, wherein, for each of the one or more riders, the risk coefficient setting unit obtains a relative speed and a relative moving direction of each rider with respect to the vehicle, based on the frame barycentric coordinates in the rider detection frame parameters at the previous time and the frame barycentric coordinates in the rider detection frame parameters at the present time.
5. The apparatus according to claim 4, wherein said risk coefficient setting unit judges for each of said riders whether or not it is a new rider, and if so, said risk coefficient is set to 1, otherwise said risk coefficient is set based on said frame barycentric coordinates and said orientation, said relative speed, said relative moving direction, said state information, and a change frequency of said orientation in said rider detection frame parameters at the present time.
6. The apparatus according to claim 5, wherein the state information is a normal state or an abnormal state, and the state information is determined to be the abnormal state when the risk factor setting unit determines that the rider is a child based on the frame height in the rider detection frame parameter, determines that a distance between the rider and another rider is less than a predetermined safe distance from the image information, or determines that the rider is disturbed by an external object from the image information.
7. A method of detecting a pedestrian around a vehicle, the method comprising:
a) acquiring image information around the vehicle from a shooting device in the vehicle, and detecting the current speed of the vehicle;
b) calculating initial frame candidate parameters of a plurality of pedestrians around the vehicle at each moment according to the image information, wherein the initial frame candidate parameters comprise orientation, and the orientation is the angle of the pedestrians relative to the shooting device;
c) adjusting the initial candidate frame parameters according to the orientation to obtain the adjusted candidate frame parameters at each moment;
d) classifying the pedestrians into one or more riders and one or more pedestrians according to the adjusted candidate frame parameters, and obtaining rider detection frame parameters of the one or more riders at each moment;
e) setting a risk factor for each of the one or more riders according to the rider detection frame parameter, the respective state information of the one or more riders, and the change frequency of the orientation.
8. The method of claim 7, wherein the initial frame candidate parameters further include frame barycentric coordinates, a frame width w, a frame height h,
wherein, in step c), the adjusted candidate frame parameters include the frame barycentric coordinate, the adjusted frame width w', the frame height h, and the orientation α, wherein
Wherein, in step d), the adjusted candidate frame parameter of each of the one or more riders at the each time is taken as the rider detection frame parameter.
9. The method of claim 7, wherein in step d), the initial candidate frame parameter of each of the one or more pedestrians is used as the pedestrian detection frame parameter of each of the one or more pedestrians at the each time instant,
in step e), a risk coefficient is set for each of the one or more pedestrians, based on the pedestrian detection frame parameter, the state information of each of the one or more pedestrians, and the change frequency of the orientation.
10. The method of claim 8, wherein for each of the one or more riders, the relative speed and relative direction of motion of each rider with respect to the vehicle is derived from frame center of gravity coordinates in rider detection frame parameters at a previous time and frame center of gravity coordinates in rider detection frame parameters at a current time.
11. The method as claimed in claim 10, wherein in step e), it is judged whether or not it is a new rider for each said rider, and if so, said risk factor is set to 1, otherwise said risk factor is set based on said frame barycentric coordinates and said orientation, said relative speed, said relative moving direction, said state information, and a change frequency of said orientation in said rider detection frame parameters at the present time.
12. The method according to claim 11, wherein the state information is a normal state or an abnormal state, and is determined to be an abnormal state when it is determined that the rider is a child based on the frame height in the rider detection frame parameter, or it is determined from the image information that a distance between the rider and another rider is less than a predetermined safety distance, or it is determined from the image information that the rider is disturbed by an external object.
CN201710234918.6A 2017-04-12 2017-04-12 The method and apparatus that the pedestrian of vehicle periphery is detected Pending CN108694363A (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
CN201710234918.6A CN108694363A (en) 2017-04-12 2017-04-12 The method and apparatus that the pedestrian of vehicle periphery is detected
PCT/JP2018/015194 WO2018190362A1 (en) 2017-04-12 2018-04-11 Method and device for detecting pedestrian around vehicle
JP2019512544A JP6756908B2 (en) 2017-04-12 2018-04-11 Methods and devices for detecting pedestrians around the vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710234918.6A CN108694363A (en) 2017-04-12 2017-04-12 The method and apparatus that the pedestrian of vehicle periphery is detected

Publications (1)

Publication Number Publication Date
CN108694363A true CN108694363A (en) 2018-10-23

Family

ID=63792644

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710234918.6A Pending CN108694363A (en) 2017-04-12 2017-04-12 The method and apparatus that the pedestrian of vehicle periphery is detected

Country Status (3)

Country Link
JP (1) JP6756908B2 (en)
CN (1) CN108694363A (en)
WO (1) WO2018190362A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109409309A (en) * 2018-11-05 2019-03-01 电子科技大学 A kind of intelligent alarm system and method based on human testing
CN111433082A (en) * 2018-11-09 2020-07-17 北京嘀嘀无限科技发展有限公司 System and method for detecting in-vehicle conflicts
CN111429754A (en) * 2020-03-13 2020-07-17 南京航空航天大学 A risk assessment method for vehicle collision avoidance trajectory under pedestrian crossing conditions
CN115527074A (en) * 2022-11-29 2022-12-27 深圳依时货拉拉科技有限公司 Vehicle detection frame generation method and device and computer equipment

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111856493A (en) * 2019-04-25 2020-10-30 北醒(北京)光子科技有限公司 Camera triggering device and method based on laser radar
JP7655233B2 (en) * 2022-01-24 2025-04-02 株式会社豊田自動織機 Image processing device for human detection system

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009053925A (en) * 2007-08-27 2009-03-12 Toyota Motor Corp Behavior prediction device
CN105216792A (en) * 2014-06-12 2016-01-06 株式会社日立制作所 Obstacle target in surrounding environment is carried out to the method and apparatus of recognition and tracking
WO2017056382A1 (en) * 2015-09-29 2017-04-06 ソニー株式会社 Information processing device, information processing method, and program

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3136288A1 (en) * 2015-08-28 2017-03-01 Autoliv Development AB Vision system and method for a motor vehicle
JP6635188B2 (en) * 2016-03-18 2020-01-22 株式会社Jvcケンウッド Object recognition device, object recognition method, and object recognition program
JP6512164B2 (en) * 2016-04-22 2019-05-15 株式会社デンソー Object detection apparatus, object detection method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009053925A (en) * 2007-08-27 2009-03-12 Toyota Motor Corp Behavior prediction device
CN105216792A (en) * 2014-06-12 2016-01-06 株式会社日立制作所 Obstacle target in surrounding environment is carried out to the method and apparatus of recognition and tracking
WO2017056382A1 (en) * 2015-09-29 2017-04-06 ソニー株式会社 Information processing device, information processing method, and program

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
朱西产等: "基于角点检测估算车辆间即碰时间", 《汽车安全与节能学报》 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109409309A (en) * 2018-11-05 2019-03-01 电子科技大学 A kind of intelligent alarm system and method based on human testing
CN111433082A (en) * 2018-11-09 2020-07-17 北京嘀嘀无限科技发展有限公司 System and method for detecting in-vehicle conflicts
US11615545B2 (en) 2018-11-09 2023-03-28 Bejing Didi Infinity Technology And Development Co., Ltd. System and method for detecting in-vehicle conflicts
CN111429754A (en) * 2020-03-13 2020-07-17 南京航空航天大学 A risk assessment method for vehicle collision avoidance trajectory under pedestrian crossing conditions
CN115527074A (en) * 2022-11-29 2022-12-27 深圳依时货拉拉科技有限公司 Vehicle detection frame generation method and device and computer equipment
CN115527074B (en) * 2022-11-29 2023-03-07 深圳依时货拉拉科技有限公司 Vehicle detection frame generation method and device and computer equipment

Also Published As

Publication number Publication date
WO2018190362A1 (en) 2018-10-18
JP6756908B2 (en) 2020-09-16
JPWO2018190362A1 (en) 2020-01-16

Similar Documents

Publication Publication Date Title
US11315026B2 (en) Systems and methods for classifying driver behavior
EP3807128B1 (en) A rider assistance system and method
US10685449B2 (en) Surrounding environment recognition device for moving body
CN108694363A (en) The method and apparatus that the pedestrian of vehicle periphery is detected
CN107380164B (en) Driver assistance system and support system based on computer vision
CN102765365B (en) Pedestrian detection method and pedestrian anti-collision warning system based on machine vision
US9975550B2 (en) Movement trajectory predicting device and movement trajectory predicting method
CN105922990B (en) A kind of vehicle environmental based on high in the clouds machine learning perceives and control method
KR101891460B1 (en) Method and apparatus for detecting and assessing road reflections
CN106647776B (en) Method and device for judging lane changing trend of vehicle and computer storage medium
US20200211395A1 (en) Method and Device for Operating a Driver Assistance System, and Driver Assistance System and Motor Vehicle
JP5938569B2 (en) Advanced driver support system considering azimuth information and operation method thereof
US9280900B2 (en) Vehicle external environment recognition device
EP1671216A2 (en) Moving object detection using low illumination depth capable computer vision
CN109703460A (en) Multi-camera complex scene adaptive vehicle collision early warning device and early warning method
CN202169907U (en) Device used for identifying environment outside vehicle
US20230093042A1 (en) Vehicle collision detection and driver notification system
JP5593217B2 (en) Vehicle external recognition device and vehicle system using the same
CN105426852A (en) Method for identifying pedestrians by vehicle-mounted monocular long-wave infrared camera
CN109308442B (en) Vehicle exterior environment recognition device
KR102485318B1 (en) System and Method for Recognizing Cut-in Vehicle on the basis of Monocular Image
CN112519801A (en) Safety protection system and method for running vehicle
Jyothi et al. Driver assistance for safe navigation under unstructured traffic environment
KR20150092505A (en) A method for tracking a vehicle, a method for warning a distance between vehicles and a device for warning a distance between vehicles
JP7185571B2 (en) Viewing direction estimation device, viewing direction estimation method, and program

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information

Address after: Ibaraki

Applicant after: Hitachi astemo Co.,Ltd.

Address before: Ibaraki

Applicant before: HITACHI AUTOMOTIVE SYSTEMS, Ltd.

CB02 Change of applicant information
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20181023

WD01 Invention patent application deemed withdrawn after publication