[go: up one dir, main page]

CN112651263A - Method and device for filtering background object - Google Patents

Method and device for filtering background object Download PDF

Info

Publication number
CN112651263A
CN112651263A CN201910953716.6A CN201910953716A CN112651263A CN 112651263 A CN112651263 A CN 112651263A CN 201910953716 A CN201910953716 A CN 201910953716A CN 112651263 A CN112651263 A CN 112651263A
Authority
CN
China
Prior art keywords
frame
current frame
distance
previous
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910953716.6A
Other languages
Chinese (zh)
Inventor
王乐菲
底欣
张兆宇
田军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujitsu Ltd
Original Assignee
Fujitsu Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujitsu Ltd filed Critical Fujitsu Ltd
Priority to CN201910953716.6A priority Critical patent/CN112651263A/en
Priority to JP2020150819A priority patent/JP2021060977A/en
Publication of CN112651263A publication Critical patent/CN112651263A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • G06V20/54Surveillance or monitoring of activities, e.g. for recognising suspicious objects of traffic, e.g. cars on the road, trains or boats
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/02Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00
    • G01S7/41Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S13/00 using analysis of echo signal for target characterisation; Target signature; Target cross-section
    • G01S7/415Identification of targets based on measurements of movement associated with the target
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformations in the plane of the image
    • G06T3/04Context-preserving transformations, e.g. by using an importance map
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/194Segmentation; Edge detection involving foreground-background segmentation
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10032Satellite or aerial image; Remote sensing
    • G06T2207/10044Radar image

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Remote Sensing (AREA)
  • Radar, Positioning & Navigation (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • General Engineering & Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Evolutionary Biology (AREA)
  • Multimedia (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Image Analysis (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

本申请实施例提供了一种过滤背景物体的方法和装置,其中,所述方法包括:对来自于雷达的当前帧和之前帧中物体的检测结果进行关联,所述检测结果包括所述物体的位置和/或速度,所述关联是指将当前帧和之前帧中属于同一物体的检测结果与该物体进行对应;对当前帧中的背景物体进行过滤。

Figure 201910953716

Embodiments of the present application provide a method and device for filtering background objects, wherein the method includes: associating detection results of objects in a current frame from a radar with a previous frame, where the detection results include the detection results of the objects. Position and/or speed, the association refers to corresponding the detection results of the same object in the current frame and the previous frame with the object; filtering the background object in the current frame.

Figure 201910953716

Description

Method and device for filtering background object
Technical Field
The present application relates to the field of communications technologies, and in particular, to a method and an apparatus for filtering background objects.
Background
Generally, a camera (camera) is used for traffic monitoring. However, the camera may be affected by environmental factors such as light, rain, fog, etc., and the detection result may be incorrect. Roadside radars are typically installed at intersections or on one side of roads. It is used to detect driving conditions such as the position and speed of the car and is not affected by environmental factors. Therefore, it is a trend of traffic monitoring to combine detection results from a camera and a roadside radar.
It should be noted that the above background description is only for the convenience of clear and complete description of the technical solutions of the present application and for the understanding of those skilled in the art. Such solutions are not considered to be known to the person skilled in the art merely because they have been set forth in the background section of the present application.
Disclosure of Invention
The inventors have found that for roadside radars, in addition to objects to be detected, such as vehicles, some strong reflectors, such as metallic signs of the roadside, etc., may also be detected by the radar. However, the information of the background object needs to be filtered to retain information about the object to be measured (e.g., vehicle). Furthermore, in order to fuse the detection result of the camera and the detection result of the radar, the background object should also be filtered, because the fusion result does not require the background object.
In order to solve at least one of the above problems or other similar problems, embodiments of the present application provide a method and an apparatus for filtering background objects in radar detection results.
According to a first aspect of embodiments of the present application, there is provided a method of filtering background objects, wherein the method includes:
correlating the detection results of the objects in the current frame and the previous frame from the radar, wherein the detection results comprise the position and/or the speed of the object, and the correlation refers to corresponding the detection results of the same object in the current frame and the previous frame to the object;
and filtering the background object in the current frame.
According to a second aspect of embodiments of the present application, there is provided an apparatus for filtering background objects, wherein the apparatus comprises:
the device comprises a correlation unit, a processing unit and a processing unit, wherein the correlation unit is used for correlating detection results of objects in a current frame and a previous frame from a radar, the detection results comprise the position and/or the speed of the object, and the correlation refers to that the detection results of the same object in the current frame and the previous frame are corresponding to the object;
and the filtering unit is used for filtering the background object in the current frame.
According to a third aspect of embodiments of the present application, a terminal device is provided, where the terminal device includes the apparatus of the foregoing second aspect.
One of the beneficial effects of the embodiment of the application lies in: by filtering the background object in the radar detection result through the method of the embodiment of the application, the information of the object to be detected can be obtained, the background information is removed, and the fusion of the detection result of the camera and the detection result of the radar is facilitated.
Specific embodiments of the present application are disclosed in detail with reference to the following description and drawings, indicating the manner in which the principles of the application may be employed. It should be understood that the embodiments of the present application are not so limited in scope. The embodiments of the application include many variations, modifications and equivalents within the spirit and scope of the appended claims.
Features that are described and/or illustrated with respect to one embodiment may be used in the same way or in a similar way in one or more other embodiments, in combination with or instead of the features of the other embodiments.
It should be emphasized that the term "comprises/comprising" when used herein, is taken to specify the presence of stated features, integers, steps or components but does not preclude the presence or addition of one or more other features, integers, steps or components.
Drawings
Elements and features described in one drawing or one implementation of an embodiment of the application may be combined with elements and features shown in one or more other drawings or implementations. Moreover, in the drawings, like reference numerals designate corresponding parts throughout the several views, and may be used to designate corresponding parts for use in more than one embodiment.
The accompanying drawings, which are included to provide a further understanding of the embodiments of the application, are incorporated in and constitute a part of this specification, illustrate embodiments of the application and together with the description serve to explain the principles of the application. It is obvious that the drawings in the following description are only some embodiments of the application, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 is a schematic diagram of a method of the first aspect of an embodiment of the present application;
FIG. 2 is a schematic diagram of one embodiment of operation 101 of the method of FIG. 1;
FIG. 3 is a schematic diagram of another embodiment of operation 101 of the method of FIG. 1;
FIG. 4 is a schematic diagram of one embodiment of operation 102 of the method of FIG. 1;
FIG. 5 is a schematic diagram of another embodiment of operation 102 of the method of FIG. 1;
FIG. 6 is a schematic diagram of one example of a result of filtering background objects;
FIG. 7 is a schematic view of an apparatus of the second aspect of the embodiments;
FIG. 8 is a schematic diagram of one embodiment of an association unit of the apparatus of FIG. 7;
FIG. 9 is a schematic diagram of another embodiment of an association unit of the apparatus of FIG. 7;
FIG. 10 is a schematic view of one embodiment of a filter unit of the apparatus of FIG. 7;
FIG. 11 is a schematic view of another embodiment of a filter unit of the apparatus of FIG. 7;
fig. 12 is a schematic diagram of an image processing apparatus according to an embodiment of the present application.
Detailed Description
The foregoing and other features of the present application will become apparent from the following description, taken in conjunction with the accompanying drawings. In the description and drawings, particular embodiments of the application are disclosed in detail as being indicative of some of the embodiments in which the principles of the application may be employed, it being understood that the application is not limited to the described embodiments, but, on the contrary, is intended to cover all modifications, variations, and equivalents falling within the scope of the appended claims.
In the embodiments of the present application, the terms "first", "second", and the like are used for distinguishing different elements by reference, but do not denote a spatial arrangement, a temporal order, or the like of the elements, and the elements should not be limited by the terms. The term "and/or" includes any and all combinations of one or more of the associated listed terms. The terms "comprising," "including," "having," and the like, refer to the presence of stated features, elements, components, and do not preclude the presence or addition of one or more other features, elements, components, and elements.
In the embodiments of the present application, the singular forms "a", "an", and the like include the plural forms and are to be construed broadly as "a" or "an" and not limited to the meaning of "a" or "an"; furthermore, the term "comprising" should be understood to include both the singular and the plural, unless the context clearly dictates otherwise. Further, the term "according to" should be understood as "at least partially according to … …," and the term "based on" should be understood as "based at least partially on … …," unless the context clearly dictates otherwise.
Various embodiments of the present application will be described below with reference to the drawings. These embodiments are merely exemplary and are not intended to limit the embodiments of the present application.
First aspect of the embodiments
A first aspect of an embodiment of the present application provides a method for filtering background objects, fig. 1 is a schematic diagram of an example of the method, and please refer to fig. 1, the method includes:
operation 101: correlating the detection results of the objects in the current frame and the previous frame from the radar, wherein the detection results comprise the position and/or the speed of the object, and the correlation refers to corresponding the detection results of the same object in the current frame and the previous frame to the object;
operation 102: and filtering the background object in the current frame.
In the embodiment of the application, the detection result of the roadside radar is periodically output as frames, and each frame comprises information of N (N is more than or equal to 0) objects. Here, it is assumed that the detection result includes the position (x, y) and the velocity (v) of the object. The time interval of two consecutive frames is constant and is denoted as at. For convenience of explanation, the present embodiment will be described with reference to the case where N.gtoreq.1 as an example.
In the embodiment of the present application, in operation 101, the detection results of the object in the current frame and the previous frame from the radar are correlated, that is, whether the two frames contain the same object or not is determined, and the detection result belonging to the same object (the detection result of the object in the current frame and the detection result of the object in the previous frame) is associated with the object.
For example, assume that the current frame and the previous frame are two consecutive frames, and at time t- Δ t (Δ t is the period of the frame), the previous frame is framei-1Containing N1Objects (objects) which are represented as 0i-1,1,Oi-1,2,.. and
Figure BDA0002226568780000041
the detection result of each object includes an x-coordinate value, a y-coordinate value, and a velocity (velocity), the N being1Object in previous framei-1By vector Xi-1,Yi-1And Vi-1Represents:
Figure BDA0002226568780000046
Figure BDA0002226568780000047
Figure BDA0002226568780000048
at time t, the current frameiContaining N2An object, N1And N2May be equal or unequal, and these objects are denoted as Oi,1,Oi,2,.. and
Figure BDA0002226568780000042
the N is2The object is in the frame of the current frameiBy vector Xi,YiAnd ViRepresents:
Figure BDA0002226568780000043
Figure BDA0002226568780000044
Figure BDA0002226568780000045
in the embodiment of the present application, based on the above-described assumption, the frame is determinedi-1Object O in (1)i-1,jAnd frameiObject O in (1)i,k(1≤j≤N1,1≤k≤N2) Whether it is the same object, in other words, determine xi-1,j,yi-1,jAnd vi-1,jAnd xi,k,Yi,kAnd vi,kWhether it is the detection of the same object on two consecutive frames.
In the embodiment of the present application, the criterion for determining whether the detection result belongs to the same object in two consecutive frames is: in two consecutive frames, the displacement of the object is not greater than a certain threshold. That is, assuming that a certain detection result in the previous frame and a certain detection result in the current frame belong to the same object, if the displacement of the object is not greater than a certain threshold value according to the detection result of the previous frame and the detection result of the current frame, the two detection results are considered to belong to the same object.
Fig. 2 is a diagram illustrating an example of operation 101, shown in fig. 2, of correlating detection results of objects in a current frame and a previous frame from a radar, including:
operation 201: for each object in the current frame, calculating the distance between the object and the radar and the distances between all the objects which are not related in the previous frame and the radar;
operation 202: comparing the distance between the object and the radar with the distances between all the objects which are not associated in the previous frame and the radar to obtain a minimum distance difference and the object in the previous frame corresponding to the minimum distance difference;
operation 203: and comparing the minimum distance difference with a preset distance threshold, and determining whether the object is a newly detected object or an object in a previous frame corresponding to the minimum distance difference according to a comparison result.
In operation 203, if the minimum distance difference is smaller than a preset distance threshold, it is determined that the object in the current frame is the same object as the object in the previous frame corresponding to the minimum distance difference, and the detection result of the object in the current frame is associated with the detection result of the object in the previous frame; and if the minimum distance difference is larger than or equal to the distance threshold, the object in the current frame is considered as a newly detected object in the current frame.
In operation 201, a distance between the object and the radar may be calculated based on the abscissa and ordinate values (e.g., x-coordinate value and y-coordinate value) of the detection result of the object. Still taking the foregoing as an example, in the current frameiIn the object Oi,kThe distance from the radar is denoted di,kAnd, furthermore,
Figure BDA0002226568780000051
preceding framei-1In the middle, objectOi-1,jThe distance from the radar is denoted di-1,jAnd, furthermore,
Figure BDA0002226568780000052
in operation 202, a previous frame is calculatedi-1Distance between middle object and radar and current frameiThe difference in distance between the medium object and the radar, called range difference, is denoted dfi,kAnd is and
dfi,k=di,k-di-1,j
wherein j is more than or equal to 1 and less than or equal to N1And k is more than or equal to 1 and less than or equal to N2。N1And N2Respectively the previous framei-1And a current frameiThe number of objects in (a). For example, for the current frameiFirst object O in (1)i,1
Figure BDA0002226568780000053
Figure BDA0002226568780000054
In the current frameiEach object has a minimum distance difference, denoted df _ mini,k(1≤k≤N2) The amount of the solvent to be used is, for example,
Figure BDA0002226568780000055
Figure BDA0002226568780000061
in operation 203, for a current frameiFor each object, the minimum distance difference df _ min is selectedi,kAnd comparing the absolute value of the minimum distance difference | df _ mini,kL and a distance threshold THd. If | df _ mini,k|<THdCorresponding to the minimum distance difference df _ mini,kOf the previous framei-1Object O in (1)i-1,jAnd a current frameiObject O in (1)i,kIs the same object in two consecutive frames, thenAnd correlating the two objects and the detection results thereof. If | df _ mini,k|≥THdWill correspond to the minimum distance difference df _ mini,kOf the current frameiObject O in (1)i,kAn object newly detected by the radar is determined.
Then, the frame of the current frame is calculated againiOf (4) the minimum distance difference df _ min corresponding to the other (remaining) objectsi,kThe calculation is performed in the manner described above, with the minimum distance difference df _ min being selectedi,kDenoted df _ mini,k', and compare | df _ mini,k' | and THd. Repeatedly calculating the frame of the current frameiOf the remaining objects of (1) a minimum distance difference df _ mini,kAnd compare | df _ mini,kL and THdUp to the current frameiHas been determined (is the frame of the previous frame)i-1Is associated with an object in (a), or is a newly detected object). In this way, different objects can be distinguished.
In the embodiment of the application, if the frame of the previous frame isi-1Object O in (1)i-1,jHas already been associated with the current frameiObject O in (1)i,kIf correlated, then in the re-calculation, the previous frame is not includedi-1Object O in (1)i-1,j. In addition, since the current frameiObject O in (1)i,kThe previous frame has been associatedi-1An object of (1), e.g. Oi-1,jOr has been determined to be a newly detected object, and therefore, in the recalculation, the current frame is not included eitheriObject O in (1)i,k
In the embodiment of the application, the frame of the current frameiThe radar detection results of the same object can be arranged in time sequence, and the detection results of the same object can be aligned. For example, the current frameiObject O in (1)i,jAnd previous framei-1Object O in (1)i-1,jThe two detection results are correlated with each other when the two detection results are the same object. The association manner is not limited in the present application, and may be in a form of a table or other forms, which are not described herein again.
In this embodiment, the distance threshold may be determined according to a speed of an object to be detected, where the object to be detected may be a vehicle such as an automobile, an electric tricycle, and an electric bicycle, or may be another object to be detected.
In one embodiment, if the current frame and the previous frame are two consecutive frames, the distance threshold may be set as: THd=v·Δt/3600,THdIs a distance threshold in meters, v is a preset speed in km/h, Δ t is a frame period in ms.
In another embodiment, if the current frame and the previous frame are two non-consecutive frames, the distance threshold may be set as: THd=M·v·Δt/3600,THdIs a distance threshold in meters, M is the number of frames between the current frame and the previous frame, v is a preset speed in km/h, Δ t is the frame period in ms.
In the above two embodiments, v may be the velocity of the object to be detected, which may be the maximum velocity or the average velocity of the object to be detected or other velocities determined according to the requirements of the detection, to which the present application is not limited.
For example, assuming that the object to be detected is an automobile, and the maximum speed of the automobile driving at the intersection is v km/h, and the frame period is Δ t, the distance threshold value may be set for two consecutive frames, that is, the current frame and the previous frame are two consecutive frames: THdV · Δ t/3600. If the maximum speed of the car is 60km/h and the period of the frame is 60ms, THd is 1m, which indicates that the car has a displacement of not more than 1m within 60ms, and if the displacement is more than 1m, it is confirmed as two different objects.
Fig. 3 is a diagram illustrating another example of operation 101, as shown in fig. 3, of correlating detection results of objects in a current frame and a previous frame from a radar, including:
operation 301: for each object in the current frame, calculating the distance between the object and all the objects which are not associated in the previous frame to obtain the minimum distance;
operation 302: and comparing the minimum distance with a preset distance threshold, and determining whether the object is a newly detected object or an object in a previous frame corresponding to the minimum distance according to a comparison result.
In operation 301, an unassociated object means that, through previous calculation or comparison, it has not been found to belong to the same object as the object in the current frame, that is, it has not yet corresponded to some object in the current frame.
In operation 301, the distance between the objects may be calculated according to the horizontal and vertical coordinate values of the objects, and the description is omitted here. Assuming that there are three unassociated objects in the previous frame, A, B, C, for the object A 'of the current frame, the distance d between A' and A is calculatedA→AThe distance d between A' and BA’→BAnd the distance d between A' and CA’→CObtaining the minimum distance, assuming the minimum distance is dA’→A
In operation 302, if the above-mentioned minimum distance dA’→AIf the distance is less than the preset distance threshold, the object A' is considered to be at the minimum distance dA’→AThe corresponding object A in the previous frame is the same object, and the detection result of the object A 'in the current frame is associated with the detection result of the object A' in the previous frame (namely, the detection result of A); if said minimum distance d isA’→AIf the distance is greater than or equal to the distance threshold, the object a' is considered to be a newly detected object in the current frame.
In the example of fig. 3, the distance threshold is set in the same manner as in the example of fig. 2, and is not described here again.
In the embodiment of the present application, in operation 102, a background object in a current frame is filtered, where the background object refers to a non-object-to-be-detected object in the current frame, and if the object-to-be-detected object is a running vehicle, the background object may be a walking pedestrian, a running bicycle, an electric bicycle, or the like, and since these background objects are not needed, the present application filters the background object in the current frame according to the correlation result of operation 101 (that is, which of the objects in the current frame have a correlation with the objects in the previous frame, that is, belong to the same object, and which belong to a newly detected object).
Fig. 4 is a diagram illustrating an example of operation 102, where filtering the background object in the current frame, as shown in fig. 4, may include:
operation 401: judging whether each object in the current frame meets the following first condition and second condition or not;
operation 402: if yes, filtering the object; if the following first condition cannot be met or the following second condition cannot be met, the object is retained or further determination is made by other means.
The first condition is: the absolute value of the velocity of the object is less than a velocity threshold;
the second condition is any one of: the object is filtered in the previous n frames (2-1); the distance or azimuth angle between the object and the radar does not increase or monotonically increases (2-2); the distance or azimuth angle between the object and the radar does not decrease or monotonically decreases (2-3); the displacement of the object does not increase or increases monotonically (2-4); and the displacement of the object is not reduced or is monotonically reduced (2-5).
The first condition limits objects that can be filtered, i.e., objects with a velocity less than a velocity threshold, and the second condition 2-1 indicates that the object should be filtered in the next frame if it had been previously filtered; 2-3 of the second condition indicates that if the object moves toward the radar or continuously moves away from the radar, the object can be regarded as a car (object to be measured) and retained, otherwise filtered; 2-4 to 2-5 of the second condition indicates that the object should be filtered if its position relative to other objects is unchanged.
In the embodiment of the present application, the speed threshold may be empirically given, and for example, the speed threshold may be related to the speed of the filtered object (background object), and if the filtered object is a pedestrian walking, the speed threshold may be set to a general walking speed of the pedestrian. For example, the walking speed of an adult is about 1m/s, and if the speed threshold is set to 1m/s, it indicates that the speed of the background object is less than 1 m/s. In the embodiments of the present application, for convenienceIllustratively, the velocity threshold is denoted THvThe speed threshold value may be set to other values and may be a negative number as needed depending on the situation.
In the embodiment of the present application, if the absolute value of the speed of the object is lower than the speed threshold, and the object is filtered out in the previous n frames, or the distance or the azimuth between the object and the radar is not increased or monotonically increased, or the distance or the azimuth between the object and the radar is not decreased or monotonically decreased, or the displacement of the object is not increased or monotonically increased, or the displacement of the object is not decreased or monotonically decreased, the object is considered not to be the object to be measured, and the object should be filtered.
Fig. 5 is a diagram illustrating another example of operation 102, where filtering the background object in the current frame, as shown in fig. 5, may include:
operation 501: for each object in the current frame, marking the label of the object according to the detection result and/or the correlation result of the object;
operation 502: and filtering background objects in the current frame by using a sliding window W, wherein the sliding window W is the number of frames which are used as references when the background objects are filtered.
In the embodiment of the present application, for each object in each frame, an initial tag is assigned, for example, the initial tag of each object is set to 0.
In operation 501, labeling the label of the object may include:
if the object is the current frameiSet its tag to a certain value (called first value, e.g. to a certain negative number);
if the absolute value of the velocity of the object is less than a velocity threshold THvAnd prior framei-1The label of the object is the first value (e.g., negative number) and the frame of the previous frame isi-1Wherein the absolute value of the velocity of the object is less than a velocity threshold THvThis means that the speed of the object is too low to be expectedAn object (e.g., a car) is measured, and the tag of the object is set to another value (called a second value, e.g., to some positive number);
if the absolute value of the velocity of the object is less than a velocity threshold THvAnd prior framei-1If the tag of the object is the second value (e.g., a positive number), it indicates that the speed of the object in the previous frame is too low to be the object to be measured (e.g., a car), and the tag of the object is set to the second value (e.g., a positive number).
In operation 501, the tag of the object is the first value, indicating that the object is a newly detected object in this frame; the label of the object is the second value, indicating that the object is not the measured object and may need to be filtered.
In operation 502, filtering the background object in the current frame may be performed using a sliding window, and assuming that the size of the sliding window is W, filtering the background object in the current frame may include:
if the current frame is one of the previous W frames where the sliding window W is located and the absolute value of the speed of the object in the current frame is less than the speed threshold, the object is considered to be a background object and is filtered, that is, if the absolute value of the speed of the object in the previous W frame is less than the speed threshold THvFiltering the object;
if the current frame is not one of the previous frames W where the sliding window W is located, and the absolute value of the velocity of the object in the current frame is less than the velocity threshold and the label of the object is not the initial value (0), the object is considered to be a background object and filtered, that is, if the absolute value of the object is less than the velocity threshold TH in the other framesvAnd the label of the object is not the initial label, e.g. is the aforementioned first value (e.g. negative number) or the aforementioned second value (e.g. positive number), then the object is filtered;
if the current frame is not one of the previous W frames where the sliding window W is located, and the absolute value of the speed of the object in the current frame is less than the speed threshold, and the label of the object is an initial value (0), filtering the object according to a predetermined number of frames before the current frame.
For example, a frame whose tag of the above object is not the initial value (0) is searched for in a frame preceding the current frame as a reference frame, the number of frames between the reference frame and the current frame is defined as FN, and if FN < W, and the distance or azimuth angle between the object and the radar is not increased or monotonically increased or decreased or monotonically decreased in the FN frame, or the displacement of the object is not increased or monotonically increased or decreased or monotonically decreased in the FN frame, the object is considered as a background object, which is filtered; if FN ≧ W and the distance or azimuth between the object and the radar does not increase or monotonically increase or decrease or monotonically decrease in the W frame, or the displacement of the object does not increase or monotonically increase or decrease or monotonically decrease in the W frame, the object is considered to be a background object, which is filtered.
That is, in other frames, if the absolute value of the object is below the speed threshold THvAnd the label of the object is equal to 0, searching the object in the previous frame until the label of the object is found to be not 0, defining the number between the two frames as FN, and determining whether to filter the change of the object from a plurality of frames in a small interval according to the size relation between FN and W. If the change of the object is small, for example, the distance or the azimuth angle between the object and the radar is not increased or monotonically increased or decreased or monotonically decreased in several frames in this small interval, or the displacement of the object is not increased or monotonically increased or decreased or monotonically decreased in several frames in this small interval, it is considered that the displacement change of the object is not large, the probability of being a background object is large, and it needs to be filtered.
In the embodiment of the present application, for the filtered objects, the labels thereof may be set to the second value (e.g., a positive number) as a reference for filtering background objects in the subsequent frame.
FIG. 6 is a schematic diagram of filtering background objects according to the method of the embodiment of the present application, FIG. 6 (a) shows an image of a frame taken by a camera, as shown in FIG. 6 (a), with a car heading toward a roadside radar; fig. 6 (b) shows the detection result of the roadside radar, and as can be seen from fig. 6 (b), the roadside radar detects three background objects in addition to the car 601; fig. 6 (c) shows the result of filtering the background object according to the method of the embodiment of the present application, and as can be seen from fig. 6 (c), only the car 601 remains after the filtering.
Note that fig. 1 to 5 only schematically illustrate the present embodiment, but the present invention is not limited thereto. For example, the execution sequence of the steps may be adjusted as appropriate, and other steps may be added or some of the steps may be reduced. Those skilled in the art can make appropriate modifications in light of the above description, and are not limited to the description of fig. 1 to 5.
It should be noted that, the above description only describes each step or process related to the present application, but the present application is not limited thereto. The method may also comprise other steps or processes, reference being made to the prior art with regard to the details of these steps or processes.
According to the embodiment, the method filters the background object in the radar detection result, can obtain the information of the object to be detected, removes the background information, and is beneficial to the fusion of the detection result of the camera and the detection result of the radar.
Second aspect of the embodiments
A second aspect of embodiments of the present application provides an apparatus for filtering background objects. Since the principle of the apparatus for solving the problem is similar to the method of the first aspect of the embodiment, the specific implementation thereof may refer to the embodiment of the method of the first aspect of the embodiment, and the description thereof is not repeated where the contents are the same.
Fig. 7 is a schematic view of an apparatus for filtering background objects according to an embodiment of the present application, and as shown in fig. 7, the apparatus 700 includes: the device comprises a correlation unit 701 and a filtering unit 702, wherein the correlation unit 701 is used for correlating detection results of objects in a current frame and a previous frame from a radar, the detection results comprise positions and/or speeds of the objects, and the correlation refers to that the detection results of the same object in the current frame and the previous frame correspond to the object; the filtering unit 702 is used to filter the background object in the current frame.
Fig. 8 is a schematic diagram of an embodiment of the association unit 701 of the apparatus shown in fig. 7, and as shown in fig. 8, in an embodiment, the association unit 701 includes: a first calculating unit 801, a first comparing unit 802 and a first associating unit 803, wherein the first calculating unit 801 is used for calculating the distance between each object in the current frame and the radar and the distance between all the objects which are not associated in the previous frame and the radar; the first comparing unit 802 is configured to compare the distance between the object and the radar with the distances between all the objects in the previous frame that are not associated and the radar, and obtain a minimum distance difference and an object in the previous frame corresponding to the minimum distance difference; the first association unit is configured to compare the minimum distance difference with a preset distance threshold, and if the minimum distance difference is smaller than the preset distance threshold, consider that the object and the object in the previous frame corresponding to the minimum distance difference are the same object, and associate a detection result of the object in the current frame with a detection result of the object in the previous frame; and if the minimum distance difference is larger than or equal to the distance threshold value, the object is considered as a newly detected object in the current frame.
Fig. 9 is a schematic diagram of an embodiment of the association unit 701 of the apparatus shown in fig. 7, and as shown in fig. 9, in an embodiment, the association unit 701 includes: a second calculating unit 901 and a second associating unit 902, where the second calculating unit 901 is configured to calculate, for each object in the current frame, distances between the object and all objects that are not associated in the previous frame, so as to obtain a minimum distance; the second association unit is used for comparing the minimum distance with a preset distance threshold, if the minimum distance is smaller than the preset distance threshold, the object and the object in the previous frame corresponding to the minimum distance are considered to be the same object, and the detection result of the object in the current frame is associated with the detection result of the object in the previous frame; and if the minimum distance is greater than or equal to the distance threshold, the object is considered to be a newly detected object in the current frame.
In at least one embodiment, the current frame and the current frame areIf the previous frame is two consecutive frames, the distance threshold is: THd=v·Δt/3600,THdIs a distance threshold in meters, v is a preset speed in km/h, Δ t is a frame period in ms.
In at least one embodiment, if the current frame and the previous frame are two non-consecutive frames, the distance threshold is: THd=M·v·Δt/3600,THdIs a distance threshold in meters, M is the number of frames between the current frame and the previous frame, v is a preset speed in km/h, Δ t is the frame period in ms.
Fig. 10 is a schematic diagram of an embodiment of a filter unit 702 of the apparatus shown in fig. 7, and as shown in fig. 10, in an embodiment, the filter unit 702 includes: a judging unit 1001 and a first filtering unit 1002, where the judging unit 1001 judges, for each object in the current frame, whether the object satisfies a first condition and a second condition, the first filtering unit 1002 is configured to filter the object when the judging unit 1001 judges that the object satisfies the first condition, and the first condition is: the absolute value of the speed of the object is less than a speed threshold, and the second condition is any one of: the object is filtered in the previous n frames; the distance or azimuth angle between the object and the radar does not increase or monotonically increases; the distance or azimuth angle between the object and the radar does not decrease or monotonically decreases; the object displacement does not increase or monotonically increases; and the displacement of the object does not decrease or decreases monotonically.
Fig. 11 is a schematic diagram of another embodiment of the filter unit 702 of the apparatus shown in fig. 7, and as shown in fig. 11, in another embodiment, the filter unit 702 comprises: a marking unit 1101 and a second filtering unit 1102, wherein the marking unit 1101 marks the label of each object in the current frame according to the detection result and/or the association result of the object; the second filtering unit 1102 filters background objects in the current frame using a sliding window W, which is the number of frames that are used as a reference when filtering background objects.
In the present embodiment, if the object is a newly detected object in the current frame, the labeling unit 1101 sets its label to a first value; if the absolute value of the speed of the object is less than the speed threshold, and the tag of the object in the previous frame is the above-mentioned first value, and the absolute value of the speed of the object in the previous frame is less than the speed threshold, the marking unit 1101 sets the tag of the object to a second value; if the absolute value of the velocity of the object is less than the velocity threshold and the tag of the object in the previous frame is the second value, the marking unit 1101 sets the tag of the object to the second value.
In this embodiment of the present application, if a current frame is one of previous W frames where the sliding window W is located, and an absolute value of a speed of an object in the current frame is smaller than a speed threshold, the second filtering unit 1102 regards the object as a background object and filters the background object; if the current frame is not one of the previous frames W where the sliding window W is located, and the absolute value of the speed of the object in the current frame is smaller than the speed threshold, and the tag of the object is not the initial value, the second filtering unit 1102 considers that the object is a background object, and filters it; if the current frame is not one of the previous frames W where the sliding window W is located, and the absolute value of the speed of the object in the current frame is smaller than the speed threshold, and the label of the object is an initial value, the second filtering unit 1102 filters the object according to a predetermined number of frames before the current frame.
In the embodiment of the present application, the second filtering unit 1102 filters the object according to a predetermined number of frames before the current frame, for example, the second filtering unit 1102 may search frames before the current frame for a frame in which a tag of the object is not an initial value as a reference frame, define the number of frames between the reference frame and the current frame as FN, and if FN < W and a distance or an azimuth angle between the object and the radar are not increased or monotonically increased or decreased or monotonically decreased in the FN frame, or a displacement of the object is not increased or monotonically increased or decreased or monotonically decreased in the FN frame, the second filtering unit 1102 regards the object as a background object and filters it; if FN ≧ W and the distance or azimuth between the object and the radar does not increase or monotonically increase or decrease or monotonically decrease in the W frame, or the displacement of the object does not increase or monotonically increase or decrease or monotonically decrease in the W frame, the second filtering unit 1102 regards the object as a background object, and filters it.
It should be noted that the above description only describes the components or modules related to the present application, but the present application is not limited thereto. The apparatus 700 for filtering background objects may further include other components or modules, and reference may be made to the related art regarding the specific contents of the components or modules.
According to the embodiment, the device filters the background object in the radar detection result, can obtain the information of the object to be detected, removes the background information, and is favorable for the fusion of the detection result of the camera and the detection result of the radar.
Third aspect of the embodiments
A third aspect of embodiments of the present application provides an image processing apparatus, which may be, for example, a computer, a server, a workstation, a laptop, a smartphone, or the like; the embodiments of the present application are not limited thereto.
Fig. 12 is a schematic diagram of an image processing apparatus according to an embodiment of the present application, and as shown in fig. 12, an image processing apparatus 1200 may include: at least one interface (not shown), a processor (e.g., a Central Processing Unit (CPU))1201, a memory 1202; a memory 1202 is coupled to the processor 1201. Wherein the memory 1202 may store various data; further, a program 1203 that filters background objects is stored, and the program 1203 is executed under the control of the processor 1201, and various preset values, predetermined conditions, and the like are stored.
In one embodiment, the functions of the apparatus 700 for filtering background objects according to the second aspect of the embodiment may be integrated into the processor 1201, and implement the method for filtering background objects according to the first aspect of the embodiment. For example, the processor 1201 may be configured to:
in another embodiment, the apparatus 700 for filtering background objects according to the second aspect of the embodiment may be configured separately from the processor 1201, for example, the apparatus 700 for filtering background objects may be configured as a chip connected to the processor 1201, and the function of the apparatus 700 for determining filtering background objects may be implemented by the control of the processor 1201.
It is noted that the image processing device 1200 may also include a display 1205 and an I/O device 1204, or may not necessarily include all of the components shown in fig. 12, such as a camera (not shown) for acquiring input image frames; further, the image processing apparatus 1200 may further include components not shown in fig. 12, which can be referred to in the related art.
In the present embodiment, the processor 1201, also sometimes referred to as a controller or operation control, may include a microprocessor or other processor device and/or logic device, which the processor 1201 receives input and controls the operation of the various components of the image processing apparatus 1200.
In the present embodiment, the memory 1202 may be, for example, one or more of a buffer, a flash memory, a hard drive, a removable media, a volatile memory, a non-volatile memory, or other suitable device. Various information may be stored, and programs for executing the information may be stored. And the processor 1201 may execute the program stored in the memory 1202 to realize information storage or processing or the like. The functions of other parts are similar to the prior art and are not described in detail here. The components of the image processing apparatus 1200 may be implemented in dedicated hardware, firmware, software, or combinations thereof, without departing from the scope of the present application.
The image processing equipment of the embodiment of the application can filter background objects in the radar detection result, can obtain the information of the object to be detected, removes the background information, and is favorable for the fusion of the detection result of the camera and the detection result of the radar.
Embodiments of the present application also provide a computer-readable program, where the program, when executed in an image processing apparatus, causes the image processing apparatus to perform the method of the first aspect of the embodiments.
Embodiments of the present application further provide a storage medium storing a computer-readable program, where the computer-readable program enables an image processing apparatus to execute the method according to the first aspect of the embodiments.
The above apparatus and method of the present application may be implemented by hardware, or may be implemented by hardware in combination with software. The present application relates to a computer-readable program which, when executed by a logic component, enables the logic component to implement the above-described apparatus or constituent components, or to implement various methods or steps described above. The present application also relates to a storage medium such as a hard disk, a magnetic disk, an optical disk, a DVD, a flash memory, or the like, for storing the above program.
The methods/apparatus described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two. One or more of the functional block diagrams and/or one or more combinations of the functional block diagrams illustrated in the figures may correspond to individual software modules of the computer program flow or may correspond to individual hardware modules. These software modules may correspond to various steps shown in the figures, respectively. These hardware modules may be implemented, for example, by solidifying these software modules using a Field Programmable Gate Array (FPGA).
A software module may reside in RAM memory, flash memory, ROM memory, EPROM memory, EEPROM memory, registers, hard disk, a removable disk, a CD-ROM, or any other form of storage medium known in the art. A storage medium may be coupled to the processor such that the processor can read information from, and write information to, the storage medium; or the storage medium may be integral to the processor. The processor and the storage medium may reside in an ASIC. The software module may be stored in the memory of the mobile terminal or in a memory card that is insertable into the mobile terminal. For example, if the device (e.g., mobile terminal) employs a relatively large capacity MEGA-SIM card or a large capacity flash memory device, the software module may be stored in the MEGA-SIM card or the large capacity flash memory device.
One or more of the functional blocks and/or one or more combinations of the functional blocks described in the figures can be implemented as a general purpose processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other programmable logic device, discrete gate or transistor logic, discrete hardware components, or any suitable combination thereof designed to perform the functions described herein. One or more of the functional blocks and/or one or more combinations of the functional blocks described in connection with the figures may also be implemented as a combination of computing devices, e.g., a combination of a DSP and a microprocessor, a plurality of microprocessors, one or more microprocessors in conjunction with a DSP communication, or any other such configuration.
The present application has been described in conjunction with specific embodiments, but it should be understood by those skilled in the art that these descriptions are intended to be illustrative, and not limiting. Various modifications and adaptations of the present application may occur to those skilled in the art based on the spirit and principles of the application and are within the scope of the application.
Regarding the above-described embodiments disclosed in the embodiments of the present application, the following remarks are also disclosed:
1. a method of filtering background objects, wherein the method comprises:
correlating the detection results of the objects in the current frame and the previous frame from the radar, wherein the detection results comprise the position and/or the speed of the object, and the correlation refers to corresponding the detection results of the same object in the current frame and the previous frame to the object;
and filtering the background object in the current frame.
2. The method according to supplementary note 1, wherein correlating the detection results of the object in the current frame and the previous frame from the radar includes:
for each object in the current frame,
calculating the distance between the object and the radar and the distance between all the objects which are not related in the previous frame and the radar;
comparing the distance between the object and the radar with the distances between all the objects which are not associated in the previous frame and the radar to obtain a minimum distance difference and the object in the previous frame corresponding to the minimum distance difference;
comparing the minimum distance difference with a preset distance threshold, if the minimum distance difference is smaller than the preset distance threshold, considering that the object and the object in the previous frame corresponding to the minimum distance difference are the same object, and associating the detection result of the object in the current frame with the detection result of the object in the previous frame;
and if the minimum distance difference is larger than or equal to the distance threshold value, the object is considered as a newly detected object in the current frame.
3. The method according to supplementary note 1, wherein correlating the detection results of the object in the current frame and the previous frame from the radar includes:
for each object in the current frame,
calculating the distance between the object and all the objects which are not related in the previous frame to obtain the minimum distance;
comparing the minimum distance with a preset distance threshold, if the minimum distance is smaller than the preset distance threshold, regarding that the object and the object in the previous frame corresponding to the minimum distance are the same object, and associating the detection result of the object in the current frame with the detection result of the object in the previous frame;
and if the minimum distance is greater than or equal to the distance threshold, the object is considered to be a newly detected object in the current frame.
4. The method according to supplementary note 2 or 3, wherein the current frame and the previous frame are two consecutive frames, and the distance threshold is: THd=v·Δt/3600,THdIs a distance threshold in meters, v is a preset speed in km/h, Δ t is a frame period in ms.
5. The method according to supplementary note 2 or 3, wherein the current frame and the previous frame are two non-consecutive frames, the distance threshold is thenThe values are: THd=M·v·Δt/3600,THdIs a distance threshold in meters, M is the number of frames between the current frame and the previous frame, v is a preset speed in km/h, Δ t is the frame period in ms.
6. The method according to supplementary note 1, wherein filtering the background object in the current frame includes:
for each object in the current frame, determining whether the object satisfies a first condition and a second condition,
if so, filtering the object,
the first condition is: the absolute value of the velocity of the object is less than a velocity threshold,
the second condition is any one of: the object is filtered in the previous n frames; the distance or azimuth angle between the object and the radar does not increase or monotonically increases; the distance or azimuth angle between the object and the radar does not decrease or monotonically decreases; the displacement of the object does not increase or monotonically increases; and the displacement of the object does not decrease or decreases monotonically.
7. The method of supplementary note 6, wherein the velocity threshold is related to a velocity of the background object.
8. The method according to supplementary note 1, wherein filtering the background object in the current frame includes:
for each object in the current frame, marking the label of the object according to the detection result and/or the correlation result of the object;
and filtering background objects in the current frame by using a sliding window W, wherein the sliding window W is the number of frames which are used as references when the background objects are filtered.
9. The method of supplementary note 8, wherein marking the label of the object comprises:
if the object is a newly detected object in the current frame, setting the label of the object to be a first value;
setting the tag of the object to a second value if the absolute value of the speed of the object is less than the speed threshold, and the tag of the object in the previous frame is the first value and the absolute value of the speed of the object in the previous frame is less than the speed threshold;
setting the tag of the object to a second value if the absolute value of the speed of the object is less than the speed threshold and the tag of the object in the previous frame is the second value.
10. The method according to supplementary note 8, wherein filtering the background object in the current frame comprises:
if the current frame is one of the previous W frames where the sliding window W is located and the absolute value of the speed of the object in the current frame is smaller than the speed threshold, the object is considered as a background object and is filtered;
if the current frame is not one of the previous W frames where the sliding window W is located, the absolute value of the speed of the object in the current frame is smaller than a speed threshold value, and the label of the object is not an initial value, the object is considered to be a background object, and the background object is filtered;
and if the current frame is not one of the previous W frames where the sliding window W is located, the absolute value of the speed of the object in the current frame is smaller than the speed threshold value, and the label of the object is an initial value, filtering the object according to a preset number of frames before the current frame.
11. The method according to supplementary note 10, wherein filtering the object according to a predetermined number of frames before a current frame, comprises:
searching frames before the current frame for a frame in which the label of the object is not an initial value as a reference frame, defining the number of frames between the reference frame and the current frame as FN,
if FN < W, and the distance or azimuth angle between the object and radar is not increased or monotonically increased or decreased or monotonically decreased in FN frame, or the displacement of the object is not increased or monotonically increased or decreased or monotonically decreased in FN frame, the object is considered as a background object, which is filtered;
if FN ≧ W and the distance or azimuth angle between the object and the radar does not increase or monotonically increase or decrease or monotonically decrease in the W frame, or the displacement of the object does not increase or monotonically increase or decrease or monotonically decrease in the W frame, the object is considered to be a background object, which is filtered.
12. The method according to supplementary note 10 or 11, wherein the method further comprises: the label of the filtered out object is set to a second value.

Claims (10)

1. An apparatus for filtering background objects, wherein the apparatus comprises:
the device comprises a correlation unit, a processing unit and a processing unit, wherein the correlation unit is used for correlating detection results of objects in a current frame and a previous frame from a radar, the detection results comprise the position and/or the speed of the object, and the correlation refers to that the detection results of the same object in the current frame and the previous frame are corresponding to the object;
and the filtering unit is used for filtering the background object in the current frame.
2. The apparatus of claim 1, wherein the associating means comprises:
a first calculation unit that calculates, for each object in the current frame, a distance between the object and the radar, and distances between all objects in the previous frame that are not associated and the radar;
a first comparing unit, which compares the distance between the object and the radar with the distance between all the objects which are not associated in the previous frame and the radar to obtain a minimum distance difference and the object in the previous frame corresponding to the minimum distance difference;
a first associating unit that compares the minimum distance difference with a preset distance threshold, and if the minimum distance difference is smaller than the preset distance threshold, it regards that the object and an object in a previous frame corresponding to the minimum distance difference are the same object, and associates a detection result of the object in a current frame with a detection result of the object in a previous frame; and if the minimum distance difference is larger than or equal to the distance threshold value, the object is considered as a newly detected object in the current frame.
3. The apparatus of claim 1, wherein the association unit comprises:
a second calculation unit that calculates, for each object in the current frame, a distance between the object and all objects not associated in the previous frame, resulting in a minimum distance;
a second association unit that compares the minimum distance with a preset distance threshold, and if the minimum distance is smaller than the preset distance threshold, considers that the object is the same as the object in the previous frame corresponding to the minimum distance, and associates the detection result of the object in the current frame with the detection result of the object in the previous frame; and if the minimum distance is greater than or equal to the distance threshold, the object is considered to be a newly detected object in the current frame.
4. The apparatus according to claim 2 or 3, wherein the current frame and the previous frame are two consecutive frames, the distance threshold is: THd=v·Δt/3600,THdIs a distance threshold in meters, v is a preset speed in km/h, Δ t is a frame period in ms.
5. The apparatus according to claim 2 or 3, wherein the current frame and the previous frame are two non-consecutive frames, the distance threshold is: THd=M·v·Δt/3600,THdIs a distance threshold in meters, M is the number of frames between the current frame and the previous frame, v is a preset speed in km/h, Δ t is the frame period in ms.
6. The apparatus of claim 1, wherein the filtering unit comprises:
a determination unit that determines, for each object in the current frame, whether the object satisfies a first condition and a second condition,
the first filtering unit filters the object when the judging unit judges that the object is true,
the first condition is: the absolute value of the velocity of the object is less than a velocity threshold,
the second condition is any one of: the object is filtered in the previous n frames; the distance or azimuth angle between the object and the radar does not increase or monotonically increases; the distance or azimuth angle between the object and the radar does not decrease or monotonically decreases; the displacement of the object does not increase or monotonically increases; and the displacement of the object does not decrease or decreases monotonically.
7. The apparatus of claim 1, wherein the filtering unit comprises:
a marking unit which marks the label of each object in the current frame according to the detection result and/or the correlation result of the object;
and a second filtering unit for filtering the background object in the current frame by using a sliding window W, wherein the sliding window W is the number of frames used as a reference when filtering the background object.
8. The apparatus of claim 7, wherein,
the marking unit sets a label of the object to a first value if the object is a newly detected object in a current frame;
if the absolute value of the speed of the object is less than the speed threshold value, the label of the object in the previous frame is the first value, and the absolute value of the speed of the object in the previous frame is less than the speed threshold value, the marking unit sets the label of the object to a second value;
the marking unit sets the label of the object to a second value if the absolute value of the speed of the object is less than the speed threshold and the label of the object in the previous frame is the second value.
9. The apparatus of claim 7, wherein,
if the current frame is one of the previous W frames where the sliding window W is located, and the absolute value of the speed of the object in the current frame is smaller than the speed threshold, the second filtering unit considers that the object is a background object and filters the background object;
if the current frame is not one of the previous W frames where the sliding window W is located, the absolute value of the speed of the object in the current frame is smaller than the speed threshold value, and the label of the object is not the initial value, the second filtering unit considers that the object is a background object and filters the background object;
if the current frame is not one of the previous frames W where the sliding window W is located, and the absolute value of the speed of the object in the current frame is smaller than the speed threshold, and the label of the object is an initial value, the second filtering unit filters the object according to a predetermined number of frames before the current frame.
10. The apparatus of claim 9, wherein the second filtering unit filters the object according to a predetermined number of frames before a current frame, comprising:
the second filtering unit searches frames before the current frame for a frame in which a label of the object is not an initial value as a reference frame, defines the number of frames between the reference frame and the current frame as FN, and if FN < W, and the distance or azimuth between the object and the radar is not increased or monotonically increased or decreased or monotonically decreased in the FN frame, or the displacement of the object is not increased or monotonically increased or decreased or monotonically decreased in the FN frame, the second filtering unit regards the object as a background object and filters it; if FN ≧ W and the distance or azimuth between the object and the radar does not increase or monotonically increase or decrease or monotonically decrease in the W frame, or the displacement of the object does not increase or monotonically increase or decrease or monotonically decrease in the W frame, the second filtering unit regards the object as a background object and filters it.
CN201910953716.6A 2019-10-09 2019-10-09 Method and device for filtering background object Pending CN112651263A (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201910953716.6A CN112651263A (en) 2019-10-09 2019-10-09 Method and device for filtering background object
JP2020150819A JP2021060977A (en) 2019-10-09 2020-09-08 Background object filtering method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910953716.6A CN112651263A (en) 2019-10-09 2019-10-09 Method and device for filtering background object

Publications (1)

Publication Number Publication Date
CN112651263A true CN112651263A (en) 2021-04-13

Family

ID=75342526

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910953716.6A Pending CN112651263A (en) 2019-10-09 2019-10-09 Method and device for filtering background object

Country Status (2)

Country Link
JP (1) JP2021060977A (en)
CN (1) CN112651263A (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20060082493A1 (en) * 2004-10-15 2006-04-20 Furuno Electric Company Limited Radar and similar apparatus
US20070080850A1 (en) * 2003-09-11 2007-04-12 Kyoichi Abe Object detection system and object detection method
US20110081043A1 (en) * 2009-10-07 2011-04-07 Sabol Bruce M Using video-based imagery for automated detection, tracking, and counting of moving objects, in particular those objects having image characteristics similar to background
US20130002865A1 (en) * 2011-06-30 2013-01-03 Canon Kabushiki Kaisha Mode removal for improved multi-modal background subtraction
US20130148853A1 (en) * 2011-12-12 2013-06-13 Samsung Electronics Co., Ltd. Image processing apparatus and image processing method
US20130266188A1 (en) * 2012-04-06 2013-10-10 Xerox Corporation Video-based method for detecting parking boundary violations
CN105184814A (en) * 2015-07-27 2015-12-23 成都天奥信息科技有限公司 Moving target detecting and tracking method based on multi-frame radar image
US20160110613A1 (en) * 2014-10-20 2016-04-21 King Abdullah University Of Science And Technology System and method for crowd counting and tracking
WO2017099935A1 (en) * 2015-12-10 2017-06-15 Microsoft Technology Licensing, Llc Motion detection of object
CN108446622A (en) * 2018-03-14 2018-08-24 海信集团有限公司 Detecting and tracking method and device, the terminal of target object
CN109478333A (en) * 2016-09-30 2019-03-15 富士通株式会社 Target detection method, device and image processing device
CN109493367A (en) * 2018-10-29 2019-03-19 浙江大华技术股份有限公司 The method and apparatus that a kind of pair of target object is tracked
CN109658437A (en) * 2018-11-01 2019-04-19 深圳神目信息技术有限公司 A kind of method and device of quick detection moving object
CN110033476A (en) * 2018-01-11 2019-07-19 富士通株式会社 Target velocity estimation method, device and image processing equipment
US20190266420A1 (en) * 2018-02-27 2019-08-29 TuSimple System and method for online real-time multi-object tracking

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070080850A1 (en) * 2003-09-11 2007-04-12 Kyoichi Abe Object detection system and object detection method
US20060082493A1 (en) * 2004-10-15 2006-04-20 Furuno Electric Company Limited Radar and similar apparatus
US20110081043A1 (en) * 2009-10-07 2011-04-07 Sabol Bruce M Using video-based imagery for automated detection, tracking, and counting of moving objects, in particular those objects having image characteristics similar to background
US20130002865A1 (en) * 2011-06-30 2013-01-03 Canon Kabushiki Kaisha Mode removal for improved multi-modal background subtraction
US20130148853A1 (en) * 2011-12-12 2013-06-13 Samsung Electronics Co., Ltd. Image processing apparatus and image processing method
US20130266188A1 (en) * 2012-04-06 2013-10-10 Xerox Corporation Video-based method for detecting parking boundary violations
US20160110613A1 (en) * 2014-10-20 2016-04-21 King Abdullah University Of Science And Technology System and method for crowd counting and tracking
CN105184814A (en) * 2015-07-27 2015-12-23 成都天奥信息科技有限公司 Moving target detecting and tracking method based on multi-frame radar image
WO2017099935A1 (en) * 2015-12-10 2017-06-15 Microsoft Technology Licensing, Llc Motion detection of object
CN109478333A (en) * 2016-09-30 2019-03-15 富士通株式会社 Target detection method, device and image processing device
CN110033476A (en) * 2018-01-11 2019-07-19 富士通株式会社 Target velocity estimation method, device and image processing equipment
US20190266420A1 (en) * 2018-02-27 2019-08-29 TuSimple System and method for online real-time multi-object tracking
CN108446622A (en) * 2018-03-14 2018-08-24 海信集团有限公司 Detecting and tracking method and device, the terminal of target object
CN109493367A (en) * 2018-10-29 2019-03-19 浙江大华技术股份有限公司 The method and apparatus that a kind of pair of target object is tracked
CN109658437A (en) * 2018-11-01 2019-04-19 深圳神目信息技术有限公司 A kind of method and device of quick detection moving object

Also Published As

Publication number Publication date
JP2021060977A (en) 2021-04-15

Similar Documents

Publication Publication Date Title
JP7081438B2 (en) Object speed estimation method and equipment and image processing equipment
US9811732B2 (en) Systems and methods for object tracking
US11023744B2 (en) Road parameter calculator
An et al. Real-time lane departure warning system based on a single FPGA
CN112330715B (en) Tracking method, tracking device, terminal equipment and readable storage medium
CN113269007B (en) Target tracking device and method for road monitoring video
US20200143175A1 (en) Scenario detection apparatus and method
US20180134289A1 (en) Lane division line recognition apparatus, lane division line recognition method, driving assist apparatus including lane division line recognition apparatus, and driving assist method including lane division line recognition method
US11250269B2 (en) Recognition method and apparatus for false detection of an abandoned object and image processing device
CN105966398A (en) Method and device for early warning lane departure of vehicle
CN110194175B (en) Device and method for determining driving tendency of driver
CN110135377A (en) Object motion state detection method, apparatus, server and computer readable medium
CN110803163A (en) Method and device for predicting vehicle running track and selecting vehicle following target
US20220388506A1 (en) Control apparatus, movable object, control method, and computer-readable storage medium
US9858493B2 (en) Method and apparatus for performing registration plate detection with aid of edge-based sliding concentric windows
JP2012117944A (en) Navigation device
CN111353339A (en) Object recognition device and method
KR20190064147A (en) Vehicle&#39;s behavior analyzing system using aerial photo and analyzing method using the same
CN113095345A (en) Data matching method and device and data processing equipment
CN115762153A (en) Reversing detection method and device
JP2012175483A (en) Device and method for traffic lane recognition
CN109300313B (en) Illegal behavior detection method, camera and server
CN112200835B (en) Traffic accident detection method and device, electronic equipment and storage medium
CN112651263A (en) Method and device for filtering background object
US11157755B2 (en) Image processing apparatus

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210413