[go: up one dir, main page]

CN107330915B - Target tracking method for AER image sensor - Google Patents

Target tracking method for AER image sensor Download PDF

Info

Publication number
CN107330915B
CN107330915B CN201710442848.3A CN201710442848A CN107330915B CN 107330915 B CN107330915 B CN 107330915B CN 201710442848 A CN201710442848 A CN 201710442848A CN 107330915 B CN107330915 B CN 107330915B
Authority
CN
China
Prior art keywords
tracker
event
time
size
aer
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710442848.3A
Other languages
Chinese (zh)
Other versions
CN107330915A (en
Inventor
徐江涛
周义豪
高志远
聂凯明
高静
史再峰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Haixin Optoelectronic Technology Co ltd
Original Assignee
Tianjin University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin University filed Critical Tianjin University
Priority to CN201710442848.3A priority Critical patent/CN107330915B/en
Publication of CN107330915A publication Critical patent/CN107330915A/en
Application granted granted Critical
Publication of CN107330915B publication Critical patent/CN107330915B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)
  • Studio Devices (AREA)

Abstract

本发明涉及AER图像传感器图像识别领域,为根据基于AER图像传感器的识别系统对于追踪算法事件驱动、低计算复杂度以及提供目标状态的要求,本发明旨在提出一种用于AER图像传感器的目标追踪方法。本发明采用的技术方案是,用于AER图像传感器的目标追踪方法,基于事件驱动,产生不同追踪器来区分不同目标,每个追踪器包含以下信息:追踪器的中心坐标xc,尺寸Rc,搜索范围Rk,追踪器产生时间Tc,最后接受事件时间Tl,活跃值A共计6个参数。本发明主要应用于图像传感器图像识别场合。

Figure 201710442848

The present invention relates to the field of AER image sensor image recognition. In order to meet the requirements of an AER image sensor-based recognition system for tracking algorithm event-driven, low computational complexity and providing target states, the present invention aims to propose a target for AER image sensors. tracking method. The technical solution adopted in the present invention is that the target tracking method for AER image sensor is based on event-driven, generating different trackers to distinguish different targets, and each tracker includes the following information: the center coordinate x c of the tracker, the size R c , the search range R k , the tracker generation time T c , the event time T l finally accepted, and the activity value A a total of 6 parameters. The invention is mainly applied to the occasion of image recognition of the image sensor.

Figure 201710442848

Description

用于AER图像传感器的目标追踪方法Object tracking method for AER image sensor

技术领域technical field

本发明涉及AER图像传感器图像识别领域,尤其涉及一种用于AER图像传感器的目标追踪方法。The invention relates to the field of image recognition of AER image sensors, in particular to a target tracking method for AER image sensors.

背景技术Background technique

在实时目标识别领域,对于高分辨率和帧频的爆发式需求增加了信息传输和储存压力,对识别的处理速度也提出了巨大的要求,后端算法和架构的优化无法从根本上解决诸多瓶颈,极大限制了识别系统的发展。AER(Address-Event Representation,AER,地址-事件表示)图像传感器是一种模拟生物视觉皮层上的异步稀疏事件驱动信号处理的仿生视觉传感器。该传感器的像素将光强变化信息编码成事件,且只有检测到光强变化的像素才可以输出事件,显著地减少了冗余数据,为突破实时识别系统的瓶颈提供了新思路。In the field of real-time target recognition, the explosive demand for high resolution and frame rate increases the pressure on information transmission and storage, and also puts forward huge requirements on the processing speed of recognition. The optimization of back-end algorithms and architecture cannot fundamentally solve many problems. The bottleneck greatly limits the development of the recognition system. AER (Address-Event Representation, AER, address-event representation) image sensor is a biomimetic vision sensor that simulates the asynchronous sparse event-driven signal processing on the biological visual cortex. The pixels of the sensor encode light intensity change information into events, and only pixels that detect light intensity changes can output events, which significantly reduces redundant data and provides a new idea for breaking through the bottleneck of real-time identification systems.

与帧驱动的传统图像传感器不同的是,AER图像传感器输出的异步的事件流,每个事件包含发生光强变化的像素地址、产生时间以及事件极性(光强增大或减小)。而大多数现存的图像处理算法都是基于帧驱动,无法直接利用于AER数据处理。将事件分割成固定时长的时间片是一个简单的适应这些算法的办法,但是无法利用AER传感器的优势,增加了各部分算法的处理时间,不利于实时识别系统的性能。Different from the frame-driven traditional image sensor, the AER image sensor outputs an asynchronous event stream, each event contains the pixel address where the light intensity changes, the generation time and the event polarity (the light intensity increases or decreases). However, most of the existing image processing algorithms are based on frame driving and cannot be directly used for AER data processing. Dividing events into fixed time slices is a simple way to adapt to these algorithms, but it cannot take advantage of the AER sensor, which increases the processing time of each part of the algorithm, which is not conducive to the performance of real-time recognition systems.

在实际的应用中,目标追踪算法是必不可少的,既要实现目标检测的功能,给出目标的实时位置,又要防止噪声的干扰。如果存在多个运动的目标,在诸多情况下,例如目标交叠时,仍然保证能精确定位不同目标,是追踪算法实现的关键。此外,对于基于AER图像传感器的追踪算法,事件驱动的概念尤为重要。分割时间片的算法可能会导致一个瞬间动作或目标被分割入两个时间片,影响追踪的精确度。In practical applications, the target tracking algorithm is indispensable, which not only needs to realize the function of target detection, but also gives the real-time position of the target, and also prevents the interference of noise. If there are multiple moving targets, in many cases, such as when the targets overlap, it is still possible to accurately locate different targets, which is the key to the realization of the tracking algorithm. Furthermore, the event-driven concept is particularly important for tracking algorithms based on AER image sensors. The time-slicing algorithm may cause an instantaneous action or target to be divided into two time-slices, affecting the tracking accuracy.

目前已有诸多基于事件驱动的簇追踪算法,但是各个算法基本上都是在特定的应用中才能发挥最好的性能,常用的追踪算法主要是为了实现目标的检测,而目标识别领域中,不同目标的状态决定了是否达到识别的要求,这在对运动目标的持续监控中显得尤为重要。此外,针对目标识别这个特殊的应用,尽可能简化算法复杂度,减少计算资源消耗是极为必要的。设计一款用于AER图像传感器的事件驱动目标追踪算法,为后端的识别提供目标状态信息显得极为关键。At present, there are many event-driven cluster tracking algorithms, but basically each algorithm can play the best performance in specific applications. The commonly used tracking algorithms are mainly to achieve target detection, but in the field of target recognition, different The state of the target determines whether the recognition requirements are met, which is particularly important in the continuous monitoring of moving targets. In addition, for the special application of target recognition, it is extremely necessary to simplify the algorithm complexity as much as possible and reduce the consumption of computing resources. It is extremely critical to design an event-driven target tracking algorithm for AER image sensor to provide target status information for back-end recognition.

发明内容SUMMARY OF THE INVENTION

为克服现有技术的不足,根据基于AER图像传感器的识别系统对于追踪算法事件驱动、低计算复杂度以及提供目标状态的要求,本发明旨在提出一种用于AER图像传感器的目标追踪方法。本发明采用的技术方案是,用于AER图像传感器的目标追踪方法,基于事件驱动,产生不同追踪器来区分不同目标,每个追踪器包含以下信息:追踪器的中心坐标xc,尺寸Rc,搜索范围Rk,追踪器产生时间Tc,最后接受事件时间Tl,活跃值A共计6个参数;In order to overcome the deficiencies of the prior art, according to the requirements of an AER image sensor-based recognition system for tracking algorithm event-driven, low computational complexity and providing target states, the present invention aims to propose a target tracking method for AER image sensors. The technical solution adopted in the present invention is that the target tracking method for AER image sensor is based on event-driven, generating different trackers to distinguish different targets, and each tracker includes the following information: the center coordinate x c of the tracker, the size R c , the search range R k , the tracker generation time T c , the event time T l finally accepted, and the activity value A a total of 6 parameters;

根据AER传感器输出的事件流的产生时间,按照顺序处理各个事件,对于坐标为xe的一个事件,首先计算它与现存的各个追踪器的欧氏距离R,如果它处于某个追踪器的搜索范围Rk之内,则该追踪器的参数会被更新,如果多个追踪器满足条件,则具有最早产生时间的追踪器会被更新,(1)为欧氏距离计算以及追踪器判别公式:According to the generation time of the event stream output by the AER sensor, each event is processed in sequence. For an event whose coordinate is x e , the Euclidean distance R between it and the existing trackers is calculated first. If it is in the search of a tracker Within the range R k , the parameters of the tracker will be updated. If multiple trackers meet the conditions, the tracker with the earliest generation time will be updated. (1) is the Euclidean distance calculation and tracker discrimination formula:

R=|xc-xe|<Rk (1)R=|x c -x e |<R k (1)

如果不存在满足条件的追踪器,则中心坐标为xe且其他参数均为缺省的追踪器会被创建,对于满足条件的追踪器,除了创建时间始终保持不变外,其余各参数均会更新,t时刻的追踪器中心坐标将从xc(t)移至xc(t+△t),其计算公式如(2)所示:If there is no tracker that satisfies the conditions, a tracker whose center coordinate is x e and other parameters are default will be created. Update, the center coordinate of the tracker at time t will be moved from x c (t) to x c (t+△t), and its calculation formula is shown in (2):

xc(t+Δt)=xc(t)·αx+xe·(1-αx) (2)x c (t+Δt)=x c (t)·α x +x e ·(1-α x ) (2)

其中αx的大小关系追踪器中心的移动速度,△t为当前事件的时间与该追踪器最后接受事件的时间的差值,追踪器的尺寸Rc与所计算的欧式距离R有关,计算公式如(3)所示:The size of α x is related to the moving speed of the center of the tracker, Δt is the difference between the time of the current event and the time when the tracker finally accepted the event, the size of the tracker R c is related to the calculated Euclidean distance R, the calculation formula As shown in (3):

Rc(t+Δt)=max(Rmin,Rc(t)·αR+R·(1-αR)) (3)R c (t+Δt)=max(R min ,R c (t)·α R +R·(1-α R )) (3)

其中αR的大小关系追踪器尺寸变化的快慢,Rmin为追踪器的最小尺寸,追踪器的搜索范围Rk与追踪器尺寸R成正比例关系,倍数为αk。追踪器的活跃值A计算公式如(4)所示:The size of α R is related to how fast the size of the tracker changes, R min is the minimum size of the tracker, the search range R k of the tracker is proportional to the size R of the tracker, and the multiple is α k . The calculation formula of the active value A of the tracker is shown in (4):

Figure BDA0001320449140000021
Figure BDA0001320449140000021

其中p为一个常数,追踪器根据其活跃值的大小,可处于隐藏层或者可见层;隐藏层的追踪器活跃值较低,为噪声引起或代表不完整的目标;可见层的追踪器则符合识别的条件,该目标所属的事件将被送入识别系统的后续模块;新创建的追踪器都处于隐藏层中,当追踪器的活跃值高于Aup时,追踪器进入可见层;如果追踪器的活跃值低于Aup,则该追踪器回落至隐藏层;同时,若追踪器的活跃值低于Adown,则该追踪器会被直接删除;如果两个追踪器重叠,则后续事件会优先选择创建时间较早的,而创建时间较早的追踪器如果长期接收不到新的事件,也会被删除。Where p is a constant, the tracker can be in the hidden layer or the visible layer according to the size of its active value; the active value of the tracker in the hidden layer is low, which is caused by noise or represents an incomplete target; the tracker in the visible layer conforms to Recognition conditions, the event to which the target belongs will be sent to the subsequent modules of the recognition system; the newly created trackers are all in the hidden layer, and when the tracker's active value is higher than A up , the tracker enters the visible layer; if the tracker is tracked If the active value of the tracker is lower than A up , the tracker will fall back to the hidden layer; at the same time, if the active value of the tracker is lower than A down , the tracker will be deleted directly; if the two trackers overlap, the subsequent events The tracker with the earlier creation time will be preferentially selected, and if the tracker with the earlier creation time cannot receive new events for a long time, it will also be deleted.

采用查找表的方式根据最小时间步长tmin储存数量为N的指数值,根据所计算出的△t调取所需数值。A look-up table is used to store index values with a number of N according to the minimum time step t min , and the required value is retrieved according to the calculated Δt.

本发明的特点及有益效果是:The characteristics and beneficial effects of the present invention are:

本发明提出了一种基于AER图像传感器的目标追踪算法,该算法基于事件驱动,解决了传统帧驱动追踪算法不适应AER数据的问题。针对目标识别这个特殊应用,尽可能降低了计算复杂度,采用活跃值的方式区分不同目标的状态以确定是否达到识别要求,并采用查找表的方式减少活跃值计算量,该目标追踪算法有利于AER图像传感器在多目标识别上的利用。The invention proposes a target tracking algorithm based on an AER image sensor, which is based on event driving and solves the problem that the traditional frame driving tracking algorithm does not adapt to AER data. For the special application of target recognition, the computational complexity is reduced as much as possible, the active value is used to distinguish the states of different targets to determine whether the recognition requirements are met, and the lookup table is used to reduce the amount of active value calculation. The target tracking algorithm is beneficial to Utilization of AER image sensor for multi-target recognition.

附图说明:Description of drawings:

图1追踪器中心移动示意图。Figure 1 Schematic diagram of the center movement of the tracker.

图2追踪器层间变化示意图。Figure 2 Schematic diagram of the inter-layer variation of the tracker.

具体实施方式Detailed ways

本发明提出的用于AER图像传感器的事件驱动目标追踪算法介绍如下:该算法基于事件驱动,传感器输出的每一个事件都会触发一系列的操作。算法产生不同追踪器来区分不同目标,每个追踪器包含以下信息:追踪器的中心坐标xc,尺寸Rc,搜索范围Rk,追踪器产生时间Tc,最后接受事件时间Tl,活跃值A共计6个参数。The event-driven target tracking algorithm for the AER image sensor proposed by the present invention is introduced as follows: The algorithm is based on event-driven, and each event output by the sensor will trigger a series of operations. The algorithm generates different trackers to distinguish different targets, each tracker contains the following information: the center coordinate x c of the tracker, the size R c , the search range R k , the tracker generation time T c , the last receiving event time T l , the activity The value A has a total of 6 parameters.

该算法根据AER传感器输出的事件流的产生时间,按照顺序处理各个事件。对于坐标为xe的一个事件,首先计算它与现存的各个追踪器的欧氏距离R,如果它处于某个追踪器的搜索范围Rk之内,则该追踪器的参数会被更新。如果多个追踪器满足条件,则具有最早产生时间的追踪器会被更新。(1)为欧氏距离计算以及追踪器判别公式。The algorithm processes each event in sequence according to the generation time of the event stream output by the AER sensor. For an event whose coordinate is x e , first calculate the Euclidean distance R between it and the existing trackers. If it is within the search range R k of a tracker, the parameters of the tracker will be updated. If multiple trackers meet the condition, the tracker with the earliest spawn time is updated. (1) is the Euclidean distance calculation and tracker discrimination formula.

R=|xc-xe|<Rk (1)R=|x c -x e |<R k (1)

如果不存在满足条件的追踪器,则中心坐标为xe且其他参数均为缺省的追踪器会被创建。对于满足条件的追踪器,除了创建时间始终保持不变外,其余各参数均会更新。t时刻的追踪器中心坐标将从xc(t)移至xc(t+△t),其计算公式如(2)所示,其移动过程如图1所示。If there is no tracker that satisfies the conditions, a tracker with center coordinates x e and other parameters are default will be created. For trackers that meet the conditions, all parameters are updated except the creation time that remains the same. The center coordinate of the tracker at time t will move from x c (t) to x c (t+Δt), the calculation formula is shown in (2), and the moving process is shown in Figure 1.

xc(t+Δt)=xc(t)·αx+xe·(1-αx) (2)x c (t+Δt)=x c (t)·α x +x e ·(1-α x ) (2)

其中αx的大小关系追踪器中心的移动速度,△t为当前事件的时间与该追踪器最后接受事件的时间的差值。追踪器的尺寸Rc与所计算的欧式距离R有关,计算公式如(3)所示。The size of α x is related to the moving speed of the center of the tracker, and Δt is the difference between the time of the current event and the time when the tracker finally accepted the event. The size R c of the tracker is related to the calculated Euclidean distance R, and the calculation formula is shown in (3).

Rc(t+Δt)=max(Rmin,Rc(t)·αR+R·(1-αR)) (3)R c (t+Δt)=max(R min ,R c (t)·α R +R·(1-α R )) (3)

其中αR的大小关系追踪器尺寸变化的快慢,Rmin为追踪器的最小尺寸。追踪器的搜索范围Rk与追踪器尺寸R成正比例关系,倍数为αk。追踪器的活跃值A计算公式如(4)所示。The size of α R is related to the speed of change of the tracker size, and R min is the minimum size of the tracker. The search range R k of the tracker is proportional to the tracker size R, and the multiple is α k . The calculation formula of the active value A of the tracker is shown in (4).

Figure BDA0001320449140000031
Figure BDA0001320449140000031

其中p为一个常数,与活跃值每次增加幅度有关。追踪器根据其活跃值的大小,可处于隐藏层或者可见层。隐藏层的追踪器活跃值较低,一般为噪声引起或代表不完整的目标。可见层的追踪器则符合识别的条件,该目标所属的事件将被送入识别系统的后续模块。追踪器的层间变化由图2所示。新创建的追踪器都处于隐藏层中,当追踪器的活跃值高于Aup时,追踪器进入可见层。如果追踪器的活跃值低于Aup,则该追踪器回落至隐藏层。同时,若追踪器的活跃值低于Adown,则该追踪器会被直接删除。因噪声而错误创建的追踪器会难以接收新的事件,因而会被很快删除。如果两个追踪器重叠,则后续事件会优先选择创建时间较早的,而创建时间较早的追踪器如果长期接收不到新的事件,也会被很快删除。where p is a constant that is related to the magnitude of each increase in the active value. The tracker can be in the hidden layer or the visible layer according to the size of its active value. The tracker activity value of the hidden layer is low, which is generally caused by noise or represents an incomplete target. The tracker of the visible layer meets the conditions for recognition, and the event to which the target belongs will be sent to the subsequent modules of the recognition system. The inter-layer variation of the tracker is shown in Figure 2. The newly created trackers are all in the hidden layer, when the active value of the tracker is higher than A up , the tracker enters the visible layer. If the active value of the tracker is lower than A up , the tracker falls back to the hidden layer. At the same time, if the active value of the tracker is lower than A down , the tracker will be deleted directly. Trackers created incorrectly due to noise will have difficulty receiving new events and will be quickly deleted. If two trackers overlap, the subsequent event will choose the one with the earlier creation time, and if the tracker with the earlier creation time cannot receive new events for a long time, it will also be deleted quickly.

追踪器的活跃值计算涉及指数计算,运算量较大,因此本发明采用查找表的方式根据最小时间步长tmin储存数量为N的指数值,根据所计算出的△t调取所需数值,极大减少了运算量。The active value calculation of the tracker involves index calculation, and the calculation amount is relatively large. Therefore, the present invention uses a look-up table to store index values with a number of N according to the minimum time step t min , and retrieve the required value according to the calculated Δt. , which greatly reduces the computational complexity.

根据本发明提出的基于AER图像传感器的目标追踪算法,对于分辨率为128×128的图像传感器,以下为追踪器的缺省参数,:追踪器尺寸Rc为15像素,搜索范围Rk为18像素,最后接收时间Tl为-1ns,初始活跃值为1v。为了让追踪器的移动和尺寸的变化更平滑,坐标移动系数αx为0.95,尺寸变化系数αR为0.95,搜索范围Rk和尺寸Rc的比例αk为1.2。活跃值的两个阈值与事件率有关,可令Aup为50,Adown为0.1,常数p可设置为1。在利用查找表简化活跃值计算上,最小时间步长tmin为20ns,τ为40000ns,查找表储存0ns至200000ns范围内数值,以tmin为步长,一共储存10000个数值。本发明提出的目标追踪算法以事件为驱动,具有较低的计算复杂度,能够满足基于AER图像传感器的目标识别系统的需求。According to the target tracking algorithm based on the AER image sensor proposed in the present invention, for an image sensor with a resolution of 128×128, the following are the default parameters of the tracker: the tracker size R c is 15 pixels, and the search range R k is 18 pixels Pixels, the last receiving time Tl is -1ns , and the initial active value is 1v. In order to make the tracker's movement and size change smoother, the coordinate movement coefficient α x is 0.95, the size change coefficient α R is 0.95, and the ratio α k of the search range R k to the size R c is 1.2. The two thresholds for the active value are related to the event rate, which can be set to 50 for A up and 0.1 for A d o wn , and the constant p can be set to 1. In using the lookup table to simplify the calculation of active values, the minimum time step tmin is 20ns, and τ is 40000ns. The lookup table stores values in the range of 0ns to 200000ns, with tmin as the step, and a total of 10000 values are stored. The target tracking algorithm proposed in the present invention is driven by events, has low computational complexity, and can meet the requirements of target recognition systems based on AER image sensors.

Claims (2)

1.一种用于AER图像传感器的目标追踪方法,其特征是,基于事件驱动,产生不同追踪器来区分不同目标,每个追踪器包含以下信息:追踪器的中心坐标xc,尺寸Rc,搜索范围Rk,追踪器产生时间Tc,最后接受事件时间Tl,活跃值A共计6个参数;1. A target tracking method for an AER image sensor, characterized in that, based on event-driven, different trackers are generated to distinguish different targets, and each tracker contains the following information: center coordinate x c of the tracker, size R c , the search range R k , the tracker generation time T c , the event time T l finally accepted, and the activity value A a total of 6 parameters; 根据AER传感器输出的事件流的产生时间,按照顺序处理各个事件,对于坐标为xe的一个事件,首先计算它与现存的各个追踪器的欧氏距离R,如果它处于某个追踪器的搜索范围Rk之内,则该追踪器的参数会被更新,如果多个追踪器满足条件,则具有最早产生时间的追踪器会被更新,(1)为欧氏距离计算以及追踪器判别公式:According to the generation time of the event stream output by the AER sensor, each event is processed in sequence. For an event whose coordinate is x e , the Euclidean distance R between it and the existing trackers is calculated first. If it is in the search of a tracker Within the range R k , the parameters of the tracker will be updated. If multiple trackers meet the conditions, the tracker with the earliest generation time will be updated. (1) is the Euclidean distance calculation and tracker discrimination formula: R=|xc-xe|<Rk (1)R=|x c -x e |<R k (1) 如果不存在满足条件的追踪器,则中心坐标为xe且其他参数均为缺省的追踪器会被创建,对于满足条件的追踪器,除了产生时间 始终保持不变外,其余各参数均会更新,t时刻的追踪器中心坐标将从xc(t)移至xc(t+△t),其计算公式如(2)所示:If there is no tracker that satisfies the conditions, a tracker whose center coordinate is x e and other parameters are default will be created. Update, the center coordinate of the tracker at time t will be moved from x c (t) to x c (t+△t), and its calculation formula is shown in (2): xc(t+Δt)=xc(t)·αx+xe·(1-αx) (2)x c (t+Δt)=x c (t)·α x +x e ·(1-α x ) (2) 其中αx的大小关系追踪器中心的移动速度,△t为当前事件的时间与该追踪器最后接受事件的时间的差值,追踪器的尺寸Rc与所计算的欧式距离R有关,计算公式如(3)所示:The size of α x is related to the moving speed of the center of the tracker, Δt is the difference between the time of the current event and the time when the tracker finally accepted the event, the size of the tracker R c is related to the calculated Euclidean distance R, the calculation formula As shown in (3): Rc(t+Δt)=max(Rmin,Rc(t)·αR+R·(1-αR)) (3)R c (t+Δt)=max(R min ,R c (t)·α R +R·(1-α R )) (3) 其中αR的大小关系追踪器尺寸变化的快慢,Rmin为追踪器的最小尺寸,追踪器的搜索范围Rk与追踪器尺寸Rc成正比例关系,倍数为αk,追踪器的活跃值A计算公式如(4)所示:The size of α R is related to the speed of the size change of the tracker, R min is the minimum size of the tracker, the search range R k of the tracker is proportional to the size R c of the tracker, the multiple is α k , the active value of the tracker A The calculation formula is shown in (4):
Figure FDA0002368863010000011
Figure FDA0002368863010000011
其中p为一个常数,追踪器根据其活跃值的大小,可处于隐藏层或者可见层;隐藏层的追踪器活跃值较低,为噪声引起或代表不完整的目标;可见层的追踪器则符合识别的条件,该目标所属的事件将被送入识别系统的后续模块;新创建的追踪器都处于隐藏层中,当追踪器的活跃值高于Aup时,追踪器进入可见层;如果追踪器的活跃值低于Aup,则该追踪器回落至隐藏层;同时,若追踪器的活跃值低于Adown,则该追踪器会被直接删除;如果两个追踪器重叠,则后续事件会优先选择产生时间较早的,而产生时间较早的追踪器如果长期接收不到新的事件,也会被删除。Where p is a constant, the tracker can be in the hidden layer or the visible layer according to the size of its active value; the active value of the tracker in the hidden layer is low, which is caused by noise or represents an incomplete target; the tracker in the visible layer conforms to Recognition conditions, the event to which the target belongs will be sent to the subsequent modules of the recognition system; the newly created trackers are all in the hidden layer, and when the tracker's active value is higher than A up , the tracker enters the visible layer; if the tracker is tracked If the active value of the tracker is lower than A up , the tracker will fall back to the hidden layer; at the same time, if the active value of the tracker is lower than A down , the tracker will be deleted directly; if the two trackers overlap, the subsequent events The tracker with the earlier generation time will be preferentially selected, and if the tracker with the earlier generation time cannot receive new events for a long time, it will also be deleted.
2.如权利要求1所述的用于AER图像传感器的目标追踪方法,其特征是,采用查找表的方式根据最小时间步长tmin储存数量为N的指数值,根据所计算出的△t调取所需数值。2. The target tracking method for an AER image sensor as claimed in claim 1, wherein a look-up table is used to store an index value of N according to the minimum time step t min , and according to the calculated Δt Recall the desired value.
CN201710442848.3A 2017-06-13 2017-06-13 Target tracking method for AER image sensor Active CN107330915B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710442848.3A CN107330915B (en) 2017-06-13 2017-06-13 Target tracking method for AER image sensor

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710442848.3A CN107330915B (en) 2017-06-13 2017-06-13 Target tracking method for AER image sensor

Publications (2)

Publication Number Publication Date
CN107330915A CN107330915A (en) 2017-11-07
CN107330915B true CN107330915B (en) 2020-08-25

Family

ID=60195349

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710442848.3A Active CN107330915B (en) 2017-06-13 2017-06-13 Target tracking method for AER image sensor

Country Status (1)

Country Link
CN (1) CN107330915B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109785365B (en) * 2019-01-17 2021-05-04 西安电子科技大学 A real-time object tracking method for addressing event-driven unstructured signals

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1720719A (en) * 2002-12-03 2006-01-11 传感电子公司 Event driven video tracking system
CN104091349A (en) * 2014-06-17 2014-10-08 南京邮电大学 Robust target tracking method based on support vector machine
CN105469039A (en) * 2015-11-19 2016-04-06 天津大学 Target identification system based on AER image sensor
CN106127800A (en) * 2016-06-14 2016-11-16 天津大学 Real-time many object tracking methods based on AER imageing sensor and device
CN106407990A (en) * 2016-09-10 2017-02-15 天津大学 Bionic target identification system based on event driving

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8457392B2 (en) * 2007-07-27 2013-06-04 Sportvision, Inc. Identifying an object in an image using color profiles

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1720719A (en) * 2002-12-03 2006-01-11 传感电子公司 Event driven video tracking system
CN104091349A (en) * 2014-06-17 2014-10-08 南京邮电大学 Robust target tracking method based on support vector machine
CN105469039A (en) * 2015-11-19 2016-04-06 天津大学 Target identification system based on AER image sensor
CN106127800A (en) * 2016-06-14 2016-11-16 天津大学 Real-time many object tracking methods based on AER imageing sensor and device
CN106407990A (en) * 2016-09-10 2017-02-15 天津大学 Bionic target identification system based on event driving

Also Published As

Publication number Publication date
CN107330915A (en) 2017-11-07

Similar Documents

Publication Publication Date Title
CN109741318B (en) Real-time detection method of single-stage multi-scale specific target based on effective receptive field
CN104616318B (en) A kind of motion target tracking method in video sequence image
CN108921873B (en) Markov Decision Online Multi-target Tracking Method Based on Kernel Correlation Filtering Optimization
CN111611901A (en) Vehicle retrograde detection method, device, device and storage medium
CN116878501A (en) A high-precision positioning and mapping system and method based on multi-sensor fusion
CN116363163A (en) Space object detection and tracking method, system and storage medium based on event camera
CN108986130A (en) A Method of Infrared Weak and Small Target Detection in Air Background
JP2020109644A (en) Fall detection method, fall detection apparatus, and electronic device
CN107330915B (en) Target tracking method for AER image sensor
Azimi et al. PKS: A photogrammetric key-frame selection method for visual-inertial systems built on ORB-SLAM3
CN114954532A (en) Lane line determination method, device, equipment and storage medium
US20240377539A1 (en) Control of the brightness of a display
NGENI et al. Multiple object tracking (Mot) of vehicles to solve vehicle occlusion problems using deepsort and quantum computing
CN111160190B (en) Vehicle-mounted pedestrian detection-oriented classification auxiliary kernel correlation filtering tracking method
CN113012193B (en) A Multi-Pedestrian Tracking Method Based on Deep Learning
CN118570298A (en) A method for highly dynamic pose estimation of intelligent agents based on motion-encoded event plane representation
CN117665805A (en) A fine-grained multi-scale human posture estimation method based on radio frequency signals
CN117214857A (en) Tracking method of Gaussian multi-hypothesis multi-expansion target in three-dimensional scene
Su et al. An efficient human-following method by fusing kernelized correlation filter and depth information for mobile robot
Kim et al. Spiking cooperative network implemented on fpga for real-time event-based stereo system
CN116128959A (en) Human body joint point thermodynamic diagram triangularization method and system based on time domain prediction
CN115220056A (en) A device and method for estimating motion state of aerial target based on laser ranging
Fei et al. Improved KCF algorithm and its application to target lost prediction
CN114399730A (en) Traffic target detection model training and target detection method and edge computing device
CN116958142B (en) Target detection and tracking method based on compound eye event imaging and high-speed turntable

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20230322

Address after: 300392 Industrial Incubation 5-1559, North 2-204, No. 18, Haitai West Road, Huayuan Industrial Zone, Binhai New Area, Tianjin

Patentee after: Tianjin Haixin Optoelectronic Technology Co.,Ltd.

Address before: 300072 Tianjin City, Nankai District Wei Jin Road No. 92

Patentee before: Tianjin University