[go: up one dir, main page]

CN102645646B - Uncertain fusion location method of multiple information sources - Google Patents

Uncertain fusion location method of multiple information sources Download PDF

Info

Publication number
CN102645646B
CN102645646B CN 201210134367 CN201210134367A CN102645646B CN 102645646 B CN102645646 B CN 102645646B CN 201210134367 CN201210134367 CN 201210134367 CN 201210134367 A CN201210134367 A CN 201210134367A CN 102645646 B CN102645646 B CN 102645646B
Authority
CN
China
Prior art keywords
fusion
uncertain
error
matrix
constantly
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN 201210134367
Other languages
Chinese (zh)
Other versions
CN102645646A (en
Inventor
史忠科
王慧丽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Xian Feisida Automation Engineering Co Ltd
Original Assignee
Xian Feisida Automation Engineering Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Xian Feisida Automation Engineering Co Ltd filed Critical Xian Feisida Automation Engineering Co Ltd
Priority to CN 201210134367 priority Critical patent/CN102645646B/en
Publication of CN102645646A publication Critical patent/CN102645646A/en
Application granted granted Critical
Publication of CN102645646B publication Critical patent/CN102645646B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses an uncertain fusion location method of multiple information sources, which belongs to the technical field of intelligent information processing and is used for resolving the problems of fusion and location of multi-source monitoring information under the effect of uncertain factors in an actual system. The uncertain fusion location method considers the uncertain factors as relative noise with uncertain items, and obtains uncertain fusion algorithm of the multi-source monitoring information. The algorithm only needs to conduct once global approximation on the uncertain factors in the system, and the uncertain fusion location method reduces calculated amount, improves fusion accuracy and achieves accurate location of targets accordingly.

Description

The uncertain fusion and positioning method of a kind of multiple source
Technical field
The present invention relates to a kind of fusion and positioning method, the particularly fusion and positioning method of a kind of multiple source under uncertain noise belongs to the intelligent information processing technology field.
Background technology
Along with going deep into of low altitude airspace administrative reform, the General Aviation of China enters the fast-growing period, and the increase day by day of all purpose aircraft makes the low altitude airspace flying activity become more and more intricate, thereby has brought new challenge for low-altitude detection and security monitoring.
Radar is with its intrinsic characteristics, it is the necessaries of low latitude blank pipe, due to the rectilinear propagation of radar wave and the impact of landform shelter, when being monitored, airbound target has a large amount of radar shadow, limited the raising of tracking accuracy and the ability that short-term collision detects alarm.The auxiliary supervision of low null images can be followed the tracks of low target, spreadability is good, but the climate impact is larger, the auxiliary supervision cooperation of image radar tracking system can be realized the collaborative supervision to aerial target, obtain the exact position of target by Data fusion technique, and then the tracking low target, the reliability of raising surveillance.
Raising to target tracking accuracy and system reliability requirement makes single-measurement can't satisfy, and utilizes the multi-source information of multisensor to merge to target the focus that the location has become present research.The JPDA method of multisensor and be the target following localization method of commonly using based on many hypothesis tracing of Interactive Multiple-Model, document " " based on infrared and maneuvering target tracking method Radar Data Fusion "; Zhu Zhiyu; laser and infrared; in February, 2007, the 37th volume, the 2nd phase; 170-174 " utilizes parallel multisensor JPDA method, by infrared fusion with radar data being realized the track and localization to target.But these fusion and positioning methods are mainly the hypothesis of independent white Gaussian noise based on the measuring error of multisensor, and condition is too desirable.In reality, radar and vision monitoring equipment usually are placed on a platform, and the error of platform position angle and angular altitude has caused image and radargrammetry noise to have correlativity, and noise statistics can not be known for people, exists various uncertain factors; If ground is furnished with a plurality of surveillances, merging needs equivalence to the same coordinate system when estimating, this equivalence also can bring relevance error and uncertain error when the monitoring point distance is larger.Practice shows, these relevance errors and uncertainty are usually the principal elements of impact fusion estimated accuracy.And the present fusion and positioning method of giving can't be considered the uncertain factor in real system, description real system that can not be properer.
Summary of the invention
The object of the invention is to for the deficiencies in the prior art, the fusion and positioning method of the lower multiple source of a kind of uncertain factor impact is provided, the fusion that obtains multi-source information by the coefficient weighting matrix under the impact of calculating uncertain factor is estimated, realizes the fusion location to target.
The technical solution adopted in the present invention is the uncertain fusion and positioning method of a kind of multiple source, and characteristics specifically comprise the following steps:
1, contain correlation noise and indeterminate mIndividual sensor provides the model of Same Physical vector
Figure 378866DEST_PATH_IMAGE002
kTConstantly merge and estimate to be defined as
In formula:
Figure 500592DEST_PATH_IMAGE006
Be physical vector to be estimated, Be the fusion estimated result,
Figure 323688DEST_PATH_IMAGE012
For
Figure 778809DEST_PATH_IMAGE014
Constantly the The measured value of individual sensor,
Figure 138564DEST_PATH_IMAGE018
Be corresponding measuring error, average is zero,
Figure 98298DEST_PATH_IMAGE020
Covariance matrix be
Figure 842263DEST_PATH_IMAGE022
,
Figure 954576DEST_PATH_IMAGE024
Expression Constantly the
Figure 674456DEST_PATH_IMAGE016
The indeterminate of individual sensor measurement,
Figure 222112DEST_PATH_IMAGE026
Be matrix of coefficients,
Figure 188931DEST_PATH_IMAGE028
Be the sampling period, full application form symbol definition is identical;
2, according to the covariance matrix of surveying instrument and the given measuring error of error calibration, calculate
Figure 678687DEST_PATH_IMAGE030
Sub-block in formula
Figure 98167DEST_PATH_IMAGE032
The expression measuring error
Figure 449514DEST_PATH_IMAGE034
Covariance matrix,
Figure 988949DEST_PATH_IMAGE036
Be the supremum of uncertain error, EThe expression mathematical expectation, real parameter
Figure 400338DEST_PATH_IMAGE038
Determine by concrete system and experiment;
3, basis
Figure 307114DEST_PATH_IMAGE040
Determine to merge the estimation coefficient matrix
Figure 196573DEST_PATH_IMAGE042
,
In formula:
Figure 387252DEST_PATH_IMAGE044
,
Figure 438384DEST_PATH_IMAGE046
,
Figure 832457DEST_PATH_IMAGE048
Be unit matrix;
4 kTFusion estimated value constantly
Figure 774874DEST_PATH_IMAGE050
And the error covariance matrix that merges estimation
Figure 305212DEST_PATH_IMAGE052
For:
Figure 58405DEST_PATH_IMAGE054
The invention has the beneficial effects as follows: the present invention considers the impact of uncertain factor in the measurement of multi-source monitoring system, more press close to real system, in order to dwindle the impact of uncertain factor, uncertain factor is thought of as to measure noise interrelated and with indeterminate, this blending algorithm only needs Integratively approximate to uncertain factor, avoided the measured value of each sensor is estimated through filtering and each filtering needs a large amount of calculating that indeterminate is similar to, simplify calculated amount, improved the precision that target is merged the location.
Below in conjunction with drawings and Examples, the present invention is elaborated.
Description of drawings
Fig. 1 is that process flow diagram is estimated in the uncertain fusion of multiple source, and the symbol in figure is consistent with symbol in instructions.
Embodiment
With reference to Fig. 1.
The below is to illustrate this fusion and positioning method to the supervision of low flyer as example, hypothetical target is monitored jointly by radar and image, two kinds of surveillance equipments are arranged on identical platform, the error of platform position angle and angular altitude causes image and radargrammetry noise to have correlativity, and there is uncertain factor in the statistical property of noise.Metrical information by radar and image is traced and monitored low flyer, need to merge the location to the metrical information of radar and image, and concrete steps are as follows:
1, with the measurement data equivalence of radar and image monitoring system in the coordinate system of reference observation point, provide the model of the Same Physical vector that contains correlation noise and indeterminate,
Figure 657882DEST_PATH_IMAGE056
kTConstantly merge and estimate to be defined as
Figure 154722DEST_PATH_IMAGE058
In formula
Figure 805147DEST_PATH_IMAGE060
For treating the positional information of estimating target,
Figure 463661DEST_PATH_IMAGE062
Represent respectively oblique distance, position angle and the angular altitude of target;
Be the measured value of radar and image monitoring system,
Figure 116545DEST_PATH_IMAGE066
Oblique distance, position angle and the angular altitude of the target of gained measured in expression respectively;
Figure 355897DEST_PATH_IMAGE068
Be
Figure 434580DEST_PATH_IMAGE070
Measuring error, average is zero, Covariance matrix be
Figure 129184DEST_PATH_IMAGE074
Figure 488621DEST_PATH_IMAGE076
Indeterminate for radar and vision monitoring equipment measurement;
Figure 472626DEST_PATH_IMAGE078
For merging the estimation coefficient matrix;
Figure 550304DEST_PATH_IMAGE080
Be the fusion estimated result, Oblique distance, position angle and the angular altitude of the target of gained merged in expression respectively;
Figure 921428DEST_PATH_IMAGE028
Be the sampling period;
2, according to the covariance matrix of surveying instrument and the given measuring error of error calibration, calculate
Figure 358225DEST_PATH_IMAGE084
In formula
Sub-block
Figure 923199DEST_PATH_IMAGE086
The expression measuring error
Figure 833472DEST_PATH_IMAGE088
Covariance matrix;
Figure 167502DEST_PATH_IMAGE090
Be the supremum of uncertain error, EThe expression mathematical expectation;
Real parameter
Figure 775201DEST_PATH_IMAGE092
Determine by concrete system and experiment;
3, basis
Determine to merge the estimation coefficient matrix
Figure 795295DEST_PATH_IMAGE096
In formula:
Figure 249410DEST_PATH_IMAGE098
Figure 496852DEST_PATH_IMAGE100
Be unit matrix;
4, kTFusion estimated value constantly
Figure 807934DEST_PATH_IMAGE050
And the error covariance matrix that merges estimation
Figure 116555DEST_PATH_IMAGE052
For:
Figure 66057DEST_PATH_IMAGE104
Figure 76607DEST_PATH_IMAGE106

Claims (1)

1. the uncertain fusion and positioning method of multiple source, is characterized in that comprising the steps:
(a) contain correlation noise and indeterminate mIndividual sensor provides the model of Same Physical vector
kTConstantly merge and estimate to be defined as
Figure 148175DEST_PATH_IMAGE002
In formula:
Figure 2012101343673100001DEST_PATH_IMAGE003
Be physical vector to be estimated,
Figure 680656DEST_PATH_IMAGE004
Be the fusion estimated result,
Figure 2012101343673100001DEST_PATH_IMAGE005
,
Figure 96594DEST_PATH_IMAGE006
For
Figure 2012101343673100001DEST_PATH_IMAGE007
Constantly the
Figure 2012101343673100001DEST_PATH_IMAGE009
The measured value of individual sensor,
Figure 832469DEST_PATH_IMAGE010
Be corresponding measuring error, average is zero,
Figure 2012101343673100001DEST_PATH_IMAGE011
Covariance matrix be
Figure 218319DEST_PATH_IMAGE012
,
Figure 2012101343673100001DEST_PATH_IMAGE013
Expression
Figure 480673DEST_PATH_IMAGE007
Constantly the
Figure 411720DEST_PATH_IMAGE009
The indeterminate of individual sensor measurement,
Figure 697208DEST_PATH_IMAGE014
Be matrix of coefficients,
Figure DEST_PATH_IMAGE015
Be the sampling period;
(b) according to the covariance matrix of surveying instrument and the given measuring error of error calibration, calculate
Figure 27695DEST_PATH_IMAGE016
In formula: sub-block
Figure 82239DEST_PATH_IMAGE012
The expression measuring error
Figure DEST_PATH_IMAGE017
Covariance matrix,
Figure 699034DEST_PATH_IMAGE018
Be the supremum of uncertain error, EThe expression mathematical expectation, real parameter
Figure DEST_PATH_IMAGE019
Noise statistics and concrete experiment by real system are determined;
(c) basis
Figure 534135DEST_PATH_IMAGE020
Determine to merge the estimation coefficient matrix
Figure DEST_PATH_IMAGE021
,
In formula:
Figure 746942DEST_PATH_IMAGE022
,
Figure DEST_PATH_IMAGE023
,
Figure 589082DEST_PATH_IMAGE024
Be unit matrix;
(d) kTFusion estimated value constantly
Figure DEST_PATH_IMAGE025
And fusion evaluated error covariance matrix
Figure 393090DEST_PATH_IMAGE026
For:
CN 201210134367 2012-05-03 2012-05-03 Uncertain fusion location method of multiple information sources Expired - Fee Related CN102645646B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN 201210134367 CN102645646B (en) 2012-05-03 2012-05-03 Uncertain fusion location method of multiple information sources

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN 201210134367 CN102645646B (en) 2012-05-03 2012-05-03 Uncertain fusion location method of multiple information sources

Publications (2)

Publication Number Publication Date
CN102645646A CN102645646A (en) 2012-08-22
CN102645646B true CN102645646B (en) 2013-06-26

Family

ID=46658586

Family Applications (1)

Application Number Title Priority Date Filing Date
CN 201210134367 Expired - Fee Related CN102645646B (en) 2012-05-03 2012-05-03 Uncertain fusion location method of multiple information sources

Country Status (1)

Country Link
CN (1) CN102645646B (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103424735B (en) * 2013-07-30 2016-06-08 北京邮电大学 Based on the near-field sources localization method of minimum description length, Apparatus and system
CN108603933B (en) * 2016-01-12 2022-07-08 三菱电机株式会社 System and method for fusing sensor outputs with different resolutions
CN116520244A (en) * 2023-03-04 2023-08-01 西安费斯达自动化工程有限公司 Low-slow and small-target radio array detection and weighted location estimation method

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1233986C (en) * 2003-07-03 2005-12-28 上海交通大学 Registration deviation on line compensation method of multisensor grafting tracing system
US7548184B2 (en) * 2005-06-13 2009-06-16 Raytheon Company Methods and apparatus for processing data from multiple sources
CN101739840A (en) * 2009-11-26 2010-06-16 西北工业大学 Poly GPS/INS and transportation image fusion and positioning method

Also Published As

Publication number Publication date
CN102645646A (en) 2012-08-22

Similar Documents

Publication Publication Date Title
US20230236280A1 (en) Method and system for positioning indoor autonomous mobile robot
CN106646450B (en) Radar track robust correlating method based on distance substep cluster
CN105809126B (en) Intelligent vehicle target tracking system and method based on DSRC and vehicle sensor fusion
CN101650178B (en) Method for image matching guided by control feature point and optimal partial homography in three-dimensional reconstruction of sequence images
CN105093198B (en) A Track Fusion Method for Distributed External Radiator Radar Network Detection
CN112230243A (en) Indoor map construction method for mobile robot
CN113791074A (en) Unmanned aerial vehicle bridge crack inspection system and method based on multi-sensor fusion
Famouri et al. A novel motion plane-based approach to vehicle speed estimation
CN109029465A (en) A kind of unmanned boat tracking and obstacle avoidance system based on millimetre-wave radar
US9213100B1 (en) Bearing-only tracking for horizontal linear arrays with rapid, accurate initiation and a robust track accuracy threshold
CN110927765A (en) Laser radar and satellite navigation fused target online positioning method
CN103697883A (en) Aircraft horizontal attitude determination method based on skyline imaging
JP2017096813A (en) Calibration device, calibration method, and calibration program
CN102645646B (en) Uncertain fusion location method of multiple information sources
Hassani et al. Analytical and empirical navigation safety evaluation of a tightly integrated LiDAR/IMU using return-light intensity
Lim et al. Multi-object identification for mobile robot using ultrasonic sensors
CN102830391B (en) Accuracy index calculating method of infrared search and track system
Hwang et al. Wave height measurement scheme using wave detector based on convolutional neural network and PPM calculator with ocean wave images
CN106940185A (en) A kind of localization for Mobile Robot and air navigation aid based on depth camera
Chen et al. A robust robot perception framework for complex environments using multiple mmwave radars
CN103678925A (en) Flight path classification method based on auxiliary information source
Gu et al. SLAM with 3dimensional-GNSS
CN111354027A (en) Visual obstacle avoidance method for mobile robot
Lubczonek Analysis of accuracy of surveillance radar image overlay by using georeferencing method
CN113552551A (en) Direct correlation method for distributed 2D sensor network tracks

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20130626