WO2009066994A2 - Procédé permettant de détecter un objet non attendu et de supprimer un objet statique - Google Patents
Procédé permettant de détecter un objet non attendu et de supprimer un objet statique Download PDFInfo
- Publication number
- WO2009066994A2 WO2009066994A2 PCT/MY2008/000160 MY2008000160W WO2009066994A2 WO 2009066994 A2 WO2009066994 A2 WO 2009066994A2 MY 2008000160 W MY2008000160 W MY 2008000160W WO 2009066994 A2 WO2009066994 A2 WO 2009066994A2
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unattended
- owner
- event
- image
- digital signal
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
Definitions
- the present invention relates to video surveillance systems in general, and more particularly to a detection system that is fully automated to detect unattended objects or removal of stationary objects.
- Present surveillance system are designed to assist humans i.e. security officers.
- the system may consist of several surveillance cameras such as CCTV cameras and/or detecting device such as motion detector.
- the images that are provided by the cameras need to be constantly monitored by the human in order to identify any activity that is amiss.
- One aspect of the present invention is to use image processing techniques and image processing sequence and process to a surveillance camera system to detect unattended objects or removal of static objects. It is also able to detect objects that have been unintentionally left unattended by the owner; for instance the object fell off without the owner noticing.
- Another aspect of the present invention is to detect the presence of an unattended object with other human in close proximity of the object, but not its owner. This is a different approach from previous arts where detection of unattended object is limited to static object with no human presence within its close proximity.
- Yet another aspect of the present invention is to be able to track and detect the owner of the unattended object by employing one or a combination of known detection methods.
- the tracking of the owner is not limited to the field of view of a single camera, but expanded to the field of view of all the cameras present in a area where the detection system as described in the present invention is used.
- the present invention discloses a method to detect unattended object using image processing technique.
- the system involves an integration of several image processing technique/process and image processing sequence and process to a surveillance camera system.
- the surveillance system can automatically detect an unattended object, which has been left behind by its owner beyond a predetermined distance and alert the security guard is presented.
- the system uses image sources such as surveillance cameras such in collaboration with the image processing sequence.
- the image data from the image source is used as input to the unattended object detection process.
- the type of image from the image source depends on the type of image source used for example a panoramic image is obtained when a omnidirectional lens camera is used, and a visible light video data is received when a visible light camera is used, hi an instance where a thermal camera is used, then the image data would be night vision image.
- FIG. 1 shows the relationship between the panoramic object space and the resultant image space mapping.
- Objects that fall under the panoramic field of view will be on the field of view of the camera.
- the field of view provided by the panoramic camera will be continuos and fully warp. Objects moving around within the field of view of the camera can be viewed from different part in the image.
- the unattended object detection process is implemented using specifically written software.
- the detection process is incorporated into a surveillance system, which has of one or more surveillance cameras.
- the operation and data flow of the present invention are described herein.
- Figure 2 describes the data flow in the present invention.
- Figure 3 shows the process to detect unattended object. A preferred embodiment of the present invention will be described with reference to Figure 2 and 3.
- Video signals from the surveillance camera are the input for the unattended object detection process.
- the video signal can be analog video signal from standard cameras or digital video signal from IP based camera.
- EP based cameras When EP based cameras are used, the processing of the system can be applied to web-based application, in which the images can be viewed from anywhere in the world.
- Analog or digital video signal is used as input to the unattended object detection process (A).
- the signal is captured by a special device for example frame grabber or DSP. These devices can be used to captures multiple video input from cameras and digitize and store these signals in digital data form (B). However, when a D? based camera is used the data would already be in digital format. Therefore, this step would not be implemented.
- Multiple video signals can be captured by multiple input capture devices and simultaneous capture sequence (2).
- the captured digital image is required to be transformed in the correct manner before it can be registered into panoramic image (3).
- the transformed image data needs to be enhanced in terms of the noise level and visual quality of the image data (4).
- the transformed and registered panoramic image data (C) is fed both to frame synchronizer (10) and Gaussian mixture based background subtraction method (8) before input to morphological filters to fill up the foreground object void.
- This method records pixel activity of each of the pixels in the image and background subtraction is achieved by analyzing each pixel (8); this module is assisted by uneven illumination (5), trailing reduction (6), and shadow reduction (7) for the purpose of:
- the moving foreground object information (D) is fed to track region and current region comparison (11) and then to region tracking (12).
- the current detected region and previous recorded track information is tracked in order to obtain the different stage of the object.
- the region tracking is assisted by motion cue, texture cue and color cue (13).
- the characteristics of the object movement can be determined after the previous processes (11 and 12).
- the moving object characteristics include a newly emerged object, exit object, object splitting; object merging or current existing objects (14). After the getting the blob information the blob is converted into object further processing (15).
- the moving object characteristics is an object splitting into two or more, it triggers the unattended object event to start.
- the object characteristics information (E) is used to be an input for feature extraction (16). In this process, the shape and motion information will be extracted.
- Object feature extractor extracts information that is related to each of the detected objects including ellipse information, angle and orientation of the major axis and minor axis.
- Object classifier must be trained with a large amount of database (18) before it can be used to detect specific object (17).
- the object class information (F) is used as an input to the event recognizer (19) to determine if the unattended object event has happened.
- the surrounding region of the unattended object is divided into multiple regions and probability techniques can be used to calculate the probability of the owner leaving the object. Once the object is dropped off and separate from the owner the probability is used to determine the likely hood of the owner leaving the object by observing the movement and once the predetermined distance is exceeded event is detected.
- the result (G) will be managed by the message manager (21) and the result will be combined by the original panoramic video data by the overlap manager (22).
- the overlay image (H) is displayed through image display component (23).
- the system can also be calibrated so that a number of cameras are coordinated and the images are fed to the detection system. This enables the surveillance system to be able to track the movement of the owner of the unattended object and determine if the object has been left unattended on purpose. This is vital especially in large areas such as airports, where many cameras are present and the human can disappear from the field of view of one camera and emerge in the other.
- detection techniques such as face recognition, gait analysis etc are added to the system. Further an image hand-over process is employed.
- the cameras are placed in a way that at least 10-15% of the field of view of each camera is overlapping with an adjacent camera.
- Features of the human are recorded and compared between the adjacent cameras. This will limit the search area or the number of cameras before the targeted human is locked down.
- an image hand-over process is employed. This is done by means of tagging the person who has left the object unattended i.e. leaving the object in place and moving away from it.
- the human's body feature such as face, clothes or gait is used as the identifier.
- Known detection method such as face recognition or gait analysis is used to recognize the human by comparing the face and gait information against known database.
- the alarm is triggered and the human i.e. owner of the object is identified using the detection method that is employed.
- the detection system as described above can be employed to detect static objects that are moved from its predetermined location.
- the system is able to learn the background of the static objects. Therefore, when the object is moved from its predetermined position, the system recognizes it as a void.
- the system can also be extended to identify the particular object that has been removed by recognizing the shape and location of the void. This will trigger the alarm system automatically.
- a detection method such as facial recognition is used to determine the human that was within the proximity of the static object prior to its removal.
- image hand-over technique the location of the individual can be determined. This will enable the security guards to act immediately.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Alarm Systems (AREA)
- Image Analysis (AREA)
- Image Processing (AREA)
- Burglar Alarm Systems (AREA)
Abstract
L'invention porte sur un procédé de surveillance visant à détecter automatiquement un objet non attendu ou à supprimer un objet stationnaire par l'intermédiaire d'une analyse d'image de surveillance. Le procédé est mis en fonctionnement par analyse des données d'image reçues de caméras de surveillance, puis par transmission en temps réel d'un ensemble de messages d'alerte au personnel en charge de la sécurité. Cette invention se révèle utile pour la détection d'objets non attendus ou la suppression d'objets stationnaires.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
MYPI20072086 | 2007-11-23 | ||
MYPI20072086A MY143022A (en) | 2007-11-23 | 2007-11-23 | Method for detecting unattended object and removal of static object |
Publications (2)
Publication Number | Publication Date |
---|---|
WO2009066994A2 true WO2009066994A2 (fr) | 2009-05-28 |
WO2009066994A3 WO2009066994A3 (fr) | 2009-07-16 |
Family
ID=40668028
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/MY2008/000160 WO2009066994A2 (fr) | 2007-11-23 | 2008-11-24 | Procédé permettant de détecter un objet non attendu et de supprimer un objet statique |
Country Status (2)
Country | Link |
---|---|
MY (1) | MY143022A (fr) |
WO (1) | WO2009066994A2 (fr) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8483481B2 (en) | 2010-07-27 | 2013-07-09 | International Business Machines Corporation | Foreground analysis based on tracking information |
US8934670B2 (en) | 2008-03-25 | 2015-01-13 | International Business Machines Corporation | Real time processing of video frames for triggering an alert |
CN112560655A (zh) * | 2020-12-10 | 2021-03-26 | 瓴盛科技有限公司 | 无主物品检测方法和系统 |
US20250086973A1 (en) * | 2022-11-15 | 2025-03-13 | Boe Technology Group Co., Ltd. | Management method, apparatus and system for epidemic detection, electronic device, and storage medium |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6028626A (en) * | 1995-01-03 | 2000-02-22 | Arc Incorporated | Abnormality detection and surveillance system |
US7436887B2 (en) * | 2002-02-06 | 2008-10-14 | Playtex Products, Inc. | Method and apparatus for video frame sequence-based object tracking |
US7200266B2 (en) * | 2002-08-27 | 2007-04-03 | Princeton University | Method and apparatus for automated video activity analysis |
US6999600B2 (en) * | 2003-01-30 | 2006-02-14 | Objectvideo, Inc. | Video scene background maintenance using change detection and classification |
-
2007
- 2007-11-23 MY MYPI20072086A patent/MY143022A/en unknown
-
2008
- 2008-11-24 WO PCT/MY2008/000160 patent/WO2009066994A2/fr active Application Filing
Cited By (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8934670B2 (en) | 2008-03-25 | 2015-01-13 | International Business Machines Corporation | Real time processing of video frames for triggering an alert |
US9123136B2 (en) | 2008-03-25 | 2015-09-01 | International Business Machines Corporation | Real time processing of video frames |
US9129402B2 (en) | 2008-03-25 | 2015-09-08 | International Business Machines Corporation | Real time processing of video frames |
US9142033B2 (en) | 2008-03-25 | 2015-09-22 | International Business Machines Corporation | Real time processing of video frames |
US9418445B2 (en) | 2008-03-25 | 2016-08-16 | International Business Machines Corporation | Real time processing of video frames |
US9418444B2 (en) | 2008-03-25 | 2016-08-16 | International Business Machines Corporation | Real time processing of video frames |
US9424659B2 (en) | 2008-03-25 | 2016-08-23 | International Business Machines Corporation | Real time processing of video frames |
US8483481B2 (en) | 2010-07-27 | 2013-07-09 | International Business Machines Corporation | Foreground analysis based on tracking information |
US8934714B2 (en) | 2010-07-27 | 2015-01-13 | International Business Machines Corporation | Foreground analysis based on tracking information |
US9460361B2 (en) | 2010-07-27 | 2016-10-04 | International Business Machines Corporation | Foreground analysis based on tracking information |
CN112560655A (zh) * | 2020-12-10 | 2021-03-26 | 瓴盛科技有限公司 | 无主物品检测方法和系统 |
US20250086973A1 (en) * | 2022-11-15 | 2025-03-13 | Boe Technology Group Co., Ltd. | Management method, apparatus and system for epidemic detection, electronic device, and storage medium |
Also Published As
Publication number | Publication date |
---|---|
WO2009066994A3 (fr) | 2009-07-16 |
MY143022A (en) | 2011-02-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10664706B2 (en) | System and method for detecting, tracking, and classifying objects | |
US10614311B2 (en) | Automatic extraction of secondary video streams | |
Candamo et al. | Understanding transit scenes: A survey on human behavior-recognition algorithms | |
CN111161312B (zh) | 一种基于计算机视觉的物体轨迹追踪识别装置及系统 | |
KR101492473B1 (ko) | 사용자 기반 상황 인지형 씨씨티비 통합관제시스템 | |
CN110543868A (zh) | 一种基于人脸识别和头肩检测的监控方法及系统 | |
Din et al. | Abandoned object detection using frame differencing and background subtraction | |
CN111612815A (zh) | 一种红外热成像行为意图分析方法及系统 | |
Davies et al. | A progress review of intelligent CCTV surveillance systems | |
JP2001005974A (ja) | 物体認識方法及び物体認識装置 | |
WO2009066994A2 (fr) | Procédé permettant de détecter un objet non attendu et de supprimer un objet statique | |
JP2012212215A (ja) | 画像監視装置 | |
Czyzewski et al. | Moving object detection and tracking for the purpose of multimodal surveillance system in urban areas | |
Prabhakar et al. | An efficient approach for real time tracking of intruder and abandoned object in video surveillance system | |
KR101926510B1 (ko) | 광각카메라를 이용한 안면인식 기반의 광역 감시 시스템 | |
KR101929212B1 (ko) | 이동객체를 마스킹처리하는 장치 및 방법 | |
TWI476735B (zh) | 攝影機異常種類辨識方法及可偵測攝影異常的監視主機 | |
Velastin | CCTV video analytics: Recent advances and limitations | |
Joshi et al. | Suspicious object detection | |
Davies et al. | Progress in computational intelligence to support cctv surveillance systems | |
Chan | A robust target tracking algorithm for FLIR imagery | |
Draganjac et al. | Dual camera surveillance system for control and alarm generation in security applications | |
Darker et al. | Automation of the CCTV-mediated detection of individuals illegally carrying firearms: combining psychological and technological approaches | |
Kim et al. | Environment independent hybrid face recognition system using a fixed camera and a PTZ Camera | |
Appiah et al. | Autonomous real-time surveillance system with distributed ip cameras |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 08851958 Country of ref document: EP Kind code of ref document: A2 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 08851958 Country of ref document: EP Kind code of ref document: A2 |