WO2007015631A1 - Smart video monitoring system and method communicating with auto-tracking radar system - Google Patents
Smart video monitoring system and method communicating with auto-tracking radar system Download PDFInfo
- Publication number
- WO2007015631A1 WO2007015631A1 PCT/KR2006/003067 KR2006003067W WO2007015631A1 WO 2007015631 A1 WO2007015631 A1 WO 2007015631A1 KR 2006003067 W KR2006003067 W KR 2006003067W WO 2007015631 A1 WO2007015631 A1 WO 2007015631A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- data
- image
- radar
- target
- target object
- Prior art date
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 139
- 238000000034 method Methods 0.000 title claims abstract description 42
- 230000003287 optical effect Effects 0.000 claims description 35
- 238000012545 processing Methods 0.000 claims description 29
- 238000004891 communication Methods 0.000 claims description 12
- 230000005540 biological transmission Effects 0.000 claims description 11
- 230000006870 function Effects 0.000 claims description 9
- 238000013507 mapping Methods 0.000 claims description 5
- 230000000873 masking effect Effects 0.000 claims description 5
- 238000009826 distribution Methods 0.000 claims description 4
- 238000003384 imaging method Methods 0.000 claims description 4
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 claims description 4
- 230000033001 locomotion Effects 0.000 claims description 3
- 238000004091 panning Methods 0.000 claims description 3
- 239000013307 optical fiber Substances 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 26
- 238000005516 engineering process Methods 0.000 description 6
- 238000006243 chemical reaction Methods 0.000 description 4
- 239000000284 extract Substances 0.000 description 4
- 238000011109 contamination Methods 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 238000012423 maintenance Methods 0.000 description 2
- 238000004519 manufacturing process Methods 0.000 description 2
- 238000004088 simulation Methods 0.000 description 2
- 102100036848 C-C motif chemokine 20 Human genes 0.000 description 1
- 238000007792 addition Methods 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 238000009434 installation Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008054 signal transmission Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S7/00—Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
- G01S7/003—Transmission of data between radar, sonar or lidar systems and remote stations
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/66—Radar-tracking systems; Analogous systems
- G01S13/72—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar
- G01S13/723—Radar-tracking systems; Analogous systems for two-dimensional tracking, e.g. combination of angle and range tracking, track-while-scan radar by using numerical data
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
Definitions
- the present invention relates to a radar system, and more particularly to a smart video monitoring system for communicating with an auto-tracking radar system, which adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site, and a method for controlling the smart video monitoring system.
- an auto-tracking radar system which adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site, and a method for controlling the smart video monitoring system.
- a conventional radar system includes an antenna, a transceiver, a radar signal processor, and a radar display.
- the antenna performs transmission or reception of radio or electric waves.
- the antenna rotates by 36O 0 C and at the same time performs incidence or projection of the radio waves.
- the transceiver receives the radio waves from the antenna, or transmits the radio waves to a destination via the antenna.
- the radar signal processor analyzes radio signals reflected from a target object using a radio- wave projection algorithm, and processes the analyzed radio signals.
- the radar display displays the object's position processed by the radar signal processor in the form of dark dots.
- the conventional radar system is configured in the form of a console or a Peer- to-Peer (P2P) network system over a dedicated network (e.g., a telephone phone line), such that a large amount of costs are required to upgrade or extend the radar system, and a dexterous engineer skillful in operating the radar system is also required. Therefore, in order to solve the above-mentioned problems and provide the user with precise monitoring data of the target object, a high-performance radar system must be developed and introduced to the market. Disclosure of Invention Technical Problem
- the present invention has been made in view of the above problems, and it is an object of the present invention to provide a smart video monitoring system for communicating with an auto-tracking radar system, which adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site, and a method for controlling the smart video monitoring system.
- a smart video monitoring system interacting with an auto-tracking radar system comprising: a transmitter for transmitting position data of a target object by the radar system, generating video data using a camera, and transmitting the generated video data, which includes: an antenna unit including: an antenna for rotating, transmitting a radar signal to the target object, and receiving incident waves reflected from the target object to detect the presence of the target object and its position data, a pedestal for physically controlling operations of the antenna according to a control signal, and a housing for forming a waveguide and an overall layout; an embedded radar data converter for processing the radar signal received via the antenna unit, finding information of the target object, transmitting the found target-object information, and driving the antenna unit according to a control signal; and a video monitoring unit for capturing an image of the target object detected by the antenna unit at high resolution; a receiver for receiving output data of the transmitter, storing the camera-captured image, performing signal processing of the stored
- a smart video monitoring system interacting with an auto-tracking radar system, including: a transmitter for transmitting position data of a target object by the radar system, generating video data using a camera, and transmitting the generated video data; a receiver for receiving output data of the transmitter, storing the camera- captured image, performing signal processing of the stored image, acquiring data of the target object, and controlling operations of the transmitter, such that a user is able to view the acquired target-object data in real time; and a signal processor for performing data communication between the transmitter and the receiver, the smart video monitoring system comprising: the receiver including: a management server for controlling operations of the camera and the antenna unit at a remote site, transmitting information acquired from the target object by replying to control signals of the camera and the antenna unit, managing target information acquired by performing signal processing on radar data generated from the transmitter, and controlling the receiver to output image information corresponding to position information of the target object, an embedded video switcher for transmitting the video data captured by the camera to a
- a smart video monitoring system communicates with an auto-tracking radar system, adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site.
- the smart video monitoring system interacts with a camera to increase the accuracy of radar monitoring data, combines a local-area common-type radar system with an imaging system, reduces the production costs of the system, and uses an open IP (Internet Protocol) as a radar communication protocol, thereby easily extending a network area of the system.
- IP Internet Protocol
- the smart video monitoring system uses radar monitoring software
- the smart video monitoring system is combined with a video monitoring system, such that it implements an upgradable or extensible system capable of being applied to a variety of monitoring information (e.g., forest fires or sea contamination).
- the smart video monitoring system records the monitoring data in a database (DB), and manages the recorded monitoring data, such that it can allow a user or operator to systemically monitor the position of a target object.
- DB database
- FIG. 1 is a block diagram illustrating a smart video monitoring system interacting with an auto-tracking radar system according to a preferred embodiment of the present invention.
- FIG. 2 is a detailed block diagram illustrating a transmitter of FIG. 1 according to the present invention.
- FIG. 3 is a detailed block diagram illustrating a receiver of FIG. 1 according to the present invention.
- FIG. 4 is a conceptual diagram illustrating a combination implementation example of the smart video monitoring system of FIG. 1, the transmitter of FIG. 2, and the receiver of FIG. 3 according to the present invention.
- FIG. 1 is a block diagram illustrating a smart video monitoring system interacting with an auto-tracking radar system according to a preferred embodiment of the present invention.
- FIG. 2 is a detailed block diagram illustrating a transmitter of FIG. 1 according to the present invention.
- FIG. 3 is a detailed block diagram illustrating a receiver of FIG. 1 according to the present invention.
- FIG. 4 is a conceptual diagram illustrating a combination implementation example of the smart video monitoring system of FIG. 1, the
- FIG. 5 is a detailed block diagram illustrating a management server of FIG. 3 according to the present invention.
- FIG. 6 is a conceptual diagram illustrating the flow of data or signals for use in the management server of FIG. 3 according to the present invention.
- FIG. 7 is a flow chart illustrating a smart video monitoring method interacting with an auto-tracking radar system according to a preferred embodiment of the present invention
- FIG. 8 is a flow chart illustrating operations of a radar signal processor of FIG. 7 according to the present invention
- FIG. 9 is a flow chart illustrating operations of a video processor of FIG. 7 according to the present invention.
- FIG. 10 is a flow chart illustrating operations of a target processor of FIG. 7 according to the present invention.
- FIG. 25 FIG.
- FIG. 11 is a conceptual diagram illustrating mapping operations of a radar signal processor according to the present invention.
- FIG. 12 is a conceptual diagram illustrating radar signal conversion operations of a radar signal processor according to the present invention.
- FIG. 13 is a conceptual diagram illustrating display mapping operations of a radar signal processor according to the present invention.
- FIG. 14 is a conceptual diagram illustrating video mixing operations of a video processor according to the present invention.
- FIG. 15 is a conceptual diagram illustrating a process for generating a target image using a video processor according to the present invention.
- FIG. 16 is a conceptual diagram illustrating the target- processing result of a target processor according to the present invention.
- FIG. 17 is a conceptual diagram illustrating a process for establishing a pre- monitoring area of a receiver according to the present invention.
- FIG. 18 shows a plurality of exemplary images illustrating the simulation result of a radar-data processing according to the present invention.
- FIG. 19 shows an implementation example of a radar/video system monitoring program according to the present invention. Mode for the Invention
- the smart video monitoring system communicates with an auto-tracking radar system, adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site.
- the smart video monitoring system interacts with a camera to increases the accuracy of radar monitoring data, combines a local-area common-type radar system with an imaging system, reduces the production costs of the system, and uses an open IP (Internet Protocol) as a radar communication protocol, thereby easily extending a network area of the system.
- IP Internet Protocol
- the smart video monitoring system uses radar monitoring software (SAV) available for personal computers (PCs), resulting in greater convenience of the user and the implementation of monitoring automation.
- SAV radar monitoring software
- PCs personal computers
- the smart video monitoring system is combined with a video monitoring system, such that it implements an upgradable or extensible system capable of being applied to a variety of monitoring information (e.g., forest fires or sea contamination).
- the smart video monitoring system records the monitoring data in a database (DB), and manages the recorded monitoring data, such that it can allow a user or operator to systemically monitor the position of a target object.
- DB database
- the smart video monitoring system monitors a wide area using a common-type probing radar system, and includes a combination system of an infrared camera (also called a thermal vision camera) and an ultra- zoom camera in order to observe a shaded area, identify a target object, and track the position of the target object, and manages the combination system using an Internet Protocol (IP), such that it can implement integrated-, open-, and extensible- systems in a monitoring/management area.
- IP Internet Protocol
- the submarine cables are indirectly monitored by non-official civilians or patrol ships. Therefore, since an administrator or manager has difficulty in recognizing unexpected problems of the submarine cables in real time, the administrator or manager cannot rapidly solve the above-mentioned problems of the submarine cables, such that there is a limitation in extending the range of a monitoring area.
- the principal areas (e.g., danger areas or the Military Demarcation line) of individual countries are being monitored by a variety of systems (e.g., a border surveillance system, a fence monitoring system, and a video monitoring system, etc.).
- a target area having a broad range must be monitored by the above- mentioned systems 24 hours a day, and dexterous administrators or managers must be unavoidably assigned to effectively manage the individual systems.
- the smart video monitoring system includes a local-area radar system capable of simultaneously monitoring the broad target area, and installs a camera at a desired shaded area to reduce the costs of the system, resulting in the implementation of high- performance monitoring system capable of monitoring the target area day and night.
- the conventional system operating/monitoring tasks performed by a radar administrator or manager generally includes a user-oriented management system, such that it is greatly affected by the dexterity of the administrator or manager, the performance of a radar system, and environments of the radar system.
- the smart video monitoring system correctly identifies a monitoring target using an ultra-zoom camera, and tracks the position of the target using the ultra- zoom camera, resulting in the increased performance of the radar monitoring system.
- a sensor-oriented passive monitoring system e.g., a camera or fence monitoring system
- an active radar monitoring system must be developed or upgraded to an active radar monitoring system.
- the smart video monitoring system interacting with the auto-tracking radar system includes a transmitter, a receiver, and a signal processor.
- the transmitter generates position data indicating the target via an antenna, and generates image data interacting with the position data using a video monitoring unit.
- the receiver receives data from the transmitter, stores the received data, and controls operations of the transmitter, such that it allows a user to view the stored data in real time.
- the signal processor performs data communication between the transmitter and the receiver.
- FIG. 1 is a block diagram illustrating a smart video monitoring system interacting with an auto-tracking radar system according to a preferred embodiment of the present invention.
- the smart video monitoring system includes: a transmitter 100 for generating radar position data of a target object; a receiver 300 for receiving an output signal of the transmitter, and outputting a control signal to the transmitter by replying to the received signal.
- the signal processor 200 is located between the transmitter 100 and the receiver 300, such that it performs data communication between the transmitter 100 and the receiver 300.
- FIG. 2 is a detailed block diagram illustrating a transmitter of FIG. 1 according to the present invention.
- FIG. 3 is a detailed block diagram illustrating a receiver of FIG. 1 according to the present invention.
- FIG. 4 is a conceptual diagram illustrating a combination implementation example of the smart video monitoring system of FIG. 1, the transmitter of FIG. 2, and the receiver of FIG. 3 according to the present invention.
- the transmitter 100 includes an antenna 110, an embedded radar data converter 120, and a video monitoring unit 130.
- the transmitter 100 is connected to the receiver 300 via the signal processor 200 acting as a data transmitter, transmits information of the target object to the receiver 300, and is driven by a control signal of the receiver 300.
- the receiver 300 includes a management server 320 equipped with a display 360, an embedded video switcher330, a DVR (Digital Video Recorder) 310, a switching hub 250, and a plurality of user terminals 340.
- the antenna unit 110 includes an antenna, a pedestal, and a housing.
- the antenna rotates by 36O 0 C and at the same time transmits a radar beam to a target monitoring area. If the target is found, the antenna receives incident waves reflected from the target object.
- the pedestal physically controls the driving of the antenna.
- the housing forms a waveguide and an overall layout.
- the antenna unit 110 is connected to a radar data converter 120 for processing a radar signal reflected from the target object to provide position data of the target object.
- the embedded radar data converter 120 outputs position data acquired via the antenna 110, outputs the acquired position data on its own screen, converts the reflected radar signal into a digital signal, and transmits the digital signal to the management server 320.
- the video monitoring unit 130 installed at the smart video monitoring system along with the antenna unit 110 adds video data to radar position data of the target object to acquire desired data, and includes a CCD camera 131, an infrared camera 132, and a receiver 133.
- the CCD camera 131 outputs high-quality video data (i.e., a high-quality image), and captures an image of the target object even at night.
- the video monitoring unit 130 is driven 24 hours a day according to a control signal, and captures an image of the target object by interacting with the target information received in the antenna unit 110.
- the CCD camera 131 In the daytime or ordinary climate environments, the CCD camera 131 is operated. In the night, or if a thick fog, a heavy rain, or a snowfall occurs, the CCD camera 131 is unable to capture an image of the target object, such that the infrared camera 132 starts its operation.
- the zoom lens is mounted to the camera, such that the camera can zoom in on the image of the target object located at a remote site using the zoom lens.
- the housing is located at the outside of the CCD camera 131 and the infrared camera 132,such that it can prevent the CCD camera 131 and the infrared camera 132 from being damaged by a variety of climates varying with time.
- the housing includes a heater and a blower to constantly maintain an inner temperature, such that it can allow the camera to be operated at optimum conditions.
- the receiver 133 capable of controlling the camera's driving and a power-supply voltage of the camera by replying to a control signal.
- the receiver 133 includes a variety of functions, for example, a function for tilting/panning the CCD camera 131 and the infrared camera 132, a zoom in/out function, a focusing function, and a preset function for establishing a pre-monitoring area.
- the signal processor 200 receives output data configured in the form of an optical signal from the transmitter 100, and transmits the output data of the transmitter 100 to the receiver 300.
- the signal processor 200 includes an optical transmitter 210 connected to the transmitter 110, and an optical receiver 220 connected to the receiver 300.
- wired optical transmission technology Technologies associated with optical signal transmission are classified into wired optical transmission technology and wireless optical transmission technology, thereby transmitting desired information configured in the form of the optical signal via a transmission path.
- the wired optical transmission technology has been widely used to correctly transmit a large amount of information.
- the wireless optical transmission technology has been partially made available within a local area (e.g., a short-distance area), however, it has difficulty in transmitting a large amount of information over a long-distance area. Therefore, in order to effectively transmit a large amount of information to a desired destination regardless of the long-distance or short- distance area, the wired optical transmission technology for employing an optical fiber path made of glass has been widely used.
- the optical transmitter 210 converts digital-type radar position data acquired from the target object into another data capable of being transmitted to a destination located at a remote site using an optical modulation process and an optical multiplexing process.
- the optical modulation process converts video data captured by the camera into an optical signal.
- the optical multiplexing process performs simultaneous communication of a plurality of signals.
- the optical receiver 220 receives data of the target object (hereinafter referred to as target data) from the optical transmitter 210 by wire or wirelessly, performs optical demodulation and optical distribution on the received target data, and transmits the resultant data to the receiver 300.
- target data data of the target object
- the optical receiver 220 receives data of the target object (hereinafter referred to as target data) from the optical transmitter 210 by wire or wirelessly, performs optical demodulation and optical distribution on the received target data, and transmits the resultant data to the receiver 300.
- the receiver 300 depicted in FIG. 3 includes the management server 320, the embedded video switcher 330, the DVR 310, and a plurality of user terminals 340.
- the management server 320 manages overall operations of the system, processes radar position data and video data captured by a camera having been received from the transmitter 100, and controls the driving of the transmitter 100 and its data transmission.
- the embedded video switcher 330 converts a video signal captured by the CCD camera 131 and the infrared cameral32 into digital signals, and transmits video data to the user terminals 340.
- the DVR 310 stores/manages analog video data received from the embedded video switcher 330.
- the embedded video switcher 330 receives analog video data including target's moving images captured by the cameras 131 and 132, and outputs the received analog video data to the DVR 310.
- the DVR 310 converts the received analog video data into digital data, and records the digital data according to an MPEG standard indicating a moving-image standard format, such that it manages the target's video data captured by the cameras 131 and 132 contained in the transmitter 100.
- the management server 320 performs data processing of the signals received from the transmitter 100 using a radar signal processor 321, a video processor 322, and a target processor 323, such that it tracks/records the position of the target object, and at the same time properly controls the receiver 133. Also, the management server transmits unified monitoring information to the display 360 and the user terminals 340 via a service unit 324.
- the radar signal processor 321 contained in the management server 320 converts an antenna signal of an one-dimensional 3-bit level capable of allowing user's eyes to easily identify the target object into a two-dimensional image with resolution of 1280 x 1280 pixels.
- the video processor 322 removes images of the remaining objects other than the target object from the two-dimensional image, and performs an image processing step capable of minimizing the number of unnecessary images.
- the target processor 323 detects speed, position, size, and a moving direction of the target object from the image acquired from the video processor 322, searches for the target object to be monitored, and starts tracking the target object. In order to manage data of the target object, the target processor 323 assigns a tracking number to the target object, and records a variety of information (e.g., a target tracking time, a moving path of the target object, and speed of the target object) of the target object in a predetermined DB (database).
- a tracking number e.g., a target tracking time, a moving path of the target object, and speed of the target object
- the target processor 323 transmits position information of the target object to the transmitter 100 according to the movement of the target object, such that two cameras 131 and 132 can capture the image of the target object at optimum conditions.
- the receiver 133 located at a specific position approximating an actual position value of the target object searches for a preset area, and transmits information of the preset area to the transmitter 100.
- the service unit 324 of the management server 320 transmits the monitored information of the target object to the user terminals 340 via Web services.
- the service unit 324 There are a variety of services supplied from the service unit 324, for example, a log-in service for system security, a monitoring-information service, and a receiver 133's remote control service for controlling two cameras 131 and 132.
- the monitoring-information service combines target information recorded in the DB with an electronic marine chart, and provides a user with the combined result, such that the user can easily read and recognize position data of the target object.
- the service unit 324 receives remote- control information of the cameras 131 and 132 from the user terminals 340, and transmits the cameras remote-control information to the receiver 133 via the signal processor 200, such that the user can acquire detailed visual information.
- the display 360 has SXGA resolution composed of 1280 x 1024 pixels, and is connected to the management server 320, such that target position data processed by the management server 320 and the video data of the embedded video switch 330 are displayed on the display 360.
- the user terminals 340 are connected to the management server 320 via a switching hub 350. If a subscriber user pre-registered as a member in the management server 320 gains access to the management server 320, the service unit 324 assigns specific authority to the subscriber user, such that the subscriber user can control overall operations of the system. Therefore, the user terminals 340 control data transmission from the transmitter 100 to the receiver 300 at a remote site, properly controls motions of the antenna unit 110 and the cameras 131 and 132 of the transmitter 100, and can process radar position data stored in an auxiliary memory and video data corresponding to the radar position data, thereby displaying target data on the display 360.
- the antenna unit 110 of the transmitter 110 rotates by 36O 0 C and at the same time transmits a radar beam to a target monitoring area. If the target is found, the antenna unit 110 receives incident waves reflected from the target object, and outputs the received incident waves to the embedded radar data converter 120. Upon receiving the incident waves from the antenna unit 110, the embedded radar data converter 120 displays the acquired position information on its screen, and converts the radar position information of the target object into digital signals, such that the position information of the target object can be transmitted to the management server 320 located at a remote site.
- the video monitoring unit 130 installed at the smart video monitoring system along with the antenna unit 110 controls the CCD camera 131 or the infrared cameral32 to capture an image of the target object detected by the antenna unit 110, acquires video data of the target object, and transmits the acquired video data to the optical transmitter 210.
- the video monitoring unit 130 precisely adjusts the cameras 131 and 132 upon receiving a control signal from the receiver 133 before capturing the image of the target object.
- the optical transmitter 210 performs optical modulation and optical multiplexing processes on data received from the antenna unit 110 and the video monitoring unit 130, and transmits the resultant data to the optical receiver 220.
- the optical camera video data is received in the embedded video switcher 330, and radar signals are received in the management server 320.
- the embedded video switcher 330 is transmitted to the service unit of the management 320, such that it transmits video data to the user terminals 340.
- the DVR 310 receives output data of the embedded video switcher 330, and stores the received data.
- radar signals received in the management server 320 undergo a variety of data processing steps for two-dimensional image conversion.
- the radar signals are processed by the radar signal processor 321, the video processor 322 for removing unnecessary images or fixed objects from the acquired image, and the target processor 323 for extracting the target information, and the resultant signals are recorded in the DB.
- the data recorded in the DB is outputted to the display 360 and the user terminals340 via the service unit 320 for combining the target data with map data, such that the user can simultaneously recognize the position and image of the target object.
- the management server 320 controls a video processor 325 to output a control signal, such that the target information acquired from the target processor 323 is transmitted to the receiver 133. Therefore, the cameras 131 and 132 also move according to the target position data, such that the management server 320 receives the target position data varying with time and video data corresponding to the target position data in real time.
- the management server 320 assigns a tracking number to a desired target object, and indicates the position of the moving target object using residual images or lines, such that it can allow the user to easily identify the moving path of the target object. Also, the speed-, position-, and size- information of the target object is displayed on the display.
- FIG. 5 is a detailed block diagram illustrating the management server 320 of FIG. 3 according to the present invention.
- the management server 320 includes the radar signal processor 321, the video processor 322, the target processor 323, and the service unit 324.
- the management server320 converts the radar data received from the radar signal processor 321 into a two-dimensional image, removes images of fixed objects from the two-dimensional image using a mask map, applies a linear-motion prediction algorithm to the resultant data, acquires/tracks an image of a desired target object, records the information of the target object in the DB, manages the information recorded in the DB, and provides a plurality of users with the information of the target object over a network.
- the radar signal processor 321 contained in the management server 320 converts an antenna signal of a one-dimensional 3-bit level capable of allowing user s eyes to easily identify the target object into a two-dimensional image with resolution of 1280 x 1280 pixels.
- the video processor 322 removes images of the remaining objects other than the target object from the two- dimensional image, and performs an image processing step capable of minimizing the number of unnecessary images.
- the target processor 323 detects speed, position, size, and a moving direction of the target object from the image acquired from the video processor 322, searches for the target object to be monitored, and starts tracking the target object. In order to manage data of the target object, the target processor 323 assigns a tracking number to the target object, and records a variety of information(e.g., a target tracking time, a moving path of the target object, and speed of the target object) of the target object in the DB.
- a tracking number e.g., a target tracking time, a moving path of the target object, and speed of the target object
- the service unit 323 provides a plurality of user terminals 340 with monitored information of the target object via Web services.
- FIG. 6 is a conceptual diagram illustrating the flow of data or signals for use in the management server of FIG. 3 according to the present invention.
- the management server 320 includes the radar signal processor
- FIG. 7 is a flow chart illustrating a smart video monitoring method interacting with the auto-tracking radar system according to a preferred embodiment of the present invention.
- the smart video monitoring system for use in the smart video monitoring method receives one-dimensional radar signal data, and transmits the received one-dimensional radar signal data.
- the smart video monitoring system records the received data of a file format in a memory, and performs the radar signal processing of the recorded file-format data at step STl.
- the smart video monitoring system creates a radar image using a monitoring-area map, a GIS map, and a user map, removes images of the remaining objects other than a desired target object from the acquired radar image, minimizes the number of unnecessary images contained in the radar image, and performs image processing on the resultant data at step ST2.
- the smart video monitoring system extracts at least one of speed, position, size, and moving direction of the target object from the processed radar image, searches for the target object to be monitored, tracks the position of the target object, assigns a tracking number of the target object, records specific information including at least one of the tracking time, moving path, and speed of the target object in the DB, and performs target processing of the recorded information at step ST3.
- FIG. 8 is a flow chart illustrating operations of the radar signal processor of FIG. 7 according to the present invention.
- the radar signal processor 321 of the management server 320 receives line data from the embedded radar data converter 120 at step STl 1, it determines whether the number of received lines is equal to a specific line number corresponding to one-rotation at step ST 12.
- the radar signal processor 321 maps one-dimensional data to two- dimensional coordinates at step ST 13.
- the radar signal processor321 generates a radar image, and forms a BMP file composed of 1280 x 1280 pixels.
- FIG. 9 is a flow chart illustrating operations of the video processor 322 of FIG. 7 according to the present invention.
- 320 receives the radar image at step ST21, it determines the presence or absence of fixed objects (i.e., GIS (Graphical Information System) data) in the radar image at step ST22.
- GIS Global Information System
- the video processor 322 performs masking of the detected GIS data at step ST23, resulting in the creation of a radar image. Otherwise, if the GIS data is not detected at step ST22 or the mask processing is performed, the video processor 322 determines the presence or absence of a monitoring- area map (i.e., hazard data) at step ST24. If the hazard data is not detected at step ST24, the video processor 322 performs the masking process on the received data at step ST25, such that a new image acquired by the sum of the GIS-processed data and the hazard data is formed. The video processor creates a radar image, and forms a BMP file composed of 1280 x 1280 pixels.
- a monitoring- area map i.e., hazard data
- FIG. 10 is a flow chart illustrating operations of the target processor 323 of FIG. 7 according to the present invention.
- the target processor 323 receives an object image at step ST31.
- the target processor 323 removes unnecessary images (e.g., interference, sea waves, and snow/rain) from the received object image at step ST32.
- the target processor 323 assigns an object number to a corresponding object using an object searching algorithm at step ST33.
- the target processor 323 calculates a variety of information (e.g., size, position, and center) of the object at step ST34.
- the target processor 323 determines whether consistency between a current object and a previous object is maintained using a linear-motion prediction algorithm at step ST35. If it is determined that the consistency between the current object and the previous object was maintained at step ST35, the target processor 323 assigns the same tracking number as that of the previous object to the current object at step ST36.
- the target processor 323 assigns a new tracking number to the current object at step ST37. Thereafter, the target processor 323 records target's tracking information (e.g., number, and position of the target) in the DB at step ST38.
- the target processor 323 reads danger information (e.g., speed, position, and direction, etc.) established by a user at step ST39.
- the target processor 323 calculates danger levels of individual objects at step ST40, and determines the presence or absence of any danger object at step ST41. If the presence of the danger object is determined at step ST41, the target processor 323 searches for an object of the highest danger level at step ST42.
- the target processor 323 searches for a preset warning area approximating the position of the found object at step ST43.
- the target processor 323 transmits a camera control command to the receiver 133 at step ST44.
- FIG. 11 is a conceptual diagram illustrating mapping operations of the radar signal processor 321 according to the present invention.
- the radar signal processor 321 generates a two-dimensional radar image according to the number of directions for each scale and resolution for each dot.
- FIG. 12 is a conceptual diagram illustrating radar signal conversion operations of the radar signal processor according to the present invention.
- the radar signal processor 321 receives one-dimensional radar data, and converts the received one-dimensional radar data into a two-dimensional image.
- FIG. 13 is a conceptual diagram illustrating display mapping operations of the radar signal processor according to the present invention.
- FIG. 14 is a conceptual diagram illustrating video mixing operations of the video processor according to the present invention.
- the video processor 322 receives height(H) information (hi), width(W) information (wl), position(P) information (pi), and scale(S) information(sl) from the two-dimensional image acquired by conversion of the one-dimensional radar data.
- the video processor 322 receives h2-, w2-, p2-, and s2- data of the target image, h3-, w3-, p3-, and s3- data of a User Hazard Map, h4-, w4-, p4-, and s4- data of an Original Hazard Map, h5-, w5-, p5-, and s5- data of a User Drawing Map, and h6-, w6-, p6-, and s6- data of a Sea Map obtained by GIS, and processes the received data.
- the video processor 322 outputs Kl-, w7-, p7-, and s7- data of a Service Map.
- FIG. 15 is a conceptual diagram illustrating a process for generating a target image using the video processor according to the present invention.
- the video processor 322 defines or limits a target tracking range, removes data of fixed objects (e.g., GIS object and user-defined objects) from an acquired image, converts data of only a desired target object into a radar image.
- fixed objects e.g., GIS object and user-defined objects
- the video processor 322 removes colored parts of the "GIS Map & User
- the above-mentioned radar image is a picture file formed by converting the one-dimensional radar data into two-dimensional data.
- the "GIS Map & User Map” image denoted by "2" indicates the principal hazard areas of the radar system, such that the user can determine the danger levels of individual objects.
- the GIS Map picture is formed by capturing an electronic marine chart on the basis of geographical information of submarine cables and the radar antenna direction according to a predetermined scale.
- the user can release the danger area by inserting the User Map picture into the radar image.
- the video processor 322 creates the GIS Map and the User Map as a single picture file, and processes the created picture file.
- Each of the remaining areas other than the hazard areas is filled with a pixel value of "0" (i.e., binary number "0"), and each of the hazard areas is filled with a binary number "1111", resulting in the implementation of logic AND operation between the resultant image and the radar image.
- the "Hazard Map & User Map"denoted by "3"in FIG. 15 indicates the setup of main radar hazard areas (i.e., main radar monitoring areas) to determine the presence or absence of danger in each area. If a danger area is established by the user, the User Map picture is inserted into the resultant image.
- the Hazard Map and the User Map are configured in the form of a single picture file, such that the single picture file including the Hazard Map and the User Map is processed by the video processor 322.
- Each area to be used as a hazard area is filled with a specific pixel value equal to a binary number " 1111", and each of the remaining areas other than the above-mentioned areas is filled with a binary number "0", resulting in the implementation of logic AND operation between the resultant image and the radar image.
- the Target Image denoted by "4"in FIG. 15 is indicative of a final target-object picture acquired by the video mixing of the video processor 322.
- FIG. 16 is a conceptual diagram illustrating the target- processing result of the target processor according to the present invention.
- the target processor 323 extracts a target from the target image, analyzes the extracted target, and records the analyzed data in the DB.
- the information extracted from the target image includes ID number, position, and speed, etc.
- FIG. 17 is a conceptual diagram illustrating a process for establishing a pre- monitoring area of a receiver according to the present invention.
- target monitoring information e.g., position, speed, and operating time
- the warning level are displayed on FIG. 17.
- FIG. 18 shows a plurality of exemplary images illustrating the simulation result of a radar-data processing according to the present invention.
- the radar signal processor for the radar-data processing receives an original radar image denoted by " 1".
- the radar signal processor establishes a fixed object mask denoted by "2".
- the radar signal processor can extract only the image of the fixed object mask from among the original radar image, as shown in the fixed-object removing image denoted by "3".
- the radar signal processor establishes a submarine cable mask denoted by "4". Therefore, only the target object image denoted by "5", corresponding to the submarine cable mask denoted by "4", is extracted from the fixed-object removing image denoted by "3", such that the extracted image denoted by "5" remains.
- Target (i.e, object) information can be displayed as denoted by "6" in FIG. 18.
- FIG. 19 shows an implementation example of a radar/video control monitoring program according to the present invention.
- FIG. 19 shows an image illustrating the summarized result of the video monitoring information program interacting with the local-area radar system.
- the image of FIG. 19 can be found on the display 360 or the user terminal 340.
- the target information is configured in the form of a table.
- the electronic submarine chart is combined with the target information, such that the combined result is shown in the table.
- the images captured by the cameras 131 and 132 are overlaid with each other, such that the overlay result is displayed on a single screen.
- the images captured by the cameras 131 and 132 can move to another position, can be hidden, and can also be zoomed in on.
- the target information is represented by symbols of a variety of colors and sizes, such that the user can easily recognize the monitoring information.
- the moving path, speed, and tracking time of the target object are determined by the system, such that the system automatically generates the warning or alert sound, resulting in greater convenience of the user.
- the smart video monitoring system communicating with an auto-tracking radar system adds a high-quality image captured by a camera to two-dimensional position data acquired by a radar system, allows a user or operator to easily recognize a target object displayed on a display, and tracks/monitors the target object located at a monitoring area at a remote site.
Landscapes
- Engineering & Computer Science (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Networks & Wireless Communication (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Radar Systems Or Details Thereof (AREA)
- Studio Devices (AREA)
Abstract
Description
Claims
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN2006800343611A CN101268383B (en) | 2005-08-04 | 2006-08-04 | Smart video monitoring system and method communicating with auto-tracking radar system |
Applications Claiming Priority (4)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| KR1020050071447A KR100594513B1 (en) | 2005-08-04 | 2005-08-04 | Video Surveillance System Linked with Near Field Radar |
| KR10-2005-0071447 | 2005-08-04 | ||
| KR1020060073033A KR100720595B1 (en) | 2006-08-02 | 2006-08-02 | Intelligent video surveillance system and method for interworking with an automatic tracking radar system |
| KR10-2006-0073033 | 2006-08-02 |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| WO2007015631A1 true WO2007015631A1 (en) | 2007-02-08 |
Family
ID=37708892
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| PCT/KR2006/003067 WO2007015631A1 (en) | 2005-08-04 | 2006-08-04 | Smart video monitoring system and method communicating with auto-tracking radar system |
Country Status (1)
| Country | Link |
|---|---|
| WO (1) | WO2007015631A1 (en) |
Cited By (18)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| EP2267409A1 (en) * | 2009-06-08 | 2010-12-29 | Honeywell International Inc. | System and method for displaying information on a display element |
| CN104135644A (en) * | 2014-07-31 | 2014-11-05 | 天津市亚安科技股份有限公司 | Intelligent tracking cradle head having radar monitoring function and monitoring method |
| CN104280730A (en) * | 2013-07-09 | 2015-01-14 | 北京瑞达恩科技股份有限公司 | Low-altitude search radar |
| US9055201B2 (en) | 2011-10-14 | 2015-06-09 | Samsung Techwin Co., Ltd. | Apparatus and method of storing and searching for image |
| JP2016057185A (en) * | 2014-09-10 | 2016-04-21 | 日本無線株式会社 | Buried object exploration equipment |
| GB2508770B (en) * | 2011-09-09 | 2017-11-22 | Accipiter Radar Tech Inc | Device and method for 3D sampling with avian radar |
| CN108037501A (en) * | 2018-01-30 | 2018-05-15 | 长沙深之瞳信息科技有限公司 | It is a kind of to obtain area outlook radar system and method for the target pitch to angle |
| CN109188429A (en) * | 2018-08-30 | 2019-01-11 | 国网电力科学研究院武汉南瑞有限责任公司 | Electricity transmission line monitoring method and monitoring system based on radar and two waveband video camera |
| CN110046130A (en) * | 2019-05-17 | 2019-07-23 | 大连海事大学 | Radar signal file organization and management method |
| CN110297234A (en) * | 2018-03-22 | 2019-10-01 | 西安航通测控技术有限责任公司 | A kind of big region passive type air target intersection measuring method of networking and system |
| CN110413836A (en) * | 2019-07-18 | 2019-11-05 | 湖南宏动光电有限公司 | A kind of panorama search system |
| CN111402296A (en) * | 2020-03-12 | 2020-07-10 | 浙江大华技术股份有限公司 | Target tracking method based on camera and radar and related device |
| CN111695771A (en) * | 2020-05-07 | 2020-09-22 | 国网安徽省电力有限公司淮南供电公司 | Intelligent electric power material detection, management and control system and method based on Internet of things technology |
| US10937232B2 (en) | 2019-06-26 | 2021-03-02 | Honeywell International Inc. | Dense mapping using range sensor multi-scanning and multi-view geometry from successive image frames |
| CN113596325A (en) * | 2021-07-15 | 2021-11-02 | 盛景智能科技(嘉兴)有限公司 | Picture capturing method and device, electronic equipment and storage medium |
| CN113759366A (en) * | 2020-06-02 | 2021-12-07 | 杭州海康威视数字技术股份有限公司 | Target detection method and device |
| CN113790667A (en) * | 2021-11-18 | 2021-12-14 | 中大检测(湖南)股份有限公司 | Dam deformation detection method based on radar |
| CN115980739A (en) * | 2023-03-21 | 2023-04-18 | 安徽隼波科技有限公司 | Automatic defense deploying method for radar guided photoelectric tracking |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0815406A (en) * | 1994-06-28 | 1996-01-19 | Mitsubishi Electric Corp | Video display |
| US6011515A (en) * | 1996-10-08 | 2000-01-04 | The Johns Hopkins University | System for measuring average speed and traffic volume on a roadway |
| KR20000017043A (en) * | 1998-08-04 | 2000-03-25 | 요코미조 히로시 | Three-dimensional radar apparatus and method for displaying three-dimensional radar image |
| KR20010017644A (en) * | 1999-08-13 | 2001-03-05 | 김계호 | Complex Image Displaying Apparatus for Vessel |
-
2006
- 2006-08-04 WO PCT/KR2006/003067 patent/WO2007015631A1/en active Application Filing
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH0815406A (en) * | 1994-06-28 | 1996-01-19 | Mitsubishi Electric Corp | Video display |
| US6011515A (en) * | 1996-10-08 | 2000-01-04 | The Johns Hopkins University | System for measuring average speed and traffic volume on a roadway |
| KR20000017043A (en) * | 1998-08-04 | 2000-03-25 | 요코미조 히로시 | Three-dimensional radar apparatus and method for displaying three-dimensional radar image |
| KR20010017644A (en) * | 1999-08-13 | 2001-03-05 | 김계호 | Complex Image Displaying Apparatus for Vessel |
Cited By (27)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8314816B2 (en) | 2009-06-08 | 2012-11-20 | Honeywell International Inc. | System and method for displaying information on a display element |
| EP2267409A1 (en) * | 2009-06-08 | 2010-12-29 | Honeywell International Inc. | System and method for displaying information on a display element |
| GB2508770B (en) * | 2011-09-09 | 2017-11-22 | Accipiter Radar Tech Inc | Device and method for 3D sampling with avian radar |
| US9055201B2 (en) | 2011-10-14 | 2015-06-09 | Samsung Techwin Co., Ltd. | Apparatus and method of storing and searching for image |
| CN104280730A (en) * | 2013-07-09 | 2015-01-14 | 北京瑞达恩科技股份有限公司 | Low-altitude search radar |
| CN104135644A (en) * | 2014-07-31 | 2014-11-05 | 天津市亚安科技股份有限公司 | Intelligent tracking cradle head having radar monitoring function and monitoring method |
| JP2016057185A (en) * | 2014-09-10 | 2016-04-21 | 日本無線株式会社 | Buried object exploration equipment |
| CN108037501A (en) * | 2018-01-30 | 2018-05-15 | 长沙深之瞳信息科技有限公司 | It is a kind of to obtain area outlook radar system and method for the target pitch to angle |
| CN108037501B (en) * | 2018-01-30 | 2023-10-03 | 长沙深之瞳信息科技有限公司 | Regional warning radar system and method capable of acquiring pitching angle of target |
| CN110297234A (en) * | 2018-03-22 | 2019-10-01 | 西安航通测控技术有限责任公司 | A kind of big region passive type air target intersection measuring method of networking and system |
| CN110297234B (en) * | 2018-03-22 | 2023-03-14 | 西安航通测控技术有限责任公司 | Networked large-area passive air target intersection determination method and system |
| CN109188429A (en) * | 2018-08-30 | 2019-01-11 | 国网电力科学研究院武汉南瑞有限责任公司 | Electricity transmission line monitoring method and monitoring system based on radar and two waveband video camera |
| CN109188429B (en) * | 2018-08-30 | 2022-11-25 | 国网电力科学研究院武汉南瑞有限责任公司 | Power transmission line monitoring method and monitoring system based on radar and dual-band camera |
| CN110046130B (en) * | 2019-05-17 | 2022-10-18 | 大连海事大学 | Radar signal file organization and management method |
| CN110046130A (en) * | 2019-05-17 | 2019-07-23 | 大连海事大学 | Radar signal file organization and management method |
| US10937232B2 (en) | 2019-06-26 | 2021-03-02 | Honeywell International Inc. | Dense mapping using range sensor multi-scanning and multi-view geometry from successive image frames |
| CN110413836A (en) * | 2019-07-18 | 2019-11-05 | 湖南宏动光电有限公司 | A kind of panorama search system |
| CN111402296B (en) * | 2020-03-12 | 2023-09-01 | 浙江大华技术股份有限公司 | Target tracking method and related device based on camera and radar |
| CN111402296A (en) * | 2020-03-12 | 2020-07-10 | 浙江大华技术股份有限公司 | Target tracking method based on camera and radar and related device |
| CN111695771A (en) * | 2020-05-07 | 2020-09-22 | 国网安徽省电力有限公司淮南供电公司 | Intelligent electric power material detection, management and control system and method based on Internet of things technology |
| CN111695771B (en) * | 2020-05-07 | 2024-02-27 | 国网安徽省电力有限公司淮南供电公司 | Electric power material intelligent detection management and control system and method based on Internet of things technology |
| CN113759366A (en) * | 2020-06-02 | 2021-12-07 | 杭州海康威视数字技术股份有限公司 | Target detection method and device |
| CN113759366B (en) * | 2020-06-02 | 2024-06-04 | 杭州海康威视数字技术股份有限公司 | Target detection method and device |
| CN113596325B (en) * | 2021-07-15 | 2023-05-05 | 盛景智能科技(嘉兴)有限公司 | Method and device for capturing images, electronic equipment and storage medium |
| CN113596325A (en) * | 2021-07-15 | 2021-11-02 | 盛景智能科技(嘉兴)有限公司 | Picture capturing method and device, electronic equipment and storage medium |
| CN113790667A (en) * | 2021-11-18 | 2021-12-14 | 中大检测(湖南)股份有限公司 | Dam deformation detection method based on radar |
| CN115980739A (en) * | 2023-03-21 | 2023-04-18 | 安徽隼波科技有限公司 | Automatic defense deploying method for radar guided photoelectric tracking |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| WO2007015631A1 (en) | Smart video monitoring system and method communicating with auto-tracking radar system | |
| CN101268383B (en) | Smart video monitoring system and method communicating with auto-tracking radar system | |
| KR100720595B1 (en) | Intelligent video surveillance system and method for interworking with an automatic tracking radar system | |
| US7801331B2 (en) | Monitoring device | |
| KR101120131B1 (en) | Intelligent Panorama Camera, Circuit and Method for Controlling thereof, and Video Monitoring System | |
| US6529234B2 (en) | Camera control system, camera server, camera client, control method, and storage medium | |
| US9762864B2 (en) | System and method for monitoring at least one observation area | |
| EP1585332A1 (en) | Remote video display method, video acquisition device, method thereof, and program thereof | |
| US20110310219A1 (en) | Intelligent monitoring camera apparatus and image monitoring system implementing same | |
| EP2284814A1 (en) | Systems and methods for night time surveillance | |
| US20050036036A1 (en) | Camera control apparatus and method | |
| US10397474B2 (en) | System and method for remote monitoring at least one observation area | |
| KR101502448B1 (en) | Video Surveillance System and Method Having Field of Views of 360 Degrees Horizontally and Vertically | |
| KR101096157B1 (en) | Real-time monitoring device using dual camera | |
| KR101075874B1 (en) | Video transmission device | |
| KR20090015311A (en) | Video surveillance system | |
| KR100655899B1 (en) | Marine web camera control device and method | |
| KR101281687B1 (en) | How to monitor bad weather | |
| CN112949436A (en) | Picture active pushing system using layout analysis | |
| CN201114535Y (en) | Ultra-remote radar and video joint action control system | |
| KR102192240B1 (en) | Communication system for remote place by using 5G network | |
| KR102259637B1 (en) | Broadcasting Transmission System Based on Artificial Intelligence for Unmanned Broadcasting | |
| CN113572946B (en) | Image display method, device, system and storage medium | |
| JP2005033827A (en) | Object monitoring device and monitoring system | |
| Dumpert et al. | Networked thermal imaging and intelligent video technology for border security applications |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| WWE | Wipo information: entry into national phase |
Ref document number: 200680034361.1 Country of ref document: CN |
|
| 121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
| WWE | Wipo information: entry into national phase |
Ref document number: 12008500271 Country of ref document: PH |
|
| NENP | Non-entry into the national phase |
Ref country code: DE |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 577/KOLNP/2008 Country of ref document: IN |
|
| WWE | Wipo information: entry into national phase |
Ref document number: 1200800539 Country of ref document: VN |
|
| 122 | Ep: pct application non-entry in european phase |
Ref document number: 06783516 Country of ref document: EP Kind code of ref document: A1 |