[go: up one dir, main page]

WO1998016801A1 - Capteur multi-voies pour systeme autoroutier intelligent pour vehicules - Google Patents

Capteur multi-voies pour systeme autoroutier intelligent pour vehicules Download PDF

Info

Publication number
WO1998016801A1
WO1998016801A1 PCT/US1997/018628 US9718628W WO9816801A1 WO 1998016801 A1 WO1998016801 A1 WO 1998016801A1 US 9718628 W US9718628 W US 9718628W WO 9816801 A1 WO9816801 A1 WO 9816801A1
Authority
WO
WIPO (PCT)
Prior art keywords
beams
vehicle
sensor
providing
range
Prior art date
Application number
PCT/US1997/018628
Other languages
English (en)
Inventor
Richard J. Wangler
Robert L. Gustavson
Robert E. Ii Mcconnell
Keith L. Fowler
Kevin A. Kreeger
Original Assignee
Schwartz Electro-Optics, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/730,732 external-priority patent/US5793491A/en
Application filed by Schwartz Electro-Optics, Inc. filed Critical Schwartz Electro-Optics, Inc.
Priority to AU48222/97A priority Critical patent/AU4822297A/en
Publication of WO1998016801A1 publication Critical patent/WO1998016801A1/fr

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4817Constructional features, e.g. arrangements of optical elements relating to scanning
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4811Constructional features, e.g. arrangements of optical elements common to transmitter and receiver
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/015Detecting movement of traffic to be counted or controlled with provision for distinguishing between two or more types of vehicles, e.g. between motor-cars and cycles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/01Detecting movement of traffic to be counted or controlled
    • G08G1/04Detecting movement of traffic to be counted or controlled using optical or ultrasonic detectors
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/481Constructional features, e.g. arrangements of optical elements
    • G01S7/4818Constructional features, e.g. arrangements of optical elements using optical fibres

Definitions

  • the present invention relates generally to object sensors, and in particular to laser rangefinder sensors useful in detecting vehicle speed and shape for classification and input to Intelligent Vehicle Highway Systems (IVHS) .
  • IVHS Intelligent Vehicle Highway Systems
  • a time-of-flight laser range-finder sensor is used to measures a distance to a road surface from a fixed position of the sensor, above the road surface, and measures a distance to a vehicle or vehicles which pass or stop under the sensor.
  • Two laser beams are pulsed at a high pulse repetition rate and projected across the road surface at a fixed angle between them. Because of the high pulse repetition rate, the system is also able to determine vehicle speed with an accuracy within one mile per hour (mph) and, using a calculated speed, develop a longitudinal profile of the vehicle with consecutive range measurements collected as the vehicle travels under the sensor .
  • the active near-field object sensor of U.S. Patent No. 5,321,490 provides a sensor which is relatively in low cost, accurate, and useful in a wide variety of applications.
  • the sensor detects the presence of an object within an area located in a close range to the sensor, and includes a range finder having means for emitting a directional output of pulsed energy toward the fixed area.
  • emitting means comprises a laser diode capable of emitting pulses of coherent infrared radiation, which are used together with collimatmg optics and a beam splitter to provide two diverging output beams directed toward the near- field area under observation.
  • the sensor also includes means for receiving a portion of the energy reflected from either the area, or an object located within the area.
  • the returned pulse energy is then provided as an input to a receiver for determining a time of flight change for pulses between the emitting and receiving means, which may be caused by the presence of an object within the area.
  • the sensor is also provided with various features useful m providing outputs which indicate either the speed, census, size or shape of one or more objects in the area.
  • the sensor is provided with means for receiving an input from a time of flight determining means and for providing an output indicating whether the object meets one of a plurality of classification criteria (e.g., is the object an automobile, truck or motorcycle) .
  • receiving means includes two detectors, with means for alternately selecting between the outputs of the two detectors for providing inputs to the time of flight determining means .
  • Measuring means are also provided for measuring the time interval between interceptions of the two diverging outputs by a given object, so as to calculate the speed of the object passing through the area.
  • Such a sensor is commercially referred to as Autosense I, by manufacturer Schwartz Electro-Optics, Inc. of Orlando, Florida.
  • Autosense II As an improvement to the Autosense I sensor of U.S. Patent No. 5,321,490 to Olsen et al., a sensor commercially referred to as Autosense II is provided and described in U.S. Patent No. 5,278,423 to angler et al.
  • the sensor of Autosense II incorporates the technology and teachings of Autosense I and provides three dimensional images of objects by rotating or scanning a laser beam range-finder.
  • the scanned laser beam range-finder operates at a high pulse rate, in a plane where there is relative motion between the range- finder and the object to be sensed or imaged in a direction perpendicular to the laser beam plane of rotation. This operation causes the laser range-finder rotating beam, when passing across an object, to cover the object to be sensed with range-finder beam pulses, and thereby, obtain a three dimensional image of the object.
  • Scanning is provided using an optically reflective surface, a mirror, intercepting the beams and reflecting the beams at predetermined angles from a perpendicular to the roadway. Those beams reflected off of the vehicle and directed back toward the mirror are directed into corresponding apertures of the receivers .
  • Means are provided for rotatably moving the reflective surface across a reflective angle sufficient for reflecting the beams across a transverse portion of the vehicle, and signal means representative of the sensor angle within the beam plane are also provided.
  • the angle signals are delivered to processing means for providing range data at corresponding angles and the range and angle data in combination provide a transverse profile of the vehicle.
  • IVHS Intelligent Vehicle Highway Systems
  • DOT Contract Number DTFH 61-91-C-00034.
  • the purpose of the strategic plan is to guide development and deployment of IVHS in the United States.
  • the plan points out that there is no single answer to the set of complex problems confronting our highway systems, but the group of technologies known as IVHS can help tremendously in meeting the goals of the Intermodal Surface Transportation Efficiency Act of 1991 (ISTEA) .
  • the purpose of ISTEA is "...
  • AVC Automated Vehicle Classification
  • the present invention to further expand on the capabilities and use of the Autosense II sensor and provide a sensor capable of detecting multiple vehicles traveling within multiple parallel lanes of traffic with sufficient accuracy for classifying the vehicles. Because the sensor of the present invention accurately ( ⁇ 1 mi/h, l ⁇ ) measures vehicle speed as well as vehicle count, it provides the basic data from which other traffic parameters, such as flow rate and mean speed, can be derived.
  • a distinguishing feature of the Autosense sensors which sets the sensor apart from other vehicle detectors, is the ability to accurately measure vehicle height profiles. This unique capability is utilized to classify vehicles and to identify specific vehicles when matched with downstream sensors for the determination of travel time.
  • ETM Electronic Toll and Traffic Management
  • a sensor comprising laser rangefmder means for determining a range from the sensor to points on a vehicle when the vehicle travels within a sensing zone and for providing range data outputs corresponding to sensor angles for ranges from the sensor to the points on the vehicle, means for scanning the laser means within a plane generally orthogonal to a direction of travel for the vehicle, the scanning means communicating with the laser rangefmder means for determining a range for a corresponding point on the vehicle within the transverse plane, the scanning means providing means for determining the range and a corresponding sensor angle for each point within the scanning plane, deflecting means cooperating with the scanning means for deflecting the scanned beam from a first longitudinal position to a second longitudinal position, the first and second positions defining a forward and backward beam for receiving the vehicle traveling in a directing between the beams, and means for processing the ranges, corresponding angles, and interception times for the vehicle receiving the first and second beams, the processing
  • the sensor in a preferred embodiment comprises a forward and a backward beam emitted by the laser means.
  • the forward and backward beams are separated by a predetermined angle and are emitted toward a fixed area through which the vehicle travels.
  • a time signal representative of a travel time for a point on the vehicle to travel between the beams is determined from time-of-flight data provided by the range data processing means.
  • a transmitter and receiver, along with a rotating polygon mirror are used for emitting a pair of laser beams, for directing the beams toward zones on a roadway traveled on by the vehicle, and for converting reflected laser beams from the vehicle to signal voltages representative of ranges between the receivers and defined points on the vehicle .
  • Scanning is provided using an optically reflective surface intercepting the beams and reflecting the beams at predetermined angles from a perpendicular to the roadway.
  • the beams reflected off of the vehicle are directed back toward the mirror into corresponding apertures of the receivers.
  • Means are provided for rotatably moving the reflective surface across a reflective angle sufficient for reflecting the beams across a transverse portion of the vehicle, and signal means representative of the sensor angle within the beam plane are also provided.
  • the angle signals are delivered to the processing means for providing range data at corresponding angles and the range and angle data in combination provide a transverse profile of the vehicle.
  • the scanning is provide using a mirror intercepting the beams emitted from the transmitter and reflecting the beams onto scanning planes.
  • the planes are set at opposing angles from a perpendicular to the roadway.
  • the reflected beams directed back toward the mirror are directed into corresponding apertures of the receiver.
  • a motor having a rotatable shaft is affixed to the mirror for continuously rotating the mirror about the axis, and an encoder is affixed to the motor shaft for identifying an angular position of the mirror relative to a reference angle.
  • Processing means comprises a microprocessor programmed to receive respective range and sensor angle data for storing and processing the data for a scanned cycle associated with a timing signal. The processed data results m a three dimensional shape profile for the vehicle.
  • the invention comprises an algorithm for comparing the vehicle shape profile with a multiplicity of predetermined vehicle shapes for classifying the vehicle.
  • FIG. 1 is a partial perspective view illustrating an IVHS sensor of the present invention in one sensing configuration
  • FIG. 2 is a partial elevation view of the sensor of the present invention operating with multiple traffic lanes
  • FIG. 2A is a partial perspective view of sensor geometry illustrating an alternate configuration of forward and backward scanning laser beams used in one preferred embodiment of the present invention
  • FIG. 3 is a schematic diagram of a preferred embodiment of the sensor of the present invention
  • FIG. 4 is a partial top plan view of a sensor packaging of FIG. 3;
  • FIG. 5 is a perspective view of the multi faceted mirror of FIG. 4;
  • FIGS. 6 and 7 are diagrammatic functional representations of a multi faceted mirror used in one preferred embodiment of the present invention.
  • FIG. 8 is a block diagram illustrating electronics and optics of one embodiment of the present invention.
  • FIG. 9 a schematic diagram of the time to amplitude (TAC) and logic circuitry useful in the sensor of the present invention.
  • FIG. 10 is a perspective view illustrating a three dimensional vehicle profile provided by the present invention.
  • FIG. 11 is a pen and ink reproduction of a false color range image illustrating vehicle profiles for detected vehicles ;
  • FIGS. 12A and 12B are block diagrams illustrating a functional flow of a processor used for the present invention
  • FIGS. 13A through 13J illustrate "American Truck Association Truck Types" by way of example, for use m toll road vehicle data collection and classification;
  • FIG. 13K is a perspective view illustrating a three dimensional truck profile provided by the present invention illustrated with a black and white ink tracing of a monitor screen;
  • FIG. 14 and 15 are perspective views illustrating operation of the active near-field object sensor;
  • FIG. 16 is a block diagram illustrating electronics and optics of an alternate embodiment of sensor illustrated in FIGS. 14 and 15;
  • FIG. 17 illustrates a scan geometry for providing high accuracy laser radar with a three inch range resolution for a sensor of the present invention;
  • FIG. 18 diagrammatically illustrates use of a rotating twelve sided polygon mirror to scan a beam with a dual- position nodding mirror deflecting the beam onto alternate rotating mirror facets for reflecting a beam into forward and backward scanned beams;
  • FIG. 19 is a schematic diagram of an embodiment of the present invention illustrating use of two transmitters and two receivers for forming the forward and backward scanned beams;
  • FIG. 20 is a functional block diagram illustrating direct and indirect sensor functions
  • FIGS. 21 through 27 are interrelated flow charts illustrating a preferred embodiment of software useful with the present invention.
  • FIG. 28 is a top plan view diagrammatically illustrating a test lab layout used for beam separation measurements ;
  • FIG. 29 is a side elevational view of the test lab layout of FIG. 28;
  • FIG. 30 illustrates beam scan traces on a test surface
  • FIG. 31 illustrates a coordinate system used for s angular separation analysis in one preferred embodiment of the present invention
  • FIG. 32 illustrates a laser beam reflectance from a tilted polygon facet
  • FIG. 33 is a plot of a predicted and measured reflected ID beam tilt angle as a function of sensor scan angle.
  • a sensor 100 is affixed above a highway 102 for sensing a vehicle 104 passing below the sensor 100.
  • a first or forward scanned beam 106 intercepts the vehicle 104 as the vehicle 104 crosses the forward beam 106 and enters a sensing area 108 below the sensor 100 configured and described herein by way of example .
  • a second or backward directed and scanned beam 110 intercepts the vehicle, designated as numeral 104a, as the vehicle 104a leaves the sensing area 108.
  • the sensor 100 is mounted on cables or a mast arm 112 over the highway 102.
  • the improved sensor 100 herein described is mounted overhead and centered over multiple lanes, between three lanes 114 of traffic as herein described by way of example.
  • the sensor 100 is mounted over the center lane 115 of traffic with look-down angles of 10 degrees for the forward beam 106 and 0 degrees for the backward beam 110.
  • this mounting configuration provides for good spatial resolution and reduces shadowing caused by larger vehicles .
  • a total beam scan coverage 116 is 60 degrees, and when the sensor 100 is mounted at 30 feet above the highway 102, complete lane coverage for three 12 ft. lanes 114 typically found for the highways 102 relying on IVHS sensors.
  • two beams 105, 109 comprise 904 nm radiation which are emitted by two similar transmitters 200.
  • Each of the beams 105, 109 are directed for scanning across 30 degree scan angles with the transmitters 200 positioned for scanning the total 60 degree coverage 116.
  • each beam 105, 109 is again split for having an angular forward and backward separation of 10 degrees, which beams 106, 110 are directed toward the highway 102 as earlier described with reference to FIGS. 1 and 2.
  • a small amount is diffusely reflected back to the sensor 100, as illustrated by numerals 106b, 110b, where it is detected by a pair of receivers 400.
  • the round- trip propagation time of a laser pulse making up the beams 106, 110 is proportional to a range 118 to the vehicle 104 or the highway 102 from which the radiation is reflected.
  • the presence of the vehicle 104 is indicated by a reduction m the range reading from the vehicle range 118 to a highway range 120.
  • Vehicle speed is computed from the measured time interval between the interceptions of the forward and backward beams 106, 110.
  • On-board microprocessors 500 within the sensor housing 122, are used for the determination of vehicle presence, speed, count, and classification, as will be described m further detail later in this section.
  • a real- time clock is used to time-tag the data collected to provide, by way of example, vehicle count and average speed for each hour of the day.
  • the senor 100 employs a pair of InGaAs diode-laser transmitters 200 and silicon avalanche photodiode (APD) receivers 400 in a generally side-by-side configuration.
  • Each transmitter 200 consists of a laser diode 202, its laser driver 204, and a collimating lens 206.
  • Each optical receiver 400 is comprised of an objective lens 402, a narrow-band optical filter 404, detector/ amplifier 406, and a threshold detector, described in above referenced applications and herein later described m further detail .
  • the laser diode 202 used in a preferred embodiment of the present invention includes an InGaAs injection laser diode having 12 W output at 10 Amps pulsed current drive.
  • the laser driver 204 produces a 10 Amp peak current pulse with a 3 ns rise time and an 8 ns pulse width.
  • a trigger pulse from a scanner controller as will be further described later in this section with discussions of the mirror system 300, triggers the laser transmitter 200 at preselected scan angles produced by the mirror system 300.
  • the 904 nm laser beam emission is at an ideal wavelength for the silicon APD receivers 400 used.
  • the sensor 100 of a preferred embodiment of the present invention includes a rotating polygon scanner 302 to line scan the laser beams 106, 110 across the three 12-foot-w ⁇ de lanes 114 of a highway 102, as earlier described, by way of example, with reference to FIGS. 1 and 2.
  • the polygon scanner 302 rotates continuously in one direction 304 at a constant speed. As herein described, for coverage of the lanes 114, and as illustrated with reference to FIGS. 5 through 7, as the polygon scanner 302 rotates, the transmitted beams 105, 109, earlier described with reference to FIG.
  • each transmitter 200 and receiver 400 pair is scanned by the rotating polygon 300 to provide the 30 degree coverage for each, and when sequentially processed, the full 60 degree coverage 116 is achieved.
  • each adjacent facet 308a, 308b is set at ten degrees.
  • alternating adjacent facets 308a, 308b have angles 309a, 309b to a polygon base 310 which angles alternate between 87.5° and 92.5° for the adjacent facets 308a, 308b.
  • successive scans are made with the angular separation 306 of 10 degrees, for providing the separated forward beam 106 and backward beam 110 used vehicle speed measurements.
  • the laser beam receiver 400 has a fleld-of-view also scanning since the laser beam axis and receiver fleld-of-view are aligned and therefore the returned reflected beam herein illustrated is coll ear.
  • the range processor 502 described with reference to FIGS. 3 and 8 keeps track of the scanner 302 position using incremental readings from the shaft encoder 312 within mirror electronics 315 of the sensor 100. Therefore, the facets 308 and beam angle 138, as illustrated again with reference to FIG. 1, at which a range measurement is being taken, is known.
  • a representative signal 317 is provided to the range processor 502, as illustrated again with reference to FIG. 8.
  • the shaft encoder 312 triggers the laser driver 208 with a first set of consecutive pulses which provide the scanned beam 106 at a predefined angle and will be offset by another set of consecutive pulses resulting from the rotating scanner 302 and reflections from its facets 308 and the discontinuities between facets 308.
  • the sensor 100 as herein described, comprises dual transmitters 200 and receivers 400, as illustrated and described with reference to FIGS. 3 and 4.
  • alternate embodiments of the mirror system 300 include the use of nodding mirrors 318 and positioner 320 described with reference to FIG. 8 herein and in above referenced application .
  • to optimize the size of the sensor housing 122 as illustrated again with reference to FIGS.
  • folding mirrors 305 are used to redirect beams from each transmitter 200 and receiver 400 pair to the polygon scanner 302.
  • the nodding mirror 318 may serve a similar function for directing the beam through the housing window 620, described later with reference to FIG. 16.
  • the optical detection of the reflected beams 106b, 110b includes circuitry which converts optical radiation reflected from the vehicle 104 and highway 102, as earlier described with reference to FIG. 1, to first, an equivalent electrical analog of the input radiation and finally, a logic-level signal.
  • the output 410 of each receiver 400 is multiplexed by the microprocessor 500 and is connected to a peak detector 412 that measures the intensity of the reflected pulse.
  • Each receiver 400 also contains a threshold detector 414 which converts the analog return pulse signals to logic- level pulses.
  • the logic-level signals are processed within the range counter logic circuitry 416 to yield analog range data, which is read by the microprocessor 500.
  • the analog technique was chosen m earlier embodiments as well as a preferred embodiment of the present invention because of its better resolution, smaller size, simpler circuitry, lower power consumption and lower costs when compared with the digital technique.
  • the analog range measurement technique specifically used in the present invention is known as a "time-of-flight converter" and has an accuracy of about one percent of measured range and a resolution of about plus or minus five centimeters. As illustrated with reference to FIG.
  • the logic circuit 416 comprises range gate 418 and time-to-amplitude converter (TAC) circuit 420 which uses a constant current source 421 including transistor Ql to charge a ramp capacitor C38, identified with numeral 422, to obtain a linear voltage ramp whose instantaneous value is a measure of elapsed time.
  • the TAC circuit 420 is designed so that the voltage across the capacitor C38 422 begins ramping down from the positive power supply when the transmitter 200 fires to provide a start signal 422, illustrated again with reference to FIG. 8. The ramp is stopped when either a reflected pulse is received by the receiver 400 to provide a stop signal 423 or at the end of a measured period of time. A maximum range and thus a maximum measured time period is preselected as an initial value.
  • the output of the TAC circuit output 424 is then converted to a digital format by a ten bit analog-to-digital converter within the microprocessor 500.
  • the timing pulse start signal 422 for the TAC circuit 420 is generated by a shaft encoder 312 with a simultaneous pulse causing the laser transmitter 200 to fire.
  • Such pulsed time-of-flight range measurements using the TAC circuit 420 provide accurate ( typically within 3 in.) transverse height profiles of the vehicle 104 on each scan.
  • the vehicle speed determined from the time interval between the interceptions of the two laser beams 106, 110, as earlier described with reference to FIGS. 1, 6 and 7, by the vehicle 104, is used to space, with a scan separation distance 124, transverse profiles 126 appropriately to obtain a full three-dimensional vehicle profile 128, as illustrated with reference to FIG. 10.
  • An algorithm as will be described in further detail later this section, is applied to the three-dimensional profile 128 for vehicle-classification purposes .
  • One preferred embodiment of the present invention includes two microprocessors 500, a range processor 502 and a data processor 504, as illustrated again with reference to FIG. 3.
  • the range processor 502 triggers the transmitter 200 as earlier described, reads the real-time range and intensity data, provided by the peak detector 412 and TAC circuit 420, as earlier described, and runs continuous self test and calibration functions.
  • the data processor 504 runs all the algorithms for vehicle classification, calculates certain traffic parameters, and controls all communications ports, for example an RS232 output.
  • An example of three-dimensional profiling capability for multiple vehicles within multiple lanes is provided by the range images shown in FIG. 11. This range image of a van 130 traveling next to a small truck 132 traveling at a speed of about 45 mph demonstrates the ability of the sensor 100 to distinguish between vehicles 104.
  • the pixel spacing resulting from a 0.67 degree scan resolution is more than adequate for vehicle separation.
  • X 6 360 scans/sec/scan line
  • the separation distances 124 for consecutive profiles are less that 5.0 inches, and for 50 mph the separation distance 124 is less than 2.5 inches.
  • the three-dimensional images 128, 130, 132, witn reference again to FIGS. 10 and 11 are constructed. At 100 mph, the scanner produces 72 scans across a 15-foot-long vehicle.
  • a vehicle length (1) 135 is calculated by measuring the vehicle speed (v) and multiplying it by the total number of scan lines (si) or transverse profiles 126 detected on the vehicle 104 and the scan-to-scan time (st) using
  • the sensor 100 is capable of classifying vehicles such as a motorcycle, automobile, pickup truck, bus, and commercial trucks. This list can also be expanded by breaking a vehicle class into subcategories . For instance, the sub- classification categories can be generated to separate pickup trucks from vans or sport utility vehicles.
  • Each sensor 100 provides both an RS-232 and an RS-422 serial interface 506 for connection to other equipment.
  • the RS-232 interface operates at data rates up to 19.2 kilobits per second and is primarily used to connect the sensor to spread-spectrum radio links or other types of data modems. Hard-wired installation of the sensors 100 is better served by the RS-422 interface. With this port, data rates of up to 1.0 megabits per second (during test mode) can be supported.
  • Sensor 100 include, automatically initializes the vehicle detection process upon power up, automatically adjusting for varying conditions at the installation sites, including adjustments for slope, grade, road reflectivity variations, and the presence of barriers and guard rails. Such features and sensor performance are not compromised by vehicles passing through the sensing area 108 or fleld-of-view.
  • the continuous self-test capability of the sensor 100 provides instant fault isolation. Every major circuit in the sensor is continuously tested for proper operation. The moment any self-test fails, a Self-Test Message will be transmitted from the unit so that immediate action can be taken if necessary. Testing of the sensor 100 has determined a detection accuracy of 99.9%, speed accuracy of +/- 3.5 mph @ 60 mph, and a classification accuracy of 98% for 5 classes.
  • the increased lane coverage of the sensor 100 herein described with reference to FIG. 3 results in a reduction in the number of sensors needed for larger roadway configurations . Mounting variations could utilize the sensor 100 for coverage of additional lanes other than the three herein described, by way of example with reference to FIG. 2.
  • the microprocessor 500 receives range information through the TAC output signal 424 and return pulse intensity signal 426, as described earlier.
  • time walk corrections are performed for accounting for range measurement error and for providing a corrected range signal 508 used with a respective angle signal 428 provided by the shaft encoder 312, earlier described with reference to FIG. 8, for providing a cosine correction 510 in the scanning plane and results in a range data set 512 representative of a sensed surface such as the detection points 136 on the vehicle 104, as described earlier with reference to FIG. 1.
  • This range data set 512 is then processed in the data processor 504 for classification with known vehicles.
  • the forward and backward beams 106, 110 are distinguished and corresponding forward scan 430 and backward scan 432 signals are input to the microprocessor 500 for use time calculations to determine the vehicle speed.
  • the three dimensional vehicle profile illustrated in FIG. 10 is constructed with reference to the highway 102.
  • Profiles 128 are matched against database profiles m the data processor 504.
  • Predetermined rules for comparison are used that will include, by way of example, total vehicle surface area, vehicle height above the roadway, and other distinguishing database vehicle characteristics effective classifying the vehicles.
  • general rule base algorithms are used completing the classification.
  • the sensor 600 comprises a compact enclosure 612 of light-weight material, such as aluminum. Across one side of the enclosure 612 is a transmissive window 620, which is shielded from ambient weather by a hood 618.
  • the electrical-optical assembly 628 includes a transmitter section 630, a receiver section 632, a range/processor section 634, and a power supply 636, each of which is discussed in detail U.S. Patent No. 5,321,490 and as highlighted below. As illustrated with reference again to FIG.
  • the transmitter section 630 includes an astable multivibrator 602 generating a laser trigger pulse at a nominal repetition frequency of 3 kilohertz to a laser driver 604 which, by way of example, produces a 20 ampere peak current pulse with a 4 nanosecond rise time, and a ten nanosecond pulse width.
  • the output of the laser driver controls a laser diode 606, whicn preferably comprises an indium gallium arsenide injection laser diode array having an output on the order of 180 watts, at the 20 ampere pulse current defined by the driver 604. This diode emits an output at 904 nanometers, which has been found to be an ideal wavelength for the silicon photodiode receiver, discussed below.
  • the array of the laser diode 606 have a junction characterized by dimensions of about 3.96 millimeters by 0.002 millimeters, in order to emit radiation in a 10 degree by 40 degree solid angle .
  • the output of the laser diode array 606 is collected by a fast (F/1.6) multi-element optical lens 608 which has an effective focal length of 24 millimeters and which is used to collimate the diode laser emission, the resulting collimated beam passes through a dual-wedge prism 610.
  • the two outputs of the dual-wedge prism 610 are referred to by reference numerals 609 and 611. Both outputs are passed through the heated transmissive window 620.
  • a 200 volt DC-DC converter 612 is provided in the transmitter section 630 and preferably is contained within the aluminum enclosure 612, earlier described with reference to FIGS. 14 and 15, for reducing electrical interference .
  • the transmitter section 630 further includes an optical fiber 614 coupled to receive a simultaneous output from the laser diode 606 with the emission into the lens 608. The output passing through the optical fiber 614 provides a significant aspect of the sensor 600, as is discussed in greater detail with reference to the range/processor section 634.
  • the receiver section 632 includes lens 622 for receiving reflected returning energy from the two pulsed output beams 609, 611 emitted by the transmitter section 630.
  • the energy passing through the lens 622 is passed through an optical filter 624, and the single input from the lens 622- filter 624 is fed into two photodetectors 626, 628 each of which provides an input to a respective amplifier 627, 629 both of which provide an input to an analog multiplexer 632.
  • the sensor 600 performs an optical multiplexing.
  • the optical energy received in the lens 622 is first converted into an equivalent electronic analog of the input radiation and second into a logic-level signal.
  • the outputs of the two photodetectors 626, 628 are time-multiplexed by the high-speed analog multiplexer 632, which is controlled by a logic-level control line 633 from the microprocessor 652 contained within the range/processor section 334.
  • the output of the multiplexer 632 is connected to a threshold detector 636 and an amplifier 634, both of which provide inputs to the range/processor section, as described below.
  • the two photodetectors 626, 628 are silicon photodiodes whicn operate as current sources, with the associated amplifiers 627, 629 converting the current pulses of the photo detectors 626, 628 into voltage pulses.
  • Each amplifier 627, 629 offers a transimpedance of 28 kilohms when operated in a differential mode .
  • the optical filter 624 preferably has a narrow-band (on the order of 40 nanometers) width, which limits the solar radiance and permits only the 904 nanometer radiation to reach the photodetectors 626, 628.
  • the transmission of the narrow-band filter 624 is on the order of about 75 percent at 904 nanometers.
  • the analog portion of the receiver section 632 be contained within a Faraday shield (not shown) which permits the circuit to operate in a "field-free" region where the ga is achieved without additional noise reduction.
  • the range/processor section 634 includes a detector 642 optically coupled with the fiber 614, an amplifier 643 and a threshold detector 644, the output of which represents a "start" input to a range gate 646.
  • the “stop” input for the range gate 646 is provided as the output from the threshold detector 636 contained within the receiver section 632.
  • the specific forms of the range gate 646 and the time- to-amplitude (TAC) converter circuit 420 are shown described in the co-pending applications and described earlier in with reference to FIG. 9.
  • a constant-current source including transistor Ql is used to charge a ramp capacitor C38 to obtain a linear voltage ramp whose instantaneous value is a measure of elapsed time.
  • the TAC circuit 420 is designed so that the voltage across the capacitor C38 begins ramping down from the positive power supply when the laser diode 606 fires. The ramp is stopped when either a reflected pulse is received at the detectors 626 or 628, or at the end of a measured period of time.
  • the output 649 of the TAC circuit 420 is then converted to a digital format by an 8 bit analog-to-digital converter inside the microprocessor 652.
  • the start timing pulse for the TAC circuit 420 is produced utilizing the optical detection of the transmitted laser pulse through the fiber 614, which provides an input to the detector 642 and thence to the amplifier 643.
  • the output of the amplifier 634 from the receiver section 632 is provided as an input to a peak detector 650 which in turn provides an input to the microprocessor 652.
  • a peak detector 650 which in turn provides an input to the microprocessor 652.
  • the microprocessor 652 comprises an Intel 87C196KC into which the software described below is loaded. As noted in range/processor section 634, the microprocessor 652 provides various outputs to light emitting diode indicators 653, a presence relay 656 for indicating the presence of an object, an RS 232 computer interface 657 and to a heater relay 666 contained within the power supply 336. The microprocessor 652 receives additional inputs from a temperature sensor 651 and a real time clock 654. The range/processor section 634 preferably also includes a battery backup circuit 658.
  • the power supply section 636 includes an EMI/surge protection circuit 662 for a power supply 664 operated by 110 volt line current.
  • the power supply circuit includes a heater relay 666 controlled by the microprocessor 652, as discussed above, and receiving 110 volts line power.
  • the heater relay is coupled to the window 320, to maintain the temperature of the window 320 constant for varying ambient conditions .
  • the sensor 600 is at a height H above the highway 102, and is displaced at an angle Theta 627 so as to be pointed toward the sensing area 108 defined by the beam separation W and the beam length L, and which is located a range distance R between the sensor 600 and the area 108.
  • the sensor 600 transmits two separate beams 609 and 611 (described as forward beam 106 and backward beam 110 with earlier description of sensor 100 and FIG. 1) which fall upon the area 108 defined by the length L and the width W.
  • a portion 609A of the radiated energy in beam 609 will be scattered from the vehicle 104 and away from the sensor 600, while a portion 609B is reflected back toward the sensor 600 for detection by receiver section 632, as earlier described.
  • the microprocessor 652 using the software and the various inputs from the electrical-optical assembly first measures the range to the road; if the range falls below a predetermined threshold, the microprocessor signals that a vehicle 104 is present by closing the presence relay 656, earlier described with' reference to FIG. 16.
  • the threshold is determined by calculating the minimum, maximum and average range to the highway 102 for 100 discrete measurements. The maximum error is then calculated by subtracting the average from the maximum range measurement and the minimum from the average range measurement. The threshold is then set to the maximum error.
  • the microprocessor 652 utilizing the software, to a certain degree classifies the vehicle 104 detected (as, for example, an automobile, a truck or a motorcycle) by examining the amount of range change, it being understood that a truck produces a much larger range change than an automobile, and a motorcycle a much smaller range change.
  • the software keeps an accurate count of vehicles by classification for a predetermined period (for example, 24 hours) and in one example maintains a count of vehicle types for each hour of the day in order to provide a user flow rate.
  • the microprocessor 652 and the associated software also calculates the vehicle speed in the manner described above, by calculating the time each vehicle takes to pass between the two beams 609, 611. Specifically, the microprocessor 652 utilizes a microsecond time increment, and is reset to zero when the first beam 609 detects the presence of the vehicle 104, and is read when the vehicle 104 is detected by the second beam. The software then automatically calculates the distance between the two beams 609, 611 by applying the law of cosines to the triangle formed by the two beams and the distance between them at the level of the highway 102, as illustrated again with reference to FIG. 14. The speed is then calculated by taking the distance between the beams and dividing it by the time the vehicle takes to travel that distance .
  • the sensor 600 can also be utilized to ascertain the existence of poor highway visibility conditions, which is useful in providing a warning to drivers to slow down because of dangerous visibility conditions.
  • the amplitude of the return signal received by the vehicle sensor is proportional to the atmospheric transmittance (visibility) . Analysis has shown that the sensor can detect vehicles until heavy fog or rainfall reduces the visibility range to 18 m. Corresponding to the change in visibility from clear day to foggy conditions, the received signal power decreases by a factor of 22. Thus, a measurement of the return-signal amplitude can be used to ascertain the existence of poor highway visibility conditions.
  • the microprocessor 652 senses a return-signal level from the roadway below a certain preselected threshold, then the software can initiate an output through the interface 657 to an appropriate visibility warning signal. It has been found that the sensor 100 achieved a detection percentage of 99.4%, and measured speed with an accuracy equal to or greater than that of conventional radar guns used for traffic enforcement purposes. The system also provided two dimensional vehicle range and intensity profiles. It was observed that the vehicles were accurately profiled, even in the area of the windshields where the intensity of the return signal was quite low, demonstrating the efficacy of the intensity-dependent range correction in mitigating the effect of timing walk on range measurements at low return-pulse amplitudes .
  • the present invention provides high resolution in both transverse axis (multiple forward cross scans 106 and multiple backward cross scans 110 of the lanes 114) and longitudinal axis (collection of a multiplicity of ranges within the scans 106, 110 along the vehicle 104, 104a passing in the lane 22) to provide the three dimensional profile 128, by way of example, of the vehicle 104.
  • This is true whether a single or dual transmitter and receiver sets are used to create the beam scanned coverage 116, earlier described with reference to FIGS. 1 and 2.
  • one preferred embodiment comprises a first transmitter/receiver pair for coverage of a thirty degree beam with a second pair for a second thirty degree coverage, each positioned to provide the complete sixty degree coverage herein described, by way of example.
  • the sensor 100 is mounted above the highway 102 over the center lane 115, as earlier described, or over a lane of interest depending on the desired use.
  • the sensor 100 makes a measurement of the highway 102 for that particular angle alpha one, for example, alpha 142a.
  • the beam 140 is pointed in the direction alpha two 142b, it makes the next measurement.
  • the distances or ranges to the points 136 on the surface of the vehicle 104 are measured. These ranges 144 or measured distances at the various scan angles 142 are then used in generating the vehicle profile 128 as illustrated in FIG. 10.
  • the profile 128 is formed by generating measured points 136 above the highway 102 by geometric transformation well known in the art.
  • one embodiment comprises the 12 sided mirror 303 rotating so as to provide a scan rate of 720 scan/sec. If the vehicle 104 is traveling at a rate of 100 mph or 146.7 feet/sec, the scan separation distance 124 would be equal to 146.7 ft/sec divided by 720 scans/sec or 2.4 inches. For a vehicle 104 traveling at 50 mph, the separation distance 124 is less than 1.25 inches. Such separation distances 124 provide detail sufficient to provide a three dimensional profile for accurately classifying the vehicle 104.
  • the sensor 100 has the forward beam 106 tilted at 5 degrees toward oncoming traffic and the backward beam tilted at 5 degrees away from oncoming traffic traveling in the lane 22.
  • the laser beam transmitter 200 is triggered at each one degree (angle alpha 142) increment of the 30 degree scan portion of the complete scanned beams 106, 110.
  • a vehicle 104 will intercept the forward scanned beam 106 and then the vehicle 104a will intercept the backward scanned beam 110 and the time between interceptions is calculated.
  • the distance between the forward 106 and backward 110 beams on the highway 102 is equal to 2 X 25 X tan 5 degrees or 4.37 feet.
  • the maximum timing error possible is one scan period and does not exceed 5% at 100 mph and 2.5% at 50 mph.
  • the length measurement accuracy of the vehicle profile 128 is a function of speed and is therefore within 5% when the vehicle 104 is traveling at 100 mph and improves linearly as the speed decreases.
  • FIG. 18 Yet another embodiment for providing the forward 106 and backward 110 scanned beams is illustrated in FIG. 18 and again with reference to FIG. 8, and comprises the use of the nodding mirror 318 which changes from a first position 322 to a second position 324 to reflect the transmitted laser beams 105, illustrated again with reference to FIGS. 3 and 8, as well as a corresponding returning reflected beam, off of facets 309 of the rotating polygon shaped mirror 303 having the facets 309 at the same inclination unlike the angled mirror facets 308 described earlier.
  • the bi-stable positioner 320 directs the nodding mirror 318 into its first 322 and second 324 positions.
  • the processor 502 provides a signal 326 to the bi-stable positioner 320 which moves the nodding mirror 318 onto every other mirror facet 309.
  • the functional flow of the electronics generally follows that as herein described. It will be appreciated that the sensor 100 of present invention includes an optical/mechanical multiplexing with the use of the nodding mirror 318, by way of example, rather than the analog multiplexing described with reference to the earlier developed sensor 600.
  • forward 106 and backward 110 scam beams are provided using two laser transmitters 200a, 200b as well as two receivers 400a, 400b as illustrated in FIG. 19.
  • the electronics of such an embodiment can be as earlier described with reference to sensor 100.
  • a planar mirror 328 is rotated by a motor 330 whose revolutions are monitored by an encoder 332 and counter 334 for providing angle data signals 336 to the processor 502.
  • the forward beam 106 and backward beam 110 are positioned at predetermined angles as described earlier by directing the transmitter/ receiver pairs at appropriate angles.
  • the rotating mirror 328 scans through a full cycle but only data applicable to the scanned beam positions of interest will be processed.
  • Vehicle classification 164 is accomplished by analyzing the range data recorded for a vehicle and matching the resultant profile to a defined set of rules . There are seven features that are used to classify a vehicle. The first feature that is calculated is the total length of the vehicle 166, which is derived from the length of each profile trend. Next, the height 168 of each plateau is compared to find the maximum plateau height. This feature will be used as the height of the vehicle. The average width 170 of the vehicle is only available if 3-dimensional data are being used. The ratio of plateau lengths to total length is used to determine how aerodynamic the vehicle is. The percentage of the vehicle above a 5 foot threshold and the height and length of the last plateau are features used for sub-classification.
  • a preferred embodiment of the software useful in connection with the sensor system and method of earlier patented invention is illustrated in flow charts and discussed in detail in the above reference patents. It will of course be understood that the software is loaded in an object code format into the microprocessor 500, and is designed to control the electrical-optical assembly of the sensor 100 in order to detect the presence of objects and to provide certain desirable outputs representative of the object, including for example, the speed with which the object moves through the area being sensed, the size and shape of the object, its classification and perhaps other characteristics.
  • the senor 100 has utility as a vehicle sensor for mounting in an overhead configuration in order to detect the presence of vehicles passing through an area--such as a portion of a roadway at an intersection--to identify the presence of a vehicle, classify the vehicle as either an automobile, truck or motorcycle, count the number of vehicles passing through the intersection and calculate the speed of each vehicle and the flow rate of all of the vehicles.
  • the software was specifically configured to meet the needs of a particular application, herein described by way of example.
  • software operates to find the range to the road.
  • the software then sets up the receiver 400 to detect the return beam 106b, 110b, and the range and return-signal intensity is read; the range and intensity reading is then toggled between the two beams 106, 110.
  • any necessary offset is added to the range based on the intensity to correct timing walk as discussed earlier.
  • the change in the range i.e., the road distance minus the distance to any object detected is calculated.
  • a vehicle pulse counter is tested to determine if there have been 16 consecutive pulses above the vehicle threshold; if the calculation is less than the vehicle threshold, then another sequence of steps is initiated to reset the vehicle pulse counter and thereby toggle between the beams 106, 110.
  • Various resets and adjustments are made including the calculation of the distance between the two beams, the calculation of the average range to the road, and the minimum/maximum range to the road.
  • the road pulse counter is reset, an inquiry is made as to whether the vehicle has already been detected; if the answer is affirmative, then an inquiry is made to determine if the change in range determined earlier is greater than the truck threshold in order to complete a truck-detection sequence. On the other hand, if the inquiry is negative, then the vehicle presence relay is set, a vehicle pulse counter is incremented, and a velocity timer is started for purposes of determining the speed of the vehicle passing through the area being sensed.
  • FIGS. 22 through 28 One embodiment of the software useful in connection with the sensor 100 and method of the present invention is illustrated in flow chart form in FIGS. 22 through 28 with portions of the software depicted in each of those figures being arbitrarily designated by reference numerals.
  • the software is loaded in an object code format and is designed to control the sensor 100 electrical, optical and mechanical components as herein earlier described.
  • the sensor 100 has utility for determining the speed of a vehicle and determining its vehicle classification through comparison of its three dimensional profile with known vehicles establish in a database.
  • the software modeling of FIGS . 21 through 27 has been specifically configured for these purposes .
  • the microcontroller software scan 720 in the forward scanned beam 106.
  • Fig. 22 further illustrates that this scan 720 is started 722 and the start time recorded 724.
  • a range and intensity are measured 726 as described earlier.
  • the intensity value is used to calculate an offset to be added to the range in order to correct for time walk 728.
  • the current scan angle 142 is determined from the motor encoder within the mirror electronics and the information used to calculate a cosine correction for the range 732 as earlier discussed. Ranges are accumulated 734 and recalculated at the various predetermined angle increments for the predetermined scan 736 and the end of the scan time is recorded 738.
  • a vehicle has been detected 740 by comparing ranges measured with sample ranges for database vehicles 742 and determining how such ranges compare 744 (refer to FIG. 23) . If a vehicle has previously been detected 746 data is sent to the microprocessor for classification 748, start times are recorded 750 and vehicle detection indicated 752 if a vehicle was not previously detected. Co-pending software uses these 750 and 752 steps and has further detail included in its specification for a reference. A range calibration is run 754 and then the process begins for the backward scanned beam 756. As illustrated in FIG. 24, the backward scan begins 758 and the start time recorded 760.
  • the process is as described in steps 762 through 774 and is as described for the forward scan in steps 722 through 738 and as described for the forward scan in 742 through 754 as 776 through 784 (see FIG. 25) .
  • a stop time is recorded 786 if a vehicle was nor previously detected.
  • a speed is calculated using the time period determined and the known distance between the beams 106, 110.
  • a feature set for the classification is calculated 796 for comparing the features of the vehicle detected to the features of vehicles contained in a vehicle database library 798 and vehicle speed and classification is provided as an output 800.
  • each scan is assembled into an image forming a three dimensional profile of the vehicle (802 of FIG 27) as illustrated earlier with reference to FIG. 10.
  • Features used in the calculation are calculated 804 and compared as discussed 798 and an output provided 800.
  • the features compared are not limited to but include vehicle surface area 806, length of the vehicle 808, width of the vehicle 810, height of the vehicle 812, a ratio of cross-sectional surface area to total surface area 814 and intensity 792. Further improvement is now described with reference to the present invention, described herein for sensor 100, by way of example.
  • vehicle surface area 806, length of the vehicle 808, width of the vehicle 810, height of the vehicle 812, a ratio of cross-sectional surface area to total surface area 814 and intensity 792 Further improvement is now described with reference to the present invention, described herein for sensor 100, by way of example.
  • the sensor 100 used in both Autosense II and III commercial versions employ rotating polygon mirror of the type described with reference to system 300 for scanning a pair of lines or laser beams 106, 110 having an angular separation of 10°, onto the flat surface of a roadway or highway 102, as earlier described with reference to FIG.l, by way of example.
  • the geometry of the situation forces the projected line or scanned beams 106, 110 to have the shape of a bow tie, as earlier described. This is of little consequence over the 30° of scan covered by way of example, for a sensor as described for embodiment of an Autosense II for a small few lanes of traffic.
  • the effect is not negligible over the 60° scan of an Autosense III, that is intended for use in higher multiples of traffic lanes. Since the measurement of speed by Autosense III depends upon beam separation, it was necessary to determine the variation in beam separation along the scan path to ensure an improved accuracy of the collected data such as for speed data.
  • the sensor 100 referred to commercially and herein as Autosense III includes a structure and geometry as earlier described.
  • the polygon mirror 302 rotates clockwise within a plane of rotation being a plane of the paper for descriptive purposes.
  • the polygon facets 308 are tilted ⁇ 2.5° to cause the beams 106, 110 described with reference to FIG. 1, to scan at ⁇ 5° with respect to the plane of rotation.
  • the incident angle is the angle that the beam makes with the perpendicular to the polygon facet 308.
  • a first effect factor dubbed the "smiley face" occurs when a scan line which is tilted out of the plane of rotation hits a flat surface. At the ends of the scan, the distance to the surface is greater than it is directly in front of the sensor. Therefore, the beam has a further distance to travel and consequently ends up farther away from the plane of rotation.
  • the laser beams hit the polygon facets 308 at different incident angles. Since the polygon facets 308 are tilted with respect to the plane of rotation, the beam should reflect off the facet at twice the tilt angle. This would happen if the incident angle in the plane of rotation is zero degrees (i.e., along the axis perpendicular to the face). However, as the incident angle increases, the degree at which the beam tilts out of the plane of rotation lessens. In fact, as the incident angle approaches 90° of the out-of- plane beam angle approaches zero degrees.
  • angular separation measurements were made by pointing a sensor at a wall and tracing the path the beams made as they hit the wall.
  • a helium-neon laser was utilized to provide a visible laser beam path through the system.
  • the geometry of the setup was measured, and the separation angle between the two beams was computed.
  • a wall 900 was constructed so that it was perpendicular to an axis 902 of alignment holes 904 of an optical table 906 upon which the sensor sat and perpendicular to the plane of the optical table. This was achieved by marking the spot on the wall in the plane of the table aligned down an axis of alignment holes (this was accomplished by simply aligning a laser with the alignment holes and marking its spot on the wall) . Then, to orient the wall 900 perpendicular to an axis of alignment holes on the optical table, marks 908 were placed equidistant to the left and right of the center mark on the wall.
  • the distance 910 from each of these marks back to a point on the chosen axis of alignment holes must be the same for the wall to be perpendicular to that axis.
  • the same procedure was repeated in the vertical direction to orient the wall perpendicular to the plane of the optical table as illustrated with reference again to FIGS. 28 and 29. This ensured that inaccuracies in the physical lab setup (i.e., floor not level, optical table not in same plane as the floor, etc.) would not affect the measurements.
  • the sensor 100 an Autosense III unit (with only the polygon and mirrors mounted on it) was fixed to the table utilizing the alignment holes so that the central axis of the sensor aligned with the axis of the optical table pointed towards the wall.
  • the base of the sensor was parallel to the surface of the table.
  • the helium- neon laser was attached to the optical table so that its laser beam was parallel to the plane of the table and aligned parallel to the axis of alignment holes down the table pointed towards the center of the mirror that reflects the beam onto the polygon facets.
  • the reflecting mirrors were aligned to be within ⁇ 5 minutes of their locations designed for the sensor 100.
  • the polygon was spun-up using a motor, and the scan lines projected onto the wall 900 for the +5° beam and the -5° beam were traced onto paper.
  • the test was repeated for the laser beam on the opposite side of the sensor. After all the beams were traced on the wall 900, the laser beam was directed straight at the wall for the entire range of scan angles. The line this beam made as it intersected the wall was also traced as a reference of where the plane of the optical table intersected the wall. A depiction of how the traces appeared as drawn on the wall is illustrated with reference to FIG. 30. Along with the four polygon-scanned traces and table-plane trace were the marks at the axis of the originating beams.
  • the dependence of the angular separation of the laser beams from the sensor 100 upon angle of incidence in the x-y plane was determined by analyzing the geometry that results when the law of reflection is applied to a laser beam incident upon the tilted facet of the polygon.
  • the coordinate system shown in FIG. 31 was used, where:
  • 1. (0,0,0) is the point 912 where the beam 914 hits the facet 308.
  • the Y-axis is in the plane of the facet.
  • the beam originates in the x-y plane and travels in the x-y plane towards the facet at point
  • Tilt Angle 916 is the angle in the x-z plane between the +Z-axis and the plane 918 which contains the facet 308 as it rotates about the Y- axis (a tilt angle of 0° would indicate that the plane which contains the facet is the y-z plane) .
  • Incident Angle 918 in the x-y plane is the angle from the +X-axis to the beam in the x-y plane (remember the beam originates in the x-y plane) .
  • the geometry consistent with the reflection of a laser beam from a tilted polygon facet 308 in the sensor 100 is illustrated with reference to FIG. 32, where line FA is perpendicular to the polygon facet, plane defined by CAB is the plane of incidence, plane DAB is in the x-y plane, angle FAB is the angle of incidence (in the plane of incidence, and not in the x-y plane) , angle FAE is the facet tilt angle 916, and angle FAB equals angle CAF by the law of reflection.
  • the angle CAD, which is the reflected beam tilt angle, and angle DAE, which is the reflected beam scan angle (in the x-y plane) are to be determined.
  • the reflected-beam tilt angle "T” can be approximated using the following formula:
  • the incident angle in the x-y plan for this configuration was computed for each scan angle in the operating range of the sensor 100. Utilizing the derived formulas for the reflected-beam scan angle and reflected-beam tilt angle, the measured values were plotted to gauge the accuracy of the prediction and are as illustrated, by way of example, with reference to FIG. 33.
  • the lines 914 are predicted values and the thin lines 916 are the measured values .
  • the measured values are well within +0.1° of the predicted values. It has been determined that if the inter- beam angle is known to within ⁇ 0.1°, the speed measurement accuracy of the sensor will be within ⁇ 1.36 mph at 60 mph. In fact, the actual error measured is within ⁇ 0.04°, which would produce a speed measurement accurate to ⁇ 0.5 mph at 60 mph.
  • the formulas for predicting the angle between the two beams at any point across the scan of an Autosense III, sensor 100 were used to build a look-up- table that is used by the speed-calculation algorithm, earlier described.
  • the sensor 100 used in vehicle detection is useful in determining and recording other highway conditions such as visibility.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Traffic Control Systems (AREA)

Abstract

Ce capteur (100) pour système autoroutier intelligent pour véhicules (IVHS), qui fournit une information précise sur les conditions de trafic en temps réel, peut être utilisé pour la détection d'incidents, l'émission d'avis à l'intention d'automobilistes et la gestion du trafic par signaux, compteurs de trafic et analogues. Le capteur (100), un détecteur et classificateur de véhicules (VDAC) reposant sur le principe de la diode-laser, décèle, dans la zone multi-voies couverte par son champ de vision, la présence de véhicules, évalue leur vitesse et prend leurs profils en trois dimensions (128). Ce capteur (100) utilise la technique d'imagerie par télémétrie laser à impulsions adaptée à la détermination du profil tridimensionnel (128) de véhicules. Un système (300) rotatif polygonal à miroirs est utilisé pour effectuer le balayage par faisceau de télémètre laser à impulsions des voies de trafic d'une autoroute (102) afin de déceler la présence, d'évaluer la vitesse et de prendre les profils verticaux des véhicules (104) et ce, sur toutes les voies en même temps. Un récepteur (400) reçoit les réflexions des faisceaux émis par le capteur (100) et génère des sorties permettant de déterminer le temps de vol et la périodicité existant entre les interceptions de deux faisceaux différents, un premier (106) et un second (110) faisceau pour chaque véhicule. Un codeur poursuit la position d'un (302) des miroirs afin de fournir des données angulaires accompagnées de mesures de portée. Un rapport signal/bruit élevé et une bonne résolution spatiale permettent d'effectuer des mesures de paramètres de trafic de manière extrêmement précise.
PCT/US1997/018628 1996-10-11 1997-10-09 Capteur multi-voies pour systeme autoroutier intelligent pour vehicules WO1998016801A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU48222/97A AU4822297A (en) 1996-10-11 1997-10-09 Intelligent vehicle highway multi-lane sensor

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US08/730,732 US5793491A (en) 1992-12-30 1996-10-11 Intelligent vehicle highway system multi-lane sensor and method
US08/730,732 1996-10-11
US3138396P 1996-11-20 1996-11-20
US60/031,383 1996-11-20

Publications (1)

Publication Number Publication Date
WO1998016801A1 true WO1998016801A1 (fr) 1998-04-23

Family

ID=26707170

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/018628 WO1998016801A1 (fr) 1996-10-11 1997-10-09 Capteur multi-voies pour systeme autoroutier intelligent pour vehicules

Country Status (2)

Country Link
AU (1) AU4822297A (fr)
WO (1) WO1998016801A1 (fr)

Cited By (33)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000033261A1 (fr) * 1998-11-27 2000-06-08 Footfall Limited Systeme de surveillance de pietons
WO2001031608A1 (fr) * 1999-10-27 2001-05-03 Sick Ag Dispositif pour reguler le flux de circulation a un carrefour, notamment pour reguler les feux
WO2001038900A1 (fr) * 1999-11-29 2001-05-31 Specialty Minerals Michigan Inc. Mesure de l'usure du revetement refractaire d'un recipient metallurgique
WO2004036245A3 (fr) * 2002-09-25 2004-09-10 Ibeo Automobile Sensor Gmbh Dispositif de detection optoelectronique
DE102004012220A1 (de) * 2004-03-12 2005-09-29 Sick Ag Optoelektronischer Sensor
US7405676B2 (en) 2004-09-10 2008-07-29 Gatsometer B.V. Method and system for detecting with laser the passage by a vehicle of a point for monitoring on a road
WO2012042043A1 (fr) * 2010-10-01 2012-04-05 Fastcom Technology Sa Système et procédé d'individualisation de personnes
EP2631667A1 (fr) * 2012-02-22 2013-08-28 Ricoh Company, Ltd. Dispositif de mesure de distance
WO2014072106A1 (fr) * 2012-11-08 2014-05-15 Valeo Schalter Und Sensoren Gmbh Dispositif de détection optoélectronique à tension de polarisation ajustable d'un photodétecteur à avalanche destiné à un véhicule automobile, véhicule automobile et procédé afférent
WO2014141115A2 (fr) 2013-03-15 2014-09-18 Primesense Ltd. Balayage en profondeur à l'aide de de multiples émetteurs
WO2015007506A1 (fr) * 2013-07-16 2015-01-22 Valeo Schalter Und Sensoren Gmbh Système de détection optoélectronique et procédé d'acquisition par balayage de l'environnement d'un véhicule automobile
EP2682781A3 (fr) * 2012-07-03 2015-03-18 Ricoh Company, Ltd. Dispositif radar laser
WO2015113892A1 (fr) * 2014-01-31 2015-08-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Dispositif de prise d'images de distance et d'intensité superposées
EP3091369A1 (fr) * 2015-05-05 2016-11-09 Sick Ag Lecteur laser
US10298913B2 (en) 2016-08-18 2019-05-21 Apple Inc. Standalone depth camera
US10324171B2 (en) 2015-12-20 2019-06-18 Apple Inc. Light detection and ranging sensor
KR20190083145A (ko) * 2018-01-03 2019-07-11 주식회사 라이드로 라이다 광학 시스템
EP3521858A3 (fr) * 2018-02-06 2019-12-18 Sick AG Capteur optoélectronique et procédé de détection d'objets dans une zone de surveillance
KR20200011313A (ko) * 2018-07-24 2020-02-03 문명일 라이다 광학 장치
WO2020094867A1 (fr) 2018-11-09 2020-05-14 Dubois Jean Claude Dispositif de capture de fréquentation miniaturisé
US10739460B2 (en) 2010-08-11 2020-08-11 Apple Inc. Time-of-flight detector with single-axis scan
WO2021019065A1 (fr) 2019-07-31 2021-02-04 Kiomda, Sas Capteur thermique stéréoscopique miniaturisé pour dispositif de comptage automatique
EP2781932B1 (fr) * 2011-12-22 2021-05-26 LG Electronics Inc. Appareil de mesure de distance
DE102004050682B4 (de) 2003-11-18 2022-03-24 Riegl Laser Measurement Systems Gmbh Einrichtung zur Aufnahme eines Objektraumes
JP2022541007A (ja) * 2020-03-05 2022-09-21 深▲せん▼市▲レイ▼神智能系統有限公司 プリズム及びマルチビームレーザーレーダー
WO2022225859A1 (fr) * 2021-04-22 2022-10-27 Innovusion, Inc. Conception lidar compacte à haute résolution et à champ de vision ultra-large
US11644543B2 (en) 2018-11-14 2023-05-09 Innovusion, Inc. LiDAR systems and methods that use a multi-facet mirror
US11662439B2 (en) 2021-04-22 2023-05-30 Innovusion, Inc. Compact LiDAR design with high resolution and ultra-wide field of view
US11675050B2 (en) 2018-01-09 2023-06-13 Innovusion, Inc. LiDAR detection systems and methods
US11675053B2 (en) 2018-06-15 2023-06-13 Innovusion, Inc. LiDAR systems and methods for focusing on ranges of interest
US11965980B2 (en) 2018-01-09 2024-04-23 Innovusion, Inc. Lidar detection systems and methods that use multi-plane mirrors
US11977185B1 (en) 2019-04-04 2024-05-07 Seyond, Inc. Variable angle polygon for use with a LiDAR system
EP4478083A1 (fr) * 2023-06-15 2024-12-18 Sick Ag Scanner multi-plan et procédé de détection d'objets

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4335962A (en) * 1979-07-20 1982-06-22 Robotic Vision Systems, Inc. Method and apparatus for determining spatial information
US5111056A (en) * 1990-04-14 1992-05-05 Matsushita Electric Works, Ltd. Optical measurement system determination of an object profile
US5528354A (en) * 1992-07-10 1996-06-18 Bodenseewerk Geratetechnik Gmbh Picture detecting sensor unit

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4335962A (en) * 1979-07-20 1982-06-22 Robotic Vision Systems, Inc. Method and apparatus for determining spatial information
US5111056A (en) * 1990-04-14 1992-05-05 Matsushita Electric Works, Ltd. Optical measurement system determination of an object profile
US5528354A (en) * 1992-07-10 1996-06-18 Bodenseewerk Geratetechnik Gmbh Picture detecting sensor unit

Cited By (58)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2000033261A1 (fr) * 1998-11-27 2000-06-08 Footfall Limited Systeme de surveillance de pietons
WO2001031608A1 (fr) * 1999-10-27 2001-05-03 Sick Ag Dispositif pour reguler le flux de circulation a un carrefour, notamment pour reguler les feux
US6922251B1 (en) 1999-11-29 2005-07-26 Specialty Minerals (Michigan) Inc. Measurement of the wear of the fireproof lining of a metallurgical vessel
WO2001038900A1 (fr) * 1999-11-29 2001-05-31 Specialty Minerals Michigan Inc. Mesure de l'usure du revetement refractaire d'un recipient metallurgique
US7345271B2 (en) 2002-09-25 2008-03-18 Ibeo Automobile Sensor Gmbh Optoelectric sensing device with common deflection device
WO2004036245A3 (fr) * 2002-09-25 2004-09-10 Ibeo Automobile Sensor Gmbh Dispositif de detection optoelectronique
DE102004050682B4 (de) 2003-11-18 2022-03-24 Riegl Laser Measurement Systems Gmbh Einrichtung zur Aufnahme eines Objektraumes
DE102004012220A1 (de) * 2004-03-12 2005-09-29 Sick Ag Optoelektronischer Sensor
DE102004012220B4 (de) * 2004-03-12 2018-05-03 Sick Ag Optoelektronischer Sensor
US7405676B2 (en) 2004-09-10 2008-07-29 Gatsometer B.V. Method and system for detecting with laser the passage by a vehicle of a point for monitoring on a road
US10739460B2 (en) 2010-08-11 2020-08-11 Apple Inc. Time-of-flight detector with single-axis scan
WO2012042043A1 (fr) * 2010-10-01 2012-04-05 Fastcom Technology Sa Système et procédé d'individualisation de personnes
EP2781932B1 (fr) * 2011-12-22 2021-05-26 LG Electronics Inc. Appareil de mesure de distance
JP2013170962A (ja) * 2012-02-22 2013-09-02 Ricoh Co Ltd 距離測定装置
US8988664B2 (en) 2012-02-22 2015-03-24 Ricoh Company, Ltd. Distance measuring device
CN103293530A (zh) * 2012-02-22 2013-09-11 株式会社理光 距离测量装置
EP2631667A1 (fr) * 2012-02-22 2013-08-28 Ricoh Company, Ltd. Dispositif de mesure de distance
EP2682781A3 (fr) * 2012-07-03 2015-03-18 Ricoh Company, Ltd. Dispositif radar laser
US9188674B2 (en) 2012-07-03 2015-11-17 Ricoh Company, Ltd. Laser radar device
WO2014072106A1 (fr) * 2012-11-08 2014-05-15 Valeo Schalter Und Sensoren Gmbh Dispositif de détection optoélectronique à tension de polarisation ajustable d'un photodétecteur à avalanche destiné à un véhicule automobile, véhicule automobile et procédé afférent
EP2972081A4 (fr) * 2013-03-15 2016-11-09 Apple Inc Balayage en profondeur à l'aide de de multiples émetteurs
KR101762525B1 (ko) * 2013-03-15 2017-07-27 애플 인크. 다수의 이미터들을 이용한 깊이 주사를 위한 장치 및 방법
WO2014141115A2 (fr) 2013-03-15 2014-09-18 Primesense Ltd. Balayage en profondeur à l'aide de de multiples émetteurs
CN105393138B (zh) * 2013-07-16 2018-04-17 法雷奥开关和传感器有限责任公司 光‑电子检测装置和用于以扫描的方式检测机动车辆的周围环境的方法
CN105393138A (zh) * 2013-07-16 2016-03-09 法雷奥开关和传感器有限责任公司 光-电子检测装置和用于以扫描的方式检测机动车辆的周围环境的方法
US10048381B2 (en) 2013-07-16 2018-08-14 Valeo Schalter Und Sensoren Gmbh Opto-electronic detection device and method for sensing the surroundings of a motor vehicle by scanning
WO2015007506A1 (fr) * 2013-07-16 2015-01-22 Valeo Schalter Und Sensoren Gmbh Système de détection optoélectronique et procédé d'acquisition par balayage de l'environnement d'un véhicule automobile
US10261188B2 (en) 2014-01-31 2019-04-16 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Apparatus for capturing superimposed distance and intensity images
WO2015113892A1 (fr) * 2014-01-31 2015-08-06 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Dispositif de prise d'images de distance et d'intensité superposées
EP3091369A1 (fr) * 2015-05-05 2016-11-09 Sick Ag Lecteur laser
US10324171B2 (en) 2015-12-20 2019-06-18 Apple Inc. Light detection and ranging sensor
US10298913B2 (en) 2016-08-18 2019-05-21 Apple Inc. Standalone depth camera
KR102334432B1 (ko) * 2018-01-03 2021-12-03 주식회사 라이드로 라이다 광학 시스템
KR20190083145A (ko) * 2018-01-03 2019-07-11 주식회사 라이드로 라이다 광학 시스템
US11675050B2 (en) 2018-01-09 2023-06-13 Innovusion, Inc. LiDAR detection systems and methods
US12078755B2 (en) 2018-01-09 2024-09-03 Seyond, Inc. LiDAR detection systems and methods that use multi-plane mirrors
US11977184B2 (en) 2018-01-09 2024-05-07 Seyond, Inc. LiDAR detection systems and methods that use multi-plane mirrors
US11965980B2 (en) 2018-01-09 2024-04-23 Innovusion, Inc. Lidar detection systems and methods that use multi-plane mirrors
EP3521858A3 (fr) * 2018-02-06 2019-12-18 Sick AG Capteur optoélectronique et procédé de détection d'objets dans une zone de surveillance
US11480707B2 (en) 2018-02-06 2022-10-25 Sick Ag Optoelectronic sensor and method of detecting objects in a monitoring zone
US12276759B2 (en) 2018-06-15 2025-04-15 Seyond, Inc. LiDAR systems and methods for focusing on ranges of interest
US11860313B2 (en) 2018-06-15 2024-01-02 Innovusion, Inc. LiDAR systems and methods for focusing on ranges of interest
US11675053B2 (en) 2018-06-15 2023-06-13 Innovusion, Inc. LiDAR systems and methods for focusing on ranges of interest
KR102287071B1 (ko) * 2018-07-24 2021-08-10 주식회사 라이드로 라이다 광학 장치
KR20200011313A (ko) * 2018-07-24 2020-02-03 문명일 라이다 광학 장치
FR3155336A1 (fr) 2018-11-09 2025-05-16 Kiomda Dispositif de capture de fréquentation miniaturisé
FR3088460A1 (fr) 2018-11-09 2020-05-15 Jean-Claude Dubois Dispositif de capture de frequentation miniaturise
WO2020094867A1 (fr) 2018-11-09 2020-05-14 Dubois Jean Claude Dispositif de capture de fréquentation miniaturisé
US11644543B2 (en) 2018-11-14 2023-05-09 Innovusion, Inc. LiDAR systems and methods that use a multi-facet mirror
US11686824B2 (en) 2018-11-14 2023-06-27 Innovusion, Inc. LiDAR systems that use a multi-facet mirror
US11977185B1 (en) 2019-04-04 2024-05-07 Seyond, Inc. Variable angle polygon for use with a LiDAR system
FR3099591A1 (fr) 2019-07-31 2021-02-05 Jean-Claude Dubois Capteur thermique stéréoscopique miniaturisé pour dispositif de comptage automatique
WO2021019065A1 (fr) 2019-07-31 2021-02-04 Kiomda, Sas Capteur thermique stéréoscopique miniaturisé pour dispositif de comptage automatique
JP2022541007A (ja) * 2020-03-05 2022-09-21 深▲せん▼市▲レイ▼神智能系統有限公司 プリズム及びマルチビームレーザーレーダー
US12072447B2 (en) 2021-04-22 2024-08-27 Seyond, Inc. Compact LiDAR design with high resolution and ultra-wide field of view
WO2022225859A1 (fr) * 2021-04-22 2022-10-27 Innovusion, Inc. Conception lidar compacte à haute résolution et à champ de vision ultra-large
US11662439B2 (en) 2021-04-22 2023-05-30 Innovusion, Inc. Compact LiDAR design with high resolution and ultra-wide field of view
EP4478083A1 (fr) * 2023-06-15 2024-12-18 Sick Ag Scanner multi-plan et procédé de détection d'objets

Also Published As

Publication number Publication date
AU4822297A (en) 1998-05-11

Similar Documents

Publication Publication Date Title
US5793491A (en) Intelligent vehicle highway system multi-lane sensor and method
WO1998016801A1 (fr) Capteur multi-voies pour systeme autoroutier intelligent pour vehicules
US5757472A (en) Intelligent vehicle highway system sensor and method
US6304321B1 (en) Vehicle classification and axle counting sensor system and method
US5249157A (en) Collision avoidance system
JP6195833B2 (ja) 改良されたレーザ距離センサ
US20020140924A1 (en) Vehicle classification and axle counting sensor system and method
US5202742A (en) Laser radar for a vehicle lateral guidance system
US6404506B1 (en) Non-intrusive laser-based system for detecting objects moving across a planar surface
US5321490A (en) Active near-field object sensor and method employing object classification techniques
WO1996034252A9 (fr) Detecteur pour systeme intelligent de regulation de la circulation sur les autoroutes et procede d'utilisation
WO1996034252A1 (fr) Detecteur pour systeme intelligent de regulation de la circulation sur les autoroutes et procede d'utilisation
US9684064B2 (en) Apparatus and method for determining a vehicle feature
EP0918232A2 (fr) Télémètre
JP5518525B2 (ja) 通行物体管理システム
JPH11232587A (ja) 検出装置、車両計測装置、車軸検出装置および通過料金算出装置
JP2014167802A (ja) 車両判定システム
JP2011186525A (ja) 通行物体管理システム
Shinmoto Vehicle optical sensor
Schwartz et al. Wide-area traffic surveillance (WATS) system

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG US UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG ZW AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA