[go: up one dir, main page]

CN108007456A - A kind of indoor navigation method, apparatus and system - Google Patents

A kind of indoor navigation method, apparatus and system Download PDF

Info

Publication number
CN108007456A
CN108007456A CN201711279091.7A CN201711279091A CN108007456A CN 108007456 A CN108007456 A CN 108007456A CN 201711279091 A CN201711279091 A CN 201711279091A CN 108007456 A CN108007456 A CN 108007456A
Authority
CN
China
Prior art keywords
icon
indoor navigation
navigation device
indoor
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201711279091.7A
Other languages
Chinese (zh)
Inventor
刘振宇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Focalcrest Inc
Original Assignee
Focalcrest Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Focalcrest Inc filed Critical Focalcrest Inc
Priority to CN201711279091.7A priority Critical patent/CN108007456A/en
Publication of CN108007456A publication Critical patent/CN108007456A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/20Instruments for performing navigational calculations
    • G01C21/206Instruments for performing navigational calculations specially adapted for indoor navigation

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Navigation (AREA)

Abstract

The invention discloses a kind of indoor navigation method, apparatus and system, method and step includes:Determine attitude angle, initial angle of drift, the initial state information of indoor navigation device;Icon information is searched for, is detected, is shot, decoded and obtained in real time to the icon being arranged on ceiling by taking module;The real-time position information of indoor guider is calculated according to icon information.The present invention carries out Global localization using inertial navigation technology, state estimation is carried out according to the data of accelerometer, gyroscope and magnetometer, and correct the accumulated error of elimination inertial navigation in time using the multiple icons being arranged on ceiling, greatly improve the precision of indoor navigation, there is provided accurate indoor positioning navigation.

Description

Indoor navigation method, device and system
Technical Field
The invention relates to the technical field of indoor navigation and positioning, in particular to an indoor navigation method, device and system.
Background
With the development of science and technology, people have more and more great demand on indoor navigation, for example, in complex indoor scenes needing navigation and positioning, such as large malls, exhibitions, parking lots, hospitals, airports and the like, the position information of people is very important.
The core of indoor positioning is global positioning of the system, which calculates the physical position or logical position information of the carrier by using a corresponding algorithm through sensing the attribute of the spatial characteristic quantity. The inertial navigation technology developed by Newton classical mechanics can automatically calculate the instantaneous speed and position information of a carrier platform and provide state information for a carrier without depending on additional deployment facilities, and is a mainstream scheme of 'passive' indoor positioning; however, the accumulated error is serious, which may result in poor positioning accuracy, and cannot meet the long-time positioning requirement, and the accumulated error needs to be corrected by using external auxiliary measures.
Currently, the existing indoor positioning technologies include Bluetooth (Bluetooth) positioning technology, Ultra Wideband (Ultra wide band) technology, infrared positioning technology, Radio Frequency Identification (RFID) technology, wireless fidelity (WiFi) positioning technology, magnetic field positioning technology, and computer vision positioning technology, and they mainly use radio frequency, infrared distance measurement, fingerprint matching, image recognition, and other modes to achieve indoor positioning. However, these techniques have not been widely used in public due to high deployment costs, limited accuracy, susceptibility to interference, difficulty in commercial application, and the like.
The problem of global positioning is difficult to solve to the positioning scheme of present single sensor, and current indoor positioning scheme has following defect: (1) and the anti-interference capability is weak. The system based on the Bluetooth indoor positioning technology has poor stability and is easily interfered by noise; the indoor positioning technology based on WIFI can only be applied to indoor positioning in a small range, cannot solve the problem of global positioning, is easily interfered by other signals, and is high in energy consumption. Ultrasonic positioning is susceptible to multipath effects and non-line-of-sight propagation; the infrared ray cannot pass through the barrier, so that the infrared ray can only be transmitted at a sight distance and is easy to interfere with indoor lighting. (2) The positioning equipment costs are high. If the Bluetooth indoor positioning technology is used, the equipment price is expensive, and ultrasonic positioning needs a large amount of investment in bottom hardware facilities, so the cost is too high; ultra-wideband positioning techniques and indoor GPS positioning techniques require a large number of correlators and are costly to locate.
Disclosure of Invention
The invention aims to solve the technical problems of weak anti-interference capability and high positioning equipment cost in the prior art, and provides an indoor navigation method, device and system.
The technical scheme adopted by the invention for solving the technical problems is as follows: according to an aspect of the present invention, an indoor navigation method is provided, which specifically includes the steps of:
determining an attitude angle, an initial yaw angle and initial state information of the indoor navigation device;
the method comprises the steps that icons arranged on a ceiling are searched, detected, shot and decoded in real time through a shooting module, and icon information is obtained; the shooting module is arranged on the indoor navigation device;
and calculating real-time position information of the indoor navigation device according to the icon information.
Preferably, the step of searching, detecting, shooting, decoding and acquiring icon information in real time for the icon disposed on the ceiling through the shooting module specifically includes:
searching, detecting and shooting icon images of the icons in real time;
carrying out binarization on the icon image and determining the icon outline of the icon image;
screening and determining a target icon serving as an icon, wherein the outline of the icon is quadrilateral;
judging whether the target icon is a pre-stored icon or not, and if so, recording image coordinates of four vertexes of the target icon;
sub-pixel refining the image coordinate positions of four vertexes of the candidate icon;
the pose of the icon with respect to the photographing module is calculated by P4P. Specifically, the calculation formula for calculating the pose of the target icon relative to the shooting module through P4P is as follows:
preferably, the calculation model adopted for calculating the real-time position information of the indoor navigation device according to the icon information is an IMU pre-integration model; the formula of the IMU pre-integration model is as follows:
wherein, theIs a rotation matrix of the indoor navigation device,is the speed of the indoor navigation device and,is the position information of the indoor navigation device.
Preferably, said aBAcceleration, omega, measured for an accelerometer of an indoor navigation deviceBAngular velocity measured for a gyroscope of an indoor navigation device; the omegaBAnd aBThe calculation formula of (2) is as follows:
wherein, b isaIs an acceleration aBMeasuring the acceleration deviation; b isgIs an angular velocity omegaBMeasuring the angular velocity deviation experienced, said ηaIs an acceleration aBMeasuring the noise experienced, said ηgIs an angular velocity omegaBThe noise experienced is measured.
Preferably, the number of the icons is multiple, and the icons are passive directional icons.
Preferably, a plurality of the icons can be arranged at any position of the ceiling, and the icons have clear and accurate position relation.
According to another aspect of the present invention, an indoor navigation device is provided, which includes a control module, a shooting module connected to the control module, and an IMU module; wherein the IMU module comprises an accelerometer, a gyroscope, and a magnetometer; the shooting module is a high-definition camera module with a lens facing the ceiling; the control module is used for executing the indoor navigation method.
According to another aspect of the present invention, there is provided an indoor navigation system, comprising a plurality of icons disposed on an indoor ceiling, and the indoor navigation device as described above.
The technical scheme for implementing the indoor navigation method, the device and the system has the following advantages or beneficial effects: the invention adopts the inertial navigation technology to carry out global positioning, carries out state estimation according to the data of the accelerometer, the gyroscope and the magnetometer, and adopts a plurality of icons arranged on the ceiling to correct and eliminate the accumulated error of the inertial navigation in time, thereby greatly improving the precision of the indoor navigation and providing the accurate indoor positioning navigation.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without inventive efforts, wherein:
FIG. 1 is a schematic flow chart diagram of an embodiment of an indoor navigation method of the present invention;
FIG. 2 is a schematic structural diagram of an indoor navigation system according to an embodiment of the present invention;
FIG. 3 is a schematic diagram of a method of an embodiment of an indoor navigation system of the present invention;
FIG. 4 is a schematic diagram of a position of an embodiment of an indoor navigation method of the present invention;
fig. 5 is a schematic view of a scene layout of an indoor navigation system according to an embodiment of the present invention.
Detailed Description
In order that the objects, aspects and advantages of the invention will become more apparent, the various embodiments described hereinafter refer to the accompanying drawings which form a part hereof, and in which are shown by way of illustration various embodiments in which the invention may be practiced. It is to be understood that other embodiments may be utilized and structural and functional modifications may be made to the embodiments set forth herein without departing from the scope and spirit of the present invention.
The first embodiment is as follows:
as shown in fig. 1 to 4, the present invention provides an embodiment of an indoor navigation method, for performing accurate positioning navigation on an indoor navigation device, which specifically includes the following steps:
s1, determining the attitude angle, the initial yaw angle and the initial state information of the indoor navigation device;
specifically, the method is used for initial alignment of the indoor navigation device, determining the attitude angle of the indoor navigation device, determining the initial state information, and calibrating the initial yaw angle of the magnetometer. And calibrating the shooting module to obtain the internal parameters and distortion coefficients of the shooting module, detecting the icon by the shooting module, and obtaining a calculation formula of the attitude angle according to the relation between the pixel coordinates of the pixel points and the coordinate system of the image plane, the relation between the coordinate system of the camera and the coordinate system of the world and the imaging projection relation.
More specifically, the step of determining the attitude angle, the initial yaw angle, and the initial state information of the indoor navigation device specifically includes:
s11, calibrating the shooting module, and acquiring internal parameters and distortion coefficients of the shooting module;
s12, detecting the icon arranged on the ceiling through a shooting module to obtain the pixel coordinate of the icon;
specifically, the icon is a passive directional icon, the icon includes unique ID information, the ID information may be 3 × 3, 4 × 4, 5 × 5, and the like, and the icon is passive, does not need to send or receive any signal, and has directivity, and the specific direction is determined by the ID information of the icon. A plurality of icons can be arranged at any position of a ceiling and can be randomly arranged, clear and accurate position relations exist among the icons, and the mutual distance of the icons needs to be accurately measured in advance. The number of the specific icons is determined by the height of the ceiling and the ground and the wide-angle comprehensive calculation of the lens of the shooting module. For example, a ceiling 3m away from the ground is arranged with icons to ensure that at least one icon can be accurately identified by a camera module on the ground, and to ensure positioning accuracy, the distance between 2 icons is required to be not more than 1.5m, and the icons are required to be pasted smoothly, and meanwhile, the ceiling is basically parallel to the ground, and the angle of the ceiling is required to be less than or equal to 5 degrees.
And S13, calculating the attitude angle of the indoor navigation device.
S14, calibrating the initial yaw angle of the magnetometer;
s15, determining the initial state information, including setting the initial state position information of the indoor navigation device, such as setting the current position coordinate to (0,0,0) when the indoor navigation device is started.
S2, carrying out real-time searching, detecting, shooting and decoding on the icon arranged on the ceiling through the shooting module and obtaining the icon information; specifically, the shooting module is arranged on the indoor navigation device, and the lens of the shooting module faces the ceiling and can shoot the icons on the ceiling.
Specifically, the icon is searched, detected, photographed and decoded by using a photographing module, wherein the icon information comprises a binarization threshold processing algorithm and a contour processing algorithm, and interference information is eliminated through a large-law OSTU algorithm.
More specifically, the steps of searching, detecting, shooting, decoding and acquiring icon information in real time for the icon arranged on the ceiling through the shooting module specifically include:
s21, searching, detecting and shooting the icon image of the icon in real time;
s22, binarizing the icon image and determining the icon outline;
s23, screening and determining a target icon as an icon, wherein the outline of the icon is quadrilateral;
s24, judging whether the target icon is a pre-stored icon or not, and if so, recording image coordinates of four vertexes of the target icon;
s25, sub-pixel purification of the image coordinate positions of the four vertexes of the target icon;
s26, calculating the poses of the target icons relative to the shooting module through P4P; specifically, the calculation formula 1 for calculating the pose of the target icon relative to the shooting module through P4P is as follows:
if the intrinsic parameter matrix is known, then knowing 4 or more points that are not co-planar and co-linear allows the pose of the camera to be calculated. The attitude of the indoor navigation device can be obtained by using 4 or more coplanar and non-collinear points, namely R and t are obtained through calculation, specifically, R is a rotation matrix, and t is a displacement vector.
The transformation from the world coordinate system to the camera coordinate system requires a matrix R | t, where R is the rotation matrix and t is the displacement vector. If the world coordinate system is X and the camera coordinate system corresponds to X ', then X' ═ R | t ] ×. The transformation from the camera coordinate system to the ideal screen coordinate system requires the intrinsic parameter matrix C. Then the ideal screen coordinate system L ═ C ═ R | t ×. How to obtain [ R | t ] is that the coordinates of a plurality of key points on a known template in a world coordinate system, namely X, are known, then the coordinates of corresponding points on the template in a screen coordinate system, namely L, are known in a frame captured by a camera, an initial value of [ R | t ] is obtained by solving a linear equation system, and then an optimal transformation matrix [ R | t ] is obtained by iteration through a nonlinear least square method.
In more detail, the principle of the visual algorithm of the camera module is as follows:
(1) and carrying out binarization on the shot icon image.
(2) And finding the contour after binarization.
Specifically, the digital binary image is subjected to topology analysis to determine the surrounding relationship of the binary image boundaries, that is, the outer boundaries, the hole boundaries and the hierarchical relationship thereof, and since the boundaries and the original image regions have a one-to-one correspondence (the outer boundaries correspond to connected regions with pixel values of 1, and the hole boundaries correspond to regions with pixel values of 0), the original image can be represented by the boundaries. The input binary image is an image of 0 and 1, and the pixel values of the image are represented by f (i, j). Each line scan is terminated with two cases: A. f (i, j-1) is 0, f (i, j) is 1; // f (i, j) is the starting point of the outer boundary; B. f (i, j) > < 1, f (i, j +1) > 0; // f (i, j) is the starting point of the pore boundary.
Then, starting from the starting point, pixels on the boundary are marked. A unique identifier is assigned to the newly discovered boundary, called NBD. Initially NBD is 1, adding 1 each time a new boundary is found. In this process, when f (p, q) is 1 and f (p, q +1) is 0, f (p, q) is set to-NBD.
(3) And finding out the outline of the quadrangle as a candidate icon for icon detection.
After the step (2), the binary image has many polygonal outlines. We find the outline of the visual identification. The outermost circle of the visual icon is black and is a black outline of a square. Therefore, all the quadrangles (inside of which are the visual marks, of course, some of which have no visual marks) are to be found in the image.
(4) The size of the candidate point side is determined and if the smallest side is less than 1/2 of the largest side, the candidate icon is discarded. The visual marks are not arranged in all the quadrilateral outlines, and because the outermost circle of the visual marks is a square, the quadrilateral is considered not to be a square if the minimum side length of the quadrilateral is less than 1/2 of the maximum side length in consideration of the influence of noise.
(5) And judging whether the borders of the quadrangles with the candidate mutation are completely black or not, and if not, discarding the quadrangles.
Since the outermost circles (5 × 5 squares) of the visual identification are all black, it is determined whether the detected quadrangle in step (4) is completely black. First, a quadrangle is projected and transformed (on the formula net) into a small picture with 100x100 pixels (this small picture is the picture of the visual identification), and then the pixel occupied by each black small square in the outermost circle of the visual identification is 20x 20. Each tile is individually determined to be black (if more than half of the pixels of the tile are black, we consider the tile to be black). The complete black of the quadrilateral frame can be indicated only if all the small squares are black.
(6) It is detected whether the candidate icon is an existing icon ID.
And (5) if the detection in the step (5) is successful, the icon is a target icon, the icon inside the quadrangle is to be decoded, and the coding region of the designed icon is a small square grid with 3x3 alternating black and white. Black squares represent 0 and white squares represent 1. Then these 0 and 1 codes (which are the codes of the visual identification affixed to the ceiling) are stored in the program in advance. We then decide which, if any, code we detect inside the quadrilateral that does not have a visual identification on the ceiling, and record its ID.
(7) And if the detection is successful, recording pixel coordinates of four vertexes of the icon.
Once the visual marker is detected, the four vertices of the outermost circle of visual markers (the four vertices of the full black quadrilateral outline) in the image are recorded.
(8) And sub-pixel purification of the coordinate positions of the four vertex images.
Since the error of one pixel is too large, we optimize the coordinates of these four vertices to be sub-pixilated (i.e., floating point type coordinates).
(9) The pose of the icon with respect to the photographing module is calculated by P4P.
With the visual marker as a world coordinate system, coordinates of four vertexes of the outermost circle (all black outline) of the visual marker in the world can be obtained by artificial measurement. Four coordinate points are also detected within the image. The pose of the visual icon relative to the camera can be obtained through a projection equation. The projection relationship is shown in equation 1. Each point can form two equations, and as long as there are 8 equations, the variables R and t can be solved.
(10) The coordinate system is transferred to the icon by coordinate conversion and then to the first frame of the IMU.
Since the calculated position is in the camera coordinate system (pose of the visual icon in the moving camera coordinate system), we turn this pose first into the image coordinate system (pose of the moving camera in the stationary icon coordinate system) and into the first frame coordinate system of the camera (pose of the moving camera in the first frame coordinate system of the camera).
And S3, calculating the real-time position information of the indoor navigation device according to the icon information.
Specifically, the global positioning adopts an Inertial navigation technology, adopts an Inertial Measurement Unit (IMU) pre-integration model algorithm, and estimates the state of the indoor navigation device according to data of an accelerometer, a gyroscope and a magnetometer sensor, because the IMU integrates for a long time, an accumulated error occurs. Therefore, after the IMU is integrated for a period of time, the accumulated error of IMU pre-integration is eliminated by using the shooting module.
Extrinsic parameter calibration, the pose estimation problem, estimates the 3D pose of an object from a set of 2D point mappings. The information required to recover the pose from the three corresponding points is minimal, referred to as the "three-point perspective problem" P3P. Similarly, the method extends to N points and is called 'PnP'. Vision-based pose estimation is classified into monocular vision and monocular vision according to the number of cameras used. And can be further classified into model-based attitude estimation and learning-based attitude estimation according to algorithms.
More specifically, a calculation model adopted for calculating the real-time position information of the indoor navigation device according to the icon information is an IMU pre-integration model; equation 2 of the IMU pre-integration model is:
wherein,is a rotation matrix of an indoor navigation unit (IMU),is the speed of the indoor navigation unit (IMU),is location information of an indoor navigation device (IMU). Specifically, an IMU pre-integration model is started between two input frames of images to calculate pose information of the indoor navigation device, positioning information between two frames of images is output, and due to the fact that accumulated errors exist in IMU pre-integration, accurate position information is solved through visual projection to eliminate the accumulated errors of the IMU pre-integration model, and accurate positioning of the indoor navigation device is achieved. Specifically, an IMU pre-integration model is adopted, the reference coordinate of the IMU is set to be B, the reference coordinate of the camera is set to be C, aBFor indoor navigationAcceleration, omega, measured by a stationary accelerometerBAngular velocity measured for a gyroscope of an indoor navigation device; omegaBAnd aBThe calculation formula 3 is:
wherein, baIs an acceleration aBMeasuring the acceleration deviation; bgIs an angular velocity omegaBMeasuring the angular velocity deviation experienced ηaIs an acceleration aBMeasuring the noise experienced ηgIs an angular velocity omegaBThe noise experienced is measured.
In this embodiment, the velocity of the indoor navigation device is solved by a double integral of the accelerometer, the angle information is integrated by the gyroscope, and there is an accumulated error, the position information obtained by the IMU pre-integration is corrected by the position information obtained in step S1 to eliminate the error, and the positioning information of the time between two frames of images is obtained by the IMU pre-integration.
In the present embodiment, the real-time position of the indoor navigation device (mobile carrier) and the positioning information at the time of high frequency are accurately positioned by continuously repeating the steps S2 and S3.
The invention adopts a plurality of icons arranged and installed on a ceiling parallel to the ground to correct the accumulated error of inertial navigation, utilizes a shooting module to carry out mark detection on an input image frame, wherein the mark detection comprises binarization processing and outline extraction of the icons, carries out ID decoding on the detected icons, obtains 8 equations by projecting known 4 3D points which are coplanar and non-collinear on a two-dimensional image, can obtain the transformation relation between the icons and the shooting module (an indoor navigation device) by solving the 8 equations, and then transforms world coordinates to an initial frame of an IMU through coordinate transformation, can obtain the pose information of the current IMU relative to a first frame IMU, namely correct the pose of the current IMU. The issued message frequency of the IMU is 200Hz, namely 5ms, positioning information can be given, and the pose frequency calculated by the shooting module through a visual algorithm is about 5Hz-10Hz, so that the IMU performs accumulative calculation for about 20-40 times of positioning, the shooting module corrects the accumulative error of the IMU, and the autonomous accurate positioning of the indoor navigation device is realized in a mode of visually correcting the accumulative error by taking IMU pre-integral positioning as a main body.
The invention adopts the inertial navigation technology to carry out global positioning, carries out state estimation according to the data of the accelerometer, the gyroscope and the magnetometer, and adopts a plurality of icons arranged on the ceiling to correct and eliminate the accumulated error of the inertial navigation in time, thereby greatly improving the precision of the indoor navigation and providing the accurate indoor positioning navigation.
Example two:
as shown in fig. 2-5, the present invention further provides an embodiment of an indoor navigation apparatus, which includes a control module 11, a shooting module 12 connected to the control module 11, and an IMU module 13; in particular, the IMU module 13 includes an accelerometer, a gyroscope and a magnetometer (not shown). The camera module 12 is a high-definition camera module, and a lens of the camera module 12 faces upward to a ceiling for shooting a plurality of icons disposed on the ceiling. More specifically, the control module 11 may be a micro-control processor (such as a raspberry processor), and predicts data acquired by each sensor by using a kalman filter algorithm to obtain target state information. The control module 11 is configured to execute the indoor navigation method in the first embodiment, and specific method steps are not described herein again. It should be noted that the indoor navigation device 100 is disposed on a mobile carrier, which may be an indoor robot, a vehicle, or the like.
Example three:
as shown in fig. 2-5, the present invention further provides an embodiment of an indoor navigation system, which includes a plurality of icons 200 disposed on an indoor ceiling, and an indoor navigation device 100 according to the second embodiment. Specifically, the navigation method of the indoor navigation device is the indoor navigation method according to the first embodiment, and the specific method steps are not described herein again. More specifically, the number of the icons 200 is plural, and the icons 200 are passive directional icons. The plurality of icons 200 can be arranged at any position of the ceiling, and the plurality of icons have clear and accurate position relation. The ceiling is required to be substantially parallel to the floor on which the indoor navigation device 100 is located, and even if there is a certain angle, the angle between each other should be within 10 degrees.
The indoor navigation system can achieve real-time navigation on low-cost hardware equipment. The MEMS system can give high-frequency positioning information, and the vision system can correct the accumulated error of the MEMS system between two vision frames in real time. The whole system is considerable, and the error can not become larger along with the accumulation of time and displacement, namely the error of our system is convergent.
After reading the following description, it will be apparent to one skilled in the art that various features described herein can be implemented in a method, data processing system, or computer program product. Accordingly, these features may be embodied in hardware, in software in their entirety, or in a combination of hardware and software. Furthermore, the above-described features may also be embodied in the form of a computer program product stored on one or more computer-readable storage media having computer-readable program code segments or instructions embodied in the storage medium. Any use of a computer-readable storage medium may be utilized including hard disks, CD-ROMs, optical storage devices, magnetic storage devices, and/or any combination of the foregoing.
While the invention has been described with reference to a preferred embodiment, it will be understood by those skilled in the art that various changes may be made and equivalents may be substituted for elements thereof without departing from the spirit and scope of the invention. In addition, many modifications may be made to adapt a particular situation or material to the teachings of the invention without departing from the essential scope thereof. Therefore, it is intended that the invention not be limited to the particular embodiment disclosed, but that the invention will include all embodiments falling within the scope of the appended claims.

Claims (10)

1. An indoor navigation method is characterized by specifically comprising the following steps:
determining an attitude angle, an initial yaw angle and initial state information of the indoor navigation device;
the method comprises the steps that icons arranged on a ceiling are searched, detected, shot and decoded in real time through a shooting module, and icon information is obtained; the shooting module is arranged on the indoor navigation device;
and calculating real-time position information of the indoor navigation device according to the icon information.
2. The indoor navigation method of claim 1, wherein the step of searching, detecting, photographing, decoding and acquiring icon information in real time for the icon disposed on the ceiling through the photographing module specifically comprises:
searching, detecting and shooting the icon image of the icon in real time;
carrying out binarization on the icon image and determining the icon outline of the icon image;
screening and determining a target icon serving as the icon, wherein the outline of the icon is quadrilateral;
judging whether the target icon is a pre-stored icon or not, and if so, recording image coordinates of four vertexes of the target icon;
sub-pixel purifying the image coordinate positions of four vertexes of the target icon;
the pose of the target icon with respect to the photographing module is calculated by P4P.
3. The indoor navigation method according to claim 2, wherein the calculation formula for calculating the pose of the target icon with respect to the photographing module through P4P is as follows:
4. the indoor navigation method of claim 1, wherein the calculation model for calculating the real-time position information of the indoor navigation device according to the icon information is an IMU pre-integration model; the formula of the IMU pre-integration model is as follows:
wherein, theIs a rotation matrix of the indoor navigation device,is the speed of the indoor navigation device,is position information of the indoor navigation device.
5. Indoor navigation method according to claim 4, characterized in that said method comprisesAcceleration measured for an accelerometer of the indoor navigation device, theAn angular velocity measured for a gyroscope of the indoor navigation device; the above-mentionedAnd saidThe calculation formula of (2) is as follows:
wherein, theIs the accelerationMeasuring the acceleration deviation; the above-mentionedIs the angular velocityMeasuring the angular velocity deviation; the above-mentionedIs the accelerationMeasuring the noise experienced; the above-mentionedIs the angular velocityThe noise experienced is measured.
6. The indoor navigation method of claim 1, wherein the number of the icons is multiple, and the icons are passive directional icons.
7. The indoor navigation method of claim 6, wherein a plurality of icons can be arranged at any position of the ceiling, and the icons have clear and accurate positional relationship.
8. An indoor navigation device is characterized by comprising a control module (11), a shooting module (12) connected with the control module (11), and an IMU module (13); the IMU module (13) comprises an accelerometer, a gyroscope and a magnetometer;
the control module (11) is adapted to perform an indoor navigation method according to any one of claims 1-7.
9. Indoor navigation device according to claim 8, characterized in that the camera module (12) is a high definition camera module with lens towards the ceiling.
10. An indoor navigation system, comprising a plurality of icons (200) provided on an indoor ceiling, and an indoor navigation device (100) according to any one of claims 8 to 9.
CN201711279091.7A 2017-12-06 2017-12-06 A kind of indoor navigation method, apparatus and system Pending CN108007456A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711279091.7A CN108007456A (en) 2017-12-06 2017-12-06 A kind of indoor navigation method, apparatus and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711279091.7A CN108007456A (en) 2017-12-06 2017-12-06 A kind of indoor navigation method, apparatus and system

Publications (1)

Publication Number Publication Date
CN108007456A true CN108007456A (en) 2018-05-08

Family

ID=62057126

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711279091.7A Pending CN108007456A (en) 2017-12-06 2017-12-06 A kind of indoor navigation method, apparatus and system

Country Status (1)

Country Link
CN (1) CN108007456A (en)

Cited By (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109341691A (en) * 2018-09-30 2019-02-15 百色学院 An Intelligent Indoor Positioning System Based on Icon Recognition and Its Positioning Method
CN109506642A (en) * 2018-10-09 2019-03-22 浙江大学 A kind of robot polyphaser vision inertia real-time location method and device
CN109677217A (en) * 2018-12-27 2019-04-26 魔视智能科技(上海)有限公司 The detection method of tractor and trailer yaw angle
CN109743675A (en) * 2018-12-30 2019-05-10 广州小狗机器人技术有限公司 Indoor orientation method and device, storage medium and electronic equipment
CN110207692A (en) * 2019-05-13 2019-09-06 南京航空航天大学 A kind of inertia pre-integration pedestrian navigation method of map auxiliary
CN111006655A (en) * 2019-10-21 2020-04-14 南京理工大学 Multi-scene autonomous navigation positioning method for airport inspection robot
CN111197982A (en) * 2020-01-10 2020-05-26 北京航天众信科技有限公司 Heading machine pose deviation rectifying method, system and terminal based on vision and strapdown inertial navigation
CN111862219A (en) * 2020-07-29 2020-10-30 上海高仙自动化科技发展有限公司 Computer equipment positioning method and device, computer equipment and storage medium
CN113218394A (en) * 2021-04-20 2021-08-06 浙江大学 Indoor visual positioning method and system for flapping wing aircraft
US20210372798A1 (en) * 2020-05-29 2021-12-02 Peking University Visual navigation method and system for mobile devices based on qr code signposts

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236540A (en) * 2014-06-24 2014-12-24 上海大学 Indoor passive navigation and positioning system and indoor passive navigation and positioning method
CN105486311A (en) * 2015-12-24 2016-04-13 青岛海通机器人系统有限公司 Indoor robot positioning navigation method and device
CN105957090A (en) * 2016-05-17 2016-09-21 中国地质大学(武汉) Monocular vision pose measurement method and system based on Davinci technology
CN107193279A (en) * 2017-05-09 2017-09-22 复旦大学 Robot localization and map structuring system based on monocular vision and IMU information

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104236540A (en) * 2014-06-24 2014-12-24 上海大学 Indoor passive navigation and positioning system and indoor passive navigation and positioning method
CN105486311A (en) * 2015-12-24 2016-04-13 青岛海通机器人系统有限公司 Indoor robot positioning navigation method and device
CN105957090A (en) * 2016-05-17 2016-09-21 中国地质大学(武汉) Monocular vision pose measurement method and system based on Davinci technology
CN107193279A (en) * 2017-05-09 2017-09-22 复旦大学 Robot localization and map structuring system based on monocular vision and IMU information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
李凯强等: "Design of an Autonomous Underwater Vehicle Robot Based on Multi-thrusters and Attitude Fusion", 《PROCEEDINGS OF THE IEEE INTERNATIONAL CONFERENCE ON INFORMATION AND AUTOMATION(ICIA)》 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109341691A (en) * 2018-09-30 2019-02-15 百色学院 An Intelligent Indoor Positioning System Based on Icon Recognition and Its Positioning Method
CN109506642A (en) * 2018-10-09 2019-03-22 浙江大学 A kind of robot polyphaser vision inertia real-time location method and device
CN109506642B (en) * 2018-10-09 2021-05-28 浙江大学 A robot multi-camera visual inertial real-time positioning method and device
CN109677217A (en) * 2018-12-27 2019-04-26 魔视智能科技(上海)有限公司 The detection method of tractor and trailer yaw angle
CN109743675A (en) * 2018-12-30 2019-05-10 广州小狗机器人技术有限公司 Indoor orientation method and device, storage medium and electronic equipment
CN110207692A (en) * 2019-05-13 2019-09-06 南京航空航天大学 A kind of inertia pre-integration pedestrian navigation method of map auxiliary
CN111006655A (en) * 2019-10-21 2020-04-14 南京理工大学 Multi-scene autonomous navigation positioning method for airport inspection robot
CN111197982A (en) * 2020-01-10 2020-05-26 北京航天众信科技有限公司 Heading machine pose deviation rectifying method, system and terminal based on vision and strapdown inertial navigation
CN111197982B (en) * 2020-01-10 2022-04-12 北京航天众信科技有限公司 Heading machine pose deviation rectifying method, system and terminal based on vision and strapdown inertial navigation
US20210372798A1 (en) * 2020-05-29 2021-12-02 Peking University Visual navigation method and system for mobile devices based on qr code signposts
CN111862219A (en) * 2020-07-29 2020-10-30 上海高仙自动化科技发展有限公司 Computer equipment positioning method and device, computer equipment and storage medium
CN113218394A (en) * 2021-04-20 2021-08-06 浙江大学 Indoor visual positioning method and system for flapping wing aircraft

Similar Documents

Publication Publication Date Title
CN108007456A (en) A kind of indoor navigation method, apparatus and system
JP7082545B2 (en) Information processing methods, information processing equipment and programs
JP5832341B2 (en) Movie processing apparatus, movie processing method, and movie processing program
Geiger et al. Automatic camera and range sensor calibration using a single shot
KR102627453B1 (en) Method and device to estimate position
JP5588812B2 (en) Image processing apparatus and imaging apparatus using the same
Tardif et al. Monocular visual odometry in urban environments using an omnidirectional camera
KR101725060B1 (en) Apparatus for recognizing location mobile robot using key point based on gradient and method thereof
KR101776622B1 (en) Apparatus for recognizing location mobile robot using edge based refinement and method thereof
CN105841687B (en) indoor positioning method and system
CN110345937A (en) Appearance localization method and system are determined in a kind of navigation based on two dimensional code
JP6499047B2 (en) Measuring device, method and program
CN104792312A (en) Indoor automatic transport vehicle positioning system with three fixed balls as visual marker
JP6758160B2 (en) Vehicle position detection device, vehicle position detection method and computer program for vehicle position detection
JP6479296B2 (en) Position / orientation estimation apparatus and position / orientation estimation method
KR20180067199A (en) Apparatus and method for recognizing object
JP2016200557A (en) Calibration apparatus, distance measuring apparatus, and calibration method
JP2017181476A (en) Vehicle position detection device, vehicle position detection method, and computer program for vehicle position detection
Khoshelham et al. Vehicle positioning in the absence of GNSS signals: Potential of visual-inertial odometry
JP2016148956A (en) Positioning device, positioning method and positioning computer program
KR102555269B1 (en) Posture estimation fusion method and system using omnidirectional image sensor and inertial measurement sensor
KR20220151572A (en) Method and System for change detection and automatic updating of road marking in HD map through IPM image and HD map fitting
US10859377B2 (en) Method for improving position information associated with a collection of images
JP6886136B2 (en) Alignment device, alignment method and computer program for alignment
TW201621273A (en) Mobile positioning apparatus and positioning method thereof

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20180508

RJ01 Rejection of invention patent application after publication