Disclosure of Invention
The present invention aims to solve at least one of the technical problems existing in the related art to a certain extent.
Therefore, an object of the embodiments of the present invention is to provide an unmanned aerial vehicle overlap detection method, which can reduce the difficulty of overlapping an unmanned aerial vehicle carrying a detection device to a power transmission line, reduce the unmanned aerial vehicle operation level requirement for related operators, and improve the efficiency and effect of power transmission line detection.
Another object of the embodiment of the application is to provide an unmanned aerial vehicle overlap detection system.
In order to achieve the technical purpose, the technical scheme adopted by the embodiment of the application comprises the following steps:
In a first aspect, an embodiment of the present application provides an overlap detection method for an unmanned aerial vehicle, which is applied to the unmanned aerial vehicle, where the unmanned aerial vehicle is provided with a roller and a line detection device, the line detection device includes a radiation camera and an imaging background plate, and the overlap detection method includes:
Acquiring blank frame images and current frame images from an unmanned aerial vehicle vision camera and roller linear coefficients of the rollers, wherein the vision camera is used for acquiring a vision image at the imaging background plate;
According to the blank frame image, carrying out pixel difference processing on the current frame image to obtain a difference image;
according to the difference image, carrying out coordinate comparison on the roller linear coefficient to obtain coefficient relative data;
And carrying out lap joint control on the idler wheels according to the coefficient relative data, so that the unmanned aerial vehicle carries out defect detection on the power transmission line through the line detection device.
In addition, the unmanned aerial vehicle lap joint detection method according to the embodiment of the application can also have the following additional technical characteristics:
further, in one embodiment of the present application, the roller linear coefficient of the roller is obtained by:
acquiring a roller image;
extracting the roller center coordinates of the roller image to obtain roller coordinate points;
and obtaining the roller linear coefficient corresponding to the roller according to the roller coordinate point.
Further, in an embodiment of the present application, the performing pixel difference processing on the current frame image according to the blank frame image to obtain a difference image includes:
performing first matrix transformation on the blank frame image to obtain a blank frame matrix, and performing second matrix transformation on the current frame image to obtain a current frame matrix;
and performing matrix difference processing on the current frame matrix according to the blank frame matrix to obtain the difference image.
Further, in an embodiment of the present application, the performing matrix difference processing on the current frame matrix according to the blank frame matrix to obtain the difference image includes:
Acquiring a preset first threshold value;
calculating a matrix difference value of the blank frame matrix and the current frame matrix to obtain a difference matrix;
according to the first threshold value, threshold value comparison is carried out on the difference matrix, and a difference threshold value comparison result is obtained;
and if the difference threshold comparison result is that the difference matrix is larger than the first threshold, obtaining the difference image according to the difference matrix.
Further, in an embodiment of the present application, the performing coordinate comparison on the roller linear coefficient according to the difference image to obtain coefficient relative data includes:
performing image binarization on the difference image to obtain a binarized image;
according to a preset target pixel value, performing pixel screening on the binarized image to obtain a target pixel set;
performing pixel straight line fitting on the target pixel set to obtain a target pixel straight line;
And comparing the linear coefficients of the roller according to the target pixel lines to obtain the coefficient relative data.
Further, in an embodiment of the present application, the comparing the linear coefficient of the roller according to the target pixel line to obtain the coefficient relative data includes:
Acquiring a first pixel coefficient and a second pixel coefficient in the target pixel straight line, and a second straight line coefficient in the roller straight line coefficients;
Performing first absolute difference calculation on the second linear coefficient according to the first pixel coefficient to obtain a first relative coefficient, and performing second absolute difference calculation on the second linear coefficient according to the second pixel coefficient to obtain a second relative coefficient;
And integrating the first relative coefficient and the second relative coefficient to obtain the coefficient relative data.
Further, in an embodiment of the present application, the lap control of the roller according to the coefficient relative data includes:
Acquiring a preset relative coefficient threshold value group;
according to the relative coefficient threshold value group, performing coincidence judgment on the first relative coefficient and the second relative coefficient to obtain a coincidence judgment result, wherein the coincidence judgment result is used for indicating whether a roller straight line corresponding to the roller is coincident with the target pixel straight line or not;
and if the superposition judgment result is that the superposition is not superposed, controlling the unmanned aerial vehicle to move according to the coefficient relative data, and then returning to execute the step of acquiring the blank frame image and the current frame image from the visual camera of the unmanned aerial vehicle.
In a second aspect, an embodiment of the present application provides an unmanned aerial vehicle overlap detection system, applied to an unmanned aerial vehicle, where the unmanned aerial vehicle is provided with a roller and a line detection device, the line detection device includes a radiation camera and an imaging background plate, and the overlap detection system includes:
The acquisition module is used for acquiring blank frame images and current frame images from the unmanned aerial vehicle visual cameras and roller linear coefficients of the rollers, and the visual cameras are used for acquiring visual field images at the imaging background plates;
the processing module is used for carrying out pixel difference processing on the current frame image according to the blank frame image to obtain a difference image;
the comparison module is used for carrying out coordinate comparison on the roller linear coefficients according to the difference image to obtain coefficient relative data;
and the control module is used for carrying out lap joint control on the idler wheels according to the coefficient relative data so that the unmanned aerial vehicle carries out defect detection on the power transmission line through the line detection device.
In a third aspect, an embodiment of the present application further provides an electronic device, including:
at least one processor;
at least one memory for storing at least one program;
The at least one program, when executed by the at least one processor, causes the at least one processor to implement the method of the first aspect described above.
In a fourth aspect, embodiments of the present application also provide a computer readable storage medium having stored therein a processor executable program which when executed by the processor is for implementing the method of the first aspect described above.
The advantages and benefits of the application will be set forth in part in the description which follows, and in part will be obvious from the description, or may be learned by practice of the application.
The unmanned aerial vehicle overlap detection method, system, equipment and medium disclosed by the embodiment of the application are applied to unmanned aerial vehicles, wherein the unmanned aerial vehicle is provided with rollers and a line detection device, the line detection device comprises a radiographic camera and an imaging background plate, the method is characterized in that the visual camera is used for acquiring a visual field image at the imaging background plate by acquiring blank frame images and current frame images from a visual camera of the unmanned aerial vehicle and roller linear coefficients of the rollers, pixel difference processing is carried out on the current frame images according to the blank frame images to obtain difference images, coordinate comparison is carried out on the roller linear coefficients according to the difference images to obtain coefficient relative data, and overlap control is carried out on the rollers according to the coefficient relative data so that the unmanned aerial vehicle detects defects of a power transmission line through the line detection device. According to the method, based on pixel difference processing, whether the power transmission line is located in the visual field of the imaging background plate or not can be rapidly and accurately obtained, and position information of the idler wheels relative to the power transmission line is obtained based on coordinate comparison, so that movement of the unmanned aerial vehicle is better controlled, the idler wheels are enabled to be overlapped to the power transmission line, difficulty in overlapping the unmanned aerial vehicle with the detection device to the power transmission line can be effectively reduced, unmanned aerial vehicle operation level requirements of related operators are reduced, and meanwhile efficiency and effect of electric line detection are improved.
Detailed Description
Embodiments of the present application are described in detail below, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to like or similar elements or elements having like or similar functions throughout. The embodiments described below by referring to the drawings are illustrative only and are not to be construed as limiting the application. The step numbers in the following embodiments are set for convenience of illustration only, and the order between the steps is not limited in any way, and the execution order of the steps in the embodiments may be adaptively adjusted according to the understanding of those skilled in the art.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the application only and is not intended to be limiting of the application.
At present, in the scheme of detecting the strain clamp and the ground wire splicing sleeve in the power transmission line by using the unmanned aerial vehicle carrying detection device, an unmanned aerial vehicle operator usually controls the unmanned aerial vehicle to fly and move to a position near the strain clamp or the ground wire splicing sleeve, and then the detection of the strain clamp and the ground wire splicing sleeve is directly realized by the carrying detection device, but the difficulty of keeping the unmanned aerial vehicle stable at the position near the strain clamp and the ground wire splicing sleeve of the power transmission line is larger due to the influence of the carrying weight of the unmanned aerial vehicle, high-altitude environmental factors (such as wind speed and wind direction) and the like, the unmanned aerial vehicle is required to be higher in unmanned aerial vehicle operation level of the unmanned aerial vehicle operator, the operation difficulty is higher, the quality of a detected image obtained by shooting may not be good, and the detection effect of the strain clamp and the ground wire splicing sleeve of the power transmission line tends to be unsatisfactory.
In view of the above, the embodiment of the invention provides an unmanned aerial vehicle overlap joint detection method, which is based on pixel difference processing, and can quickly and accurately acquire whether a power transmission line is located in the field of view of an imaging background plate or not, and acquire position information of a roller relative to the power transmission line based on coordinate comparison, so that unmanned aerial vehicle movement is better controlled, the roller is overlapped to the power transmission line, the difficulty of overlapping the unmanned aerial vehicle with a detection device to the power transmission line can be effectively reduced, the unmanned aerial vehicle operation level requirement of related operators is reduced, and meanwhile, the efficiency and the effect of detecting the power transmission line are improved.
Referring to fig. 1, in an embodiment of the present application, an overlap detection method of an unmanned aerial vehicle is applied to the unmanned aerial vehicle, the unmanned aerial vehicle is provided with a roller and a line detection device, the line detection device includes a radiation camera and an imaging background plate, and the overlap detection method includes:
Step 110, acquiring blank frame images and current frame images from an unmanned aerial vehicle vision camera and roller linear coefficients of the roller, wherein the vision camera is used for acquiring a view field image at the imaging background plate;
In some embodiments, the roller linear coefficient in the step 110 is obtained by the following steps:
A1, acquiring a roller image;
a2, extracting the roller center coordinates of the roller image to obtain roller coordinate points;
a3, obtaining the roller linear coefficient corresponding to the roller according to the roller coordinate point.
In the embodiment of the present application, referring to fig. 2 and 3, an unmanned aerial vehicle may set an imaging background plate 302 and rollers 303 through a bracket 301, and set a vision camera 305 and a radiation camera 306 on a flight platform 304 of the unmanned aerial vehicle, where the number of rollers may be set according to actual situations, and the embodiment of the present application uses the number of rollers as 2 as an example. And the radiographic camera is used for generating X-rays and forming X-ray images at the imaging background plate so as to shoot and obtain detection images of the strain clamp and the ground wire splicing sleeve of the transmission line.
It is understood that the blank frame image may be a frame image captured about the imaging background plate after the vision camera is started, or it may be a frame image captured by the vision camera when there is no transmission line shielding on the imaging background plate. The current frame image may be a frame image currently captured by the visual camera.
It should be noted that the linear coefficient of the roller may be two coefficients of a linear equation corresponding to the roller, specifically, when the number of rollers is one, the linear coefficient of the roller may be two coefficients of a linear equation corresponding to the wheel groove of the roller, and when the number of rollers is equal to or greater than two, the linear coefficient of the roller may be two coefficients of a linear equation corresponding to a coaxial roller, which is used for characterizing the same roller of the overlapped transmission line.
Specifically, the roller image in step A1 may also be captured by a vision camera, where the position of the roller in the image and the position of the imaging background plate in the image are recorded. When the number of the rollers is 2, the step A2 may be extracting the center coordinates of the two rollers in the roller image to obtain the center coordinates (x 1, y 1) of the first roller and the center coordinates (x 2, y 2) of the second roller, and taking the obtained center coordinates of the two rollers as roller coordinate points, and the step A3 may be based on the roller coordinate points, specifically substituting the center coordinates (x 1, y 1) of the first roller and the center coordinates (x 2, y 2) of the second roller into a linear equation and solving, and calculating to obtain the roller linear coefficient (a 1,b1) corresponding to the roller.
Step 120, performing pixel difference processing on the current frame image according to the blank frame image to obtain a difference image;
in some embodiments, the step 120 of performing pixel difference processing on the current frame image according to the blank frame image to obtain a difference image includes:
B1, performing first matrix transformation on the blank frame image to obtain a blank frame matrix, and performing second matrix transformation on the current frame image to obtain a current frame matrix;
In the embodiment of the present application, step 120 is used to obtain a difference image only including the transmission line after the unmanned aerial vehicle flies and moves to the vicinity of the transmission line. Specifically, if the power transmission line does not enter the shooting view field of the visual camera, the current frame image shot by the visual camera only contains an imaging background plate and/or a roller, or if the power transmission line enters the shooting view field of the visual camera, the image content contained in the current frame image shot by the visual camera also comprises the power transmission line.
It can be understood that, step B1 may be to represent the pixel value or the color channel value of each pixel point of the blank frame image in a matrix form, and the current frame image is the same, which can be obtained by simple analogy.
And B2, performing matrix difference processing on the current frame matrix according to the blank frame matrix to obtain the difference image.
Further, the step B2 of performing matrix difference processing on the current frame matrix according to the blank frame matrix to obtain the difference image includes:
B21, acquiring a preset first threshold value;
B22, calculating a matrix difference value between the blank frame matrix and the current frame matrix to obtain a difference matrix;
B23, comparing the thresholds of the difference matrixes according to the first threshold to obtain a difference threshold comparison result;
And B24, if the difference threshold comparison result is that the difference matrix is larger than the first threshold, obtaining the difference image according to the difference matrix.
In the embodiment of the application, the specific value of the first threshold value can be set according to the actual situation, the difference matrix in the step B22 can be obtained by subtracting the blank frame matrix from the current frame matrix, the step B23 can be obtained by comparing the size relation of each matrix element in the difference matrix according to the preset first threshold value, specifically, when each matrix element of the difference matrix is smaller than or equal to the first threshold value, namely, the difference threshold value comparison result is that the difference matrix is smaller than or equal to the first threshold value, the situation that the power transmission line does not enter the shooting view of the visual camera can be indicated, at the moment, the unmanned aerial vehicle can be controlled to move and returns to the execution step 110, or when a plurality of matrix elements of the difference matrix are larger than the first threshold value, namely, the difference threshold value comparison result is that the difference matrix is larger than the first threshold value, the situation that the power transmission line enters the shooting view of the visual camera can be indicated, and the position of the unmanned aerial vehicle is near the power transmission line can be directly determined according to the difference matrix. It should be noted that, in the embodiment of the present application, a plurality of integers are 1 or more, and specific numerical settings may be set according to actual situations.
130, Carrying out coordinate comparison on the roller linear coefficients according to the difference image to obtain coefficient relative data;
In some embodiments, the step 130 of comparing coordinates of the roller line coefficients according to the difference image to obtain coefficient relative data includes:
c1, performing image binarization on the difference image to obtain a binarized image;
c2, performing pixel screening on the binarized image according to a preset target pixel value to obtain a target pixel set;
C3, performing pixel straight line fitting on the target pixel set to obtain a target pixel straight line;
in the embodiment of the present application, step 130 is used to obtain the relative position (i.e., coefficient relative data) between the line corresponding to the transmission line in the difference image and the roller line. Step C1 may be to obtain a binary image (i.e., a binarized image) about the power transmission line in the difference image based on the maximum inter-class variance method, set the pixel value of the pixel point unrelated to the power transmission line to 0, and set the pixel value of the pixel point related to the power transmission line to 255.
It can be understood that, in the step C2, a pixel point with a pixel value of 255 and a pixel point coordinate corresponding to the pixel point with the pixel value of 255 in the binarized image may be extracted in the image coordinate system, and the obtained pixel points with the pixel values of 255 and the corresponding pixel point coordinates are integrated, so as to obtain the target pixel set. Step C3 may be to fit the pixel points with the pixel values of 255 based on a least square method, so as to obtain a target pixel straight line, and specifically may calculate a minimum error function of all the pixel points with the pixel values of 255 in the target pixel set, where an equivalent expression of the minimum error function may be:
Wherein f is the minimum error function of all pixel points of the target pixel set, i is the serial number of the pixel points with the pixel value of 255, n is the total number of the pixel points with the pixel value of 255, Y i is the vertical axis coordinate corresponding to the pixel point i with the pixel value of 255, X i is the horizontal axis coordinate corresponding to the pixel point i with the pixel value of 255, and a 2 and b 2 are two coefficients of the linear equation corresponding to the target pixel straight line.
After the minimum error function is obtained, the minimum error function is solved based on an optimization method, and two coefficients a 2 and b 2 of a linear equation corresponding to the target pixel straight line can be obtained, so that the target pixel straight line corresponding to the target pixel level is obtained, and the target pixel straight line is used for indicating the position of the power transmission line in an image obtained by shooting by the visual camera.
And C4, comparing the linear coefficients of the roller according to the target pixel linear, and obtaining the coefficient relative data.
Further, the step C4 of comparing the linear coefficients of the roller with each other according to the target pixel line to obtain the coefficient relative data includes:
C41, acquiring a first pixel coefficient and a second pixel coefficient in the target pixel straight line, and a second straight line coefficient in the roller straight line coefficients;
C42, according to the first pixel coefficient, performing first absolute difference calculation on the second linear coefficient to obtain a first relative coefficient, and according to the second pixel coefficient, performing second absolute difference calculation on the second linear coefficient to obtain a second relative coefficient;
and C43, integrating the first relative coefficient and the second relative coefficient to obtain the coefficient relative data.
In the embodiment of the present application, the first pixel coefficient in the target pixel line may be the aforementioned coefficient a 2, the second pixel coefficient may be the aforementioned coefficient b 2, the second line coefficient in the roller line coefficient may be the aforementioned coefficient a 1, and the second line coefficient may be the aforementioned coefficient b 1. Step C42 may be to calculate an absolute value of a difference between the corresponding coefficients, thereby obtaining a first relative coefficient and a second relative coefficient, and using the obtained first relative coefficient and second relative coefficient as a relative position (i.e., coefficient relative data) between the power transmission line and the roller.
And 140, performing lap joint control on the roller according to the coefficient relative data, so that the unmanned aerial vehicle performs defect detection on the power transmission line through the line detection device.
In some embodiments, step 140, performing lap control on the roller according to the coefficient relative data includes:
D1, acquiring a preset relative coefficient threshold value group;
d2, according to the relative coefficient threshold value group, performing coincidence judgment on the first relative coefficient and the second relative coefficient to obtain a coincidence judgment result, wherein the coincidence judgment result is used for indicating whether a roller straight line corresponding to the roller is coincident with the target pixel straight line or not;
And D3, if the superposition judging result is superposition, controlling the unmanned aerial vehicle to move so that the roller is lapped on the power transmission line, or if the superposition judging result is non-superposition, controlling the unmanned aerial vehicle to move through a PID control algorithm according to the coefficient relative data, and then returning to execute the step of acquiring the blank frame image and the current frame image from the unmanned aerial vehicle vision camera.
In the embodiment of the present application, the relative coefficient threshold set includes a first coefficient threshold and a second coefficient threshold, where specific values of the first coefficient threshold and the second coefficient threshold may be set according to practical situations, for example, any one of 0.05, 0.01, 0.2, and the like, and the present application is not limited herein. Step D2 may be comparing the magnitude relation between the first relative coefficient and the first coefficient threshold and comparing the magnitude relation between the second relative coefficient and the second coefficient threshold.
Specifically, when the first relative coefficient is smaller than the first coefficient threshold value and the second relative coefficient is smaller than the second coefficient threshold value, the target pixel straight line and the roller straight line where the roller is located can be considered to be coincident, a coincidence judgment result representing coincidence is obtained, the unmanned aerial vehicle is controlled to slowly fall, the unmanned aerial vehicle roller is enabled to be lapped on a power transmission line, and the line detection device is enabled to stably detect defects of a strain clamp and a ground wire splicing sleeve of the power transmission line. Or when the first relative coefficient is greater than or equal to the first coefficient threshold, or the second relative coefficient is greater than or equal to the second coefficient threshold, it may be considered that the target pixel line does not coincide with the roller line where the roller is located, and the unmanned aerial vehicle may be controlled to move by a PID control algorithm according to the coefficient relative data, and then step 110 is executed in a return manner.
For example, an equivalent expression for controlling the unmanned aerial vehicle to move through the PID control algorithm in the embodiment of the application can be:
Δsj=Pj*[e(k)-e(k-1)]+Ij*e(k)+Dj*[e(k)-2*e(k-1)+e(k-2)]
The method comprises the steps of controlling the moving speed of the unmanned aerial vehicle, wherein deltas is the moving speed of the unmanned aerial vehicle, j is a sequence number, deltas j is the moving speed of the unmanned aerial vehicle up and down when j=1, deltas j is the moving speed of the unmanned aerial vehicle left and right when j=2, deltas j is the rotating speed of the unmanned aerial vehicle around the center when j=3, P, I, D respectively represents a proportional factor, an integral factor and a differential factor in a PID control algorithm, and e (k), e (k-1) and e (k-2) respectively represent errors of k, k-1 and k-2.
It can be appreciated that, for the error e (k) at the kth time, it can be calculated based on the coefficient relative data corresponding to the kth time, and the equivalent expression thereof can be:
e(k)=|a1-a2|-A+|b1-b2|-B
wherein a 1 is a second linear coefficient, B 1 is a second linear coefficient, a 2 is a first pixel coefficient, B 2 is a second pixel coefficient, a is a first coefficient threshold, and B is a second coefficient threshold.
The following describes in detail a unmanned aerial vehicle overlap detection system according to an embodiment of the present application with reference to the accompanying drawings.
Referring to fig. 4, an overlap detection system of an unmanned aerial vehicle according to an embodiment of the present application is applied to an unmanned aerial vehicle, the unmanned aerial vehicle is provided with a roller and a line detection device, the line detection device includes a radiation camera and an imaging background plate, and the overlap detection system includes:
The acquisition module 101 is used for acquiring blank frame images and current frame images from an unmanned aerial vehicle vision camera and roller linear coefficients of the roller, wherein the vision camera is used for acquiring a view field image at the imaging background plate;
the processing module 102 is configured to perform pixel difference processing on the current frame image according to the blank frame image to obtain a difference image;
the comparison module 103 is configured to perform coordinate comparison on the roller linear coefficient according to the difference image, so as to obtain coefficient relative data;
And the control module 104 is configured to perform lap joint control on the roller according to the coefficient relative data, so that the unmanned aerial vehicle performs defect detection on the power transmission line through the line detection device.
It can be understood that the content in the above method embodiment is applicable to the system embodiment, and the functions specifically implemented by the system embodiment are the same as those of the above method embodiment, and the achieved beneficial effects are the same as those of the above method embodiment.
Referring to fig. 5, an embodiment of the present application further provides an electronic device, including:
at least one processor 201;
At least one memory 202 for storing at least one program;
the at least one program, when executed by the at least one processor 201, causes the at least one processor 201 to implement the method embodiments described above.
Similarly, it can be understood that the content in the above method embodiment is applicable to the embodiment of the present apparatus, and the functions specifically implemented by the embodiment of the present apparatus are the same as those of the embodiment of the foregoing method, and the achieved beneficial effects are the same as those achieved by the embodiment of the foregoing method.
The embodiment of the present application further provides a computer readable storage medium in which a program executable by the processor 201 is stored, the program executable by the processor 201 being configured to implement the above-described method embodiment when executed by the processor 201.
Similarly, the content in the above method embodiment is applicable to the present computer-readable storage medium embodiment, and the functions specifically implemented by the present computer-readable storage medium embodiment are the same as those of the above method embodiment, and the beneficial effects achieved by the above method embodiment are the same as those achieved by the above method embodiment.
In some alternative embodiments, the functions/acts noted in the block diagrams may occur out of the order noted in the operational illustrations. For example, two blocks shown in succession may in fact be executed substantially concurrently or the blocks may sometimes be executed in the reverse order, depending upon the functionality/acts involved. Furthermore, the embodiments presented and described in the flowcharts of the present application are provided by way of example in order to provide a more thorough understanding of the technology. The disclosed methods are not limited to the operations and logic flows presented herein. Alternative embodiments are contemplated in which the order of various operations is changed, and in which sub-operations described as part of a larger operation are performed independently.
Furthermore, while the application is described in the context of functional modules, it should be appreciated that, unless otherwise indicated, one or more of the functions and/or features may be integrated in a single physical device and/or software module or may be implemented in separate physical devices or software modules. It will also be appreciated that a detailed discussion of the actual implementation of each module is not necessary to an understanding of the present application. Rather, the actual implementation of the various functional modules in the apparatus disclosed herein will be apparent to those skilled in the art from consideration of their attributes, functions and internal relationships. Accordingly, one of ordinary skill in the art can implement the application as set forth in the claims without undue experimentation. It is also to be understood that the specific concepts disclosed are merely illustrative and are not intended to be limiting upon the scope of the application, which is to be defined in the appended claims and their full scope of equivalents.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer-readable storage medium. Based on this understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or in a part of the technical solution, in the form of a software product stored in a storage medium, comprising several instructions for causing a computer device (which may be a personal computer, a server, a network device, etc.) to perform all or part of the steps of the method of the embodiment of the present application. The storage medium includes a U disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (RAM, random Access Memory), a magnetic disk, an optical disk, or other various media capable of storing program codes.
Logic and/or steps represented in the flowcharts or otherwise described herein, e.g., a ordered listing of executable instructions for implementing logical functions, can be embodied in any computer-readable medium for use by or in connection with an instruction execution system, apparatus, or device, such as a computer-based system, processor-containing system, or other system that can fetch the instructions from the instruction execution system, apparatus, or device and execute the instructions. For the purposes of this description, a "computer-readable medium" can be any means that can contain, store, communicate, propagate, or transport the program for use by or in connection with the instruction execution system, apparatus, or device.
More specific examples (a non-exhaustive list) of the computer-readable medium would include an electrical connection (an electronic device) having one or more wires, a portable computer diskette (a magnetic device), a Random Access Memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or flash memory), an optical fiber device, and a portable compact disc read-only memory (CDROM). Additionally, the computer-readable medium may even be paper or other suitable medium upon which the program is printed, as the program may be electronically captured, via, for instance, optical scanning of the paper or other medium, then compiled, interpreted or otherwise processed in a suitable manner, if necessary, and then stored in a computer memory.
It is to be understood that portions of the present application may be implemented in hardware, software, firmware, or a combination thereof. In the above-described embodiments, the various steps or methods may be implemented in software or firmware stored in a memory and executed by a suitable instruction execution system. For example, if implemented in hardware, as in another embodiment, may be implemented using any one or combination of techniques known in the art, discrete logic circuits with logic gates for implementing logic functions on data signals, application specific integrated circuits with appropriate combinational logic gates, programmable Gate Arrays (PGAs), field Programmable Gate Arrays (FPGAs), and the like.
In the foregoing description of the present specification, reference has been made to the terms "one embodiment/example", "another embodiment/example", "certain embodiments/examples", and the like, means that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the application. In this specification, schematic representations of the above terms do not necessarily refer to the same embodiments or examples. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
Although embodiments of the present application have been shown and described, it will be understood by those skilled in the art that various changes, modifications, substitutions and alterations can be made therein without departing from the spirit and scope of the application as defined by the appended claims and their equivalents.
While the preferred embodiment of the present application has been described in detail, the present application is not limited to the embodiments, and those skilled in the art can make various equivalent modifications or substitutions without departing from the spirit of the present application, and the equivalent modifications or substitutions are intended to be included in the scope of the present application as defined in the appended claims.