[go: up one dir, main page]

CN114124769B - Base station testing method and device, electronic equipment and storage medium - Google Patents

Base station testing method and device, electronic equipment and storage medium Download PDF

Info

Publication number
CN114124769B
CN114124769B CN202010797450.3A CN202010797450A CN114124769B CN 114124769 B CN114124769 B CN 114124769B CN 202010797450 A CN202010797450 A CN 202010797450A CN 114124769 B CN114124769 B CN 114124769B
Authority
CN
China
Prior art keywords
test
task
test task
testing
base station
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010797450.3A
Other languages
Chinese (zh)
Other versions
CN114124769A (en
Inventor
王岩
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Datang Mobile Communications Equipment Co Ltd
Original Assignee
Datang Mobile Communications Equipment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Datang Mobile Communications Equipment Co Ltd filed Critical Datang Mobile Communications Equipment Co Ltd
Priority to CN202010797450.3A priority Critical patent/CN114124769B/en
Publication of CN114124769A publication Critical patent/CN114124769A/en
Application granted granted Critical
Publication of CN114124769B publication Critical patent/CN114124769B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L43/00Arrangements for monitoring or testing data switching networks
    • H04L43/50Testing arrangements

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Mobile Radio Communication Systems (AREA)

Abstract

The invention provides a base station testing method, a base station testing device, electronic equipment and a storage medium, and relates to the technical field of communication. The method comprises the following steps: determining a test task; if the test task is a newly added test task, acquiring description information of the test task, wherein the description information comprises an importance level and a test specification of the test task; under the condition that the importance level meets the preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification; if the value of the automatic identification is the target value, acquiring a keyword corresponding to the testing step and configuration parameters of the keyword; and calling a corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step. The invention meets the function expansion requirement of the base station, improves the test efficiency and standardizes the test flow.

Description

Base station testing method and device, electronic equipment and storage medium
Technical Field
The present invention relates to the field of communications technologies, and in particular, to a base station testing method, a device, an electronic apparatus, and a storage medium.
Background
In a communication network, a base station is a bridge connecting a user terminal and a core network, and the communication quality of the whole communication network is determined by factors such as the working performance of the base station, so that the base station is tested to ensure the stability of the working performance of the base station, and the base station is very important for ensuring the communication quality.
Existing operation and maintenance (OM, operate Maintenance) tests are all performed by base station maintenance personnel, who are required to manually test all base stations under the communication network through the OM center.
However, with the continuous upgrade of the base station version, the functions of the base station are continuously expanded, the OM test of the base station is increasingly complicated, the manual test is difficult to meet the function expansion requirement of the base station, the test efficiency is low, and the test is not standard.
Disclosure of Invention
The invention provides a base station testing method, a base station testing device, electronic equipment and a storage medium, which are used for solving the problems that in the prior art, manual testing is difficult to meet the function expansion requirement of a base station, the testing efficiency is low, and the testing is not standard.
According to a first aspect of the present invention, there is provided a base station testing method, the method comprising:
determining a test task;
if the test task is a newly added test task, acquiring description information of the test task, wherein the description information comprises an importance level and a test specification of the test task;
under the condition that the importance level meets the preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification;
if the value of the automatic identification is the target value, acquiring a keyword corresponding to the testing step and configuration parameters of the keyword;
and calling a corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step.
According to a second aspect of the present invention, there is provided a base station testing apparatus, the apparatus comprising:
the determining module is used for determining a test task;
the first acquisition module is used for acquiring description information of the test task if the test task is a newly added test task, wherein the description information comprises an importance level and a test specification of the test task;
the analysis module is used for analyzing the test case of the test task according to the test specification under the condition that the importance level meets the preset condition, wherein the test case comprises the test step of the test task, and the test step comprises an automatic identifier;
the second acquisition module is used for acquiring the keywords corresponding to the testing step and the configuration parameters of the keywords if the value of the automatic identifier is the target value;
and the first test module is used for calling the corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the test step.
According to a third aspect of the present invention, there is provided an electronic device comprising:
a processor, a memory and a computer program stored on the memory and executable on the processor, the processor implementing the aforementioned method when executing the program.
According to a fourth aspect of the invention, there is provided a readable storage medium, which when executed by a processor of an electronic device, enables the electronic device to perform the aforementioned method.
The invention provides a base station testing method, a device, electronic equipment and a storage medium, which can automatically supplement test cases and keywords corresponding to newly added test tasks in the function development process of a base station, and automatically test by calling a keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing flow.
The foregoing description is only an overview of the present invention, and is intended to be implemented in accordance with the teachings of the present invention in order that the same may be more clearly understood and to make the same and other objects, features and advantages of the present invention more readily apparent.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings that are needed in the description of the embodiments of the present invention will be briefly described below, it being obvious that the drawings in the following description are only some embodiments of the present invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is a flowchart illustrating specific steps of a base station testing method according to a first embodiment of the present invention;
fig. 2 is a flowchart of specific steps of a base station testing method according to a second embodiment of the present invention;
fig. 3 is a modularized structure diagram of a Robot Framework provided in the second embodiment of the present invention;
FIG. 4 is an organizational hierarchy diagram of a Robot Framework test library according to a second embodiment of the present invention;
fig. 5 is a block diagram of a base station testing apparatus according to a third embodiment of the present invention;
fig. 6 is a block diagram of a base station testing apparatus according to a fourth embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present invention will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are some, but not all embodiments of the invention. All other embodiments, which can be made by those skilled in the art based on the embodiments of the invention without making any inventive effort, are intended to be within the scope of the invention.
Example 1
Referring to fig. 1, a flowchart illustrating specific steps of a base station testing method according to a first embodiment of the present invention is shown.
Step 101, determining a test task.
In the embodiment of the invention, the test task indicates the functions of the base station to be tested, and the functions of the base station comprise: base station connection, management information base (MIB, management Information Base) loading, simple network management protocol (SNMP, simple Network Management Protocol) addition and deletion checking, event verification, alarm verification, log uploading, base station function call, out Side Plant connection and other functions, and new functions in the function development process of the base station.
Optionally, the tester is prompted to input a test instruction, and a test task is determined according to the test instruction.
Step 102, if the test task is a new test task, acquiring description information of the test task, wherein the description information comprises an importance level and a test specification of the test task.
And if the test task is a newly added test task, indicating that the function to be tested by the test task is a newly added function.
Optionally, the test task includes a task identifier, and if the task identifier is a preset identifier, the test task is determined to be a new test task. Specifically, when a tester inputs a test instruction, the test instruction may include a task identifier of a test task, so that a computer can determine whether the test task is a new test task according to the task identifier, for example, two task identifiers "N" and "O" are set for the test task, and when the task identifier of the test task is "N", the test task is determined to be the new test task. In a specific application, a person skilled in the art can set the task identifier and the preset identifier according to actual requirements, and the invention is not particularly limited.
Optionally, the test task includes a task number, and a test case corresponding to the test task is searched in a test library according to the task number, wherein the test library is used for storing the test case corresponding to the test task, and the test case has a corresponding relation with the task number; if at least one test case corresponding to the task number exists in the test library, the test task is a non-newly added task; if the test database does not contain the test cases corresponding to the task numbers, the test task is a new task.
In the embodiment of the invention, if the test task is a newly added test task, a tester is prompted to submit the description information of the test task, wherein the description information comprises the importance level and the test specification of the test task. The importance level of the test task is determined according to the function to be tested, and if the function to be tested is the main stream function of the base station, the importance level of the test task is high. In practical application, a tester can determine the importance level of the newly added function according to the service requirement corresponding to the newly added function, thereby determining the importance level of the test task. The test specification of a test task is used to indicate the specific implementation of the test task.
And step 103, analyzing the test cases of the test tasks according to the test specifications under the condition that the importance level meets the preset condition, wherein the test cases comprise test steps of the test tasks, and the test steps comprise automatic identifications.
In the embodiment of the invention, the importance level meets the preset condition, and specifically comprises the following steps: dividing the importance level of the test task into a plurality of levels, wherein each level corresponds to a numerical value, the larger the numerical value is, the higher the importance level is, and if the numerical value corresponding to the importance level is larger than a preset threshold value, the importance level meets a preset condition; or setting an importance level label for the test task, distinguishing the importance level of the test task according to the color of the label, and if the color of the importance level label is a preset color, setting red as the preset color, for example, setting the color of the importance level label of a certain test task as green, so as to indicate that the test task is not important, wherein the importance level of the test task does not meet the preset condition.
In the embodiment of the invention, the test case is a specific embodiment of the test task, and the test case comprises a test step, an input parameter and an expected result corresponding to the execution of the test task. One test task corresponds to at least one test case, and one test case comprises at least one test step, one input parameter and one expected result, and can execute the test task according to the test case.
And 104, if the value of the automatic identifier is the target value, acquiring the keywords corresponding to the testing step and the configuration parameters of the keywords.
The automatic identification is used for indicating whether the testing step can be executed by the machine, if the value of the automatic identification is the target value, the corresponding testing step can be executed by the machine, the keywords corresponding to the testing step and the configuration parameters of the keywords are obtained, and a computer can call the corresponding keyword functions to carry out automatic testing according to the configuration parameters of the keywords. Specifically, a preset parameter may be selected as the automation identifier, where the preset parameter includes at least two parameter values, one of the parameter values is used as the target value, and if the parameter value of the preset parameter is the target value, it indicates that the test step corresponding to the preset parameter can be executed by the machine, for example, the automation identifier is set as an automation, and two values of the automation are set: YES and NO, regarding YES as a target value, if the value of the Automate is YES, indicating that the corresponding test step can be executed by the machine; alternatively, the color label is used as the automation mark, and one color is selected as the target value, for example, green is set as the target value of the automation mark, so long as the color label of the automation mark is green, the corresponding testing step can be executed by the machine. The automation flag and the target value of the automation flag may be set by those skilled in the art according to the actual application, and the present invention is not particularly limited.
In the embodiment of the invention, one testing step corresponds to one keyword, the keyword is usually a descriptive language for the testing step, the keyword comprises keywords in a testing library and user-defined keywords, the configuration parameters of the keywords can be zero or more, usually one keyword comprises at least one configuration parameter, and the configuration parameters comprise input parameters and expected output parameters of the keyword.
Optionally, if the testing step includes at least two sub-steps, the step of obtaining the keywords corresponding to the testing step includes:
obtaining a basic keyword corresponding to the substep;
combining the basic keywords of at least two sub-steps into keywords corresponding to the testing steps.
In the embodiment of the invention, one testing step can only correspond to one keyword, so if one testing step comprises at least two sub-steps, the keyword of the testing step is not a basic keyword corresponding to the at least two sub-steps, but the basic keywords in the at least two sub-steps are combined to obtain a new keyword, and the new keyword is used as the keyword of the testing step. For example, the test step a includes a sub-step B and a sub-step C, wherein the basic key of the sub-step B is "receive registration message", the basic key of the sub-step C is "UE authentication", and then the key of the test step a may be "user registration". In general, the keywords are descriptive languages of the test steps and are used for indicating the test functions corresponding to the test steps, so when a test step includes multiple sub-steps, the keywords of the test step are generally summarized for the test functions of each sub-step to obtain a new descriptive language, instead of simply overlaying the basic keywords of each sub-step, the basic keywords of the sub-steps can be keywords in a test library or user-defined keywords, and the keywords corresponding to the test steps including multiple sub-steps are generally user-defined. And 105, calling a corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step.
In the embodiment of the invention, the Robot Framework is adopted to realize the automatic test of the test task, so that in the actual test process, the test task can be tested by calling a keyword function through the test template only by defining the configuration parameters of the keywords.
The invention provides a base station test method, which analyzes the test cases of a test task according to the description information of the test task, acquires keywords corresponding to each test step in the test cases, and calls a keyword function to automatically test the test task, so that the test cases and keywords corresponding to the newly added test task can be automatically supplemented in the function development process of the base station, and the automatic test is performed by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the test efficiency and standardizing the test flow.
Example two
Referring to fig. 2, a flowchart of specific steps of a base station testing method according to a second embodiment of the present invention is shown.
In step 201, a test task is determined.
In the embodiment of the invention, the test task indicates the functions of the base station to be tested, and the functions of the base station comprise: base station connection, management information base (MIB, management Information Base) loading, simple network management protocol (SNMP, simple Network Management Protocol) addition and deletion checking, event verification, alarm verification, log uploading, base station function call, out Side Plant connection and other functions, and new functions in the function development process of the base station.
Optionally, the tester is prompted to input a test instruction, and a test task is determined according to the test instruction.
The embodiment of the invention builds the base station test system based on the Robot Framework architecture, thereby realizing the automatic test of the base station system. Referring to fig. 3, a modularized structure diagram of a Robot Framework in an embodiment of the present invention is shown, in which Test Data is used to store Test Data, the Robot Framework is a core Framework, test Libraries are Test Libraries, the Test Tools include various Test Tools, and System Under Test are tested systems, in the embodiment of the present invention, the tested systems are base station systems, the Robot Framework interacts with the tested systems through a system interface (System interfaces), the Test Libraries interact with the core Framework through a Test library application program interface (Test library API), and the core Framework invokes the Test Data to perform a Test based on a Test Data syntax (Test Data syntax). When the Robot Framework is started, test data is started, test cases are executed, logs and reports are generated, the core Framework does not know any details about the base station system, interaction is executed by the test library, and the core Framework acquires relevant information of the base station system, such as test tasks, by reading the test library.
Step 202, if the test task is a new test task, acquiring description information of the test task, wherein the description information comprises an importance level and a test specification of the test task.
And if the test task is a newly added test task, indicating that the function to be tested by the test task is a newly added function.
Optionally, the test task includes a task identifier, and if the task identifier is a preset identifier, the test task is determined to be a new test task. Specifically, when a tester inputs a test instruction, the test instruction may include a task identifier of a test task, so that a computer can determine whether the test task is a new test task according to the task identifier, for example, two task identifiers "N" and "O" are set for the test task, and when the task identifier of the test task is "N", the test task is determined to be the new test task. In a specific application, a person skilled in the art can set the task identifier and the preset identifier according to actual requirements, and the invention is not particularly limited. Optionally, the test task includes a task number, and a test case corresponding to the test task is searched in a test library according to the task number, wherein the test library is used for storing the test case corresponding to the test task, and the test case has a corresponding relation with the task number; if at least one test case corresponding to the task number exists in the test library, the test task is a non-newly added task; if the test database does not contain the test cases corresponding to the task numbers, the test task is a new task.
Referring to fig. 4, which shows an organizational hierarchy chart of a Robot Framework Test library in the embodiment of the present invention, for a base station system, each sub-module constructs a Test Case in a Test Case File (Test Case File) based on a module function, each sub-module corresponds to at least one Test Case, several Test cases of the same sub-module form a Test set (Module Test Suites) of the sub-module, and the Test Case set of each sub-module forms a Test set (Om Test tests) of a higher-level module; the module resource file (Om Resources) contains the keywords of the module and the configuration parameters of the keywords, and is used for calling each test case in the test set of the module; the Common resource file (Common Resources) contains higher-level user-defined keywords which are based on configuration parameters of the keywords and are used for the Test cases of different base station subsystems to call, and the implementation of the keywords in the Common resource file depends on lower-layer standard libraries and extended libraries, such as a Python library, an SNMP library, a Test Case Lib library and the like, wherein the Python library contains Python files, the Python files are used for storing specific Test methods, the SNMP library is related with signaling interaction, and the Test Case Lib library contains OSP related operations.
In the embodiment of the invention, the test task corresponds to the function of the base station, and because the test cases are constructed according to the functions of the modules in the base station system, the corresponding relation exists between the test task and the test cases, and whether the test cases corresponding to the test task exist in the test library can be inquired according to the task number of the test task, so that whether the test task is a newly added test task is further judged.
The initial test library includes an admission test case of the base station system, and corresponds to common functions of each module in the base station system, such as: the method comprises the steps of activating and deactivating a cell module, loading a processor and a board of an equipment module, upgrading a package of a software module and the like. The newly added test task corresponds to a newly added function in the development process of the base station system, and also can correspond to newly added BUG backtracking.
Optionally, counting new functions and/or new BUGs of the base station system according to a preset period; determining a new test task according to the new function and/or the new BUG; and acquiring the description information of the test task. According to the embodiment of the invention, the newly-added function and the newly-added BUG of the base station system are counted according to the preset period, so that the newly-added test task is determined, the automatic update of the test task is realized, and the function expansion requirement of the base station can be met. In the embodiment of the invention, if the test task is a newly added test task, a tester is prompted to submit the description information of the test task, wherein the description information comprises the importance level and the test specification of the test task. The importance level of the test task is determined according to the function to be tested, and if the function to be tested is the main stream function of the base station, the importance level of the test task is high. In practical application, a tester can determine the importance level of the newly added function according to the service requirement corresponding to the newly added function, thereby determining the importance level of the test task. The test specification of a test task is used to indicate the specific implementation of the test task.
And 203, analyzing the test cases of the test tasks according to the test specifications under the condition that the importance level meets the preset condition, wherein the test cases comprise test steps of the test tasks, and the test steps comprise automatic identifications.
In the embodiment of the invention, the importance level meets the preset condition, and specifically comprises the following steps: dividing the importance level of the test task into a plurality of levels, wherein each level corresponds to a numerical value, the larger the numerical value is, the higher the importance level is, and if the numerical value corresponding to the importance level is larger than a preset threshold value, the importance level meets a preset condition; or setting an importance level label for the test task, distinguishing the importance level of the test task according to the color of the label, and if the color of the importance level label is a preset color, setting red as the preset color, for example, setting the color of the importance level label of a certain test task as green, so as to indicate that the test task is not important, wherein the importance level of the test task does not meet the preset condition.
In the embodiment of the invention, the test case is a specific embodiment of the test task, and the test case comprises a test step, an input parameter and an expected result corresponding to the execution of the test task. One test task corresponds to at least one test case, and one test case comprises at least one test step, one input parameter and one expected result, and can execute the test task according to the test case.
And 204, if the value of the automation identification is the target value, generating a test step configuration page.
The automatic identification is used for indicating whether the testing step can be executed by the machine, if the value of the automatic identification is the target value, the corresponding testing step can be executed by the machine, the testing step configuration page is generated, so that a user inputs a keyword corresponding to the testing step and the configuration parameter of the keyword on the testing step configuration page, and the computer calls a corresponding keyword function according to the configuration parameter of the keyword to realize automatic testing. Specifically, a preset parameter may be selected as the automation identifier, where the preset parameter includes at least two parameter values, one of the parameter values is used as the target value, and if the parameter value of the preset parameter is the target value, it indicates that the test step corresponding to the preset parameter can be executed by the machine, for example, the automation identifier is set as an automation, and two values of the automation are set: YES and NO, regarding YES as a target value, if the value of the Automate is YES, indicating that the corresponding test step can be executed by the machine; alternatively, the color label is used as the automation mark, and one color is selected as the target value, for example, green is set as the target value of the automation mark, so long as the color label of the automation mark is green, the corresponding testing step can be executed by the machine. The automation flag and the target value of the automation flag may be set by those skilled in the art according to the actual application, and the present invention is not particularly limited.
Step 205, obtaining keywords corresponding to the testing steps and configuration parameters of the keywords, which are input by a tester on the testing step configuration page.
And step 206, calling a corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step.
If the value of the automatic identification is the target value, the method indicates that the testing step corresponding to the automatic identification can be executed by a machine, a testing template is included in a Robot Framework automatic testing Framework, and a keyword-driven testing case can be converted into a data-driven testing case through the testing template.
Optionally, if the test task is a non-newly added test task, acquiring identification information of the non-newly added test task; searching the test cases of the non-newly added test tasks in a preset file according to the identification information; and executing the test cases of the non-newly added test tasks for testing.
As can be seen from the foregoing step 202, in the embodiment of the present invention, the test cases of each module of the base station system are stored in the test library, and if the test task is a non-newly added task, the test cases corresponding to the test task can be directly obtained from the test library, and then the obtained test cases are executed to perform the test.
Whether the test task is a non-newly added test task or not is determined by the method of determining whether the test task is a newly added test task in the step 202, which is not described herein.
Step 207, if the test task corresponds to at least two test cases, if at least one test case fails to test, continuing to execute the next test case.
Each test case contains one or more test steps, and all test steps pass the test to indicate that the test case is executed. Each test step has corresponding preset input and expected output, and if the actual test result obtained by executing the test step is inconsistent with the expected output, the test step is indicated to fail to be executed.
In general, if a test case fails to execute, the whole test task will stop, and the test efficiency is reduced. Therefore, in the embodiment of the invention, if the execution of a certain test case fails, the test task is continuously executed, the next test case is tested until all the test cases are executed, and then the test results of the test cases corresponding to the test task are exported in batches, so that the test efficiency is improved.
The invention provides a base station test method, which analyzes the test cases of a test task according to the description information of the test task, acquires keywords corresponding to each test step in the test cases, and calls a keyword function to automatically test the test task, so that the test cases and keywords corresponding to the newly added test task can be automatically supplemented in the function development process of the base station, and the automatic test is performed by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the test efficiency and standardizing the test flow. In addition, in the invention, if the execution of a certain test case fails, the next test case is continuously executed until all the test cases corresponding to the test task are executed, and then the test result is output, thereby improving the test efficiency.
Example III
Referring to fig. 5, a block diagram of a base station testing apparatus according to a third embodiment of the present invention is shown, which is specifically as follows:
a determining module 301, configured to determine a test task;
the first obtaining module 302 is configured to obtain description information of the test task if the test task is a newly added test task, where the description information includes an importance level and a test specification of the test task;
the analyzing module 303 is configured to analyze, according to the test specification, a test case of the test task, where the test case includes a test step of the test task, and the test step includes an automation identifier, where the importance level meets a preset condition;
a second obtaining module 304, configured to obtain a keyword corresponding to the testing step and a configuration parameter of the keyword if the value of the automation identifier is a target value;
and the first test module 305 is configured to call a corresponding keyword function to perform a test according to the configuration parameter of the keyword corresponding to the test step.
The invention provides a base station testing device, which analyzes the test cases of a test task according to the description information of the test task, acquires keywords corresponding to each test step in the test cases, and calls a keyword function to automatically test the test task, so that the test cases and keywords corresponding to the newly added test task can be automatically supplemented in the function development process of the base station, and the automatic test is performed by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing flow.
In the third embodiment, the device embodiment corresponding to the first method embodiment may refer to the detailed description of the first embodiment, and will not be described herein.
Example IV
Referring to fig. 6, a block diagram of a base station testing apparatus according to a fourth embodiment of the present invention is shown, specifically as follows:
a determining module 401, configured to determine a test task.
The first obtaining module 402 is configured to obtain description information of the test task if the test task is a new test task, where the description information includes an importance level and a test specification of the test task.
Optionally, the first obtaining module 402 includes:
the statistics sub-module is used for counting the newly added functions and/or the newly added BUG of the base station system according to a preset period;
the task determination submodule is used for determining a new test task according to the new function and/or the new BUG;
and the descriptive information acquisition sub-module is used for acquiring descriptive information of the test task.
The analyzing module 403 is configured to analyze, according to the test specification, a test case of the test task, where the test case includes a test step of the test task, and the test step includes an automation identifier, where the importance level meets a preset condition.
And a second obtaining module 404, configured to obtain the keyword corresponding to the testing step and the configuration parameter of the keyword if the value of the automation identifier is the target value.
The second obtaining module 404 includes:
a page generation submodule 4041 for generating a test step configuration page;
the first obtaining submodule 4042 is configured to obtain a keyword corresponding to the test step and a configuration parameter of the keyword, which are input by a tester on the test step configuration page.
Optionally, if the testing step includes at least two sub-steps, the second obtaining module 404 includes:
the second acquisition sub-module is used for acquiring the basic keywords corresponding to the sub-steps and the configuration parameters of the basic keywords;
and the combination sub-module is used for combining the basic keywords of at least two sub-steps into keywords corresponding to the testing steps.
And the testing module 405 is configured to call the corresponding keyword function to perform testing according to the configuration parameter of the keyword corresponding to the testing step.
Optionally, the apparatus further comprises:
the third acquisition module is used for acquiring the identification information of the non-newly added test task if the test task is the non-newly added test task;
the searching module is used for searching the test cases of the non-newly added test tasks in a preset file according to the identification information;
and the non-newly-added test task test module is used for executing the test cases of the non-newly-added test tasks to test.
And the test control module 406 is configured to continue executing the next test case if the test task corresponds to at least two test cases and there is at least one test case test failure.
The invention provides a base station testing device, which analyzes the test cases of a test task according to the description information of the test task, acquires keywords corresponding to each test step in the test cases, and calls a keyword function to automatically test the test task, so that the test cases and keywords corresponding to the newly added test task can be automatically supplemented in the function development process of the base station, and the automatic test is performed by calling the keyword function, thereby meeting the function expansion requirement of the base station, improving the testing efficiency and standardizing the testing flow. In addition, in the invention, if the execution of a certain test case fails, the next test case is continuously executed until all the test cases corresponding to the test task are executed, and then the test result is output, thereby improving the test efficiency. In the fourth embodiment, the device embodiment corresponding to the second method embodiment may refer to the detailed description of the second embodiment, and will not be described herein.
The embodiment of the invention also provides electronic equipment, which comprises: a processor, a memory and a computer program stored on the memory and executable on the processor, the processor implementing the aforementioned method when executing the program.
The embodiment of the invention also provides a readable storage medium, which when the instructions in the storage medium are executed by a processor of an electronic device, enables the electronic device to execute the method.
For the device embodiments, since they are substantially similar to the method embodiments, the description is relatively simple, and reference is made to the description of the method embodiments for relevant points.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The embodiments of the present invention have been described above with reference to the accompanying drawings, but the present invention is not limited to the above-described embodiments, which are merely illustrative and not restrictive, and many forms may be made by those having ordinary skill in the art without departing from the spirit of the present invention and the scope of the claims, which are to be protected by the present invention.

Claims (14)

1. A method for testing a base station, the method comprising:
determining a test task;
if the test task is a newly added test task, acquiring description information of the test task, wherein the description information comprises an importance level and a test specification of the test task, the importance level of the test task is determined according to a function to be tested, and if the function to be tested is a main stream function of a base station, the importance level of the test task is high, and the test specification of the test task is used for indicating a specific implementation mode of the test task;
under the condition that the importance level meets the preset condition, analyzing a test case of the test task according to the test specification, wherein the test case comprises a test step of the test task, and the test step comprises an automatic identification;
if the value of the automatic identification is the target value, acquiring a keyword corresponding to the testing step and configuration parameters of the keyword;
and calling a corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step.
2. The method of claim 1, wherein the step of obtaining the keyword corresponding to the testing step and the configuration parameter of the keyword includes:
generating a test step configuration page;
and acquiring keywords corresponding to the testing steps and configuration parameters of the keywords, which are input by a tester on the testing step configuration page.
3. The method according to claim 1, wherein if the testing step includes at least two sub-steps, the step of obtaining the keywords corresponding to the testing step includes:
obtaining a basic keyword corresponding to the substep;
combining the basic keywords of at least two sub-steps into keywords corresponding to the testing steps.
4. The method according to claim 1, wherein the step of acquiring the description information of the test task if the test task is a newly added test task includes:
counting newly-added functions and/or newly-added BUGs of the base station system according to a preset period;
determining a new test task according to the new function and/or the new BUG;
and acquiring the description information of the test task.
5. The method according to claim 1, wherein the method further comprises:
if the test task is a non-newly added test task, acquiring identification information of the non-newly added test task;
searching the test cases of the non-newly added test tasks in a preset file according to the identification information;
and executing the test cases of the non-newly added test tasks for testing.
6. The method of any one of claims 1 to 5, wherein if the test task corresponds to at least two test cases, the method further comprises:
if at least one test case fails, continuing to execute the next test case.
7. A base station testing apparatus, the apparatus comprising:
the determining module is used for determining a test task;
the first acquisition module is used for acquiring description information of the test task if the test task is a newly added test task, wherein the description information comprises an importance level and a test specification of the test task, the importance level of the test task is determined according to a function to be tested, and the importance level of the test task is high if the function to be tested is a main stream function of a base station, and the test specification of the test task is used for indicating a specific implementation mode of the test task;
the analysis module is used for analyzing the test case of the test task according to the test specification under the condition that the importance level meets the preset condition, wherein the test case comprises the test step of the test task, and the test step comprises an automatic identifier;
the second acquisition module is used for acquiring the keywords corresponding to the testing step and the configuration parameters of the keywords if the value of the automatic identifier is the target value;
and the testing module is used for calling the corresponding keyword function to test according to the configuration parameters of the keywords corresponding to the testing step.
8. The apparatus of claim 7, wherein the second acquisition module comprises:
the page generation sub-module is used for generating a test step configuration page;
the first acquisition sub-module is used for acquiring keywords corresponding to the testing steps and configuration parameters of the keywords, which are input by a tester on the testing step configuration page.
9. The apparatus of claim 7, wherein if the testing step comprises at least two sub-steps, the second obtaining module comprises:
the second acquisition sub-module is used for acquiring the basic keywords corresponding to the sub-steps;
and the combination sub-module is used for combining the basic keywords of at least two sub-steps into keywords corresponding to the testing steps.
10. The apparatus of claim 7, wherein the first acquisition module comprises:
the statistics sub-module is used for counting the newly added functions and/or the newly added BUG of the base station system according to a preset period;
the task determination submodule is used for determining a new test task according to the new function and/or the new BUG;
and the descriptive information acquisition sub-module is used for acquiring descriptive information of the test task.
11. The apparatus of claim 7, wherein the apparatus further comprises:
the third acquisition module is used for acquiring the identification information of the non-newly added test task if the test task is the non-newly added test task;
the searching module is used for searching the test cases of the non-newly added test tasks in a preset file according to the identification information;
and the non-newly-added test task test module is used for executing the test cases of the non-newly-added test tasks to test.
12. The apparatus according to any one of claims 7 to 11, further comprising, if the test task corresponds to at least two test cases:
and the test control module is used for continuously executing the next test case if at least one test case fails to be tested.
13. An electronic device, comprising:
a processor, a memory and a computer program stored on the memory and executable on the processor, characterized in that the processor implements the method according to any one of claims 1 to 6 when executing the program.
14. A readable storage medium, characterized in that instructions in the storage medium, when executed by a processor of an electronic device, enable the electronic device to perform the method of any one of claims 1 to 6.
CN202010797450.3A 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium Active CN114124769B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010797450.3A CN114124769B (en) 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010797450.3A CN114124769B (en) 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium

Publications (2)

Publication Number Publication Date
CN114124769A CN114124769A (en) 2022-03-01
CN114124769B true CN114124769B (en) 2023-08-11

Family

ID=80373506

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010797450.3A Active CN114124769B (en) 2020-08-10 2020-08-10 Base station testing method and device, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN114124769B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114625654B (en) * 2022-03-21 2025-09-05 北京有竹居网络技术有限公司 A testing method and related equipment
CN116204446B (en) * 2023-05-06 2023-08-18 云账户技术(天津)有限公司 Automatic test flow management method and device based on JIRA platform

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9753842B2 (en) * 2014-05-09 2017-09-05 Wipro Limited System and method for creating universal test script for testing variants of software application
CN106937303B (en) * 2015-12-30 2020-07-07 中国移动通信集团河南有限公司 A base station testing method and system, terminal and cloud server
CN110471839A (en) * 2019-07-11 2019-11-19 平安普惠企业管理有限公司 Fixed time test task control method, device, computer equipment and storage medium
CN110888818A (en) * 2019-12-22 2020-03-17 普信恒业科技发展(北京)有限公司 Test case configuration system and method, automatic test system and method

Also Published As

Publication number Publication date
CN114124769A (en) 2022-03-01

Similar Documents

Publication Publication Date Title
CN107908541B (en) Interface testing method and device, computer equipment and storage medium
CN105094783B (en) method and device for testing stability of android application
CN108108297B (en) Method and device for automatic testing
CN111240994B (en) Vulnerability processing method and device, electronic equipment and readable storage medium
CN109710810B (en) Change management method, device, equipment and storage medium
CN111459794A (en) Communication network testing method, device, computer equipment and storage medium
CN108845940B (en) Enterprise-level information system automatic function testing method and system
CN109344053B (en) Interface coverage test method, system, computer device and storage medium
CN111767218B (en) Automatic test method, equipment and storage medium for continuous integration
CN114124769B (en) Base station testing method and device, electronic equipment and storage medium
CN117931666B (en) System and method for testing core framework of software radio communication equipment
CN112241360A (en) Test case generation method, device, equipment and storage medium
CN113886262A (en) Software automation test method and device, computer equipment and storage medium
CN116909936B (en) Big data automatic test method, equipment and readable storage medium
CN111538542B (en) System configuration method and related device
CN112988555B (en) Interface testing method, device, equipment and storage medium
CN113434405A (en) Method and device for determining test file, storage medium and electronic device
CN112100086B (en) Software automation test method, device, equipment and computer readable storage medium
CN114090365A (en) Method, device and equipment for performing function test by using dynamic link library
CN117743148A (en) Interface testing method and device, storage medium and electronic device
CN117667680A (en) Use case testing method, system, equipment and storage medium
CN115576831A (en) Test case recommendation method, device, equipment and storage medium
CN111221719B (en) Automatic test system and test method
CN114153719A (en) Test methods and related equipment
CN114428706A (en) Interface monitoring method and device, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant