CN114428725A - Method and device for automatically generating performance test script - Google Patents
Method and device for automatically generating performance test script Download PDFInfo
- Publication number
- CN114428725A CN114428725A CN202011186544.3A CN202011186544A CN114428725A CN 114428725 A CN114428725 A CN 114428725A CN 202011186544 A CN202011186544 A CN 202011186544A CN 114428725 A CN114428725 A CN 114428725A
- Authority
- CN
- China
- Prior art keywords
- script
- performance test
- configuration file
- jdbc
- service
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3692—Test management for test results analysis
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F11/00—Error detection; Error correction; Monitoring
- G06F11/36—Prevention of errors by analysis, debugging or testing of software
- G06F11/3668—Testing of software
- G06F11/3672—Test management
- G06F11/3688—Test management for test execution, e.g. scheduling of test suites
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Computer Hardware Design (AREA)
- Quality & Reliability (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Debugging And Monitoring (AREA)
Abstract
The application relates to the technical field of performance testing, and provides a method and a device for automatically generating a performance testing script. The performance test script is used for testing the performance of a system to be tested, and the method comprises the following steps: acquiring a tracking file, wherein the tracking file comprises a plurality of service requests sent to a database of the system to be tested by a preset application program; analyzing each service request into the contents of a plurality of preset fields to obtain an analysis result; acquiring a preset script configuration file, wherein the script configuration file comprises configuration information of each node of a frame structure forming a performance test script; and generating a performance test script according to the script configuration file and the analysis result. According to the script file generation method and device, the script file can be automatically generated according to the preset script configuration file and the tracked actual service request, the labor cost can be greatly reduced, and the script generation efficiency can be improved.
Description
Technical Field
The invention relates to the technical field of performance testing, in particular to a method and a device for automatically generating a performance testing script.
Background
At present, the system performance test based on the C/S architecture adopts a mode that interaction of a foreground application program and a system background database is simulated through a performance test tool, and the system performance test has the advantages that too many resources are not needed, and the concurrency number is controllable.
In a specific implementation mode, a background SQL request corresponding to business operation of a foreground application program is captured through a tracking tool in advance, a Jmeter tool is opened, a test plan node, a thread group node and a controller are manually created, simulated requests are added under the controller, when the simulated requests are added, the content of each tracked SQL request is pasted into one simulated request one by one, and then database parameters and the like corresponding to the SQL request are configured manually. Thus, creation of the Jmeter script is completed.
Because SQL requests are large in quantity and various in service types, manual creation of the Jmeter script is long in time consumption and low in efficiency.
Disclosure of Invention
The embodiment of the application aims to provide a method and a device for automatically generating a performance test script, wherein the performance test script is created in an automatic mode, so that the script generation efficiency is improved, and further the performance test efficiency is improved.
In order to achieve the above purpose, the present application provides the following technical solutions:
in a first aspect, an embodiment of the present application provides an automatic generation method for a performance test script, where the performance test script is used to test performance of a system to be tested, and the method includes: acquiring a tracking file, wherein the tracking file comprises a plurality of service requests sent to a database of the system to be tested by a preset application program; analyzing each service request into the contents of a plurality of preset fields to obtain an analysis result; acquiring a preset script configuration file, wherein the script configuration file comprises configuration information of each node of a frame structure forming a performance test script; and generating a performance test script according to the script configuration file and the analysis result.
The performance test script automatic generation method provided by the embodiment of the application can automatically generate the performance test script according to the preset script configuration file and the tracked actual service request, greatly reduce the labor cost and improve the script generation efficiency.
In an optional embodiment, before obtaining the preset script configuration file, the method further includes: a framework structure defining a performance test script, the framework structure comprising: a performance test plan node and a thread group node under the performance test plan node; and acquiring configuration information of each node in the defined framework structure, and generating the script configuration file according to the configuration information.
The framework structure of the performance test script and the configuration information of each node in the framework structure are predefined, and a script configuration file is generated to be called when the performance test script is automatically created.
In an optional implementation manner, the generating a performance test script according to the script configuration file and the parsing result includes: forming a frame structure of a performance test script according to the script configuration file; constructing a corresponding JDBC (Java Database connection) request according to the analysis result of each service request under the thread group node in the framework structure to obtain a plurality of JDBC requests, wherein the contents of a plurality of preset fields in the analysis result respectively correspond to one parameter in the JDBC request; and generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the constructed JDBC requests.
In the process of generating the performance test script, a JDBC request is constructed according to the analysis result of the actual service request and is used for simulating the interaction between an actual user and a background database during performance test.
In an optional implementation manner, after parsing each service request into contents of a plurality of preset fields to obtain a parsing result, the method further includes: carrying out data cleaning on the analysis result according to a preset cleaning rule to obtain a cleaned analysis result; the JDBC requests are constructed according to the cleaned parsing result.
And the plurality of service requests in the tracking file comprise useless service requests, and the analysis result of the useless service requests in the obtained analysis result is removed by setting a cleaning rule, so that only available contents are reserved.
In an optional implementation manner, the generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the constructed multiple JDBC requests, where the parsing result includes a sending timestamp of the corresponding service request, includes: sequencing JDBC requests belonging to the same service in the plurality of JDBC requests according to the sending time stamps in the corresponding analysis results; and generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the sequenced JDBC requests.
Because each JDBC request is sorted according to the corresponding service and the sending time stamp, the JDBC requests are very fit with the actual service, and the truth of the performance test is high.
In an optional implementation manner, after the JDBC requests belonging to the same service in the multiple JDBC requests are sorted according to the sending timestamps in the corresponding parsing results, and before a performance test script is generated according to the configuration information of each node of the framework structure in the script configuration file and the sorted multiple JDBC requests, the method further includes: respectively initializing a count value for JDBC requests of different services, and performing accumulation counting on the sequence of the count value according to the corresponding sending timestamp; and naming the JDBC requests belonging to the same service according to the name of the corresponding service and the count value in the service.
The JDBC requests belonging to the same service are named according to the names of the corresponding services and the count values in the services, after the performance test scripts are run, the performance test results of different services can be visually seen, and meanwhile, the request quantity of each service can be reflected.
In an optional implementation manner, the parsing result includes a sending timestamp of the corresponding service request, and after obtaining the cleaned parsing result, the method further includes: classifying the cleaned analysis results according to the corresponding services, sequencing the analysis results belonging to the same service in the cleaned analysis results according to the sending time stamps, and sequentially constructing corresponding JDBC requests for the analysis results of the same service according to the sequenced sequence.
In a second aspect, an embodiment of the present application provides an automatic generation apparatus for a performance test script, where the performance test script is used to test performance of a system to be tested, and the apparatus includes: the request tracking module is used for acquiring a tracking file, wherein the tracking file comprises a plurality of service requests sent to a database of the system to be tested by a preset application program; the data analysis module is used for analyzing each service request into the contents of a plurality of preset fields to obtain an analysis result; the configuration module is used for acquiring a preset script configuration file, and the script configuration file comprises configuration information of each node of a frame structure forming the performance test script; and the script construction module is used for generating a performance test script according to the script configuration file and the analysis result.
In a third aspect, an embodiment of the present application provides a storage medium, where a computer program is stored on the storage medium, and when the computer program is executed by a processor, the method according to any one of the first aspect and the optional implementation manner of the first aspect is performed.
In a fourth aspect, an embodiment of the present application provides an electronic device, including: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating via the bus when the electronic device is running, the machine-readable instructions when executed by the processor performing the method according to any one of the first aspect, the optional implementation of the first aspect.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are required to be used in the embodiments of the present application will be briefly described below, it should be understood that the following drawings only illustrate some embodiments of the present application and therefore should not be considered as limiting the scope, and that those skilled in the art can also obtain other related drawings based on the drawings without inventive efforts.
Fig. 1 is a flowchart of a method for automatically generating a performance test script according to an embodiment of the present application;
FIG. 2 is a flowchart showing a specific example of step 150 of the automatic generation method of the performance test script;
fig. 3 is a schematic diagram of an apparatus for automatically generating a performance test script according to an embodiment of the present application;
fig. 4 is a schematic view of an electronic device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, it need not be further defined and explained in subsequent figures. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other identical elements in a process, method, article, or apparatus that comprises the element. The terms "first," "second," "third," and the like are used solely to distinguish one from another and are not to be construed as indicating or implying relative importance.
The embodiment of the application provides an automatic generation method of a performance test script, which is used for automatically generating the performance test script so as to test the performance of a system to be tested by using the automatically generated performance test script. Because the performance test script is automatically generated, the manual participation links are few, the script generation efficiency is improved, and further, the performance test efficiency is also improved. Jmeter is an open source code item of the Apache organization, a tool for performance testing. The performance test script in the embodiment of the present application may be a Jmeter script, and may also be a script in other forms, such as a LoadRunner script, a WebLOAD script, a Loadster script, and the like.
Fig. 1 is a flowchart illustrating an automatic generation method of a performance test script provided in this embodiment, please refer to fig. 1, where the method includes the following steps:
step 110: a script configuration file is created.
The script configuration file includes configuration information for the various nodes that make up the framework structure of the performance test script. Specifically, a framework structure of the performance test script is defined in the script configuration file, and each node of the framework structure of the performance test script includes: a test plan node (root node) and a thread group node (child node under the root node) under the test plan node, although other child nodes may be included under the test plan node. The test plan describes the execution process and steps of the Jmeter script in the performance test process, and a complete test plan can include one or more thread groups, logic controllers, instance generation controllers, listeners, timers, configuration elements, and the like, so that in a script configuration file, each defined node forming the framework structure of the performance test script can include, but is not limited to, a logic controller node, a listener node, and the like in addition to a test plan node and a thread group node, and the framework structure of the performance test script in the script configuration file can be flexibly defined according to requirements.
The basis for performance testing is thread groups, one thread group representing a simulated set of users that are used to simulate sending requests to the database of the system under test.
It can be understood that the present embodiment tests the system performance by simulating the way that the client user sends the constructed test request to the system database.
In this embodiment, the script configuration file is in the format of.
In a specific embodiment, a plurality of generated performance test scripts are obtained in advance, the configuration in the performance test scripts is analyzed, the overall framework structure and common parameters and variables of universality are extracted and written into a script configuration file for calling when the performance test scripts are automatically created.
Step 120: and acquiring a tracking file, wherein the tracking file comprises a plurality of service requests sent to a database of the system to be tested by a preset application program.
A foreground application program is deployed on a client device, a service request corresponding to the service operation of the foreground application program is captured by a tracking tool in advance, the foreground application program sends the corresponding service request to a background database when the service operation is executed, and one service can only send one service request or can also send a group of service requests. For example, a tracking tool such as sql profiler can be used for tracking, the tracking content includes the time of the request and the text content, and all service requests interacted between the foreground application program and the background database within a period of time are captured and stored by the tracking tool to form a tracking file. Each service request in the trace file is named according to the corresponding service name.
The trace file may support saving in a variety of formats, and in this embodiment, the trace file includes, but is not limited to, an XML format.
When the simple service system only has one data source, namely, one foreground application program only interacts with one background database, a tracking file can be obtained. When a complex business system has a plurality of data sources, namely, one foreground application program needs to interact with a plurality of background databases to complete the business, a plurality of trace files can be obtained.
The acquisition of the trace file includes, but is not limited to, the following ways:
1. and acquiring the file name of the tracking file, and acquiring the tracking file according to the file name.
2. And acquiring a storage path of the tracking file, inquiring the file under the storage path, and acquiring the tracking file.
3. The trace file is retrieved through a read interface provided by the storage medium carrier.
Step 130: and analyzing each service request into the contents of a plurality of preset fields to obtain an analysis result.
After one or more tracking files are obtained, each service request in the tracking files is analyzed, each service request is analyzed into the content of a plurality of preset fields, and each service request obtains an analysis result. The analyzed contents of the plurality of preset fields comprise: the request text, the database information, the information of the foreground application program and the time information are converted into a time stamp, and the time stamp is the sending time stamp of the corresponding service request.
Step 140: a pre-created script configuration file is obtained.
Step 150: and generating a performance test script according to the script configuration file and the analysis result.
And after the analysis result of the service request is obtained, automatically generating a performance test script according to the analysis result and a script configuration file established in advance.
Specifically, referring to fig. 2, step 150 includes:
step 151: and forming a framework structure of the performance test script according to the script configuration file.
The script is constructed in an XML format. Firstly, a test plan node is constructed according to a script configuration file, and then a thread group node and a JDBC connection configuration (JDBCConnection Config) node are added under the test plan node. The name of the thread group node may be generated by default from the business name. And configuring the thread setting information of the thread group in the script configuration file, and configuring the thread setting information into 1 concurrent execution 1 turn by default without time configuration.
Step 152: and constructing corresponding JDBC requests according to the analysis result of each service request under the thread group nodes in the frame structure to obtain a plurality of JDBC requests.
And constructing a corresponding JDBC request under the thread group node according to the analysis result of each service request, wherein the analysis result of each service request comprises the contents of a plurality of fields, such as a request text, database information, information of a foreground application program and a sending timestamp, and the content of each field corresponds to one parameter in the JDBC request.
In step 152, a JDBC request may be constructed according to all the parsing results obtained in step 130, or may be constructed according to some of the parsing results, for example, data washing is performed on the parsing results in step 130, and a JDBC request is constructed according to the washed parsing results. A service request is analyzed to obtain an analysis result, and a JDBC request is constructed by using the analysis result.
Step 153: and generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the constructed multiple JDBC requests.
And after constructing a corresponding JDBC request under the thread group node according to the analysis result of the service request, automatically generating a performance test script according to the constructed multiple JDBC requests and the configuration information in the script configuration file.
In an embodiment, after the JDBC requests are constructed in step 152, JDBC requests belonging to the same service in the JDBC requests are sorted according to the sending timestamps in the corresponding parsing results, and then a performance testing script is generated according to the configuration information of each node of the framework structure in the script configuration file and the sorted JDBC requests. Because each JDBC request is sorted according to the corresponding service and the sending time stamp, the JDBC requests are matched with the actual service.
After sequencing, initializing a count value for JDBC requests of different services respectively, performing accumulation counting on the sequence of the count values according to the corresponding sending timestamps, and splicing the service name and the count value to form the name of the JDBC request. And naming the JDBC requests belonging to the same service according to the name of the corresponding service and the count value in the service.
For example, the service a has 100 service requests, 100 JDBC requests are constructed, a count value is initialized for the service a, the initial count value is assumed to be 0, the sequence of the sending timestamps of the 100 service requests is counted up, the count is from 0 to 99, and the names of the 100 JDBC requests are: service a _001, service a _002, … …, service a _ 099.
For JDBC requests with special attributes, corresponding keywords are abstracted, the extracted keywords are initialized to be characteristic value variables of the requests and named according to service names, characteristic values and counting values, and if the characteristic value is 002, the name is named as service A _002_ 001.
And traversing all the analysis results, constructing the JDBC request according to all the analysis results according to a certain naming rule, and adding the JDBC request into the XML file of the framework structure.
And after the JDBC request is constructed, the content of the whole performance test script is constructed, and according to the obtained configuration information of the frame structure and the constructed JDBC request, the performance test script with the suffix of jmx is generated.
The performance test script may be named as follows:
1. drawing up a fixed file name, such as test.jmx;
2. naming according to the name of the realized service;
3. automatically named according to a preset rule.
If the script storage path is configured, after the performance test script is generated, saving the file of the performance test script under the corresponding path, and if the path is not configured, saving the file in a default folder.
The performance test script automatically generated in the embodiment can support subsequent flexible modification.
And running a performance test script, sending the constructed JDBC request to a database through the thread group node, and monitoring and summarizing the performance test data of the system in real time. And after the performance test script is finished to run, generating a performance test report, wherein the report comprises various performance test indexes for performing performance test on the system to be tested.
In a particular embodiment, a JDBC request is constructed from the scrubbed parsing results. After step 130, the method further comprises: and carrying out data cleaning on the obtained analysis result according to a preset cleaning rule to obtain a cleaned analysis result. And the plurality of service requests in the tracking file comprise useless service requests, and the analysis result of the useless service requests in the obtained analysis result is removed by setting a cleaning rule, so that only available contents are reserved.
After the data washing is completed, the washed analysis result is stored according to a predefined format and field name, which is stored by Panads in this embodiment.
The saved analysis result can be output to an external carrier in a form of printing, dumping, writing and the like, and the external carrier comprises a database, a text file, a storage medium and the like and is used for business analysis.
After the cleaned analytical results are saved, the cleaned analytical results can be subjected to subsequent processing.
And for the analysis results of the same data source, classifying the cleaned analysis results according to the services, and sequencing the analysis results belonging to the same service in the cleaned analysis results according to the sending time stamps, wherein if the service A has 10 service requests, the analysis results of the 10 service requests of the service A are sequenced according to the sending time stamps.
It can be understood that, for a plurality of service requests of a single data source and a single service, if the analysis results of the plurality of service requests are sorted according to the time sequence, the steps of sorting and sorting may not be required.
For the analysis results of a plurality of data sources, the analysis results of the plurality of cleaned data sources are merged together, the merged analysis results are classified according to the services, and the analysis results belonging to the same service in the plurality of data sources are named according to a uniform service name, for example, for the analysis results of the data source 1, the data source 2 and the data source 3, before merging, the naming for the same service a may be different, and after merging, the naming is uniformly named as the service a. Then, the multiple analysis results of the same service are sorted according to the sending time stamp, for example, the multiple analysis results of the service a are sorted according to the time stamp, and the multiple analysis results of the service B are sorted according to the time stamp. After the analysis results of the multiple data sources are combined, classified and sequenced, the analysis results are matched with the actual service.
In constructing the performance testing script, a JDBC request is constructed according to the cleaned parsing result. When the JDBC request is constructed, all analysis results of the corresponding services are sequentially searched from the stored analysis results according to the service names, and the corresponding JDBC request is sequentially constructed for the analysis results of the same service according to the sorted sequence.
In this embodiment, when a JDBC request is constructed, the matching between the request content and the service name in the parsing result is verified through the check rule, and when the verification request content is matched with the service name, the JDBC request is constructed according to the parsing result.
In summary, the performance test script automatic generation method provided by the embodiment of the present application can automatically generate the script file according to the preset script configuration file and the tracked actual service request, so that the labor cost can be greatly reduced, and the script generation efficiency can be improved.
Based on the same inventive concept, an embodiment of the present application provides an apparatus for automatically generating a performance test script, where the performance test script is used to test the performance of a system to be tested, and please refer to fig. 3, the apparatus includes: a request tracking module 210, a data parsing module 220, a configuration module 230, and a script building module 240.
The request tracking module 210 is configured to obtain a tracking file, where the tracking file includes a plurality of service requests sent by a preset application program to a database of the system to be tested;
the data analysis module 220 is configured to analyze each service request into contents of a plurality of preset fields to obtain an analysis result;
a configuration module 230, configured to obtain a preset script configuration file, where the script configuration file includes configuration information of each node of a framework structure that constitutes a performance test script;
and the script building module 240 is configured to generate a performance test script according to the script configuration file and the parsing result.
Optionally, the apparatus further comprises: a configuration file building module for defining a framework structure of a performance test script, the framework structure comprising: a performance test plan node and a thread group node under the performance test plan node; and the script configuration file is used for acquiring the configuration information of each node in the defined framework structure and generating the script configuration file according to the configuration information.
Optionally, the script building module 240 includes: the frame structure analysis module is used for forming a frame structure of the performance test script according to the script configuration file; a JDBC request construction module, configured to construct, at a thread group node in the framework structure, a corresponding JDBC request according to a parsing result of each service request, so as to obtain multiple JDBC requests, where contents of multiple preset fields in the parsing result respectively correspond to one parameter in the JDBC request; and the script file generation module is used for generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the constructed JDBC requests.
Optionally, the apparatus further comprises: the data cleaning module is used for carrying out data cleaning on the analysis result according to a preset cleaning rule to obtain a cleaned analysis result; the JDBC request construction module is specifically used for constructing and obtaining the plurality of JDBC requests according to the cleaned analysis result.
Optionally, the script file generating module is specifically configured to: sequencing JDBC requests belonging to the same service in the plurality of JDBC requests according to the sending time stamps in the corresponding analysis results; and generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the sequenced JDBC requests.
Optionally, the apparatus further comprises: the naming module is used for respectively initializing a count value for JDBC requests of different services and performing accumulation counting on the sequence of the count value according to the corresponding sending timestamp; and naming the JDBC requests belonging to the same service according to the name of the corresponding service and the count value in the service.
Optionally, the apparatus further comprises: and the processing module is used for classifying the cleaned analysis results according to the corresponding services, sequencing the analysis results belonging to the same service in the cleaned analysis results according to the sending time stamps, and sequentially constructing corresponding JDBC requests for the analysis results of the same service according to the sequenced sequence.
The implementation principle and the generated technical effect of the performance test script automatic generation device provided by the embodiment of the application have been introduced in the foregoing method embodiment, and for brief description, no part of the embodiment of the device is mentioned, and reference may be made to corresponding contents in the method embodiment.
Fig. 4 shows a possible structure of an electronic device 300 provided in an embodiment of the present application. Referring to fig. 4, the electronic device 300 includes: a processor 310, a memory 320, and a communication interface 330, which are interconnected and in communication with each other via a communication bus 340 and/or other form of connection mechanism (not shown).
The Memory 320 includes one or more (Only one is shown in the figure), which may be, but not limited to, a Random Access Memory (RAM), a Read Only Memory (ROM), a Programmable Read-Only Memory (PROM), an Erasable Programmable Read-Only Memory (EPROM), an electrically Erasable Programmable Read-Only Memory (EEPROM), and the like. The processor 310, as well as possibly other components, may access, read, and/or write data to the memory 320.
The processor 310 includes one or more (only one shown) which may be an integrated circuit chip having signal processing capabilities. The Processor 310 may be a general-purpose Processor, and includes a Central Processing Unit (CPU), a Micro Control Unit (MCU), a Network Processor (NP), or other conventional processors; the Application-Specific Processor may also be a special-purpose Processor, including a Graphics Processing Unit (GPU), a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, a discrete Gate or transistor logic device, and discrete hardware components. Also, when there are a plurality of processors 310, some of them may be general-purpose processors, and the other may be special-purpose processors.
The communication interface 330 includes one or more (only one shown) devices that can be used to communicate directly or indirectly with other devices for data interaction, such as to communicate with an external storage medium carrier to retrieve a tracking file. Communication interface 330 may include an interface to communicate wired and/or wireless.
One or more computer program instructions may be stored in the memory 320 and read and executed by the processor 310 to implement the automatic generation method of the performance test script provided by the embodiments of the present application and other desired functions.
It will be appreciated that the configuration shown in fig. 4 is merely illustrative and that electronic device 300 may include more or fewer components than shown in fig. 4 or have a different configuration than shown in fig. 4. The components shown in fig. 4 may be implemented in hardware, software, or a combination thereof. The electronic device 300 may be a physical device, such as a PC, a laptop, a server, an embedded device, etc., or may be a virtual device, such as a virtual machine, a virtualized container, etc. The electronic device 300 is not limited to a single device, and may be a combination of a plurality of devices or a cluster including a large number of devices. The electronic device 300 provided in this embodiment may be the same device as the client device with the foreground application deployed in the foregoing, or may be a different device.
The embodiment of the present application further provides a computer-readable storage medium, where computer program instructions are stored on the computer-readable storage medium, and when the computer program instructions are read and run by a processor of a computer, the method for automatically generating a performance test script provided in the embodiment of the present application is executed. The computer-readable storage medium may be implemented as, for example, memory 320 in electronic device 300 in fig. 4.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus and method may be implemented in other ways. The above-described embodiments of the apparatus are merely illustrative, and for example, the division of the unit is only a logical division, and other divisions may be realized in practice. Furthermore, the functional modules in the embodiments of the present application may be integrated together to form an independent part, or each module may exist separately, or two or more modules may be integrated to form an independent part.
The above description is only an example of the present application and is not intended to limit the scope of the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.
Claims (10)
1. A method for automatically generating a performance test script is characterized in that the performance test script is used for testing the performance of a system to be tested, and the method comprises the following steps:
acquiring a tracking file, wherein the tracking file comprises a plurality of service requests sent to a database of the system to be tested by a preset application program;
analyzing each service request into the contents of a plurality of preset fields to obtain an analysis result;
acquiring a preset script configuration file, wherein the script configuration file comprises configuration information of each node of a frame structure forming a performance test script;
and generating a performance test script according to the script configuration file and the analysis result.
2. The method of claim 1, wherein prior to obtaining the pre-set script configuration file, the method further comprises:
a framework structure defining a performance test script, the framework structure comprising: a performance test plan node and a thread group node under the performance test plan node;
and acquiring configuration information of each node in the defined framework structure, and generating the script configuration file according to the configuration information.
3. The method of claim 2, wherein generating a performance test script according to the script configuration file and the parsing result comprises:
forming a frame structure of a performance test script according to the script configuration file;
constructing a corresponding JDBC request according to the analysis result of each service request under the thread group node in the frame structure to obtain a plurality of JDBC requests, wherein the contents of a plurality of preset fields in the analysis result respectively correspond to one parameter in the JDBC requests;
and generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the constructed JDBC requests.
4. The method of claim 3, wherein after parsing each service request into a plurality of predetermined fields, the method further comprises:
carrying out data cleaning on the analysis result according to a preset cleaning rule to obtain a cleaned analysis result; the JDBC requests are constructed according to the cleaned parsing result.
5. The method according to claim 3, wherein the parsing result includes a sending timestamp of the corresponding service request, and the generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the constructed JDBC requests includes:
sequencing JDBC requests belonging to the same service in the plurality of JDBC requests according to the sending time stamps in the corresponding analysis results;
and generating a performance test script according to the configuration information of each node of the framework structure in the script configuration file and the sequenced JDBC requests.
6. The method according to claim 5, wherein after sorting JDBC requests belonging to the same service among the JDBC requests according to the sending timestamps in the corresponding parsing results, and before generating performance testing scripts according to the configuration information of the nodes of the framework structure in the script configuration file and the sorted JDBC requests, the method further comprises:
respectively initializing a count value for JDBC requests of different services, and performing accumulation counting on the sequence of the count value according to the corresponding sending timestamp;
and naming the JDBC requests belonging to the same service according to the name of the corresponding service and the count value in the service.
7. The method of claim 4, wherein the parsing result includes a sending timestamp of the corresponding service request, and after obtaining the cleaned parsing result, the method further comprises:
classifying the cleaned analysis results according to the corresponding services, sequencing the analysis results belonging to the same service in the cleaned analysis results according to the sending time stamps, and sequentially constructing corresponding JDBC requests for the analysis results of the same service according to the sequenced sequence.
8. The utility model provides a performance test script automatic generation device which characterized in that, performance test script is used for testing the performance of system under test, the device includes:
the request tracking module is used for acquiring a tracking file, wherein the tracking file comprises a plurality of service requests sent to a database of the system to be tested by a preset application program;
the data analysis module is used for analyzing each service request into the contents of a plurality of preset fields to obtain an analysis result;
the configuration module is used for acquiring a preset script configuration file, and the script configuration file comprises configuration information of each node of a frame structure forming the performance test script;
and the script construction module is used for generating a performance test script according to the script configuration file and the analysis result.
9. A storage medium, having stored thereon a computer program which, when executed by a processor, performs the method according to any one of claims 1-7.
10. An electronic device, comprising: a processor, a memory and a bus, the memory storing machine-readable instructions executable by the processor, the processor and the memory communicating over the bus when the electronic device is operating, the machine-readable instructions when executed by the processor performing the method of any of claims 1-7.
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011186544.3A CN114428725A (en) | 2020-10-29 | 2020-10-29 | Method and device for automatically generating performance test script |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202011186544.3A CN114428725A (en) | 2020-10-29 | 2020-10-29 | Method and device for automatically generating performance test script |
Publications (1)
| Publication Number | Publication Date |
|---|---|
| CN114428725A true CN114428725A (en) | 2022-05-03 |
Family
ID=81309045
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202011186544.3A Pending CN114428725A (en) | 2020-10-29 | 2020-10-29 | Method and device for automatically generating performance test script |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN114428725A (en) |
Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6167534A (en) * | 1995-11-24 | 2000-12-26 | Rational Software Corporation | Load test system and method |
| CN107133161A (en) * | 2016-02-26 | 2017-09-05 | 中国移动(深圳)有限公司 | One kind generation client performance test script method and device |
| CN108009087A (en) * | 2017-11-29 | 2018-05-08 | 广州品唯软件有限公司 | Data library test method, device and computer-readable recording medium |
| US20190188116A1 (en) * | 2017-12-20 | 2019-06-20 | 10546658 Canada Inc. | Automated software testing method and system |
-
2020
- 2020-10-29 CN CN202011186544.3A patent/CN114428725A/en active Pending
Patent Citations (4)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US6167534A (en) * | 1995-11-24 | 2000-12-26 | Rational Software Corporation | Load test system and method |
| CN107133161A (en) * | 2016-02-26 | 2017-09-05 | 中国移动(深圳)有限公司 | One kind generation client performance test script method and device |
| CN108009087A (en) * | 2017-11-29 | 2018-05-08 | 广州品唯软件有限公司 | Data library test method, device and computer-readable recording medium |
| US20190188116A1 (en) * | 2017-12-20 | 2019-06-20 | 10546658 Canada Inc. | Automated software testing method and system |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112286806B (en) | Automatic test method and device, storage medium and electronic equipment | |
| CN110515912A (en) | Log processing method, device, computer installation and computer readable storage medium | |
| Bockermann et al. | The streams framework | |
| CN113986746A (en) | Performance test method and device and computer readable storage medium | |
| CN110750458A (en) | Big data platform testing method and device, readable storage medium and electronic equipment | |
| CN106970880A (en) | A kind of distributed automatization method for testing software and system | |
| CN113704077B (en) | Test case generation method and device | |
| CN109697456A (en) | Business diagnosis method, apparatus, equipment and storage medium | |
| CN112559525B (en) | Data checking system, method, device and server | |
| Wang et al. | A model-based framework for cloud API testing | |
| CN111367792A (en) | Test method, test device, storage medium and electronic equipment | |
| CN114416561A (en) | A stress test configuration method and device | |
| CN113590261B (en) | Distributed service deployment method and system | |
| CN113746842A (en) | Message sending method based on Protobuf protocol dynamic analysis | |
| CN111858727A (en) | A system and method for exporting data from multiple data sources based on template configuration | |
| CN113742226A (en) | Software performance testing method, device, medium and electronic equipment | |
| CN115098401B (en) | HTML report verification method, device, electronic device and storage medium | |
| CN114253798A (en) | Index data acquisition method and device, electronic equipment and storage medium | |
| CN110442331B (en) | Method and system for automatically building code frame | |
| CN118227255A (en) | Service cluster testing method, device, storage medium and program product | |
| CN114428725A (en) | Method and device for automatically generating performance test script | |
| CN115221033A (en) | Interface protocol testing method and device, computer readable medium and electronic equipment | |
| KR101039874B1 (en) | Information Communication Integrated Platform Test System | |
| CN113238901B (en) | Multi-device automatic testing method and device, storage medium and computer device | |
| CN116467188A (en) | Universal local reproduction system and method under multi-environment scene |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination |