CN115017283B - Natural language processing model, method, electronic device and computer storage medium - Google Patents
Natural language processing model, method, electronic device and computer storage medium Download PDFInfo
- Publication number
- CN115017283B CN115017283B CN202210606848.3A CN202210606848A CN115017283B CN 115017283 B CN115017283 B CN 115017283B CN 202210606848 A CN202210606848 A CN 202210606848A CN 115017283 B CN115017283 B CN 115017283B
- Authority
- CN
- China
- Prior art keywords
- statement
- sql
- sql statement
- parameters
- natural language
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/30—Information retrieval; Database structures therefor; File system structures therefor of unstructured textual data
- G06F16/33—Querying
- G06F16/332—Query formulation
- G06F16/3329—Natural language query formulation
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/24—Querying
- G06F16/242—Query formulation
- G06F16/2433—Query languages
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/08—Learning methods
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- General Engineering & Computer Science (AREA)
- Computational Linguistics (AREA)
- Data Mining & Analysis (AREA)
- General Physics & Mathematics (AREA)
- Databases & Information Systems (AREA)
- Artificial Intelligence (AREA)
- Biomedical Technology (AREA)
- Molecular Biology (AREA)
- Computing Systems (AREA)
- General Health & Medical Sciences (AREA)
- Evolutionary Computation (AREA)
- Biophysics (AREA)
- Software Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Health & Medical Sciences (AREA)
- Human Computer Interaction (AREA)
- Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
Abstract
The embodiment of the application provides a natural language processing model, a natural language processing method, electronic equipment and a computer storage medium. The natural language processing model is used for converting natural language problem sentences in multiple rounds of conversations into SQL sentences, and comprises a pooling layer, an encoding layer and an output layer, wherein the pooling layer is used for compressing column information of a data table to be retrieved into encoded data with set length, the encoding layer is used for constructing input data according to the natural language problem sentences and the encoded data containing the listed information, predicting parameters of the SQL sentences of the current round and intentions of the problem sentences based on the input data, and the output layer is used for generating the SQL sentences corresponding to the problem sentences according to the predicted intentions and the parameters of the SQL sentences of the current round. The model can promote reasoning performance.
Description
Technical Field
The embodiment of the application relates to the technical field of intelligent dialogue, in particular to a natural language processing model, a natural language processing method, electronic equipment and a computer storage medium.
Background
Tables (tables) are a widely used data storage means, and are widely used for storing and displaying structured data. Because of their clear structure, ease of maintenance, and timeliness, forms are often an important answer source for search engines and intelligent dialog systems. In the process of interaction (such as online shopping, ticket booking, meal ordering and the like) between the conventional and database, most of operations are performed through a background packaged SQL template, so that the operations are often limited to the preset SQL template, the flexibility is not provided, and the requirements of an intelligent dialogue system are difficult to meet.
In order to solve this problem, a technical solution of NL2SQL (natural language to SQL statement) may be used, and the user only needs to express the intent through the natural language, where NL2SQL may convert it into the structured problem statement SQL, which may greatly shorten the distance between the user and the database.
And TableQA is an effective application to this technology. TableQA is a type of intelligent dialog that can automatically convert user natural language into SQL statements and automatically reply to a query for a question-answering robotic product. In an actual application scene, as data are generally stored in a database, complex problems such as multiple tables, association of tables through external keys and the like are required to be related in order to represent complex table relationships, and the situations such as multiple tables, multiple rounds of table questions and answers are required to be considered in the question and answer process according to the context information of natural language, so that the input data of the NL2SQL model are very long, and the overall performance of the NL2SQL model is seriously influenced.
Disclosure of Invention
Accordingly, embodiments of the present application provide a natural language processing scheme to at least partially solve the above-mentioned problems.
According to a first aspect of an embodiment of the present application, there is provided a natural language processing model for converting a problem statement of a natural language in a multi-round dialogue into an SQL statement, the model including a pooling layer for compressing column information of a data table to be retrieved into encoded data of a set length, an encoding layer for constructing input data according to the problem statement of the natural language and the encoded data including the column information and predicting parameters of the SQL statement of a current round and intentions of the problem statement based on the input data, and an output layer for generating the SQL statement corresponding to the problem statement according to the predicted intentions and parameters of the SQL statement of the current round.
According to a second aspect of the embodiment of the application, a natural language processing method is provided, and the method comprises the steps of compressing column information of a data table to be retrieved into coded data with set length, constructing input data at least according to the coded data containing the column information and problem sentences of a current round of dialogue, processing the input data by using a natural language processing model to predict parameters of SQL sentences corresponding to the problem sentences and intentions of the problem sentences, and generating the SQL sentences corresponding to the problem sentences according to the predicted parameters of the SQL sentences corresponding to the problem sentences and intentions of the problem sentences.
According to a third aspect of the embodiment of the application, there is provided an electronic device, including a processor, a memory, a communication interface and a communication bus, where the processor, the memory and the communication interface complete communication with each other through the communication bus, and the memory is configured to store at least one executable instruction, where the executable instruction causes the processor to perform an operation corresponding to a method as described above.
According to a fourth aspect of embodiments of the present application, there is provided a computer storage medium having stored thereon a computer program which, when executed by a processor, implements a method as described above.
According to a fifth aspect of embodiments of the present application, there is provided a computer program product comprising computer instructions for instructing a computing device to perform operations corresponding to the method as described above.
According to the method provided by the embodiment of the application, in the process of processing the problem statement by the natural language processing model to obtain the SQL statement, column information (such as column names) of the data table to be searched is required to be input, because the column information of the data table is longer, the length of input data input into the natural language processing model is longer, so that the calculated amount of the natural language processing model is increased, the calculation efficiency is reduced, and the performance is influenced. The coding layer can predict the intention of the problem statement and the parameters of the SQL statement corresponding to the problem statement by processing the input data, and then the output layer can generate the corresponding SQL statement according to the intention and the parameters of the SQL statement.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments described in the embodiments of the present application, and other drawings may be obtained according to these drawings for a person having ordinary skill in the art.
FIG. 1 is a schematic diagram of a natural language processing model according to a first embodiment of the present application;
FIG. 2 is a flowchart illustrating a natural language processing method according to a second embodiment of the present application;
FIG. 3 is a block diagram illustrating a natural language processing device according to a third embodiment of the present application;
Fig. 4 is a schematic structural diagram of an electronic device according to a fifth embodiment of the present application.
Detailed Description
In order to better understand the technical solutions in the embodiments of the present application, the following description will clearly and completely describe the technical solutions in the embodiments of the present application with reference to the accompanying drawings in the embodiments of the present application, and it is obvious that the described embodiments are only some embodiments of the present application, not all embodiments. All other embodiments, which are derived by a person skilled in the art based on the embodiments of the present application, shall fall within the scope of protection of the embodiments of the present application.
The implementation of the embodiments of the present application will be further described below with reference to the accompanying drawings.
Example 1
For ease of understanding, one use scenario of the method of the present application is described by way of example before describing in detail the implementation of the method, but the method of the present application is not limited to this use scenario, and may be applied to other suitable scenarios.
In the embodiment of the application, a tableQA-based intelligent robot is taken as an example for explanation. The intelligent dialogue robot can conduct dialogue with a user based on data in a data table (table), and in the dialogue process, the intelligent dialogue robot can understand problem sentences input by the user in natural language and convert the problem sentences into SQL sentences which can be understood by a database, and then search the data table in the database by using the SQL sentences to obtain required information, and convert the required information into reply sentences expressed in the natural language and return the reply sentences to the user. For example, the user inquires about how long the investment time of the product A is, the intelligent dialogue robot converts the problem statement of the user into an SQL statement, queries the data table through the SQL statement to obtain information (such as 5 days) of the investment time related to the product A, and generates a natural language reply statement for the information to be fed back to the user, so that the user and the intelligent dialogue robot can interact in a similar person-to-person dialogue manner.
In this process, it is required to be able to accurately convert a problem sentence in a natural language into a corresponding SQL sentence recognizable by a database, and a natural language processing model for converting a problem sentence in a natural language in a multi-round dialogue into an SQL sentence is provided in this embodiment to achieve this function, which includes a pooling layer, an encoding layer, and an output layer. The pooling layer is used for compressing column information of a data table to be retrieved into coded data with set length, the coding layer is used for constructing input data according to problem sentences of natural language and the coded data containing the column information, predicting parameters of SQL sentences of a current round and intentions of the problem sentences based on the input data, and the output layer is used for generating the SQL sentences corresponding to the problem sentences according to the predicted intentions and the parameters of the SQL sentences of the current round.
In the embodiment, a pooling layer is added before a coding layer of the natural language processing model, the pooling layer is used for compressing the column information, so that the column information with a longer length is compressed into coding data with a set length, the column information is still reserved in the coding data, the problem statement and the like are constructed into input data, the length of the input data of the natural language processing model is reduced, the calculated amount of coding layer data processing is reduced, the performance is improved, and the column information is reserved in the coding data, so that the accuracy of the subsequent prediction of the SQL statement can be still ensured. The coding layer can predict the intention of the problem statement and the parameters of the SQL statement corresponding to the problem statement by processing the input data, and then the output layer can generate the corresponding SQL statement according to the intention and the parameters of the SQL statement.
The following describes the natural language processing model in detail with reference to fig. 1:
As shown in fig. 1, the natural language processing model includes a pooling layer (pooling), an encoding layer (encoding), and an output layer (output).
Among them, the pooling layer may be a pooling layer of max-pooling (max pooling) or a pooling layer of avg-pooling (average pooling), without limitation. Unlike the conventional pooling layer, the pooling layer in the present application is provided before the encoder, and the pooling layer may individually encode column information (schema) of the associated data table to output encoded data of a set length, which may be determined as needed, without limitation.
The set length is generally smaller than or equal to the length of the column information of any one column in the data after direct splicing. Column information includes, but is not limited to, column name (e.g., rate of return, product name, time of investment, etc.), column value type (e.g., text, numerical values, etc.), column units (e.g., days,%, ten thousand, etc.), and the like.
Conventionally, if the column information includes only a column name, and each character of the column name is converted into an N-dimensional vector (may also be referred to as token), the column name is represented by a3 XN-dimensional vector, taking the example that the column name is a yield rate. In the application, the column names are processed through the pooling layer, so that the compressed coded data is an N-dimensional vector, and the length of the constructed input data can be greatly reduced by processing the column information through the pooling layer under the condition that a plurality of columns exist in the data table. Of course, in other examples, the column information may include a column name (e.g., yield), a column unit (e.g., percent), and a column value type (e.g., number), without limitation. Experiments show that the more abundant the column information is, the better the accuracy of subsequent prediction is.
The encoded data output by the pooling layer together with the question statement (which may be represented in a vector fashion) etc. constitute the input data. For example, the encoding layer, when constructing input data from a question sentence in natural language and the encoded data, includes constructing the input data from the question sentence, history information of the multi-turn dialog (SQL sentence of the history shown in FIG. 1), the encoded data, and initialized SQL structure data (SQL-search data).
In this example, the input data includes not only the encoded data and the problem statement, but also the history information of the multi-round dialogue and the initialized SQL structure data, and because the history information of the multi-round dialogue is included, the natural language processing model can be compatible with the scenes of the multi-round dialogue, so that the natural language processing model can combine the context information of the history dialogue, thereby improving the prediction accuracy.
In this example, the history information of the multiple rounds of conversations includes SQL statements corresponding to the question statements in the previous round of conversations. Because the input data of each round of the natural language processing model contains the historical information, the SQL sentences of the previous round of dialogue actually fuse the context information of all the historical rounds of dialogue, so that the SQL sentences of the previous round of dialogue ensure that all the historical information can be reserved and the length of the input data is shorter, thereby being beneficial to improving the performance.
Besides the history information of the multi-round dialogue, the input data also comprises initialized SQL structure data, and the SQL structure data comprises initialized parameters (s-num, agg and the like shown in fig. 1) and initialized intentions (actions shown in fig. 1) of the SQL statement, so that the coding layer can directly predict the parameters of the SQL statement of the current round and the intentions expressed by the problem statement input by the user, and the follow-up output layer is not required to carry out complex processing on the data output by the coding layer, thereby simplifying the output layer and improving the performance.
The coding layer can adopt bert neural network model, and the model can be initialized in bert-base mode during training. Of course, in other embodiments, the coding layer may employ other models, as long as the accuracy of the prediction can be guaranteed.
The parameters of the predicted SQL statement may include, among other things, query parameters (e.g., select-related parameters) for indicating the query target and filter parameters for indicating the filter criteria.
The query parameters include, but are not limited to, a query number parameter, a query column parameter, and an aggregate function parameter.
The number of queries parameter (select number, s-number shown in FIG. 1) is used to indicate the number of query operations that the SQL statement of the current round contains, which may indicate that the SQL statement includes several queries.
The query column parameter (select column) is used to indicate a column, such as a column of "product name," for each query operation corresponding to the query.
The aggregate function parameter (Select-Aggregation) is used to indicate an aggregate function used by the query operation, such as a max function, a min function, and so on.
The screening parameters include a screening condition number parameter, a screening condition target column parameter, a screening condition operator, and a screening condition value. Screening connectors may be included in addition.
Wherein, the filtering condition number parameter (where number) is used to indicate the number of filtering conditions contained in the SQL sentence of the current round.
The target column parameter (where column) of the screening conditions is used to indicate the column screened by the screening conditions, such as "yield".
The Operator of the screening condition (where Operator) is used to indicate which Operator is used, e.g., greater than, equal to, less than, or not equal to, etc.
The value of the screening condition (where value) is used to indicate the value of the screening.
In one example, the SQL statement selects max (col_1) where col_2>2 has max () as the aggregate function, col_1 is the value of the query column parameter, col_2 is the value of the target column parameter of the filter criteria, > is the operator of the filter criteria, and 2 is the value of the filter criteria.
When there are a plurality of screening conditions, the screening connector is used to indicate the connection relationship between the screening conditions, such as "and" or.
When there is only one screening condition or no screening condition, there is no screening connector.
In addition to the parameters of these SQL statements, the coding layer may also predict the intent of the question statement, which may indicate to adjust the SQL statement of the previous round of dialogue to obtain the current round of SQL statement, or whether to answer the question statement using the data table, or to open a new SQL statement.
The output layer (output) may generate an SQL statement corresponding to the question statement of the current round of dialogue based on the predicted intent of the encoding layer and parameters of the SQL statement, so as to query the data table by executing the SQL statement, thereby obtaining the required information.
For example, it is intended that the SQL statement of the previous round of dialog be adjusted to obtain the SQL statement of the current round for the new condition, modification condition, deletion condition, modification query, modification of the aggregate function, deletion of the query, etc. In this case, the output layer adjusts the SQL statement in the previous dialog using the predicted parameters of the SQL statement to generate an SQL statement corresponding to the question statement.
Specifically, for example, when the new condition is intended, the parameters of the SQL statement include a filter condition number parameter (e.g., 2), a filter condition target column parameter (e.g., col_5), a filter condition operator (e.g., smaller than col), a filter condition value (e.g., value 1), and a filter connector (e.g., and), one filter condition col_5< value1 can be newly added on the basis of the previous round of the SQL statement, and the relation between the newly added filter condition and the filter condition of the original SQL statement is and, thus the number of filter conditions of the current round of the SQL statement generated is 2.
When the modification condition is intended, the parameters of the SQL statement may include a target column parameter (for example, col_2), and other parameters that do not need to be changed are not described in detail, and the target column in the filtering condition in the previous round of the SQL statement may be modified to col_2, thereby generating the current round of the SQL statement. When there are a plurality of filtering conditions, the target column of each filtering condition may be modified in the order of the output target column parameters. For example, the filtering condition is 3, the previous target column parameters in the SQL sentence of the previous dialog are col_3, col_5 and col_6, the target column parameters output by the coding layer are col_2, col_5 and col_6, and then the modification of col_3 to col_2 is indicated.
When the intention is to delete the condition, the parameters of the SQL sentence can be the parameters related to the screening condition which correspondingly represents the need to be deleted or the parameters of the screening condition which needs to be reserved, and the output layer only needs to delete the screening condition which is not needed in the SQL sentence of the previous dialog according to the intention.
When the query is intended to be modified, parameters of columns of the query to be modified are contained in parameters of the SQL statement, if the columns of the query to be modified are changed from 'product name' to 'yield', the output layer modifies the SQL statement of the previous round of dialogue based on the parameters of the SQL statement, and therefore the SQL statement of the current round is obtained.
When the aggregation function is intended to be modified, the parameters of the SQL statement include the aggregation function to be replaced, for example, max () is modified to min (), and the modification process is similar to the above, so that the description is omitted.
When the query is intended to be deleted, the parameters of the SQL statement include parameters of the query to be deleted, and the modification process is similar to that described above, so that the description is omitted.
It should be noted that one or more of the above-described intentions and corresponding parameters may be output in one prediction, and this is not a limitation.
And if the intention instruction in the prediction result starts a new SQL sentence, generating an SQL sentence corresponding to the problem sentence according to the predicted parameters of the SQL sentence.
When the intent representation opens a new SQL statement, it may be shown that the previous dialog has completed, opening a new topic, so that a new SQL statement may be generated from the parameters of the SQL statement.
If the intention in the prediction result indicates that the data table to be searched is not used for answering the question and sentence, the intention is refusal, and the relevance between the data table to be searched and the question and sentence is not large, so that the question and sentence cannot be answered through the data table to be searched, for example, the yield of a certain product can be inquired through the question and sentence, and the searched data table stores information such as the release year of film and television works, and the intention is refused because the data table to be searched and the question and sentence are irrelevant.
The natural language processing model can encode the column information (Schema) in the data table to obtain encoded data (Schema embedding), and the column information is encoded, so that the method is applicable to a multi-table multi-round TableQA question-answering system. The method solves the problems that the input of the model is very long and the overall performance of the model is seriously affected due to the fact that the conventional table schema is spliced to a problem statement to be uniformly analyzed, the number of the column names supported by the model in the embodiment is increased from 60 columns to more than 300 columns relative to the conventional model, the column name capacity is increased by 5 times, the reasoning performance of the model is increased by 2.3 times, and the method is better applicable to scenes of multi-table association query.
Example two
Referring to fig. 2, a schematic flow chart of steps of a natural language processing method according to a second embodiment of the present application is shown.
The method may use the natural language processing model of the previous embodiment, the method comprising the steps of:
step S202, compressing column information of a data table to be retrieved into coded data with set length.
For example, if the data table to be retrieved includes 10 columns, column information of each column may be compressed, thereby obtaining encoded data of a set length. The column information includes at least a column name, but is not limited thereto, and may include a column unit and the like. By the method, the column information can be used as the input of the coding layer of the natural language processing model, and the length of input data is reduced as much as possible, so that the reasoning performance is improved.
And S204, constructing input data at least according to the coded data containing the column information and the question statement of the current round of dialogue.
In a possible manner, the step S204 may be implemented to construct the input data based on the encoded data, the question statement of the current round of dialogue, the SQL statement of the previous round of dialogue, and SQL structured data. By including SQL sentences of the previous dialog in the input data, the input data can carry context information of historical dialogues in multiple dialogues, thereby improving adaptability to multiple dialog scenes.
By including SQL structure data (SQL SKETCH) in the input data, the SQL structure data includes the intention of the problem statement and the parameters of the SQL statement, so that the subsequent encoding layer can directly predict the parameters of the intention and the SQL statement, the output layer can obtain the parameters of the intention and the SQL statement without complex processing on the data output by the encoding layer, and the burden of data processing is reduced.
And S206, processing the input data by using a natural language processing model to predict parameters of the SQL sentence corresponding to the problem sentence and the intention of the problem sentence.
Input data are input into a trained coding layer, and are processed by the coding layer, so that the intent of the output of the coding layer and parameters of SQL sentences are obtained.
Step S208, according to the predicted parameters of the SQL statement corresponding to the problem statement and the intention of the problem statement, generating the SQL statement corresponding to the problem statement.
The intent of the question statement may include a new condition, a modified condition, a deleted condition, a modified query, a modified aggregate function, a deleted query, a re-launch, a rejection, etc. Wherein the new conditions, the modification conditions, the deletion conditions, the modification queries, the modification aggregation functions, the deletion queries and the like indicate that corresponding operations are executed on the basis of the SQL statement of the previous round of dialogue so as to generate the SQL statement corresponding to the question statement of the current round of dialogue.
For example, the SQL statement of the previous round of dialogue is the select investment time where product name=A, the question statement of the current round is "what the yield rate is," the corresponding intention is to modify the query, the parameters of the SQL statement include the query column parameter as "yield rate," and the SQL statement of the current round of dialogue after adjustment is the select yield rate where product name=A.
According to the method, a user only needs to express intention through a problem statement of natural language, NL2SQL (Natural Language to SQL) is converted into a structured query statement SQL based on the problem statement, intelligent response is achieved, and column information of each column in a data table is compressed to form encoded data as input, so that the length of the input data is reduced, the reasoning performance of a natural language processing model is improved, and the method is compatible with the data table with more columns.
The method evaluates the effect on a general industry data set, and has basically consistent overall effect and 2.3 times of inference performance compared with a sketch-base model, specifically, the effect of a test set of the sketch-base model is 86.6%, the effect of the test set is 85.4% when the column name is max-pooling, the effect of the test set is 85.8% when the column name is avg-pooling, the effect of the test set is 86.1% when the column name is avg-pooling and the column value type code is combined, and the effect of the test set is 86.3% when the column name is avg-pooling and the column value type code is combined.
The method can support multi-table and multi-round database question-answering scenes, solves the problems that the input data of the model is very long and seriously affects the overall performance of the model because the search-base model splices the column information of the data table to the question sentences for unified analysis, effectively reduces the length of the input data by compressing the column information and obtaining the coding data, and improves the column name capacity by 5 times and the reasoning performance by 2.3 times relative to the number of columns supported by the search-base model from 60 columns to more than 300 columns, thereby being better applicable to the scenes of multi-table association query.
The method of the present embodiment may be performed by any suitable electronic device having data processing capabilities, including, but not limited to, servers, mobile terminals (e.g., cell phones, PADs, etc.), PCs, and the like.
Example III
Referring to fig. 3, a block diagram of a natural language processing device according to a third embodiment of the present application is shown.
In this embodiment, the apparatus includes:
a compression module 302, configured to compress column information of a data table to be retrieved into encoded data with a set length;
a construction module 304, configured to construct input data at least according to the encoded data containing the column information and a question sentence of a current round of dialogue;
A prediction module 306, configured to process the input data using a natural language processing model, so as to predict parameters of an SQL statement corresponding to the problem statement and an intention of the problem statement;
the generating module 308 is configured to generate an SQL statement corresponding to the question statement according to the predicted parameters of the SQL statement corresponding to the question statement and the intent of the question statement.
Optionally, the building module 304 is configured to build the input data according to the encoded data, the question statement of the current dialog, the SQL statement of the previous dialog, and the SQL structure data.
The device of the present embodiment is configured to implement the corresponding method in the foregoing multiple method embodiments, and has the beneficial effects of the corresponding method embodiments, which are not described herein again. In addition, the functional implementation of each module in the apparatus of this embodiment may refer to the description of the corresponding portion in the foregoing method embodiment, which is not repeated herein.
Example IV
Referring to fig. 4, a schematic structural diagram of an electronic device according to a fourth embodiment of the present application is shown, and the specific embodiment of the present application is not limited to the specific implementation of the electronic device.
As shown in FIG. 4, the electronic device may include a processor 402, a communication interface (Communications Interface) 404, a memory 406, and a communication bus 408.
Wherein:
Processor 402, communication interface 404, and memory 406 communicate with each other via communication bus 408.
A communication interface 404 for communicating with other electronic devices or servers.
Processor 402, for executing program 410, may specifically perform the relevant steps in the method embodiments described above.
In particular, program 410 may include program code including computer-operating instructions.
Processor 402 may be a processor CPU or a specific integrated circuit ASIC (Application SpecificIntegrated Circuit) or one or more integrated circuits configured to implement embodiments of the present application. The one or more processors included in the smart device may be the same type of processor, such as one or more CPUs, or different types of processors, such as one or more CPUs and one or more ASICs.
Memory 406 for storing programs 410. Memory 406 may comprise high-speed RAM memory or may also include non-volatile memory (non-volatile memory), such as at least one disk memory.
Program 410 may be specifically configured to cause processor 402 to perform operations corresponding to the methods described above.
The specific implementation of each step in the procedure 410 may refer to the corresponding step and corresponding description in the unit in the above method embodiment, which is not repeated herein. It will be clear to those skilled in the art that, for convenience and brevity of description, specific working procedures of the apparatus and modules described above may refer to corresponding procedure descriptions in the foregoing method embodiments, which are not repeated herein.
Embodiments of the present application also provide a computer program product comprising computer instructions that instruct a computing device to perform operations corresponding to any one of the above-described method embodiments.
It should be noted that, according to implementation requirements, each component/step described in the embodiments of the present application may be split into more components/steps, or two or more components/steps or part of operations of the components/steps may be combined into new components/steps, so as to achieve the objects of the embodiments of the present application.
The above-described methods according to embodiments of the present application may be implemented in hardware, firmware, or as software or computer code storable in a recording medium such as a CD ROM, RAM, floppy disk, hard disk, or magneto-optical disk, or as computer code originally stored in a remote recording medium or a non-transitory machine-readable medium and to be stored in a local recording medium downloaded through a network, so that the methods described herein may be stored on such software processes on a recording medium using a general purpose computer, special purpose processor, or programmable or special purpose hardware such as an ASIC or FPGA. It is understood that a computer, processor, microprocessor controller, or programmable hardware includes a storage component (e.g., RAM, ROM, flash memory, etc.) that can store or receive software or computer code that, when accessed and executed by a computer, processor, or hardware, performs the methods described herein. Furthermore, when a general purpose computer accesses code for implementing the methods illustrated herein, execution of the code converts the general purpose computer into a special purpose computer for performing the methods illustrated herein.
Those of ordinary skill in the art will appreciate that the elements and method steps of the examples described in connection with the embodiments disclosed herein can be implemented as electronic hardware, or as a combination of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the embodiments of the present application.
The above embodiments are only for illustrating the embodiments of the present application, but not for limiting the embodiments of the present application, and various changes and modifications may be made by one skilled in the relevant art without departing from the spirit and scope of the embodiments of the present application, so that all equivalent technical solutions also fall within the scope of the embodiments of the present application, and the scope of the embodiments of the present application should be defined by the claims.
Claims (11)
Priority Applications (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202210606848.3A CN115017283B (en) | 2022-05-31 | 2022-05-31 | Natural language processing model, method, electronic device and computer storage medium |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202210606848.3A CN115017283B (en) | 2022-05-31 | 2022-05-31 | Natural language processing model, method, electronic device and computer storage medium |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN115017283A CN115017283A (en) | 2022-09-06 |
| CN115017283B true CN115017283B (en) | 2025-01-28 |
Family
ID=83071405
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202210606848.3A Active CN115017283B (en) | 2022-05-31 | 2022-05-31 | Natural language processing model, method, electronic device and computer storage medium |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN115017283B (en) |
Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101334778A (en) * | 2007-06-29 | 2008-12-31 | 国际商业机器公司 | Management database connecting method and system |
| CN107943995A (en) * | 2017-09-22 | 2018-04-20 | 国网重庆市电力公司电力科学研究院 | A kind of SQL query result row name and coding automatic switching method |
Family Cites Families (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN109413028B (en) * | 2018-08-29 | 2021-11-30 | 集美大学 | SQL injection detection method based on convolutional neural network algorithm |
| CN111159223B (en) * | 2019-12-31 | 2021-09-03 | 武汉大学 | Interactive code searching method and device based on structured embedding |
| CN113609158A (en) * | 2021-08-12 | 2021-11-05 | 国家电网有限公司大数据中心 | SQL statement generation method, device, equipment and medium |
-
2022
- 2022-05-31 CN CN202210606848.3A patent/CN115017283B/en active Active
Patent Citations (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN101334778A (en) * | 2007-06-29 | 2008-12-31 | 国际商业机器公司 | Management database connecting method and system |
| CN107943995A (en) * | 2017-09-22 | 2018-04-20 | 国网重庆市电力公司电力科学研究院 | A kind of SQL query result row name and coding automatic switching method |
Also Published As
| Publication number | Publication date |
|---|---|
| CN115017283A (en) | 2022-09-06 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| CN112819153B (en) | Model transformation method and device | |
| EP4152224A1 (en) | Machine learning application method, device, electronic apparatus, and storage medium | |
| CN110727659B (en) | Method, device, equipment and medium for generating decision tree model based on SQL statement | |
| CN117591659A (en) | Information processing methods, devices, equipment and media based on ChatGLM operation and maintenance scenarios | |
| US20250061886A1 (en) | Method and apparatus for training speech synthesis model, device, storage medium and program product | |
| WO2025055528A1 (en) | Speech processing method and apparatus, and device and storage medium | |
| CN110597847A (en) | SQL statement automatic generation method, device, equipment and readable storage medium | |
| CN117132168A (en) | Index rule generation method and device, electronic equipment and storage medium | |
| JP2024175030A (en) | Information processing method, device, electronic device, and agent based on artificial intelligence | |
| CN117170837A (en) | Task processing method, task processing system and reasoning method | |
| CN119759481A (en) | Performance tuning method, device, cloud server, storage medium, and program product | |
| CN115017283B (en) | Natural language processing model, method, electronic device and computer storage medium | |
| CN117079651B (en) | Speech cross real-time enhancement implementation method based on large-scale language model | |
| CN119646238A (en) | Document generation method, device, electronic device, storage medium and program | |
| CN113449067B (en) | Data query method, device, equipment and medium | |
| CN118506131A (en) | Image generation method, device, electronic device and storage medium | |
| CN117972038A (en) | Intelligent question-answering method, device and computer readable medium | |
| WO2025081585A1 (en) | Gpt model-based intelligent device interaction methods, apparatuses and system | |
| CN120045646A (en) | Text processing method, text processing device, computer equipment, storage medium and program product | |
| CN115062050B (en) | Database query result generation method, device, equipment and storage medium | |
| CN114064125A (en) | Instruction analysis method and device and electronic equipment | |
| CN118627556B (en) | A large language model quantization method and system based on FP8 | |
| CN114444658B (en) | Deep learning model reasoning method, system, equipment and computer medium | |
| CN112562679B (en) | Offline voice interaction method, device and medium | |
| CN118413714A (en) | Data processing method, device, electronic equipment and storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |