[go: up one dir, main page]

CN109885710B - User image depicting method based on differential evolution algorithm and server - Google Patents

User image depicting method based on differential evolution algorithm and server Download PDF

Info

Publication number
CN109885710B
CN109885710B CN201910031695.2A CN201910031695A CN109885710B CN 109885710 B CN109885710 B CN 109885710B CN 201910031695 A CN201910031695 A CN 201910031695A CN 109885710 B CN109885710 B CN 109885710B
Authority
CN
China
Prior art keywords
user
degree
combination
label
feedback information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910031695.2A
Other languages
Chinese (zh)
Other versions
CN109885710A (en
Inventor
姜翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ping An Technology Shenzhen Co Ltd
Original Assignee
Ping An Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ping An Technology Shenzhen Co Ltd filed Critical Ping An Technology Shenzhen Co Ltd
Priority to CN201910031695.2A priority Critical patent/CN109885710B/en
Publication of CN109885710A publication Critical patent/CN109885710A/en
Application granted granted Critical
Publication of CN109885710B publication Critical patent/CN109885710B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The invention provides a user portrait depicting method, which comprises the following steps: the method comprises the steps of respectively printing original label combinations on users with the same behaviors, obtaining feedback information of the users to the original label combinations, respectively calculating the coincidence degree of the original label combinations, generating new label combinations by utilizing a differential evolution algorithm according to the original label combinations, obtaining feedback information of the users to the new label combinations, respectively calculating the coincidence degree of the new label combinations, and printing the new label combinations on the users when the coincidence degree of the new label combinations is higher than the coincidence degree of preset label combinations and higher than the preset value. The invention can automatically print the most appropriate label combination for the user, improves the accuracy of the user label, and helps enterprises to quickly find accurate user groups and user requirements.

Description

User image depicting method based on differential evolution algorithm and server
Technical Field
The embodiment of the invention relates to the field of big data, in particular to a user image depicting method based on a differential evolution algorithm, a server and a computer readable storage medium.
Background
With the development of big data of the internet, all behaviors of consumers seem to be 'visualized' in front of enterprises. The focus day of the enterprise also starts focusing on how to utilize big data to provide services to users with precision. Thus, the "user image" is generated.
The user portrait, namely the tagging of user information, is to perfectly abstract a user's complete picture after collecting and analyzing the data of main information such as user social attribute, living habits, behaviors and the like. The user portrait provides enough information foundation for enterprises, and can help the enterprises to quickly find more extensive feedback information such as accurate user groups and user requirements.
The existing differential evolution algorithm can enable the population to evolve continuously, retain good individuals, eliminate poor individuals and guide the optimal solution approximation of search terms. With the maturity of the differential evolution algorithm technology, the application range of the differential evolution algorithm technology is wider and wider.
Therefore, the invention aims to solve the problem of how to realize user image portrayal by using a differential evolution algorithm.
Disclosure of Invention
In view of the above, there is a need to provide a user image depicting method based on a differential evolution algorithm, a server, a computer device and a computer readable storage medium, which can automatically draw a most suitable label combination for a user, improve the accuracy of a user label, and further help an enterprise to quickly find an accurate user group and user requirements.
In order to achieve the above object, an embodiment of the present invention provides a user image depicting method based on a differential evolution algorithm, where the method includes:
acquiring a plurality of behaviors of a plurality of users;
identifying the plurality of behaviors;
according to the recognition result, a first user and a second user with a first behavior in the behaviors are respectively marked with a preset first label combination and a preset second label combination;
sending the first tag combination to the first user and sending the second tag combination to the second user;
acquiring first feedback information of the first user on the first label combination and second feedback information of the second user on the second label;
respectively calculating a first coincidence degree of the first label combination and a second coincidence degree of the second label combination according to the first feedback information and the second feedback information;
generating a third label combination and a fourth label combination by using a differential evolution algorithm according to the first label combination and the second label combination;
acquiring third feedback information of the first user on the third label combination and fourth feedback information of the second user on the fourth label combination;
respectively calculating a third coincidence degree of the third label combination and a fourth coincidence degree of the fourth label combination according to the third feedback information and the fourth feedback information;
comparing the first conformity degree, the second conformity degree, the third conformity degree and the fourth conformity degree; and
and when the fourth conformity degree is higher than the first conformity degree, the second conformity degree and the third conformity degree and is higher than a preset value stored in a database, the fourth label combination is printed on the first user and the second user.
Further, before the step of printing a preset first tag combination and a preset second tag combination on a first user and a second user having a first behavior in the behaviors according to the recognition result, the method further includes: and judging whether the same behavior exists in the plurality of behaviors according to the identification result.
Further, the step of calculating a first coincidence degree of the first label combination and a second coincidence degree of the second label combination according to the first feedback information and the second feedback information respectively further includes:
counting the first feedback information and the second feedback information;
calculating a first coincidence quantity and a first user quantity of the first user to the first label combination according to the first feedback information;
and calculating a second coincidence quantity and a second user quantity of the second user to the second label combination according to the second feedback information.
Further, the calculation formulas of the first conformity degree, the second conformity degree, the third conformity degree and the fourth conformity degree are as follows:
Figure BDA0001944474250000031
wherein, K isiIndicates the ith degree of coincidence, NiAnd representing the ith coincidence quantity, wherein the U represents the total quantity of the first users and the second users.
Further, the step of generating a third tag combination and a fourth tag combination by using a differential evolution algorithm according to the first tag combination and the second tag combination further includes:
recombining the first tag combination and the second tag combination to generate a first tag combination group; and
and screening the third label combination and the fourth label combination with the highest conformity degree from the first label combination group according to the first conformity degree and the second conformity degree.
Further, after the step of comparing the first compliance degree, the second compliance degree, the third compliance degree and the fourth compliance degree, the method further includes:
when the fourth conformity degree is higher than the second conformity degree, the second conformity degree is higher than the first conformity degree and the third conformity degree, and the fourth conformity degree is lower than the preset value, generating a second tag combination group according to the second tag combination and the fourth tag combination;
and screening a fifth label combination and a sixth label combination with the highest conformity degree from the second label combination group according to the second conformity degree and the fourth conformity degree.
Further, the first feedback information includes a conforming state of the first user to the first tag combination, the second feedback information includes a conforming state of the second user to the second tag combination, and the conforming state includes conforming and nonconforming.
In order to achieve the above object, an embodiment of the present invention further provides a server, including:
the acquisition module is used for acquiring a plurality of behaviors of a plurality of users;
an identification module to identify the plurality of behaviors;
the tag module is used for respectively printing a preset first tag combination and a preset second tag combination on a first user and a second user with a first behavior in the behaviors according to the identification result;
the sending module is used for sending the first label combination to the first user and sending the second label combination to the second user;
the obtaining module is configured to obtain first feedback information of the first user on the first tag combination and second feedback information of the second user on the second tag;
the calculation module is used for calculating a first coincidence degree of the first label combination and a second coincidence degree of the second label combination according to the first feedback information and the second feedback information;
the generating module is used for generating a third label combination and a fourth label combination by using a differential evolution algorithm according to the first label combination and the second label combination;
the obtaining module is configured to obtain third feedback information of the first user on the third tag combination and fourth feedback information of the second user on the fourth tag combination;
the calculation module is configured to calculate a third coincidence degree of the third label combination and a fourth coincidence degree of the fourth label combination according to the third feedback information and the fourth feedback information, respectively;
the comparison module is used for comparing the first coincidence degree, the second coincidence degree, the third coincidence degree and the fourth coincidence degree; and
the label module is configured to put the fourth label combination on the first user and the second user when the fourth compliance degree is higher than the first compliance degree, the second compliance degree, and the third compliance degree, and the fourth compliance degree is higher than a preset value stored in a database.
Further, the identification module is further configured to determine whether the same behavior exists in the plurality of behaviors according to the identification result.
Further, the calculating module is further configured to count the first feedback information and the second feedback information, calculate a first number of matches of the first user to the first tag combination and a first number of users according to the first feedback information, and calculate a second number of matches of the second user to the second tag combination and a second number of users according to the second feedback information.
Further, the first, second, third and fourth degrees of conformityThe calculation formula of (2) is as follows:
Figure BDA0001944474250000041
wherein, K isiIndicates the ith degree of coincidence, NiAnd representing the ith coincidence quantity, wherein the U represents the total quantity of the first users and the second users.
Further, the generating module is further configured to recombine the first tag combination and the second tag combination to generate a first tag combination group, and screen out the third tag combination and the fourth tag combination with the highest matching degree from the first tag combination group according to the first matching degree and the second matching degree.
Further, the comparison module is further configured to, when the fourth compliance degree is higher than the second compliance degree, the second compliance degree is higher than the first compliance degree and the third compliance degree, and the fourth compliance degree is lower than the preset value, generate a second tag combination group according to the second tag combination and the fourth tag combination, and screen out a fifth tag combination and a sixth tag combination with the highest compliance degree from the second tag combination group according to the second compliance degree and the fourth compliance degree.
Further, the first feedback information includes a conforming state of the first user to the first tag combination, the second feedback information includes a conforming state of the second user to the second tag combination, and the conforming state includes conforming and nonconforming.
To achieve the above object, an embodiment of the present invention further provides a computer device, a memory of the computer device, a processor, and a computer program stored on the memory and executable on the processor, wherein the computer program, when executed by the processor, implements the steps of the user portrait depicting method as described above.
To achieve the above object, an embodiment of the present invention further provides a computer-readable storage medium, in which a computer program is stored, where the computer program is executable by at least one processor to cause the at least one processor to execute the steps of the user representation depicting method.
According to the user portrait depicting method, the server, the computer equipment and the computer readable storage medium, the label combination is marked for the users with similar behaviors, the conformity degree of the label combination is calculated according to the conformity degree of the users to the label combination of the users, then the new label combination is generated by utilizing a differential evolution algorithm according to the label combination, the conformity degree of the new label combination is calculated according to the conformity degree of the users to the new label combination, the label combination with high conformity degree is screened out according to the conformity degree of the original label combination and the new label combination, the most appropriate label combination can be automatically marked for the users, the accuracy of user labels is improved, and enterprises are helped to quickly find accurate user groups and user requirements.
Drawings
FIG. 1 is a flowchart illustrating a user image depicting method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of a hardware architecture of a server according to a second embodiment of the present invention.
FIG. 3 is a block diagram of a third embodiment of a system for depicting a user image.
Reference numerals:
server 2
Memory device 21
Processor with a memory having a plurality of memory cells 22
Network interface 23
User portrait depicting system 20
Acquisition module 201
Identification module 202
Label module 203
Transmission module 204
Computing module 205
Generation module 206
Comparison module 207
The implementation, functional features and advantages of the objects of the present invention will be further explained with reference to the accompanying drawings.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the description relating to "first", "second", etc. in the present invention is for descriptive purposes only and is not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include at least one such feature. In addition, technical solutions between various embodiments may be combined with each other, but must be realized by a person skilled in the art, and when the technical solutions are contradictory or cannot be realized, such a combination should not be considered to exist, and is not within the protection scope of the present invention.
Example one
Referring to FIG. 1, a flowchart illustrating steps of a method for depicting a user portrait according to a first embodiment of the invention is shown. It is to be understood that the flow charts in the embodiments of the present method are not intended to limit the order in which the steps are performed. It should be noted that the present embodiment is exemplarily described with the server 2 as an execution subject. The method comprises the following specific steps:
in step S100, a plurality of behaviors of a plurality of users are acquired.
Step S102, identifying the plurality of behaviors.
In a preferred embodiment, the server 2 obtains information of the user, such as click records, browsing records, and search records, through a plurality of user terminals to obtain behaviors of the plurality of users, identifies a plurality of behaviors of the plurality of users, and determines whether the same behavior exists in the plurality of behaviors according to the identification result to classify the behaviors of the users.
Specifically, when the user uses panning, the server 2 obtains information of the user such as clicking, browsing, searching, and purchasing a commodity record, and further identifies the consumption behavior of the user, and classifies the behavior of the user according to the consumption behavior. For example: the user frequently clicks, browses and purchases a history book, and the server 2 recognizes the user's consumption behavior as history data and classifies the user's behavior as "history book".
And step S104, according to the identification result, respectively printing a preset first label combination and a preset second label combination on a first user and a second user with a first behavior in the behaviors.
In a preferred embodiment, after the server 2 classifies the user's behavior, the user's behavior in the same category is randomly labeled with various label combinations, for example, the user's behavior in the category "books with historical records" is labeled with "history", "books", "comics", "geography", "programming", and "clothes".
Step S106, the first label combination is sent to the first user, and the second label combination is sent to the second user.
Step S108, obtaining first feedback information of the first user on the first label combination and second feedback information of the second user on the second label.
Step S110, respectively calculating a first coincidence degree of the first label combination and a second coincidence degree of the second label combination according to the first feedback information and the second feedback information. The calculation formula of the first conformity degree is as follows:
Figure BDA0001944474250000081
the calculation formula of the second conformity degree is as follows:
Figure BDA0001944474250000082
wherein, K is1Representing a first degree of compliance, said N1Representing a number of coincidences of the first user with the first tag, K2Representing a first degree of compliance, said N2A first number of coincidences is represented, and the U represents the total number of first and second users.
In a preferred embodiment, since the tag combination is a tag combination randomly printed by the server 2 for the user, the tag combination does not necessarily conform to the user, the server 2 sends the tag combination to the terminal of the user, and obtains the degree of conformity of the user to the tag combination from the terminal of the user, if so, the conforming feedback information is sent to the server 2, otherwise, the non-conforming feedback information is sent to the server 2. The server 2 counts the feedback information of the user and calculates the conformity degree of the label combination to the user.
In another preferred embodiment, after the first feedback information and the second feedback information are obtained, the first feedback information and the second feedback information are counted, a first number of matches and a first number of users of the first user to the first tag combination are calculated according to the first feedback information, and a second number of matches and a second number of users of the second user to the second tag combination are calculated according to the second feedback information. Then, the server 2 calculates the first conformity degree and the second conformity degree according to the statistical result.
And step S112, generating a third label combination and a fourth label combination by using a differential evolution algorithm according to the first label combination and the second label combination.
In one embodiment, the first tag combination and the second tag combination are recombined to generate a first tag combination group, and then a third tag combination and a fourth tag combination with the highest degree of conformity are selected from the first tag combination group according to the first degree of conformity and the second degree of conformity.
For example, if the original 1000 tags are combined, 2 is recombined by the differential evolution algorithm1000And (3) carrying out label combination, if the original 2 label combinations are combined, recombining 2 through a differential evolution algorithm2And (5) a label combination is planted. For example: the original tags of the user behaviors of the ' text history book ' category are combined into a ' history ' cartoon ' ' and a ' geography ' programming ' ', the tags recombined by a differential evolution algorithm are combined into a ' history ' geography ' ', ' history ' programming ' ' ' geography ' cartoon ' ' ' and ' cartoon ' programming ' ', and the degree of conformity of the user to the original tag combination into the ' history ' cartoon ' ' is counted to be higher than that of the original tag combinationIf the combined label combination has high coincidence degree of the 'geographical' programming '', the 'historical' geographical 'and' historical 'programming' '' label combinations with the highest coincidence degree are screened out from the recombined label combinations, and the 'historical' geographical 'and' historical 'programming' '' label combinations are the generated third label combination and the fourth label combination.
Step S114, obtain third feedback information of the first user on the third tag combination and fourth feedback information of the second user on the fourth tag combination.
Step S116, respectively calculating a third coincidence degree of the third label combination and a fourth coincidence degree of the fourth label combination according to the third feedback information and the fourth feedback information. The calculation formula of the third conformity degree is as follows:
Figure BDA0001944474250000091
the calculation formula of the fourth conformity degree is as follows:
Figure BDA0001944474250000092
wherein, K is3Represents a third degree of conformity, said N3Representing a number of coincidences of a third user with a third tag, K4Represents a fourth degree of conformity, said N4Indicating a fourth coincidence quantity.
Step S118, comparing the first compliance level, the second compliance level, the third compliance level and the fourth compliance level.
Step S120, when the fourth compliance degree is higher than the first compliance degree, the second compliance degree, and the third compliance degree, and the fourth compliance degree is higher than a preset value stored in a database, the first user and the second user are marked with the fourth label combination.
In a preferred embodiment, the server 2 compares the first label combination and the second label combination of the original label combinations and the generated coincidence degrees of the third label combination and the fourth label combination, and utilizes a differential evolution algorithm to screen out a target label combination which has the highest coincidence degree and is greater than a preset value from the first label combination, the second label combination, the third label combination and the fourth label combination.
In another preferred embodiment, if the fourth degree of conformity is higher than the second degree of conformity, the second degree of conformity is higher than the first degree of conformity and the third degree of conformity, and the fourth degree of conformity is lower than the preset value, the second tag combination and the fourth tag combination are continued to generate a second tag combination group according to a differential evolution algorithm, and then the tag combination with the highest degree of conformity (for example, the fifth tag combination and the sixth tag combination) is selected from the second tag combination group according to the second degree of conformity of the second tag combination and the fourth degree of conformity of the fourth tag combination.
It should be noted that, if the coincidence degree of any one of the first coincidence degree, the second coincidence degree, the third coincidence degree, and the fourth coincidence degree is higher than a preset value, the first user and the second user are hit with the label combination with the highest coincidence degree, and if the coincidence degree is lower than the preset value, the first two label combinations with the highest coincidence degree are selected from the first coincidence degree, the second coincidence degree, the third coincidence degree, and the fourth coincidence degree, and continue to generate a new label combination group according to a differential evolution algorithm with the first two label combinations, and then the label combination with the highest coincidence degree is selected from the new label combination group according to the coincidence degree of the first two label combinations. In the present embodiment, only the fourth conformity degree with the highest conformity degree is taken as an example for explanation.
The user image depicting method based on the differential evolution algorithm can automatically draw the most appropriate label combination for the user, improves the accuracy of the user label, and further helps enterprises to quickly find accurate user groups and user requirements.
Example two
Referring to fig. 2, a hardware architecture diagram of a server according to a second embodiment of the present invention is shown. The server 2 includes, but is not limited to, a memory 21, a process 22, and a network interface 23 communicatively coupled to each other via a system bus, and FIG. 2 illustrates only the server 2 having components 21-23, although it is to be understood that not all of the illustrated components are required to be implemented and that more or fewer components may alternatively be implemented.
The memory 21 includes at least one type of readable storage medium including a flash memory, a hard disk, a multimedia card, a card type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a Read Only Memory (ROM), an Electrically Erasable Programmable Read Only Memory (EEPROM), a Programmable Read Only Memory (PROM), a magnetic memory, a magnetic disk, an optical disk, etc. In some embodiments, the storage 21 may be an internal storage unit of the server 2, such as a hard disk or a memory of the server 2. In other embodiments, the memory may also be an external storage device of the server 2, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), or the like, provided on the server 2. Of course, the memory 21 may also include both an internal storage unit of the server 2 and an external storage device thereof. In this embodiment, the memory 21 is generally used for storing an operating system installed in the server 2 and various application software, such as program codes of the user portrait rendering system 20. Further, the memory 21 may also be used to temporarily store various types of data that have been output or are to be output.
The processor 22 may be a Central Processing Unit (CPU), controller, microcontroller, microprocessor, or other data Processing chip in some embodiments. The processor 22 is typically used to control the overall operation of the server 2. In this embodiment, the processor 22 is configured to run the program code or the processing data stored in the memory 21, for example, run the user representation system 20.
The network interface 23 may comprise a wireless network interface or a wired network interface, and the network interface 23 is generally used for establishing communication connection between the server 2 and other electronic devices. For example, the network interface 23 is used to connect the server 2 to an external terminal via a network, establish a data transmission channel and a communication connection between the server 2 and the external terminal, and the like. The network may be a wireless or wired network such as an Intranet (Intranet), the Internet (Internet), a Global System of Mobile communication (GSM), Wideband Code Division Multiple Access (WCDMA), a 4G network, a 5G network, Bluetooth (Bluetooth), Wi-Fi, and the like.
EXAMPLE III
Referring to FIG. 3, a schematic diagram of program modules of a user image depicting system according to a third embodiment of the invention is shown. In this embodiment, the user representation characterization system 20 may include or be divided into one or more program modules, which are stored in a storage medium and executed by one or more processors to implement the present invention and implement the user representation characterization methods described above. The program modules referred to in the embodiments of the present invention refer to a series of computer program instruction segments capable of performing specific functions, and are more suitable than the program itself for describing the execution process of the user image depicting system 20 in the storage medium. The following description will specifically describe the functions of the program modules of the present embodiment:
the acquiring module 201 is configured to acquire a plurality of behaviors of a plurality of users.
An identifying module 202 configured to identify the plurality of behaviors.
In a preferred embodiment, the obtaining module 201 obtains information of a user, such as a click record, a browsing record, a search record, and the like, through a plurality of user terminals to obtain behaviors of the plurality of users, and the identifying module 202 identifies a plurality of behaviors of the plurality of users and determines whether the same behavior exists in the plurality of behaviors according to an identification result to classify the behaviors of the users.
Specifically, when the user uses panning, the obtaining module 201 obtains information of the user, such as clicking, browsing, searching, and purchasing a commodity record, so as to identify the consumption behavior of the user, and classifies the behavior of the user according to the consumption behavior. For example: the user frequently clicks, browses and purchases a books with a history, and the recognition module 202 recognizes the consumption behavior of the user as data with a history, and classifies the behavior of the user as "books with a history".
The label module 203 is configured to print a preset first label combination and a preset second label combination on a first user and a second user having a first behavior in the behaviors according to the identification result.
In a preferred embodiment, after the recognition module 202 classifies the user's behavior, the tagging module 203 randomly tags the same category of user behavior, for example, the "books in history" category of user behavior with "history", "books", "comics", "geography", "programming", and "clothing" categories.
A sending module 204, configured to send the first tag combination to the first user, and send the second tag combination to the second user.
The obtaining module 201 is configured to obtain first feedback information of the first user on the first tag combination and second feedback information of the second user on the second tag.
A calculating module 205, configured to calculate a first coincidence degree of the first label combination and a second coincidence degree of the second label combination according to the first feedback information and the second feedback information, respectively. The calculation formula of the first conformity degree is as follows:
Figure BDA0001944474250000121
the calculation formula of the second conformity degree is as follows:
Figure BDA0001944474250000122
wherein, K is1Representing a first degree of compliance, said N1Representing a number of coincidences of the first user with the first tag, K2Representing a first degree of compliance, saidN2A first number of coincidences is represented, and the U represents the total number of first and second users.
In a preferred embodiment, since the tag combination is a tag combination randomly printed by the server for the user, the tag combination does not necessarily conform to the user, the sending module 204 sends the tag combination to the terminal of the user, and the obtaining module 201 obtains the degree of conformity of the user to the tag combination from the terminal of the user, and if the degree of conformity is met, sends the conforming feedback information to the user portrait depicting system 20, otherwise, sends the non-conforming feedback information to the user portrait depicting system 20. The calculation module 205 counts feedback information of the user and calculates a degree of conformity of the tag combination to the user.
In another preferred embodiment, after the obtaining module 201 obtains the first feedback information and the second feedback information, the calculating module 205 counts the first feedback information and the second feedback information, calculates a first number of matches of the first user to the first tag combination and a first number of users according to the first feedback information, and calculates a second number of matches of the second user to the second tag combination and a second number of users according to the second feedback information. Then, the user portrait rendering system 20 calculates a first matching degree and a second matching degree according to the statistical result.
A generating module 206, configured to generate a third tag combination and a fourth tag combination by using a differential evolution algorithm according to the first tag combination and the second tag combination.
In an embodiment, the generating module 206 recombines the first tag combination and the second tag combination to generate a first tag combination group, and then selects a third tag combination and a fourth tag combination with the highest degree of matching from the first tag combination group according to the first degree of matching and the second degree of matching.
For example, if the original 1000 tags are combined, the generating module 206 recombines 2 through the differential evolution algorithm1000A kind of label combination, if the original 2 kinds of label combinations, the generation module 206 is generalRecombining 2 by the over-differential evolution algorithm2And (5) a label combination is planted. For example: the original label combination of the user behaviors of the ' text history book ' type is ' history ' cartoon ' ' and ' geography ' programming ' ', the label combination recombined by a differential evolution algorithm is ' history ' geography ' ', ' history ' programming ' ' ' geography ' and ' geography ' programming ' ', and the degree of conformity of the original label combination to the ' history ' cartoon ' ' is higher than that of the original label combination to the ' geography ' programming ' ', and the label combination with the highest degree of conformity is selected from the recombined label combination, so that the ' history ' geography ' and ' history ' programming ' ' label combination is the third label combination and the fourth label combination generated.
The obtaining module 201 is configured to obtain third feedback information of the first user on the third tag combination and fourth feedback information of the second user on the fourth tag combination.
The calculating module 205 is configured to calculate a third coincidence degree of the third label combination and a fourth coincidence degree of the fourth label combination according to the third feedback information and the fourth feedback information, respectively. The calculation formula of the third conformity degree is as follows:
Figure BDA0001944474250000141
the calculation formula of the fourth conformity degree is as follows:
Figure BDA0001944474250000142
wherein, K is3Represents a third degree of conformity, said N3Representing a number of coincidences of a third user with a third tag, K4Represents a fourth degree of conformity, said N4Indicating a fourth coincidence quantity.
A comparing module 207, configured to compare the first compliance degree, the second compliance degree, the third compliance degree, and the fourth compliance degree.
The label module 203 is configured to put the fourth label combination on the first user and the second user when the fourth compliance degree is higher than the first compliance degree, the second compliance degree, and the third compliance degree, and the fourth compliance degree is higher than a preset value stored in a database.
In a preferred embodiment, the comparing module 207 compares the first label combination and the second label combination of the original label combinations and the generated coincidence degrees of the third label combination and the fourth label combination, and utilizes a differential evolution algorithm to screen out a target label combination having the highest coincidence degree and being greater than a preset value from the first label combination, the second label combination, the third label combination and the fourth label combination.
In another preferred embodiment, if the fourth degree of conformity is higher than the second degree of conformity, the second degree of conformity is higher than the first degree of conformity and the third degree of conformity, and the fourth degree of conformity is lower than the preset value, the second tag combination and the fourth tag combination are continued to generate a second tag combination group according to a differential evolution algorithm, and then the tag combination with the highest degree of conformity (for example, the fifth tag combination and the sixth tag combination) is selected from the second tag combination group according to the second degree of conformity of the second tag combination and the fourth degree of conformity of the fourth tag combination.
The user portrait depicting system provided by the invention can automatically print the most appropriate label combination for the user, improve the accuracy of the user label and further help enterprises to quickly find accurate user groups and user requirements.
The present invention also provides a computer device, such as a smart phone, a tablet computer, a notebook computer, a desktop computer, a rack server, a blade server, a tower server or a rack server (including an independent server or a server cluster composed of a plurality of servers) capable of executing programs, and the like. The computer device of the embodiment at least includes but is not limited to: memory, processor, etc. communicatively coupled to each other via a system bus.
The present embodiment also provides a computer-readable storage medium, such as a flash memory, a hard disk, a multimedia card, a card-type memory (e.g., SD or DX memory, etc.), a Random Access Memory (RAM), a Static Random Access Memory (SRAM), a read-only memory (ROM), an electrically erasable programmable read-only memory (EEPROM), a programmable read-only memory (PROM), a magnetic memory, a magnetic disk, an optical disk, a server, an App application mall, etc., on which a computer program is stored, which when executed by a processor implements corresponding functions. The computer readable storage medium of this embodiment is used for storing the user portrait characterization system 20, and when executed by a processor, implements the user portrait characterization method of the first embodiment.
The above-mentioned serial numbers of the embodiments of the present invention are merely for description and do not represent the merits of the embodiments.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner.
The above description is only a preferred embodiment of the present invention, and not intended to limit the scope of the present invention, and all modifications of equivalent structures and equivalent processes, which are made by using the contents of the present specification and the accompanying drawings, or directly or indirectly applied to other related technical fields, are included in the scope of the present invention.

Claims (10)

1. A user image depicting method based on a differential evolution algorithm is characterized by comprising the following steps:
acquiring a plurality of behaviors of a plurality of users;
identifying the plurality of behaviors;
according to the recognition result, a first user and a second user with a first behavior in the behaviors are respectively marked with a preset first label combination and a preset second label combination;
sending the first tag combination to the first user and sending the second tag combination to the second user;
acquiring first feedback information of the first user on the first label combination and second feedback information of the second user on the second label;
respectively calculating a first coincidence degree of the first label combination and a second coincidence degree of the second label combination according to the first feedback information and the second feedback information;
generating a third label combination and a fourth label combination by using a differential evolution algorithm according to the first label combination and the second label combination;
acquiring third feedback information of the first user on the third label combination and fourth feedback information of the second user on the fourth label combination;
respectively calculating a third coincidence degree of the third label combination and a fourth coincidence degree of the fourth label combination according to the third feedback information and the fourth feedback information;
comparing the first conformity degree, the second conformity degree, the third conformity degree and the fourth conformity degree; and
and when the fourth conformity degree is higher than the first conformity degree, the second conformity degree and the third conformity degree and is higher than a preset value stored in a database, the fourth label combination is printed on the first user and the second user.
2. The method as claimed in claim 1, wherein the step of printing a first label combination and a second label combination preset respectively for a first user and a second user having a first behavior among the behaviors according to the recognition result further comprises:
and judging whether the same behavior exists in the plurality of behaviors according to the identification result.
3. The method as claimed in claim 1, wherein the step of calculating a first matching degree of the first tag assembly and a second matching degree of the second tag assembly according to the first feedback information and the second feedback information, respectively, further comprises:
counting the first feedback information and the second feedback information;
calculating a first coincidence quantity and a first user quantity of the first user to the first label combination according to the first feedback information;
and calculating a second coincidence quantity and a second user quantity of the second user to the second label combination according to the second feedback information.
4. The method of claim 3, wherein the first, second, third and fourth degrees of conformity are calculated by:
Figure FDA0001944474240000021
wherein, K isiIndicates the ith degree of coincidence, NiAnd representing the ith coincidence quantity, wherein the U represents the total quantity of the first users and the second users.
5. The method of claim 1, wherein the step of generating a third tag combination and a fourth tag combination using a differential evolution algorithm based on the first tag combination and the second tag combination further comprises:
recombining the first tag combination and the second tag combination to generate a first tag combination group; and
and screening the third label combination and the fourth label combination with the highest conformity degree from the first label combination group according to the first conformity degree and the second conformity degree.
6. The method of claim 1, wherein comparing the first, second, third and fourth degrees of conformity further comprises:
when the fourth conformity degree is higher than the second conformity degree, the second conformity degree is higher than the first conformity degree and the third conformity degree, and the fourth conformity degree is lower than the preset value, generating a second tag combination group according to the second tag combination and the fourth tag combination;
and screening a fifth label combination and a sixth label combination with the highest conformity degree from the second label combination group according to the second conformity degree and the fourth conformity degree.
7. A method as claimed in claim 3, wherein said first feedback information includes a compliance status of said first user with said first tag combination, and said second feedback information includes a compliance status of said second user with said second tag combination, said compliance status including compliance and non-compliance.
8. A server, comprising:
the acquisition module is used for acquiring a plurality of behaviors of a plurality of users;
an identification module to identify the plurality of behaviors;
the tag module is used for respectively printing a preset first tag combination and a preset second tag combination on a first user and a second user with a first behavior in the behaviors according to the identification result;
the sending module is used for sending the first label combination to the first user and sending the second label combination to the second user;
the obtaining module is configured to obtain first feedback information of the first user on the first tag combination and second feedback information of the second user on the second tag;
the calculation module is used for calculating a first coincidence degree of the first label combination and a second coincidence degree of the second label combination according to the first feedback information and the second feedback information;
the generating module is used for generating a third label combination and a fourth label combination by using a differential evolution algorithm according to the first label combination and the second label combination;
the obtaining module is configured to obtain third feedback information of the first user on the third tag combination and fourth feedback information of the second user on the fourth tag combination;
the calculation module is configured to calculate a third coincidence degree of the third label combination and a fourth coincidence degree of the fourth label combination according to the third feedback information and the fourth feedback information, respectively;
the comparison module is used for comparing the first coincidence degree, the second coincidence degree, the third coincidence degree and the fourth coincidence degree; and
the label module is configured to put the fourth label combination on the first user and the second user when the fourth compliance degree is higher than the first compliance degree, the second compliance degree, and the third compliance degree, and the fourth compliance degree is higher than a preset value stored in a database.
9. A computer device, characterized by a computer device memory, a processor and a computer program stored on the memory and executable on the processor, the computer program, when executed by the processor, implementing the steps of the user representation characterization method according to any one of claims 1-7.
10. A computer-readable storage medium, having stored thereon a computer program executable by at least one processor to cause the at least one processor to perform the steps of the user representation method as claimed in any one of claims 1 to 7.
CN201910031695.2A 2019-01-14 2019-01-14 User image depicting method based on differential evolution algorithm and server Active CN109885710B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910031695.2A CN109885710B (en) 2019-01-14 2019-01-14 User image depicting method based on differential evolution algorithm and server

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910031695.2A CN109885710B (en) 2019-01-14 2019-01-14 User image depicting method based on differential evolution algorithm and server

Publications (2)

Publication Number Publication Date
CN109885710A CN109885710A (en) 2019-06-14
CN109885710B true CN109885710B (en) 2022-03-18

Family

ID=66925939

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910031695.2A Active CN109885710B (en) 2019-01-14 2019-01-14 User image depicting method based on differential evolution algorithm and server

Country Status (1)

Country Link
CN (1) CN109885710B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110263126A (en) * 2019-06-20 2019-09-20 维沃移动通信有限公司 A kind of generation method and mobile terminal of user's portrait

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014195782A2 (en) * 2013-06-03 2014-12-11 Tata Consultancy Services Limited Differential evolution-based feature selection
CN106540448A (en) * 2016-09-30 2017-03-29 浙江大学 The visual analysis method affected on its consuming behavior is exchanged between a kind of game player
CN108512674A (en) * 2017-02-24 2018-09-07 百度在线网络技术(北京)有限公司 Method, apparatus and equipment for output information

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7716240B2 (en) * 2005-12-29 2010-05-11 Nextlabs, Inc. Techniques and system to deploy policies intelligently
US8355940B2 (en) * 2009-03-25 2013-01-15 International Business Machines Corporation Capability and maturity-based SOA governance

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2014195782A2 (en) * 2013-06-03 2014-12-11 Tata Consultancy Services Limited Differential evolution-based feature selection
CN106540448A (en) * 2016-09-30 2017-03-29 浙江大学 The visual analysis method affected on its consuming behavior is exchanged between a kind of game player
CN108512674A (en) * 2017-02-24 2018-09-07 百度在线网络技术(北京)有限公司 Method, apparatus and equipment for output information

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
Web文本分类中的标签权重自动优化研究;钟旭东等;《小型微型计算机系统》;20160531;第37卷(第05期);第890-894页 *

Also Published As

Publication number Publication date
CN109885710A (en) 2019-06-14

Similar Documents

Publication Publication Date Title
WO2019024496A1 (en) Enterprise recommendation method and application server
CN107844634A (en) Polynary universal model platform modeling method, electronic equipment and computer-readable recording medium
CN112613917A (en) Information pushing method, device and equipment based on user portrait and storage medium
CN111090807B (en) Knowledge graph-based user identification method and device
CN113032668A (en) Product recommendation method, device and equipment based on user portrait and storage medium
WO2019061664A1 (en) Electronic device, user's internet surfing data-based product recommendation method, and storage medium
CN113254672A (en) Abnormal account identification method, system, equipment and readable storage medium
CN111159183B (en) Report generation method, electronic device and computer readable storage medium
CN113706249B (en) Data recommendation method and device, electronic equipment and storage medium
CN114513578A (en) Outbound method, device, computer equipment and storage medium
CN107944931A (en) Seed user expanding method, electronic equipment and computer-readable recording medium
CN111625567A (en) Data model matching method, device, computer system and readable storage medium
CN114219664A (en) Product recommendation method and device, computer equipment and storage medium
WO2019056496A1 (en) Method for generating picture review probability interval and method for picture review determination
CN111047336A (en) User label pushing method, user label display method, device and computer equipment
CN109885710B (en) User image depicting method based on differential evolution algorithm and server
CN113064984A (en) Intention recognition method and device, electronic equipment and readable storage medium
CN110765118B (en) Data revision method, revision device and readable storage medium
CN113111078A (en) Resource data processing method and device, computer equipment and storage medium
CN113536788A (en) Information processing method, device, storage medium and equipment
CN111414395A (en) Data processing method, system and computer equipment
CN111309993B (en) Enterprise asset data portrayal generation method and system
CN112905191B (en) Data processing method, device, computer readable storage medium and computer equipment
CN111523011B (en) Cold and hot wallet intelligent label system based on block chain technology distributed graph calculation engine
CN119004181B (en) User data portrait generation method and system based on flow

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant