WO2007069361A1 - Terminal de traitement d'informations - Google Patents
Terminal de traitement d'informations Download PDFInfo
- Publication number
- WO2007069361A1 WO2007069361A1 PCT/JP2006/314521 JP2006314521W WO2007069361A1 WO 2007069361 A1 WO2007069361 A1 WO 2007069361A1 JP 2006314521 W JP2006314521 W JP 2006314521W WO 2007069361 A1 WO2007069361 A1 WO 2007069361A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- call
- emotion
- processing terminal
- information processing
- information
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Ceased
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/68—Details of telephonic subscriber devices with means for recording information, e.g. telephone number during a conversation
Definitions
- the present invention relates to an information processing terminal capable of specifying the emotion of the creator of the email and the emotion of the speaker during the call from the character data described in the email and the voice during the call. About.
- Recent mobile phones, PDAs (Personal Digital Assistance), and other mobile devices have a history of incoming calls (at least when the call arrives and if the caller's phone number and caller can be identified, the caller's name and , The items that are also configured in chronological order according to the incoming time) and outgoing call history (at least when the phone call time and the destination phone number and destination can be specified) And call history such as items arranged in chronological order according to the call origination time) and mail reception history (at least the time when the mail was received, the sender's email address and sender can be specified)
- the name of the sender, the email body, and the items that are configured in a time-sequential manner according to the reception time are sent in the email transmission history (at least the email transmission time and the email address of the recipient. If you can identify the address and destination, you can display the email history such as the destination name and the body of the email, arranged in chronological order according to the sending time) (For example, Patent Document 1).
- Patent Document 1 Japanese Patent Laid-Open No. 11-275209
- the present invention has been made in view of the above circumstances, and can support recalling the contents of past calls and the contents of e-mail text without forcing the user to spend time and effort.
- An object is to provide a processing terminal.
- An information processing terminal includes an input means for inputting emotion specifying information composed of at least character data or voice, and an emotion specifying an emotion based on the emotion specifying information input by the input means. And a display unit for displaying information related to the emotion specified by the emotion specifying unit.
- the information processing terminal of the present invention includes mail transmission / reception means for transmitting or receiving an electronic mail, and the input means is character data described in the electronic mail transmitted or received by the mail transmission / reception means.
- the emotion specifying means specifies an emotion based on the character data input by the input means, and the display means transmits or receives by the mail transmitting / receiving means specified by the emotion specifying means. Including information about emotions corresponding to selected e-mails.
- At least a transmission time or at least for each e-mail in the order in which the display unit transmits or receives e-mails by the mail transmission / reception unit includes those that display information about the reception time, destination or source, and the emotion.
- the information processing terminal of the present invention includes the information processing terminal in which the display unit displays, for each date, information related to the emotion corresponding to the email transmitted or received on the date by the mail transmission / reception unit. .
- the display means for each target person registered in the telephone directory function, the display means also receives an e-mail transmitted to the target person by the mail transmitting / receiving means or the target person power. Including information on the emotions corresponding to the received e-mail.
- the mobile phone user's emotions and the mobile phone are used to By notifying mobile phone users of the feelings of other people who have communicated with users, mobile phone users can more easily remember past communications and provide an environment.
- any one of the information related to the emotion specified by the display unit for each of the plurality of e-mails transmitted or received by the mail transmitting / receiving unit is included.
- the terminal user can confirm at a glance the emotion specified for each of a plurality of emails.
- the information processing terminal includes a calling means for making a call, wherein the input means inputs a call voice by the calling means, and the emotion specifying means inputs by the input means. Emotions are specified based on the voice of the call, and the display means specifies information related to emotions about the called party, the caller, or both the called party and the caller who have made a call by the calling means specified by the emotion specifying means. Including things to display.
- the display means makes a call or an incoming call in the order in which the display means makes or receives a call. Including those that display information about followers and the emotions.
- the information processing terminal is such that the display means displays, for each date, information related to the emotion corresponding to a call made or received by the call means on the date. Including.
- the display means displays, for each target person registered in the telephone directory function, information on the emotion about the target person who has called by the calling means. Including things to do.
- the mobile phone user's emotions and the mobile phone can be used to By notifying mobile phone users of the feelings of other people who have communicated with users, mobile phone users can more easily remember past communications and provide an environment.
- the display means displays at least one of the information related to the emotion specified for each of a plurality of calls transmitted or received by the call means. Including things to do.
- the terminal user can confirm at a glance the emotions specified for each of a plurality of calls.
- an information processing terminal of the present invention includes a call storage unit that stores a call voice by the call unit, and a call playback unit that plays back the call voice stored in the call storage unit.
- the call reproduction means reproduces the call voice specifying the information related to the emotion displayed by the display means among the call voices stored in the call storage means.
- the call storage unit stores a portion in which the emotion specified by the emotion specifying unit is reflected in the call voice by the call unit. Then, the call reproduction means reproduces at least the part of the call stored in the call storage means.
- the information processing terminal includes the information processing terminal in which the call reproduction means reproduces the location with a predetermined time before the start time of the location as a reproduction start time.
- FIG. 1 is a configuration diagram of a mobile phone according to a first embodiment of the present invention.
- FIG. 3 shows information stored in the emotion identification device in the mobile phone according to the first embodiment of the present invention.
- FIG. 4 is a processing flow for displaying a call history by the mobile phone according to the first embodiment of the present invention.
- FIG. 5 is a display example of a call history by the mobile phone according to the first embodiment of the present invention.
- Figure 5 (a) shows the outgoing call history.
- Figure 5 (b) shows the incoming call history.
- Fig. 5 (c) shows an example of a feeling mark displayed in the outgoing call history or incoming call history.
- FIG. 6 Display example of a schedule book by the mobile phone according to the first embodiment of the present invention
- FIG. 7 shows an example of a telephone directory display by the mobile phone according to the first embodiment of the present invention.
- FIG. 8 Sorting example for each emotion by the mobile phone according to the first embodiment of the present invention.
- FIG. 9 is another configuration diagram of the mobile phone according to the first embodiment of the present invention.
- FIG. 10 shows another example of the information list stored in the emotion identification device in the mobile phone according to the first embodiment of the present invention.
- FIG. 11 Processing flow for displaying the mail reception history by the mobile phone of the first embodiment of the present invention.
- FIG. 13 shows information stored in the emotion identification device in the mobile phone according to the second embodiment of the present invention.
- FIG. 14 is a processing flow for displaying the call history of the mobile phone according to the second embodiment of the present invention.
- FIG. 15 is a display example of call history by the mobile phone according to the second embodiment of the present invention.
- FIG. 16 Display example of schedule book by mobile phone according to the second embodiment of the present invention
- FIG. 17 is a display example of a telephone directory by the mobile phone according to the first embodiment of the present invention.
- FIG. 18 is a processing flow for recording voice of a call by the mobile phone according to the second embodiment of the present invention (Example 1).
- FIG. 19 is a processing flow for recording a call voice by the mobile phone according to the second embodiment of the present invention (Example 2).
- FIG. 1 shows a configuration diagram of the mobile phone according to the first embodiment of the present invention.
- the mobile phone according to the first embodiment of the present invention includes a call device 10, an emotion identification device 20, a PIM (Personal Information Manager) application group 30, a display control unit 40, and a display unit 50.
- a configuration having various functions included in a mobile communication terminal such as a recent mobile phone or PDA (Personal Digital assistant) may be used.
- the call device 10 includes a communication unit 101, an audio signal output unit 102, a call speaker 103, and a call microphone 104.
- the communication unit 101 performs mobile wireless communication with a mobile phone base station, and realizes a voice call by transmitting and receiving a voice signal between the mobile phone user of the present invention and another telephone user.
- the audio signal output unit 102 receives an audio signal received from another telephone through the communication unit 101, an audio signal of the mobile phone user picked up by the call microphone 104, or both of the audio signals. Output to 20. Also, the audio signal output unit 102 outputs the voice received by other telephone power to the call speaker 103, and outputs the call voice of the mobile phone user collected by the call microphone 104 to the communication unit 101. To do.
- the emotion identification device 20 includes an emotion estimation unit 201, an emotion accumulation unit 202, an emotion identification unit 203, and an emotion information storage unit 204.
- Emotion estimation unit 201 emits the voice from the volume of the voice, the waveform of the voice, the pitch of the voice, or the phoneme included as information in the received voice signal and the transmitted voice signal input from communication device 10 Estimate the emotions of other telephone users and those of the mobile phone users (such an emotion estimation method is proposed in, for example, International Publication No. WO00Z62279).
- the emotion estimation unit 201 indicates the degree of emotion expressed by values of 0, 1, and 2 for each factor of emotion information composed of love, joy, anger, sadness, and neutral (normal) shown in FIG.
- the emotion estimation unit 201 does not necessarily have to estimate both the emotion of the other telephone user and the emotion of the mobile phone user.
- the emotion of either the other telephone user or the mobile phone user is not necessarily estimated in advance. If it is set to estimate the voice, either the incoming voice signal or the outgoing voice signal may be input from the voice signal output unit 102 of the communication device 10.
- Emotion accumulation section 202 accumulates the numerical value for each factor input from emotion estimation section 201 in correspondence with the input time or order for each received voice signal and transmitted voice signal.
- the emotion accumulation unit 202 receives a numerical value for each of a series of factors from the emotion estimation unit 201 (the series here refers to the input start force of the numerical value from the emotion estimation unit 201 to the emotion accumulation unit 202, which is the end of input.
- the value for each factor in the series is input as a set of data (hereinafter, a set of data is referred to as one call). (Referred to as minute data).
- minute data If the emotion accumulation unit 202 accumulates the numerical value for each factor input from the emotion estimation unit 201 for each received voice signal and transmitted voice signal, the data for one call is received voice data. One for each signal and transmitted voice signal.
- the emotion identification unit 203 reads the data for one call from the emotion accumulation unit 202, analyzes the numerical value for each factor, identifies one characteristic factor from the read numerical value, One of them
- the emotion represented by the factor is output to the emotion information storage unit 204.
- the emotion identification unit 203 uses the factor with the most powerful numerical value in the data for one call as the characteristic emotion, so that the content that is strongly impressed during the call is displayed. It is possible to identify feelings with emphasis. Another method is to identify emotions by placing emphasis on the content of the entire call, with the characteristic that the most powerful factor is the sum of the call start power in the data for one call and the end of the call.
- the emotion By selecting the factor that has the most powerful value immediately before the end of the call in the data for one call as the characteristic emotion, the emotion can be identified with emphasis on the reverberation of the conversation. .
- the emotion identifying unit 203 identifies one characteristic factor from each of the data for one call for each received voice signal and transmitted voice signal, the one factor is used for the received voice signal and the transmitted voice signal. Both forces, which are specified, are also output.
- Emotion information storage unit 204 inputs an emotion represented by one factor from emotion identification unit 203, and further, information related to a call made by communication unit 101 of call device 10 (the call is transmitted) One of the incoming calls, call start time 'call end time, and other telephone identification information (for example, the telephone number of the called party) are input from the communication unit 101.
- the emotion information storage unit 204 stores the emotion input from the emotion specifying unit 203 and various information input from the communication unit 101 in association with each other (the emotion specifying unit 203 stores 1 for each received voice signal and transmitted voice signal). If one characteristic factor is identified from each of the data for the call, the force that is the signal source of either the received voice signal or the transmitted voice signal is memorized.
- the emotion information storage unit 204 stores the emotion identified from the received voice signal (i.e., the other party's emotion) and the emotion identified from the transmitted voice signal (i.e.
- the emotion of the mobile phone user of the invention is memorized. If it is set in advance to estimate the feeling of either one of the other phone users or the mobile phone user (data for one call c), Emotions are not memorized (in Fig. 3, it is shown that they are not memorized as “one”).
- the PIM application group 30 manages personal information and sends the personal information to mobile phone users. It is composed of multiple application cards for using information. Examples of the PIM application include a call history application 301 for displaying a call history, a scheduler application 302 for supporting schedule management by a mobile phone user, or a phone book application 303 for registering various personal information. Can be mentioned.
- the display control unit 40 activates and executes one of the applications in the PIM application group, extracts data necessary for processing of the executed application from the emotion information storage unit 2004, and displays the data in the display unit 50. Display various information. A display example of the display unit 50 when the call history application, scheduler application, or phone book application is started and executed will be described later.
- the emotion identification device 20 accepts an input operation by an operation key (not shown) provided in the mobile phone of the present invention from a mobile phone user, and an emotion should be specified (1.
- the other party, 2. the mobile phone user himself, or both, are determined (step S401).
- the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs a reception voice signal and a transmission voice signal to the emotion identification device 20 (step S403).
- the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal) for the target speech signal of the speech signal input from the communication device 10. Each time, the degree of emotion is continuously estimated at predetermined time intervals (step S404).
- the emotion identification device 20 analyzes the numerical value for each factor estimated during the call, and the numerical force has one characteristic factor.
- the emotion expressed by the one factor is stored as the emotion of the other party or mobile phone user in this call (step S406. At this time, as shown in FIG. I remember with the feelings I made).
- the display control unit 40 starts the call history application 301 by accepting the input operation by the operation key provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES).
- Program code for application 301 Accordingly, the emotion specified for each call stored in the emotion specifying device 20 and information related to the call are read out, and the information is displayed on the display unit 50 in a predetermined display format (step S408).
- FIG. 5 shows a display example of a call history by the mobile phone according to the first embodiment of the present invention.
- Fig. 5 (a) is a transmission history generated based on the information stored in the emotion identification device 20 shown in Fig. 3, and
- Fig. 5 (b) is based on the information stored in the emotion identification device 20 shown in Fig. 3. This is the incoming call history generated at.
- Fig. 5 (c) is an example of a mark indicating emotion displayed in the outgoing call history or incoming call history.
- the display control unit 30 sets the item "outgoing / incoming" in the information list stored in the emotion identifying device 20 shown in Fig. 3 to "outgoing" ”(Corresponding to data a for one call), and the other party whose emotion set in step S401 should be specified, that is, the item“ signal source ”is“ received voice ”.
- the data of the items “call start time”, “call destination telephone number”, and “emotion” are extracted from the minute data, and the items of the call history in FIG. 5 (a), “date and time”, “name”, “emotion” "Is displayed at the corresponding location.
- the emotion of the other party is displayed.
- the mobile phone user's own emotion is displayed, and the other party's emotion and the cellular phone user's own emotion are displayed. It may be configured to display both at the same time! / ⁇ .
- the emotion of the item “signal source” force S “sending speech” is extracted from the information list stored in the emotion identification device 20 shown in FIG. , Display.
- the display control unit 30 In order to display the incoming call history shown in FIG. 5 (b), the display control unit 30 also sets the item "outgoing / incoming" in the information list stored in the emotion identification device 20 shown in FIG. "Incoming call” (data corresponding to one call b and c), and the other party who should specify the emotion set in step S401, that is, the item “Signal source” is "Received voice" From the data for one call, the data of the items "call start time”, "call destination phone number”, and “emotion” are extracted, and the items of the incoming call history in Fig. 5 (b) are "Received date" and "Name". , “Emotion” is displayed in the corresponding part. Note that the name “Jiro Matsushita” is displayed in the “name” field of the incoming call history in FIG. 5B instead of the telephone number “09000001111”! This is the same as the reason described above. is there.
- the call history is displayed on the mobile phone according to the first embodiment of the present invention.
- FIG. 6 shows a display example of the schedule book by the mobile phone according to the first embodiment of the present invention.
- Fig. 6 (a) is a calendar displayed from the schedule book
- Fig. 6 (b) is a display example displaying emotions for each date in the calendar
- Fig. 6 (c) is a display on a specific date. It is a display example that displays the transition of emotion
- the display control unit 40 starts up the scheduler application 302 by accepting the input operation by the operation key provided in the mobile phone of the present invention also by the mobile phone user power (step S407, YES), first, the display control unit 40 in FIG. Display the calendar shown in a). Furthermore, when the display control unit 40 receives from the mobile phone user an input operation instructing to display the mobile phone user's own emotions on each date using the operation keys, the emotion identifying device 20 shown in FIG. From the list of stored information, extract the emotion of the mobile phone user who is the target of emotion identification, that is, the data for one call whose item “Signal Source” is “Transmission Voice”. Put out. In the list of information stored in the emotion identification device 20 shown in Fig.
- the emotion of the mobile phone user identified by the call on September 9, 2005 is “joy”. Since the mobile phone user's own emotion identified in (1) is “normal”, the display control unit 40 displays a mark indicating a feeling of joy on September 9, 2005, in the corresponding date column. On September 10, 2005, a mark representing normal feeling is displayed. For other dates, these processes are performed in the same way, and the emotions for each date in the calendar are displayed as shown in Fig. 6 (b).
- the emotions of mobile phone users are identified by one call on September 9, 2005 and September 10, 2005, respectively. Since there was only one emotion that was given power, it would be good to display a mark representing that one emotion on each date. However, there may be multiple different emotions identified in each call on the same date in the information list. In such a case, the display control unit 40, among a plurality of calls on a single date, the emotion specified by the call with the newest call date and time, the emotion specified by the call with the longest call time, The emotion that has been identified most frequently is displayed on the display unit 50 as a total emotion on that date. In addition, although the structure which displays the total feeling in a certain day was demonstrated here, you may display the total feeling in predetermined periods, such as a week and a 1-month unit.
- the level of the emotion is set for the emotion specified for each call. In order to do so, weighting may be performed, and based on the degree of the emotion, it may be possible to identify the total emotion.
- the display control unit 40 weights the emotion specified for each call so that the newer the call date or the longer the call time, the larger the numerical value indicating the degree of that emotion (for example, the call date and time).
- the threshold is calculated by the formula shown in Equation 1.
- the numbers of “joy weight”, “anger weight”, “sadness weight”, and “love weight” are used for emotions that are difficult to express in life (ie, "love”). On the contrary, it becomes smaller for emotions that are easily expressed in life, and the value of “threshold weight” is the initial value of the threshold. For example, by increasing the “weight of love”, the threshold value calculated when “affection” is included in the emotions identified in multiple calls is increased, and as a result, the numerical value “ The degree of “affection” easily exceeds this threshold, but the degree of emotions other than “affection” does not easily exceed this threshold, so it is easy to identify “affection” as a comprehensive emotion. By setting the threshold in this way, even if the owner is an individual whose joy and power is specified for each call, the affection is specified in several calls, so that the overall feeling can be expressed in love. And can reduce emotional bias.
- the display control unit 40 also selects a specific date using the operation keys (in FIG. 6 (b), select September 2! /), And the emotion of the mobile phone user on that date is displayed.
- the item “call start time” in the information list stored in the emotion identification device 20 shown in FIG. 3 is that specific date. so
- the mobile phone user's own emotions for which emotions are to be identified that is, data for one call whose item “signal source” is “sending speech” are extracted.
- the display control unit 40 refers to the items “call start time” and “emotion” in the extracted data for one call, and for example, as shown in FIG. Display your feelings in chronological order.
- the display control unit 40 displays a mark representing the emotion of the mobile phone user as shown in FIGS. 6 (b) and 6 (c). It may be displayed. At this time, if the display control unit 40 receives an input operation from the mobile phone user instructing to display an emotion of the other person on each date by the operation key (at this time, the other person also includes the time of the input operation). Specified by the phone number entered or specified by the phone number registered in the phone book if the other person to be identified is already registered in the phone book) From the list of information stored in device 20, extract the emotions of others, that is, the item “Signal Source” is “Received Voice” and the item “Destination Phone Number” matches the entered phone number. To do.
- FIG. 7 shows a display example of the telephone directory by the mobile phone according to the first embodiment of the present invention.
- Fig. 7 (a) is a display example of the name and phone number of each person registered in the phone book
- Fig. 7 (b) is a display example of emotions for each individual registered in the phone book
- Fig. 7 (c) shows a display example of an individual's emotional call status.
- the display control unit 40 performs an input operation using an operation key provided in the mobile phone of the present invention.
- the phone book application 303 is started by accepting the talk user power (step S407, YES), for example, as shown in Fig. 7 (a), the name of the person whose name begins with a line and its individual Are displayed for each individual.
- the display control unit 40 accepts an input operation for instructing to display the personal emotion being displayed by the operation key, the mobile phone user power is received, among the information lists stored in the emotion identification device 20 shown in FIG. Of the person whose emotion is to be identified, that is, the item “Signal source” is “Received voice” and the item “Called phone number” matches the phone number registered for that individual.
- Extract data for one call In the list of information stored in the emotion identification device 20 shown in FIG. 3, the telephone number “09000000000” matches the telephone number of the displayed name “Taro Matsushita” and the telephone number “09000001111” is displayed.
- the display control unit 40 puts marks representing the extracted emotions “sorrow” and “love” in the corresponding “Taro Matsushita” and “Jiro Matsushita” fields. Display.
- these processes are performed in the same way, and for each individual registered in the phone book as shown in Fig. 7 (b), it is obtained from the past call of that individual. Display the sentiment.
- the display control unit 40 has the latest call date and time, the emotion specified by the call, the longest call time, the emotion specified by the call, and the most specific among the emotions of a certain individual. It is also possible to cause the display unit 50 to display a number of emotions that have been repeated many times as the overall emotion of the individual.
- the display control unit 40 displays the appearance frequency of emotions specified by a plurality of calls with an individual as shown in the display example of the emotional call status of an individual in FIG. 7 (c). May be.
- a specific individual displayed in the phone book is selected by the operation key ("Taro Matsushita" is selected in 7 (b)) and out of 25 calls with the selected individual. Show how many times each emotion was identified (and the percentage of all calls)! /
- the emotion specified for each call with the individual is weighted to set the level of that emotion, and the overall emotion is identified from the level of emotion. It is also possible to do.
- the display control unit 40 weights the emotion specified for each call with the individual so that the newer the call date and time or the longer the call time, the larger the numerical value indicating the degree of the emotion (for example, The number of seconds, which is the difference between subtracting the current date and time from the date and time of the call and subtracting the current date and time, is used as the degree of emotion, or the number of seconds of the call time is used as the degree of emotion.
- the degree of emotion expressed in numbers! Of ⁇ ⁇ / ⁇ (which may be a numerical value of emotions specified for each call and the sum of those values for each emotion), the degree is indicated by the largest value exceeding a certain threshold. If the weighted emotion does not exceed the predetermined threshold, “normal” is displayed as the total emotion of the individual. According to this configuration, an emotion with a particularly high degree of emotion identified among all calls made with a certain individual is displayed as the total emotion of that individual. It is possible to improve the accuracy of identifying overall feelings.
- the threshold value is calculated by the formula shown in Equation 1.
- the phone book can be used to call a specific other person in the past by simply looking at the phone book! Since it is possible to judge whether it is hot or not, it is possible to help the mobile phone user remember the past call partner without taking time and effort.
- FIG. 8 shows an example of sorting for each emotion by the mobile phone according to the first embodiment of the present invention.
- the display control unit 40 uses the operation keys provided on the mobile phone of the present invention to perform an input operation to instruct the mobile phone user to display a list of call partners whose emotions were "sad” in past calls.
- the item “Signal source” is “Received voice”
- the item “Emotion” is “Sorrow”. Extract something.
- the display control unit 40 further counts the number of times the callee telephone number has been extracted when the callee telephone number that has already been extracted is extracted again. When the display control unit 40 extracts all the corresponding telephone numbers in the information list, as shown in FIG.
- the personal names corresponding to the telephone numbers in the descending order of the number of extractions (individual names are called If the phone number is already registered in the phone book application 303, the phone number is identified) and the number of times of extraction is displayed.
- the total number of calls with the extracted call destination telephone number and the ratio of the number of extractions to the total number of calls are also displayed. . By displaying such information, the mobile phone user can roughly know what other people have felt in calls made with a specific other person in the past.
- FIG. 9 shows another configuration diagram of the mobile phone according to the first embodiment of the present invention.
- the mobile phone according to the first embodiment of the present invention includes a PIM (Personal Information Manager) application group 30, a display unit 50, a communication device 60, an emotion identification device 70, and a display control unit 80.
- PIM Personal Information Manager
- the description of the same reference numerals as in FIG. 1 is as described above, so the description is omitted. Abbreviated.
- the call device 60 includes a mail transmission / reception unit 601 and a character information output unit 602.
- the mail transmission / reception unit 601 performs mobile wireless communication with the mobile radio base station, receives an e-mail addressed to the mail address assigned to the mobile phone of the present invention, and addresses an arbitrary mail address from the mobile phone of the present invention.
- the character information output unit 602 is a character data described in the e-mail received via the mail transmitting / receiving unit 601 or the e-mail for transmission (the character string described in the title name or e-mail text in the e-mail). At least a part of the character data about the) is output to the emotion identification device 70.
- the character information output unit 602 outputs the received e-mail data to the display control unit 80 to display the e-mail, and displays the e-mail by operating the operation key (not shown) provided in the mobile phone of the present invention.
- E-mail data for transmission created on 50 is input from the display control unit 80.
- the emotion identification device 70 includes an emotion estimation unit 701, an emotion accumulation unit 702, an emotion identification unit 703, and an emotion information storage unit 704.
- Emotion estimation unit 701 estimates the emotion of the mail creator describing the character string from the character data input from communication device 20.
- the emotion estimation unit 701 for example, expresses the degree of emotion expressed by the values of 0, 1, 2 for each factor of emotion information composed of love, joy, anger, sadness, neutral (normal) (0: emotion None, 1: Emotional weakness, 2: Emotional strongness, input character data (a character string that has at least one character strength, a mark that expresses an image, etc.) This mark is called a pictogram Is estimated for each sentence from the head of the sentence, and each estimated value is sequentially output to the emotion storage unit 702.
- the emotion accumulating unit 702 accumulates the numerical values for each factor input from the emotion estimating unit 701 in association with the input order.
- Emotion accumulation unit 702 inputs a numerical value for each of a series of factors from emotion estimation unit 701 (the series here refers to the input start force of numerical input from emotion estimation unit 701 to emotion accumulation unit 702) Then, the numerical value for each factor in the series is stored as a set of data (hereinafter, a set of data is referred to as data for one mail).
- Emotion identification unit 703 reads the data for one email from emotion accumulation unit 702, and The numerical value for each factor is analyzed, one characteristic factor is identified from the read numerical value, and the emotion represented by the one factor is output to the emotion information storage unit 704.
- the emotion identification unit 703 uses the factor with the largest numerical value in the data for one email as the characteristic emotion, thereby identifying the content that is strongly impressed in the email. Emotion can be identified with emphasis. Another method is to identify emotions with emphasis on the content of the entire email, by using the factors that have the largest sum of the numbers from the beginning of the email to the end of the email in the data for one email as the characteristic emotion. In addition, it is possible to identify emotions with emphasis on the reverberation of the email content by making the characteristic emotion the factor with the most powerful numerical value at the end of the email sentence in the data for one email. .
- Emotion information storage section 704 inputs an emotion represented by one factor from emotion identification section 703, and further, information on mail transmission / reception performed by mail transmission / reception section 601 of communication device 60 (the mail)
- the mail transmission / reception unit 601 inputs the mail transmission / reception error or the mail transmission time (mail reception time, destination, and sender's mail address).
- the emotion information storage unit 704 stores emotions input from the emotion identification unit 703 and various information input from the mail transmission / reception unit 601 in association with each other.
- FIG. 10 shows another example of the information list stored in the emotion identifying device in the mobile phone according to the first embodiment of the present invention. In FIG.
- the emotion information storage unit 704 is an emotion in which the mail power is also specified for each piece of data for one email (that is, the sentiment of the mobile phone user of the present invention who is the mail creator in the case of outgoing mail). If it is a received mail, it is configured to memorize the emotion of the mail partner who is the mail creator).
- the display control unit 80 activates and executes one of the applications in the PIM application group, extracts data necessary for processing of the executed application from the emotion information storage unit 700, and displays the data in the display unit 50. Display various information. A display example of the display unit 50 when the call history application, scheduler application, or phone book application is started and executed will be described later.
- step S1101 When communication device 60 starts mail transmission or mail reception (YES in step S1101), it outputs character data described in the received electronic mail or electronic mail for transmission to emotion identification device 70.
- step S1102 The emotion identifying device 70 reads the text data input from the communication device 60 for each sentence from the beginning of the input character data for each factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal). Or, it is estimated for each phrase (step S 1103).
- the emotion identification device 70 When the emotion identification device 70 finishes inputting the character data of the communication device 60 (step S 1104, YES), it analyzes the numerical value for each factor estimated from the beginning to the end of the series of character data, One characteristic factor is identified, and the emotion represented by the one factor is stored as the mail creator's emotion (step S 1105. At this time, as shown in FIG. Memorize with specific emotions).
- the display control unit 80 starts the mail history application 304 by accepting the input operation by the operation key provided in the mobile phone of the present invention by the mobile phone user power (step S1106, YES).
- the program code of the history application 304 the emotion specified for each email stored in the emotion identification device 70 and information related to the email are read, and the information is displayed on the display unit 50 in a predetermined display format. (Step S 1107).
- the display control unit 40 can receive multiple emails received on a single date, Among the multiple emails received, the date and time when the email was received is the newest, the emotion identified by the email, the number of characters in the email body is the most, the emotion identified by the email, and the number of times identified the most Are displayed on the display unit 50 as the total feelings of the date or the individual.
- the display control unit 40 weights the emotion specified for each email so that the newer the email reception date or the more the number of characters in the email body, the larger the numerical value indicating the degree of the emotion (for example, E-mail reception date / time power
- the number of seconds after subtracting the current date and time for displaying comprehensive emotions is numerically expressed as the degree of the emotion, or the amount of data in the mail text is expressed as the degree of emotion.
- the mobile phone of the first embodiment of the present invention it is possible to think about the contents of past calls and the contents of e-mail text without forcing the mobile phone user to spend time and effort. It can help to put out.
- mobile phone users can be notified by notifying mobile phone users of emotions of mobile phone users and the emotions of other parties that have communicated with those users using mobile phones. It is possible to provide an environment in which a person can easily remember past communications.
- FIG. 12 shows a configuration diagram of the mobile phone according to the second embodiment of the present invention.
- the mobile phone according to the second embodiment of the present invention includes a call device 10, an emotion identification device 20, a PIM (Personal Information Manager) application group 30, a display control unit 40, a display unit 50, a playback control unit 90, and a speaker. It is composed of 100.
- the configuration of the mobile phone according to the second embodiment of the present invention is the same as the configuration of the mobile phone according to the first embodiment of the present invention shown in FIG. 1 except that the voice storage unit 105, the playback control unit 90, and the speaker 100 of the communication device 10 are provided. It is an added one.
- the voice signal output unit 102 emotionally receives a voice signal received by another telephone force via the communication unit 101, a voice signal of the mobile phone user picked up by the call microphone 104, or both voice signals. Output to specific device 20. Also, the voice signal output unit 102 outputs the voice received by other telephone capabilities to the call speaker 103, and outputs the call voice of the mobile phone user collected by the call microphone 104 to the communication unit 101. In addition, the audio signal output unit 102 outputs to the audio storage unit 105 the audio signal received by the other telephones via the communication unit 101 and the audio signal of the mobile phone user collected by the call microphone 104. To do.
- the voice storage unit 105 of the call device 10 When the voice storage unit 105 of the call device 10 is notified by the communication unit 101 that a call with another telephone has been started and receives a voice signal from the voice signal output unit 102, the voice storage unit 105 starts recording the voice signal. To do.
- the audio storage unit 105 receives the audio signal from the audio signal output unit 102.
- the recording of the audio signal is ended, and the communication unit 101 is notified that the recording is completed.
- the communication unit 105 when notifying the voice storage unit 105 that a call with another telephone has started, is identification information for identifying the call (the time at which the call was started or ended is used to identify the call).
- the voice storage unit 105 records the identification information and the voice signal to be recorded in association with each other (the voice recording unit 105 identifies the voice signal based on the identification information).
- Emotion information storage unit 204 receives an emotion expressed by one factor from emotion identification unit 203, and further, information related to a call made by communication unit 101 of call device 10 (the call is transmitted) Any incoming call, call start time 'call end time, and other phone identification information (for example, the phone number of the callee), and the presence of audio data that recorded the call by the call device 10. Input from the communication unit 101.
- the emotion information storage unit 204 stores the emotion inputted from the emotion identification unit 203, the various information inputted from the communication unit 101, and the presence / absence of recorded audio data in association with each other (the emotion identification unit 203 receives the received voice signal).
- FIG. 13 shows a list of information stored in the emotion identifying device in the mobile phone according to the second embodiment of the present invention.
- the information list stored in the emotion identifying device in the mobile phone according to the second embodiment of the present invention is obtained by adding the item “recorded data presence / absence” to the information list in FIG.
- the reproduction control unit 90 is an audio signal specified by identification information (identification information is input by an input operation using an operation key provided on a mobile phone) among the audio signals stored in the audio storage unit 105. Is output and output to the speaker 100 to output sound.
- the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs the received voice signal and the transmitted voice signal to the emotion identifying device 20, and further starts recording the voice signal. (Step S1401).
- the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and two recommendationsls (normal) for the target speech signal of the speech signal input from the call device 10. Each time, the degree of emotion is continuously estimated at a predetermined time interval (step S404).
- the emotion identification device 20 analyzes the numerical value for each factor estimated during the call, and sets the characteristic factor to 1 And identify the emotion expressed by one of the factors as the other party's or mobile phone user's emotion in this call (step S1402). The presence / absence of voice data is stored in association with the identified emotion). Further, when the input of the audio signal is completed, the call device 10 ends the recording of the audio signal, and records the recorded data and identification information for identifying the recorded data in association with each other (step S1403).
- the display control unit 40 starts the call history application 301 by accepting the input operation using the operation keys provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES).
- the program code of the application 301 the emotion identified for each call stored in the emotion identification device 20, the information related to the call and the presence / absence of recorded voice data are read, and the information is displayed on the display unit 50. Display in a predetermined display format (step S 1404).
- FIG. 15 shows a display example of a call history by the mobile phone according to the second embodiment of the present invention.
- Fig. 15 (a) is a transmission history generated based on the information stored in the emotion identification device 20 shown in Fig. 13, and
- Fig. 15 (b) shows the information stored in the emotion identification device 20 shown in Fig. 13. This is the original received history.
- the display control unit 30 sets the item "outgoing / incoming" in the information list stored in the emotion identifying device 20 shown in Fig. 13 to "outgoing". (In this case, the data a for one call is applicable) From the data for one call with the other party, that is, the item “Signal Source” is “Received Voice”, the data of the items “Call Start Time”, “Destination Phone Number”, “Emotion”, “Recorded Data” Is extracted and displayed at the corresponding locations of the items of “Originating date”, “Name”, “Emotion”, and “Recorded voice” in the outgoing history in Fig. 15 (a).
- the display control unit 30 In order to display the incoming call history shown in FIG. 5 (b), the display control unit 30 also sets the item "outgoing / incoming" in the information list stored in the emotion identification device 20 shown in FIG. "Incoming call” (data corresponding to one call b and c), and the other party who should specify the emotion set in step S401, that is, the item “Signal source” is "Received voice" From the data for one call, the data of the items “call start time”, “call destination phone number”, “emotion”, “recorded data presence / absence” are extracted, and the incoming call history in Fig. 15 (b) is extracted. Displayed in the corresponding sections of the items “Date and time of incoming call”, “Name”, “Emotion”, and “Recorded voice”.
- FIG. 16 shows a display example of the schedule book by the mobile phone according to the second embodiment of the present invention.
- Fig. 16 (a) is a calendar displayed in the schedule book
- Fig. 16 (b) is a display example that displays emotions for each date in the calendar
- Fig. 16 (c) is for a specific date. It is a display example that displays the transition of emotion.
- FIGS. 6 (a) and 6 (b) are the same as FIGS. 6 (a) and 6 (b) of the first embodiment, and the processing of the display control unit 40 for displaying these is also performed. Since it is the same as that of the first embodiment, description thereof is omitted.
- the display control unit 40 selects a specific date using the operation keys (in FIG. 16 (b), selects September 2), and changes the emotion of the mobile phone user on that date.
- the item “call start time” in the information list stored in the emotion identification device 20 shown in FIG. 13 is that specific date, and
- the mobile phone user's own emotions for which emotions are to be identified, that is, data for one call whose item “signal source” is “sending speech” are extracted.
- the display control unit 40 extracts Referring to the items “call start time”, “emotion”, and “recorded data presence / absence” in the data for one call, for example, as shown in Fig. 6 (c), the mobile phone user himself on September 2 “Sound recording” is displayed along with the feeling of, and is displayed in chronological order.
- FIG. 17 shows a display example of the phone book by the mobile phone according to the first embodiment of the present invention.
- Fig. 17 (a) is a display example of each person's name and phone number registered in the phone book
- Fig. 17 (b) is an example of emotion display for each individual registered in the phone book. .
- the display control unit 40 starts up the phone book application 303 by accepting the input operation using the operation keys provided in the mobile phone of the present invention by the mobile phone user power (step S407, YES). For example, FIG. As shown in (a), the name of the person whose name begins with a line is displayed for each individual person's name and the individual's telephone number. Further, when the display control unit 40 receives an input operation for instructing to display the personal emotion being displayed by the operation key from the mobile phone user, the display control unit 40 of the information list stored in the emotion identification device 20 shown in FIG.
- the item “Signal source” is “Received voice” and the item “Called phone number” matches the phone number registered for that individual. Yes Extract data for one call.
- the telephone number “09000000000” matches the telephone number of the displayed name “Taro Matsushita” and the telephone number “09000001111” is displayed.
- the display control unit 40 displays the marks representing the extracted emotions “sorrow” and “love” in the corresponding “Taro Matsushita” and “Jiro Matsushita” fields. .
- the display control unit 40 displays “Recorded voice present” in the corresponding column as shown in FIG. indicate.
- the mobile phone according to the second embodiment of the present invention has [display format using call history], [schedule This section describes the process of playing back the recorded audio after displaying the presence or absence of the recorded audio as described in [Display Format Using Single Book] and [Display Format Using Phone Book].
- the control unit 90 identifies the information displayed in the emotion information storage unit 204 as to which one call data the selected display location is displayed based on, and identifies the data for one call.
- the voice signal to be reproduced is read out of the voice signals stored in the voice storage unit 105 from the item “call start time” in (in the voice storage unit 105, the call start time as the identification information and the voice to be recorded The signal is stored in association with the signal).
- the reproduction control unit 105 outputs the read audio signal to the speaker 100, so that the mobile phone can output sound up to the end of the recorded voice.
- the mobile phone of the second embodiment of the present invention when the mobile phone user remembers the content of a past call, it is sometimes specified in advance and symbolizes the content. In addition to displaying emotions, if the call voice is recorded, the presence or absence of the recorded data is also displayed, and the recorded data is played back so that the mobile phone user can confirm the contents. Can remember.
- the mobile phone according to the second embodiment of the present invention does not need to record every call one by one, so that a mobile phone user can record a voice signal during a call by performing a predetermined operation. It may be configured.
- information about the call, and the presence or absence of recorded audio data are displayed in a predetermined display format, if there is no recorded audio data, for example, Fig. 15 (b), Fig. 17 As shown in (b), “No sound” is displayed at the corresponding location.
- FIG. 18 shows the second of the present invention.
- the processing flow (Example 1) which records the audio
- the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs a received voice signal and a transmitted voice signal to the emotion identifying device 20. (Step S403).
- the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and neutral (normal) for the speech signal to which emotion should be identified among the audio signals input from the call device 10. Every time, the degree of emotion is continuously estimated at a predetermined time interval (step S404). In the process of step S404, if there is a factor that exceeds the threshold that is at least one of the estimated emotion information factors (S1801, YES), the call device 10 records the received voice signal and the transmitted voice signal. (S1802.
- the emotion estimation unit 201 for determining whether there is a factor with a numerical value exceeding a threshold value among the estimated emotion information factors is stored in the voice storage unit 105.
- a control signal that starts recording the received voice signal and the transmitted voice signal will be output).
- the emotion identification device 20 repeats these processes until the call by the call device 10 is completed (step S405, YES).
- the emotion identification device 20 has the ability to start a call with respect to the received voice signal for each of the emotional information elements composed of affection, joy, anger, sadness, and neutral (normal) as shown in Fig. 2.
- the communication device 10 has a pleasure factor in the interval from 10 seconds to 15 seconds after the call starts. Since it was estimated to be 2, the recording of the call voice from this interval to the end of the call is started (first recorded audio), and the anger factor is 2 in the interval of 45 seconds after the start of the call for 50 seconds It is estimated that the recording of the call voice from this section to the end of the call is started (second recorded voice), and the grief factor is estimated to be 2 in the section of 55 seconds after the start of the call. Therefore, this segment strength also records the call voice until the end of the call. Start (third recorded voice).
- the emotion identification device 20 analyzes the numerical value for each factor estimated during the one call, specifies one characteristic factor from the numerical value, The emotion expressed by the one factor is stored as the feeling of the other party or the user of the mobile phone in this call (step S 1402. At this time, as shown in FIG. 13, information about the call and recorded voice data are recorded. Is stored in association with the identified emotion).
- the call device 10 ends the recording of each call audio, and the recorded audio that has started recording after the factor specified in the process of step S 1402 is estimated to be 2 (for example, step If the factor specified in the process of S1402 is pleasure, the first recorded voice) is recorded in association with the identification information for identifying the recorded data (S1803.
- the communication device 10 may delete the second and third recorded voices after step S1803.
- the communication device 10 leaves the second and third recorded voices as second and third candidates representing the emotion of the call, and proceeds to the process of step S 1803 to obtain the second and third recorded voices.
- the identification information for identifying the recorded data may be recorded in association with (In this case, [Display format using call history], [Display format using schedule book], [Phone book When displaying the presence / absence of recorded audio as described in [Display format used], display the number of recorded audios in order of priority, or display only the number of recorded audios).
- FIG. 19 shows a processing flow (Example 2) for recording a call voice by the mobile phone according to the second embodiment of the present invention.
- the call device 10 When the call device 10 starts a call (step S402, YES), the call device 10 outputs a received voice signal and a transmitted voice signal to the emotion identification device 20, and further starts recording the voice signal. (Step S403).
- the emotion identifying device 20 is a factor of emotion information composed of affection, joy, anger, sadness, and two Amls (normal) for the target speech signal of the speech signal input from the call device 10. Each time, the degree of emotion is continuously estimated at a predetermined time interval (step S404). In the process of step S404, If there is a factor with a numerical value exceeding the threshold that is at least one of the emotional information factors (step S 1901, YES), the emotion identification device 20 provides information on the factor and the time when the factor exceeded the threshold.
- Step S 1902. the emotion estimation unit 201 determines the factor of the emotion information. If it is estimated that there is a factor with a numerical value exceeding the threshold value, the emotion accumulation unit 202 stores the factor and the time when the factor exceeds the threshold value).
- the emotion identification device 20 repeats these processes until the call by the call device 10 is completed (step S405, YES).
- the emotion starter 20 can continuously determine the call start power for the received voice signal at the end of the call for each emotion information factor consisting of affection, joy, anger, sadness, and neutral (normal) shown in FIG.
- the call device 10 will be in the interval from 10 seconds to 15 seconds after the start of the call. Since the joy factor was estimated to be 2, the joy factor and the time 10 seconds after the start of the call were memorized (first tag information), and the interval from 45 to 50 seconds after the start of the call Therefore, the anger factor and the time 45 seconds after the start of the call are memorized (second tag information), and 50 to 55 seconds after the start of the call.
- the sadness factor was estimated to be 2 in the interval It stores a child, and the time after the call start 50 seconds, the (third tag information).
- the emotion identification device 20 analyzes the numerical value for each factor estimated during the one call, specifies one characteristic factor from the numerical value, The emotion expressed by the one factor is stored as the other party's or mobile phone user's emotion in this call, and the information about the time when the specified factor exceeds the threshold (tag information) is stored (step information). S 1903. At this time, in addition to the correspondence between the information related to the call shown in FIG. 13, the presence / absence of the recorded voice data, and the identified emotion, tag information is also stored in association with each other). Further, when the input of the audio signal is completed, call device 10 ends the recording of the audio signal, and records the recorded data and identification information for identifying the recorded data in association with each other (step S 1403).
- Playback should be started from the audio location that can be identified from the tag information. Furthermore, by starting playback of the recorded voice from a location that is a predetermined time before the voice location that can be identified from the tag information, it is possible to avoid missing the voice location that is the main cause of identifying the emotion of the recorded speech.
- the information processing terminal of the present invention it is possible to support recalling the contents of past calls and the contents of e-mail texts without forcing the user to spend time and effort. This is useful in the field of information processing terminals that can identify the emotion of the creator of the email and the emotion of the speaker during the call from the character string described in the email and the voice during the call.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Telephone Function (AREA)
- Mobile Radio Communication Systems (AREA)
Abstract
Terminal de traitement d'informations pouvant aider l'utilisateur à se souvenir du contenu d'une conversation antérieure ou d'un message sans perte de temps ni effort. Ce terminal est composé d'un dispositif de conversation (10) servant à entrer des informations d'identification d'émotions consistant au moins en une voix, d'un dispositif d'identification d'émotions (20) servant à identifier une émotion en fonction des informations d'identification d'émotions entrées dans le dispositif de conversation (10) et d'un afficheur (50) servant à afficher les informations concernant l'émotion identifiée par le dispositif d'identification d'émotions (20).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005363837 | 2005-12-16 | ||
JP2005-363837 | 2005-12-16 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2007069361A1 true WO2007069361A1 (fr) | 2007-06-21 |
Family
ID=38162679
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/314521 Ceased WO2007069361A1 (fr) | 2005-12-16 | 2006-07-21 | Terminal de traitement d'informations |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2007069361A1 (fr) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011141651A (ja) * | 2010-01-06 | 2011-07-21 | Nec System Technologies Ltd | 電子メールシステム、電子メール受信装置および表示方法 |
US8299897B2 (en) | 2008-01-24 | 2012-10-30 | Sony Corporation | Information processing apparatus, method, and program |
JP2013206389A (ja) * | 2012-03-29 | 2013-10-07 | Fujitsu Ltd | 親密度算出方法、親密度算出プログラムおよび親密度算出装置 |
JP2015096867A (ja) * | 2008-10-22 | 2015-05-21 | グーグル・インコーポレーテッド | 個人情報のジオコーディング |
JP2022020659A (ja) * | 2017-08-08 | 2022-02-01 | Line株式会社 | 通話中の感情を認識し、認識された感情を活用する方法およびシステム |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005135169A (ja) * | 2003-10-30 | 2005-05-26 | Nec Corp | 携帯端末およびデータ処理方法 |
JP2005152054A (ja) * | 2003-11-20 | 2005-06-16 | Sony Corp | 感情算出装置及び感情算出方法、並びに携帯型通信装置 |
JP2005192024A (ja) * | 2003-12-26 | 2005-07-14 | Fujitsu I-Network Systems Ltd | コールセンタにおける通話音声データ管理方式およびそれに用いるオペレータ端末 |
JP2005345496A (ja) * | 2004-05-31 | 2005-12-15 | Nippon Telegr & Teleph Corp <Ntt> | 音声処理装置、音声処理方法およびそのプログラム |
-
2006
- 2006-07-21 WO PCT/JP2006/314521 patent/WO2007069361A1/fr not_active Ceased
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005135169A (ja) * | 2003-10-30 | 2005-05-26 | Nec Corp | 携帯端末およびデータ処理方法 |
JP2005152054A (ja) * | 2003-11-20 | 2005-06-16 | Sony Corp | 感情算出装置及び感情算出方法、並びに携帯型通信装置 |
JP2005192024A (ja) * | 2003-12-26 | 2005-07-14 | Fujitsu I-Network Systems Ltd | コールセンタにおける通話音声データ管理方式およびそれに用いるオペレータ端末 |
JP2005345496A (ja) * | 2004-05-31 | 2005-12-15 | Nippon Telegr & Teleph Corp <Ntt> | 音声処理装置、音声処理方法およびそのプログラム |
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8299897B2 (en) | 2008-01-24 | 2012-10-30 | Sony Corporation | Information processing apparatus, method, and program |
JP2015096867A (ja) * | 2008-10-22 | 2015-05-21 | グーグル・インコーポレーテッド | 個人情報のジオコーディング |
US10055862B2 (en) | 2008-10-22 | 2018-08-21 | Google Llc | Geocoding personal information |
US10867419B2 (en) | 2008-10-22 | 2020-12-15 | Google Llc | Geocoding personal information |
US11704847B2 (en) | 2008-10-22 | 2023-07-18 | Google Llc | Geocoding personal information |
US12249010B2 (en) | 2008-10-22 | 2025-03-11 | Google Llc | Geocoding personal information |
JP2011141651A (ja) * | 2010-01-06 | 2011-07-21 | Nec System Technologies Ltd | 電子メールシステム、電子メール受信装置および表示方法 |
JP2013206389A (ja) * | 2012-03-29 | 2013-10-07 | Fujitsu Ltd | 親密度算出方法、親密度算出プログラムおよび親密度算出装置 |
JP2022020659A (ja) * | 2017-08-08 | 2022-02-01 | Line株式会社 | 通話中の感情を認識し、認識された感情を活用する方法およびシステム |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
KR100689396B1 (ko) | 음성 인식을 이용한 통화 내역 관리 장치 및 방법 | |
CN1988705B (zh) | 在移动通信终端中提供呼入和呼出信息的设备和方法 | |
JP5292732B2 (ja) | 通信装置 | |
JP3738383B2 (ja) | 通信装置 | |
JP4226055B2 (ja) | 通信端末装置及びプログラム | |
WO2006028223A1 (fr) | Terminal de traitement d’information | |
WO2004070567A2 (fr) | Procede destine a remplir des informations d'appelants dans un carnet d'adresses gere par le systeme central | |
CN101459713A (zh) | 来电提示个人信息的方法及其移动通讯装置 | |
WO2007069361A1 (fr) | Terminal de traitement d'informations | |
WO2006031685A2 (fr) | Identification sonore de l'appelant pour casques d'ecoute de telephones mobiles | |
JP2007274500A (ja) | 携帯電話機、閲覧方法、及び閲覧プログラム | |
JP5233287B2 (ja) | 携帯通信端末 | |
WO2018061824A1 (fr) | Dispositif de traitement d'informations, procédé de traitement d'informations, et support d'enregistrement de programmes | |
JP2005244815A (ja) | 携帯端末装置 | |
JP2004129174A (ja) | 情報通信装置、情報通信プログラム、及び記録媒体 | |
JP3877724B2 (ja) | 電話機 | |
JP5218376B2 (ja) | 検索が容易な通話録音機能を有する電話装置 | |
JP4490943B2 (ja) | 携帯電話機 | |
JP4606816B2 (ja) | 電話機 | |
JP2007257238A (ja) | 電話機 | |
KR100738417B1 (ko) | 주소록을 이용한 버디 관리 기능을 제공하는 이동통신단말기 | |
KR100645765B1 (ko) | 무선통신 단말기에서의 통화 목록 자동 갱신 방법 | |
JP4565959B2 (ja) | 出力装置および出力方法 | |
JP2007019600A (ja) | 電話装置及び着信通知方法 | |
JPH10304050A (ja) | 着信電話自動応答装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06781443 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: JP |