WO2018107031A1 - Personal emergency data capture and alerting - Google Patents
Personal emergency data capture and alerting Download PDFInfo
- Publication number
- WO2018107031A1 WO2018107031A1 PCT/US2017/065316 US2017065316W WO2018107031A1 WO 2018107031 A1 WO2018107031 A1 WO 2018107031A1 US 2017065316 W US2017065316 W US 2017065316W WO 2018107031 A1 WO2018107031 A1 WO 2018107031A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- mobile device
- contacts
- event
- user
- data
- Prior art date
Links
- 238000013481 data capture Methods 0.000 title claims abstract description 9
- 238000000034 method Methods 0.000 claims abstract description 49
- 230000004913 activation Effects 0.000 claims abstract description 22
- 230000033001 locomotion Effects 0.000 claims description 21
- 238000012545 processing Methods 0.000 claims description 10
- 230000004044 response Effects 0.000 description 22
- 238000004891 communication Methods 0.000 description 15
- 230000015654 memory Effects 0.000 description 13
- 230000000981 bystander Effects 0.000 description 12
- 230000008569 process Effects 0.000 description 8
- 230000009471 action Effects 0.000 description 4
- 230000005540 biological transmission Effects 0.000 description 4
- 230000001413 cellular effect Effects 0.000 description 4
- 230000036541 health Effects 0.000 description 4
- 230000005291 magnetic effect Effects 0.000 description 4
- 238000010079 rubber tapping Methods 0.000 description 4
- 230000003068 static effect Effects 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 230000003993 interaction Effects 0.000 description 3
- 230000003287 optical effect Effects 0.000 description 3
- 206010039740 Screaming Diseases 0.000 description 2
- 230000003213 activating effect Effects 0.000 description 2
- 230000002730 additional effect Effects 0.000 description 2
- 238000010586 diagram Methods 0.000 description 2
- 230000009429 distress Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 230000007613 environmental effect Effects 0.000 description 2
- 230000007246 mechanism Effects 0.000 description 2
- 238000012546 transfer Methods 0.000 description 2
- 206010001488 Aggression Diseases 0.000 description 1
- 235000002566 Capsicum Nutrition 0.000 description 1
- 239000006002 Pepper Substances 0.000 description 1
- 235000016761 Piper aduncum Nutrition 0.000 description 1
- 235000017804 Piper guineense Nutrition 0.000 description 1
- 244000203593 Piper nigrum Species 0.000 description 1
- 235000008184 Piper nigrum Nutrition 0.000 description 1
- 241000269400 Sirenidae Species 0.000 description 1
- 230000036626 alertness Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 230000008859 change Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000007123 defense Effects 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 229940079593 drug Drugs 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- 230000006870 function Effects 0.000 description 1
- 230000007274 generation of a signal involved in cell-cell signaling Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 238000012552 review Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 239000007921 spray Substances 0.000 description 1
- 230000001960 triggered effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B25/00—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems
- G08B25/01—Alarm systems in which the location of the alarm condition is signalled to a central station, e.g. fire or police telegraphic systems characterised by the transmission medium
- G08B25/016—Personal emergency signalling and security systems
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B13/00—Burglar, theft or intruder alarms
- G08B13/18—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
- G08B13/189—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
- G08B13/194—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
- G08B13/196—Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
- G08B13/19695—Arrangements wherein non-video detectors start video recording or forwarding but do not generate an alarm themselves
-
- G—PHYSICS
- G08—SIGNALLING
- G08B—SIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
- G08B27/00—Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations
- G08B27/006—Alarm systems in which the alarm condition is signalled from a central station to a plurality of substations with transmission via telephone network
Definitions
- Embodiments described herein generally relate to emergency event recording and, in some embodiments, more specifically to an automated emergency event recording with location and relationship based event notification.
- Personal emergency response systems often rely on a proximity to a base station or the existence of a call center. They interact with one responder, do not warn others nearby of the emergency, nor is their use, generally, covert, an important option for senders of distress signals in emergency situations with attackers or other scenarios in which one doesn't want their sending of the distress signal to be detected. Further, streaming of the realtime health data of the sender of an emergency signal is not available to receivers of the notification.
- FIG. 1 illustrates an example of an environment in which the emergency alert system may be used, in accordance with some embodiments.
- FIG. 2 illustrates the emergency alert system engine in accordance with some embodiments.
- FIG. 3 illustrates an example process when solo recording mode is activated, as according to some embodiments.
- FIG. 4 illustrates an example process when broadcast mode is activated, as according to some embodiments.
- FIG. 5 illustrates an example process for determining the contacts for an alert notification when broadcast mode is activated, as according to some embodiments.
- FIG. 6 illustrates an example user interface for solo recording mode on a mobile device in accordance with some embodiments.
- FIG. 7 illustrates an example user interface for activated solo recording mode on a mobile device in accordance with some embodiments.
- FIG. 8 illustrates an example user interface for the activated broadcast mode on a mobile device in accordance with some embodiments.
- FIG. 9 illustrates an example user interface for the activated broadcast mode on a mobile device in accordance with some embodiments.
- FIG. 10A illustrates an example user interface for the alert system on a smartwatch in accordance with some embodiments.
- FIG. 10B illustrates an example user interface for the alert system on a smartwatch in accordance with some embodiments.
- FIG. IOC illustrates an example user interface for the alert system on a smartwatch in accordance with some embodiments.
- FIG. 11 illustrates an example user interface for a received alert notification on a mobile device in accordance with some embodiments.
- FIG. 12 illustrates a flow chart showing a technique for executing a solo recording mode for an emergency alert system, in accordance with some embodiments.
- FIG. 13 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
- the alert system described herein provides a PERS that may record and store information such as audio, health data, and location for the sender and their recipients for later review.
- the system described may broadcast a message or notification to a select group of people based on the event and include the recorded information.
- the alert system may have two modes: a solo recording mode and a broadcast mode.
- the solo recording mode may be executed on any mobile electronic device, wherein the device does not need to be connected to a network.
- the solo recording mode may record and store information such as audio, video, health, and location data.
- a connected device such as one connected to a network to broadcast a notification to other devices, may execute broadcast mode.
- the broadcast mode may be executed
- the broadcast notification may include data collected from the solo recording mode.
- FIG. 1 illustrates an example of an environment 100 in which the emergency alert system may be used, in accordance with some embodiments.
- a person, or user 105 may find themselves in a situation where their safety is a concern. This may include self-inflicted situations where a person has accidentally hurt themselves or is lost. This may include situations where no actual event has occurred, but the person feels uneasy or threatened. For example the user 105 is being followed by two threatening individuals 120. This may include situations where a person is actually being assaulted or attacked, such as instances of robbery or domestic violence.
- the attacker may be an authority, such as a police officer that is exceeding justifiable use of force.
- the user 105 may have a mobile device 110 that may execute the alert system engine 200 of the alert system.
- the alert system engine 200 may be executed on a mobile device 110, for example, a cellular phone, a smart phone, smart watch, etc.
- a user 105 may find themselves in a situation they feel may be threatening or have the potential to escalate, such as being followed on a dark street by two threatening individuals 120 or being stopped by a police officer.
- the user 105 may wish to activate the solo recording mode of the alert system engine 200.
- the user 105 may wish to activate the broadcast mode when danger feels eminent or when an attack or violence is occurring.
- the solo recording mode or broadcast mode of the alert system engine 200 may be activated by multiple methods and combinations of those methods.
- the alert system engine 200 may determine if the solo recording mode alone, or the solo recording mode and broadcast mode together should be activated based on the input received. For example, the alert system engine 200 may have a single input for activating solo recording mode, but if the alert system engine 200 receives an input from a sensor at the time of activation indicating an escalated or emergency situation, the alert system engine 200 may automatically enter broadcast mode as well.
- the user 105 may desire to activate broadcast mode or the alert system engine 200 may determine that broadcast mode should be activated. For example, if the threatening individuals 120 begin to attack the user 105, such as try to rob or assault the user 105, then the user may activate broadcast mode.
- the alert system engine 200 may transmit, by way of the mobile device 110, an alert message to contacts 125 of the user 105.
- the contacts 125 have mobile devices configured for receiving alert notifications from the alert system engine 200 of the user 105.
- the mobile devices of the contacts 125 may be executing the alert system engine as well to receive alert notifications from other alert system engines and respond to the alert
- the alert system may be activated by pushing a button or making a sequence of button pushes on the device.
- the sequence of button pushes may be similar to Morse code, wherein the user may press a button on the device in a pattern of press-and-release and hold-and-release to activate the alert system.
- the alert system may be activated by tapping or tapping and holding a virtual button on the screen of a smartphone or smartwatch.
- the alert system may be activated by voice commands.
- the voice command activation may include a combination of words, tone, and volume of the speech of the user to determine when an activation should occur, what mode should be activated, and the type of event that is occurring.
- the words spoken may be a specific key phrase predefined by the user for activation.
- the alert system may also be trained with the user's voice, such that the alert system may only proceed with activation if the key phrase is spoken by the user's voice.
- the user may also train the alert system with their voice at different tones, such that the alert system may recognize when the user is speaking at a normal tone and when the user may be screaming or agitated.
- the user may speak in a calm, normal volume, a command such as "activate recording mode" for the alert system to start the solo recording mode.
- the user may instead, in a loud, screaming voice, say “broadcast emergency” for the alert system to activate broadcast mode and broadcast an emergency notification.
- the alert system may activate solo recording mode or broadcast mode based on voice and tone factors, without a specific phrase being said.
- the user may train the alert system with a baseline voice profile.
- the alert system may determine when the user's voice indicates they are in an agitated or fearful state and automatically activate at least solo recording mode.
- the alert system may activate solo recording mode or broadcast mode based on the tone, words, or volume of the voice of a person near the user.
- the user may suddenly be verbally assaulted by a person threatening them, thus the person's voice may be agitated, aggressive, and yelling.
- the alert system may automatically activate broadcast mode based on determining an agitated yelling voice directed at the user, and thus the user is in immediate danger.
- the alert system may utilize sensors, such as a heart rate sensor, to determine in part when the alert system should be activated. For example, a person may say “activate alert” while their heart rate is in a normal range, which may activate the solo recording mode. A person may say “activate alert” while their heart rate is in an elevated range, which may activate both the solo recording mode and the broadcast mode, as the person may be in immediate danger based on their elevated heart rate.
- sensors such as a heart rate sensor
- a person may say “activate alert” while their heart rate is in a normal range, which may activate the solo recording mode.
- a person may say “activate alert” while their heart rate is in an elevated range, which may activate both the solo recording mode and the broadcast mode, as the person may be in immediate danger based on their elevated heart rate.
- the alert system may receive data from a sensor, such as a gyroscope or accelerometer that indicates the user is running.
- the information that the person is running at the time the alert system is activated may indicate the person is in potential danger and trying to escape it, but is not immediately under attack, thus the alert system may activate solo recording mode.
- the alert system may receive indication the user is not moving, or the mobile device is not moving. This may indicate the user could be hiding, hurt and not able to move, or has been separated from the mobile device. Thus the user required immediate help and the alert system enters broadcast mode.
- the alert system may receive data indicating that the mobile device is being moved in dramatic or violent motions. This may indicate the user is under attack, and broadcast mode is activated.
- the alert system may receive sensor data indicating a police officer is nearby.
- This may be data such as the detection of flashing lights by the camera of the mobile device or frequent use of the word "Officer" detected in the audio stream.
- the user may feel threatened by police and in fear of excessive force by the police officer.
- the alert system may activate broadcast mode and avoid broadcasting to any police or emergency services.
- the alert system may turn the screen and notifications off for the mobile device and put the mobile device into a user specified lock mode.
- the only person, presumably, that may turn off solo recording mode or broadcast mode may be the user of the mobile device who has the capability of unlocking the mobile device. This may prevent solo recording mode or broadcast mode from being deactivated by a ne'er-do-well, such as the person who may have attacked or harmed the user.
- FIG. 2 illustrates the emergency alert system engine 200 in accordance with some embodiments.
- the alert system engine 200 may include multiple components for receiving an indication of an event, receiving sensor data, and executing solo recording mode or broadcast mode.
- the alert system engine 200 may include an event detector 210.
- the event detector 210 receives input for the activation of the either solo recording mode or broadcast mode.
- the event detector 210 may receive input such as the pressing of a virtual button, a sequence of button presses, or a voice command.
- the event detector 210 may analyze the tone of the user's voice, as compared to a known baseline of the user's voice, and determine it is yelling or aggravated, and thus activate solo recording mode.
- the event detector 210 may determine if the alert system engine 200 should activate solo recording mode or both solo recording mode and broadcast mode. The determination may be based on the input received. The determination may be based on the type of event. The event detector may determine the type of event based on data received from the sensor array 208.
- the alert system engine 200 may include a sensor array 208 for receiving and managing data received from sensors.
- the sensors may be part of the device executing the alert system engine 200 and the sensor may be part of another device communicating the sensor data to the sensor array 208 by way of the transceiver 214.
- the sensor array 208 may receive data from sensors such as a microphone 202, a GPS 204, and a heart rate sensor 206.
- the microphone 202 may be activated to capture audio for the audio stream when solo recording mode is activated.
- the heart rate sensor may be a sensor on a smartwatch. The smart watch communicates the heart rate sensor data to the sensor array 208 of the alert system engine 200 by way of the transceiver 214.
- the GPS 204 may obtain geolocation data for the device executing the alert system engine 200 and determine the location of the device.
- the heart rate sensor 206 is an example of a sensor providing sensor data to the alert system engine 208 that is utilized by the event detector 210 to determine the type of event.
- the alert system engine 200 may be connected to a storage device 220.
- the alert system engine 200 may store data on the storage device 220, including the audio stream and video stream, the sensor data, and the geolocation data.
- the alert system engine 200 may retrieve the data stored on the storage device 220, such as if the audio stream is requested to provide information to an EMT about what the user may have experienced.
- the alert system engine 200 may include an output manager 216.
- the output manager 216 receives information from the event detector 210, the sensor array 208, and the broadcast manager 212.
- the output manager 216 may collect data and generate information to be displayed, such as on a display screen of the device.
- Display information generated by the output manager 216 may include a map indicating the current location of the device based on the geolocation data from the GPS 204 and chat messages received from contacts who have received an alert notification.
- the output manager 216 may generate alert messages to be transmitted when broadcast mode is activated.
- the alert system engine 200 may include a contacts manager 218.
- the broadcast manager 212 may interface with the contacts manager 218 to retrieve a set of contacts for transmitting an alert notification to.
- the contacts manger 218 may store multiple sets of contacts the user has designated as the contacts the user wishes to be notified, depending on the event.
- the contacts manager 218 may determine the set of contacts by collecting data such as the type of event from the event detector 210 and sensor data from the sensor array 208. For example, the event may indicate that only contacts outside a distance radius should be contacted.
- the event detector 210 may indicate this type of event to the contacts manager 218.
- the contacts manager 218 may get geolocation data from the sensor array 208.
- the contacts manger 218 may determine the set of contacts outside the distance radius for the broadcast manager 212 to utilize for transmitting an alert notification.
- the alert system engine 200 may include a broadcast manager 212.
- the broadcast manager 212 receives the activation.
- the broadcast manager 212 may interface with the contacts manager 218 to determine the contacts to transmit an alert notification to.
- the broadcast manager 212 may interface with the output manager 216 for a message to include with the alert notification.
- the broadcast manager 212 may send the alert notification to the transceiver 214 for transmission to the determined contacts through a network 226. From the network 226, the transceiver 214 may receive acknowledgement responses from the contacts that received the alert notification. The responses may be sent to the broadcast manager 212 to track the contacts that have responded.
- the broadcast manager 212 may provide information about the contacts that have responded to the output manager 216.
- the output manager 216 may generate a display including the information about the contacts who have responded.
- the transceiver 214 may receive additional information from the contact, such as the current geolocation of the contact.
- the information is managed by the broadcast manager 212 for each contact and provided to the output manager 216 to display, such as by generating a map with the location of the contacts.
- FIG. 3 illustrates an example process 300 when solo recording mode is activated, as according to some embodiments.
- the alert system may perform an operation 305 wherein the alert system receives an input from the user to activate solo recording mode.
- the alert system may perform an operation 310 to activate one or more sensors of the mobile device.
- a microphone of the device may be activated to capture audio into an audio stream.
- the audio may include audio such as the voices of the user and those people around the user, and the environmental sounds around the user.
- the device may activate a camera to capture video into a video stream.
- the camera may be a camera on the mobile device or a camera communicatively connected to the mobile device.
- the camera may be a camera on a device in the possession of a contact of the user.
- the camera may also be a network connected camera such as a security camera at a store or a residence.
- the alert system may connect to external security systems to capture data such as sirens, or other security features found in cars and buildings that may be activated in addition to the audio recording.
- a wearable camera on the alert sender's body may be activated to capture still shots or video of an emergency interaction.
- the alert system may perform an operation 315 to utilize a global positioning system (GPS) to collect geolocation data contemporaneously with the audio stream.
- GPS global positioning system
- the sensors may capture data contemporaneously with the audio and/or video stream.
- Sensors may include a heart rate sensor, an accelerometer, a magnetometer, a gyroscope, an optical sensor, an ultrasonic sensor, an inertial measurement unit, a multi-axis sensor, or a contact pressure sensor, etc.
- a sensor may be configured to detect physiological indications and may be adapted to detect a person's heart rate, skin temperature, brain wave activities, alertness (e.g., camera-based eye tracking), activity levels, or other physiological or biological data.
- a device may include sensors such as a gyroscope or accelerometer and the device may utilize these sensors to determine the user is in a physical altercation or has been injured and is not able to move.
- the alert system may perform a decision 320 to utilize the sensors of the device to determine the type of event.
- the data from the sensors and the trigger indicating an event for activation may be combined to determine the type of event.
- the device may include sensors to capture physiological data which is synced with the audio stream recording. When the audio recording is played back, such as by an emergency medical technician (EMT), the EMT may observe the physiological changes of the user during the audio recording and utilize the information to diagnose and treat the user.
- EMT emergency medical technician
- the decision 320 may determine the type of event, and based on the type of event determine if the alert system should be in solo recording mode or activate broadcast mode. If the alert system determines to stay in solo recording mode, the alert system may perform operation 325 for the mobile device to display an option to enter broadcast mode, the status of the audio stream recording, and a map with the current location of the mobile device as determined by the geolocation data. Alternatively, the decision 320 may determine the type of event and determine the alert system should activate broadcast mode at operation 330. The activation of broadcast mode is detailed in FIG. 4.
- the alert system may perform an operation 335 to store the audio stream, geolocation data, and sensor data.
- the audio stream and geolocation data by be synched and stored on the device, such that when the audio recording is played, the geolocation data may be displayed on a map to show the location of the device during the progression of the audio recording.
- the audio stream, geolocation data, and sensor data may be transmitted and stored on a cloud service.
- solo recording mode When solo recording mode is activated, only the user may cancel solo recording mode.
- the user may set a predetermined timer for solo recording mode, wherein the countdown timer begins when solo recording mode is activated. If the user does not cancel solo recording time before the countdown timer expires, the alert system may automatically contact an emergency service or a contact as specified by the user.
- FIG. 4 illustrates an example process 400 when broadcast mode is activated, as according to some embodiments.
- the option to enter broadcast mode is available to the user. This may be presented as a button on the display screen of the device or by activation through commands and motions as described above.
- the user may optionally press and/or hold a button, wherein the button may be a physical button on the device or a virtual button on the screen.
- the alert system may perform an operation 410 to receive an input from the user to activate broadcast mode.
- the alert system may perform an operation 415 to collect and analyze data from sensors of the mobile device and determine the type of event. Based on the type of event, operation 415 may determine what type of broadcast event to perform and what information should be included in the alert notification.
- broadcast mode may be automatically activated from solo recording mode. Both at activation of solo recording mode and while solo recording mode is active, sensor data may be collected and analyzed to determine if broadcast mode should be activated.
- the alert system may perform an operation 405 to receive an activation command to activate broadcast mode automatically from a determination during solo recording mode.
- the alert system may monitor the sensors of the mobile device while solo recording mode is active.
- the alert system may receive sensor data that the situation the user is experiencing has escalated and thus broadcast mode is automatically activated to alert others that the user is in need of assistance. For example, the user may have activated solo recording mode when a stranger begins speaking to them. The stranger may suddenly attack the user. Based on data such as a sharp spike in heart rate for the user and erratic movement of the mobile device, the alert system may determine the situation has escalated and the user is in need of assistance, thus automatically activating broadcast mode.
- an event notification is broadcast over a network to contacts of the user.
- a user may have a set of contacts stored in their mobile device.
- a contact is a person the user has an association with and has provided identifying information for communicating with, such as a phone number or email address.
- the user may set the alert system to utilize the set of contacts already stored on the mobile device.
- the user may manually select or input the set of contacts the user wishes for the alert system to utilize for sending alert notifications.
- the alert system may send a message to the contact alerting them to the addition.
- a user removes a contact from the set of contacts from whom they may receive alerts, that removed user may also receive a message.
- Broadcast mode may transmit an alert notification to people who are not a contact of the user. This may include bystanders, people that share a
- the message may contain information for the contact to install the alert system on the contact's devices.
- the alert system may perform an operation 420 to select a set of contacts to send an alert notification.
- Operation 420 may determine the set of contacts based on the geolocation of the user and the type of event, either determined or indicated by the user.
- the contacts to which the alert notification is sent to may be a predetermined group of people the user has selected for receiving notifications, such as an emergency contact list.
- the set of contacts may be determined by location, such as only contacting people who have a profile location as the same state as the user.
- the set of contacts may be group of people sharing a common characteristic or attribute. For example, a user of a certain race or religion may fear being unfairly stereotyped in certain situations, and thus they may have the alert system configured to send an alert notification to others of a similar race or religion.
- the alert system may perform at operation 425 to broadcast an alert notification to the members of the determined set of contacts.
- the alert notification may include information about the event the user is experiencing, a message about what the user wishes for the contact to do, the geolocation of the user, and the audio stream.
- a contact may acknowledge the notification. This may include an acceptance of the alert notification so that the contact may monitor the situation that the user is involved in. This may include responding with a message, such as informing the user that the contact is coming to the user's assistance.
- the contact may also perform an action such as contacting an emergency service or contacting additional people, for which a message is sent to the user informing them that the contact has performed such an action.
- the alert system may perform an operation 430 to receive acknowledgment responses from at least one contact.
- the response may include the geolocation data for the contact.
- the response may include a message from the contact.
- the alert system may perform an operation 435 to display the location of the contacts which have responded.
- the display screen of the user's mobile device may display a map.
- the alert system may use the geolocation data obtained from the GPS to determine the location of the user and display the user's current location on the map.
- the alert system may use the geolocation data received in the contact responses to determine the location of the contact and display the location of the contact on the map.
- the alert system may continually obtain geolocation data from the GPS of the mobile device and the contacts to update the map with the current location of the user and the contacts.
- the alert system may perform an operation 440 to transmit the audio stream and updates about the user to the contacts which have responded.
- the updated data may include a message from the user, changes to the situation as indicated from sensors of the mobile device, update to the current location of the user, and a message indicating the user is no longer in danger and assistance is no longer needed.
- the alert system may perform a decision 445 to determine if the number of acknowledged contacts has met a predetermined threshold for the number of contacts that should respond for a situation.
- the threshold may be predetermined by the user, such as for any situation, the user wishes for at least five contacts to respond.
- the threshold may be determined by the type of situation. For example, the user may not be feeling well and thus a threshold of at least two people is set. But, in an example, if the user is being attacked, a threshold of at least ten people may be set.
- the alert system may have a timer for how long it may be allowed for contacts to respond and meet the threshold before additional actions are taken. Additional actions may be taken if certain criteria is met.
- a timer allowing time for contacts to respond may be provided. If the number of responses is not met, then additional contacts may be notified. As the timer progresses, the alert system may receive sensor data indicating a change in the situation, where the alert system may determine an alert notification should be sent to a larger number of contacts or to attempt to send alert notifications to recipients outside the user's known contacts. Similarly, this may be based on a user setting or the type of situation. For example, the user may want to give their contacts three minutes to respond. In immediate situations, such as being attacked, there may not be time to wait very long for a set of contacts to respond, thus after a minute, if the threshold has not been met, the alert system may notify additional contacts.
- the alert system may perform an operation 450 to notify additional contacts if the threshold for contact responses has not been met within the predetermined time.
- the alert system utilized on both the user's mobile device and the contact's mobile device, may be configured to allow for a notification to be sent to a set of contacts of a contact of the user.
- the set of contacts of a contact may include members not known to the user.
- Operation 450 may send a notification to the set of contacts of the contact as a means to reach more people to assist the user.
- FIG. 5 illustrates an example process 500 for determining the contacts for an alert notification when broadcast mode is activated, as according to some embodiments.
- Process 500 may illustrate the process performed by the alert system to determine which contacts to broadcast an alert notification to and at what time to broadcast.
- the alert system may determine the type of event being experienced by the user of the alert system. If the alert system determines a potential event may be occurring, such as the user feels that someone may be following them, then the alert system may perform operation 510 and send an alert notification to all the members of the user's contact set.
- the alert notification may include a message that the user is not in immediate danger, but wishes to have the contacts monitor their status.
- the decision 505 may determine the user is experiencing an emergency event.
- the alert system may perform operation 515 to remove any contact within a determined distance radius.
- a user may be in an emergency situation with a member of their contacts. The person may either be experiencing the same situation as the user, and thus sending them an alert notification would be wasted, or the person may be the one causing harm to the user, and thus it would not be helpful for them to receive an alert notification.
- the alert system may be configured such that alert notifications are broadcast to contacts who have a current location outside a predetermined radius distance from the user's current location. By broadcasting the notification to people outside a radius distance may help ensure the notification reaches people who have the ability to assist. The radius distance may be adjusted based on the location, type of event, and circumstances.
- the radius distance may be relatively small so as to only avoid people in the immediate vicinity of the user.
- the distance may be larger when the user is outside and there is less concentration of people, such as when the user is walking on a street alone at night.
- the alert system on the mobile device of the user may first broadcast a location request to all of the user' s contacts which utilize the alert system.
- the alert system on the devices of the contacts may collect current geolocation data for the device and transmit the geolocation data to the user's mobile device.
- the alert system on the user's mobile device may determine which devices may be in the immediate vicinity of the user and which contacts may be within a predetermined distance radius of the user.
- the alert system may remove these contacts from the set of contacts to broadcast the alert notification to.
- the alert system may prioritize sending the alert notification to contacts which have reported a location closer to the user. As time progresses, the alert system may send the alert notification to contacts at further distances.
- the alert system may be configured to broadcast an alert notification to bystanders.
- a bystander may be a person within a relative vicinity of the user.
- a bystander may be a person not previously known to the user.
- a bystander may be a person sharing a similar characteristic or trait with the user as determined by a comparison of the profiles for the user and the bystander.
- a user of the alert system may wish to have help reach them as quickly as possible, such as if they are being attacked by someone.
- the alert system may broadcast an alert notification to mobile devices executing the alert system within a relative distance of the user. These may be bystanders and people passing near the event.
- the bystanders are alerted that a person is in need of assistance and the bystander may choose to assist, such as by going to the person's aide or contacting an emergency service.
- a bystander who has opted into the alert system as willing to assist others may have their physiological data or audio data collected. This data may be combined with the data collect by the alert system of the user's device to provide more information about the situation. The data may be used to determine if the distance radius for alert notification broadcast is large enough. For example, the alert system may determine the by stander is in an agitated state and thus may also be involved in the situation. Thus, using the geolocation data of the bystander, the alert system may determine the broadcast alert notification should be sent to those at a farther distance.
- the alert system may perform operation 520 to broadcast an alert notification to the determined set of contacts.
- the alert system may then perform operation 525 to receive response from the contacts for the alert notification.
- the alert system may perform operation 555 to add the contact to a list of active responders.
- the alert system continues to add the contacts to an active group.
- the alert system may then perform an operation 560 to provide status updates of user and any changes to the location of the user.
- a timer may be running to determine if a wider broadcast needs to be made because enough responses have not been received. Depending on the event the timer may be shorter for more dire situations.
- the alert system may perform decision 530 to determine if the threshold timer has expired. If the threshold timer has not expired, the alert system may continue to perform operation 525 and wait to receive responses. [0055] When the decision 530 determines the timer has expired, the alert system may perform decision 535 to determine if the response threshold was met. Similar to the threshold time, the response threshold may depend on the type of event. The more serious the situation, the higher the threshold value may be for the number of people that the user wishes to respond. The user may also customize the threshold for the event.
- the response threshold may be one or two people. If the user is being attacked, the user may wish for ten or more people to respond. If decision 535 determines the response threshold was met, the alert system may perform operation 545 and end the broadcast of alert notifications.
- the alert system may perform operation 540 to broadcast the alert notification to a wider group of contacts.
- the alert system may perform operation 550 to broadcast to contacts of the user's contacts.
- a contact of a contact may respond and assist the user.
- Both operation 540 and 550 may broadcast the alert notification and then return to operation 525 to receive responses.
- FIG. 6 illustrates an example user interface for solo recording mode on a mobile device 600 in accordance with some embodiments.
- Mobile device 600 may be a device such as a smartphone, cellular phone, personal digital assistant, fitness tracker, or portable music player.
- the mobile device 600 may display on a touch display interface 605 a graphical user interface (GUI) for the alert system.
- GUI graphical user interface
- the alert system may be designed such that it is simple for a user to activate solo recording mode without having to perform a sequence of steps.
- the GUI may display a single virtual button 610 with instructions 615 to activate the solo recording mode.
- the GUI may have a virtual button 610 labeled "HELP", with instructions 615 for the user to "PRESS + HOLD" the virtual button 610 to activate the solo recording mode.
- FIG. 7 illustrates an example user interface for activated solo recording mode on a mobile device 700 in accordance with some embodiments.
- the mobile device 700 may display on a touch display interface 705 a GUI for the activated solo recording mode of the alert system.
- the GUI may include several virtual buttons for the user to interface with while the alert system is in solo recording mode.
- the GUI may include an "END" button 710 for the user to interface with when the user wishes to end the solo recording mode.
- the GUI may include a phone button 715 for the user to place a phone call while in solo recording mode.
- the phone button 715 may be customized in different ways to the user's preferences, such as immediately placing a 911 call, immediately placing a call to the first person listed as an emergency contact, or opening up the user's list of contacts so they may decide who to call.
- the GUI may include a broadcast button 720 to enter broadcast mode. If they user feels the situation has escalated and they need to alert people of the situation, the user may activate broadcast mode by tapping the broadcast button 720 while in solo recording mode.
- the GUI may include a map 725 displaying the current location of the mobile device from the geolocation data obtained from the GPS.
- FIG. 8 illustrates an example user interface for the activated broadcast mode on a mobile device 800 in accordance with some embodiments.
- the mobile device 800 may display on a touch display interface 805 a GUI for the activated broadcast mode of the alert system.
- the GUI may include several virtual buttons for the user to interface with while the alert system is in broadcast mode.
- the GUI may include an "END" button 810 for the user to interface with when the user wishes to end the broadcast mode.
- the GUI may include a phone button 820 for the user to place a phone call while in broadcast mode.
- the phone button may operate similarly to the phone button during solo recording mode.
- the GUI may include a mute button 815 to mute or un-mute the contacts which have acknowledged the alert notification.
- the contacts that have responded to the alert notification may have the capability to speak with the user and one and other.
- the GUI may include a map 825 to display the current location of the mobile device and the location of any contacts that may have received or acknowledged the broadcast alert notification.
- FIG. 9 illustrates an example user interface for the activated broadcast mode on a mobile device 900 in accordance with some embodiments.
- the mobile device 900 may display on a touch display interface 905 a GUI for the activated broadcast mode of the alert system.
- the GUI may include an "END" button 910 for the user to interface with when the user wishes to end the broadcast mode.
- the GUI may include a call button 915, such as "CALL 911", for the user to place an emergency call.
- the call button may be customized to place a call to any number the user chooses.
- the GUI may include a broadcast bar 920 containing names or images of the contacts which have received and acknowledged the broadcast alert notification.
- the user may click on an image or name in the broadcast bar 920 to start an interaction with the contact associated with the image or name.
- the interaction may be a text message, a chat window, or a phone call.
- the GUI may include a map 925 to display the current location of the mobile device and the location of any contacts that may have received or acknowledged the broadcast alert notification.
- the map 925 may display the location of an acknowledged contact 930 such that the user may track the location of the acknowledged contact 930.
- the GUI may include a chat button 935 to start a group chat session with all the contacts which have received and acknowledged the broadcast alert notification.
- the GUI may include a video window 940 to display a captured video stream of the situation.
- the video stream displayed in the video window 940 may be captured from a camera on the mobile device or a camera communicatively connected to the mobile device.
- FIG. 10A illustrates an example user interface for the alert system on a smartwatch 1000 in accordance with some embodiments.
- the smartwatch 1000 may execute a smartwatch specific application for the alert system.
- the smartwatch 1000 may include a microphone allowing for the smartwatch to capture the audio stream when solo recording mode is activated.
- the microphone of smartwatch 1000 may receive commands, such as a voice command to activate solo recording mode or broadcast mode.
- the smartwatch 1000 may include a GPS device, allowing the smartwatch 1000 to obtain geolocation data when solo recording mode is activated.
- the smartwatch 1000 may have capabilities to connect to a network, such as a cellular network, allowing the smartwatch to send a broadcast alert notification.
- the smartwatch 1000 may not have network connectivity capabilities and thus transmits the commands to a connected mobile device to execute the broadcast of an alert notification.
- the smartwatch 1000 may display on a touch display interface 1005 a GUI for a user of the alert system to activate solo recording mode or broadcast mode.
- the smartwatch 1000 GUI may include a virtual button 1010 to activate solo recording mode or broadcast mode. Similar to virtual button 610, the user may tap or tap-and-hold the virtual button 1010 to activate.
- the smartwatch 1000 may include a physical button 1015.
- the physical button 1015 may receive a sequence pattern of button pushes to activate either solo recording mode or broadcast mode.
- the user wearing a communicatively connected device such as a smartwatch 1000, may receive alert notification acknowledgements on the smartwatch 1000.
- the smartwatch 1000 may display information indicating a contact has responded to the alert notification.
- the smartwatch 1000 may provide haptic feedback (e.g., a vibration or physical pulse) to the user when a contact has responded to the alert notification. For example, when the user is experiencing an emergency situation, they may not be able to look at the display screen of their mobile device.
- haptic feedback e.g., a vibration or physical pulse
- the smartwatch 1000, the mobile device, and another communicatively connected device may provide a unique haptic feedback (e.g., a unique pattern of vibrations) to the user when a contact has acknowledged the alert notification, thus informing, and reassuring, the user that help is responding even though the user cannot view their mobile device.
- a unique haptic feedback e.g., a unique pattern of vibrations
- FIG. 10B illustrates an example user interface for the alert system on a smartwatch 1030 in accordance with some embodiments.
- Smartwatch 1030 is an example of a smartwatch worn by a contact of the user running an application for the alert system.
- the smartwatch 1030 has received an alert notification from a user that the user needs assistance.
- the smartwatch 1030 may have a touch sensitive display screen 1035.
- the display screen 1035 may display the alert system application GUI.
- the GUI may display the name of the user 1040 as the person requesting assistance.
- the smartwatch 1030 may play the audio stream received from the user.
- the GUI may display the information 1045 about the audio stream, such as if the stream is a live broadcast and how long the audio stream has been going.
- the information 1045 may display a timer based on when the alert notification was sent, such that the contact is aware of how much time has transpired since the user indicated they needed assistance.
- the GUI may include a virtual button 1050 for the contact to accept the notification and send acknowledgment to the user that the notification has been received.
- the GUI may present additional options to the contact, such as contacting an emergency service or sending a message to the user that the contact is coming to assist.
- the GUI and actions described related to FIG. 10B are not limited to a smartwatch and may be performed on any network connected electronic device.
- FIG. IOC illustrates an example user interface for the alert system on a smartwatch 1060 in accordance with some embodiments.
- Smartwatch 1060 is an example of a smartwatch worn by a contact of the user running an application for the alert system.
- the smartwatch 1060 has received an alert notification from a user that the user needs assistance.
- the contact has accepted the alert notification, such as by tapping a virtual button 1050 in FIG. 10B.
- the smartwatch 1030 may have a touch sensitive display screen 1065 displaying a GUI for the alert system.
- the GUI may display a map 1070 on the display screen 1065 of the smartwatch 1060.
- the map 1070 may display the current location of the user requesting assistance.
- the map 1070 may display the current location of the contact relative to the user requesting assistance.
- the map 1070 may display directions to assist the contact in reaching the user requesting assistance.
- FIG. 11 illustrates an example user interface for a received alert notification on a mobile device 1100 in accordance with some embodiments.
- a contact of the user requesting assistance, or requester may execute the alert system on their mobile device 1100.
- the mobile device 1100 may display on a touch display interface 1 105 a GUI for a receiver of an alert notification of the alert system.
- the receiver may choose to enter chat mode or the alert system may enter chat mode automatically.
- the GUI for the alert system of a receiver may have a static status portion 1110 providing the status of the requester.
- the status portion 1110 may include the name 1115 of the requester, an image 1120 of the requester, and the address 1125 of the requester.
- the receiver may select the name or picture of the requester to obtain additional information about the requester, such as social group identification, medical condition, medication requirements, vitals, car make and model, and license to carry a weapon.
- the status portion 1110 may include a display of information 1130 about the audio stream, such as if the stream is a live broadcast and how long the audio stream has been going.
- the audio and video stream may be stored on the receiver's mobile device.
- the receiver may playback the audio and video stream at a later time, such as to provide information to an EMT or evidence to police.
- the status portion 1 110 may provide a virtual button 1135 for the receiver to contact additional people to be alerted and assist the requester. Additional people may include additional contacts, contacts of a contact, or people unknown to the user, such as bystanders, that the alert system has determined may assist. The alert system may select people unknown to the user based on analyzing characteristics of those people, such as shared personal characteristics (e.g. race or religion), location proximity, history (e.g. if the person has previously responded to alert notifications), and physiological state.
- the status portion 1110 may include a virtual button 1140 to contact an emergency service or other emergency contact.
- the GUI of the alert system may include a chat window 1145 to provide status updates for the requester and messages exchanged between the requester and other receivers. The chat window 1145 may provide status information 1150 as the alert system receives it, such as the request and location of the requester.
- the chat window 1145 may include messages 1155 from the requester.
- the chat window 1145 may include messages 1160 from other receivers, or contacts of the requester which have acknowledged the alert notification.
- the members of the chat window 1145 may exchange messages, pictures, and video.
- the chat window 1145 may include status information for the requester, such as heart rate, or environmental data, such as other alerts triggered near the requester.
- the chat window 1145 may be changed to a map window by selecting the virtual map button 1165.
- the map window may provide the location of the requester, the receiver, and other receivers.
- the map window may include directions to the location of the requester.
- Data collected during an event where either solo recording mode or broadcast mode are activated may be stored at a network server and storage device, such as a cloud service.
- the data may include the geolocation data of all participants, the audio stream, the video stream, the chat exchanges amongst the participants, the voice chat amongst the receivers, and any sensor data collected.
- the alert system may be linked to third party service or application.
- the alert system may transmit the recorded data to a service over the network.
- Services may include a health monitoring service, a translation service, an attorney, or a community leader.
- the alert system may be connected to similar applications within the mobile device or an application that stores the recorded data to a cloud service to preserve the recorded data should the mobile device be damaged or lost.
- a communication channel may open allowing for real time two-way communication with the service. The real time two-way
- communication may include audio and video communication.
- a user who does not speak English may initiate the alert system to connect to a translation service.
- a two-way communication channel is opened between the user and the translation service.
- the user may then speak in their native language and the translation service may translate to English for someone the user is conversing with, such as a police officer.
- a user of the alert system may carry with them an instrument for self- defense. This may include a baton, a knife, a Taser, pepper spray, or a handgun.
- the instrument may be equipped with sensors, such as an accelerometer or gyroscope, and a communication device, such as a Bluetooth transmitter.
- sensors such as an accelerometer or gyroscope
- a communication device such as a Bluetooth transmitter.
- the motion sensors on the instrument may detect this movement and prompt the communication device to send a message.
- the communication device being communicatively connected to the alert system through the mobile device may receive this indication and automatically activate solo recording mode or broadcast mode, as warranted by the situation.
- the user may carry a Taser. The user feels threatened by someone approaching or following them. When the user motions such that the Taser is being pointed, the motion sensors of the Taser may detect this motion and signal to the
- the communication device which then transmits to the mobile device.
- the alert system receiving the indication that the Taser is being pointed activates solo recording mode.
- FIG. 12 illustrates a flow chart showing a technique 1200 for executing a solo recording mode for an emergency alert system, in accordance with some embodiments.
- the technique 1200 includes an operation 1202 to receive an activation trigger for an indication of an event on a mobile device.
- An activation trigger may include a selection of a virtual button on a touch screen, a sequence of button presses, an eye gaze or eye movement, a movement of the mobile device, a movement or gesture with a worn device, or a voice command.
- the user input to trigger the event may be a voice command and the trigger is activated based on the tone of the voice used in the voice command or the trigger is activated based on the words used in the voice command.
- the trigger is a movement of the mobile device.
- the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
- the technique 1200 includes an operation 1204 to obtain geolocation data from a global positioning device of the mobile device.
- a GPS may provide geolocation for the current location of the mobile device.
- the technique 1200 includes an operation 1206 to obtain sensor data from one or more sensor of the mobile device.
- the sensors data may include data obtained from an accelerometer, a gyroscope, or a camera.
- the sensor data may be obtained from a sensor of a device communicatively connected to the mobile device.
- the user may wear a smartwatch which has a heart rate sensor.
- the alert system through the mobile device, may receive the heart rate sensor data from the communicatively connected device
- the technique 1200 includes an operation 1208 to determine the location of the mobile device based on the obtained geolocation data.
- the technique 1200 includes an operation 1210 to activate a microphone on the mobile device to capture an audio stream.
- the audio stream may include voices and sounds around the mobile device.
- the technique 1200 includes an operation 1212 to determine a type of event based on the sensor data and the activation trigger for indicating the event.
- the technique 1200 includes an operation 1214 to store the audio stream, the sensor data, and the geolocation data in a storage device of the mobile device.
- the technique 1200 includes an operation 1216 to display, on a display device of the mobile device, a message based on the type of event.
- the display may include a message about an emergency service or contacts to notify based on the determined type of event.
- the technique may include an operation to transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
- the technique may include an operation to automatically transmit the notification based on the type of event.
- the technique may include operations to receive, from at least one member of the set of contacts, an acknowledgement of the notification and display, on the display device of the mobile device, the contacts from which an acknowledgement has been received.
- the notification of the event may include the audio stream and a first map generated using the geolocation data.
- the technique may include operations to display, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
- the set of contacts may be limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
- the technique may include operations to obtain, contemporaneously with the audio stream, physiological data about the user from a sensor of the mobile device and store the physiological data in a storage device of the mobile device.
- the technique may include operations to obtain sensor data from a sensor of a device communicatively connected to the mobile device.
- the device communicatively connected to the mobile device is a smartwatch.
- FIG. 13 illustrates generally an example of a block diagram of a machine 1300 upon which any one or more of the techniques (e.g.,
- the machine 1300 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1300 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 1300 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment.
- P2P peer-to-peer
- the machine 1300 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
- PC personal computer
- PDA personal digital assistant
- STB set-top box
- PDA personal digital assistant
- mobile telephone a web appliance
- network router network router, switch or bridge
- machine any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine.
- machine shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
- Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or mechanisms.
- Modules are tangible entities (e.g., hardware) capable of performing specified operations when operating.
- a module includes hardware.
- the hardware may be specifically configured to carry out a specific operation (e.g., hardwired).
- the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation.
- the configuring may occur under the direction of the executions units or a loading mechanism.
- the execution units are communicatively coupled to the computer readable medium when the device is operating.
- the execution units may be a member of more than one module.
- the execution units may be configured by a first set of instructions to implement a first module at one point in time and reconfigured by a second set of instructions to implement a second module.
- Machine 1300 may include a hardware processor 1302 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 1304 and a static memory 1306, some or all of which may communicate with each other via an interlink (e.g., bus) 1308.
- the machine 1300 may further include a display unit 1310, an alphanumeric input device 1312 (e.g., a keyboard), and a user interface (UI) navigation device 1314 (e.g., a mouse).
- the display unit 1310, alphanumeric input device 1312 and UI navigation device 1314 may be a touch screen display.
- the machine 1300 may additionally include a storage device (e.g., drive unit) 1316, a signal generation device 1318 (e.g., a speaker), a network interface device 1320, and one or more sensors 1321, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor.
- the machine 1300 may include an output controller 1328, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
- a serial e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
- USB universal serial bus
- NFC
- the storage device 1316 may include a machine readable medium 1322 that is non-transitory on which is stored one or more sets of data structures or instructions 1324 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein.
- the instructions 1324 may also reside, completely or at least partially, within the main memory 1304, within static memory 1306, or within the hardware processor 1302 during execution thereof by the machine 1300.
- one or any combination of the hardware processor 1302, the main memory 1304, the static memory 1306, or the storage device 1316 may constitute machine readable media.
- machine readable medium 1322 is illustrated as a single medium, the term “machine readable medium” may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 1324.
- machine readable medium may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 1324.
- machine readable medium may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 1300 and that cause the machine 1300 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions.
- Non- limiting machine readable medium examples may include solid-state memories, and optical and magnetic media.
- Specific examples of machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks;
- EPROM Electrically Programmable Read-Only Memory
- EEPROM Electrically Erasable Programmable Read-Only Memory
- magneto-optical disks and CD-ROM and DVD-ROM disks.
- the instructions 1324 may further be transmitted or received over a communications network 1326 using a transmission medium via the network interface device 1320 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.).
- transfer protocols e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.
- Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers (IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others.
- the network interface device 1320 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 1326.
- the network interface device 1320 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques.
- SIMO single-input multiple-output
- MIMO multiple-input multiple-output
- MISO multiple-input single-output
- transmission medium shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 1300, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software.
- Example 1 is a system for personal emergency data capture and alerting, the system comprising: a processing complex, the processing complex including one or more processors in a mobile device of a user; and at least one machine readable medium in the mobile device, the at least one machine readable medium including instructions that, when executed, cause the processing complex to control electronic hardware of the mobile device: receive a trigger for an indication of an event on the mobile device; obtain geolocation data from a global positioning device of the mobile device; obtain sensor data from one or more sensors of the mobile device; determine a location of the mobile device based on the geolocation data; determine a type of event based on the sensor data and the trigger used to indicate the event; store the sensor data, and the geolocation data in a storage device of the mobile device; and output an activation indication to the mobile device based on the type of event.
- a processing complex including one or more processors in a mobile device of a user
- at least one machine readable medium in the mobile device the at least one machine readable medium including instructions that, when executed, cause the
- Example 2 the subject matter of Example 1 includes, instructions to: transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
- Example 3 the subject matter of Example 2 includes, instructions to: receive, from at least one member of the set of contacts, an acknowledgement of the notification; and display, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
- Example 4 the subject matter of Examples 2-3 includes, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
- Example 5 the subject matter of Examples 1-4 includes, instructions to: receive, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receive, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
- Example 6 the subject matter of Examples 4-5 includes, instructions to: display, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
- Example 7 the subject matter of Examples 1-6 includes, wherein the set of contacts is limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
- Example 8 the subject matter of Examples 1-7 includes, instructions to: obtain, contemporaneously with the sensor data, physiological data about the user from a sensor of the mobile device; store the physiological data in a storage device of the mobile device.
- Example 9 the subject matter of Examples 1-8 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the tone of the voice used in the voice command.
- Example 10 the subject matter of Examples 1-9 includes, instructions to: obtain sensor data from a sensor of a device communicatively connected to the mobile device.
- Example 11 the subject matter of Examples 3-10 includes, wherein at least one of the mobile device or a device communicatively connected to the mobile device provides haptic feedback when an
- Example 12 the subject matter of Examples 1-11 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the words used in the voice command.
- Example 13 the subject matter of Examples 2-12 includes, instructions to automatically transmit the notification based on the type of event.
- Example 14 the subject matter of Examples 1-13 includes, wherein the trigger is a movement of at least one of the mobile device or a device communicatively connected to the mobile device.
- Example 15 the subject matter of Examples 1-14 includes, wherein the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
- Example 16 is at least one non-transitory computer readable medium including instructions for personal emergency data capture and alerting that when executed by at least one processor, cause the at least one processor to: receive a trigger for an indication of an event on the mobile device; obtain geolocation data from a global positioning device of the mobile device; obtain sensor data from one or more sensors of the mobile device; determine a location of the mobile device based on the geolocation data; determine a type of event based on the sensor data and the trigger used to indicate the event; store the sensor data, and the geolocation data in a storage device of the mobile device; and output an activation indication to the mobile device based on the type of event.
- Example 17 the subject matter of Example 16 includes, instructions to: transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
- Example 18 the subject matter of Example 17 includes, instructions to: receive, from at least one member of the set of contacts, an acknowledgement of the notification; and display, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
- the subject matter of Examples 17-18 includes, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
- Example 20 the subject matter of Examples 16-19 includes, instructions to: receive, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receive, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
- Example 21 the subject matter of Examples 19-20 includes, instructions to: display, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
- Example 22 the subject matter of Examples 16-21 includes, wherein the set of contacts is limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
- Example 23 the subject matter of Examples 16-22 includes, instructions to: obtain, contemporaneously with the sensor data, physiological data about the user from a sensor of the mobile device; store the physiological data in a storage device of the mobile device.
- Example 24 the subject matter of Examples 16-23 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the tone of the voice used in the voice command.
- Example 25 the subject matter of Examples 16-24 includes, instructions to: obtain sensor data from a sensor of a device communicatively connected to the mobile device.
- Example 26 the subject matter of Examples 18-25 includes, wherein at least one of the mobile device or a device communicatively connected to the mobile device provides haptic feedback when an
- Example 27 the subject matter of Examples 16-26 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the words used in the voice command.
- Example 28 the subject matter of Examples 17-27 includes, instructions to automatically transmit the notification based on the type of event.
- Example 29 the subject matter of Examples 16-28 includes, wherein the trigger is a movement of at least one of the mobile device or a device communicatively connected to the mobile device.
- Example 30 the subject matter of Examples 16-29 includes, wherein the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
- Example 31 is a method for personal emergency data capture and alerting, the method comprising: receiving a trigger for an indication of an event on the mobile device; obtaining geolocation data from a global positioning device of the mobile device; obtaining sensor data from one or more sensors of the mobile device; determining a location of the mobile device based on the geolocation data; determining a type of event based on the sensor data and the trigger used to indicate the event; storing the sensor data, and the geolocation data in a storage device of the mobile device; and outputting an activation indication to the mobile device based on the type of event.
- Example 32 the subject matter of Example 31 includes, transmitting a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
- Example 33 the subject matter of Example 32 includes, receiving, from at least one member of the set of contacts, an acknowledgement of the notification; and displaying, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
- Example 34 the subject matter of Examples 32-33 includes, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
- Example 35 the subject matter of Examples 32-34 includes, receiving, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receiving, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
- Example 36 the subject matter of Examples 34-35 includes, displaying, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
- Example 37 the subject matter of Examples 31-36 includes, wherein the set of contacts is limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
- Example 38 the subject matter of Examples 31-37 includes, obtaining, contemporaneously with the sensor data, physiological data about the user from a sensor of the mobile device; storing the physiological data in a storage device of the mobile device.
- Example 39 the subject matter of Examples 31-38 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the tone of the voice used in the voice command.
- Example 40 the subject matter of Examples 31-39 includes, obtaining sensor data from a sensor of a device communicatively connected to the mobile device.
- Example 41 the subject matter of Examples 33-40 includes, wherein at least one of the mobile device or a device communicatively connected to the mobile device provides haptic feedback when an
- Example 42 the subject matter of Examples 31-41 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the words used in the voice command.
- Example 43 the subject matter of Examples 32-42 includes, transmitting the notification based on the type of event.
- Example 44 the subject matter of Examples 31-43 includes, wherein the trigger is a movement of at least one of the mobile device or a device communicatively connected to the mobile device.
- Example 45 the subject matter of Examples 31-44 includes, wherein the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
- Example 46 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-45.
- Example 47 is an apparatus comprising means to implement of any of Examples 1-45.
- Example 48 is a system to implement of any of Examples 1-45.
- Example 49 is a method to implement of any of Examples 1-45.
- Method examples described herein may be machine or computer- implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples.
- An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non- transitory, or non-volatile tangible computer-readable media, such as during execution or at other times.
- Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.
Landscapes
- Engineering & Computer Science (AREA)
- Computer Security & Cryptography (AREA)
- Business, Economics & Management (AREA)
- Emergency Management (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Alarm Systems (AREA)
Abstract
Systems and techniques for personal emergency data capture and alerting are described herein. The personal emergency system to receive a trigger for an indication of an event on a mobile device. The personal emergency system to obtain geolocation data from a global positioning device of the mobile device and sensor data from sensors of the mobile device. The personal emergency system to determine a location of the mobile device based on the geolocation data and determine a type of event based on the sensor data and the trigger used to indicate the event. The personal emergency system to store the sensor data and the geolocation data in a storage device of the mobile device and output an activation indication to the mobile device based on the type of event.
Description
PERSONAL EMERGENCY DATA CAPTURE AND
ALERTING
RELATED APPLICATIONS AND CLAIM PRIORITY
[0001] This patent application claims the benefit of priority, under 35 U.S.C. Section 119(e), to Abdurrahman et al, U.S. Provisional Patent Application Serial Number 62/431,753, entitled "Method and Apparatus for Emergency Alert" filed on December 8, 2016 (Attorney Docket No. 4712.001PRV), which is hereby incorporated by reference herein in its entirety.
TECHNICAL FIELD
[0002] Embodiments described herein generally relate to emergency event recording and, in some embodiments, more specifically to an automated emergency event recording with location and relationship based event notification.
BACKGROUND
[0003] Personal emergency response systems (PERS) often rely on a proximity to a base station or the existence of a call center. They interact with one responder, do not warn others nearby of the emergency, nor is their use, generally, covert, an important option for senders of distress signals in emergency situations with attackers or other scenarios in which one doesn't want their sending of the distress signal to be detected. Further, streaming of the realtime health data of the sender of an emergency signal is not available to receivers of the notification.
[0004] Additionally, there may be situations in which a person may not feel at ease, but still may not want to alert others until they are sure that escalation is necessary. Also, proving in one way or another that an emergency situation even occurred is sometimes required to get the attention and consideration of others.
Many systems do not provide a way to demonstrate the actual occurrence of an incident such as an assault, hate crime, or other potentially dangerous situation.
BRIEF DESCRIPTION OF THE DRAWINGS
[0005] In the drawings, which are not necessarily drawn to scale, like numerals may describe similar components in different views. Like numerals having different letter suffixes may represent different instances of similar components. The drawings illustrate generally, by way of example, but not by way of limitation, various embodiments discussed in the present document.
[0006] FIG. 1 illustrates an example of an environment in which the emergency alert system may be used, in accordance with some embodiments.
[0007] FIG. 2 illustrates the emergency alert system engine in accordance with some embodiments.
[0008] FIG. 3 illustrates an example process when solo recording mode is activated, as according to some embodiments.
[0009] FIG. 4 illustrates an example process when broadcast mode is activated, as according to some embodiments.
[0010] FIG. 5 illustrates an example process for determining the contacts for an alert notification when broadcast mode is activated, as according to some embodiments.
[0011] FIG. 6 illustrates an example user interface for solo recording mode on a mobile device in accordance with some embodiments.
[0012] FIG. 7 illustrates an example user interface for activated solo recording mode on a mobile device in accordance with some embodiments.
[0013] FIG. 8 illustrates an example user interface for the activated broadcast mode on a mobile device in accordance with some embodiments.
[0014] FIG. 9 illustrates an example user interface for the activated broadcast mode on a mobile device in accordance with some embodiments.
[0015] FIG. 10A illustrates an example user interface for the alert system on a smartwatch in accordance with some embodiments.
[0016] FIG. 10B illustrates an example user interface for the alert system on a smartwatch in accordance with some embodiments.
[0017] FIG. IOC illustrates an example user interface for the alert system on a smartwatch in accordance with some embodiments.
[0018] FIG. 11 illustrates an example user interface for a received alert notification on a mobile device in accordance with some embodiments.
[0019] FIG. 12 illustrates a flow chart showing a technique for executing a solo recording mode for an emergency alert system, in accordance with some embodiments.
[0020] FIG. 13 illustrates generally an example of a block diagram of a machine upon which any one or more of the techniques (e.g., methodologies) discussed herein may perform in accordance with some embodiments.
DETAILED DESCRIPTION
[0021] The alert system described herein provides a PERS that may record and store information such as audio, health data, and location for the sender and their recipients for later review. The system described may broadcast a message or notification to a select group of people based on the event and include the recorded information.
[0022] Mobile electronic devices are widely utilized and carried by people across the globe. When people are confronted with an event, especially an emergency type event, they may desire a means to activate an alert and record the event on their mobile device. The alert system may have two modes: a solo recording mode and a broadcast mode. The solo recording mode may be executed on any mobile electronic device, wherein the device does not need to be connected to a network. The solo recording mode may record and store information such as audio, video, health, and location data. A connected device, such as one connected to a network to broadcast a notification to other devices, may execute broadcast mode. The broadcast mode may be executed
concurrently with the solo recording mode. The broadcast notification may include data collected from the solo recording mode.
[0023] FIG. 1 illustrates an example of an environment 100 in which the emergency alert system may be used, in accordance with some embodiments. A person, or user 105, may find themselves in a situation where their safety is a concern. This may include self-inflicted situations where a person has accidentally hurt themselves or is lost. This may include situations where no actual event has occurred, but the person feels uneasy or threatened. For example the user 105 is being followed by two threatening individuals 120. This
may include situations where a person is actually being assaulted or attacked, such as instances of robbery or domestic violence. In some examples, the attacker may be an authority, such as a police officer that is exceeding justifiable use of force.
[0024] The user 105 may have a mobile device 110 that may execute the alert system engine 200 of the alert system. The alert system engine 200 may be executed on a mobile device 110, for example, a cellular phone, a smart phone, smart watch, etc. A user 105 may find themselves in a situation they feel may be threatening or have the potential to escalate, such as being followed on a dark street by two threatening individuals 120 or being stopped by a police officer. The user 105 may wish to activate the solo recording mode of the alert system engine 200. The user 105 may wish to activate the broadcast mode when danger feels eminent or when an attack or violence is occurring. As detailed below, the solo recording mode or broadcast mode of the alert system engine 200 may be activated by multiple methods and combinations of those methods. The alert system engine 200 may determine if the solo recording mode alone, or the solo recording mode and broadcast mode together should be activated based on the input received. For example, the alert system engine 200 may have a single input for activating solo recording mode, but if the alert system engine 200 receives an input from a sensor at the time of activation indicating an escalated or emergency situation, the alert system engine 200 may automatically enter broadcast mode as well.
[0025] The user 105 may desire to activate broadcast mode or the alert system engine 200 may determine that broadcast mode should be activated. For example, if the threatening individuals 120 begin to attack the user 105, such as try to rob or assault the user 105, then the user may activate broadcast mode. When broadcast mode is activated, the alert system engine 200 may transmit, by way of the mobile device 110, an alert message to contacts 125 of the user 105. The contacts 125 have mobile devices configured for receiving alert notifications from the alert system engine 200 of the user 105. The mobile devices of the contacts 125 may be executing the alert system engine as well to receive alert notifications from other alert system engines and respond to the alert
notifications.
[0026] The alert system may be activated by pushing a button or making a sequence of button pushes on the device. The sequence of button pushes may be similar to Morse code, wherein the user may press a button on the device in a pattern of press-and-release and hold-and-release to activate the alert system. The alert system may be activated by tapping or tapping and holding a virtual button on the screen of a smartphone or smartwatch.
[0027] The alert system may be activated by voice commands. The voice command activation may include a combination of words, tone, and volume of the speech of the user to determine when an activation should occur, what mode should be activated, and the type of event that is occurring. The words spoken may be a specific key phrase predefined by the user for activation. The alert system may also be trained with the user's voice, such that the alert system may only proceed with activation if the key phrase is spoken by the user's voice. The user may also train the alert system with their voice at different tones, such that the alert system may recognize when the user is speaking at a normal tone and when the user may be screaming or agitated. For example, the user may speak in a calm, normal volume, a command such as "activate recording mode" for the alert system to start the solo recording mode. The user may instead, in a loud, screaming voice, say "broadcast emergency" for the alert system to activate broadcast mode and broadcast an emergency notification. The alert system may activate solo recording mode or broadcast mode based on voice and tone factors, without a specific phrase being said. The user may train the alert system with a baseline voice profile. The alert system may determine when the user's voice indicates they are in an agitated or fearful state and automatically activate at least solo recording mode. The alert system may activate solo recording mode or broadcast mode based on the tone, words, or volume of the voice of a person near the user. For example, the user may suddenly be verbally assaulted by a person threatening them, thus the person's voice may be agitated, aggressive, and yelling. The alert system may automatically activate broadcast mode based on determining an agitated yelling voice directed at the user, and thus the user is in immediate danger.
[0028] The alert system may utilize sensors, such as a heart rate sensor, to determine in part when the alert system should be activated. For example, a person may say "activate alert" while their heart rate is in a normal range, which
may activate the solo recording mode. A person may say "activate alert" while their heart rate is in an elevated range, which may activate both the solo recording mode and the broadcast mode, as the person may be in immediate danger based on their elevated heart rate. The following examples apply for when the alert system is initially activated and at any time the solo recording mode is already active. For example, the alert system may receive data from a sensor, such as a gyroscope or accelerometer that indicates the user is running. The information that the person is running at the time the alert system is activated may indicate the person is in potential danger and trying to escape it, but is not immediately under attack, thus the alert system may activate solo recording mode. The alert system may receive indication the user is not moving, or the mobile device is not moving. This may indicate the user could be hiding, hurt and not able to move, or has been separated from the mobile device. Thus the user required immediate help and the alert system enters broadcast mode. The alert system may receive data indicating that the mobile device is being moved in dramatic or violent motions. This may indicate the user is under attack, and broadcast mode is activated. The alert system may receive sensor data indicating a police officer is nearby. This may be data such as the detection of flashing lights by the camera of the mobile device or frequent use of the word "Officer" detected in the audio stream. The user may feel threatened by police and in fear of excessive force by the police officer. The alert system may activate broadcast mode and avoid broadcasting to any police or emergency services.
[0029] When either solo recording mode or broadcast mode is activated for the alert system, the alert system may turn the screen and notifications off for the mobile device and put the mobile device into a user specified lock mode. The only person, presumably, that may turn off solo recording mode or broadcast mode, may be the user of the mobile device who has the capability of unlocking the mobile device. This may prevent solo recording mode or broadcast mode from being deactivated by a ne'er-do-well, such as the person who may have attacked or harmed the user.
[0030] FIG. 2 illustrates the emergency alert system engine 200 in accordance with some embodiments. In an embodiment, the alert system engine 200 may include multiple components for receiving an indication of an event,
receiving sensor data, and executing solo recording mode or broadcast mode. The alert system engine 200 may include an event detector 210. The event detector 210 receives input for the activation of the either solo recording mode or broadcast mode. The event detector 210 may receive input such as the pressing of a virtual button, a sequence of button presses, or a voice command. For example, the event detector 210 may analyze the tone of the user's voice, as compared to a known baseline of the user's voice, and determine it is yelling or aggravated, and thus activate solo recording mode. The event detector 210 may determine if the alert system engine 200 should activate solo recording mode or both solo recording mode and broadcast mode. The determination may be based on the input received. The determination may be based on the type of event. The event detector may determine the type of event based on data received from the sensor array 208.
[0031] The alert system engine 200 may include a sensor array 208 for receiving and managing data received from sensors. The sensors may be part of the device executing the alert system engine 200 and the sensor may be part of another device communicating the sensor data to the sensor array 208 by way of the transceiver 214. The sensor array 208 may receive data from sensors such as a microphone 202, a GPS 204, and a heart rate sensor 206. The microphone 202 may be activated to capture audio for the audio stream when solo recording mode is activated. For example, the heart rate sensor may be a sensor on a smartwatch. The smart watch communicates the heart rate sensor data to the sensor array 208 of the alert system engine 200 by way of the transceiver 214. The GPS 204 may obtain geolocation data for the device executing the alert system engine 200 and determine the location of the device. The heart rate sensor 206 is an example of a sensor providing sensor data to the alert system engine 208 that is utilized by the event detector 210 to determine the type of event.
[0032] The alert system engine 200 may be connected to a storage device 220. The alert system engine 200 may store data on the storage device 220, including the audio stream and video stream, the sensor data, and the geolocation data. The alert system engine 200 may retrieve the data stored on the storage device 220, such as if the audio stream is requested to provide information to an EMT about what the user may have experienced.
[0033] The alert system engine 200 may include an output manager 216. The output manager 216 receives information from the event detector 210, the sensor array 208, and the broadcast manager 212. The output manager 216 may collect data and generate information to be displayed, such as on a display screen of the device. Display information generated by the output manager 216 may include a map indicating the current location of the device based on the geolocation data from the GPS 204 and chat messages received from contacts who have received an alert notification. The output manager 216 may generate alert messages to be transmitted when broadcast mode is activated.
[0034] The alert system engine 200 may include a contacts manager 218. When broadcast mode is activated, the broadcast manager 212 may interface with the contacts manager 218 to retrieve a set of contacts for transmitting an alert notification to. The contacts manger 218 may store multiple sets of contacts the user has designated as the contacts the user wishes to be notified, depending on the event. The contacts manager 218 may determine the set of contacts by collecting data such as the type of event from the event detector 210 and sensor data from the sensor array 208. For example, the event may indicate that only contacts outside a distance radius should be contacted. The event detector 210 may indicate this type of event to the contacts manager 218. The contacts manager 218 may get geolocation data from the sensor array 208.
Based on the geolocation data, the contacts manger 218 may determine the set of contacts outside the distance radius for the broadcast manager 212 to utilize for transmitting an alert notification.
[0035] The alert system engine 200 may include a broadcast manager 212. When the event detector 210 receives an input to activate broadcast mode or determines broadcast mode should be activated, the broadcast manager 212 receives the activation. The broadcast manager 212 may interface with the contacts manager 218 to determine the contacts to transmit an alert notification to. The broadcast manager 212 may interface with the output manager 216 for a message to include with the alert notification. The broadcast manager 212 may send the alert notification to the transceiver 214 for transmission to the determined contacts through a network 226. From the network 226, the transceiver 214 may receive acknowledgement responses from the contacts that received the alert notification. The responses may be sent to the broadcast
manager 212 to track the contacts that have responded. The broadcast manager 212 may provide information about the contacts that have responded to the output manager 216. The output manager 216 may generate a display including the information about the contacts who have responded. The transceiver 214 may receive additional information from the contact, such as the current geolocation of the contact. The information is managed by the broadcast manager 212 for each contact and provided to the output manager 216 to display, such as by generating a map with the location of the contacts.
[0036] FIG. 3 illustrates an example process 300 when solo recording mode is activated, as according to some embodiments. As described above, the alert system may perform an operation 305 wherein the alert system receives an input from the user to activate solo recording mode. When the solo recording mode of the alert system is activated, the alert system may perform an operation 310 to activate one or more sensors of the mobile device. For example, a microphone of the device may be activated to capture audio into an audio stream. The audio may include audio such as the voices of the user and those people around the user, and the environmental sounds around the user. The device may activate a camera to capture video into a video stream. The camera may be a camera on the mobile device or a camera communicatively connected to the mobile device. The camera may be a camera on a device in the possession of a contact of the user. The camera may also be a network connected camera such as a security camera at a store or a residence. The alert system may connect to external security systems to capture data such as sirens, or other security features found in cars and buildings that may be activated in addition to the audio recording. In an example, a wearable camera on the alert sender's body may be activated to capture still shots or video of an emergency interaction. The alert system may perform an operation 315 to utilize a global positioning system (GPS) to collect geolocation data contemporaneously with the audio stream. The sensors may capture data contemporaneously with the audio and/or video stream. Sensors may include a heart rate sensor, an accelerometer, a magnetometer, a gyroscope, an optical sensor, an ultrasonic sensor, an inertial measurement unit, a multi-axis sensor, or a contact pressure sensor, etc. A sensor may be configured to detect physiological indications and may be adapted to detect a person's heart rate, skin temperature, brain wave activities, alertness (e.g., camera-based eye tracking),
activity levels, or other physiological or biological data. For example, a device may include sensors such as a gyroscope or accelerometer and the device may utilize these sensors to determine the user is in a physical altercation or has been injured and is not able to move.
[0037] The alert system may perform a decision 320 to utilize the sensors of the device to determine the type of event. The data from the sensors and the trigger indicating an event for activation may be combined to determine the type of event. The device may include sensors to capture physiological data which is synced with the audio stream recording. When the audio recording is played back, such as by an emergency medical technician (EMT), the EMT may observe the physiological changes of the user during the audio recording and utilize the information to diagnose and treat the user.
[0038] The decision 320 may determine the type of event, and based on the type of event determine if the alert system should be in solo recording mode or activate broadcast mode. If the alert system determines to stay in solo recording mode, the alert system may perform operation 325 for the mobile device to display an option to enter broadcast mode, the status of the audio stream recording, and a map with the current location of the mobile device as determined by the geolocation data. Alternatively, the decision 320 may determine the type of event and determine the alert system should activate broadcast mode at operation 330. The activation of broadcast mode is detailed in FIG. 4.
[0039] The alert system may perform an operation 335 to store the audio stream, geolocation data, and sensor data. The audio stream and geolocation data by be synched and stored on the device, such that when the audio recording is played, the geolocation data may be displayed on a map to show the location of the device during the progression of the audio recording. The audio stream, geolocation data, and sensor data may be transmitted and stored on a cloud service.
[0040] When solo recording mode is activated, only the user may cancel solo recording mode. The user may set a predetermined timer for solo recording mode, wherein the countdown timer begins when solo recording mode is activated. If the user does not cancel solo recording time before the countdown
timer expires, the alert system may automatically contact an emergency service or a contact as specified by the user.
[0041] FIG. 4 illustrates an example process 400 when broadcast mode is activated, as according to some embodiments. While the device is in solo recording mode, the option to enter broadcast mode is available to the user. This may be presented as a button on the display screen of the device or by activation through commands and motions as described above. To escalate to broadcast mode from solo recording mode, the user may optionally press and/or hold a button, wherein the button may be a physical button on the device or a virtual button on the screen. The alert system may perform an operation 410 to receive an input from the user to activate broadcast mode. When the user activates solo recording mode, the alert system may perform an operation 415 to collect and analyze data from sensors of the mobile device and determine the type of event. Based on the type of event, operation 415 may determine what type of broadcast event to perform and what information should be included in the alert notification.
[0042] As previously discussed, broadcast mode may be automatically activated from solo recording mode. Both at activation of solo recording mode and while solo recording mode is active, sensor data may be collected and analyzed to determine if broadcast mode should be activated. The alert system may perform an operation 405 to receive an activation command to activate broadcast mode automatically from a determination during solo recording mode. The alert system may monitor the sensors of the mobile device while solo recording mode is active. The alert system may receive sensor data that the situation the user is experiencing has escalated and thus broadcast mode is automatically activated to alert others that the user is in need of assistance. For example, the user may have activated solo recording mode when a stranger begins speaking to them. The stranger may suddenly attack the user. Based on data such as a sharp spike in heart rate for the user and erratic movement of the mobile device, the alert system may determine the situation has escalated and the user is in need of assistance, thus automatically activating broadcast mode.
[0043] When broadcast mode is activated, an event notification is broadcast over a network to contacts of the user. A user may have a set of contacts stored in their mobile device. A contact is a person the user has an association with and
has provided identifying information for communicating with, such as a phone number or email address. The user may set the alert system to utilize the set of contacts already stored on the mobile device. The user may manually select or input the set of contacts the user wishes for the alert system to utilize for sending alert notifications. When a user adds a contact to their set of contacts in the alert system, the alert system may send a message to the contact alerting them to the addition. When a user removes a contact from the set of contacts from whom they may receive alerts, that removed user may also receive a message.
Broadcast mode may transmit an alert notification to people who are not a contact of the user. This may include bystanders, people that share a
characteristic with the user (e.g. similar race, religion, group membership, etc.), and people who have indicated to the alert system they are willing to help others that request assistance. The message may contain information for the contact to install the alert system on the contact's devices. The alert system may perform an operation 420 to select a set of contacts to send an alert notification.
Operation 420 may determine the set of contacts based on the geolocation of the user and the type of event, either determined or indicated by the user. The contacts to which the alert notification is sent to may be a predetermined group of people the user has selected for receiving notifications, such as an emergency contact list. The set of contacts may be determined by location, such as only contacting people who have a profile location as the same state as the user. The set of contacts may be group of people sharing a common characteristic or attribute. For example, a user of a certain race or religion may fear being unfairly stereotyped in certain situations, and thus they may have the alert system configured to send an alert notification to others of a similar race or religion.
[0044] The alert system may perform at operation 425 to broadcast an alert notification to the members of the determined set of contacts. The alert notification may include information about the event the user is experiencing, a message about what the user wishes for the contact to do, the geolocation of the user, and the audio stream. When a contact receives an alert notification they may acknowledge the notification. This may include an acceptance of the alert notification so that the contact may monitor the situation that the user is involved in. This may include responding with a message, such as informing the user that
the contact is coming to the user's assistance. The contact may also perform an action such as contacting an emergency service or contacting additional people, for which a message is sent to the user informing them that the contact has performed such an action. If a contact is unable to assist, for example if they are out of town, they may decline the alert notification and an acknowledgment is not sent to the user or a message is sent to the user so the user is not under the assumption the contact may still respond. The alert system may perform an operation 430 to receive acknowledgment responses from at least one contact. The response may include the geolocation data for the contact. The response may include a message from the contact.
[0045] As the alert system receives acknowledgement responses from the set of contacts, the alert system may perform an operation 435 to display the location of the contacts which have responded. The display screen of the user's mobile device may display a map. The alert system may use the geolocation data obtained from the GPS to determine the location of the user and display the user's current location on the map. The alert system may use the geolocation data received in the contact responses to determine the location of the contact and display the location of the contact on the map. The alert system may continually obtain geolocation data from the GPS of the mobile device and the contacts to update the map with the current location of the user and the contacts. The alert system may perform an operation 440 to transmit the audio stream and updates about the user to the contacts which have responded. The updated data may include a message from the user, changes to the situation as indicated from sensors of the mobile device, update to the current location of the user, and a message indicating the user is no longer in danger and assistance is no longer needed.
[0046] As the alert system receives acknowledgment responses, the alert system may perform a decision 445 to determine if the number of acknowledged contacts has met a predetermined threshold for the number of contacts that should respond for a situation. The threshold may be predetermined by the user, such as for any situation, the user wishes for at least five contacts to respond. The threshold may be determined by the type of situation. For example, the user may not be feeling well and thus a threshold of at least two people is set. But, in an example, if the user is being attacked, a threshold of at least ten people may
be set. The alert system may have a timer for how long it may be allowed for contacts to respond and meet the threshold before additional actions are taken. Additional actions may be taken if certain criteria is met. For example, a timer allowing time for contacts to respond may be provided. If the number of responses is not met, then additional contacts may be notified. As the timer progresses, the alert system may receive sensor data indicating a change in the situation, where the alert system may determine an alert notification should be sent to a larger number of contacts or to attempt to send alert notifications to recipients outside the user's known contacts. Similarly, this may be based on a user setting or the type of situation. For example, the user may want to give their contacts three minutes to respond. In immediate situations, such as being attacked, there may not be time to wait very long for a set of contacts to respond, thus after a minute, if the threshold has not been met, the alert system may notify additional contacts.
[0047] The alert system may perform an operation 450 to notify additional contacts if the threshold for contact responses has not been met within the predetermined time. The alert system, utilized on both the user's mobile device and the contact's mobile device, may be configured to allow for a notification to be sent to a set of contacts of a contact of the user. The set of contacts of a contact may include members not known to the user. Operation 450 may send a notification to the set of contacts of the contact as a means to reach more people to assist the user.
[0048] FIG. 5 illustrates an example process 500 for determining the contacts for an alert notification when broadcast mode is activated, as according to some embodiments. Process 500 may illustrate the process performed by the alert system to determine which contacts to broadcast an alert notification to and at what time to broadcast. At decision 505, the alert system may determine the type of event being experienced by the user of the alert system. If the alert system determines a potential event may be occurring, such as the user feels that someone may be following them, then the alert system may perform operation 510 and send an alert notification to all the members of the user's contact set. The alert notification may include a message that the user is not in immediate danger, but wishes to have the contacts monitor their status.
[0049] The decision 505 may determine the user is experiencing an emergency event. Based on the emergency event designation, the alert system may perform operation 515 to remove any contact within a determined distance radius. A user may be in an emergency situation with a member of their contacts. The person may either be experiencing the same situation as the user, and thus sending them an alert notification would be wasted, or the person may be the one causing harm to the user, and thus it would not be helpful for them to receive an alert notification. The alert system may be configured such that alert notifications are broadcast to contacts who have a current location outside a predetermined radius distance from the user's current location. By broadcasting the notification to people outside a radius distance may help ensure the notification reaches people who have the ability to assist. The radius distance may be adjusted based on the location, type of event, and circumstances. For example, if the user is at a concert or a mall, where there are many people around, the radius distance may be relatively small so as to only avoid people in the immediate vicinity of the user. The distance may be larger when the user is outside and there is less concentration of people, such as when the user is walking on a street alone at night.
[0050] At operation 510, the alert system on the mobile device of the user may first broadcast a location request to all of the user' s contacts which utilize the alert system. The alert system on the devices of the contacts may collect current geolocation data for the device and transmit the geolocation data to the user's mobile device. Based on the geolocation data received from the devices, the alert system on the user's mobile device may determine which devices may be in the immediate vicinity of the user and which contacts may be within a predetermined distance radius of the user. The alert system may remove these contacts from the set of contacts to broadcast the alert notification to. The alert system may prioritize sending the alert notification to contacts which have reported a location closer to the user. As time progresses, the alert system may send the alert notification to contacts at further distances.
[0051] The alert system may be configured to broadcast an alert notification to bystanders. A bystander may be a person within a relative vicinity of the user. A bystander may be a person not previously known to the user. A bystander may be a person sharing a similar characteristic or trait with the user as
determined by a comparison of the profiles for the user and the bystander. A user of the alert system may wish to have help reach them as quickly as possible, such as if they are being attacked by someone. The alert system may broadcast an alert notification to mobile devices executing the alert system within a relative distance of the user. These may be bystanders and people passing near the event. The bystanders are alerted that a person is in need of assistance and the bystander may choose to assist, such as by going to the person's aide or contacting an emergency service.
[0052] A bystander who has opted into the alert system as willing to assist others may have their physiological data or audio data collected. This data may be combined with the data collect by the alert system of the user's device to provide more information about the situation. The data may be used to determine if the distance radius for alert notification broadcast is large enough. For example, the alert system may determine the by stander is in an agitated state and thus may also be involved in the situation. Thus, using the geolocation data of the bystander, the alert system may determine the broadcast alert notification should be sent to those at a farther distance.
[0053] Having determined the set of contacts to broadcast to, the alert system may perform operation 520 to broadcast an alert notification to the determined set of contacts. The alert system may then perform operation 525 to receive response from the contacts for the alert notification. For each contact that responds and acknowledges receiving the alert notification, the alert system may perform operation 555 to add the contact to a list of active responders. As responses are received, the alert system continues to add the contacts to an active group. The alert system may then perform an operation 560 to provide status updates of user and any changes to the location of the user.
[0054] As the alert system waits to receive responses, a timer may be running to determine if a wider broadcast needs to be made because enough responses have not been received. Depending on the event the timer may be shorter for more dire situations. The alert system may perform decision 530 to determine if the threshold timer has expired. If the threshold timer has not expired, the alert system may continue to perform operation 525 and wait to receive responses.
[0055] When the decision 530 determines the timer has expired, the alert system may perform decision 535 to determine if the response threshold was met. Similar to the threshold time, the response threshold may depend on the type of event. The more serious the situation, the higher the threshold value may be for the number of people that the user wishes to respond. The user may also customize the threshold for the event. For example, if the user has tripped and needs help, the response threshold may be one or two people. If the user is being attacked, the user may wish for ten or more people to respond. If decision 535 determines the response threshold was met, the alert system may perform operation 545 and end the broadcast of alert notifications.
[0056] If the decision 535 determines the response threshold was not met, the alert system may perform operation 540 to broadcast the alert notification to a wider group of contacts. Depending on the type of relationship the user has established with their set of contacts, the alert system may perform operation 550 to broadcast to contacts of the user's contacts. By association, a contact of a contact may respond and assist the user. Both operation 540 and 550 may broadcast the alert notification and then return to operation 525 to receive responses.
[0057] FIG. 6 illustrates an example user interface for solo recording mode on a mobile device 600 in accordance with some embodiments. Mobile device 600 may be a device such as a smartphone, cellular phone, personal digital assistant, fitness tracker, or portable music player. The mobile device 600 may display on a touch display interface 605 a graphical user interface (GUI) for the alert system. The alert system may be designed such that it is simple for a user to activate solo recording mode without having to perform a sequence of steps. The GUI may display a single virtual button 610 with instructions 615 to activate the solo recording mode. For example, as illustrated, the GUI may have a virtual button 610 labeled "HELP", with instructions 615 for the user to "PRESS + HOLD" the virtual button 610 to activate the solo recording mode.
[0058] FIG. 7 illustrates an example user interface for activated solo recording mode on a mobile device 700 in accordance with some embodiments. The mobile device 700 may display on a touch display interface 705 a GUI for the activated solo recording mode of the alert system. The GUI may include several virtual buttons for the user to interface with while the alert system is in
solo recording mode. The GUI may include an "END" button 710 for the user to interface with when the user wishes to end the solo recording mode. The GUI may include a phone button 715 for the user to place a phone call while in solo recording mode. The phone button 715 may be customized in different ways to the user's preferences, such as immediately placing a 911 call, immediately placing a call to the first person listed as an emergency contact, or opening up the user's list of contacts so they may decide who to call. The GUI may include a broadcast button 720 to enter broadcast mode. If they user feels the situation has escalated and they need to alert people of the situation, the user may activate broadcast mode by tapping the broadcast button 720 while in solo recording mode. The GUI may include a map 725 displaying the current location of the mobile device from the geolocation data obtained from the GPS.
[0059] FIG. 8 illustrates an example user interface for the activated broadcast mode on a mobile device 800 in accordance with some embodiments. The mobile device 800 may display on a touch display interface 805 a GUI for the activated broadcast mode of the alert system. The GUI may include several virtual buttons for the user to interface with while the alert system is in broadcast mode. The GUI may include an "END" button 810 for the user to interface with when the user wishes to end the broadcast mode. The GUI may include a phone button 820 for the user to place a phone call while in broadcast mode. The phone button may operate similarly to the phone button during solo recording mode. The GUI may include a mute button 815 to mute or un-mute the contacts which have acknowledged the alert notification. The contacts that have responded to the alert notification may have the capability to speak with the user and one and other. The GUI may include a map 825 to display the current location of the mobile device and the location of any contacts that may have received or acknowledged the broadcast alert notification.
[0060] FIG. 9 illustrates an example user interface for the activated broadcast mode on a mobile device 900 in accordance with some embodiments. The mobile device 900 may display on a touch display interface 905 a GUI for the activated broadcast mode of the alert system. The GUI may include an "END" button 910 for the user to interface with when the user wishes to end the broadcast mode. The GUI may include a call button 915, such as "CALL 911", for the user to place an emergency call. The call button may be customized to
place a call to any number the user chooses. The GUI may include a broadcast bar 920 containing names or images of the contacts which have received and acknowledged the broadcast alert notification. The user may click on an image or name in the broadcast bar 920 to start an interaction with the contact associated with the image or name. The interaction may be a text message, a chat window, or a phone call. The GUI may include a map 925 to display the current location of the mobile device and the location of any contacts that may have received or acknowledged the broadcast alert notification. The map 925 may display the location of an acknowledged contact 930 such that the user may track the location of the acknowledged contact 930. The GUI may include a chat button 935 to start a group chat session with all the contacts which have received and acknowledged the broadcast alert notification. The GUI may include a video window 940 to display a captured video stream of the situation. The video stream displayed in the video window 940 may be captured from a camera on the mobile device or a camera communicatively connected to the mobile device.
[0061] FIG. 10A illustrates an example user interface for the alert system on a smartwatch 1000 in accordance with some embodiments. The smartwatch 1000 may execute a smartwatch specific application for the alert system. The smartwatch 1000 may include a microphone allowing for the smartwatch to capture the audio stream when solo recording mode is activated. The microphone of smartwatch 1000 may receive commands, such as a voice command to activate solo recording mode or broadcast mode. The smartwatch 1000 may include a GPS device, allowing the smartwatch 1000 to obtain geolocation data when solo recording mode is activated. The smartwatch 1000 may have capabilities to connect to a network, such as a cellular network, allowing the smartwatch to send a broadcast alert notification. The smartwatch 1000 may not have network connectivity capabilities and thus transmits the commands to a connected mobile device to execute the broadcast of an alert notification. The smartwatch 1000 may display on a touch display interface 1005 a GUI for a user of the alert system to activate solo recording mode or broadcast mode. The smartwatch 1000 GUI may include a virtual button 1010 to activate solo recording mode or broadcast mode. Similar to virtual button 610, the user may tap or tap-and-hold the virtual button 1010 to activate. The
smartwatch 1000 may include a physical button 1015. The physical button 1015 may receive a sequence pattern of button pushes to activate either solo recording mode or broadcast mode.
[0062] The user, wearing a communicatively connected device such as a smartwatch 1000, may receive alert notification acknowledgements on the smartwatch 1000. The smartwatch 1000 may display information indicating a contact has responded to the alert notification. The smartwatch 1000 may provide haptic feedback (e.g., a vibration or physical pulse) to the user when a contact has responded to the alert notification. For example, when the user is experiencing an emergency situation, they may not be able to look at the display screen of their mobile device. The smartwatch 1000, the mobile device, and another communicatively connected device may provide a unique haptic feedback (e.g., a unique pattern of vibrations) to the user when a contact has acknowledged the alert notification, thus informing, and reassuring, the user that help is responding even though the user cannot view their mobile device.
[0063] FIG. 10B illustrates an example user interface for the alert system on a smartwatch 1030 in accordance with some embodiments. Smartwatch 1030 is an example of a smartwatch worn by a contact of the user running an application for the alert system. The smartwatch 1030 has received an alert notification from a user that the user needs assistance. The smartwatch 1030 may have a touch sensitive display screen 1035. The display screen 1035 may display the alert system application GUI. When the contact receives an alert notification, the GUI may display the name of the user 1040 as the person requesting assistance. The smartwatch 1030 may play the audio stream received from the user. The GUI may display the information 1045 about the audio stream, such as if the stream is a live broadcast and how long the audio stream has been going. If an audio stream is not included with the alert notification, the information 1045 may display a timer based on when the alert notification was sent, such that the contact is aware of how much time has transpired since the user indicated they needed assistance. The GUI may include a virtual button 1050 for the contact to accept the notification and send acknowledgment to the user that the notification has been received. Upon acceptance, the GUI may present additional options to the contact, such as contacting an emergency service or sending a message to the user that the contact is coming to assist. The
GUI and actions described related to FIG. 10B are not limited to a smartwatch and may be performed on any network connected electronic device.
[0064] FIG. IOC illustrates an example user interface for the alert system on a smartwatch 1060 in accordance with some embodiments. Smartwatch 1060 is an example of a smartwatch worn by a contact of the user running an application for the alert system. The smartwatch 1060 has received an alert notification from a user that the user needs assistance. The contact has accepted the alert notification, such as by tapping a virtual button 1050 in FIG. 10B. The smartwatch 1030 may have a touch sensitive display screen 1065 displaying a GUI for the alert system. When a contact has accepted and acknowledged an alert notification from a user, the GUI may display a map 1070 on the display screen 1065 of the smartwatch 1060. The map 1070 may display the current location of the user requesting assistance. The map 1070 may display the current location of the contact relative to the user requesting assistance. The map 1070 may display directions to assist the contact in reaching the user requesting assistance.
[0065] FIG. 11 illustrates an example user interface for a received alert notification on a mobile device 1100 in accordance with some embodiments. A contact of the user requesting assistance, or requester, may execute the alert system on their mobile device 1100. The mobile device 1100 may display on a touch display interface 1 105 a GUI for a receiver of an alert notification of the alert system. When a contact, or receiver, accept and acknowledged the alert notification received from a user requesting assistance, the receiver may choose to enter chat mode or the alert system may enter chat mode automatically. The GUI for the alert system of a receiver may have a static status portion 1110 providing the status of the requester. The status portion 1110 may include the name 1115 of the requester, an image 1120 of the requester, and the address 1125 of the requester. The receiver may select the name or picture of the requester to obtain additional information about the requester, such as social group identification, medical condition, medication requirements, vitals, car make and model, and license to carry a weapon. The status portion 1110 may include a display of information 1130 about the audio stream, such as if the stream is a live broadcast and how long the audio stream has been going. The audio and video stream may be stored on the receiver's mobile device. The
receiver may playback the audio and video stream at a later time, such as to provide information to an EMT or evidence to police. The status portion 1 110 may provide a virtual button 1135 for the receiver to contact additional people to be alerted and assist the requester. Additional people may include additional contacts, contacts of a contact, or people unknown to the user, such as bystanders, that the alert system has determined may assist. The alert system may select people unknown to the user based on analyzing characteristics of those people, such as shared personal characteristics (e.g. race or religion), location proximity, history (e.g. if the person has previously responded to alert notifications), and physiological state. The status portion 1110 may include a virtual button 1140 to contact an emergency service or other emergency contact. The GUI of the alert system may include a chat window 1145 to provide status updates for the requester and messages exchanged between the requester and other receivers. The chat window 1145 may provide status information 1150 as the alert system receives it, such as the request and location of the requester.
The chat window 1145 may include messages 1155 from the requester. The chat window 1145 may include messages 1160 from other receivers, or contacts of the requester which have acknowledged the alert notification. The members of the chat window 1145 may exchange messages, pictures, and video. The chat window 1145 may include status information for the requester, such as heart rate, or environmental data, such as other alerts triggered near the requester. The chat window 1145 may be changed to a map window by selecting the virtual map button 1165. The map window may provide the location of the requester, the receiver, and other receivers. The map window may include directions to the location of the requester.
[0066] Data collected during an event where either solo recording mode or broadcast mode are activated may be stored at a network server and storage device, such as a cloud service. The data may include the geolocation data of all participants, the audio stream, the video stream, the chat exchanges amongst the participants, the voice chat amongst the receivers, and any sensor data collected.
[0067] The alert system may be linked to third party service or application. For example the alert system may transmit the recorded data to a service over the network. Services may include a health monitoring service, a translation service, an attorney, or a community leader. The alert system may be connected to
similar applications within the mobile device or an application that stores the recorded data to a cloud service to preserve the recorded data should the mobile device be damaged or lost. When the alert system connects to a third party service or application, a communication channel may open allowing for real time two-way communication with the service. The real time two-way
communication may include audio and video communication. For example, a user who does not speak English may initiate the alert system to connect to a translation service. A two-way communication channel is opened between the user and the translation service. The user may then speak in their native language and the translation service may translate to English for someone the user is conversing with, such as a police officer.
[0068] A user of the alert system may carry with them an instrument for self- defense. This may include a baton, a knife, a Taser, pepper spray, or a handgun. The instrument may be equipped with sensors, such as an accelerometer or gyroscope, and a communication device, such as a Bluetooth transmitter. When a situation occurs where the user feels at danger and motions defensively with the instrument, the motion sensors on the instrument may detect this movement and prompt the communication device to send a message. The communication device, being communicatively connected to the alert system through the mobile device may receive this indication and automatically activate solo recording mode or broadcast mode, as warranted by the situation. For example, the user may carry a Taser. The user feels threatened by someone approaching or following them. When the user motions such that the Taser is being pointed, the motion sensors of the Taser may detect this motion and signal to the
communication device which then transmits to the mobile device. The alert system receiving the indication that the Taser is being pointed activates solo recording mode.
[0069] FIG. 12 illustrates a flow chart showing a technique 1200 for executing a solo recording mode for an emergency alert system, in accordance with some embodiments. The technique 1200 includes an operation 1202 to receive an activation trigger for an indication of an event on a mobile device. An activation trigger may include a selection of a virtual button on a touch screen, a sequence of button presses, an eye gaze or eye movement, a movement of the mobile device, a movement or gesture with a worn device, or a voice
command. For example, the user input to trigger the event may be a voice command and the trigger is activated based on the tone of the voice used in the voice command or the trigger is activated based on the words used in the voice command. For example, the trigger is a movement of the mobile device. For example, the trigger is a combination of a movement of the mobile device and a tone of the user's voice. The technique 1200 includes an operation 1204 to obtain geolocation data from a global positioning device of the mobile device. For example, a GPS may provide geolocation for the current location of the mobile device. The technique 1200 includes an operation 1206 to obtain sensor data from one or more sensor of the mobile device. The sensors data may include data obtained from an accelerometer, a gyroscope, or a camera. The sensor data may be obtained from a sensor of a device communicatively connected to the mobile device. For example, the user may wear a smartwatch which has a heart rate sensor. The alert system, through the mobile device, may receive the heart rate sensor data from the communicatively connected device
[0070] The technique 1200 includes an operation 1208 to determine the location of the mobile device based on the obtained geolocation data. The technique 1200 includes an operation 1210 to activate a microphone on the mobile device to capture an audio stream. The audio stream may include voices and sounds around the mobile device. The technique 1200 includes an operation 1212 to determine a type of event based on the sensor data and the activation trigger for indicating the event. The technique 1200 includes an operation 1214 to store the audio stream, the sensor data, and the geolocation data in a storage device of the mobile device. The technique 1200 includes an operation 1216 to display, on a display device of the mobile device, a message based on the type of event. The display may include a message about an emergency service or contacts to notify based on the determined type of event.
[0071] The technique may include an operation to transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event. The technique may include an operation to automatically transmit the notification based on the type of event. The technique may include operations to receive, from at least one member of the set of contacts, an
acknowledgement of the notification and display, on the display device of the mobile device, the contacts from which an acknowledgement has been received. For example, the notification of the event may include the audio stream and a first map generated using the geolocation data. The technique may include operations to display, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts. For example, the set of contacts may be limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
[0072] The technique may include operations to obtain, contemporaneously with the audio stream, physiological data about the user from a sensor of the mobile device and store the physiological data in a storage device of the mobile device. The technique may include operations to obtain sensor data from a sensor of a device communicatively connected to the mobile device. For example, the device communicatively connected to the mobile device is a smartwatch.
[0073] FIG. 13 illustrates generally an example of a block diagram of a machine 1300 upon which any one or more of the techniques (e.g.,
methodologies) discussed herein may perform in accordance with some embodiments. In alternative embodiments, the machine 1300 may operate as a standalone device or may be connected (e.g., networked) to other machines. In a networked deployment, the machine 1300 may operate in the capacity of a server machine, a client machine, or both in server-client network environments. In an example, the machine 1300 may act as a peer machine in peer-to-peer (P2P) (or other distributed) network environment. The machine 1300 may be a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile telephone, a web appliance, a network router, switch or bridge, or any machine capable of executing instructions (sequential or otherwise) that specify actions to be taken by that machine. Further, while only a single machine is illustrated, the term "machine" shall also be taken to include any collection of machines that individually or jointly execute a set (or multiple sets) of instructions to perform any one or more of the methodologies discussed herein, such as cloud computing, software as a service (SaaS), other computer cluster configurations.
[0074] Examples, as described herein, may include, or may operate on, logic or a number of components, modules, or mechanisms. Modules are tangible entities (e.g., hardware) capable of performing specified operations when operating. A module includes hardware. In an example, the hardware may be specifically configured to carry out a specific operation (e.g., hardwired). In an example, the hardware may include configurable execution units (e.g., transistors, circuits, etc.) and a computer readable medium containing instructions, where the instructions configure the execution units to carry out a specific operation when in operation. The configuring may occur under the direction of the executions units or a loading mechanism. Accordingly, the execution units are communicatively coupled to the computer readable medium when the device is operating. In this example, the execution units may be a member of more than one module. For example, under operation, the execution units may be configured by a first set of instructions to implement a first module at one point in time and reconfigured by a second set of instructions to implement a second module.
[0075] Machine (e.g., computer system) 1300 may include a hardware processor 1302 (e.g., a central processing unit (CPU), a graphics processing unit (GPU), a hardware processor core, or any combination thereof), a main memory 1304 and a static memory 1306, some or all of which may communicate with each other via an interlink (e.g., bus) 1308. The machine 1300 may further include a display unit 1310, an alphanumeric input device 1312 (e.g., a keyboard), and a user interface (UI) navigation device 1314 (e.g., a mouse). In an example, the display unit 1310, alphanumeric input device 1312 and UI navigation device 1314 may be a touch screen display. The machine 1300 may additionally include a storage device (e.g., drive unit) 1316, a signal generation device 1318 (e.g., a speaker), a network interface device 1320, and one or more sensors 1321, such as a global positioning system (GPS) sensor, compass, accelerometer, or other sensor. The machine 1300 may include an output controller 1328, such as a serial (e.g., universal serial bus (USB), parallel, or other wired or wireless (e.g., infrared (IR), near field communication (NFC), etc.) connection to communicate or control one or more peripheral devices (e.g., a printer, card reader, etc.).
[0076] The storage device 1316 may include a machine readable medium 1322 that is non-transitory on which is stored one or more sets of data structures or instructions 1324 (e.g., software) embodying or utilized by any one or more of the techniques or functions described herein. The instructions 1324 may also reside, completely or at least partially, within the main memory 1304, within static memory 1306, or within the hardware processor 1302 during execution thereof by the machine 1300. In an example, one or any combination of the hardware processor 1302, the main memory 1304, the static memory 1306, or the storage device 1316 may constitute machine readable media.
[0077] While the machine readable medium 1322 is illustrated as a single medium, the term "machine readable medium" may include a single medium or multiple media (e.g., a centralized or distributed database, or associated caches and servers) configured to store the one or more instructions 1324.
[0078] The term "machine readable medium" may include any medium that is capable of storing, encoding, or carrying instructions for execution by the machine 1300 and that cause the machine 1300 to perform any one or more of the techniques of the present disclosure, or that is capable of storing, encoding or carrying data structures used by or associated with such instructions. Non- limiting machine readable medium examples may include solid-state memories, and optical and magnetic media. Specific examples of machine readable media may include: non-volatile memory, such as semiconductor memory devices (e.g., Electrically Programmable Read-Only Memory (EPROM), Electrically Erasable Programmable Read-Only Memory (EEPROM)) and flash memory devices; magnetic disks, such as internal hard disks and removable disks;
magneto-optical disks; and CD-ROM and DVD-ROM disks.
[0079] The instructions 1324 may further be transmitted or received over a communications network 1326 using a transmission medium via the network interface device 1320 utilizing any one of a number of transfer protocols (e.g., frame relay, internet protocol (IP), transmission control protocol (TCP), user datagram protocol (UDP), hypertext transfer protocol (HTTP), etc.). Example communication networks may include a local area network (LAN), a wide area network (WAN), a packet data network (e.g., the Internet), mobile telephone networks (e.g., cellular networks), Plain Old Telephone (POTS) networks, and wireless data networks (e.g., Institute of Electrical and Electronics Engineers
(IEEE) 802.11 family of standards known as Wi-Fi®, IEEE 802.16 family of standards known as WiMax®), IEEE 802.15.4 family of standards, peer-to-peer (P2P) networks, among others. In an example, the network interface device 1320 may include one or more physical jacks (e.g., Ethernet, coaxial, or phone jacks) or one or more antennas to connect to the communications network 1326. In an example, the network interface device 1320 may include a plurality of antennas to wirelessly communicate using at least one of single-input multiple-output (SIMO), multiple-input multiple-output (MIMO), or multiple-input single-output (MISO) techniques. The term "transmission medium" shall be taken to include any intangible medium that is capable of storing, encoding or carrying instructions for execution by the machine 1300, and includes digital or analog communications signals or other intangible medium to facilitate communication of such software. Various Notes & Examples
[0080] Each of these non-limiting examples may stand on its own, or may be combined in various permutations or combinations with one or more of the other examples.
[0081] Example 1 is a system for personal emergency data capture and alerting, the system comprising: a processing complex, the processing complex including one or more processors in a mobile device of a user; and at least one machine readable medium in the mobile device, the at least one machine readable medium including instructions that, when executed, cause the processing complex to control electronic hardware of the mobile device: receive a trigger for an indication of an event on the mobile device; obtain geolocation data from a global positioning device of the mobile device; obtain sensor data from one or more sensors of the mobile device; determine a location of the mobile device based on the geolocation data; determine a type of event based on the sensor data and the trigger used to indicate the event; store the sensor data, and the geolocation data in a storage device of the mobile device; and output an activation indication to the mobile device based on the type of event.
[0082] In Example 2, the subject matter of Example 1 includes, instructions to: transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and
wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
[0083] In Example 3, the subject matter of Example 2 includes, instructions to: receive, from at least one member of the set of contacts, an acknowledgement of the notification; and display, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
[0084] In Example 4, the subject matter of Examples 2-3 includes, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
[0085] In Example 5, the subject matter of Examples 1-4 includes, instructions to: receive, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receive, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
[0086] In Example 6, the subject matter of Examples 4-5 includes, instructions to: display, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
[0087] In Example 7, the subject matter of Examples 1-6 includes, wherein the set of contacts is limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
[0088] In Example 8, the subject matter of Examples 1-7 includes, instructions to: obtain, contemporaneously with the sensor data, physiological data about the user from a sensor of the mobile device; store the physiological data in a storage device of the mobile device.
[0089] In Example 9, the subject matter of Examples 1-8 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the tone of the voice used in the voice command.
[0090] In Example 10, the subject matter of Examples 1-9 includes, instructions to: obtain sensor data from a sensor of a device communicatively connected to the mobile device.
[0091] In Example 11, the subject matter of Examples 3-10 includes, wherein at least one of the mobile device or a device communicatively
connected to the mobile device provides haptic feedback when an
acknowledgement has been received.
[0092] In Example 12, the subject matter of Examples 1-11 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the words used in the voice command.
[0093] In Example 13, the subject matter of Examples 2-12 includes, instructions to automatically transmit the notification based on the type of event.
[0094] In Example 14, the subject matter of Examples 1-13 includes, wherein the trigger is a movement of at least one of the mobile device or a device communicatively connected to the mobile device.
[0095] In Example 15, the subject matter of Examples 1-14 includes, wherein the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
[0096] Example 16 is at least one non-transitory computer readable medium including instructions for personal emergency data capture and alerting that when executed by at least one processor, cause the at least one processor to: receive a trigger for an indication of an event on the mobile device; obtain geolocation data from a global positioning device of the mobile device; obtain sensor data from one or more sensors of the mobile device; determine a location of the mobile device based on the geolocation data; determine a type of event based on the sensor data and the trigger used to indicate the event; store the sensor data, and the geolocation data in a storage device of the mobile device; and output an activation indication to the mobile device based on the type of event.
[0097] In Example 17, the subject matter of Example 16 includes, instructions to: transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
[0098] In Example 18, the subject matter of Example 17 includes, instructions to: receive, from at least one member of the set of contacts, an acknowledgement of the notification; and display, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
[0099] In Example 19, the subject matter of Examples 17-18 includes, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
[00100] In Example 20, the subject matter of Examples 16-19 includes, instructions to: receive, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receive, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
[00101] In Example 21, the subject matter of Examples 19-20 includes, instructions to: display, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
[00102] In Example 22, the subject matter of Examples 16-21 includes, wherein the set of contacts is limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
[00103] In Example 23, the subject matter of Examples 16-22 includes, instructions to: obtain, contemporaneously with the sensor data, physiological data about the user from a sensor of the mobile device; store the physiological data in a storage device of the mobile device.
[00104] In Example 24, the subject matter of Examples 16-23 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the tone of the voice used in the voice command.
[00105] In Example 25, the subject matter of Examples 16-24 includes, instructions to: obtain sensor data from a sensor of a device communicatively connected to the mobile device.
[00106] In Example 26, the subject matter of Examples 18-25 includes, wherein at least one of the mobile device or a device communicatively connected to the mobile device provides haptic feedback when an
acknowledgement has been received.
[00107] In Example 27, the subject matter of Examples 16-26 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the words used in the voice command.
[00108] In Example 28, the subject matter of Examples 17-27 includes, instructions to automatically transmit the notification based on the type of event.
[00109] In Example 29, the subject matter of Examples 16-28 includes, wherein the trigger is a movement of at least one of the mobile device or a device communicatively connected to the mobile device.
[00110] In Example 30, the subject matter of Examples 16-29 includes, wherein the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
[00111] Example 31 is a method for personal emergency data capture and alerting, the method comprising: receiving a trigger for an indication of an event on the mobile device; obtaining geolocation data from a global positioning device of the mobile device; obtaining sensor data from one or more sensors of the mobile device; determining a location of the mobile device based on the geolocation data; determining a type of event based on the sensor data and the trigger used to indicate the event; storing the sensor data, and the geolocation data in a storage device of the mobile device; and outputting an activation indication to the mobile device based on the type of event.
[00112] In Example 32, the subject matter of Example 31 includes, transmitting a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
[00113] In Example 33, the subject matter of Example 32 includes, receiving, from at least one member of the set of contacts, an acknowledgement of the notification; and displaying, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
[00114] In Example 34, the subject matter of Examples 32-33 includes, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
[00115] In Example 35, the subject matter of Examples 32-34 includes, receiving, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receiving, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
[00116] In Example 36, the subject matter of Examples 34-35 includes, displaying, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
[00117] In Example 37, the subject matter of Examples 31-36 includes, wherein the set of contacts is limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
[00118] In Example 38, the subject matter of Examples 31-37 includes, obtaining, contemporaneously with the sensor data, physiological data about the user from a sensor of the mobile device; storing the physiological data in a storage device of the mobile device.
[00119] In Example 39, the subject matter of Examples 31-38 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the tone of the voice used in the voice command.
[00120] In Example 40, the subject matter of Examples 31-39 includes, obtaining sensor data from a sensor of a device communicatively connected to the mobile device.
[00121] In Example 41, the subject matter of Examples 33-40 includes, wherein at least one of the mobile device or a device communicatively connected to the mobile device provides haptic feedback when an
acknowledgement has been received.
[00122] In Example 42, the subject matter of Examples 31-41 includes, wherein the user input to trigger the event is a voice command and the trigger is activated based on the words used in the voice command.
[00123] In Example 43, the subject matter of Examples 32-42 includes, transmitting the notification based on the type of event.
[00124] In Example 44, the subject matter of Examples 31-43 includes, wherein the trigger is a movement of at least one of the mobile device or a device communicatively connected to the mobile device.
[00125] In Example 45, the subject matter of Examples 31-44 includes, wherein the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
[00126] Example 46 is at least one machine-readable medium including instructions that, when executed by processing circuitry, cause the processing circuitry to perform operations to implement of any of Examples 1-45.
[00127] Example 47 is an apparatus comprising means to implement of any of Examples 1-45.
[00128] Example 48 is a system to implement of any of Examples 1-45.
[00129] Example 49 is a method to implement of any of Examples 1-45.
[00130] Method examples described herein may be machine or computer- implemented at least in part. Some examples may include a computer-readable medium or machine-readable medium encoded with instructions operable to configure an electronic device to perform methods as described in the above examples. An implementation of such methods may include code, such as microcode, assembly language code, a higher-level language code, or the like. Such code may include computer readable instructions for performing various methods. The code may form portions of computer program products. Further, in an example, the code may be tangibly stored on one or more volatile, non- transitory, or non-volatile tangible computer-readable media, such as during execution or at other times. Examples of these tangible computer-readable media may include, but are not limited to, hard disks, removable magnetic disks, removable optical disks (e.g., compact disks and digital video disks), magnetic cassettes, memory cards or sticks, random access memories (RAMs), read only memories (ROMs), and the like.
Claims
1. A system for personal emergency data capture and alerting, the system comprising:
a processing complex, the processing complex including one or more processors in a mobile device of a user; and
at least one machine readable medium in the mobile device, the at least one machine readable medium including instructions that, when executed, cause the processing complex to control electronic hardware of the mobile device:
receive a trigger for an indication of an event on the mobile device; obtain geolocation data from a global positioning device of the mobile device; obtain sensor data from one or more sensors of the mobile device; determine a location of the mobile device based on the geolocation data; determine a type of event based on the sensor data and the trigger used to indicate the event; store the sensor data, and the geolocation data in a storage device of the mobile device; and
output an activation indication to the mobile device based on the type of event.
2. The system of claim 1, further including instructions to:
transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
3. The system of claim 2, further including instructions to:
receive, from at least one member of the set of contacts, an
acknowledgement of the notification; and
display, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
4. The system of claim 2, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
5. The system of claim 1, further including instructions to:
receive, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receive, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
6. The system of claim 4, further including instructions to:
display, on a display device of the mobile device, a second map including a location indicated by the geolocation data of each contact of the set of contacts.
7. The system of claim 1, wherein the set of contacts is limited to contacts with a location outside a predetermined radius distance of the location of the mobile device, wherein the radius distance is based on the type of event.
8. The system of claim 1, further including instructions to:
obtain, contemporaneously with the sensor data, physiological data about the user from a sensor of the mobile device;
store the physiological data in a storage device of the mobile device.
9. The system of claim 1, wherein the user input to trigger the event is a voice command and the trigger is activated based on the tone of the voice used in the voice command.
10. The system of claim 1, further including instructions to:
obtain sensor data from a sensor of a device communicatively connected to the mobile device.
11. The system of claim 3, wherein at least one of the mobile device or a device communicatively connected to the mobile device provides haptic feedback when an acknowledgement has been received.
12. The system of claim 1, wherein the user input to trigger the event is a voice command and the trigger is activated based on the words used in the voice command.
13. The system of claim 2, further including instructions to automatically transmit the notification based on the type of event.
14. The system of claim 1, wherein the trigger is a movement of at least one of the mobile device or a device communicatively connected to the mobile device.
15. The system of claim 1, wherein the trigger is a combination of a movement of the mobile device and a tone of the user's voice.
16. At least one non-transitory computer readable medium including instructions for personal emergency data capture and alerting that when executed by at least one processor, cause the at least one processor to: receive a trigger for an indication of an event on the mobile device; obtain geolocation data from a global positioning device of the mobile device; obtain sensor data from one or more sensors of the mobile device; determine a location of the mobile device based on the geolocation data; determine a type of event based on the sensor data and the trigger used to indicate the event; store the sensor data, and the geolocation data in a storage device of the mobile device; and output an activation indication to the mobile device based on the type of event.
17. The at least one computer readable medium of claim 16, further comprising instructions to: transmit a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and
wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
18. The at least one computer readable medium of claim 17, further comprising instructions to: receive, from at least one member of the set of contacts, an
acknowledgement of the notification; and display, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
19. The at least one computer readable medium of claim 17, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
20. The at least one computer readable medium of claim 16, further comprising instructions to: receive, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and receive, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
21. A method for personal emergency data capture and alerting, the method comprising: receiving a trigger for an indication of an event on the mobile device; obtaining geolocation data from a global positioning device of the mobile device; obtaining sensor data from one or more sensors of the mobile device; determining a location of the mobile device based on the geolocation data; determining a type of event based on the sensor data and the trigger used to indicate the event;
storing the sensor data, and the geolocation data in a storage device of the mobile device; and
outputting an activation indication to the mobile device based on the type of event.
22. The method of claim 21, further comprising: transmitting a notification of the event to a set of contacts, wherein the notification includes the geolocation data, the location of the mobile device, and wherein the set of contacts are contacts predetermined by the user for receiving event notifications based on the type of event.
23. The method of claim 22, further comprising: receiving, from at least one member of the set of contacts, an
acknowledgement of the notification; and displaying, on a display device of the mobile device, the contacts from which an acknowledgement has been received.
24. The method of claim 22, wherein the notification of the event includes at least one of an audio stream and a video stream, and a first map generated using the geolocation data.
25. The method of claim 22, further comprising: receiving, from at least one member of the set of contacts, geolocation data for the at least one member of the set of contacts; and
receiving, from at least one member outside the set of contacts, geolocation data for the at least one member outside the set of contacts.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201662431753P | 2016-12-08 | 2016-12-08 | |
US62/431,753 | 2016-12-08 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2018107031A1 true WO2018107031A1 (en) | 2018-06-14 |
Family
ID=62491383
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/US2017/065316 WO2018107031A1 (en) | 2016-12-08 | 2017-12-08 | Personal emergency data capture and alerting |
Country Status (1)
Country | Link |
---|---|
WO (1) | WO2018107031A1 (en) |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020074978A3 (en) * | 2018-09-05 | 2020-05-22 | Mobile Software As | System and method for alerting, recording and tracking |
CN113920665A (en) * | 2021-09-30 | 2022-01-11 | 中国工商银行股份有限公司 | Security management method and system comprising security equipment |
TWI773141B (en) * | 2021-02-19 | 2022-08-01 | 杜昱璋 | Hazard Prediction and Response Device and System |
GB2605381A (en) * | 2021-03-29 | 2022-10-05 | Tended Ltd | Wearable device and a screen for a wearable device |
CN119904964A (en) * | 2023-10-26 | 2025-04-29 | 北京小米移动软件有限公司 | Method, device and medium for calling for help based on fall detection |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110130112A1 (en) * | 2008-01-28 | 2011-06-02 | Michael Saigh | Personal Safety Mobile Notification System |
KR20140030848A (en) * | 2012-09-04 | 2014-03-12 | 이용인 | Method for responding to emergency situations by cooperation between users using smart device |
US20150348389A1 (en) * | 2014-05-27 | 2015-12-03 | Lg Electronics Inc. | Smart band and emergency state monitoring method using the same |
WO2016071006A1 (en) * | 2014-11-06 | 2016-05-12 | Rudolf King | Personal emergency response system and method of operation |
KR20160104332A (en) * | 2015-02-26 | 2016-09-05 | 엘지이노텍 주식회사 | A system for providing notification service for emergency using imaging apparatus and a method for the same |
-
2017
- 2017-12-08 WO PCT/US2017/065316 patent/WO2018107031A1/en active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110130112A1 (en) * | 2008-01-28 | 2011-06-02 | Michael Saigh | Personal Safety Mobile Notification System |
KR20140030848A (en) * | 2012-09-04 | 2014-03-12 | 이용인 | Method for responding to emergency situations by cooperation between users using smart device |
US20150348389A1 (en) * | 2014-05-27 | 2015-12-03 | Lg Electronics Inc. | Smart band and emergency state monitoring method using the same |
WO2016071006A1 (en) * | 2014-11-06 | 2016-05-12 | Rudolf King | Personal emergency response system and method of operation |
KR20160104332A (en) * | 2015-02-26 | 2016-09-05 | 엘지이노텍 주식회사 | A system for providing notification service for emergency using imaging apparatus and a method for the same |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2020074978A3 (en) * | 2018-09-05 | 2020-05-22 | Mobile Software As | System and method for alerting, recording and tracking |
TWI773141B (en) * | 2021-02-19 | 2022-08-01 | 杜昱璋 | Hazard Prediction and Response Device and System |
GB2605381A (en) * | 2021-03-29 | 2022-10-05 | Tended Ltd | Wearable device and a screen for a wearable device |
CN113920665A (en) * | 2021-09-30 | 2022-01-11 | 中国工商银行股份有限公司 | Security management method and system comprising security equipment |
CN119904964A (en) * | 2023-10-26 | 2025-04-29 | 北京小米移动软件有限公司 | Method, device and medium for calling for help based on fall detection |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10667313B2 (en) | Wireless communication system and method for monitoring the pairing status between two devices | |
US11323659B2 (en) | Video communication device, video communication method, and video communication mediating method | |
US11381650B2 (en) | System and server for analyzing and integrating data collected by an electronic device | |
WO2018107031A1 (en) | Personal emergency data capture and alerting | |
CA3075301C (en) | Method and device for responding to an audio inquiry | |
US9704377B2 (en) | Systems and methods for managing an emergency situation | |
US10701315B2 (en) | Video communication device and video communication method | |
US20170124834A1 (en) | Systems and methods for secure collection of surveillance data | |
DE112018003225B4 (en) | Method and system for delivering an event-based voice message with coded meaning | |
WO2018008224A1 (en) | Robot, robot system, and recording medium | |
US9064392B2 (en) | Method and system for awareness detection | |
KR20140088836A (en) | Methods and systems for searching utilizing acoustical context | |
US12125490B2 (en) | System and method for digital assistant receiving intent input from a secondary user | |
CA3065096C (en) | Adaptation of the auditory output of an electronic digital assistant in accordance with an indication of the acoustic environment | |
AU2018422609A1 (en) | System, device, and method for an electronic digital assistant recognizing and responding to an audio inquiry by gathering information distributed amongst users in real-time and providing a calculated result | |
US10181253B2 (en) | System and method for emergency situation broadcasting and location detection | |
US12125480B2 (en) | System and method for encouraging group discussion participation | |
US20240288929A1 (en) | Methods and systems for determining user interest relevant to co-located users | |
US10887552B1 (en) | Door-knocking for teleconferencing | |
WO2023049358A1 (en) | Systems and methods for providing assistance in an emergency | |
US11509986B1 (en) | Headphones restricted to use with a particular controlled-environment facility resident communication and/or media device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 17879485 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 17879485 Country of ref document: EP Kind code of ref document: A1 |