CN117333341B - Accident handling method and smart helmet - Google Patents
Accident handling method and smart helmet Download PDFInfo
- Publication number
- CN117333341B CN117333341B CN202311116745.XA CN202311116745A CN117333341B CN 117333341 B CN117333341 B CN 117333341B CN 202311116745 A CN202311116745 A CN 202311116745A CN 117333341 B CN117333341 B CN 117333341B
- Authority
- CN
- China
- Prior art keywords
- data
- accident
- helmet
- rescue
- terminal
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q50/00—Information and communication technology [ICT] specially adapted for implementation of business processes of specific business sectors, e.g. utilities or tourism
- G06Q50/10—Services
- G06Q50/26—Government or public services
- G06Q50/265—Personal security, identity or safety
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/0406—Accessories for helmets
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/0406—Accessories for helmets
- A42B3/0433—Detecting, signalling or lighting devices
-
- A—HUMAN NECESSITIES
- A42—HEADWEAR
- A42B—HATS; HEAD COVERINGS
- A42B3/00—Helmets; Helmet covers ; Other protective head coverings
- A42B3/04—Parts, details or accessories of helmets
- A42B3/0406—Accessories for helmets
- A42B3/0433—Detecting, signalling or lighting devices
- A42B3/046—Means for detecting hazards or accidents
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04L—TRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
- H04L67/00—Network arrangements or protocols for supporting network services or applications
- H04L67/01—Protocols
- H04L67/12—Protocols specially adapted for proprietary or special-purpose networking environments, e.g. medical networks, sensor networks, networks in vehicles or remote metering networks
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- Health & Medical Sciences (AREA)
- Signal Processing (AREA)
- General Health & Medical Sciences (AREA)
- Tourism & Hospitality (AREA)
- Marketing (AREA)
- General Physics & Mathematics (AREA)
- Educational Administration (AREA)
- Human Resources & Organizations (AREA)
- Development Economics (AREA)
- Primary Health Care (AREA)
- Strategic Management (AREA)
- Physics & Mathematics (AREA)
- General Business, Economics & Management (AREA)
- Economics (AREA)
- Theoretical Computer Science (AREA)
- Multimedia (AREA)
- Computer Security & Cryptography (AREA)
- Computing Systems (AREA)
- Medical Informatics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Alarm Systems (AREA)
Abstract
The application provides an accident handling method and an intelligent helmet, and belongs to the technical field of computers. The method is suitable for helmets worn by delivery and transport capacity, communication connection is established between the helmets and a terminal, the helmets comprise a data acquisition module and a processor, the data acquisition module acquires motion data of the helmets when the helmets are worn, the processor carries out accident identification based on the acquired motion data, the helmets send accident notification used for triggering rescue services to a server and send data acquisition instructions to the terminal when the accidents are identified, the data acquisition instructions indicate the terminal connected with the helmets through communication to acquire at least one of environment data used for assisting the rescue services and state data of delivery and transport capacity, and the acquired data are sent to the server. The scheme realizes that the delivery capacity can be timely rescued when accidents occur, and enriches the functions of the helmet.
Description
Technical Field
The application relates to the technical field of computers, in particular to an accident handling method and an intelligent helmet.
Background
Riding tools are a common riding tool, and many people travel by using the riding tools. In order to improve riding safety, a rider is required to wear the helmet when using the riding tool, but most of the current helmets protect the head of the rider when a traffic accident occurs, and the effect is limited.
Disclosure of Invention
The embodiment of the application provides an accident handling method and an intelligent helmet, which realize that the distribution capacity can be timely saved when an accident occurs, and enrich the functions of the helmet. The technical scheme is as follows:
in one aspect, there is provided an accident handling method adapted to distribute a capacity-worn helmet, the helmet having a communication connection established with a terminal, the helmet comprising a data acquisition module and a processor, the data acquisition module being electrically connected to the processor, the method comprising:
the data acquisition module acquires the movement data of the helmet and sends the movement data to the processor when the helmet is worn;
The processor performs accident identification based on the motion data;
Under the condition that an accident is identified, the helmet sends an accident notification for triggering a rescue service to the server and sends a data acquisition instruction to the terminal, the data acquisition instruction instructs the terminal connected with the helmet through communication to acquire at least one of environment data and distribution capacity state data for assisting the rescue service, and the acquired data is sent to the server.
In one possible implementation, the method further includes:
the processor is used for responding to the first key operation of the distribution capacity on the helmet, sending a rescue confirmation notice to the server, wherein the rescue confirmation notice is used for indicating that the distribution capacity needs to be rescued.
In one possible implementation, the data acquisition module includes an acceleration sensor, the athletic data includes an acceleration of the helmet, and the processor performs accident identification based on the athletic data, including:
The processor is used for identifying at least one of helmet weightlessness, collision, rolling and posture change based on the acceleration of the helmet acquired by the acceleration sensor;
the processor determines that an accident has occurred when at least one of weight loss, impact, roll, and attitude change of the helmet is identified.
In one possible implementation, the method further includes:
the helmet sends the motion data corresponding to the accident to the server, the collected data are used for carrying out accident judgment and verification together with the motion data corresponding to the accident in the server to obtain an accident verification result, and the accident verification result is used for triggering rescue operation in the rescue service.
On the other hand, an accident handling method is provided, and is suitable for a terminal, and the method comprises the following steps:
The terminal establishes communication connection with the helmets with the delivery capacity, and the binding relation between the helmets and the personal identity information of the delivery capacity is determined;
Receiving a data acquisition instruction sent by the helmet, wherein the data acquisition instruction and an accident notification sent to a server for triggering rescue service are sent under the condition that the helmet recognizes that an accident occurs in delivery capacity through motion data perceived by a data acquisition module;
Responsive to the data acquisition instruction, acquiring at least one of environmental data of the terminal and status data of the distribution capacity;
and sending the acquired data to the server, wherein the acquired data is used for assisting the rescue service.
In one possible implementation manner, the responding to the data collection instruction collects at least one of environment data and distribution capacity state data of the terminal, and the method comprises at least one of the following steps:
Responding to the data acquisition instruction, controlling a front camera and/or a rear camera of the terminal of the distribution capacity, shooting image data of the environment where the terminal is located and/or recording video data of the environment where the terminal is located;
Responding to the data acquisition instruction, and controlling a recording device of the terminal of the distribution capacity to record audio data of the environment where the terminal is positioned;
And responding to the data acquisition instruction, acquiring positioning data and/or order state data in a first time period before the accident occurs and a second time period after the accident occurs, wherein the order state data is used for indicating whether the delivery capacity has an order to be delivered or not.
In one possible implementation, in response to the data acquisition instruction, acquiring the environmental data of the terminal includes:
Responding to the data acquisition instruction, and calling the front camera and the rear camera to respectively shoot so as to obtain image data shot by the front camera and image data shot by the rear camera;
determining a camera which is not blocked based on the image data shot by the front camera and the image data shot by the rear camera;
and calling the non-occluded camera to record video data.
In one possible implementation manner, after the sending the collected data to the server, the method further includes:
receiving a rescue confirmation notification sent by the server;
responding to the rescue confirmation notification, and displaying a rescue confirmation interface;
Responding to the confirmation operation of the rescue confirmation interface, wherein the display time of the rescue confirmation interface exceeds a third time or receives a first rescue confirmation notification sent by the helmet, sending a second rescue confirmation notification to the server, wherein the second rescue confirmation notification and the first rescue confirmation notification are used for indicating that the delivery capacity needs to be helped, the helmet is provided with a first key, and the first rescue confirmation notification is sent to the terminal by the helmet in response to the confirmation operation of the first key.
In one possible implementation manner, the motion data corresponding to the accident sent to the server is sent when the helmet recognizes that the accident occurs in the delivery capacity through the motion data perceived by the data acquisition module, the acquired data is used for performing accident judgment and verification together with the motion data corresponding to the accident in the server to obtain an accident verification result, and the accident verification result is used for triggering rescue operation in the rescue service.
In another aspect, there is provided an accident handling method, performed by a server, the method comprising:
Receiving an accident notification sent by the helmet and used for triggering a rescue service, wherein the accident notification is sent under the condition that the helmet recognizes an accident, and receiving movement data corresponding to the accident, and the movement data corresponding to the accident is movement data of the helmet when the accident happens;
Acquiring data which is acquired by a terminal in communication connection with the helmet and is used for assisting the rescue service, wherein the data acquired by the terminal comprises at least one of environment data and distribution capacity state data, the acquired data are acquired after the terminal receives a data acquisition instruction sent by the helmet, and the data acquisition instruction is sent under the condition that the helmet recognizes an accident;
and executing the rescue service, wherein the rescue service comprises the step of executing rescue operation after accident judgment and verification are carried out based on the motion data corresponding to the accident and the data acquired by the terminal.
In one possible implementation, the method further includes:
Sending a rescue confirmation notice to the terminal, wherein the rescue confirmation notice is used for displaying a rescue confirmation interface in the terminal;
Receiving a second rescue confirmation notification sent by the terminal, and executing a rescue operation in the rescue service based on the second rescue confirmation notification, wherein the second rescue confirmation notification is used for being sent when the terminal detects the confirmation operation of the rescue interface, the display duration of the rescue interface exceeds a third duration or the first rescue confirmation notification sent by the helmet is received, the helmet is provided with a first key, and the first rescue confirmation notification is sent to the terminal by the helmet in response to the confirmation operation of the first key.
In a possible implementation manner, the performing a rescue operation in the rescue service includes at least one of the following:
Sending a rescue notice to emergency contacts of the delivery capacity;
And sending a rescue notice to the emergency rescue mechanism.
In another aspect, there is provided an accident management method adapted to distribute a capacity-worn helmet, the helmet having a communication connection established with a server, the helmet comprising a data acquisition module and a processor, the method comprising:
a data acquisition module of the helmet acquires movement data of the helmet when the helmet is worn;
The processor of the helmet performs accident recognition based on the motion data;
In the event of an accident being identified, the helmet sends an accident notification to the server for triggering a rescue service.
In one possible implementation, the method further includes:
the helmet sends the motion data corresponding to the accident to the server, and the motion data corresponding to the accident is used for assisting the rescue service in the server.
In one possible implementation, the method further includes:
and responding to the first key operation of the delivery capacity on the helmet, sending a rescue confirmation notice to the server, wherein the rescue confirmation notice is used for indicating that the delivery capacity needs to be rescued.
In another aspect, a helmet adapted for distribution capacity wear is provided, the helmet having a communication connection established with a server, the helmet comprising a data acquisition module and a processor;
The data acquisition module is used for acquiring the motion data of the helmet when the helmet is worn;
the processor is used for carrying out accident recognition based on the motion data;
The helmet is used for sending an accident notification for triggering a rescue service to the server under the condition that the accident is identified.
In one possible implementation, the helmet is further configured to send motion data corresponding to the accident to the server, where the motion data corresponding to the accident is used to assist the rescue service in the server.
In one possible implementation, the helmet is further configured to send a rescue confirmation notification to the server in response to a first key operation of the helmet by the distribution capacity, the rescue confirmation notification being configured to indicate that the distribution capacity requires rescue.
In another aspect, there is provided an intelligent helmet adapted for distribution capacity wear, the helmet comprising a data acquisition module and a processor, the data acquisition module being electrically connected to the processor;
The data acquisition module is used for acquiring the movement data of the helmet and sending the movement data to the processor under the condition that the helmet is worn;
the processor is used for carrying out accident recognition based on the motion data;
The processor is further configured to send an accident notification for triggering a rescue service to the server and send a data acquisition instruction to the terminal in case of identifying an accident, where the data acquisition instruction instructs the terminal connected with the helmet through communication to acquire at least one of environmental data and status data for assisting the rescue service and delivery capacity, and send the acquired data to the server.
In one possible implementation, the processor is further configured to send a rescue confirmation notification to the server in response to a first key operation of the helmet by the distribution capacity, the rescue confirmation notification indicating that the distribution capacity requires rescue.
In one possible implementation, the data acquisition module comprises an acceleration sensor, and the motion data comprises an acceleration of the helmet;
The processor is used for identifying at least one of helmet weightlessness, collision, rolling and posture change based on the acceleration of the helmet acquired by the acceleration sensor;
the processor is used for determining that an accident occurs when at least one of the weightlessness, the collision, the rolling and the posture change of the helmet is identified.
In a possible implementation manner, the helmet is configured to send motion data corresponding to the accident to the server, where the collected data is used to perform accident judgment and verification together with the motion data corresponding to the accident in the server, so as to obtain an accident verification result, and the accident verification result is used to trigger a rescue operation in the rescue service.
In another aspect, a terminal is provided, the terminal including a processor and a memory, the memory storing at least one program code, the at least one program code loaded and executed by the processor to implement the incident processing method according to any one of the implementations described above.
In another aspect, a server is provided, the server including a processor and a memory, the memory storing at least one program code, the at least one program code loaded and executed by the processor to implement the incident processing method of any of the implementations described above.
In another aspect, a computer readable storage medium is provided, in which at least one program code is stored, the at least one program code being loaded and executed by a processor to implement an incident handling method as described in any of the above implementations.
In another aspect, a computer program product is provided, the computer program product comprising at least one piece of program code that is loaded and executed by a processor to implement an incident handling method as described in any one of the implementations above.
The technical scheme provided by the embodiment of the application has the beneficial effects that at least:
The embodiment of the application provides an accident handling method, which can be used for carrying out accident identification through a helmet, triggering a rescue service under the condition of identifying the accident, ensuring that the delivery capacity can be timely rescued, enriching the functions of the helmet, and sending a data acquisition instruction to a terminal by the helmet so that the terminal can acquire other data related to the accident to assist the rescue service, and improving the accuracy of the rescue service.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings required for the description of the embodiments will be briefly described below, and it is apparent that the drawings in the following description are only some embodiments of the present application, and other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of an accident handling system according to an embodiment of the present application;
FIG. 2 is a schematic diagram of an accident handling system according to an embodiment of the present application;
FIG. 3 is a schematic diagram of an accident handling system according to an embodiment of the present application;
FIG. 4 is a flow chart of an accident handling method according to an embodiment of the present application;
FIG. 5 is a flow chart of an accident handling method provided by an embodiment of the present application;
FIG. 6 is a flow chart of an accident handling method provided by an embodiment of the present application;
FIG. 7 is a flow chart of an accident handling method according to an embodiment of the present application;
FIG. 8 is a flow chart of an accident handling method provided by an embodiment of the present application;
Fig. 9 is a schematic structural view of a helmet according to an embodiment of the present application;
fig. 10 is a schematic structural diagram of a terminal according to an embodiment of the present application;
fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
For the purpose of making the objects, technical solutions and advantages of the present application more apparent, the embodiments of the present application will be described in further detail with reference to the accompanying drawings.
The terms "first," "second," "third," and "fourth" and the like in the description and in the claims and drawings are used for distinguishing between different objects and not necessarily for describing a particular sequential or chronological order. Furthermore, the terms "comprising," "including," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
It should be noted that, all actions of acquiring signals, information or data in the present application are performed under the condition of conforming to the corresponding data protection rule policy of the country of the location and obtaining the authorization given by the owner of the corresponding device.
The accident handling method provided by the embodiment of the application is executed by an accident handling system. In some embodiments, the accident handling system includes a helmet and a server. Fig. 1 is a schematic diagram of an accident handling system according to an embodiment of the present application. The accident management system includes a helmet 101 and a server 102. Helmet 101 establishes a communication connection with server 102.
Alternatively, after the helmet 101 is worn, motion data of itself may be collected, and after an accident is identified based on the motion data, an accident notification is sent to the server 102 to trigger a rescue procedure in the server 102 so that the delivery capacity may be timely rescued.
In some embodiments, as shown in fig. 2, the incident processing system further includes a terminal 103. Terminal 103 establishes communication connections with helmet 101 and server 102, respectively. Alternatively, the terminal 103 may be provided with an application associated with the helmet 101, which may be served by the server 102. Thus, terminal 103 is communicatively coupled to helmet 101 and server 102 via the installed application. Alternatively, terminal 103 may establish a binding relationship of the helmet with the personal identification information of the shipping capacity through a communication connection established with the helmet 101. Optionally, after the accident is identified based on the movement data, the helmet 101 may also send a data acquisition command to the terminal 103, where the terminal 103 acquires environmental data, status data of delivery capacity, etc. in response to the data acquisition command, and sends the acquired data to the server 102, so as to assist in a rescue process in the server 102, so that the delivery capacity can be more accurately rescued.
In some embodiments, helmet 101 is a distribution capacity-worn helmet and terminal 103 is a distribution capacity-held terminal. Distribution capacity helmets 101 are worn during distribution. If an accident occurs during the delivery process, the helmet 101 may detect that an accident notification for triggering a rescue service is sent to the server 102, instruct the terminal 103 held by the delivery capacity to collect at least one of environmental data for assisting in the environment in which the delivery capacity of the rescue service is located and status data of the delivery capacity, and send the collected data to the server 102.
In some embodiments, as shown in fig. 3, the server 102 includes a helmet server 1021, a security center 1022, and a visual intelligent center 1023. Optionally, the helmet 101 establishes a communication connection with the helmet server 1021, the terminal 103 establishes a communication connection with the security center 1022 and the visual intelligent center 1023, and the helmet server 1021 establishes a communication connection with the security center 1022.
Optionally, the helmet 101 transmits the accident notification, the movement data corresponding to the accident, etc. to the helmet server 1021, the helmet server 1021 forwards the accident notification, the movement data corresponding to the accident, etc. to the security center 1022 (optionally, the helmet server 1021 may also store the movement data corresponding to the accident), the terminal 103 transmits non-image data in the acquired data to the security center 1022, and transmits image data to the visual intelligent center 1023.
Next, the structure of the helmet will be exemplarily described in the embodiments of the present application:
In some embodiments, the helmet includes a data acquisition module and a processor, the data acquisition module and the processor are electrically connected, the data acquisition module and the processor may be located at any position in the helmet, and the positions of the data acquisition module and the processor are not limited in the embodiments of the present application. Optionally, the data acquisition module comprises an acceleration sensor, the helmet acquires the acceleration of the helmet through the acceleration sensor, and whether an accident occurs is determined based on the acceleration of the helmet. Of course, the embodiment of the present application is only exemplified by the data acquisition module including the acceleration sensor, and in another embodiment, the data acquisition module includes a gyroscope sensor, etc., and the embodiment of the present application is not limited to the data acquisition module.
In some embodiments, the helmet further comprises a headgear detection module for detecting whether the helmet is worn. Optionally, the helmet-wearing detection module includes an infrared sensor, which is located at the top of the inner side of the helmet, and the infrared sensor may emit infrared rays or may receive infrared rays. If the infrared sensor emits infrared rays, the helmet is determined to be worn after receiving the reflected infrared rays, and if the infrared sensor does not receive the reflected infrared rays after emitting the infrared rays, the helmet is determined to be not worn. For example, if the infrared sensor receives reflected infrared light, a signal 1 is sent to the processor, with a signal 1 indicating that the helmet is worn, and if the infrared sensor does not receive reflected infrared light, a signal 0 is sent to the processor, with a signal 0 indicating that the helmet is not worn.
Of course, the embodiment of the present application is only exemplified by taking the example that the helmet wearing detection module includes the infrared sensor, and in another embodiment, the helmet wearing detection module includes the distance sensor, etc., and the embodiment of the present application does not limit the helmet wearing detection module.
In some embodiments, the helmet is further provided with a first key. Optionally, the first key may be disposed on an outer side of the helmet, so that the first key may be effectively reduced from being touched by mistake, and the first key may be conveniently operated by the distribution capacity. Optionally, the first key is used for sending a rescue confirmation notification to the server.
Fig. 4 is a flowchart of an accident handling method provided by an embodiment of the present application, where an execution body is exemplified as a helmet, and the helmet may be a helmet worn by any user, and an embodiment of the present application is exemplified as a helmet worn by a distribution capacity, and the helmet includes a data acquisition module and a processor, where the data acquisition module is electrically connected to the processor. Referring to fig. 4, the method includes:
401. in the case of a helmet being worn, the data acquisition module acquires movement data of the helmet and sends the movement data to the processor.
402. The processor performs accident recognition based on the motion data.
403. Under the condition that the accident is identified, the helmet sends an accident notification for triggering the rescue service to the server, and sends a data acquisition instruction to the terminal, wherein the data acquisition instruction instructs the terminal connected with the helmet through communication to acquire at least one of environment data and distribution capacity state data for assisting the rescue service, and the acquired data is sent to the server.
According to the accident handling method provided by the embodiment of the application, the accident can be identified through the helmet, and under the condition of identifying the accident, the rescue service is triggered, so that the distribution capacity can be timely rescued, the functions of the helmet are enriched, and the helmet can also send a data acquisition instruction to the terminal, so that the terminal can acquire other data related to the accident, the rescue service is assisted, and the accuracy of the rescue service is improved.
In one possible implementation, the method further includes:
The processor is responsive to a first key operation of the helmet by the dispatch capacity to send a rescue confirmation notification to the server indicating that the dispatch capacity requires rescue.
In one possible implementation, the data acquisition module includes an acceleration sensor, the athletic data includes an acceleration of the helmet, and the processor performs accident identification based on the athletic data, including:
the processor is used for identifying at least one of helmet weightlessness, collision, rolling and posture change based on the acceleration of the helmet acquired by the acceleration sensor;
The processor determines that an accident has occurred when at least one of the helmet weightlessness, impact, roll, and change in attitude is identified.
In one possible implementation, the processor identifies at least one of helmet weight loss, impact, roll, and attitude change based on acceleration of the helmet acquired by the acceleration sensor, including any one of:
The processor determines the total acceleration of the helmet based on the acceleration of the helmet in the three-axis direction, and determines that the weightlessness occurs if the total acceleration of the helmet is smaller than a first threshold;
The processor determines that an impact has occurred when it is recognized that the acceleration of the helmet in either axial direction is greater than a second threshold;
Determining the Euler angle of the helmet based on the acceleration of the helmet in the three-axis direction by the processor, and determining that the helmet is overturned if the change amplitude of the Euler angle of the helmet exceeds a third threshold value and the change duration does not exceed a second duration;
And if the change amplitude of the Euler angle of the helmet exceeds a third threshold value and the change duration exceeds a second duration, determining that the posture of the helmet is changed.
In one possible implementation, the method further includes:
the helmet sends the movement data corresponding to the accident to the server, the collected data are used for carrying out accident judgment and verification together with the movement data corresponding to the accident in the server, an accident verification result is obtained, and the accident verification result is used for triggering rescue operation in a rescue service.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein.
Fig. 5 is a flowchart of an accident handling method according to an embodiment of the present application, where an execution body is taken as a terminal for example to perform an exemplary description. Referring to fig. 5, the method includes:
501. The terminal establishes communication connection with the helmets of the distribution capacity, and the binding relation between the helmets and the personal identity information of the distribution capacity is determined.
502. The terminal receives a data acquisition instruction sent by the helmet, and the data acquisition instruction and an accident notification sent to the server for triggering the rescue service are sent under the condition that the helmet recognizes that an accident occurs in the delivery capacity through the motion data perceived by the data acquisition module.
503. The terminal is responsive to the data acquisition instructions to acquire at least one of environmental data and status data of the distribution capacity of the terminal.
504. The terminal sends the collected data to the server, and the collected data is used for assisting the rescue service.
According to the accident handling method provided by the embodiment of the application, the accident can be identified through the helmet, and under the condition of identifying the accident, the rescue service is triggered, so that the distribution capacity can be timely rescued, the functions of the helmet are enriched, and the helmet can also send a data acquisition instruction to the terminal, so that the terminal can acquire other data related to the accident, the rescue service is assisted, and the accuracy of the rescue service is improved.
In one possible implementation, in response to the data acquisition instruction, at least one of environmental data and status data of delivery capacity of the acquisition terminal includes at least one of:
Responding to a data acquisition instruction, controlling a front camera and/or a rear camera of a terminal of distribution capacity, shooting image data of an environment where the terminal is positioned and/or recording video data of the environment where the terminal is positioned;
responding to a data acquisition instruction, and controlling a recording device of the terminal to record audio data of an environment where the terminal for delivering and transporting capacity is located;
and responding to the data acquisition instruction, acquiring positioning data and/or order state data in a first time period before the accident occurs and a second time period after the accident occurs, wherein the order state data is used for indicating whether the delivery capacity has an order to be delivered.
In one possible implementation, in response to a data acquisition instruction, acquiring environmental data of a terminal includes:
Responding to a data acquisition instruction, and calling the front camera and the rear camera to respectively shoot so as to obtain image data shot by the front camera and image data shot by the rear camera;
Determining a camera which is not blocked based on the image data shot by the front camera and the image data shot by the rear camera;
and calling the non-occluded camera to record video data.
In one possible implementation, after the collected data is sent to the server, the method further includes:
Responding to the rescue confirmation notice, and displaying a rescue confirmation interface;
Responding to the confirmation operation of the rescue confirmation interface, wherein the display time of the rescue confirmation interface exceeds the third time or receives a first rescue confirmation notification sent by the helmet, and sending a second rescue confirmation notification to the server, wherein the second rescue confirmation notification and the first rescue confirmation notification are used for indicating that the delivery capacity needs to be helped, the helmet is provided with a first key, and the first rescue confirmation notification is sent to the terminal by the helmet in response to the confirmation operation of the first key.
In one possible implementation manner, the motion data corresponding to the accident sent to the server is sent when the helmet recognizes that the accident occurs in the delivery capacity through the motion data perceived by the data acquisition module, the acquired data is used for carrying out accident judgment and verification together with the motion data corresponding to the accident in the server, so as to obtain an accident verification result, and the accident verification result is used for triggering rescue operation in the rescue service.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein.
Fig. 6 is a flowchart of an accident handling method according to an embodiment of the present application, where an execution body is a server, and the server establishes a communication connection with a helmet. Referring to fig. 6, the method includes:
601. The server receives an accident notification sent by the helmet and used for triggering a rescue service, the accident notification is sent under the condition that the helmet recognizes an accident, and receives movement data corresponding to the accident, wherein the movement data corresponding to the accident is movement data of the helmet when the accident happens.
602. The method comprises the steps that a server acquires data which are acquired by a terminal and are used for assisting rescue services and are in communication connection with a helmet, the data acquired by the terminal comprise at least one of environment data and state data of delivery capacity, the acquired data are acquired after the terminal receives a data acquisition instruction sent by the helmet, and the data acquisition instruction is sent under the condition that the helmet recognizes an accident.
603. The server executes the rescue service, and the rescue operation in the rescue service is executed after accident judgment and verification is carried out based on the motion data corresponding to the accident and the data acquired by the terminal.
According to the accident handling method provided by the embodiment of the application, the accident can be identified through the helmet, and under the condition of identifying the accident, the rescue service is triggered, so that the distribution capacity can be timely rescued, the functions of the helmet are enriched, and the helmet can also send a data acquisition instruction to the terminal, so that the terminal can acquire other data related to the accident, the rescue service is assisted, and the accuracy of the rescue service is improved.
In one possible implementation, the method further includes:
sending a rescue confirmation notice to the terminal, wherein the rescue confirmation notice is used for displaying a rescue confirmation interface in the terminal;
And receiving a second rescue confirmation notice sent by the terminal, triggering a rescue operation in the rescue service based on the second rescue confirmation notice, wherein the second rescue confirmation notice is sent when the terminal detects the confirmation operation of the rescue interface, the display duration of the rescue interface exceeds a third duration or the first rescue confirmation notice sent by the helmet is received, the helmet is provided with a first key, and the first rescue confirmation notice is sent to the terminal by the helmet in response to the confirmation operation of the first key.
In one possible implementation, triggering a rescue operation in a rescue service includes at least one of:
Sending a rescue notice to emergency contacts of the delivery capacity;
And sending a rescue notice to the emergency rescue mechanism.
Any combination of the above optional solutions may be adopted to form an optional embodiment of the present application, which is not described herein.
Fig. 7 is a flowchart of an accident handling method provided by an embodiment of the present application, where an execution subject is an accident handling system, and the accident handling system includes a helmet, a terminal, and a server, where the helmet establishes communication connection with the terminal and the server, respectively, and the terminal establishes communication connection with the server. Referring to fig. 7, the method includes:
701. In the case that the helmet is worn, the helmet collects movement data of the helmet, and accident recognition is performed based on the movement data.
The movement data of the helmet is data for representing a movement state of the helmet, for example, a speed of the helmet, a posture of the helmet, an acceleration of the helmet, a movement direction of the helmet, etc., and the movement data is not limited in the embodiment of the present application. When the user uses the riding tool to go out, the helmet is usually worn for safety, so in the embodiment of the application, the user can be considered to be going out by adopting the riding tool under the condition that the helmet is worn. Because the helmet is worn on the head of the user, the motion data of the helmet can represent the motion state of the user to a certain extent, so that the motion data of the helmet can be identified by collecting the motion data of the helmet, and whether the user has an accident or not can be determined.
In one possible implementation, a helmet includes a data acquisition module and a processor, the data acquisition module being electrically connected to the processor. The helmet collects motion data through the data collection module, and accident identification is carried out through the processor. In some embodiments, the helmet collects movement data with the helmet worn and accident identification is performed based on the movement data, including the data collection module collecting movement data with the helmet worn and sending the movement data to a processor that receives the movement data and performs accident identification based on the movement data.
The data acquisition module is a module with a data acquisition function. In some embodiments, the data acquisition module may include a sensor by which the helmet acquires data. Optionally, the data acquisition module comprises an acceleration sensor and the motion data comprises an acceleration of the helmet. When the user falls or collides, the acceleration of the helmet varies significantly and regularly, and thus, whether the user falls or collides, that is, whether the user has an accident, can be determined based on the acceleration of the helmet.
It has been found through experimentation that at least one of weight loss, impact, roll, and change in posture may occur when a user falls or collides. Thus, at least one of a helmet, an impact, a roll, and a change in posture may be identified. In some embodiments, the processor performs accident identification based on the motion data, including the processor identifying at least one of a weight loss, an impact, a roll, and a change in posture of the helmet based on an acceleration of the helmet acquired by the acceleration sensor, and the processor determining that an accident has occurred when the at least one of the weight loss, the impact, the roll, and the change in posture of the helmet is identified.
Optionally, the processor identifies at least one of weight loss, impact, roll and attitude change of the helmet based on the acceleration of the helmet acquired by the acceleration sensor, including any one of:
(1) The processor determines a resultant acceleration of the helmet and determines that weight loss has occurred if the resultant acceleration of the helmet is less than a first threshold.
In some embodiments, the acceleration sensor is used to obtain acceleration of the helmet on three axes. Wherein, the triaxial is cross axis, vertical axis and vertical axis. The resultant acceleration of the helmet is the vector sum of the accelerations of the helmet in the three axes.
For example, the acceleration of the helmet on the horizontal axis is denoted as Ax, the acceleration of the helmet on the vertical axis is denoted as Ay, the acceleration of the helmet on the vertical axis is denoted as Az, and then the resultant acceleration of the helmet is denoted as
During free fall, the resultant acceleration of the object may decrease to approximately 0g, where g is the gravitational acceleration. When the distribution capacity has accidents, the distribution capacity is generally fallen down, the weight loss phenomenon can occur in the falling process, the weight loss phenomenon is similar to the free falling phenomenon, the total acceleration is reduced, the free falling phenomenon is not obvious, the condition that the total acceleration is less than 1g and is rapidly decreased is found through experiments, and the condition generally lasts more than 0.5 seconds.
If the combined acceleration decreases rapidly and lasts for a certain period of time, the combined acceleration value becomes smaller, so that the embodiment of the application determines that the weightlessness occurs when the combined acceleration is smaller than the first threshold value by setting the first threshold value. The first threshold may be any value less than 1g, which is not limited in the embodiment of the present application. Optionally, the first threshold is an empirical value.
Of course, the embodiment of the application only uses whether the combined acceleration is smaller than the first threshold value as the judgment standard of the weightlessness for illustrative explanation, and the embodiment of the application does not limit the judgment standard of the weightlessness. In another embodiment, the decreasing duration may also reach the target duration as the criterion of weight loss according to whether the combined acceleration is continuously decreasing. For example, the processor determines a resultant acceleration of the helmet and determines that a weight loss has occurred if the resultant acceleration of the helmet continues to decrease and the decreasing duration reaches a target duration. The target duration may be any duration, for example, 0.5 seconds, etc., which is not limited in the embodiment of the present application. Optionally, the target duration is an empirical value.
(2) The processor determines that an impact has occurred upon identifying that the acceleration of the helmet is greater than a second threshold.
When a human body collides with the ground or other objects, a large impact is generated in the acceleration curve of the helmet. That is, the acceleration of the helmet suddenly becomes large. Therefore, by setting the second threshold, the embodiment of the application determines that the impact occurs when the acceleration of the helmet is greater than the second threshold. The second threshold may be any value, which is not limited in the embodiment of the present application. Optionally, the second threshold is an empirical value.
In some embodiments, the acceleration sensor is used to obtain acceleration of the helmet on three axes. Wherein, the triaxial is cross axis, vertical axis and vertical axis. The acceleration on the corresponding axis varies the most when the helmet is impacted in which direction. Optionally, the processor determines that an impact has occurred upon identifying that the acceleration of the helmet in either axis is greater than a second threshold.
(3) And if the change amplitude of the Euler angle of the helmet exceeds a third threshold value and the change duration does not exceed a second duration, determining that the helmet is overturned.
Wherein, the Euler angle includes pitch angle, roll angle and yaw angle. The processor can determine the pitch angle and the roll angle of the helmet when determining the Euler angle of the helmet based on the acceleration values of the helmet in the three-axis directions.
When the delivery capacity falls down or impacts, the human body may roll over, and the euler angle may change drastically for a short time during the rolling process. Thus, when the magnitude of the change in the euler angle of the helmet exceeds the third threshold and the period of time of the change does not exceed the second period of time, it may be determined that the helmet has flipped.
The third threshold may be any value, which is not limited in the embodiment of the present application. Optionally, the third threshold is an empirical value. The second duration is any duration, and the embodiment of the application does not limit the second duration. Optionally, the second time period is an empirical value.
It should be noted that the euler angle includes a pitch angle and a roll angle, and therefore, the change amplitude of the euler angle of the helmet exceeds the third threshold value and the change duration does not exceed the second duration means that the change amplitude of the pitch angle and/or the roll angle of the helmet exceeds the third threshold value and the change duration does not exceed the second duration.
(4) And if the change amplitude of the Euler angle of the helmet exceeds a third threshold value, the change duration exceeds a second duration and the change is in a stable state before and after the change, determining that the gesture of the delivery capacity is changed.
After a fall or impact in delivery capacity, the posture of the person often changes significantly, for example, from sitting to lying, etc. The change of the human body posture and the rolling can cause larger change of the Euler angle, but under the rolling condition, the duration of the change of the Euler angle is shorter, and the change is more severe. When the posture of the human body is changed, the change duration of the Euler angle is slightly longer, the change is not so severe, and the posture is in a stable state before and after the change. Therefore, when the change amplitude of the Euler angle of the helmet exceeds the third threshold, the change duration exceeds the second duration and the helmet is in a stable state before and after the change, the posture of the helmet is determined to be changed.
In other embodiments, it has been found through experimentation that the helmet continuously recognizes weight loss, impact and roll over a short period of time, or continuously recognizes weight loss, impact and change in attitude when a user falls or collides, and that the helmet may recognize weight loss, impact, roll over, change in attitude, etc., but does not continuously recognize weight loss, impact and roll over a short period of time, or continuously recognize weight loss, impact and change in attitude when other conditions occur to the user. Therefore, the helmet can accurately identify accidents by carrying out weightlessness, collision, rolling and gesture change identification.
In some embodiments, the processor performs accident identification based on the motion data, including the processor performing weight loss, impact, roll and attitude change identification based on acceleration of the helmet, and determining that an accident has occurred if the processor identifies weight loss, impact and roll for a first period of time, or if the processor identifies weight loss, impact and attitude change for a first period of time.
The first duration may be any duration, such as 2 seconds, 3 seconds, and the like, which is not limited in the embodiment of the present application. Optionally, the first duration is an empirical value.
It should be noted that, in the embodiment of the present application, the data acquisition module is only illustrated by taking the example that the data acquisition module includes the acceleration sensor, and the data acquisition module is not limited, and the data acquisition module may include other hardware capable of acquiring motion data, which is not limited in the embodiment of the present application.
It should be noted that step 701 is performed with the helmet in a worn condition, and in one possible implementation, whether the helmet is worn or not is determined by the helmet itself. In some embodiments, the helmet includes a processor and a headgear detection module electrically connected to the processor. Optionally, the headgear-wearing detection module includes an infrared sensor located on top of the inside of the helmet. The method further includes the steps that the infrared sensor emits infrared rays, if the infrared sensor receives the reflected infrared rays within a third duration, the infrared sensor sends a first signal to the processor, the first signal is used for indicating that the helmet is worn, and if the infrared sensor does not receive the reflected infrared rays, the infrared sensor sends a second signal to the processor, and the second signal is used for indicating that the helmet is not worn. For example, the first signal is 1 and the second signal is 0. It should be noted that, in the embodiment of the present application, whether the helmet is worn is determined by the infrared sensor only, and in another embodiment, the helmet is provided with other sensors, and whether the helmet is worn may also be determined according to the other sensors. For example, the helmet is further provided with a distance sensor, from which the helmet can also determine whether the helmet is worn.
In another possible implementation, whether the helmet is in a worn condition is set by the user. In some embodiments, the helmet is provided with a switch, which can be turned on after the helmet is worn by the user, in which case the helmet performs step 701 described above. After the user removes the helmet, the switch may be turned off, after which the helmet stops performing step 701 as described above.
It should be noted that, the helmet provided in the embodiment of the present application is a helmet suitable for being worn by a delivery capacity, and therefore, the embodiment of the present application is exemplified by taking a wearer of the helmet as a delivery capacity.
702. And when the helmet recognizes the accident, sending an accident notification to the server and sending a data acquisition instruction to the terminal.
The accident notification is used for indicating that the distribution capacity has an accident, and the accident notification is used for triggering a rescue service, so that after the server receives the accident notification, the rescue service can be triggered, and the distribution capacity can be timely rescued. In some embodiments, the accident notification may be any event that can indicate that an accident has occurred in the delivery capacity, such as a collision event or a suspected collision event reported by the helmet.
In some embodiments, the server triggers the rescue service to perform a rescue operation. In other embodiments, in order to more accurately rescue the delivery capacity, the server performs judgment and verification on the accident first, and then performs rescue operation after the verification is passed. In order to realize the judgment and verification of the accident by the server, the helmet can also send the movement data corresponding to the accident to the server so that the server can judge and verify the accident based on the movement data corresponding to the accident. The movement data corresponding to the accident is movement data of the helmet when the accident occurs, and the movement data can comprise movement data acquired by the first N seconds data acquisition module for identifying the accident to movement data acquired by the second M seconds data acquisition module for identifying the accident. Wherein, N and M can be any positive number, and the embodiment of the application does not limit N and M.
It should be noted that, in the embodiment of the present application, the helmet sends the accident notification and the movement data corresponding to the accident to the server only as an example, and of course, the helmet may also send other information to the server, for example, the helmet may also send a timestamp to the server, where the timestamp is used to indicate the time when the helmet recognizes the accident. In addition, the accident notification, the movement data corresponding to the accident, or other information, etc. sent by the helmet to the server may be sent together or separately, which is not limited by the embodiment of the present application.
In some embodiments, the helmet may also send the accident identification result to the server. For example, the helmet recognizes weight loss, impact, weight loss, tumbling, impact and tumbling during the accident recognition process, and then the helmet may transmit the accident recognition result of "weight loss, impact, weight loss, tumbling, impact and tumbling" to the server or transmit the accident recognition result of "weight loss, impact and tumbling" to the server, or transmit the accident recognition result of "weight loss, impact, weight loss, tumbling, impact and tumbling" and the accident recognition result of 2 "weight loss, impact and tumbling" to the server, wherein the accident recognition result of 2 is the abbreviated information of the accident recognition result of 1.
Another point to be described is that if the helmet recognizes two accidents in a short time, the helmet defaults to the same accident, and only one accident notification is reported. For example, when the time for identifying an incident twice does not exceed 10 seconds, it is defaulted that the two incidents are the same incident.
In the embodiment of the application, the data acquisition instruction is used for indicating the terminal which is in communication connection with the helmet to acquire at least one of environment data and state data of delivery capacity, and the acquired data is sent to the server and used for assisting the rescue service. For example, the collected data is used to indicate which rescue operation the server performs, and, for example, whether the server performs a rescue operation, etc.
In some embodiments, the helmet transmits the incident notification to the server through NB (Narrow Band) wireless communication technology. Of course, the helmet may also transmit the accident notification to the server through other wireless communication technologies, and the data transmission manner between the helmet and the server is not limited in the embodiment of the present application.
In some embodiments, the helmet transmits the data acquisition instructions to the terminal via a short-range communication technique. The short-distance communication technology can be a bluetooth communication technology (for example, BLE ultra-low power consumption applies bluetooth protocol), wifi (WIRELESS FIDELITY ) technology, zigBee (ZigBee, a low-speed short-distance transmission wireless network protocol) technology, etc., and the embodiment of the application does not limit the short-distance communication technology.
In some embodiments, the helmet includes a processor, and the helmet sends the accident notification to the server and the data acquisition instructions to the terminal via the processor. Of course, the helmet may further include other modules, and the accident notification is sent to the server through the other modules, and the data acquisition instruction is sent to the terminal, which is not limited by the embodiment of the present application.
703. The terminal receives a data acquisition instruction sent by the helmet, and responds to the data acquisition instruction to acquire at least one of environment data and state data of delivery capacity and send the acquired data to the server.
The environment data is data for describing the environment around the delivery capacity, and whether the accident occurs in the delivery capacity can be more accurately determined according to the environment around the delivery capacity. In one possible implementation, the environment data comprises at least one of image data shot by a front-facing camera of the terminal, video data recorded by the front-facing camera of the terminal, image data shot by a rear-facing camera of the terminal, video data recorded by the rear-facing camera of the terminal, and audio data of the environment in which the terminal is located. The terminal is a terminal held by a helmet wearer, and the embodiment of the application is exemplified by the helmet wearer as the delivery capacity, so the terminal is a terminal held by the delivery capacity.
In some embodiments, the terminal is responsive to the data acquisition instructions to acquire environmental data, including at least one of controlling a front camera and/or a rear camera of the terminal of the delivery capacity in response to the data acquisition instructions to capture image data of an environment in which the terminal of the delivery capacity is located and/or recording video data of an environment in which the terminal of the delivery capacity is located, and controlling a recording device of the terminal to record audio data of an environment in which the terminal of the delivery capacity is located in response to the data acquisition instructions.
In some embodiments, the terminal may take a picture with the front camera and the rear camera, determine the non-occluded camera based on the images taken by the front camera and the rear camera, and record the video data with the non-occluded camera preferentially. Optionally, the terminal responds to the data acquisition instruction to acquire environment data, and the method comprises the steps of responding to the data acquisition instruction, calling a front camera and a rear camera to respectively shoot, obtaining image data shot by the front camera and image data shot by the rear camera, determining a camera which is not shielded based on the image data shot by the front camera and the image data shot by the rear camera, and calling the camera which is not shielded to record video data.
In the embodiment of the application, the video data is recorded by using the non-shielded camera preferentially as an example, and in another embodiment, if the front camera and the rear camera are shielded, the video data can be recorded by using the front camera first, then the video data can be recorded by using the rear camera, or the front camera and the rear camera can be called to shoot the image data at intervals of preset time, until the non-shielded camera is determined, and the non-shielded camera is called to record the video data. In another embodiment, if neither the front camera nor the rear camera is occluded, the front camera and the rear camera may be invoked to record video data at the same time. If the front camera and the rear camera cannot be called at the same time, the front camera can be called to record video data, and then the rear camera can be called to record video data, or the rear camera can be called to record video data, and then the front camera can be called to record video data.
When the front camera and the rear camera record video data and record sound, the front camera and the rear camera can record video data with a certain duration, and the duration can be preset and can be adjustable. For example, the front camera and the rear camera record video data for 10 seconds by default, and the recording duration of the front camera and the rear camera can be configured, and the configuration range is 1 to 30 seconds. The recording device defaults to record 10 seconds of voice data, and the recording duration of the recording device can be configured, and the configuration range is 1 to 60 seconds.
The state data is data for describing the state of the delivery capacity, and the current state of the delivery capacity can be more accurately determined according to the state of the delivery capacity, so that whether an accident occurs in the delivery capacity or not is determined, and if the accident occurs in the delivery capacity, how the delivery capacity should be rescued is determined. In one possible implementation, the status data includes at least one of location data and/or order status data for a first time period before the incident occurred and a second time period after the incident occurred. The order status data is used to indicate whether the delivery capacity has an order to be delivered. For example, the positioning data is positioning data of 3 minutes before to 3 minutes after the accident, wherein the duration of the positioning data is configurable, and the configuration range can be 0 to 15 minutes.
In some embodiments, the terminal is responsive to the data acquisition instructions to acquire status data of the shipping capacity, including, responsive to the data acquisition instructions, acquiring positioning data and/or order status data for a first time period before the occurrence of the incident and a second time period after the occurrence of the incident.
In some embodiments, the distribution of orders may be affected after an accident occurs in the distribution capacity, and acquiring the order status data may enable the distribution work of non-distributed orders to be successfully arranged subsequently. The order status data includes no work, no work order, and a work order. The embodiment of the application does not limit the order state data.
It should be noted that, step 703 is performed after the user opens the data collection authority of the terminal. In some embodiments, the terminal is provided with an application program related to the helmet, the application program can display a setting interface of the helmet, and after entering the setting interface of the helmet, the application program can be bound with the helmet of the user, so that the helmet can send a data acquisition instruction to the terminal through the application program. After the application program is bound with the helmet of the user, a permission application interface for data acquisition can be displayed, and the permission application interface can display the necessary description of data acquisition, such as "if recording, image and video data cannot be acquired", the system cannot effectively judge whether an accident occurs in delivery capacity or not, and further, rescue service cannot be triggered. After obtaining the authorization of the user based on the rights application interface, the terminal may perform step 703 described above.
In some embodiments, the user may also bind the helmet to the shipping capacity in the application program so that upon identifying an accident based on the helmet, the user can learn who needs to be rescued. Optionally, the terminal establishes a communication connection with the helmet, and determines a binding relationship between the helmet and personal identity information of the distribution capacity. The personal identity information of the distribution capacity can be input by the user through the terminal or can be the personal identity information of the current login account of the terminal, which is not limited by the embodiment of the application.
For example, the terminal is provided with an application program related to the helmet, a setting interface of the helmet can be displayed through the application program, personal identity information of distribution capacity can be input after entering the setting interface of the helmet, and the helmet and the personal identity information of the distribution capacity are bound.
Another point to be noted is that in some embodiments, the server comprises a helmet server, a security center, a visual intelligent center. The helmet sends the movement data of the accident to the helmet server, the helmet server records the movement data of the accident and forwards the movement data of the accident to the safety center, the terminal uploads other data except the image data and the video data in the collected data to the safety center, and the image data and the video data are uploaded to the visual intelligent center.
704. The server receives accident notification sent by the helmet and used for triggering the rescue service, acquires data which are acquired by a terminal in communication connection with the helmet and used for assisting the rescue service, and executes the rescue service.
The accident notification is used for triggering the rescue service, so that the server can execute the rescue service after receiving the accident notification. In the embodiment of the application, the server can execute the rescue service either by directly executing the rescue operation or by firstly performing the accident judgment and verification, and then executing the rescue operation after determining that the accident actually occurs. The server performs accident judgment and verification based on the motion data corresponding to the accident and the data collected by the terminal after receiving the suspected accident notification, and then performs rescue operation after confirming the serious accident.
In a possible implementation manner, when the server performs accident judgment and verification based on the motion data corresponding to the accident and the data collected by the terminal, a preset algorithm or model can be adopted to perform accident judgment, and the classification result of the algorithm or model comprises that no accident occurs and the accident occurs, or comprises that no accident occurs, the accident is suspected to occur and the accident occurs, or comprises that no accident occurs and the accident is suspected to occur. The embodiment of the application is not limited to the classification result.
It should be noted that, when no accident occurs as a result, the flow may end and not proceed further. When an accident occurs as a result, a rescue operation may be performed, and the rescue operation may refer to step 7044 described below, which will not be described in detail herein. When an accident is suspected as a result, the principal may be queried for assistance, and based on the principal's feedback, a determination may be made as to whether an accident has occurred. Accordingly, the execution of the rescue service in step 704 described above includes the following steps 7041 to 7044:
7041. the server sends a rescue confirmation notification to the terminal, the rescue confirmation notification being used to confirm whether the delivery capacity requires rescue.
In some embodiments, a rescue confirmation notification is used to send when the result is not that an accident has not occurred. In some embodiments, the rescue confirmation notification is sent when the server verifies that an accident is suspected based on the motion data corresponding to the accident and the data collected by the terminal. For example, as shown in fig. 8, after receiving the motion data corresponding to the accident and the data collected by the terminal, the server determines whether an accident is suspected to occur or not based on the motion data corresponding to the accident and the data collected by the terminal, and if no accident occurs, the flow ends. If the accident is suspected to happen, the terminal sends a rescue confirmation notice and triggers the terminal to display a rescue confirmation interface.
Another point to be described is that the embodiment of the application only takes the helmet and the terminal to collect data, all the collected data are sent to the server, and the server combines the data collected by the helmet and the terminal to carry out accident judgment and verification as an example for carrying out exemplary description. In another embodiment, the terminal does not need to collect data, after the helmet collects the motion data, the motion data is simply identified at the local end, and after the accident is identified, the motion data is sent to the server for fine identification.
7042. The terminal receives the rescue confirmation notification sent by the server, and responds to the rescue confirmation notification to display a rescue confirmation interface.
In some embodiments, when the terminal displays the rescue confirmation interface, the terminal may pop up the rescue confirmation interface, so that even if the terminal currently displays an interface other than an application program related to the helmet, the display of the rescue confirmation interface may be realized, so that the delivery capacity performs rescue confirmation based on the rescue confirmation interface.
In some embodiments, the rescue validation interface may include a validation option for indicating that delivery capacity validation requires rescue. The embodiment of the application does not limit the rescue confirmation interface.
7043. And the terminal responds to the confirmation operation of the rescue confirmation interface, the display time of the rescue confirmation interface exceeds the third time or receives a first rescue confirmation notice sent by the helmet, and sends a second rescue confirmation notice to the server.
The confirmation operation is an operation performed by the delivery capacity and used for confirming that the delivery capacity needs rescue, the confirmation operation can be any trigger operation, and the confirmation operation is not limited in the embodiment of the application. After the delivery capacity performs the confirmation operation in the rescue confirmation interface, the terminal sends a second rescue confirmation notification to the server, the second rescue confirmation notification being used to indicate that the delivery capacity needs rescue. Therefore, after receiving the second rescue confirmation notification, the server may perform a rescue operation, so that the delivery capacity is timely rescued.
However, if an accident occurs in the delivery capacity, there may be a case that the terminal is not at the delivery capacity, or the delivery capacity is in a coma state, and the confirmation operation cannot be performed on the rescue confirmation interface. Therefore, the embodiment of the application also provides a method, namely, the terminal responds to the fact that the display duration of the rescue interface exceeds the third duration, and sends a second rescue confirmation notice to the server. Therefore, if the delivery capacity exits the rescue confirmation interface within the third time period, it is determined that the user does not need to be rescued, and a rescue operation is not required to be performed.
In one possible implementation, the helmet is provided with a first key, and the helmet sends a first rescue confirmation notification to the terminal in response to a confirmation operation of the first key. Therefore, when the terminal is not at the side of the delivery capacity, the delivery capacity can be confirmed by aid of the first key, and the delivery capacity can be more timely helped without waiting for the third time to automatically confirm the help.
The first rescue confirmation notification and the second rescue confirmation notification may be the same or different, which is not limited in the embodiment of the present application.
7044. The server receives a second rescue confirmation notification sent by the terminal, and executes rescue operation in the rescue service based on the second rescue notification.
In one possible implementation, performing a rescue operation in a rescue service includes at least one of:
(1) And sending a rescue notice to emergency contacts of the delivery capacity.
In some embodiments, a user may configure an emergency contact on a terminal, the terminal sends the emergency contact configured by the user to a server, and a subsequent server may send a rescue notification to the emergency contact when delivery capacity requires rescue. In some embodiments, the manner of sending the rescue notification to the emergency contact may be to make a call to the emergency contact, send a short message, etc., and the embodiment of the present application does not limit the notification form of the rescue notification.
(2) And sending a rescue notice to the emergency rescue mechanism.
The emergency rescue mechanism may be any organization mechanism, for example, the emergency rescue mechanism is a mechanism for handling traffic accidents, for example, the emergency rescue mechanism is a mechanism for handling wounds, for example, the emergency rescue mechanism is a rescue mechanism of a company to which delivery capacity belongs, and the embodiment of the application is not limited to the emergency rescue mechanism.
The manner of sending the rescue notification to the emergency rescue mechanism may be to make a call to the emergency rescue mechanism, send a short message, send a message in a group of the emergency rescue mechanism, etc., which is not limited in the notification form of the rescue notification in the embodiment of the present application.
In one possible implementation, the relevant personnel of the emergency assistance facility may retrieve the relevant data from the server in order to further confirm whether assistance is required or in order to further confirm the manner of assistance. The method further includes displaying movement data and collected data corresponding to the incident in response to a data reading operation of the incident.
It should be noted that, in the accident handling method provided by the embodiment of the present application, only the delivery capacity is saved after an accident occurs, in another embodiment, a saving function setting interface may be provided, and after the delivery capacity starts the saving function through the saving function setting interface, the accident handling method provided by the embodiment of the present application is executed, and if the delivery capacity does not start the saving function, the accident handling method provided by the embodiment of the present application is not executed. In some embodiments, the terminal displays a function setting interface, and in response to a function opening operation in the function setting interface, sends a function opening notification to the server, where the function opening notification carries a user identifier, the server sets a state corresponding to the user identifier to a rescue opening state, and after a subsequent server receives an accident notification sent by the helmet, the server executes steps such as a rescue service. If the server does not receive the function starting notification carrying the user identifier, the state corresponding to the user identifier in the server is set to be a rescue closing state. The subsequent server does not trigger the rescue service after receiving the accident notification sent by the helmet.
In another aspect, in the embodiment of the present application, the delivery capacity may set an emergency contact in the terminal, and the terminal sends the emergency contact set by the delivery capacity to the server, so that the server may send a rescue notification to the emergency contact when the delivery capacity needs the rescue.
According to the accident handling method provided by the embodiment of the application, the accident can be identified through the helmet, and under the condition of identifying the accident, the rescue service is triggered, so that the distribution capacity can be timely rescued, the functions of the helmet are enriched, and the helmet can also send a data acquisition instruction to the terminal, so that the terminal can acquire other data related to the accident, the rescue service is assisted, and the accuracy of the rescue service is improved.
In addition, the accident handling method provided by the embodiment of the application can further ensure the accuracy of rescue by the server executing rescue operation after the confirmation of the delivery capacity is obtained. In addition, the accident handling method provided by the embodiment of the application can also enable the rescuer to check the motion data collected by the helmet and the data collected by the terminal, so that the rescuer can provide more accurate rescue for the delivery capacity, and the accident handling effect is improved.
The embodiments shown in fig. 7 and 8 are merely examples of the accident handling system including the helmet, the terminal, and the server, and the accident handling method of the accident handling system is described as an example. In yet another embodiment, the accident handling system may include a helmet and a server, rather than a terminal. This embodiment exemplifies an accident handling method of an accident handling system, which includes only a helmet and a server, by taking the example as an example:
In one possible implementation, an accident handling system includes a helmet and a server, the helmet having a communication connection established with the server, the helmet including a data acquisition module and a processor. The accident handling method comprises the steps that under the condition that the helmet is worn, a data acquisition module of the helmet acquires movement data of the helmet, the movement data are sent to a processor of the helmet, the processor of the helmet receives the movement data and carries out accident identification based on the movement data, and under the condition that an accident is identified, the helmet sends an accident notification for triggering rescue service to a server. Thus, after the server receives the accident notification, the rescue service can be executed.
Alternatively, the helmet is a helmet adapted for distribution capacity wear. Therefore, if an accident occurs in the delivery process, the helmet can be identified in time and trigger the rescue service, so that the delivery capacity is timely rescued, and the safety of the delivery capacity is ensured.
Optionally, the accident handling method further comprises the step that the helmet sends the movement data corresponding to the accident to the server, wherein the movement data corresponding to the accident is used for assisting the rescue service in the server. It should be noted that, the embodiments shown in fig. 7 and fig. 8 provide a method for assisting a rescue service based on the motion data corresponding to the accident and the data collected by the terminal, and the method for assisting a rescue service based on the motion data corresponding to the accident in this embodiment is the same as the method for assisting a rescue service based on the motion data corresponding to the accident and the data collected by the terminal in the embodiment shown in fig. 7 and fig. 8, and only the difference is that the data collected by the terminal is not processed any more, so the assisting method in the embodiment shown in fig. 7 and fig. 8 may be referred to, and will not be repeated here.
It should be noted that, when an accident occurs in the delivery capacity, the delivery capacity is not necessarily injured or is not necessarily severely injured. Therefore, no assistance is necessarily required after an accident in the delivery capacity. In some embodiments, in order not to waste rescue resources, a first key is provided on the helmet, and the first key may be operated when the delivery capacity requires rescue. Optionally, the accident handling method further comprises the step that in response to the first key operation of the helmet by the delivery capacity, the helmet sends a rescue confirmation notice to the server, wherein the rescue confirmation notice is used for indicating that the delivery capacity needs to be rescued.
In some embodiments, the distribution capability operates a first key of the helmet at any one time, and the helmet sends a rescue confirmation notification to the server. In some embodiments, after the helmet sends the accident notification to the server, the server returns a rescue confirmation notification to the helmet, and the helmet sends the rescue confirmation notification to the server in response to the first button operation of the helmet by the delivery capacity after receiving the rescue confirmation notification. Optionally, in order to enable the delivery capacity to operate the first key in time, the helmet may voice broadcast the rescue confirmation notification after receiving the rescue confirmation notification.
The embodiment of the application also provides an intelligent helmet, and fig. 9 is a schematic structural diagram of the helmet provided by the embodiment of the application. The helmet 900 is respectively in communication connection with the terminal and the server, the helmet 900 comprises a data acquisition module 901 and a processor 902, and the data acquisition module is electrically connected with the processor;
in some embodiments, the data acquisition module 901 is configured to acquire movement data of the helmet when the helmet is worn;
a processor 902 for accident recognition based on the motion data;
Helmet 900 for sending an accident notification to the server for triggering a rescue service in the event of an identified accident.
In one possible implementation, the helmet 900 is further configured to send accident-related movement data to a server, where the accident-related movement data is used to assist in a rescue service.
In one possible implementation, the helmet 900 is further configured to send a rescue confirmation notification to the server in response to the first button 903 of the helmet being operated by the shipping capacity, the rescue confirmation notification indicating that the shipping capacity requires rescue.
In other embodiments, the data acquisition module 901 is configured to acquire movement data of the helmet 900 when the helmet 900 is worn, and send the movement data to the processor 902;
a processor 902 for accident recognition based on the motion data;
The helmet 900 is further configured to send, in case of identifying an accident, an accident notification for triggering a rescue service to the server, and send a data acquisition instruction to the terminal, where the data acquisition instruction instructs the terminal connected to the helmet through communication to acquire at least one of environmental data and status data for assisting the rescue service and delivery capacity, and send the acquired data to the server.
In one possible implementation, the processor 902 is further configured to send a rescue confirmation notification to the server in response to the first button 903 of the helmet being operated by the shipping capacity, the rescue confirmation notification indicating that the shipping capacity requires rescue.
In one possible implementation, the data acquisition module 901 includes an acceleration sensor and the motion data includes an acceleration of the helmet;
A processor 902, configured to identify at least one of a helmet weightlessness, impact, rolling, and a change in posture based on an acceleration of the helmet acquired by the acceleration sensor;
a processor 902 is configured to determine that an accident has occurred when at least one of a helmet weight loss, an impact, a roll, and a change in attitude is identified.
In one possible implementation, the helmet 900 is further configured to send motion data corresponding to an accident to a server, where the collected data is used to perform accident judgment and verification together with the motion data corresponding to the accident in the server, so as to obtain an accident verification result, where the accident verification result is used to trigger a rescue operation in a rescue service.
Fig. 10 is a block diagram of a terminal 1000 according to an embodiment of the present application. Terminal 1000 can include a processor 1001 and a memory 1002.
The processor 1001 may include one or more processing cores, such as a 4-core processor, an 8-core processor, and so on. The processor 1001 may be implemented in at least one hardware form of DSP (DIGITAL SIGNAL Processing), FPGA (Field-Programmable gate array), PLA (Programmable Logic Array ). The processor 1001 may also include a main processor for processing data in the awake state, which is also called a CPU (Central Processing Unit ), and a coprocessor for processing data in the standby state, which is a low-power-consumption processor. In some embodiments, the processor 1001 may integrate a GPU (Graphics Processing Unit, image processor) for rendering and drawing of content required to be displayed by the display screen. In some embodiments, the processor 1001 may also include an AI (ARTIFICIAL INTELLIGENCE ) processor for processing computing operations related to machine learning.
Memory 1002 may include one or more computer-readable storage media, which may be non-transitory. Memory 1002 may also include high-speed random access memory, as well as non-volatile memory, such as one or more magnetic disk storage devices, flash memory storage devices. In some embodiments, a non-transitory computer readable storage medium in memory 1002 is used to store at least one program code for execution by processor 1001 to implement the incident processing methods provided by the method embodiments of the present application.
Terminal 1000 can also optionally include a peripheral interface 1003 and at least one peripheral in some embodiments. The processor 1001, the memory 1002, and the peripheral interface 1003 may be connected by a bus or signal line. The various peripheral devices may be connected to the peripheral device interface 1003 via a bus, signal wire, or circuit board. Specifically, the peripheral devices include at least one of radio frequency circuitry 1004, a display 1005, a camera assembly 1006, audio circuitry 1007, a positioning assembly 1008, and a power supply 1009.
Peripheral interface 1003 may be used to connect I/O (Input/Output) related at least one peripheral to processor 1001 and memory 1002. In some embodiments, processor 1001, memory 1002, and peripheral interface 1003 are integrated on the same chip or circuit board, and in some other embodiments, either or both of processor 1001, memory 1002, and peripheral interface 1003 may be implemented on separate chips or circuit boards, as this embodiment is not limiting.
The display screen 1005 is used to display a UI (User Interface). The UI may include graphics, text, icons, video, and any combination thereof. When the display 1005 is a touch screen, the display 1005 also has the ability to capture touch signals at or above the surface of the display 1005. The touch signal may be input to the processor 1001 as a control signal for processing. At this time, the display 1005 may also be used to provide virtual buttons and/or virtual keyboards, also referred to as soft buttons and/or soft keyboards. In some embodiments, display 1005 may be one, providing a front panel of terminal 1000, in other embodiments, display 1005 may be at least two, provided on different surfaces of terminal 1000 or in a folded design, respectively, and in still other embodiments, display 1005 may be a flexible display, provided on a curved surface or a folded surface of terminal 1000. Even more, the display 1005 may be arranged in a non-rectangular irregular pattern, i.e., a shaped screen. The display 1005 may be made of LCD (Liquid CRYSTAL DISPLAY), OLED (Organic Light-Emitting Diode) or other materials.
Power supply 1009 is used to power the various components in terminal 1000. The power source 1009 may be alternating current, direct current, disposable battery or rechargeable battery. When the power source 1009 includes a rechargeable battery, the rechargeable battery may support wired or wireless charging. The rechargeable battery may also be used to support fast charge technology.
Those skilled in the art will appreciate that the structure shown in fig. 10 is not limiting and that terminal 1000 can include more or fewer components than shown, or certain components can be combined, or a different arrangement of components can be employed.
Fig. 11 is a schematic structural diagram of a server according to an embodiment of the present application, where the server 1100 may have a relatively large difference due to different configurations or performances, and may include one or more processors (Central Processing Units, CPUs) 1101 and one or more memories 1102, where at least one program code is stored in the memories 1102, and the at least one program code is loaded and executed by the processors 1101 to implement the methods provided in the above-mentioned method embodiments. Of course, the server may also have a wired or wireless network interface, a keyboard, an input/output interface, and other components for implementing the functions of the device, which are not described herein.
The server 1100 is configured to perform the steps performed by the server in the method embodiments described above.
Embodiments of the present application also provide a computer readable storage medium having stored therein at least one program code that is loaded and executed by a processor to implement an incident processing method as described in any of the above implementations.
Embodiments of the present application also provide a computer program product comprising at least one program code loaded and executed by a processor to implement an incident processing method as described in any of the above implementations.
In some embodiments, a computer program according to an embodiment of the present application may be deployed to be executed on one computer device or on multiple computer devices located at one site or on multiple computer devices distributed across multiple sites and interconnected by a communication network, where the multiple computer devices distributed across multiple sites and interconnected by a communication network may constitute a blockchain system.
The foregoing is illustrative of the present application and is not to be construed as limiting thereof, but rather as various modifications, equivalent arrangements, improvements, etc., which fall within the spirit and principles of the present application.
Claims (12)
Priority Applications (2)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311116745.XA CN117333341B (en) | 2023-08-31 | 2023-08-31 | Accident handling method and smart helmet |
| CN202510155605.6A CN120013732A (en) | 2023-08-31 | 2023-08-31 | Accident handling method and smart helmet |
Applications Claiming Priority (1)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| CN202311116745.XA CN117333341B (en) | 2023-08-31 | 2023-08-31 | Accident handling method and smart helmet |
Related Child Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510155605.6A Division CN120013732A (en) | 2023-08-31 | 2023-08-31 | Accident handling method and smart helmet |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN117333341A CN117333341A (en) | 2024-01-02 |
| CN117333341B true CN117333341B (en) | 2025-01-24 |
Family
ID=89278064
Family Applications (2)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510155605.6A Pending CN120013732A (en) | 2023-08-31 | 2023-08-31 | Accident handling method and smart helmet |
| CN202311116745.XA Active CN117333341B (en) | 2023-08-31 | 2023-08-31 | Accident handling method and smart helmet |
Family Applications Before (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202510155605.6A Pending CN120013732A (en) | 2023-08-31 | 2023-08-31 | Accident handling method and smart helmet |
Country Status (1)
| Country | Link |
|---|---|
| CN (2) | CN120013732A (en) |
Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114693018A (en) * | 2020-12-28 | 2022-07-01 | 北京三快在线科技有限公司 | Distribution state monitoring method, device, medium, terminal and wearable device |
| CN115005548A (en) * | 2022-05-31 | 2022-09-06 | 梅州市明眸电子科技有限公司 | An intelligent riding helmet and a helmet-wearing detection method based on the helmet |
| CN116114956A (en) * | 2022-11-03 | 2023-05-16 | 拉扎斯网络科技(上海)有限公司 | Helmet, information interaction method, device and system |
Family Cites Families (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| KR102221508B1 (en) * | 2019-12-10 | 2021-03-02 | (주)지와이네트웍스 | Smart helmet management server for personal mobility and method for operating the server |
| CN113616191A (en) * | 2021-06-30 | 2021-11-09 | 展讯半导体(南京)有限公司 | Vital sign monitoring method and system based on intelligent helmet, helmet and medium |
| CN114359805A (en) * | 2022-01-04 | 2022-04-15 | 济南昊影电子科技有限公司 | Riding state acquisition and accident analysis processing method and system |
| CN116665433A (en) * | 2022-02-18 | 2023-08-29 | 上海博泰悦臻网络技术服务有限公司 | Traffic accident handling method and device |
| CN115299666B (en) * | 2022-08-05 | 2024-07-09 | 西安建筑科技大学 | Embedded intelligent helmet for Internet of things and intelligent system working method |
-
2023
- 2023-08-31 CN CN202510155605.6A patent/CN120013732A/en active Pending
- 2023-08-31 CN CN202311116745.XA patent/CN117333341B/en active Active
Patent Citations (3)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| CN114693018A (en) * | 2020-12-28 | 2022-07-01 | 北京三快在线科技有限公司 | Distribution state monitoring method, device, medium, terminal and wearable device |
| CN115005548A (en) * | 2022-05-31 | 2022-09-06 | 梅州市明眸电子科技有限公司 | An intelligent riding helmet and a helmet-wearing detection method based on the helmet |
| CN116114956A (en) * | 2022-11-03 | 2023-05-16 | 拉扎斯网络科技(上海)有限公司 | Helmet, information interaction method, device and system |
Non-Patent Citations (1)
| Title |
|---|
| 基于Arduino 的碰撞报警与监测智能头盔;陈楚婷;《科技创新与应用》;20210107;53-55 * |
Also Published As
| Publication number | Publication date |
|---|---|
| CN120013732A (en) | 2025-05-16 |
| CN117333341A (en) | 2024-01-02 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20180007994A1 (en) | Wearable integration with helmet | |
| US9833031B2 (en) | Safety accessory with situational awareness and data retention | |
| CN110126783B (en) | Vehicle control method and device | |
| US10591930B2 (en) | Automatic traveling control apparatus and automatic traveling control system | |
| EP3806066A1 (en) | Method and apparatus for controlling automated guided vehicle, and storage medium | |
| US11263885B2 (en) | Information processing device, information processing method, information processing program, terminal device, and method for controlling terminal device | |
| WO2018046015A1 (en) | Alarm method, device and terminal for vehicle | |
| CN105632101A (en) | Human body anti-tumbling early warning method and system | |
| CN109003425A (en) | A kind of method for early warning and relevant device | |
| CN113160520B (en) | Data processing method and device, storage medium and intelligent glasses | |
| CN107406081A (en) | Use the driver of the startup improved response time | |
| CN108391885A (en) | A kind of device and method that detectable jockey goes on a journey safely | |
| KR20190104704A (en) | a terminal for detecting fall | |
| CN108550210A (en) | Control vehicle unlocks the method and apparatus of state | |
| CN111292493A (en) | Vibration reminding method, device, electronic equipment and system | |
| CN110290059A (en) | The method and apparatus for sending social content | |
| KR101569068B1 (en) | Method and apparatus for remote monitoring management for safe operation of group bus | |
| KR20230005640A (en) | Electronic device and method for determining type of falling event | |
| CN108307072A (en) | Electronic device, fall reminding method and related product | |
| CN117333341B (en) | Accident handling method and smart helmet | |
| CN110177240B (en) | A wearable device video call method and wearable device | |
| CN108040182A (en) | A kind of alarm method and mobile terminal | |
| CN113205069A (en) | False license plate detection method and device and computer storage medium | |
| CN118025201B (en) | Method and device for processing data of automatic driving system | |
| CN108521514A (en) | Terminal anti-theft based reminding method, mobile terminal and computer readable storage medium |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |