[go: up one dir, main page]

CN109828660B - Method and device for controlling application operation based on augmented reality - Google Patents

Method and device for controlling application operation based on augmented reality Download PDF

Info

Publication number
CN109828660B
CN109828660B CN201811645324.5A CN201811645324A CN109828660B CN 109828660 B CN109828660 B CN 109828660B CN 201811645324 A CN201811645324 A CN 201811645324A CN 109828660 B CN109828660 B CN 109828660B
Authority
CN
China
Prior art keywords
user
gesture information
information
application
initial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811645324.5A
Other languages
Chinese (zh)
Other versions
CN109828660A (en
Inventor
曾科凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Intellifusion Technologies Co Ltd
Original Assignee
Shenzhen Intellifusion Technologies Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Intellifusion Technologies Co Ltd filed Critical Shenzhen Intellifusion Technologies Co Ltd
Priority to CN201811645324.5A priority Critical patent/CN109828660B/en
Publication of CN109828660A publication Critical patent/CN109828660A/en
Application granted granted Critical
Publication of CN109828660B publication Critical patent/CN109828660B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the application provides a method for controlling application operation based on augmented reality, which comprises the following steps: receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user; analyzing the initial gesture information of the user to obtain the gesture information of the user; and performing operation control on the first application according to the gesture information of the user. According to the method and the device, the initial gesture information of the user is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.

Description

Method and device for controlling application operation based on augmented reality
Technical Field
The present application relates to the field of Augmented Reality (AR) technologies, and in particular, to a method and an apparatus for controlling application operations based on augmented reality.
Background
With the development and progress of artificial intelligence technology, gesture recognition technology has begun to be applied to social life, greatly facilitating the work of people and increasing the pleasure of life. However, for the existing applications, a user is generally required to trigger system interaction by clicking a screen or a keyboard, which is not only tedious in process, but also has certain harm to human eyes, so that a technology capable of controlling the applications by a stroke gesture at a certain distance from the screen is required.
Disclosure of Invention
The embodiment of the application provides a method and a device for controlling application operation based on augmented reality, which can realize application operation without pressing a screen or a keyboard by a user, and are very convenient.
A first aspect of an embodiment of the present application provides a method for controlling application operation based on augmented reality, including:
receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user;
analyzing the initial gesture information of the user to obtain the gesture information of the user;
and performing operation control on the first application according to the gesture information of the user.
A second aspect of an embodiment of the present application provides an apparatus for controlling application operation based on augmented reality, including:
the gesture obtaining module is used for receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to obtain the initial gesture information of the user;
the gesture analysis module is used for analyzing the initial gesture information of the user to obtain the gesture information of the user;
and the application control module is used for carrying out operation control on the first application according to the gesture information of the user.
A third aspect of embodiments of the present application provides a computer-readable storage medium storing a computer program for execution by a processor to implement the method.
The embodiment of the application has at least the following beneficial effects:
according to the method and the device, the initial gesture information of the user is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present application, and for those skilled in the art, other drawings can be obtained according to the drawings without creative efforts.
Fig. 1 is an interaction diagram of a method for controlling application operation based on augmented reality according to an embodiment of the present invention;
fig. 2 is a schematic flowchart of a method for controlling application operation based on augmented reality according to an embodiment of the present invention;
fig. 3 is a schematic flowchart of a method for controlling application operation based on augmented reality according to an embodiment of the present invention;
fig. 4 is a schematic flowchart of a method for controlling application operation based on augmented reality according to an embodiment of the present invention;
fig. 5 is a schematic flowchart of a method for controlling application operation based on augmented reality according to an embodiment of the present invention;
fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present invention;
fig. 7 is a schematic structural diagram of an apparatus for controlling application operation based on augmented reality according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The terms "first," "second," and the like in the description and claims of the present application and in the above-described drawings are used for distinguishing between different objects and not for describing a particular order. Furthermore, the terms "include" and "have," as well as any variations thereof, are intended to cover non-exclusive inclusions. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those steps or elements listed, but may alternatively include other steps or elements not listed, or inherent to such process, method, article, or apparatus.
Reference in the specification to "an embodiment" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. The appearances of the phrase in various places in the specification are not necessarily all referring to the same embodiment, nor are separate or alternative embodiments mutually exclusive of other embodiments. It is explicitly and implicitly understood by one skilled in the art that the embodiments described herein can be combined with other embodiments.
The embodiment of the scheme can be carried out on equipment or a terminal and the like, wherein the initial gesture information of the user is detected and then is analyzed to obtain the gesture information which can be used for carrying out application operation control on the equipment or the terminal, namely the initial gesture information of the user is compared with the preset gesture information to confirm the operation which the user wants to carry out on the application, and then the mouse can be controlled or the application can be directly controlled to carry out corresponding operation.
Specifically, referring to fig. 1, fig. 1 is an interaction schematic diagram of a method for controlling application operation based on augmented reality according to an embodiment of the present invention. As shown in fig. 1, it includes: the user 101 and the terminal 102 are specifically as follows:
a terminal 102 receives a request sent by a user 101, wherein the request carries initial gesture information of the user 101, and the request is used for indicating to acquire the initial gesture information of the user 101; the terminal 102 analyzes the initial gesture information of the user 101 to obtain the gesture information of the user 101; and the terminal 102 performs operation control on the first application according to the gesture information of the user 101.
Preferably, before the terminal 102 obtains the initial gesture information of the user 101, the method further includes:
the terminal 102 receives a face recognition request sent by the user 101, wherein the face recognition request is used for indicating to acquire face information of the user 101; the terminal 102 compares the acquired face information of the user 101 with face information in a preset grade database to confirm the grade of the user 101; the terminal 102 displays a first application matching the user's rating.
According to the method and the device, the initial gesture information of the user is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.
Referring to fig. 2, fig. 2 is a schematic flowchart of a method for controlling application operation based on augmented reality according to an embodiment of the present invention. As shown in fig. 2, the method includes steps 201 and 203 as follows:
201. receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user;
receiving a request sent by a user by equipment or a terminal, and if the initial gesture information of the user is acquired and located in a specific area, identifying the initial gesture information of the user and acquiring the initial gesture information of the user; wherein the initial gesture information may be various gesture actions;
202. analyzing the initial gesture information of the user to obtain the gesture information of the user;
if equipment or a terminal acquires the initial gesture information of the user, analyzing the initial gesture information of the user to obtain the gesture information of the user;
preferably, the initial gesture information of the user can be compared with gesture information stored in a preset gesture information base to confirm the gesture information of the user;
203. and performing operation control on the first application according to the gesture information of the user.
The initial gesture information of the user is correspondingly analyzed to obtain the gesture information when the initial gesture information is embodied on a screen and the like, so that the equipment or the terminal can conveniently and correspondingly control the application according to the gesture information.
According to the method and the device, the initial gesture information of the user is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.
Referring to fig. 3, fig. 3 is a schematic flowchart of a method for controlling application operation based on augmented reality according to an embodiment of the present invention. As shown in fig. 3, the method includes steps 301-306 as follows:
301. receiving a face recognition request sent by the user, wherein the face recognition request is used for indicating to acquire face information of the user;
if the equipment detects that the user is located in a preset face acquisition area, receiving a face identification request sent by the user, and triggering a face information acquisition module such as a camera to acquire face information of the user;
302. comparing the acquired face information of the user with face information in a preset grade database to confirm the grade of the user;
preferably, if the screen is located in a shopping mall, the preset level database may be set according to the obtained consumption amount of the user, or may be set according to the obtained approximate age of the user;
preferably, step 302 may include S3021 to S3025 as follows:
s3021, confirming the gender, the face shape information and the facial features information of the user according to the face information of the user;
if the equipment is used for training and identifying the face information of different sexes, different face shapes and different facial feature information based on deep training learning, then acquiring and identifying the gender, the face shape information and the facial feature information of the user;
s3022, obtaining face information k1 which is consistent with the gender of the user in the preset grade database;
after the gender of the user is confirmed, if the user is a female, the equipment acquires face information k1 of all the females in a preset grade database;
s3023, acquiring face information k2 consistent with the face information of the user from the face information k 1;
then, based on the female face information k1, searching and acquiring face information k2 consistent with the face information of the user, if the face information of the user is acquired to be round, searching the face information of the round face from the face information k 1;
s3024, acquiring face information k consistent with the facial feature information of the user from the face information k 2;
further, face information consistent with the information of the five sense organs of the user is obtained from the face information k2, wherein the five sense organs comprise ears, noses, eyes, mouths, eyebrows and the like;
and S3025, confirming that the grade corresponding to the face information k is the grade of the user.
Namely the face information k is the face information of the user;
303. displaying a first application matching the user's rating;
alternatively, a plurality of applications matching the user's rating may be displayed for selection by the user;
304. receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user;
305. analyzing the initial gesture information of the user to obtain the gesture information of the user;
preferably, the initial gesture information of the user can be compared with gesture information stored in a preset gesture information base to confirm the gesture information of the user;
if the gesture information stored in the preset gesture information base is as follows: when the initial gesture information of the user is clapping hands, the gesture information corresponding to the user is double-clicking the application;
306. and performing operation control on the first application according to the gesture information of the user.
According to the method and the device, before the initial gesture information of the user is obtained, the face information of the user is obtained, then the application corresponding to the user level is displayed, and then the initial gesture information is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.
Referring to fig. 4, fig. 4 is a schematic flowchart of a method for controlling application operation based on augmented reality according to an embodiment of the present invention. As shown in fig. 4, the method includes steps 401 and 406 as follows:
401. receiving a face recognition request sent by the user, wherein the face recognition request is used for indicating to acquire face information of the user;
402. comparing the acquired face information of the user with face information in a preset grade database to confirm the grade of the user;
403. displaying a first application matching the user's rating;
displaying the first application on a display screen of the device;
404. receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user;
405. when the initial gesture information of the user is that the first gesture is continuously kept within the preset time, confirming that the gesture information of the user is clicking;
if the first gesture can be fist making, if the user keeps the fist making action within 5s, the first gesture is judged to be clicking on the first application;
or when the initial gesture information of the user is a second gesture, confirming that the gesture information of the user is exit;
if the second gesture can be in a scissor shape, when the initial gesture information of the user is in the scissor shape, the gesture information of the user is determined to be exit;
or, when the initial gesture information of the user is a first position pointing to a first preset area, acquiring coordinates (x1, y1) of the first position; calling a preset mapping algorithm to the coordinates of the first position for processing so as to obtain coordinates (x2, y2) of a second position corresponding to the coordinates of the first position; confirming that the gesture information of the user points to a second position of a second preset area;
preferably, the first preset area is an area located in a specific position in front of the user, and the coordinate information mapped to the screen is obtained by obtaining the position coordinate information of the initial gesture information of the user in the first preset area and then mapping and converting the initial gesture information to the second preset area;
wherein the preset mapping algorithm may include:
confirming that the plane where the first preset area is located is parallel to the plane where the second preset area is located;
acquiring a distance L from the center point of the first preset area to the center point of the second preset area;
acquiring an included angle theta between a connecting line from the center point of the first preset area to the center point of the second preset area and a horizontal line;
if the plane where the first preset area is located is parallel to the plane where the second preset area is located, obtaining the distance between the center points of the two planes; then, according to the size of the elevation angle, coordinate conversion can be carried out on the coordinate;
the coordinates of the second location may be expressed as:
x2=x1+Lcosθ,y2=y1+Lsinθ。
406. and performing operation control on the first application according to the gesture information of the user.
According to the method and the device, before the initial gesture information of the user is obtained, the face information of the user is obtained, then the application corresponding to the user level is displayed, and then the initial gesture information is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. Furthermore, through the scheme, the user does not need to press the screen or the keyboard, application operation can be carried out, remote operation control on application can be realized, the method is very convenient, the interestingness is strong, the method is beneficial to eyes of the user, and the user experience is good.
Referring to fig. 5, fig. 5 is a schematic flowchart illustrating a method for controlling application operation based on augmented reality according to an embodiment of the present application. As shown in fig. 5, the method may include steps 501 and 507 as follows:
501. receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user;
502. confirming whether the initial gesture information of the user is successfully acquired;
503. if yes, analyzing the initial gesture information of the user to obtain the gesture information of the user, and executing step 507, otherwise, executing step 504;
504. prompting the user to re-display the initial gesture information so as to re-acquire the initial gesture information of the user;
wherein, the user can be reminded to reproduce the initial gesture information through voice prompt or screen display and the like;
505. if the second acquisition fails, acquiring eye information of the user;
wherein the eye spirit information may include open eyes, closed eyes, blinking eyes, turning eyes or eyes not turning eyes, etc.;
506. performing operation control on the first application according to the eye information;
if the user blinks for 2 times within the preset time, judging that the user clicks to enter the application; and when the user blinks for 4 times within the preset time, determining to quit the application, and the like.
Further, alternatively, head information of the user may be obtained, such as head shaking, head nodding, head turning, etc., to achieve different specific controls;
507. and performing operation control on the first application according to the gesture information of the user.
Specifically, if the initial gesture information of the user is that the two hands are tightly held, the first application is analyzed to be double-clicked; when the initial gesture information of the user is crossed, analyzing the initial gesture information as a first application to perform quitting operation; when the initial gesture information of the user is that the palm is opened, the operation is analyzed to be performed by one hand on the first application, and the like.
If the first application is set as a video playing application, the volume of the video playing application is controlled to be increased if the initial gesture information of the user is acquired as upward movement, and/or the video in the video playing application is controlled to be fast-forwarded if the initial gesture information of the user is acquired as transverse fluctuation movement;
and when the first application is a game theme application, acquiring initial gesture information of the user as a fist, clicking the game theme application, and/or acquiring initial gesture information of the user as two-hand crossing, and controlling the game theme application to enter a double-person operation mode.
According to the method and the device, the initial gesture information of the user is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. When the initial gesture information of the user fails to be acquired, the gesture information can be acquired again or the eye information of the user can be acquired, so that the application can be controlled. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.
In accordance with the foregoing embodiments, please refer to fig. 6, fig. 6 is a schematic structural diagram of a terminal according to an embodiment of the present application, and as shown in the drawing, the terminal includes a processor, an input device, an output device, and a memory, where the processor, the input device, the output device, and the memory are connected to each other, where the memory is used to store a computer program, the computer program includes program instructions, the processor is configured to call the program instructions, and the program includes instructions for performing the following steps;
receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user;
analyzing the initial gesture information of the user to obtain the gesture information of the user;
and performing operation control on the first application according to the gesture information of the user.
According to the method and the device, the initial gesture information of the user is obtained and analyzed, so that the device can control the corresponding application according to the gesture information obtained through analysis. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.
The above description has introduced the solution of the embodiment of the present application mainly from the perspective of the method-side implementation process. It is understood that the terminal includes corresponding hardware structures and/or software modules for performing the respective functions in order to implement the above-described functions. Those of skill in the art will readily appreciate that the present application is capable of hardware or a combination of hardware and computer software implementing the various illustrative elements and algorithm steps described in connection with the embodiments provided herein. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiment of the present application, the terminal may be divided into the functional units according to the above method example, for example, each functional unit may be divided corresponding to each function, or two or more functions may be integrated into one processing unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit. It should be noted that the division of the unit in the embodiment of the present application is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
In accordance with the above, please refer to fig. 7, fig. 7 is a schematic structural diagram of an apparatus for controlling application operation based on augmented reality according to an embodiment of the present application, which includes a gesture obtaining module 701, a gesture parsing module 702, and an application control module 703, and the specific details are as follows:
a gesture obtaining module 701, configured to receive a request sent by a user, where the request carries initial gesture information of the user, and the request is used to instruct to obtain the initial gesture information of the user;
a gesture parsing module 702, configured to parse the initial gesture information of the user to obtain gesture information of the user;
the application control module 703 is configured to perform operation control on the first application according to the gesture information of the user.
Preferably, the system further comprises a face recognition module, configured to:
receiving a face recognition request sent by the user, wherein the face recognition request carries face information of the user, and the face recognition request is used for indicating to acquire the face information of the user; comparing the acquired face information of the user with face information in a preset grade database to confirm the grade of the user; displaying a first application matching the user's rating.
Preferably, the gesture parsing module is further configured to:
when the initial gesture information of the user is that the first gesture is continuously kept within the preset time, confirming that the gesture information of the user is clicking; or when the initial gesture information of the user is a second gesture, confirming that the gesture information of the user is exit; or, when the initial gesture information of the user is a first position pointing to a first preset area, acquiring coordinates (x1, y1) of the first position; calling a preset mapping algorithm to the coordinates of the first position for processing so as to obtain coordinates (x2, y2) of a second position corresponding to the coordinates of the first position; and confirming that the gesture information of the user points to a second position of a second preset area.
Preferably, the gesture parsing module is further configured to:
when the initial gesture information of the user points to a first position of a first preset area, confirming that a plane where the first preset area is located is parallel to a plane where the second preset area is located; acquiring a distance L from the center point of the first preset area to the center point of the second preset area; acquiring an included angle theta between a connecting line from the center point of the first preset area to the center point of the second preset area and a horizontal line; the coordinates of the second location are: x2 is x1+ Lcos θ, and y2 is y1+ Lsin θ.
Therefore, according to the embodiment of the application, the initial gesture information of the user is obtained and analyzed, so that the equipment can control the corresponding application according to the gesture information obtained through analysis. Through this scheme, need not user point and press screen or keyboard, can carry out the application operation, can realize remote operation control to using, very convenient, and interesting strong, be of value to user's eyes, user experience is good.
Embodiments of the present application also provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program enables a computer to execute part or all of the steps of any one of the methods for controlling application operation based on augmented reality as described in the above method embodiments.
Embodiments of the present application also provide a computer program product, which includes a non-transitory computer-readable storage medium storing a computer program, and the computer program causes a computer to execute some or all of the steps of any one of the methods for controlling application operation based on augmented reality as described in the above method embodiments.
It should be noted that for simplicity of description, the above-mentioned embodiments of the method are described as a series of acts, but those skilled in the art should understand that the present application is not limited by the described order of acts, as some steps may be performed in other orders or simultaneously according to the present application. Further, those skilled in the art should also appreciate that the embodiments described in the specification are preferred embodiments and that the acts and modules referred to are not necessarily required in this application.
In the foregoing embodiments, the descriptions of the respective embodiments have respective emphasis, and for parts that are not described in detail in a certain embodiment, reference may be made to related descriptions of other embodiments.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus may be implemented in other manners. For example, the above-described embodiments of the apparatus are merely illustrative, and for example, the division of the units is only one type of division of logical functions, and there may be other divisions when actually implementing, for example, a plurality of units or components may be combined or may be integrated into another system, or some features may be omitted, or not implemented. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection of some interfaces, devices or units, and may be an electric or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one position, or may be distributed on multiple network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit may be implemented in the form of hardware, or may be implemented in the form of a software program module.
The integrated units, if implemented in the form of software program modules and sold or used as stand-alone products, may be stored in a computer readable memory. Based on such understanding, the technical solution of the present application may be substantially implemented or a part of or all or part of the technical solution contributing to the prior art may be embodied in the form of a software product stored in a memory, and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) to execute all or part of the steps of the method described in the embodiments of the present application. And the aforementioned memory comprises: various media capable of storing program codes, such as a usb disk, a read-only memory (ROM), a Random Access Memory (RAM), a removable hard disk, a magnetic or optical disk, and the like.
Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable memory, which may include: flash memory disks, read-only memory, random access memory, magnetic or optical disks, and the like.
The foregoing detailed description of the embodiments of the present application has been presented to illustrate the principles and implementations of the present application, and the above description of the embodiments is only provided to help understand the method and the core concept of the present application; meanwhile, for a person skilled in the art, according to the idea of the present application, there may be variations in the specific embodiments and the application scope, and in summary, the content of the present specification should not be construed as a limitation to the present application.

Claims (8)

1. A method for controlling application operation based on augmented reality, comprising:
receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to acquire the initial gesture information of the user;
analyzing the initial gesture information of the user to obtain the gesture information of the user; the analyzing the initial gesture information of the user to obtain the gesture information of the user includes: when the initial gesture information of the user points to a first position of a first preset area, acquiring coordinates of the first position, a distance between a center point of the first preset area and a center point of a second preset area pointed by the user, and an included angle between a connecting line between the center point of the first preset area and the center point of the second preset area and a horizontal line; calculating to obtain the coordinate of the second position of the second preset area pointed by the user according to the coordinate of the first position, the distance between the center point of the first preset area and the center point of the second preset area and the included angle, wherein the plane where the first preset area is located is parallel to the plane where the second preset area is located;
performing operation control on a first application according to the gesture information of the user;
the method further comprises the following steps:
if the initial gesture information of the user fails to be acquired, prompting the user to re-display the initial gesture information so as to re-acquire the initial gesture information of the user; if the second acquisition fails, acquiring eye information of the user; and performing operation control on the first application according to the eye information.
2. The method of claim 1, wherein prior to receiving the request sent by the user, the method further comprises:
receiving a face recognition request sent by the user, wherein the face recognition request carries face information of the user, and the face recognition request is used for indicating to acquire the face information of the user;
comparing the acquired face information of the user with face information in a preset grade database to confirm the grade of the user;
displaying a first application matching the user's rating.
3. The method of claim 2, wherein the parsing the initial gesture information of the user to obtain the gesture information of the user further comprises:
when the initial gesture information of the user is that the first gesture is continuously kept within the preset time, confirming that the gesture information of the user is clicking;
or when the initial gesture information of the user is a second gesture, confirming that the gesture information of the user is exit.
4. The method according to any one of claims 1 to 3, wherein the performing operation control on the first application according to the gesture information of the user comprises:
when the first application is a video playing application, acquiring the initial gesture information of the user as rising, controlling the volume of the video playing application to increase, and/or acquiring the initial gesture information of the user as transverse fluctuation movement, and controlling the video in the video playing application to fast forward;
if the first application is a game theme application and the initial gesture information of the user is obtained as a fist, clicking the game theme application, and/or if the initial gesture information of the user is obtained as two-hand crossing, controlling the game theme application to enter a double-person operation mode.
5. An apparatus for controlling operation of an application based on augmented reality, comprising:
the gesture obtaining module is used for receiving a request sent by a user, wherein the request carries initial gesture information of the user, and the request is used for indicating to obtain the initial gesture information of the user;
the gesture analysis module is used for analyzing the initial gesture information of the user to obtain the gesture information of the user; the gesture parsing module is further configured to: when the initial gesture information of the user points to a first position of a first preset area, acquiring coordinates of the first position, a distance between a center point of the first preset area and a center point of a second preset area pointed by the user, and an included angle between a connecting line between the center point of the first preset area and the center point of the second preset area and a horizontal line; calculating to obtain the coordinate of the second position of the second preset area pointed by the user according to the coordinate of the first position, the distance between the central point of the first preset area and the central point of the second preset area and the included angle, wherein the plane where the first preset area is located is parallel to the plane where the second preset area is located;
the application control module is used for carrying out operation control on the first application according to the gesture information of the user;
wherein, the gesture obtaining module is further configured to: if the initial gesture information of the user fails to be acquired, prompting the user to re-display the initial gesture information so as to re-acquire the initial gesture information of the user; if the second acquisition fails, acquiring eye information of the user; the application control module is further configured to: and performing operation control on the first application according to the eye information.
6. The apparatus of claim 5, further comprising a face recognition module to:
receiving a face recognition request sent by the user, wherein the face recognition request carries face information of the user, and the face recognition request is used for indicating to acquire the face information of the user; comparing the acquired face information of the user with face information in a preset grade database to confirm the grade of the user; displaying a first application matching the user's rating.
7. The apparatus of claim 6, wherein the gesture resolution module is further configured to:
when the initial gesture information of the user is that the first gesture is continuously kept within the preset time, confirming that the gesture information of the user is clicking; or when the initial gesture information of the user is a second gesture, confirming that the gesture information of the user is exit.
8. A computer-readable storage medium storing a computer program for execution by a processor to implement the method of any one of claims 1 to 4.
CN201811645324.5A 2018-12-29 2018-12-29 Method and device for controlling application operation based on augmented reality Active CN109828660B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811645324.5A CN109828660B (en) 2018-12-29 2018-12-29 Method and device for controlling application operation based on augmented reality

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811645324.5A CN109828660B (en) 2018-12-29 2018-12-29 Method and device for controlling application operation based on augmented reality

Publications (2)

Publication Number Publication Date
CN109828660A CN109828660A (en) 2019-05-31
CN109828660B true CN109828660B (en) 2022-05-17

Family

ID=66861505

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811645324.5A Active CN109828660B (en) 2018-12-29 2018-12-29 Method and device for controlling application operation based on augmented reality

Country Status (1)

Country Link
CN (1) CN109828660B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110727345B (en) * 2019-09-19 2023-12-26 北京耐德佳显示技术有限公司 Method and system for realizing man-machine interaction through finger intersection movement
CN111580652B (en) * 2020-05-06 2024-01-16 Oppo广东移动通信有限公司 Video playback control method, device, augmented reality device and storage medium
CN114967484A (en) * 2022-04-20 2022-08-30 海尔(深圳)研发有限责任公司 Method and device for controlling home appliance, home appliance, and storage medium
CN115277143B (en) * 2022-07-19 2023-10-20 中天动力科技(深圳)有限公司 Data security transmission method, device, equipment and storage medium

Family Cites Families (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2672880B1 (en) * 2011-02-09 2019-05-22 Apple Inc. Gaze detection in a 3d mapping environment
CN103019559B (en) * 2012-11-27 2016-03-23 海信集团有限公司 Gesture controls projection display equipment and control method thereof
TWI547626B (en) * 2013-05-31 2016-09-01 原相科技股份有限公司 Apparatus having the gesture sensor
CN104914985A (en) * 2014-03-13 2015-09-16 扬智科技股份有限公司 Gesture control method and system and video stream processing device
CN204347750U (en) * 2014-10-17 2015-05-20 李妍 head-mounted display apparatus
CN104656903A (en) * 2015-03-04 2015-05-27 联想(北京)有限公司 Processing method for display image and electronic equipment
CN105204743A (en) * 2015-09-28 2015-12-30 百度在线网络技术(北京)有限公司 Interaction control method and device for speech and video communication
CN106705837B (en) * 2015-11-17 2019-12-06 华为技术有限公司 Object measuring method and device based on gestures
CN106200916B (en) * 2016-06-28 2019-07-02 Oppo广东移动通信有限公司 Augmented reality image control method, device and terminal device
CN106502424A (en) * 2016-11-29 2017-03-15 上海小持智能科技有限公司 Based on the interactive augmented reality system of speech gestures and limb action
CN107272890A (en) * 2017-05-26 2017-10-20 歌尔科技有限公司 A kind of man-machine interaction method and device based on gesture identification
CN107679860A (en) * 2017-08-09 2018-02-09 百度在线网络技术(北京)有限公司 A kind of method, apparatus of user authentication, equipment and computer-readable storage medium
CN108595005A (en) * 2018-04-20 2018-09-28 深圳市天轨年华文化科技有限公司 Exchange method, device based on augmented reality and computer readable storage medium
CN109086590B (en) * 2018-08-13 2020-09-04 广东小天才科技有限公司 Interface display method of electronic equipment and electronic equipment

Also Published As

Publication number Publication date
CN109828660A (en) 2019-05-31

Similar Documents

Publication Publication Date Title
CN109828660B (en) Method and device for controlling application operation based on augmented reality
US20220150285A1 (en) Communication assistance system, communication assistance method, communication assistance program, and image control program
CN109635621B (en) System and method for recognizing gestures based on deep learning in first-person perspective
CN102789313B (en) User interaction system and method
US20170046568A1 (en) Systems and methods of identifying a gesture using gesture data compressed by principal joint variable analysis
CN112527115B (en) User image generation method, related device and computer program product
KR101743763B1 (en) Method for providng smart learning education based on sensitivity avatar emoticon, and smart learning education device for the same
KR101567154B1 (en) Method for processing dialogue based on multiple user and apparatus for performing the same
CN113205569B (en) Image drawing method and device, computer readable medium and electronic device
CN116894711A (en) Commodity recommendation reason generation method and device and electronic equipment
CN110007765A (en) A kind of man-machine interaction method, device and equipment
US12333658B2 (en) Generating user interfaces displaying augmented reality graphics
CN112190921A (en) Game interaction method and device
CN105975072A (en) Method, device and system for identifying gesture movement
CN104252287A (en) Interaction device and method for improving expression capability based on interaction device
CN112990043A (en) Service interaction method and device, electronic equipment and storage medium
CN104615231A (en) Determination method for input information, and equipment
CN108255308A (en) A kind of gesture interaction method and system based on visual human
CN108415561A (en) Gesture interaction method based on visual human and system
CN111741321A (en) Live broadcast control method, device, equipment and computer storage medium
CN109683711B (en) Product display method and device
CN113253838B (en) AR-based video teaching method and electronic equipment
CN115132027A (en) Smart programming learning system and learning method based on multimodal deep learning
CN109445573A (en) A kind of method and apparatus for avatar image interactive
Chen Immersive analytics interaction: User preferences and agreements by task type

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant