[go: up one dir, main page]

CN105589552A - Projection interaction method and projection interaction device based on gestures - Google Patents

Projection interaction method and projection interaction device based on gestures Download PDF

Info

Publication number
CN105589552A
CN105589552A CN201410601367.9A CN201410601367A CN105589552A CN 105589552 A CN105589552 A CN 105589552A CN 201410601367 A CN201410601367 A CN 201410601367A CN 105589552 A CN105589552 A CN 105589552A
Authority
CN
China
Prior art keywords
projection
gesture
infrared
image acquisition
camera
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201410601367.9A
Other languages
Chinese (zh)
Other versions
CN105589552B (en
Inventor
宋金龙
马军
王琳
李翔
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lenovo Beijing Ltd
Original Assignee
Lenovo Beijing Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lenovo Beijing Ltd filed Critical Lenovo Beijing Ltd
Priority to CN201410601367.9A priority Critical patent/CN105589552B/en
Publication of CN105589552A publication Critical patent/CN105589552A/en
Application granted granted Critical
Publication of CN105589552B publication Critical patent/CN105589552B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)
  • Projection Apparatus (AREA)

Abstract

The invention provides a projection interaction method and projection interaction device based on a gesture. The method is applied in the projection interaction device. The projection interaction device comprises a projection unit, a first camera and a second camera; the overlapped areas of the image collection areas of the first camera and the second camera cover the projection desktop of the projection unit. The method comprises following steps: successively obtaining the first infrared image frame and the first colored image frame of the projection desktop through the first camera in a predetermined time period; successively obtaining the second infrared image frame and the second colored image frame of the projection desktop through the second camera in a predetermined time period; recovering a finger depth according to the first infrared image frame and the second infrared image frame; recovering a projection desktop depth according to the first colored image frame and the second colored image frame; determining a gesture and the coordinate information of the gesture according to the finger depth and the projection desktop depth; determining the operation expressed by the gesture according to the gesture and the coordinate information; and carrying out corresponding operation to the projection unit.

Description

Gesture-based projection interaction method and projection interaction equipment
Technical Field
The invention relates to the field of image processing, in particular to a projection interaction method and projection interaction equipment based on gestures.
Background
As technology develops, projection is widely used in various aspects, such as teaching, meetings, and the like. Gesture recognition is carried out through projection and a camera, better man-machine interaction can be achieved, and the projection can be used more conveniently.
In the prior art, an adopted camera is usually an infrared camera or a color camera, the recovery of the finger depth by adopting the color camera may be interfered by the projection of a projector, the projection desktop depth cannot be recovered by adopting the infrared camera, the relationship between the finger depth and the projection desktop depth is difficult to accurately determine, and the gesture recognition effect is poor.
Disclosure of Invention
The invention provides a projection interaction method and projection interaction equipment based on gestures, which can accurately determine the relation between the depth of a finger and the depth of a projection desktop, further determine the operation represented by the gestures, and have better gesture recognition effect.
In a first aspect, a gesture-based projection interaction method is provided, and is applied to a projection interaction device, where the projection interaction device includes a projection unit, a first camera, and a second camera, where both the first camera and the second camera can alternately acquire an infrared image and a color image, and an overlapping area of an image acquisition area of the first camera and an image acquisition area of the second camera covers a projection desktop of the projection unit, and the method includes: acquiring a first infrared image frame and a first color image frame of the projection desktop sequentially through the first camera within a preset time period, and acquiring a second infrared image frame and a second color image frame of the projection desktop sequentially through the second camera within the preset time period; restoring the finger depth according to the first infrared image frame and the second infrared image frame, and restoring the projection desktop depth according to the first color image and the second color image; determining a gesture and coordinate information of the gesture according to the finger depth and the projection desktop depth; and determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture and performing corresponding operation on the projection unit.
In a second aspect, there is provided a projection interaction device, the device comprising: the projection device comprises a projection unit, a first image acquisition unit, a second image acquisition unit and a control unit, wherein the projection unit is used for projecting display data of the projection interaction device to a projection desktop; the first image acquisition unit and the second image acquisition unit are used for alternately acquiring infrared images and color images, and the overlapping area of the image acquisition area of the first image acquisition unit and the image acquisition area of the second image acquisition unit covers the projection desktop of the projection unit; the infrared light supplement lamp is used for performing infrared light supplement on the infrared images of the first image acquisition unit and the second image acquisition unit; the control unit comprises an image acquisition subunit, a depth recovery subunit, a coordinate determination subunit and a gesture operation subunit, wherein the image acquisition subunit is used for acquiring a first infrared image frame and a first color image frame of the projection desktop at the current moment through the first image acquisition unit and acquiring a second infrared image frame and a second color image frame of the projection desktop at the current moment through the second image acquisition unit; the depth recovery subunit is used for recovering the finger depth according to the first infrared image frame and the second infrared image frame and recovering the projection desktop depth according to the first color image and the second color image; the coordinate determination subunit is used for determining the gesture and the coordinate information of the gesture according to the finger depth and the projection desktop depth; the gesture operation subunit is used for determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture and performing corresponding operation on the projection unit.
According to the gesture-based projection interaction method and the projection interaction device, the two cameras with the infrared image and color image acquisition functions respectively acquire the infrared image and the color image in sequence, the finger depth is recovered according to the infrared images acquired by the two cameras, and the projection desktop depth is recovered according to the color images acquired by the two cameras, so that the relation between the finger depth and the projection desktop depth can be accurately determined, the operation represented by the gesture is further determined, and a good gesture recognition effect is achieved.
Drawings
FIG. 1 is a flowchart of a gesture-based projection interaction method according to an embodiment of the present invention.
Fig. 2 is a schematic diagram of an external structure of a projection interaction device according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of a projection interaction device according to an embodiment of the present invention.
Fig. 4 is another structural schematic diagram of a projection interaction device according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
FIG. 1 is a flowchart of a gesture-based projection interaction method according to an embodiment of the present invention. The method is applied to the projection interaction equipment and is executed by the control device of the projection interaction equipment. The projection interaction device comprises a projection unit, a first camera and a second camera, wherein both the first camera and the second camera can alternately acquire infrared images and color images, and the overlapping area of the image acquisition area of the first camera and the image acquisition area of the second camera covers the projection desktop of the projection unit. The method comprises the following steps:
101, acquiring a first infrared image frame and a first color image frame of the projection desktop sequentially by the first camera within a predetermined time period, and acquiring a second infrared image frame and a second color image frame of the projection desktop sequentially by the second camera within the predetermined time period.
It should be understood that a projection desktop refers to the area where the projection unit projects to form an image, and may be a wall, a projection screen, and so on.
It should be understood that the predetermined period of time is a short period of time within which the projected desktop can be considered unchanged. Or, the first infrared image frame, the first color image frame, the second infrared image frame and the second color image frame acquired in a predetermined time period may be regarded as the result of different angle and different manner of acquisition of the same projection desktop. For example, the predetermined period of time may be specified as the time occupied by 5 frames during which the projected desktop does not substantially change much. Of course, the predetermined period of time may also be specified as a longer period of time or a shorter period of time.
In addition, the first camera and the second camera can alternately acquire the infrared image and the color image, and thus the infrared image frame and the color image frame can be sequentially acquired by the first camera and the second camera, respectively. The first camera may acquire the first infrared image frame first and then acquire the first color image frame, or acquire the first color image frame first and then acquire the first infrared image frame. Similarly, the second camera may acquire the second infrared image frame before acquiring the second color image frame, or acquire the second color image frame before acquiring the second infrared image frame.
And 102, restoring the finger depth according to the first infrared image frame and the second infrared image frame, and restoring the projection desktop depth according to the first color image and the second color image.
It should be understood that, with the binocular algorithm, the method for recovering the finger depth according to the infrared image frame and the method for recovering the projection desktop depth according to the color image can refer to the prior art, and the embodiments of the present invention are not described herein again.
103, determining the gesture and the coordinate information of the gesture according to the finger depth and the projection desktop depth.
104, determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture, and performing corresponding operation on the projection unit.
In the embodiment of the invention, the two cameras which have the infrared image and color image acquisition functions at the same time respectively acquire the infrared image and the color image in sequence, the finger depth is recovered according to the infrared images acquired by the two cameras, and the projection desktop depth is recovered according to the color images acquired by the two cameras, so that the relation between the finger depth and the projection desktop depth can be accurately determined, the operation represented by the gesture is further determined, and a better gesture recognition effect is achieved.
Optionally, as an embodiment, in step 104, determining, according to the gesture and the coordinate information of the gesture, the operation represented by the gesture includes: determining operation corresponding to the gesture according to the plurality of continuous gestures and the corresponding coordinate information; or determining the operation corresponding to the gesture according to the single gesture and the corresponding coordinate information.
For example, when determining the operation of dragging, the operation may be determined by a plurality of consecutive gestures and corresponding coordinate information. For another example, when determining the operation of clicking, the operation may be determined by a single gesture and corresponding coordinate information, and so on. Of course, other operations may also be determined by multiple continuous gestures and corresponding coordinate information, or other operations may be determined by a single gesture and corresponding coordinate information, which may refer to the prior art for specific implementation, and details of embodiments of the present invention are not described herein.
Optionally, as an embodiment, the projection interaction device further includes an infrared light supplement lamp, where the infrared light supplement lamp is configured to perform infrared light supplement on the infrared images of the first camera and the second camera. Through infrared light filling, can make the infrared image of gathering comparatively clear, and then can guarantee the accuracy of the finger degree of depth that resumes.
Optionally, the method may further comprise: determining a pixel corresponding relation between an image acquisition area of the first camera and the projection desktop; wherein, step 103 can be specifically realized as: acquiring a first coordinate of the gesture in an image acquisition area of the first camera according to the finger depth and the projection desktop depth; and determining a second coordinate of the gesture in the projection desktop according to the first coordinate and the pixel corresponding relation, wherein the second coordinate is an operation coordinate of the gesture. Of course, the pixel correspondence between the image capture area of the second camera and the projection desktop may also be determined, and the embodiment of the present invention is not described herein again.
The method of the embodiments of the present invention will be further described with reference to specific examples.
Fig. 2 is a schematic external structural diagram of a projection interaction device according to an embodiment of the present invention. As shown in fig. 2, the projection interaction device may include: infrared color camera 1, infrared color camera 2, projecting apparatus, infrared light filling lamp. Wherein the projector is used for generating a projection desktop; both the infrared color camera 1 and the infrared color camera 2 can alternately acquire an infrared image and a color image, and an overlapping area of an image capturing area of the infrared color camera 1 and an image capturing area of the infrared color camera 2 covers a projection desktop of the projection unit. That is, both the infrared color camera 1 and the infrared color camera 2 can capture an infrared image and a color image of the projection desktop area. The infrared light supplement lamp is used for supplementing light to the infrared images collected by the infrared color camera 1 and the infrared color camera 2 so as to obtain the infrared images with better effect. The specific operation steps are as follows:
firstly, an infrared color camera 1 and an infrared color camera 2 respectively and interactively acquire an infrared image frame and a color image frame.
The time interval between the two cameras for collecting the infrared image frames and the color image frames is short, and the projection desktop can be regarded as unchanged.
And secondly, restoring the finger depth by using the infrared images collected by the infrared color camera 1 and the infrared color camera 2 according to a binocular algorithm.
It should be understood that, in the embodiment of the present invention, the infrared fill-in light lamp can fill in light for the infrared images collected by the infrared color camera 1 and the infrared color camera 2 to obtain an infrared image with a better effect, so that the finger depth can be recovered more accurately. Of course, if there is no infrared fill-in light, the finger depth can be recovered, but the error of the recovered finger depth may be larger, but the implementation of the method shown in fig. 1 in the embodiment of the present invention is not affected.
In addition, according to the method for restoring finger depth by using a binocular algorithm, the method for restoring finger depth by using a binocular algorithm and an infrared image can be specifically realized, and the embodiment of the invention is not repeated herein.
And thirdly, recovering the depth of the projection desktop by using the color images collected by the infrared color camera 1 and the infrared color camera 2 according to a binocular algorithm.
Similarly, the method for projecting the desktop depth according to the binocular algorithm specifically implements the algorithm for recovering the depth by referring to the binocular algorithm and the color image, and the embodiment of the invention is not described herein again.
In addition, it should be understood that there is no chronological restriction in the steps of the second and third steps.
And fourthly, obtaining the pixel corresponding relation between the projector image and the camera image through calibration.
The pixel correspondence between the projector image and the camera image may be calibrated in a variety of ways.
In one embodiment of the present invention, n is a plane equation (e.g., a plane equation for a wall) of an object projected by the projector, and K ispIs a projector internal reference, RpIs the projector to camera rotation matrix, tpIs a translation matrix of the projector to the camera, KcIs an internal reference of the camera, XcIs a point coordinate, X, in the camerapAre the mapped projector coordinates. The pixel correspondence between the projector image and the camera image is as follows:
Xp=Kp(Rp-tpnT)P=Kp(Rp-tpnT)Kc -1Xc
wherein, the plane equation nTP+1=0。
That is, the relationship between the projector image and the camera image can be expressed by the following equation:
Xp=HpcXc
wherein Hpc=Kp(Rp-tpnT)Kc -1
When R isp、tpGiven, map HpcOnly with plane nTIt is related.
Of course, it should be understood that there may be other ways to calibrate the pixel correspondence between the projector image and the camera image, and the specific implementation may refer to the prior art, and the embodiments of the present invention are not limited herein.
And fifthly, recognizing the gesture by using a gesture recognition algorithm, and mapping the operation of the projector according to the coordinate mapping relation in the fourth step.
After the finger depth is recovered, the motion of the finger, or the gesture recognition, can be determined by determining the change in the height of the finger and the change in the up, down, left and right directions.
The gesture recognition algorithm is used for recognizing the gesture algorithm, and reference is made to the prior art, and the embodiment of the invention is not described herein again.
After the gesture is recognized, the operation corresponding to the gesture can be determined according to a plurality of continuous gestures and corresponding coordinate information; or determining the operation corresponding to the gesture according to the single gesture and the corresponding coordinate information.
Then, according to the coordinates of the gesture and the pixel correspondence between the projector image and the camera image, the coordinates of the projector are determined, and then the operation on the projector is mapped.
Fig. 3 is a schematic structural diagram of a projection interaction device 300 according to an embodiment of the present invention. As shown in fig. 3, the projection interaction device 300 may include: a projection unit 320, a first image acquisition unit 330, a second image acquisition unit 340 and a control unit 310. Wherein,
the projection unit 320 is used for projecting the display data of the projection interaction device 300 onto a projection desktop;
the first image acquisition unit 330 and the second image acquisition unit 340 are used for alternately acquiring an infrared image and a color image, and an overlapping area of an image acquisition area of the first image acquisition unit 330 and an image acquisition area of the second image acquisition unit 340 covers a projection desktop of the projection unit 320;
the control unit 310 includes an image acquisition sub-unit 311, a depth restoration sub-unit 312, a coordinate determination sub-unit 313, and a gesture operation sub-unit 314, wherein,
the image acquisition subunit 311 is configured to acquire a first infrared image frame and a first color image frame of the projection desktop at the current time through the first image acquisition unit 330, and acquire a second infrared image frame and a second color image frame of the projection desktop at the current time through the second image acquisition unit 340;
the depth recovery subunit 312 is configured to recover the finger depth according to the first infrared image frame and the second infrared image frame, and recover the projection desktop depth according to the first color image and the second color image;
the coordinate determination subunit 313 is configured to determine a gesture and coordinate information of the gesture according to the finger depth and the projection desktop depth;
the gesture operation subunit 314 is configured to determine an operation represented by the gesture according to the gesture and the coordinate information of the gesture, and perform a corresponding operation on the projection unit 320.
In the embodiment of the present invention, the projection interaction device 300 sequentially acquires the infrared image and the color image through two cameras having the infrared image and color image acquisition functions at the same time, and further recovers the finger depth according to the infrared images acquired by the two cameras, and recovers the projection desktop depth according to the color images acquired by the two cameras, so that the relationship between the finger depth and the projection desktop depth can be accurately determined, and further the operation represented by the gesture is determined, which has a better gesture recognition effect.
Optionally, as an embodiment, in the process of determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture, the gesture operation subunit 314 is specifically configured to: determining operation corresponding to the gesture according to the plurality of continuous gestures and the corresponding coordinate information; or determining the operation corresponding to the gesture according to the single gesture and the corresponding coordinate information.
Optionally, as shown in fig. 4, the projection interaction device 300 may further include an infrared fill light unit 350. The infrared light supplement unit 350 is configured to perform infrared light supplement on the infrared images of the first image acquisition unit 330 and the second image acquisition unit 340.
Optionally, the coordinate determination subunit 313 is further configured to determine a pixel correspondence between the image capturing area of the first image capturing unit 330 and the projection desktop; in the process of determining the gesture and the coordinate information of the gesture according to the finger depth and the projected desktop depth, the coordinate determination subunit 313 is specifically configured to: acquiring a first coordinate of the gesture in an image acquisition area of the first image acquisition unit 330 according to the finger depth and the projection desktop depth; and determining a second coordinate of the gesture in the projection desktop according to the first coordinate and the pixel corresponding relation, wherein the second coordinate is an operation coordinate of the gesture.
It should be appreciated that in practical applications, the projection unit 320 may be a projector, or other projection device; the first image collecting unit 330 and the second image collecting unit 340 may be cameras with infrared image and color image collecting functions, or other image collecting devices with infrared image and color image collecting functions; the infrared fill-in light unit 350 may be an infrared fill-in light, or other infrared fill-in light devices, and so on. The control unit 310 may be a controller, a control chip or other control device in the projection interaction device 300, or may be composed of a plurality of apparatuses, and the embodiment of the present invention is not limited thereto.
In addition, the projection interaction device 300 may also perform the method shown in fig. 1 and have the functions of the projection interaction device in the embodiments shown in fig. 1 and fig. 2, which are not described herein again in the embodiments of the present invention.
In the present invention, when it is described that a specific member is located between a first member and a second member, there may or may not be an intervening member between the specific member and the first member or the second member; when it is described that a specific component is connected to other components, the specific component may be directly connected to the other components without having an intervening component or may be directly connected to the other components without having an intervening component.

Claims (9)

1. A gesture-based projection interaction method is applied to a projection interaction device, the projection interaction device comprises a projection unit, a first camera and a second camera, both of the first camera and the second camera can alternately acquire an infrared image and a color image, an overlapping area of an image acquisition area of the first camera and an image acquisition area of the second camera covers a projection desktop of the projection unit, and the method comprises the following steps:
acquiring a first infrared image frame and a first color image frame of the projection desktop sequentially through the first camera within a preset time period, and acquiring a second infrared image frame and a second color image frame of the projection desktop sequentially through the second camera within the preset time period;
restoring finger depth according to the first infrared image frame and the second infrared image frame, and restoring projection desktop depth according to the first color image and the second color image;
determining a gesture and coordinate information of the gesture according to the finger depth and the projection desktop depth;
and determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture, and performing corresponding operation on the projection unit.
2. The method of claim 1,
the determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture comprises:
determining operation corresponding to the gesture according to the plurality of continuous gestures and the corresponding coordinate information; or
And determining the operation corresponding to the gesture according to the single gesture and the corresponding coordinate information.
3. The method of claim 1 or 2, wherein the projection interaction device further comprises an infrared fill-in lamp, and the infrared fill-in lamp is used for performing infrared fill-in on the infrared images of the first camera and the second camera.
4. The method of any of claims 1 to 3, further comprising: determining a pixel correspondence between an image acquisition area of the first camera and the projection desktop; wherein,
the determining the gesture and the coordinate information of the gesture according to the finger depth and the projection desktop depth comprises:
acquiring a first coordinate of the gesture in an image acquisition area of the first camera according to the finger depth and the projection desktop depth;
and determining a second coordinate of the gesture in the projection desktop according to the first coordinate and the pixel corresponding relation, wherein the second coordinate is an operation coordinate of the gesture.
5. A projection interaction device, comprising: a projection unit, a first image acquisition unit, a second image acquisition unit and a control unit, wherein,
the projection unit is used for projecting the display data of the projection interaction equipment to a projection desktop;
the first image acquisition unit and the second image acquisition unit are used for alternately acquiring infrared images and color images, and the overlapping area of the image acquisition area of the first image acquisition unit and the image acquisition area of the second image acquisition unit covers the projection desktop of the projection unit;
the control unit comprises an image acquisition subunit, a depth recovery subunit, a coordinate determination subunit and a gesture operation subunit, wherein,
the image acquisition subunit is used for acquiring a first infrared image frame and a first color image frame of the projection desktop at the current moment through the first image acquisition unit, and acquiring a second infrared image frame and a second color image frame of the projection desktop at the current moment through the second image acquisition unit;
the depth recovery subunit is used for recovering the finger depth according to the first infrared image frame and the second infrared image frame and recovering the projection desktop depth according to the first color image and the second color image;
the coordinate determination subunit is used for determining the gesture and the coordinate information of the gesture according to the finger depth and the projection desktop depth;
the gesture operation subunit is used for determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture and performing corresponding operation on the projection unit.
6. The projection interaction device of claim 5, wherein, in the process of determining the operation represented by the gesture according to the gesture and the coordinate information of the gesture, the gesture operation subunit is specifically configured to:
determining operation corresponding to the gesture according to the plurality of continuous gestures and the corresponding coordinate information; or
And determining the operation corresponding to the gesture according to the single gesture and the corresponding coordinate information.
7. The projection interaction device of claim 5 or 6, further comprising an infrared fill-in light unit, wherein the infrared fill-in light unit is configured to perform infrared fill-in light on the infrared images of the first image acquisition unit and the second image acquisition unit.
8. The projection interaction device of any of claims 5 to 7, wherein the coordinate determination subunit is further configured to determine a pixel correspondence between the image acquisition region of the first image acquisition unit and the projection desktop;
in the process of determining the gesture and the coordinate information of the gesture according to the finger depth and the projection desktop depth, the coordinate determination subunit is specifically configured to: acquiring a first coordinate of the gesture in an image acquisition area of the first image acquisition unit according to the finger depth and the projection desktop depth; and determining a second coordinate of the gesture in the projection desktop according to the first coordinate and the pixel corresponding relation, wherein the second coordinate is an operation coordinate of the gesture.
9. The projection interaction device of any of claims 5 to 8, wherein the first image acquisition unit and the second image acquisition unit are cameras with an infrared image and a color image.
CN201410601367.9A 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device Active CN105589552B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201410601367.9A CN105589552B (en) 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201410601367.9A CN105589552B (en) 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device

Publications (2)

Publication Number Publication Date
CN105589552A true CN105589552A (en) 2016-05-18
CN105589552B CN105589552B (en) 2018-10-12

Family

ID=55929197

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201410601367.9A Active CN105589552B (en) 2014-10-30 2014-10-30 Projection interactive method based on gesture and projection interactive device

Country Status (1)

Country Link
CN (1) CN105589552B (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106371594A (en) * 2016-08-31 2017-02-01 李姣昂 Binocular infrared vision portable gesture-controlled projection system and method
CN106705003A (en) * 2017-01-16 2017-05-24 歌尔科技有限公司 Projection lamp
CN107426886A (en) * 2016-05-24 2017-12-01 仁宝电脑工业股份有限公司 Intelligent lighting device
CN107506133A (en) * 2017-08-24 2017-12-22 歌尔股份有限公司 Project the operation trace response method and system of touch-control system
CN108227923A (en) * 2018-01-02 2018-06-29 南京华捷艾米软件科技有限公司 A kind of virtual touch-control system and method based on body-sensing technology
CN111258410A (en) * 2020-05-06 2020-06-09 北京深光科技有限公司 Man-machine interaction equipment
CN114374776A (en) * 2020-10-15 2022-04-19 华为技术有限公司 Camera and camera control method
CN116974369A (en) * 2023-06-21 2023-10-31 广东工业大学 Method, system, equipment and storage medium for operating medical image in operation

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120139914A1 (en) * 2010-12-06 2012-06-07 Samsung Electronics Co., Ltd Method and apparatus for controlling virtual monitor
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method
CN102959616A (en) * 2010-07-20 2013-03-06 普莱姆森斯有限公司 Interaction Reality Enhancement for Natural Interactions
CN103093471A (en) * 2013-01-24 2013-05-08 青岛智目科技有限公司 Foreground extraction method under complicated background
CN104019761A (en) * 2014-04-15 2014-09-03 北京农业信息技术研究中心 Three-dimensional configuration obtaining device and method of corn plant

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102959616A (en) * 2010-07-20 2013-03-06 普莱姆森斯有限公司 Interaction Reality Enhancement for Natural Interactions
US20120139914A1 (en) * 2010-12-06 2012-06-07 Samsung Electronics Co., Ltd Method and apparatus for controlling virtual monitor
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method
CN103093471A (en) * 2013-01-24 2013-05-08 青岛智目科技有限公司 Foreground extraction method under complicated background
CN104019761A (en) * 2014-04-15 2014-09-03 北京农业信息技术研究中心 Three-dimensional configuration obtaining device and method of corn plant

Cited By (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107426556B (en) * 2016-05-24 2019-05-17 仁宝电脑工业股份有限公司 projection device
CN107426554B (en) * 2016-05-24 2019-09-24 仁宝电脑工业股份有限公司 Projection device
CN107426886A (en) * 2016-05-24 2017-12-01 仁宝电脑工业股份有限公司 Intelligent lighting device
CN107426556A (en) * 2016-05-24 2017-12-01 仁宝电脑工业股份有限公司 Projection device
CN107426555A (en) * 2016-05-24 2017-12-01 仁宝电脑工业股份有限公司 Projection device
CN107426554A (en) * 2016-05-24 2017-12-01 仁宝电脑工业股份有限公司 Projection device
CN106371594A (en) * 2016-08-31 2017-02-01 李姣昂 Binocular infrared vision portable gesture-controlled projection system and method
CN106705003A (en) * 2017-01-16 2017-05-24 歌尔科技有限公司 Projection lamp
CN107506133A (en) * 2017-08-24 2017-12-22 歌尔股份有限公司 Project the operation trace response method and system of touch-control system
CN107506133B (en) * 2017-08-24 2020-09-18 歌尔股份有限公司 Method and system for operating trajectory response of projection touch system
CN108227923A (en) * 2018-01-02 2018-06-29 南京华捷艾米软件科技有限公司 A kind of virtual touch-control system and method based on body-sensing technology
CN111258410A (en) * 2020-05-06 2020-06-09 北京深光科技有限公司 Man-machine interaction equipment
CN111258410B (en) * 2020-05-06 2020-08-04 北京深光科技有限公司 A human-computer interaction device
CN114374776A (en) * 2020-10-15 2022-04-19 华为技术有限公司 Camera and camera control method
CN114374776B (en) * 2020-10-15 2023-06-23 华为技术有限公司 Camera and camera control method
CN116974369A (en) * 2023-06-21 2023-10-31 广东工业大学 Method, system, equipment and storage medium for operating medical image in operation
CN116974369B (en) * 2023-06-21 2024-05-17 广东工业大学 Intraoperative medical imaging operation method, system, device and storage medium

Also Published As

Publication number Publication date
CN105589552B (en) 2018-10-12

Similar Documents

Publication Publication Date Title
CN105589552B (en) Projection interactive method based on gesture and projection interactive device
CN108764024B (en) Device and method for generating face recognition model and computer readable storage medium
US10186087B2 (en) Occluding augmented reality objects
JP6417702B2 (en) Image processing apparatus, image processing method, and image processing program
WO2019242262A1 (en) Augmented reality-based remote guidance method and device, terminal, and storage medium
US20190354799A1 (en) Method of Determining a Similarity Transformation Between First and Second Coordinates of 3D Features
JP6201379B2 (en) Position calculation system, position calculation program, and position calculation method
CN103713738B (en) A kind of view-based access control model follows the tracks of the man-machine interaction method with gesture identification
CN106688031A (en) Apparatus and method for providing content-aware photo filters
WO2015000286A1 (en) Three-dimensional interactive learning system and method based on augmented reality
US20130069876A1 (en) Three-dimensional human-computer interaction system that supports mouse operations through the motion of a finger and an operation method thereof
CN105335950A (en) Image processing method and image processing apparatus
CN104978077B (en) interaction method and system
CN109839827B (en) Gesture recognition intelligent household control system based on full-space position information
WO2021097600A1 (en) Inter-air interaction method and apparatus, and device
CN110427849B (en) Face pose determination method and device, storage medium and electronic equipment
TW201905757A (en) Electronic device and gesture recognition method applied therein
CN103399629A (en) Method and device for capturing gesture displaying coordinates
JP2014029656A (en) Image processor and image processing method
US8866921B2 (en) Devices and methods involving enhanced resolution image capture
WO2018006481A1 (en) Motion-sensing operation method and device for mobile terminal
CN118747039A (en) Method, device, electronic device and storage medium for moving virtual objects
KR20160022832A (en) Method and device for character input
CN107506133B (en) Method and system for operating trajectory response of projection touch system
KR20160031968A (en) A method for shadow removal in front projection systems

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant