[go: up one dir, main page]

WO2018154714A1 - Système, procédé et programme d'entrée d'opération - Google Patents

Système, procédé et programme d'entrée d'opération Download PDF

Info

Publication number
WO2018154714A1
WO2018154714A1 PCT/JP2017/007123 JP2017007123W WO2018154714A1 WO 2018154714 A1 WO2018154714 A1 WO 2018154714A1 JP 2017007123 W JP2017007123 W JP 2017007123W WO 2018154714 A1 WO2018154714 A1 WO 2018154714A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
wave
display screen
operation input
reflected wave
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2017/007123
Other languages
English (en)
Japanese (ja)
Inventor
洋海 澤田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to PCT/JP2017/007123 priority Critical patent/WO2018154714A1/fr
Publication of WO2018154714A1 publication Critical patent/WO2018154714A1/fr
Anticipated expiration legal-status Critical
Ceased legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors

Definitions

  • the present invention relates to an operation input system, an operation input method, and an operation input program for inputting a coordinate position on a display screen of an information processing apparatus and an operation command related to the coordinate position.
  • a coordinate position on a display screen of a display such as a mouse, a touch pad, and a touch panel is designated.
  • a method of inputting an operation command at the coordinate position such as button click, tap, and swipe, has become widespread.
  • the operator's hand is held by a camera provided in the information processing terminal. Take an image, set the operation area in the air near the position of the finger in correspondence with the screen of the information processing terminal, detect the movement of the finger in the operation area by image analysis, and from the movement of the finger, the screen
  • an operation input device that moves the upper cursor or highlights and designates an icon (see Patent Document 1). According to the operation input device disclosed in Patent Document 1, it is possible to determine an operation input with less burden on the operator without requiring an operation of directly touching a finger such as a mouse, a touch pad, or a touch panel.
  • the minimum point in the distance image is detected using a distance image that is a distance distribution from the camera to the object surface as disclosed in Patent Document 2, and the operator's
  • an information input device that obtains the position of the fingertip and enables pointing on the screen by the movement.
  • An object of the present invention is to provide an operation input system, an operation input method, and an operation input program that can accurately acquire an operation input.
  • the present invention is an operation input system for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position,
  • a transmitter that transmits electromagnetic waves or sound waves as a transmitted wave to a monitoring space area including a part or all of the visual field range in which the display screen is visible;
  • a receiving unit that receives the transmitted wave reflected by the object as a reflected wave;
  • a contact / separation detection unit that detects the movement of the object in the direction approaching or leaving the transmission unit or the reception unit and its speed, Based on the transmitted wave transmitted and the received reflected wave, the position information on the position and the displacement of the object that generated the reflected wave in the three-dimensional space is detected, and the position information and the contact / separation detection unit
  • a signal processing unit that generates an input signal related to a coordinate position on the display screen based on a detection result and provides the input signal to the information processing apparatus.
  • the present invention is also an operation input method for inputting an operation command related to a coordinate position on the display screen of the information processing apparatus and the coordinate position, While transmitting the electromagnetic wave or sound wave as a transmitted wave from the transmitter to a monitoring space area including part or all of the visual field range where the display screen is visible, the transmitted wave reflected by the object is used as a reflected wave.
  • the position of the object in the three-dimensional space and its displacement are detected based on the transmitted wave and the reflected wave, and input signals relating to cursor movement and basic command operations are provided to the information processing apparatus.
  • a pointing operation by an operator can be input without requiring a device such as a mouse, a touch pad, or a touch panel that is directly operated.
  • the approach of the object to the transmitting unit or the receiving unit or without performing heavy processing such as image analysis can be detected, and an operation input with less burden on the operator such as a fine and quick movement of a fingertip or the like can be accurately acquired.
  • filter In the above invention, a filtering unit that extracts in advance only the minimum data necessary for the subsequent processing process of the received reflected wave, that is, only data from the initial wave of the reflected wave to a predetermined time, and sends it to the subsequent processing flow.
  • the signal processing unit generates the input signal based on the reflected wave extracted by the filtering unit.
  • the signal processing unit is parallel to the display screen or has a predetermined angle in the monitoring space region, has the same shape or similar shape to the display screen, and is a distance from the display screen to the monitoring target point
  • an operation surface setting unit for setting a virtual continuous countless operation surface that is, a 3D touch panel, whose size is changed in accordance with the coordinates of the monitoring target point on the virtual operation surface of the 3D touch panel It is preferable to generate the pointing input signal according to the position.
  • Subcommand key In the above invention, since the Doppler commands are limited to two types of button down and button up and combinations thereof, only basic commands that can be executed by the current mouse, touch pad, and touch panel can be realized. Therefore, it is preferable to further include one subcommand key in the pointing gesture and diversify input commands in addition to the Doppler command. This subcommand key can be replaced by other methods such as voice input.
  • the shape of the finger at the time of the pointing gesture is schematically shown as a columnar shape by the distance image. It is also possible to adopt an operation method in which a pointer is displayed at a coordinate position where a vector directed to the axis of the columnar shape intersects with the display screen as a monitoring target point. Also in this case, it is preferable that the contact / separation detection unit monitors the Doppler effect using a reflected wave used for the distance image.
  • the above-described apparatus and method according to the present invention can be realized by executing the operation input program of the present invention described in a predetermined language on a computer. That is, the program of the present invention is installed in a mobile terminal device, a smart phone, a wearable terminal, a mobile personal computer, other information processing terminals, an IC chip or a memory device of a general-purpose computer such as a personal computer or a server computer, and executed on the CPU. By doing so, the system of each function mentioned above can be constructed
  • the program of the present invention is an operation input program for inputting a coordinate position on the display screen of the information processing apparatus and an operation command related to the coordinate position,
  • a transmitter that transmits electromagnetic waves or sound waves as a transmitted wave to a monitoring space area including a part or all of the visual field range in which the display screen is visible,
  • a receiving unit that receives the transmitted wave reflected by the object as a reflected wave; Based on the transmitted wave and the received reflected wave, the position information on the three-dimensional space of the object that generated the reflected wave and position information regarding the displacement are detected, and the display is performed based on the position information.
  • An input signal related to a coordinate position on the screen is generated and provided to the information processing device, and the wavelength of the transmitted wave and the wavelength of the reflected wave are compared, and the signal is transmitted to the transmitter or the receiver.
  • it functions as a contact / separation detection unit that detects the movement and speed of the object in the direction of approaching or leaving.
  • an operation input program of the present invention for example, as a package application that can be distributed through a communication line and that runs on a stand-alone computer by being recorded on a computer-readable recording medium.
  • the recording medium can be recorded on various recording media such as a magnetic recording medium such as a flexible disk and a cassette tape, an optical disk such as a CD-ROM and a DVD-ROM, and a RAM card.
  • the computer-readable recording medium on which the program is recorded the above-described system and method can be easily implemented using a general-purpose computer or a dedicated computer, and the program can be stored, transported, and Easy installation.
  • a mouse, a touch pad, a touch panel, or the like is operated by direct contact with a pointing device or a command input on a desktop or a display screen of an information processing terminal device such as a notebook personal computer, a tablet personal computer, or a smartphone.
  • an information processing terminal device such as a notebook personal computer, a tablet personal computer, or a smartphone.
  • assembled on CPU of the operation input system which concerns on 1st Embodiment. It is explanatory drawing which shows the detection process of the fingertip monitoring target point of the operation input system which concerns on 1st Embodiment. It is explanatory drawing which shows the outline
  • FIG. 1 is a conceptual diagram showing an external configuration of an information processing terminal device for realizing the operation input system according to the present embodiment
  • FIG. 2 is an information process for realizing the operation input system according to the present embodiment.
  • It is a block diagram which shows the hardware constitutions of a terminal device.
  • the “module” used in the description refers to a functional unit that is configured by hardware such as an apparatus or a device, software having the function, or a combination thereof, and achieves a predetermined operation. .
  • the information processing terminal device can be realized by a general-purpose computer or a dedicated device.
  • the notebook personal computer 1 is an information processing terminal device that includes a display screen 5b such as a monitor and is integrated so that a main body having a keyboard, a pointing device, and the like is folded in half, and includes a transmitter 5c and a receiver 6a0. 2 are arranged on the lower side of the display screen 5b toward the operator side.
  • the present invention is not limited to this, and information processing that can be equipped or connected to the transmission unit 5c and the reception units 6a0 to 6a.
  • it is a terminal, it can be realized by a general-purpose computer such as a desktop personal computer or a dedicated device specialized in function, and a tablet personal computer, smartphone, mobile phone, wearable terminal device, etc. can also be adopted. .
  • the hardware configuration of the notebook personal computer 1 includes a CPU 11, a memory 12, an input interface 16, a storage device 14, an output interface 15, and a communication interface 13. .
  • these devices are connected via the CPU bus 10 and can exchange data with each other.
  • the input interface 16 is a module that receives an operation signal from an operation device such as a keyboard 6b, a pointing device, a touch panel, or a button. The received operation signal is transmitted to the CPU 11 and can perform operations on the OS and each application. . Further, input devices such as a CCD camera 6d and a microphone 6c can be connected to the input interface 16, and the receiving units 6a0 to 6a2 of the present invention are also connected to the input interface 16.
  • the receiving units 6a0 to 6a are receiving devices having directivity for blocking re-reflection from the vicinity and selectively receiving only the direct reflected wave from the monitoring target in the air. is there. More specifically, the receiving units 6a0 to 6a2 irradiate the space W that has nothing other than the monitoring target and the operator in the vicinity with a predetermined directivity, and the direction of the keyboard among the transmitted waves reflected by the target object. A direct reflected wave that reaches the receiving unit having directivity so as to prevent re-reflection from being received is received as a reflected wave W2, and the received reflected wave W2 is subjected to the synchronization processing shown in FIG. The signal is input to the signal processing unit 112 and the contact / separation detection unit 113 through the unit 111.
  • the receiving units 6a0 to 6a2 are a plurality of infrared sensors that are physically separated and arranged adjacent to each other, and are arranged in a triangle on the lower right side of the display screen 5b. If there is sufficient resolution, various sensors corresponding to the types of electromagnetic waves and sound waves transmitted by the transmitter 5c, such as electromagnetic sensors other than infrared rays and ultrasonic sensors, can be employed.
  • the output interface 15 is a module that transmits video signals and audio signals in order to output video and audio from output devices such as the display screen 5b and the speaker 5a.
  • a transmitter 5c that transmits electromagnetic waves or sound waves as a transmitted wave is also connected to the output interface 15.
  • the transmitter 5 c has a predetermined directivity with respect to the monitoring space region R ⁇ b> 3 including part or all of the visual field range R ⁇ b> 2 that is visible on the display screen 5 b using electromagnetic waves or sound waves as a transmitted wave.
  • This is a transmitting device, and transmits a transmitted wave in accordance with control by the synchronization processing unit 111 in FIG.
  • the transmitter 5c employs an infrared light that irradiates an infrared laser pulse, and is arranged at the center of the receivers 6a0 to 6a2 arranged in a triangle at the lower right side of the display screen 5b.
  • the transmitter 5c can detect electromagnetic waves and sound waves that can be detected by the receivers 6a0 to 2 such as an ultrasonic transmitter and a radio transmitter other than infrared rays.
  • the receivers 6a0 to 2 such as an ultrasonic transmitter and a radio transmitter other than infrared rays.
  • Various transmitters corresponding to such types can be employed.
  • the communication interface 13 is a module that transmits and receives data to and from other communication devices.
  • a communication method for example, a public line such as a telephone line, an ISDN line, an ADSL line, and an optical line, a dedicated line, WCDMA (registered trademark)
  • 3G 3rd generation
  • 4G 4th generation
  • 5G 5th generation
  • WiFi registered trademark
  • Bluetooth Wireless communication networks such as registered trademark
  • the storage device 14 is a device that accumulates data in a recording medium and reads out the accumulated data in response to a request from each device.
  • a hard disk drive (HDD), a solid state drive (SSD), a memory card, and the like can be configured.
  • the CPU 11 is a device that performs various arithmetic processes necessary for controlling each unit, and virtually constructs various modules on the CPU 11 by executing various programs.
  • an OS Operating System
  • various applications can be executed on the OS, and the CPU 11 executes the OS program, thereby managing and controlling the basic functions of the notebook personal computer 1, and the CPU 11 executing the application program.
  • various functional modules are virtually constructed on the CPU.
  • FIG. 3 is a block diagram showing functional modules constructed on the CPU of the operation input system according to the first embodiment.
  • three modules of the synchronization processing unit 111, the signal processing unit 112, and the contact / separation detection unit 113 are constructed on the CPU 11.
  • the synchronization processing unit 111 synchronizes the transmission process of the transmitted wave W1 in the transmission unit 5c and the reception process of the reflected wave W2 in the reception unit 6a, and transmits the transmitted wave W1 and the reflected wave W2. It is a module that associates with.
  • the transmission unit 5c transmits intermittent pulsar waves as transmission waves W1, receives reflected waves corresponding to the transmitted pulsar waves, transmits and receives the order, time, transmission to reception. Based on the time length, each transmitted wave signal and each corresponding reflected wave signal are selected, the transmitted wave and the reflected wave signal are associated as a set, and sent to the signal processing unit 112.
  • the signal processing unit 112 detects the spatial position information of the object that generated the reflected wave W2 and its displacement based on the transmitted wave W1 transmitted by the transmitting unit 5c and the reflected wave W2 received by the receiving unit 6a.
  • This is a module that generates an input signal related to a coordinate position on the display screen 5b and provides it to an OS or an application running on the information processing apparatus.
  • the pointing gesture recognition unit 112b determines the presence or absence of the pointing gesture as shown in FIGS. 1, 4 to 9, 13, 13, 18, and 19.
  • the reflected wave has a strong amplitude at the fingertip B1 perpendicular to the traveling direction of the transmitted wave, and at the belly of the finger that is parallel to the traveling direction of the transmitted wave or is in contact with a predetermined angle (for example, an obtuse angle). Since there is no to weak amplitude, the reflected wave forms an amplitude-time profile as seen in the A1 region of FIG. Paradoxically speaking, if such a characteristic of the amplitude-time profile is observed in the reflected wave, it is presumed that a pointing gesture was present.
  • the three distance values calculated in this way are input to the position information calculation unit 112d.
  • spatial position information is obtained by three-point ranging from three reception units arranged in a triangle.
  • the arrangement is improved by devising the arrangement.
  • the accuracy of the three-dimensional position information of the target point is improved.
  • the number of transmitting units is one, but the number of transmitting units can be increased to the same number as the receiving units by synchronizing with each receiving unit.
  • the combination of the number and arrangement of the transmitters and receivers can be optimized in accordance with the required accuracy of the monitoring target point position information.
  • the most basic 1 transmission unit + 3 reception unit is adopted, but as shown in FIG. 18, as a pair of three transmission units and three reception units corresponding to each of the three transmission units. Also good.
  • the operation surface setting unit 112e has the same or similar shape to the display screen 5b in parallel or obtuse with the display screen 5b in the monitoring space region R3.
  • This is a module for setting a virtual continuous countless virtual operation screen VP set (3D touch panel) that changes its size in accordance with the distance to the target point.
  • the distance D1 or D2
  • the virtual operation screen VP or VP2 as the bottom surface of the quadrangular pyramid may be set to be perpendicular to the normal line orthogonal to the display screen 5b, for example, and perpendicular to the extension line passing through the transmission unit or the reception unit Or you may set so that it may make a predetermined angle and you may set so that it may become parallel with respect to the display screen 5b.
  • the framework 3D grid of the 3D touch panel set by the operation surface setting unit 112e is input to the pointing signal generation unit 112f.
  • the pointing signal generation unit 112f is a module that generates a GUI (Graphical User Interface) such as a cursor, a pointer, or a scroll bar displayed on the screen as a pointing signal according to coordinates on the screen.
  • GUI Graphic User Interface
  • the signal is input to the signal generator 112g. This is the description of the signal processing unit.
  • the contact / separation detector 113 is a module that detects the Doppler effect by comparing the wavelength of the transmitted wave and the wavelength of the reflected wave, and detects the movement of the object in the direction of approaching or leaving the object and the speed thereof. is there.
  • the contact / separation detection unit 113 includes a Doppler effect detection unit 113a and a Doppler command signal generation unit 113b.
  • the average of the maximum speeds is input to the Doppler command signal generator 113b.
  • Doppler commands In the present embodiment, it is interpreted as a button down operation when it is a blue shift where the wavelength of the reflected wave is short, and as a button up operation when it is a red shift where the wavelength is long. For example, if there is a rapid button-down operation, it is determined that a click with the mouse is performed, and if the button-down operation continues twice rapidly, a double-click operation is determined. If there is a rapid button up, it is determined that the drag operation is started, and if the button is rapidly down after selecting the drag range, it is determined that the drag operation is completed.
  • the Doppler command generated here is input to the input signal generation unit.
  • the input signal generation unit 112g integrates the pointer coordinates (x, y) on the screen obtained by the pointing signal generation unit 112f and the command signal obtained by the Doppler command signal generation unit 113b, and performs comprehensive operations on the GUI. An input signal translated as a command is generated.
  • the Doppler command signal is input to the input signal generation unit 112g, the input of the pointing signal to the input signal generation unit 112g is interrupted, and the pointer coordinates (x, y) remain at a position where the command is executed for a certain time.
  • the input signal generated by the input signal generation unit 112g is provided to the notebook personal computer 1 through the OS executed on the CPU 11 and other applications.
  • the input signal generation unit 112g may generate various commands by incorporating operation signals from other input devices in addition to the Doppler command. For example, with the pointer displayed on the display screen with a pointing gesture, the sub-command key 1a in the left front corner of the information processing terminal shown in FIG. 1 is briefly pressed once to display the menu screen, or the same key is pressed. You can zoom in and out by moving the pointer back and forth while holding down, and scroll the screen by moving it in parallel with the screen. In this way, by adding only one subcommand key, almost all current mouse and touch panel commands can be realized together with the Doppler command. Furthermore, it is more convenient if the GUI mode can be optimized to the operator's preference according to needs, such as adding assistance by voice input to the microphone 6c such as “click”, “drag”, “zoom in”, and the like.
  • a character string such as “click” or “double click” may be displayed as a message for about 1 second, or “drag”, “zoom”, and “scroll” may be continuously displayed during operation.
  • FIG. 11 is a flowchart showing the procedure of the operation input method according to the present embodiment.
  • step S101 infrared rays or other electromagnetic waves or sound waves are transmitted as directional transmission waves from the transmission unit 5c to the monitoring space region R3 shown in FIG. 4 and reflected waves reflected by the object. Are received by the receiving units 6a0 to 6a so that re-reflection is not mixed.
  • step S102 the transmission process of the transmission wave W1 in the transmission unit shown in FIG. 6 and the reception process of the reflected wave W2 in the reception unit are synchronized, and the transmission wave W1 and the reflection wave W2 are associated with each other.
  • step S103 in order to remove unnecessary information in advance and reduce the processing load, the filtering unit 112a performs filtering to select only a reflected wave signal in a range corresponding to a depth of 3 centimeters from the initial motion wave. Send to step.
  • step S104 the presence / absence of a pointing gesture is determined for the filtered signal. When a characteristic reflected wave pattern as shown in the area A1 in FIG. 10 is detected simultaneously with a predetermined number or more of pulse waves from the three receiving units, it is determined that there is a pointing gesture.
  • step S105 If it is determined in step S105 that there is a pointing gesture (“Y” in S105), the process proceeds to the next step S106, and the reflected wave data of the three receiving units shown in the A1 area of FIG. 10 is sent to the next step. . If the pointing gesture is not recognized (“N” in S105), the previous step is repeated until it is recognized. In step S106, the initial motion wave of the reflected wave data in the A1 region in FIG. 10 is interpreted as the fingertip B1, and the distance from the time from when the transmitted wave is irradiated until it returns to each of the three receiving units to the fingertip is calculated. .
  • the fingertip the monitoring target point detection unit 112c is compared with the wavelength of the initial reflected wave data of the three receiving units and the wavelength of the pulse transmission wave, and the presence or absence of the Doppler effect is checked. Detects the movement of a point back and forth.
  • the Doppler command signal in step S111 Is generated.
  • the input of the pointing signal to the input signal generation unit 112g is interrupted, and the pointer coordinates (x, y) remain at a position where the command is executed for a certain time.
  • the Doppler effect is not detected (“N” in step S108)
  • the normal pointing operation is continued (S109).
  • step S110 the pointing position information input from the pointing signal generation unit in step S109 and the Doppler command signal in step S111 are combined, and an input signal of a Doppler command at a specific pointer position (x, y) is output. Is provided to the notebook personal computer 1 through the OS and other applications running on the computer.
  • an operator's pointing operation can be input without using a device such as a mouse, a touch pad, or a touch panel that is operated by direct contact.
  • the so-called Doppler effect is monitored by comparing changes in the wavelength of the transmitted wave and its reflected wave, so that the approach of the target object can be performed easily and accurately without performing heavy processing such as image analysis. It is possible to detect the separation and the moving speed thereof, and it is possible to accurately detect a minute and quick movement of a fingertip or the like and convert it into a command.
  • the most basic setting of one transmitting unit and three receiving units is adopted.
  • the transmitting unit irradiates a transmitted wave in a space where there are no objects to be monitored and an object other than the operator in the vicinity, and the receiving unit is configured to receive only reflected waves from the space from the beginning.
  • Unnecessary signal noise is suppressed.
  • the combination of the 3D touch panel and the Doppler command allows the operator to freely perform a natural and free operation by freely using the space existing in front of the display screen of the information processing terminal.
  • the operability can be improved by combining with a Doppler command that produces an effect of actual touch by a touching gesture.
  • FIG. 12 is a block diagram illustrating functional modules constructed on the CPU of the operation input system according to the second embodiment.
  • the same components as those in the first embodiment described above are denoted by the same reference numerals, and the functions and the like are the same unless otherwise specified, and the description thereof is omitted.
  • the signal processing unit 112 includes a columnar detection unit 112i and a vector determination unit 112h as characteristic components.
  • one receiving unit 6a provided at the lower right of the display screen 5b is a so-called range image camera.
  • a distance image camera is a semiconductor chip such as a CCD that captures an image, and measures and images the arrival distance of reflected waves received by a number of sensor elements to generate a so-called distance image.
  • the role of the pointing gesture recognition unit 112b is the same as that of the first embodiment, but in this embodiment, the pointing gesture is extracted based on the shape obtained from the image information. As shown in FIG. 14, since the finger directed to the display screen can be imitated in a long and narrow cylindrical shape, when a cylindrical shape having a predetermined diameter facing the display screen is recognized in the image data after filtering. Recognize that there is a fingertip gesture.
  • the fingertip gesture When looking at the pointing gesture in the distance image, a shadow is created on the side opposite to the direction of the transmitted wave irradiation, so the fingertip gesture is actually recognized on the basis of an elongated rectangle projected on the xz plane, that is, the keyboard and a horizontal plane.
  • the distance image data that has been recognized with the pointing gesture in this manner is input to the columnar detection unit 112i.
  • the vector determination unit 112h creates an approximate expression of the axis vector V1 (FIGS. 13 to 15) of the columnar shape Ob (FIGS. 14 and 15B) detected by the columnar detection unit 112i.
  • the operation surface setting unit 112e sets a 3D grid with the display screen 5b as the XY axis and the direction orthogonal thereto as the Z axis in the monitoring space area.
  • the 3D grid set by the operation surface setting unit 112e is input as a framework to the position information calculation unit 112d.
  • the position information calculation unit 112d calculates the coordinate position (x, y) where the approximate straight line of the vector V1 expressed in the 3D grid intersects the display screen 5b, and uses this as the pointer position to the pointing signal generation unit 112f. Deliver. Based on this, the pointing signal generator 112f generates an input signal that specifies the position and movement of the pointer.
  • the second pointing gesture is recognized in the above process, it is captured as a subcommand signal and the content of the input signal is changed. For example, as shown in FIG. 16, if a gesture such as raising a little finger is performed in combination with an operation with a pointing finger with an index finger, this is determined as a subcommand signal, and a subcommand dialog is displayed on the screen.
  • the range of selection operations by the operator can be expanded. This corresponds to the sub-command key (1a in FIG. 1) in the first embodiment.
  • this information is replaced with a gesture by making use of the abundant amount of information in the distance image.
  • FIG. 17 is a flowchart showing the procedure of the operation input method according to this embodiment.
  • the same contents as in the first embodiment are omitted, and only the steps unique to this embodiment are described.
  • step S202 the object is monitored with a distance image camera, and in step S204, the presence or absence of a pointing gesture is determined based on the distance image data.
  • step S205 if it is determined in the previous step that there is a pointing gesture, the columnar detection unit 112i geometrically analyzes the columnar shape Ob shown in FIGS. 14 and 15 based on the distance image data of the pointing gesture. Approximate.
  • step S206 the vector determination unit 112h calculates the direction of the axis of the columnar shape Ob detected by the columnar detection unit 112i, that is, the approximate expression of the vector V1, as shown in FIGS.
  • step S207 the position information calculation unit 112d calculates the coordinate position where the extended line of the vector V1 intersects the display screen 5b, and transfers this to the pointing signal generation unit 112f. Based on this information, the pointing signal generator 112f generates a pointing signal for designating the pointer position and movement (S209), and inputs the pointing signal to the input signal generator 112g.
  • Step S208 and Step S211 are basically the same as those in the first embodiment.
  • the subcommand key is used.
  • the subcommand signal is generated by the second pointing gesture using the abundant amount of information in the distance image.
  • a process for displaying the subcommand menu is executed, and then a pointing signal for moving the pointer on the subcommand menu is generated.
  • the signal processing unit 112 generates and outputs an input signal related to the coordinate position on the display screen 5b.
  • the notebook personal computer is used as the information processing terminal device, but the operation input program of the present invention is installed in the smartphone 1 '.
  • the transmitting / receiving unit is arranged in a triangle toward the operator side on the upper left of the display screen 5b.
  • This change example assumes that you have a smartphone with your left hand and perform a pointing gesture with your right index finger, or a smartphone with your right hand and a pointing gesture with your right thumb.
  • smartphones have a limited number of keys that can be used freely, so a subcommand button 1b is newly provided at the lower left side.
  • the function of this button 1b is the same as the subcommand key of the first embodiment shown in 1a of FIG. 1, and various commands can be executed in addition to the Doppler command by combining this button operation and the pointing gesture operation. .
  • Operation surface setting unit 112f Pointing signal generation unit 112g ... Input signal generation unit 112h ... Vector determination unit 112i ... Columnar detection unit 113 ... Contact / separation detection unit 113a ... Doppler effect detection unit 113b ... Doppler command generator

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention vise à résoudre le problème de l'obtention fiable d'une entrée d'opération effectuée via un mouvement fin rapide du bout d'un doigt ou similaire sans nécessiter un dispositif de manipulation directe comme une souris ou un panneau tactile, d'où une réduction de la charge imposée à l'opérateur. La solution selon la présente invention concerne un système d'entrée d'opération comportant: une unité 5c d'émission qui émet des ondes électromagnétiques ou des ondes sonores en direction d'une région R3 d'espace surveillé comprenant tout ou partie d'une étendue R2 de visualisation dans laquelle un écran 5b d'affichage est visible, lesdites ondes électromagnétiques et sonores servant d'ondes d'émission; des unités 6a0-2 de réception qui reçoivent les ondes réfléchies à partir d'un objet cible irradié au moyen des ondes d'émission; une unité 112 de traitement de signal qui, sur la base des ondes d'émission et des ondes réfléchies, détecte des informations de position se rapportant à une position dans l'espace tridimensionnel de l'objet cible réfléchissant les ondes et se rapportant à des variations de ladite position, génère un signal d'entrée associé à une position en coordonnées sur l'écran 5b d'affichage, et fournit le signal d'entrée généré à un dispositif terminal de traitement d'informations; et un unité 113 de détection de rapprochement/d'éloignement qui compare les longueurs d'ondes des ondes réfléchies à celles des ondes d'émission et détecte ainsi le mouvement et la vitesse de l'objet cible dans la direction où l'objet cible s'approche ou s'éloigne de l'écran 5b d'affichage.
PCT/JP2017/007123 2017-02-24 2017-02-24 Système, procédé et programme d'entrée d'opération Ceased WO2018154714A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/007123 WO2018154714A1 (fr) 2017-02-24 2017-02-24 Système, procédé et programme d'entrée d'opération

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2017/007123 WO2018154714A1 (fr) 2017-02-24 2017-02-24 Système, procédé et programme d'entrée d'opération

Publications (1)

Publication Number Publication Date
WO2018154714A1 true WO2018154714A1 (fr) 2018-08-30

Family

ID=63252556

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/007123 Ceased WO2018154714A1 (fr) 2017-02-24 2017-02-24 Système, procédé et programme d'entrée d'opération

Country Status (1)

Country Link
WO (1) WO2018154714A1 (fr)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005280396A (ja) * 2004-03-29 2005-10-13 Alpine Electronics Inc 操作指示装置
JP2013536493A (ja) * 2010-06-29 2013-09-19 クゥアルコム・インコーポレイテッド 持続波超音波信号を使用したタッチレス感知およびジェスチャー認識

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005280396A (ja) * 2004-03-29 2005-10-13 Alpine Electronics Inc 操作指示装置
JP2013536493A (ja) * 2010-06-29 2013-09-19 クゥアルコム・インコーポレイテッド 持続波超音波信号を使用したタッチレス感知およびジェスチャー認識

Similar Documents

Publication Publication Date Title
KR102230630B1 (ko) 빠른 제스처 재접속
US9020194B2 (en) Systems and methods for performing a device action based on a detected gesture
US8593398B2 (en) Apparatus and method for proximity based input
EP2908215B1 (fr) Procédé et appareil pour une détection de geste et commande d'affichage
US20140237401A1 (en) Interpretation of a gesture on a touch sensing device
US20140189579A1 (en) System and method for controlling zooming and/or scrolling
US20160274732A1 (en) Touchless user interfaces for electronic devices
US20140267142A1 (en) Extending interactive inputs via sensor fusion
KR101194883B1 (ko) 비접촉식 스크린 제어 시스템 및 그 시스템에서의 비접촉식 스크린 제어 방법
WO2019033957A1 (fr) Procédé et système de détermination de position d'interaction, support de stockage et terminal intelligent
US20140380249A1 (en) Visual recognition of gestures
KR20140114913A (ko) 사용자 기기의 센서 운용 방법 및 장치
CN104969148A (zh) 基于深度的用户界面手势控制
CN109069108B (zh) 超声医学检测设备、传输控制方法以及成像系统和终端
JP2014219938A (ja) 入力支援装置、入力支援方法、および、プログラム
US10346992B2 (en) Information processing apparatus, information processing method, and program
CN104731317A (zh) 导航装置及影像显示系统
TWI486815B (zh) 顯示設備及其控制系統和方法
US8749488B2 (en) Apparatus and method for providing contactless graphic user interface
KR20200120467A (ko) Hmd 장치 및 그 동작 방법
KR101019255B1 (ko) 깊이 센서 방식의 공간 터치 무선단말기, 이의 데이터 처리방법 및 스크린장치
TW201351977A (zh) 用於影像辨識之影像擷取方法及其系統
KR101394604B1 (ko) 모션 감지를 통한 사용자 인터페이스 구현 방법 및 그 장치
CN109069105B (zh) 超声医学检测设备及成像控制方法、成像系统、控制器
WO2018154714A1 (fr) Système, procédé et programme d'entrée d'opération

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17898016

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: 1205A 20.01.2020

122 Ep: pct application non-entry in european phase

Ref document number: 17898016

Country of ref document: EP

Kind code of ref document: A1