WO2006036069A1 - Systeme et procede de traitement d'informations - Google Patents
Systeme et procede de traitement d'informations Download PDFInfo
- Publication number
- WO2006036069A1 WO2006036069A1 PCT/NO2005/000360 NO2005000360W WO2006036069A1 WO 2006036069 A1 WO2006036069 A1 WO 2006036069A1 NO 2005000360 W NO2005000360 W NO 2005000360W WO 2006036069 A1 WO2006036069 A1 WO 2006036069A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- optoelectronic device
- processing system
- motion
- input
- information processing
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1626—Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/16—Constructional details or arrangements
- G06F1/1613—Constructional details or arrangements for portable computers
- G06F1/1633—Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
- G06F1/1684—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
- G06F1/1686—Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being an integrated camera
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0236—Character input methods using selection techniques to select from displayed items
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/0485—Scrolling or panning
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2200/00—Indexing scheme relating to G06F1/04 - G06F1/32
- G06F2200/16—Indexing scheme relating to G06F1/16 - G06F1/18
- G06F2200/163—Indexing scheme relating to constructional details of the computer
- G06F2200/1637—Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04806—Zoom, i.e. interaction techniques or interactors for controlling the zooming operation
Definitions
- the present invention concerns an information processing system comprising an optoelectronic device, associated display means and display control electronics that either is physically a part of said optoelectronic device or physically separate therefrom, but connected therewith via a data link, wherein the optoelectronic device comprises an internal memory, a data processing system for operating the device, including input- and/or output processes, and an optical image recorder for capturing information from the environment surrounding said optoelectronic device.
- the present invention also concerns a method for operating an information processing system of this kind.
- buttons or other means of input have led to a situation where user control of the device is seriously slowed down, e.g. during text input and menu item selection.
- miniaturized multiple-choice buttons typically leads to increased operator fatigue and higher incidence of erroneous data entries:
- a common way to implement menu item selection is to employ a four-way button which has to be repeatedly operated in order to step through menu items that are typically arranged in a 1-dimensional list or a 2-dimensional array of symbols. As the lists and menus tend to increase in size with the increased functionality of mobile devices, the number of button operations required will increase in a highly undesired manner. ⁇ ,
- the small area available for a display screen implies that displayed features must either be very small and correspondingly difficult to read, or they must be so few as to severely limit the information content in the display.
- US patent application 2004/0141085 discloses a system that uses an orientation detecting mechanism (an accelerometer) to determine the orientation of an imaging device, e.g. a digital camera.
- the acquired orientation information is used to reconfigure the displayed image for increased user viewing convenience.
- the reconfiguration is explicitly effected on the captured image only. The general problem of using small displays for viewing and performing operations within large and content-rich images, diagrams, etc. is not addressed.
- a solution to the problem of providing input to a small portable device with limited space for keyboard is promoted by the firm Memsic Inc., of North Andover, MA in U.S.A.
- the portable device is equipped with a small accelerometer which senses tilting of the device. By tilting right/left and/or up/down and clicking on a button on the device, navigation is possible on a screen on the device, i.e. scrolling through lists, menu selection, map scrolling, gaming, etc.
- An obvious drawback of this approach is the need for an accelerometer in the portable device, where space and cost typically are at an extreme premium. Beyond this, input is sensitive to motion of the device involving changes in speed or direction, and presupposes a given attitude of the portable device relative to the direction of the gravity vector.
- a possible solution to the text input problem, described in US Pat. No. 5,818,437 is to assign multiple letters to each of the numerical buttons on the device, and use built-in dictionaries to find words that match the button sequence keyed by the user.
- Drawbacks of this solution include the need to scroll through several valid alternatives to find the correct word, and having to explicitly spell words, names and abbreviations that are not included in the dictionary.
- the optoelectronic device comprises hardware and software adapted for analyzing information captured by said optical image recorder with respect to temporal variations in the signals from the optical image recorder, said variations being caused by translational and/or rotational motion of said optoelectronic device relative to real or virtual objects, textures or scenery within the field of view of said optical image recorder, that said processing hardware and software are adapted for providing an output signal in scalar or vector form on the basis of the analyzed information, said output signal being indicative of the motion characteristics of said optoelectronic device relative to the real or virtual objects, textures or scenery within the field of view of said optical image recorder, that the display control electronics is connected with said optoelectronic device for receiving said output signal, and that said display control electronics is adapted for modifying or changing a displayed message or image in response to the characteristics of said output signal.
- a method according to the invention which is characterized by deriving at least one parameter from translational and/or rotational motion of said optoelectronic device or auxiliary equipment linked to same, relative to real or virtual objects, textures or scenery objects within the field of view of said optical image recorder, and applying said at least one parameter as input to control electronics in said optoelectronic device and to effect a navigation and/or input commands in said display means, partly or completely in response to said at least one parameter.
- figure 1 shows the main components in an information processing system according to the present invention
- figures 2a and 2b a first preferred embodiment where the display screen on a mobile phone constitutes part of a virtual display where the viewing field is enhanced by scrolling a window across a larger page
- figure 3 a second preferred embodiment involving a virtual keyboard
- figure 4 a third preferred embodiment involving menu item selection
- figures 5a and 5b an example of a variant of the third preferred embodiment
- figure 6 a fourth preferred embodiment where an optoelectronic device according to the invention functions as a computer mouse
- figures 7a and 7b show examples of variants of the fourth embodiment.
- the main components of an optoelectronic device as used in the information processing system of the invention is shown in fig. 1.
- the optoelectronic device comprises an optical image recorder, an image motion analyser linked with the former and used for analyzing recorded images and generating a continuously updated image output to a connected display.
- the optoelectronic device is implemented in a mobile telephone or a PDA equipped with a digital camera and a display screen.
- the optoelectronic device is termed a "phone" in these preferred embodiments. This is not meant to preclude that the optoelectronic device in question may be another type of mobile platform equipped with a digital camera and a display screen, e.g. a helmet-mounted or head-up device or system.
- FIG. 2 A first preferred embodiment is shown in figure 2.
- This embodiment provides a virtual display wherein large pages of text, images, web content, etc. that would otherwise defy normal criteria for readability on the phone display screen, are accessed in selected segments that are scrolled across the larger page simply by moving the phone in three dimensions.
- This scrolling is rendered intuitive by correlation between the virtual motion across the page to be viewed with the real motion of the phone.
- the latter is typically a right/left and up/down motion of the phone above or in front of a surface or object or scene that exhibits adequate structure or texture for the camera to register movement.
- the degree of zoom can be used to control the speed of scrolling for a given motion input, e.g. the more zoom, the slower the scrolling, and vice versa.
- Combined with the lateral motion of the phone optionally there may be an in/out motion relative to the surface or object which effects a zoom in/zoom out function on the displayed image of the large page.
- One may combine scrolling with the zoom function, such that fast scrolling reduces the zoom factor and vice versa.
- the zoom level then applied may be maintained until the zoom function is activated again, e.g. by the relative in/out motion of the phone.
- the absolute zoom level shall be user controllable
- An alternative variant of phone motion which is possible in cases where no suitable surface or object is available in reasonably close proximity to the phone, is to rotate the phone right/left and up/down, i.e. panning and tilting the camera to cause the observed scene to move across the camera chip surface.
- this can be perceived intuitively as being related to tilting of the observer's head to view different parts of the page.
- motion of the camera in a line of sight to the distant objects will cause the image to change very little, and other means of providing zooming commands must be used.
- zooming in/out in the virtual image can be achieved in several ways, namely a) by using the zoom function built into the camera, if such exists; b) by pressing or toggling a mechanical input key, switch or pressure- sensitive field on the phone; c) by moving the phone in a prescribed pattern, e.g. rapid sideways or up/down rotation for zoom in/zoom out; or d) by acoustic input, e.g. voice command.
- This threshold shall typically apply for the recorded speed of motion (translational or rotational), and the magnitude of the threshold may be selected or adjusted according to the type of usage and environment that is encountered in each application.
- any predictable motion pattern that shall not serve as input cue to the optoelectronic device may be compensated for in the signal processing, e.g. a constant linear motion due to steady translational motion of the optoelectronic device such as walking.
- the "dead band” would be centered around a certain speed value corresponding to the constant motion.
- the algorithm for making the system insensitive to motion that is not intended by the operator to provide meaningful motion input to the optical image recorder may also be set to reject very rapid accelerations or motions, e.g. by setting an upper limit to the legitimate sensitive range.
- the processing can easily be set to pick up a specific motion as an input cue, e.g. cyclic motion within a defined frequency band such as would occur in response to rapid wagging of the optoelectronic device sideways or up and down.
- the phone movement can also be used to scroll a virtual keyboard for text input, for instance to message systems like SMS.
- a single function key is pressed to select a letter, and the range of characters can be extended or various special functions implemented by using movement in the third dimension.
- writing via a virtual keyboard is performed without the need for a set of physical keys representing the alphabet or other symbols on the face of the phone. Instead, letters or symbols are selected on a virtual keyboard shown in the display, by a two- step process: First, moving the phone in a lateral or rotational motion causes a visual indicator function to move across the virtual keyboard.
- a "Select" command is given to the phone, by one of the following modalities, namely a) by pressing or toggling a mechanical input key, switch or pressure- sensitive field on the phone; b) by moving the phone in a prescribed pattern, i.e. laterally up/down or rotationally (pan/tilt) or along the optical axis of the phone's camera (in/out); or c) by acoustic input, e.g. voice command.
- the virtual keyboard may be larger than the display, causing only part of it to be visible within the field defined by the phone's display screen. This situation is depicted in figure 3, and in this case the virtual keyboard scrolls across the display, one keyboard letter or symbol being highlighted or framed at a time upon passing a selector field in e.g. the middle of the screen.
- the whole virtual keyboard is shown inside the display, and a cursor or highlighting field moves across the virtual keyboard when the phone is moved.
- Selection of upper or lower case characters or special symbols or functions is achieved by entering a command into the phone by one of the modalities a) to c) described earlier in this paragraph.
- the command may be of the "Caps shift” or “Caps lock” type, depending on the situation.
- a sweep across a virtual background field where the menu icons are laid out can be performed quickly and simply.
- Motion patterns and selection modes can be similar to those described for the virtual keyboard above.
- a typical phone has many menus. The user could pre-set which menus should be subject to motion controlled selection, and then the camera would automatically be activated when these menus are selected. Menus could be nested hierarchically, and motion control could be used to navigate into and within sub-menus.
- the phone camera is automatically activated to detect phone movement.
- the phone calls the presently highlighted contact. • If the contact has more than one phone number, the first dial-up or select command produces a pop-up window with all the phone numbers. Phone movement is used to highlight the desired number and the dial-up button is used for placing the call.
- This strategy enables fast selection of the desired contact in one sweeping phone movement. It enables more contacts on the screen than the usual text list layout, and is based on human recognition of images/graphical symbols rather than text, which is normally a faster search method.
- the present invention could also in a general fashion implement a so-called relative pointer device, for instance for controlling the cursor on a computer screen. However, the present invention shall allow the extension of relative pointer technology far beyond that offered by e.g.
- a conventional computer mouse This opens for a fourth preferred embodiment, the principle of which is shown in fig.6, which basically renders the use of the optoelectronic device as three-dimensional computer mouse or similar input device to a mobile or stationary information processing system. Cursor movement on the computer screen is controlled by moving the optoelectronic device in one or more of the translational, rotational or combined translational/rotational modes described in the other preferred embodiments above, while mouse clicks can be effected in several ways, e.g.
- the input from the optoelectronic device to the information processing system is the operator signature, e.g. in legal or economic transactions or in access control.
- the operator in this case moves the optoelectronic device as he would a pen to create the pattern, which may be his name signature or some other pre-defined identifying pattern.
- the motion need not take place against any supporting surface, but could be performed in the air in front of the operator ("writing in the air").
- One example of such processing would be to scale the size of the recorded signature to a predefined norm.
- an alternative to moving the phone in the open air shall be to equip the phone with additional physical hardware, e.g. a rounded object, e.g. ball-like, which may be attached to the phone at the side where the camera is located.
- additional physical hardware e.g. a rounded object, e.g. ball-like, which may be attached to the phone at the side where the camera is located.
- the phone may be rested on the desktop or any other object, while the three-dimensional motions are achieved by tilting the phone in the various directions, as allowed by the thickness of the attached, rounded object.
- the translation movements of the mouse now can be replaced by a rotational movement around any of three axes and over a fairly small angle.
- Communication with the computer could be via a wire or by one of several wireless means.
- Bluetooth communication would enable using the mobile electronic device for controlling presentations, including pointer control and the possibility for the user to stand up and move around.
- An important advantage of this embodiment relative to the conventional computer mouse is that contact with or close proximity to a surface is not needed.
- it is possible to provide input commands linked to a motion in the third dimension either by increasing or reducing the distance from the camera of the mobile electronic device to an object in front of it or by employing the camera zoom function. This opens up opportunities for 3-D cursor control, e.g. in CAD software or in the display/manipulation of 3 dimensional virtual or real-space objects.
- the optoelectronic device can be equipped with a microphone, such as is the case when the optoelectronic device is a mobile telephone with camera. This makes it possible to insert audio files into the computer. Likewise, by means of a speaker or earphone attached to the optoelectronic device, it becomes possible to listen to audio files that have been located in the computer.
- relative motion is created by the optoelectronic device being stationary or nearly so.
- the relative motion is then created by moving an object, e.g. the operator's hand (or finger), in front of the optical image recorder of the optoelectronic device.
- This variant can be implemented in various ways, as shown in figs 7a and 7b.
- Figure 7a shows a person carrying an optoelectronic device according to the invention with a camera viewing out in front of the person. By moving his hands in front of the camera, this causes a cursor to move on the screen of a computer as shown in fig.7a, or on a screen in the optoelectronic device itself.
- Mouse clicks can be done in several ways, e.g.
- the person holds a contrast-enhancing or recognition aiding implement in his hand, e.g. a pen or pencil with a marker spot of high optical contrast on it, optionally in a color selected in an optimized spectral region, to simplify processing of the image.
- a contrast-enhancing or recognition aiding implement e.g. a pen or pencil with a marker spot of high optical contrast on it, optionally in a color selected in an optimized spectral region, to simplify processing of the image.
- Fingers or hands of the operator can be marked with a spot or symbol of high optical contrast, e.g. to show the positions of the fingertips.
- variants of this fourth embodiment exemplified in figures 7a and 7b can be used to effect handwritten input into the computer, the optoelectronic device or both, including name or other types of signatures for legal and transaction purposes or access control.
- Image analysis provides further possibilities for more complex and hence more compact or efficient input coding: For example, two fingers in contact with each other side by side may carry a different message from the same two fingers splayed away from each other.
- a simple example of how this is when writing to virtual keyboards (cf. below), in which case lower case symbols can be written with a single finger, upper case with two fingers close together.
- detection of more than one motion vector in the field of view of the optical sensing device of the optoelectronic device can be employed as input.
- An example of this is the use of two hands or fingers that are moved independently, but in coordinated fashion.
- the principle scheme in figure 7b can be used to provide keyboard input via a "virtual keyboard”:
- a keyboard may be printed on the table surface or on a flat sheet lying on the table, and the camera records the position of the operator's fingers on the keyboard.
- Image analysis software is used to determine which key is indicated, e.g. by analyzing the obscuration of keyboard by the hands and fingers. In cases where fingertips are tagged for enhanced visibility, the tag location relative to the keyboard is analyzed.
- the actual tapping to indicate selection of an indicated key may be achieved by one of the modalities mentioned above in connection with mouse clicking, or by providing a motion cue to the imaging software. An example of the latter would be to set an activation threshold related to the dwell time on a given key, i.e. the finger must rest a time exceeding a predetermined minimum for a tapping to be registered.
- a variant of a virtual keyboard would be to project a keyboard as an image onto the writing surface. Fingertip tags with high contrast could in this case be rendered especially useful by making them fluoresce in response to the light used in projecting the keyboard or another light source.
- virtual touch screen Another type of virtual keyboard can be established under a wider concept which may be termed a "virtual touch screen”:
- the optoelectronic device is equipped with a camera which is directed towards the screen display of a computer.
- the operator places a pointer or wand, or one or more Fingers or hand(s) against the screen, thereby obscuring parts of the displayed image.
- Image analysis software determines which of the displayed image parts or regions that are indicated, e.g. by analysing the contours of the object held in front of the screen. This principle thus provides the features of a touch-sensitive screen, providing input opportunities for menu selection, etc, but without the need for technically complex physical sensors to record the touching action.
- writing may be performed in a manner similar to the case described above where writing was performed on a printed or projected keyboard.
- the virtual touch screen provides opportunities for more sophisticated input operations than simple two-dimensional coordinate point indication, by making the image analysis software able to discriminate between different motion- and shape patterns for the operator's fingers and hands in front of the screen.
- An example of this is to use the flat palm of the hand to mark or wipe larger parts of a document or scene, to effect copying, deletion or transfer.
- One vital part of any of the indicated embodiments is the processing of camera image sequences to obtain the output signal which indicates movement of the mobile electronic device relative to imaged objects or scenes.
- motion estimation which is a well established research field where the present major application is compression of digital video sequences.
- the MPEG video compression standards see e.g.: http://www.chiariglione.org/mpeg/ ) rely heavily on motion estimation.
- the present invention has an advantage over conventional video compression, in that only the camera motion need be estimated, which means characterization of the general movement of the background of the scene. The motion of smaller objects across the background can be disregarded, which lowers the demand for processing power compared to video compression.
- the camera image needs to be split into macroblocks, typically of the size 16x 16 pixels.
- the motion of each macroblock from one frame to another is then computed, resulting in a motion vector (MV).
- MV motion vector
- MVF motion vector field
- Moving objects, mismatched macroblocks due to insufficient texture, etc. will show up as outliers in the MVF and can be disregarded through proper filtering.
- the implementation of MVF calculation in the present invention will be chosen in each particular embodiment according to the properties of the camera and other phone hardware, available processing power, etc.
- Block matching is probably the most common way to calculate the MVF (cf., e.g.: A. M. Telkap, Digital Video Processing, Prentice Hall, 1995; J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994).
- each macroblock in the new frame is compared with shifted regions of the same size from the previous frame, and the shift which results in the minimum error is selected as the best MV for that macroblock.
- Other methods of camera motion estimation include phase correlation (cf., e.g.: J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994; Y. Liang, Phase-Correlation Motion Estimation, Final project, Stanford University) and the utilization of MPEG motion vectors (cf., e.g.: M. PiIu, On Using Raw MPEG Motion Vectors To Determine Global Camera Motion, Hewlett-Packard Company, 1997).
- phase correlation cf., e.g.: J. Watkinson, The Engineer 's Guide to Motion Compensation, Snell & Wilcox, 1994; Y. Liang, Phase-Correlation Motion Estimation, Final project, Stanford University
- MPEG motion vectors cf., e.g.: M. PiIu,
- Power consumption is a potential concern in embodiments of the present invention where an imaging camera and/or a display screen are operated on a battery.
- the camera part of the system draws little power compared to an illuminated display screen.
- several power conservation strategies are possible, since imaging requirements are different from and generally less demanding than those in regular photography.
- the total number of available camera pixels need not be active in the primary photo-event or in subsequent signal processing steps.
- Frame rates may also be reduced, depending on the updating rate and response speed that is required in applications based on the present invention.
- several strategies are available to conserve power.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Hardware Design (AREA)
- Multimedia (AREA)
- Position Input By Displaying (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
NO20044073A NO20044073D0 (no) | 2004-09-27 | 2004-09-27 | Informasjonsbehandlingssystem og fremgangsmate |
NO20044073 | 2004-09-27 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006036069A1 true WO2006036069A1 (fr) | 2006-04-06 |
Family
ID=35057647
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/NO2005/000360 WO2006036069A1 (fr) | 2004-09-27 | 2005-09-27 | Systeme et procede de traitement d'informations |
Country Status (2)
Country | Link |
---|---|
NO (1) | NO20044073D0 (fr) |
WO (1) | WO2006036069A1 (fr) |
Cited By (64)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007042189A1 (fr) * | 2005-10-11 | 2007-04-19 | Sony Ericsson Mobile Communication Ab | Terminaux de communications cellulaires et procedes de detection de deplacement de terminaux pour la commande de curseur |
WO2007068791A1 (fr) * | 2005-12-13 | 2007-06-21 | Elcoteq Se | Methode et disposition pour gerer une interface utilisateur graphique et un dispositif portable pourvu d'une interface utilisateur graphique |
EP1884864A1 (fr) * | 2006-08-02 | 2008-02-06 | Research In Motion Limited | Système et méthode pour adapter la présentation de texte et d'images sur un appareil électronique relativement à l'orientation de l'appareil |
WO2008029180A1 (fr) * | 2006-09-06 | 2008-03-13 | Santosh Sharan | Appareil et procédé pour l'agrandissement d'écran lié à une position |
WO2008098331A1 (fr) * | 2007-02-15 | 2008-08-21 | Edson Roberto Minatel | Dispositif optoélectronique pour faciliter et commander des procédés industriels |
WO2009001240A1 (fr) * | 2007-06-27 | 2008-12-31 | Nokia Corporation | Procédé, appareil et progiciel permettant la fourniture d'un mécanisme de défilement pour des dispositifs à écran tactile |
WO2009052848A1 (fr) * | 2007-10-25 | 2009-04-30 | Nokia Corporation | Controle de presentation d'informations par un appareil |
WO2009071234A1 (fr) * | 2007-12-08 | 2009-06-11 | T-Mobile International Ag | Clavier virtuel d'un appareil terminal mobile |
US7616186B2 (en) | 2005-12-09 | 2009-11-10 | Sony Ericsson Mobile Communications Ab | Acceleration reference devices, cellular communication terminal systems, and methods that sense terminal movement for cursor control |
WO2009150522A1 (fr) * | 2008-06-11 | 2009-12-17 | Nokia Corporation | Mouvements d’appareil photo pour commande d’interface utilisateur |
WO2009032638A3 (fr) * | 2007-09-04 | 2010-01-21 | Apple Inc. | Interface utilisateur pour menu d'application |
US20100125816A1 (en) * | 2008-11-20 | 2010-05-20 | Bezos Jeffrey P | Movement recognition as input mechanism |
WO2010068312A1 (fr) * | 2008-12-10 | 2010-06-17 | Sony Ericsson Mobile Communications Ab | Système et procédé de modification d’une pluralité de régions d'entrée de touche sur la base d'une inclinaison et/ou d'un taux d'inclinaison détectés d'un dispositif électronique |
US20100149100A1 (en) * | 2008-12-15 | 2010-06-17 | Sony Ericsson Mobile Communications Ab | Electronic Devices, Systems, Methods and Computer Program Products for Detecting a User Input Device Having an Optical Marker Thereon |
FR2940690A1 (fr) * | 2008-12-31 | 2010-07-02 | Cy Play | Un procede et dispositif de navigation d'utilisateur d'un terminal mobile sur une application s'executant sur un serveur distant |
FR2940703A1 (fr) * | 2008-12-31 | 2010-07-02 | Cy Play | Procede et dispositif de modelisation d'un affichage |
FR2940689A1 (fr) * | 2008-12-31 | 2010-07-02 | Cy Play | Procede de navigation d'utilisateur d'un terminal mobile sur une application s'executant sur un serveur distant |
WO2010080166A1 (fr) * | 2009-01-06 | 2010-07-15 | Qualcomm Incorporated | Interface utilisateur pour des dispositifs mobiles |
WO2010076436A3 (fr) * | 2008-12-31 | 2010-11-25 | Cy Play | Procédé de modélisation par macroblocs de l'affichage d'un terminal distant a l'aide de calques caractérisés par un vecteur de mouvement et des données de transparence |
EP2296076A1 (fr) * | 2009-09-15 | 2011-03-16 | Palo Alto Research Center Incorporated | Système d'interaction avec des objets dans un environnement virtuel |
EP2341412A1 (fr) * | 2009-12-31 | 2011-07-06 | Sony Computer Entertainment Europe Limited | Dispositif électronique portable et procédé de contrôle d'un dispositif électronique portable |
EP2382527A2 (fr) * | 2008-12-30 | 2011-11-02 | France Telecom | Interface utilisateur pour permettre une commande renforcée d'un programme d'application |
US8139026B2 (en) | 2006-08-02 | 2012-03-20 | Research In Motion Limited | System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device |
EP2370889A4 (fr) * | 2008-12-18 | 2012-08-08 | Nokia Corp | Appareil, méthode, programme informatique et interface utilisateur présentant lasaisie de données |
CN102750077A (zh) * | 2011-01-31 | 2012-10-24 | 手持产品公司 | 可操作用于显示电子记录的终端 |
EP2290931A3 (fr) * | 2009-08-24 | 2013-05-22 | Pantech Co., Ltd. | Appareil et procédé pour exécution une fonction à touche programmable d'un terminal mobile |
KR101281058B1 (ko) | 2011-12-06 | 2013-07-15 | (주)나노티에스 | 터치키보드 장치 및 이의 터치위치 검출방법 |
US8493323B2 (en) | 2006-08-02 | 2013-07-23 | Research In Motion Limited | System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device |
EP2438506A4 (fr) * | 2009-06-04 | 2013-10-02 | Mellmo Inc | Affichage de donnees multi-dimensionnelles au moyen d'un objet rotatif |
EP2597590A3 (fr) * | 2011-11-28 | 2013-11-27 | Samsung Electronics Co., Ltd | Procédé d'authentification de mot de passe et dispositif portable correspondant |
CN101196795B (zh) * | 2006-08-02 | 2014-04-09 | 黑莓有限公司 | 运动图像管理系统、运动图像呈现方法和便携式电子设备 |
US8878773B1 (en) | 2010-05-24 | 2014-11-04 | Amazon Technologies, Inc. | Determining relative motion as input |
US8884928B1 (en) | 2012-01-26 | 2014-11-11 | Amazon Technologies, Inc. | Correcting for parallax in electronic displays |
US8947351B1 (en) | 2011-09-27 | 2015-02-03 | Amazon Technologies, Inc. | Point of view determinations for finger tracking |
US9035874B1 (en) | 2013-03-08 | 2015-05-19 | Amazon Technologies, Inc. | Providing user input to a computing device with an eye closure |
US9041734B2 (en) | 2011-07-12 | 2015-05-26 | Amazon Technologies, Inc. | Simulating three-dimensional features |
JP2015102924A (ja) * | 2013-11-22 | 2015-06-04 | シャープ株式会社 | 表示装置、スクロール表示方法、および、スクロール表示プログラム |
US9063574B1 (en) | 2012-03-14 | 2015-06-23 | Amazon Technologies, Inc. | Motion detection systems for electronic devices |
US9122917B2 (en) | 2011-08-04 | 2015-09-01 | Amazon Technologies, Inc. | Recognizing gestures captured by video |
US9123272B1 (en) | 2011-05-13 | 2015-09-01 | Amazon Technologies, Inc. | Realistic image lighting and shading |
US9223415B1 (en) | 2012-01-17 | 2015-12-29 | Amazon Technologies, Inc. | Managing resource usage for task performance |
US9269012B2 (en) | 2013-08-22 | 2016-02-23 | Amazon Technologies, Inc. | Multi-tracker object tracking |
US9285895B1 (en) | 2012-03-28 | 2016-03-15 | Amazon Technologies, Inc. | Integrated near field sensor for display devices |
US9335924B2 (en) | 2006-09-06 | 2016-05-10 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
GB2532010A (en) * | 2014-11-04 | 2016-05-11 | Samsung Electronics Co Ltd | Display method and device |
US9367203B1 (en) | 2013-10-04 | 2016-06-14 | Amazon Technologies, Inc. | User interface techniques for simulating three-dimensional depth |
US9367232B2 (en) | 2007-01-07 | 2016-06-14 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US9423929B2 (en) | 2009-06-04 | 2016-08-23 | Sap Se | Predictive scrolling |
US9619143B2 (en) | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
US9733812B2 (en) | 2010-01-06 | 2017-08-15 | Apple Inc. | Device, method, and graphical user interface with content display modes and display rotation heuristics |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
US9933937B2 (en) | 2007-06-20 | 2018-04-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for playing online videos |
US9933913B2 (en) | 2005-12-30 | 2018-04-03 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US10055013B2 (en) | 2013-09-17 | 2018-08-21 | Amazon Technologies, Inc. | Dynamic object tracking for user interfaces |
FR3063821A1 (fr) * | 2017-03-10 | 2018-09-14 | Institut Mines Telecom | Interface homme machine |
US10088924B1 (en) | 2011-08-04 | 2018-10-02 | Amazon Technologies, Inc. | Overcoming motion effects in gesture recognition |
US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
CN109804341A (zh) * | 2017-07-31 | 2019-05-24 | 腾讯科技(深圳)有限公司 | 与显示在用户界面上的三维互联网内容的交互 |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US10613732B2 (en) | 2015-06-07 | 2020-04-07 | Apple Inc. | Selecting content items in a user interface display |
US10620780B2 (en) | 2007-09-04 | 2020-04-14 | Apple Inc. | Editing interface |
US11153472B2 (en) | 2005-10-17 | 2021-10-19 | Cutting Edge Vision, LLC | Automatic upload of pictures from a camera |
US11199906B1 (en) | 2013-09-04 | 2021-12-14 | Amazon Technologies, Inc. | Global user input management |
US11216119B2 (en) | 2016-06-12 | 2022-01-04 | Apple Inc. | Displaying a predetermined view of an application |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6288704B1 (en) * | 1999-06-08 | 2001-09-11 | Vega, Vista, Inc. | Motion detection and tracking system to control navigation and display of object viewers |
US6433793B1 (en) * | 1998-04-24 | 2002-08-13 | Nec Corporation | Scrolling system of a display image |
US20020130839A1 (en) * | 2001-03-16 | 2002-09-19 | Hugh Wallace | Optical screen pointing device with inertial properties |
WO2003003185A1 (fr) * | 2001-06-21 | 2003-01-09 | Ismo Rakkolainen | Systeme permettant la mise en oeuvre d'une interface utilisateur |
GB2387755A (en) * | 2002-03-28 | 2003-10-22 | Nec Corp | Portable apparatus including improved pointing device using image shift |
-
2004
- 2004-09-27 NO NO20044073A patent/NO20044073D0/no unknown
-
2005
- 2005-09-27 WO PCT/NO2005/000360 patent/WO2006036069A1/fr active Application Filing
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6433793B1 (en) * | 1998-04-24 | 2002-08-13 | Nec Corporation | Scrolling system of a display image |
US6288704B1 (en) * | 1999-06-08 | 2001-09-11 | Vega, Vista, Inc. | Motion detection and tracking system to control navigation and display of object viewers |
US20020130839A1 (en) * | 2001-03-16 | 2002-09-19 | Hugh Wallace | Optical screen pointing device with inertial properties |
WO2003003185A1 (fr) * | 2001-06-21 | 2003-01-09 | Ismo Rakkolainen | Systeme permettant la mise en oeuvre d'une interface utilisateur |
GB2387755A (en) * | 2002-03-28 | 2003-10-22 | Nec Corp | Portable apparatus including improved pointing device using image shift |
US20040204067A1 (en) * | 2002-03-28 | 2004-10-14 | Nec Corporation | Portable apparatus including improved pointing device |
Cited By (128)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2007042189A1 (fr) * | 2005-10-11 | 2007-04-19 | Sony Ericsson Mobile Communication Ab | Terminaux de communications cellulaires et procedes de detection de deplacement de terminaux pour la commande de curseur |
US7643850B2 (en) | 2005-10-11 | 2010-01-05 | Sony Ericsson Mobile Communications Ab | Cellular communication terminals and methods that sense terminal movement for cursor control |
US11818458B2 (en) | 2005-10-17 | 2023-11-14 | Cutting Edge Vision, LLC | Camera touchpad |
US11153472B2 (en) | 2005-10-17 | 2021-10-19 | Cutting Edge Vision, LLC | Automatic upload of pictures from a camera |
US7616186B2 (en) | 2005-12-09 | 2009-11-10 | Sony Ericsson Mobile Communications Ab | Acceleration reference devices, cellular communication terminal systems, and methods that sense terminal movement for cursor control |
WO2007068791A1 (fr) * | 2005-12-13 | 2007-06-21 | Elcoteq Se | Methode et disposition pour gerer une interface utilisateur graphique et un dispositif portable pourvu d'une interface utilisateur graphique |
US10884579B2 (en) | 2005-12-30 | 2021-01-05 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US9933913B2 (en) | 2005-12-30 | 2018-04-03 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US11449194B2 (en) | 2005-12-30 | 2022-09-20 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US10359907B2 (en) | 2005-12-30 | 2019-07-23 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US11650713B2 (en) | 2005-12-30 | 2023-05-16 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US12026352B2 (en) | 2005-12-30 | 2024-07-02 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US10915224B2 (en) | 2005-12-30 | 2021-02-09 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US8493323B2 (en) | 2006-08-02 | 2013-07-23 | Research In Motion Limited | System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device |
CN101196795B (zh) * | 2006-08-02 | 2014-04-09 | 黑莓有限公司 | 运动图像管理系统、运动图像呈现方法和便携式电子设备 |
US9367097B2 (en) | 2006-08-02 | 2016-06-14 | Blackberry Limited | System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device |
US8139026B2 (en) | 2006-08-02 | 2012-03-20 | Research In Motion Limited | System and method for adjusting presentation of text and images on an electronic device according to an orientation of the device |
US9110499B2 (en) | 2006-08-02 | 2015-08-18 | Blackberry Limited | System and method for adjusting presentation of moving images on an electronic device according to an orientation of the device |
EP2090975A1 (fr) * | 2006-08-02 | 2009-08-19 | Research In Motion Limited | Système et méthode pour adapter la présentation de texte et d'images sur un appareil électronique relativement à l'orientation de l'appareil |
EP2259163A3 (fr) * | 2006-08-02 | 2011-03-16 | Research In Motion Limited | Système et méthode pour adapter la présentation de texte et d'images sur un appareil électronique relativement à l'orientation de l'appareil |
EP1884864A1 (fr) * | 2006-08-02 | 2008-02-06 | Research In Motion Limited | Système et méthode pour adapter la présentation de texte et d'images sur un appareil électronique relativement à l'orientation de l'appareil |
US12236080B2 (en) | 2006-09-06 | 2025-02-25 | Apple Inc. | Device, method, and medium for sharing images |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US9335924B2 (en) | 2006-09-06 | 2016-05-10 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
WO2008029180A1 (fr) * | 2006-09-06 | 2008-03-13 | Santosh Sharan | Appareil et procédé pour l'agrandissement d'écran lié à une position |
US9952759B2 (en) | 2006-09-06 | 2018-04-24 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US11736602B2 (en) | 2006-09-06 | 2023-08-22 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US10778828B2 (en) | 2006-09-06 | 2020-09-15 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US11029838B2 (en) | 2006-09-06 | 2021-06-08 | Apple Inc. | Touch screen device, method, and graphical user interface for customizing display of content category icons |
US11240362B2 (en) | 2006-09-06 | 2022-02-01 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US12028473B2 (en) | 2006-09-06 | 2024-07-02 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
US11586348B2 (en) | 2007-01-07 | 2023-02-21 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US9367232B2 (en) | 2007-01-07 | 2016-06-14 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US10732821B2 (en) | 2007-01-07 | 2020-08-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US10254949B2 (en) | 2007-01-07 | 2019-04-09 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US11169691B2 (en) | 2007-01-07 | 2021-11-09 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US20100026812A1 (en) * | 2007-02-15 | 2010-02-04 | Edson Roberto Minatel | Optoeletronic Device for Helping and Controlling Industrial Processes |
WO2008098331A1 (fr) * | 2007-02-15 | 2008-08-21 | Edson Roberto Minatel | Dispositif optoélectronique pour faciliter et commander des procédés industriels |
US9933937B2 (en) | 2007-06-20 | 2018-04-03 | Apple Inc. | Portable multifunction device, method, and graphical user interface for playing online videos |
WO2009001240A1 (fr) * | 2007-06-27 | 2008-12-31 | Nokia Corporation | Procédé, appareil et progiciel permettant la fourniture d'un mécanisme de défilement pour des dispositifs à écran tactile |
US10761691B2 (en) | 2007-06-29 | 2020-09-01 | Apple Inc. | Portable multifunction device with animated user interface transitions |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
US12131007B2 (en) | 2007-06-29 | 2024-10-29 | Apple Inc. | Portable multifunction device with animated user interface transitions |
US11507255B2 (en) | 2007-06-29 | 2022-11-22 | Apple Inc. | Portable multifunction device with animated sliding user interface transitions |
US11861138B2 (en) | 2007-09-04 | 2024-01-02 | Apple Inc. | Application menu user interface |
US11010017B2 (en) | 2007-09-04 | 2021-05-18 | Apple Inc. | Editing interface |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
US11604559B2 (en) | 2007-09-04 | 2023-03-14 | Apple Inc. | Editing interface |
US10620780B2 (en) | 2007-09-04 | 2020-04-14 | Apple Inc. | Editing interface |
WO2009032638A3 (fr) * | 2007-09-04 | 2010-01-21 | Apple Inc. | Interface utilisateur pour menu d'application |
WO2009052848A1 (fr) * | 2007-10-25 | 2009-04-30 | Nokia Corporation | Controle de presentation d'informations par un appareil |
RU2494442C2 (ru) * | 2007-12-08 | 2013-09-27 | Т-Мобиле Интернациональ Аг | Виртуальная клавиатура мобильного оконечного устройства |
CN101889256B (zh) * | 2007-12-08 | 2013-04-17 | T-移动国际股份公司 | 移动终端的虚拟键盘 |
US20100313160A1 (en) * | 2007-12-08 | 2010-12-09 | T-Mobile International Ag | Virtual keyboard of a mobile terminal |
JP2011507058A (ja) * | 2007-12-08 | 2011-03-03 | ティー−モバイル インターナツィオナール アーゲー | 携帯端末の仮想キーボード |
US8527895B2 (en) | 2007-12-08 | 2013-09-03 | T-Mobile International, AG | Virtual keyboard of a mobile terminal |
WO2009071234A1 (fr) * | 2007-12-08 | 2009-06-11 | T-Mobile International Ag | Clavier virtuel d'un appareil terminal mobile |
KR101352462B1 (ko) | 2007-12-08 | 2014-01-17 | 티-모바일 인터내셔널 아게 | 이동 단말의 가상 키보드 |
US10628028B2 (en) | 2008-01-06 | 2020-04-21 | Apple Inc. | Replacing display of icons in response to a gesture |
US9619143B2 (en) | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
WO2009150522A1 (fr) * | 2008-06-11 | 2009-12-17 | Nokia Corporation | Mouvements d’appareil photo pour commande d’interface utilisateur |
US8269842B2 (en) | 2008-06-11 | 2012-09-18 | Nokia Corporation | Camera gestures for user interface control |
CN102089738A (zh) * | 2008-06-11 | 2011-06-08 | 诺基亚公司 | 用于用户界面控制的相机姿态 |
US20100125816A1 (en) * | 2008-11-20 | 2010-05-20 | Bezos Jeffrey P | Movement recognition as input mechanism |
US8788977B2 (en) * | 2008-11-20 | 2014-07-22 | Amazon Technologies, Inc. | Movement recognition as input mechanism |
US9304583B2 (en) | 2008-11-20 | 2016-04-05 | Amazon Technologies, Inc. | Movement recognition as input mechanism |
JP2012511867A (ja) * | 2008-12-10 | 2012-05-24 | ソニー エリクソン モバイル コミュニケーションズ, エービー | 電子装置の検出された傾斜と傾斜レートの少なくともいずれかに基づいて複数のキー入力領域を修正するためのシステム及び方法 |
CN102216882A (zh) * | 2008-12-10 | 2011-10-12 | 索尼爱立信移动通讯有限公司 | 基于所检测出的电子装置的倾斜和/或倾斜速率来改变多个按键输入区域的系统和方法 |
WO2010068312A1 (fr) * | 2008-12-10 | 2010-06-17 | Sony Ericsson Mobile Communications Ab | Système et procédé de modification d’une pluralité de régions d'entrée de touche sur la base d'une inclinaison et/ou d'un taux d'inclinaison détectés d'un dispositif électronique |
US20100149100A1 (en) * | 2008-12-15 | 2010-06-17 | Sony Ericsson Mobile Communications Ab | Electronic Devices, Systems, Methods and Computer Program Products for Detecting a User Input Device Having an Optical Marker Thereon |
WO2010070460A1 (fr) * | 2008-12-15 | 2010-06-24 | Sony Ericsson Mobile Communications | Dispositifs électroniques, systèmes, procédés et produits de programme d'ordinateur pour détecter un dispositif d'entrée d'utilisateur ayant un marqueur optique sur celui-ci |
EP2370889A4 (fr) * | 2008-12-18 | 2012-08-08 | Nokia Corp | Appareil, méthode, programme informatique et interface utilisateur présentant lasaisie de données |
KR101471708B1 (ko) * | 2008-12-18 | 2014-12-10 | 노키아 코포레이션 | 사용자 입력을 가능하게 하는 장치, 방법, 컴퓨터 판독가능 저장 매체 및 사용자 인터페이스 |
EP2382527A2 (fr) * | 2008-12-30 | 2011-11-02 | France Telecom | Interface utilisateur pour permettre une commande renforcée d'un programme d'application |
FR2940703A1 (fr) * | 2008-12-31 | 2010-07-02 | Cy Play | Procede et dispositif de modelisation d'un affichage |
FR2940689A1 (fr) * | 2008-12-31 | 2010-07-02 | Cy Play | Procede de navigation d'utilisateur d'un terminal mobile sur une application s'executant sur un serveur distant |
US9185159B2 (en) | 2008-12-31 | 2015-11-10 | Cy-Play | Communication between a server and a terminal |
WO2010076436A3 (fr) * | 2008-12-31 | 2010-11-25 | Cy Play | Procédé de modélisation par macroblocs de l'affichage d'un terminal distant a l'aide de calques caractérisés par un vecteur de mouvement et des données de transparence |
FR2940690A1 (fr) * | 2008-12-31 | 2010-07-02 | Cy Play | Un procede et dispositif de navigation d'utilisateur d'un terminal mobile sur une application s'executant sur un serveur distant |
WO2010080166A1 (fr) * | 2009-01-06 | 2010-07-15 | Qualcomm Incorporated | Interface utilisateur pour des dispositifs mobiles |
US8441441B2 (en) | 2009-01-06 | 2013-05-14 | Qualcomm Incorporated | User interface for mobile devices |
EP2438506A4 (fr) * | 2009-06-04 | 2013-10-02 | Mellmo Inc | Affichage de donnees multi-dimensionnelles au moyen d'un objet rotatif |
US9423929B2 (en) | 2009-06-04 | 2016-08-23 | Sap Se | Predictive scrolling |
EP2290931A3 (fr) * | 2009-08-24 | 2013-05-22 | Pantech Co., Ltd. | Appareil et procédé pour exécution une fonction à touche programmable d'un terminal mobile |
US8599153B2 (en) | 2009-08-24 | 2013-12-03 | Pantech Co., Ltd. | Apparatus and method for executing hot key function of mobile terminal |
EP2296076A1 (fr) * | 2009-09-15 | 2011-03-16 | Palo Alto Research Center Incorporated | Système d'interaction avec des objets dans un environnement virtuel |
US9542010B2 (en) | 2009-09-15 | 2017-01-10 | Palo Alto Research Center Incorporated | System for interacting with objects in a virtual environment |
EP2341412A1 (fr) * | 2009-12-31 | 2011-07-06 | Sony Computer Entertainment Europe Limited | Dispositif électronique portable et procédé de contrôle d'un dispositif électronique portable |
US9733812B2 (en) | 2010-01-06 | 2017-08-15 | Apple Inc. | Device, method, and graphical user interface with content display modes and display rotation heuristics |
US9557811B1 (en) | 2010-05-24 | 2017-01-31 | Amazon Technologies, Inc. | Determining relative motion as input |
US8878773B1 (en) | 2010-05-24 | 2014-11-04 | Amazon Technologies, Inc. | Determining relative motion as input |
EP2482170A3 (fr) * | 2011-01-31 | 2015-01-21 | Hand Held Products, Inc. | Terminal opératif pour l'affichage d'enregistrement électronique |
CN102750077A (zh) * | 2011-01-31 | 2012-10-24 | 手持产品公司 | 可操作用于显示电子记录的终端 |
US9123272B1 (en) | 2011-05-13 | 2015-09-01 | Amazon Technologies, Inc. | Realistic image lighting and shading |
US9041734B2 (en) | 2011-07-12 | 2015-05-26 | Amazon Technologies, Inc. | Simulating three-dimensional features |
US9122917B2 (en) | 2011-08-04 | 2015-09-01 | Amazon Technologies, Inc. | Recognizing gestures captured by video |
US10088924B1 (en) | 2011-08-04 | 2018-10-02 | Amazon Technologies, Inc. | Overcoming motion effects in gesture recognition |
US8947351B1 (en) | 2011-09-27 | 2015-02-03 | Amazon Technologies, Inc. | Point of view determinations for finger tracking |
EP2597590A3 (fr) * | 2011-11-28 | 2013-11-27 | Samsung Electronics Co., Ltd | Procédé d'authentification de mot de passe et dispositif portable correspondant |
US9165132B2 (en) | 2011-11-28 | 2015-10-20 | Samsung Electronics Co., Ltd. | Method of authenticating password and portable device thereof |
KR101281058B1 (ko) | 2011-12-06 | 2013-07-15 | (주)나노티에스 | 터치키보드 장치 및 이의 터치위치 검출방법 |
US9223415B1 (en) | 2012-01-17 | 2015-12-29 | Amazon Technologies, Inc. | Managing resource usage for task performance |
US8884928B1 (en) | 2012-01-26 | 2014-11-11 | Amazon Technologies, Inc. | Correcting for parallax in electronic displays |
US10019107B2 (en) | 2012-01-26 | 2018-07-10 | Amazon Technologies, Inc. | Correcting for parallax in electronic displays |
US9471153B1 (en) | 2012-03-14 | 2016-10-18 | Amazon Technologies, Inc. | Motion detection systems for electronic devices |
US9063574B1 (en) | 2012-03-14 | 2015-06-23 | Amazon Technologies, Inc. | Motion detection systems for electronic devices |
US9285895B1 (en) | 2012-03-28 | 2016-03-15 | Amazon Technologies, Inc. | Integrated near field sensor for display devices |
US9652083B2 (en) | 2012-03-28 | 2017-05-16 | Amazon Technologies, Inc. | Integrated near field sensor for display devices |
US9483113B1 (en) | 2013-03-08 | 2016-11-01 | Amazon Technologies, Inc. | Providing user input to a computing device with an eye closure |
US9035874B1 (en) | 2013-03-08 | 2015-05-19 | Amazon Technologies, Inc. | Providing user input to a computing device with an eye closure |
US9269012B2 (en) | 2013-08-22 | 2016-02-23 | Amazon Technologies, Inc. | Multi-tracker object tracking |
US11199906B1 (en) | 2013-09-04 | 2021-12-14 | Amazon Technologies, Inc. | Global user input management |
US10055013B2 (en) | 2013-09-17 | 2018-08-21 | Amazon Technologies, Inc. | Dynamic object tracking for user interfaces |
US9367203B1 (en) | 2013-10-04 | 2016-06-14 | Amazon Technologies, Inc. | User interface techniques for simulating three-dimensional depth |
JP2015102924A (ja) * | 2013-11-22 | 2015-06-04 | シャープ株式会社 | 表示装置、スクロール表示方法、および、スクロール表示プログラム |
GB2532010A (en) * | 2014-11-04 | 2016-05-11 | Samsung Electronics Co Ltd | Display method and device |
US10613732B2 (en) | 2015-06-07 | 2020-04-07 | Apple Inc. | Selecting content items in a user interface display |
US11660503B2 (en) | 2016-06-11 | 2023-05-30 | Apple Inc. | Activity and workout updates |
US11148007B2 (en) | 2016-06-11 | 2021-10-19 | Apple Inc. | Activity and workout updates |
US10272294B2 (en) | 2016-06-11 | 2019-04-30 | Apple Inc. | Activity and workout updates |
US11918857B2 (en) | 2016-06-11 | 2024-03-05 | Apple Inc. | Activity and workout updates |
US11161010B2 (en) | 2016-06-11 | 2021-11-02 | Apple Inc. | Activity and workout updates |
US12274918B2 (en) | 2016-06-11 | 2025-04-15 | Apple Inc. | Activity and workout updates |
US11216119B2 (en) | 2016-06-12 | 2022-01-04 | Apple Inc. | Displaying a predetermined view of an application |
EP3373118A3 (fr) * | 2017-03-10 | 2018-12-12 | Institut Mines-Telecom | Interface homme machine comportant une caméra et un marqueur |
FR3063821A1 (fr) * | 2017-03-10 | 2018-09-14 | Institut Mines Telecom | Interface homme machine |
US10895953B2 (en) | 2017-07-31 | 2021-01-19 | Tencent Technology (Shenzhen) Company Limited | Interaction with a three-dimensional internet content displayed on a user interface |
CN109804341A (zh) * | 2017-07-31 | 2019-05-24 | 腾讯科技(深圳)有限公司 | 与显示在用户界面上的三维互联网内容的交互 |
Also Published As
Publication number | Publication date |
---|---|
NO20044073D0 (no) | 2004-09-27 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2006036069A1 (fr) | Systeme et procede de traitement d'informations | |
US12118201B2 (en) | Devices, methods, and graphical user interfaces for a unified annotation layer for annotating content displayed on a device | |
Lee et al. | Interaction methods for smart glasses: A survey | |
US20250123698A1 (en) | Devices, methods, and graphical user interfaces for an electronic device interacting with a stylus | |
Kratz et al. | HoverFlow: expanding the design space of around-device interaction | |
US11468625B2 (en) | User interfaces for simulated depth effects | |
US7366540B2 (en) | Hand-held communication device as pointing device | |
US10209877B2 (en) | Touch screen device, method, and graphical user interface for moving on-screen objects without using a cursor | |
EP2263134B1 (fr) | Terminaux de communication à interface utilisateur superposée | |
CN103262008B (zh) | 智能无线鼠标 | |
US9740297B2 (en) | Motion-based character selection | |
EP2045694B1 (fr) | Dispositif électronique portable doté de fonctionnalités de type souris | |
EP2068235A2 (fr) | Dispositif d'entrée, dispositif d'affichage, procédé d'entrée, procédé d'affichage, et programme | |
US20100275122A1 (en) | Click-through controller for mobile interaction | |
US20090164930A1 (en) | Electronic device capable of transferring object between two display units and controlling method thereof | |
JP2018519583A (ja) | スタイラスでユーザインタフェースを操作するためのデバイス及び方法 | |
CN113253908B (zh) | 按键功能执行方法、装置、设备及存储介质 | |
CN102314301A (zh) | 虚拟触控感应系统及方法 | |
US20030234766A1 (en) | Virtual image display with virtual keyboard | |
WO2005124528A2 (fr) | Manette de jeu optique pour dispositif de communication a main | |
JP5173001B2 (ja) | 情報処理装置、画面表示方法、制御プログラムおよび記録媒体 | |
CN119225538B (zh) | 一种掌上平板电脑的数据交互方法和系统 | |
Ballagas et al. | Mobile Phones as Pointing Devices. | |
Sasaki et al. | Hit-wear: A menu system superimposing on a human hand for wearable computers | |
KR20110066545A (ko) | 터치스크린을 이용하여 이미지를 표시하기 위한 방법 및 단말 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG |
|
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
DPE2 | Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101) | ||
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: COMMUNICATION UNDER RULE 69 EPC ( EPO FORM 1205A DATED 12/09/07 ) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 05798944 Country of ref document: EP Kind code of ref document: A1 |