CN112083798B - Temperature regulation feedback system responsive to user input - Google Patents
Temperature regulation feedback system responsive to user input Download PDFInfo
- Publication number
- CN112083798B CN112083798B CN202010451233.9A CN202010451233A CN112083798B CN 112083798 B CN112083798 B CN 112083798B CN 202010451233 A CN202010451233 A CN 202010451233A CN 112083798 B CN112083798 B CN 112083798B
- Authority
- CN
- China
- Prior art keywords
- virtual object
- user input
- temperature adjustment
- user
- haptic feedback
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/016—Input arrangements with force or tactile feedback as computer generated output to the user
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06Q—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
- G06Q30/00—Commerce
- G06Q30/06—Buying, selling or leasing transactions
- G06Q30/0601—Electronic shopping [e-shopping]
- G06Q30/0641—Electronic shopping [e-shopping] utilising user interfaces specially adapted for shopping
- G06Q30/0643—Electronic shopping [e-shopping] utilising user interfaces specially adapted for shopping graphically representing goods, e.g. 3D product representation
Landscapes
- Engineering & Computer Science (AREA)
- Business, Economics & Management (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Accounting & Taxation (AREA)
- Finance (AREA)
- Physics & Mathematics (AREA)
- Development Economics (AREA)
- General Business, Economics & Management (AREA)
- Strategic Management (AREA)
- Marketing (AREA)
- Economics (AREA)
- Human Computer Interaction (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
公开了响应于用户输入的温度调节反馈系统,包括:接收关于虚拟对象的用户输入;至少部分基于用户输入更新虚拟对象;至少部分基于用户输入确定温度调节;并且输出温度调节。
A temperature adjustment feedback system responsive to user input is disclosed, comprising: receiving user input regarding a virtual object; updating the virtual object based at least in part on the user input; determining a temperature adjustment based at least in part on the user input; and outputting the temperature adjustment.
Description
Cross Reference to Related Applications
The present application claims priority to U.S. provisional patent application No. 62/860,687 (attorney docket number ALIBP +) filed on month 12 of 2019, entitled air thermal feedback system for 3D interaction, which is incorporated herein by reference for all purposes.
Background
Typically, in order to experience a product in person, a person must go to a physical store to test the item. However, it is not always convenient or practical to test products in a de-brick store. For example, one may interact with a product remotely by browsing photos on a website, but the experience is not immersive nor close to actually using the product.
Drawings
Various embodiments of the present invention are disclosed in the following detailed description and the accompanying drawings.
FIG. 1A is a diagram illustrating an embodiment of a system for providing a mid-air temperature adjustment output in response to user input.
FIG. 1B is a functional diagram illustrating an embodiment of a host computer for providing temperature regulation in response to user input.
Fig. 2 is a diagram showing an example of the half-air temperature adjusting apparatus.
Fig. 3 is a diagram illustrating an example of a system for providing a half-air temperature adjustment output in response to user input.
FIG. 4 is a flow chart illustrating an embodiment of a process for providing a half-air temperature adjustment output in response to user input.
FIG. 5 is a flow chart illustrating an embodiment of a process for providing a half-air temperature adjustment output in response to user input.
Fig. 6 is a flowchart showing an example of a process for providing a half-air temperature adjustment output in response to user input.
Fig. 7A and 7B depict examples of 3D virtual objects displayed by a system for providing a half-space temperature regulated output responsive to user input.
Detailed Description
The invention can be implemented in numerous ways, including as a process, an apparatus, a system, a combination of materials, a computer program product presented on a computer readable storage medium, and/or a processor, e.g., a processor configured to execute instructions stored in and/or provided by a memory coupled to the processor. In this specification, these implementations, or any other form that the invention may take, may be referred to as techniques. In general, the order of the steps of disclosed processes may be altered within the scope of the invention. Unless otherwise indicated, a component (e.g., a processor or memory) described as being configured to perform a task may be implemented as a general-purpose component that is temporarily configured to perform the task at a given time, or a specific component that is manufactured to perform the task. The term "processor" as used herein refers to one or more devices, circuits, and/or processing cores configured to process data (e.g., computer program instructions).
The following provides a detailed description of one or more embodiments of the invention with reference to the accompanying drawings that illustrate the principles of the invention. The invention is described in connection with such embodiments, but the invention is not limited to any embodiment. The scope of the invention is limited only by the claims and the invention encompasses numerous alternatives, modifications and equivalents. Numerous specific details are set forth in the following description in order to provide a thorough understanding of the invention. These details are provided for the purpose of example and the invention may be practiced according to the claims without some or all of these specific details. For the sake of clarity, technical material that is known in the technical fields related to the invention has not been described in detail so that the invention is not unnecessarily obscured.
Embodiments are described herein that provide a half-air (mid-air) temperature adjustment output responsive to user input. User input is received regarding a virtual object. In some embodiments, the virtual object is a three-dimensional (3D) object that is rendered and rendered by an autostereoscopic display such that the virtual object is presented as a 3D hologram. In some embodiments, the user input does not require physical contact by the user with the physical object and is detected using the sensor device. For example, the user input may be hand movement, foot movement, head movement, and/or eye movement. In some embodiments, the user input determines to affect the virtual object as a result of detecting a collision between the user input and the virtual object in 3D space. The headroom temperature adjustment is determined based at least in part on the user input. In various embodiments, "semi-air temperature conditioning" includes temperature conditioning (e.g., heat, cold, and/or air) that can be perceived (e.g., by a user) without direct/physical contact with a physical object (e.g., a source of temperature conditioning). In some embodiments, the temperature adjustment is determined based at least in part on a measurement determined based on user input associated with at least a portion of the virtual object in the 3D space. And outputting the half-air temperature regulation. For example, the half-air temperature adjustment feedback includes heat blown from at least one half-air temperature adjustment device in the direction of the user's hand. In some embodiments, the appearance of the virtual object is updated also in response to user input. In some embodiments, feedback other than half-air temperature adjustment, e.g., haptic feedback, is also provided in response to user input.
FIG. 1A is a diagram illustrating an embodiment of a system for providing a half-air temperature adjustment output in response to user input. In this example, the system 100 includes a host computer 112, a display device 102, a user input detection device 104, a half-air temperature adjustment device 106, a half-air haptic feedback device 108, and an additional feedback device 110. As shown in fig. 1A, each of the display device 102, the haptic feedback device 108, the thermal regulation device 106, and the additional feedback device 110 is connected to a host computer 112. Also, as shown in fig. 1A, a user input detection device 104 is connected to the display device 102. In other embodiments, the user input detection device 104 may be directly connected to the host computer 112 instead of the display device 102. In some embodiments, one or more of the display device 102, the user input detection device 104, the half-air temperature adjustment device 106, the half-air tactile feedback device 108, and the additional feedback device 110 include a corresponding driver that interfaces with it and another device with which it communicates.
Host computer 112 is configured to execute a software environment (e.g., a game engine) configured to render 3D virtual objects in 3D space. For example, the software environment is a unified (Unity) game engine. Files/information for rendering the 3D virtual objects may be stored locally on the host computer 112 or retrieved from a remote server (not shown) via a network (not shown). In some embodiments, the 3D virtual object rendered by host computer 112 includes a product. In some embodiments, the 3D virtual objects rendered by host computer 112 are at least partially animated. For example, the 3D virtual object may include a showerhead that emits an animation of the water flow. Host computer 112 is configured to output at least one 3D virtual object to display device 102.
The display device 102 is configured to render 3D virtual objects. In some embodiments, the display device 102 is configured to render the 3D virtual object as a 3D hologram. In various embodiments, the display device 102 is an autostereoscopic display device (e.g., a device manufactured by the front of view (Seefront TM) or the Dimensions (Dimenco TM)), a virtual reality display device, or a device that displays images that can be perceived in 3D depth without requiring the use of special headwear or glasses by the viewer.
The user input detection device 104 includes one or more sensors that track user input/motion. For example, the user input detection device 104 is configured to track eye movement, hand movement, leg/foot movement, and/or head movement. In response to each motion detected, the user input detection device 104 is configured to send a corresponding message to the display device 102, which display device 102 may in turn send to the host computer 112. Based on the detected message of the user input, the software environment executing on the host computer 112 is configured to determine whether the user input collides with at least a portion of the 3D virtual object presented on the display device 102, and if so, update the appearance of the 3D virtual object and/or cause the air-to-air temperature adjustment device 106, the air-to-air haptic feedback device 108, and/or the other feedback device 110 to output air-to-air feedback accordingly. In some embodiments, when a point in 3D space detected at the user input is detected to overlap a volume in 3D space defined for the 3D virtual object, it is determined that the user input "collides" with at least a portion of the 3D virtual object. In other words, the user gesture interactions sensed by the user input detection device 104 may cause the host computer 112 to generate an updated appearance of the 3D virtual display presented by the display device 102 and/or cause output of the mid-air feedback by at least one peripheral feedback device (e.g., the mid-air temperature adjustment device 106, the mid-air haptic feedback device 108, and/or the additional feedback device 110).
For example, host computer 112 may render a plurality of 3D virtual objects, including products downloaded from a remote server associated with an online shopping platform. Host computer 112 is configured to send the rendered plurality of 3D virtual objects to display device 102, wherein the 3D virtual product is presented in the form of a 3D hologram. The user input detection device 104 tracks user gestures with respect to the 3D virtual product. The particular detected user gesture detected by the user input detection device 104 is determined by the host computer 112 to match a predetermined user gesture associated with selecting a particular 3D virtual product. In response to the detected predetermined selection user gesture, the software environment executing on the host computer 112 is configured to cause the selected 3D virtual product to be presented on the display device 102 and interact therewith via user input. In a particular example, the initially presented 3D virtual product includes a 3D showerhead and/or faucet, and the particular item selected is a particular model of the showerhead. Once the showerhead is selected, the user may continue to interact with the selected showerhead via user input, thereby obtaining an immersive semi-air sensory experience, as described below.
The half-air temperature regulation device 106 is configured to receive instructions from the host computer 112 to output temperature regulation feedback. In various embodiments, the half-air temperature regulating device 106 includes a heating component and a fan. In some embodiments, the half-air temperature regulating device 106 includes a heating component, a cooling component, and a fan. In some embodiments, the half-air temperature adjustment device 106 is instructed by the host computer 112 to activate a fan to generate a cool breeze, thereby reducing the temperature near the system 100 and/or causing a user to perceive the wind (e.g., the user's hand). In some embodiments, the half-air temperature regulating device 106 is configured to activate a heating component (e.g., positive temperature coefficient (positive temperature coefficient, PTC)) to generate heat, thereby elevating the temperature in the vicinity of the system 100. In some embodiments, the half-air temperature regulating device 106 is configured to activate a cooling component (e.g., a thermoelectric cooler) to cool, thereby reducing the temperature in the vicinity of the system 100. In some embodiments, the half-air temperature regulating device 106 is configured to activate a fan with at least one of the heating component and the cooling component to distribute heat and/or cold in a particular direction away from the half-air temperature regulating device 106 and toward the user. In some embodiments, the degree of temperature regulation (e.g., heat or cold) (e.g., temperature) produced by the half-air temperature regulation device 106 is determined and indicated by the host computer 112. In some embodiments, the intensity and/or speed of the wind generated by the half-air temperature regulating device 106 is determined and indicated by the host computer 112.
When the user perceives the half-empty temperature regulation feedback, the user may feel warm or cool as if they were close to or even in direct contact with the temperature regulation source. In some embodiments, the half-air temperature adjustment device 106 is configured to adjust feedback in response to user interaction/input/output temperature with respect to the 3D virtual object presented by the display device 102. For example, if a user interacts with a presented 3D virtual object in a manner that allows the user to experience heat emanating from the 3D virtual object, the semi-air temperature regulating device 106 is configured to activate its heating components and its fans to transfer heat to the user as if the heat emanated from the 3D virtual object.
Although only one example of a half-air temperature regulating device 106 is shown in the system 100, multiple examples of half-air temperature regulating devices 106 may be connected to the host computer 112. For example, each instance of the half-space temperature adjustment device 106 may be disposed in a different location (e.g., relative to another peripheral device of the host computer 112, such as the display device 102 and the half-space haptic feedback device 108) to provide temperature adjustment feedback from different orientations/locations. For example, multiple instances of the half-air temperature regulating device 106 may provide different temperature feedback for different orientations/positions thereof, respectively. In some embodiments, host computer 112 may select all instances of half-space temperature adjustment device 106, or only a subset of the instances of half-space temperature adjustment device 106, to output temperature adjustment feedback based on factors such as the position of the user (e.g., the position of the user's palm, which is more sensitive to detecting wind and/or temperature changes than the back of the hand) and/or the orientation of the selected 3D virtual object currently interacting with the user.
The half-space haptic feedback device 108 is configured to receive instructions from the host computer 112 to output haptic feedback. Multiple haptic sensations (or single haptic sensations) are the science and engineering of applying haptic sensations in a computer system. In some embodiments, mid-air haptic includes non-contact haptic feedback that the user perceives in mid-air. Several types of semi-air haptic feedback devices include, for example, ultrasonic vibration using a two-dimensional (2D) array of ultrasonic transducers, laser-based haptic feedback, and air vortex using a driven flexible nozzle with a subwoofer. In various embodiments, the haptic feedback device 108 is configured to emit ultrasonic waves to generate haptic sensations at one or more focal points. For example, the focus of the ultrasound in 3D space corresponds to the detected position of the user's hand, causing the user to perceive haptic feedback. In some embodiments, the degree (e.g., pressure) and/or location of the haptic feedback generated by the haptic feedback device 108 is determined and indicated by the host computer 112.
When the user perceives the half-empty haptic feedback, they may feel stress if they directly contact the physical object. In some embodiments, the half-space haptic feedback device 108 is configured to output haptic feedback in response to user interaction/input with respect to the 3D virtual object presented by the display device 102. In one example, if a user browses multiple 3D virtual objects or 3D environments/scenes (e.g., menus), the half-space haptic feedback device 108 is configured to output haptic feedback, allowing the user to perceive that they are physically moving items or make selections (e.g., particular 3D virtual objects). In another example, if the user interacts with the presented 3D virtual object in a manner that allows the user to touch the 3D virtual object, the half-space haptic feedback device 108 is configured to output haptic feedback that causes the user to feel that his or her hand is contacting the object.
Although only one instance of the haptic feedback device 108 is shown in the system 100, multiple instances of the haptic feedback device 108 may be connected to the host computer 112. For example, each instance of the half-space haptic feedback device 108 may be disposed in a different location (e.g., relative to another peripheral device of the host computer 112, such as the display device 102 and the half-space temperature adjustment device 106) to provide haptic feedback from a different orientation/location. In some embodiments, host computer 112 may select all instances of half-space haptic feedback device 108, or only a subset of the instances of half-space haptic feedback device 108, to output haptic feedback based on factors such as the position of the user (e.g., the position of the user's palm, which is more sensitive to detecting wind and/or temperature changes than the back of the hand) and/or the orientation of the selected 3D virtual object currently interacting with the user.
In an embodiment, the host computer 112 is configured to cause the half-air temperature adjustment device 106 and the half-air tactile feedback device 108 to simultaneously output feedback to cause the user to perceive that he or she is touching or contacting a tangible item that provides cooling or warmth without ever contacting a physical item or a heat/cold source. Returning to the example above, where the user selects a 3D virtual object that includes a showerhead to further interact with, the display device 102 is configured to zoom in on the presentation of the 3D virtual showerhead or otherwise make it more noticeable so that the user can more easily attempt it. For example, a user may use his or her hands to spray water from a 3D hologram of a showerhead, and the 3D hologram of the showerhead will be updated to show that water is sprayed from a 3D virtual showerhead in a manner that approximates the same physical version of the showerhead. For example, the user's hand movements will be tracked by the user input detection device 104 and ultimately input to the host computer 112. If the software environment executing on the host computer 112 determines that the user input matches a predetermined user gesture associated with causing the 3D virtual showerhead to spray (e.g., turns the palm in a certain direction), the software environment is configured to update the appearance of the 3D virtual showerhead by causing the display device 102 to additionally present an animation of water being sprayed from the showerhead. For example, a user may interact with the 3D virtual showerhead and animated water flow to change the shape of the water flow and trigger temperature adjustment feedback and tactile feedback output so that the user's hand may feel pressure and warmth from the water spray from the 3D virtual showerhead. In another example, if the 3D virtual object includes both a 3D virtual showerhead and a 3D virtual bathtub faucet, the 3D virtual showerhead and the 3D virtual bathtub faucet may spray water simultaneously to interact with the user. For example, the user's hand movements will be tracked by the user input detection device 104 and ultimately input to the host computer 112. If the software environment executing on the host computer 112 determines that the user input collides with an animation of the 3D virtual showerhead and/or the 3D virtual bathtub faucet spray in the 3D space, the software environment is configured to update the appearance of the 3D virtual showerhead and/or the 3D virtual bathtub faucet by causing the display device 102 to display an animation of a change in the spray from the showerhead and/or the bathtub faucet, wherein the water flow is distorted by the collision with the user's hand. Further, if the software environment executing on the host computer 112 determines that the user input collides with the water spray animation of the 3D virtual shower head and/or the 3D virtual bathtub faucet in 3D space, the host computer 112 is configured to send instructions to the half-air temperature adjustment device 106 to cause the half-air temperature adjustment device 106 to generate heat and activate a fan to distribute the heat and is configured to send instructions to the half-air haptic feedback device 108 to cause the half-air haptic feedback device 108 to generate a haptic sensation at a focus corresponding to the position of the user's hand in 3D space. As a result of the temperature adjustment and tactile feedback, the user may experience an immersive simulation attempt to spray water from the selected showerhead without directly contacting or even touching the physical object. If multiple instances of the half-air temperature adjustment device 106 and the half-air tactile feedback device 108 are used, the host computer 112 may send a different signal to each instance of the device to cause the instance of the device to issue different feedback. For example, if a 3D virtual object can produce hot and cold water from different nozzles simultaneously, host computer 112 can send one set of signals to cause one instance of semi-air temperature regulating device 106 to emit hot air, and host computer 112 can send another set of signals to cause another instance of semi-air temperature regulating device 106 to emit cold air.
The additional feedback device 110 is configured to provide additional feedback in addition to haptic or temperature-based. In an example, the additional feedback device 110 includes a speaker configured to provide audio-based feedback. In another example, the additional feedback device 110 is configured to provide scent or scent-based feedback. In some embodiments, in response to user input tracked by the user input detection device 104 and forwarded to the host computer 112, the host computer 112 is configured to determine whether the user input and/or the 3D virtual object currently presented by the display device 102 will trigger audio and/or fragrance-based feedback.
FIG. 1B is a functional diagram illustrating an embodiment of a host computer for providing temperature regulation in response to user input. It should be apparent that other computer system architectures and configurations may be used to provide temperature regulation in response to user input. In some embodiments, the host computer 112 of the system 100 of FIG. 1A may be implemented using a computer system 150. Computer system 150, which includes various subsystems as described below, includes at least one microprocessor subsystem (also referred to as a processor or central processing unit (central processing unit, CPU)) 152. For example, the processor 152 may be implemented by a single chip processor or multiple processors. In some embodiments, the processor 152 is a general-purpose digital processor that controls the operation of the computer system 150. Using instructions retrieved from memory 160, processor 152 controls the reception and operation of input data, as well as the output and display of data on an output device (e.g., display 168).
The processor 152 is bi-directionally coupled to the memory 160 and may include a first main memory area, typically random access memory (random access memory, RAM), and a second main memory area, typically read-only memory (ROM). As is well known in the art, main memory may be used as a general-purpose memory area and temporary memory (scratch-pad memory) as well as to store input data and process data. The main memory may also store programming instructions and data in the form of data objects and text objects, as well as other data and instructions for processes running on the processor 152. As is also well known in the art, main memory generally includes basic operating instructions, program code, data, and objects used by the processor 152 to perform its functions (e.g., programming instructions). For example, memory 160 may include any suitable computer-readable storage medium, as described below, depending on, for example, whether data access needs to be bi-directional or uni-directional. For example, the processor 152 may also retrieve and store frequently needed data directly and very quickly in a cache (not shown).
Removable mass storage device 162 provides additional data storage capacity for computer system 150 and is bi-directionally coupled (read/write) or uni-directionally coupled (read only) to processor 152. For example, storage 162 may also include computer readable media such as magnetic tape, flash memory, PC cards, portable mass storage devices, holographic storage devices, and other storage devices. For example, fixed mass storage 170 may also provide additional data storage capacity. The most common example of fixed mass storage 170 is a hard disk drive. The mass storage 162, 170 typically stores additional programming instructions, data, etc., which are not normally actively used by the processor 152. It should be appreciated that the information retained in mass storage 162 and 170 may, in standard fashion, be used as part of memory 160 (e.g. RAM) as virtual memory if desired.
In addition to providing processor 152 with access to a memory subsystem, bus 164 may also be used to provide access to other subsystems and devices. As shown, these may include a display 168, a network interface 166, a keyboard 154, and a pointing device 158, as well as auxiliary Input/Output (I/O) device interfaces, sound cards, speakers, and other subsystems as desired. For example, the pointing device 158 may be a mouse, stylus, trackball, or tablet, and is useful for interacting with a graphical user interface.
The network interface 166 allows the processor 152 to be coupled to another computer, computer network, or telecommunications network using a network connection as shown. For example, through the network interface 166, the processor 152 may receive information (e.g., data objects or program instructions) from another network or output information to another network during execution of the method/process steps. Information, typically represented as a sequence of instructions executing on a processor, may be received from and output to another network. An interface card or similar device and appropriate software implemented (e.g., executed/carried out) by the processor 152 may be used to connect the computer system 150 to an external network and to transfer data according to standard protocols. For example, the various process embodiments disclosed herein may be executed on the processor 152 or may be executed on a network with a remote processor sharing part of the processing, such as the internet, an intranet, or a local area network. Additional mass storage devices (not shown) may also be connected to the processor 166 through the network interface 152.
An auxiliary I/O device interface (not shown) may be used in conjunction with computer system 150. The auxiliary I/O device interfaces may include general and custom interfaces that allow the processor 152 to send and typically receive data from other devices, such as microphones, touch-sensitive displays, sensor readers, magnetic tape readers, voice or handwriting recognizers, biometric readers, cameras, portable mass storage devices, and other computers.
Fig. 2 is a diagram showing an example of the half-air temperature adjusting apparatus. In some embodiments, the half-air temperature adjustment device 106 of the system 100 of fig. 1A is implemented using the half-air temperature adjustment device 206 of fig. 2.
As shown in fig. 2, the half-air temperature regulating device 206 includes a fan 202 and a heating element 204, e.g., positive temperature coefficient (positive temperature coefficient, PTC), each of which is connected to a drive circuit 208 with a microcontroller. The drive circuit 208 with the microcontroller is in turn connected to a host computer (e.g., host computer 112 of system 100 of fig. 1A). The drive circuit 208 with a microcontroller is configured to interface with a host computer. For example, instructions from a host computer are received on the drive circuit 208 with a microcontroller, and the drive circuit 208 with a microcontroller is configured to send corresponding instructions to one or both of the fan 202 and the heating assembly 204. In some embodiments, the fan 202 and the heating element 204 may be instructed to be activated at different times. For example, the heating component 204 may be instructed to be activated (e.g., turned on) when a user makes a particular selection of a user input detection device connected to the host computer. For example, the heating assembly 204 may be activated when a subsequent output of the temperature regulation feedback is desired because the heating assembly 204 is not capable of instantaneously generating heat. Thus, at some point after the heating assembly 204 is activated, the drive circuitry 208 with the microcontroller receives instructions from the host computer to activate the fan 202. Since the heating element 204 is activated in advance and has generated heat, a subsequent activation of the fan 202 will cause heat to be distributed/transferred from the half-air temperature regulating device 206.
The following is an example process by which the half-air temperature adjustment device 206 is configured to generate a temperature adjustment that, when the drive circuit 208 with a microcontroller receives an activation signal from a host computer, it may activate the heating assembly 204 and/or the fan 202 simultaneously or later, depending on the type of activation signal. When the drive circuit 208 with a microcontroller receives a deactivation signal from the host computer, it may deactivate the heating assembly 204 and/or the fan 202 simultaneously, depending on the type of deactivation signal. The main computer can only send an activation signal to the fan 202 in case the heated air needs to be cooled. In some embodiments, the host computer includes a timer to track the time that the heating assembly 204 is on and sends a deactivation signal after the timer expires (e.g., to prevent overheating of the heating assembly 204). For example, the timer is configured (e.g., by a user) to start at a certain value (e.g., 10) and to start counting down when the heating assembly 204 begins to generate heat. Once the timer reaches zero, the host computer sends a deactivation signal to the drive circuit 208 with the microcontroller to cause the heating element 204 and fan 202 to deactivate.
In some embodiments, the drive circuitry 208 with the microcontroller is configured to receive instructions regarding the degree and/or intensity at which one or both of the fan 202 or the heating assembly 204 will operate. The drive circuitry 208 with a microcontroller is configured to convert a particular degree and/or intensity to corresponding computer instructions configured to cause one or both of the corresponding fan 202 or heating assembly 204 to operate accordingly and send the corresponding computer instructions to one or both of the fan 202 or heating assembly 204.
Fig. 3 is a diagram illustrating an example of a system for providing a half-air temperature adjustment output in response to user input. In the example of fig. 3, system 300 includes a display device 310, a motion detection device 308, a haptic feedback device 306a, a haptic feedback device 306b, a haptic temperature adjustment device 304, and a host computer (not shown). Although not shown in fig. 3, a host computer is connected to and receives and transmits instructions from each of the display device 310, the motion detection device 308, the haptic feedback device 306a, the haptic feedback device 306b, and the haptic temperature adjustment device 304.
In this example, display device 310 includes an autostereoscopic display that presents a 3D hologram of an object to user 302 without requiring the user to wear dedicated glasses to experience the 3D depth of the hologram. In other embodiments, the display of the virtual display requires special glasses, goggles, or other equipment may be used. The virtual objects presented by the display device 310 are rendered by a software environment (e.g., a game engine) executing on a host computer. For example, the virtual object comprises two stereoscopic images, which may be displayed in an autostereoscopic 3D display. The motion detection device 308 is built into the display device 310 and is configured to track the motion of the user, including, for example, eye and/or hand movements. The motion detection device 308 is configured to track the position and pose of the hand of the user 302. The haptic feedback device 306a is placed in front of and above the display device 310 and the haptic feedback device 306b is placed in front of and below the display device 310 to provide haptic feedback to the user 302 from the top and bottom, respectively, to match the illusion of both eyes (e.g., a 3D virtual object) being presented in mid-air. In fig. 3, each half-space haptic feedback device 306a and 306b comprises an ultrasonic haptic device that includes a 2D array of multiple transducers that produce focused ultrasonic vibrations so that the user 302 can feel the touch when the hand is placed in its vicinity. As shown in fig. 3, a half-air temperature adjustment device 304 is placed beside the half-air haptic feedback device 306a to provide heated air in combination with (or independent of) the haptic feedback. For example, the angle of the half-air temperature adjustment device 304 is between 30 degrees and 45 degrees, facing the focus of the half-air haptic feedback device 306 a. Although not shown in fig. 3, a corresponding half-space temperature regulating device may also be placed beside the half-space haptic feedback device 306 b. In some embodiments, when either of the half-air haptic feedback devices 306a or 306b is activated to produce focused haptic feedback, heated air is also produced by the respective half-air temperature adjustment device to provide temperature adjustment feedback that may be perceived by the palm of the user, e.g., user 302 interacting with a 3D object presented by display device 310. For example, haptic and/or thermoregulation feedback may be output from either of the haptic feedback device 306a and the haptic thermoregulation device 304 or the haptic feedback device 306b and the corresponding haptic feedback device depending on the predicted current position of the palm of the user 302 and/or the orientation of the 3D virtual object presented by the display device 310.
FIG. 4 is a flow chart illustrating an embodiment of a process for providing a half-air temperature adjustment output in response to user input. In some embodiments, process 400 may be implemented on system 100 of fig. 1A.
At 402, information related to a plurality of virtual objects is obtained from an online platform server. In some embodiments, information corresponding to a virtual object corresponding to a new product is obtained from a server associated with an online shopping platform. For example, the information associated with the virtual object includes an image that may be used to render a 3D hologram of a corresponding new physical product sold on the online shopping platform. The information related to the virtual object also includes a description of the characteristics and/or specifications of the new product to which the virtual object corresponds.
At 404, a plurality of virtual objects are caused to be rendered. The virtual objects are presented in the form of 3D holograms on a display device, which a user can browse via user input (e.g., hand motion). For example, user interaction with the presented 3D virtual objects may allow a cursor to highlight any one of the 3D virtual objects.
At 406, a selection related to the presented virtual object is received. In some embodiments, a user selection gesture (e.g., an action of a user pushing with his or her hand) for selecting a particular rendered 3D virtual object is determined. If a predetermined user selection gesture is detected for the 3D virtual object that the cursor is currently highlighting, the 3D virtual object is selected for only further interaction. In some embodiments, if a 3D virtual object is not detected within a predetermined time since the 3D virtual object is highlighted by the cursor, the 3D virtual object is automatically selected (e.g., a potential interest of the intended user in the 3D virtual object and/or a predetermined user selection gesture that the system may not be able to detect that the user is attempting to perform).
One example application of a half-air temperature adjustment output in response to user input is a holographic signage setting (holographic signage kiosk setup) to display a virtual product. Users can interact with virtual products using semi-empty gestures and motions, as well as eye movements, and their hands may be subjected to simulated warmth or coldness when they collide with 3D virtual objects in different ways.
FIG. 5 is a flow chart illustrating an embodiment of a process for providing a half-air temperature adjustment output in response to user input. In some embodiments, process 500 may be implemented on system 100 of fig. 1A.
At 502, user input regarding a virtual object is received. The user input includes user motion, for example, eye motion, hand motion, leg motion, foot motion, and/or head motion. In some embodiments, the virtual object is presented on a display. For example, the virtual object is an object that the user selects in process 400 as in FIG. 4. For example, the display includes an autostereoscopic display. In various embodiments, if it is determined that the location of the user input collides with a virtual object in 3D space (e.g., is located within a collision zone), it is determined that the user input is associated with the virtual object
At 504, the virtual object is updated based at least in part on the user input. In some embodiments, the appearance of the virtual object is updated based on user input. For example, the shape and/or associated animation of the virtual object is updated according to the location (e.g., collision zone) of the user input associated with the location of the virtual object in 3D space, and the updated virtual object is presented. For example, the user input may have deformed at least a portion of the virtual object because the virtual object is configured to be compressible (e.g., the virtual object is a sofa cushion). In another example, the user input may have changed the associated animation of the virtual object (e.g., the virtual object includes a showerhead with an animation of spraying water from the showerhead).
At 506, a half-air temperature adjustment feedback is determined based at least in part on the user input. A respective half-air temperature adjustment feedback including at least one of air and warm or cold is determined based on the user input. For example, a corresponding half-space temperature adjustment feedback is generated according to a position of a user input regarding a position of at least part of the virtual object in the 3D space.
At 508, a half-air temperature adjustment feedback is output. The corresponding half-air temperature adjustment feedback is output to the direction of user input and/or the current location of the user. In some embodiments, the respective half-air temperature adjustment feedback is output with the other half of the air feedback (e.g., haptic feedback) that is also determined based on user input.
An example application for providing temperature adjustment feedback in response to user input is in a shopping area where a user may wish to interact with a virtual product before purchasing an entity version of the product. The user can view not only the product details but also the functions that the product can provide. For example, a user may wish to try and feel the pressure of a newly released new water jet characteristic of a deluxe jet bathtub. In another example, the user may also want to try and feel a different type of warm water waterfall setting (hot WATERFALL SETTING) for a new shower product. By applying a temperature adjustment feedback system according to some embodiments described herein, a user can interact with the characteristics of a 3D virtual product and receive multi-sensory feedback to experience a more realistic, intuitive simulation of product interactions.
Fig. 6 is a flowchart showing an example of a process for providing a half-air temperature adjustment output in response to user input. In some embodiments, process 600 may be implemented on system 100 of fig. 1A. In some embodiments, the process 400 of fig. 4 may be implemented at least in part by the flow 600. In some embodiments, process 500 of fig. 5 may be implemented, at least in part, by flow 600.
At 602, a generated 3D virtual environment is presented. The 3D virtual environment includes a 3D virtual scene. In some embodiments, one or more 3D virtual objects are presented in a 3D virtual environment. The 3D virtual environment and/or 3D virtual object are rendered by a display device (e.g., autostereoscopic). In some embodiments, the 3D virtual environment and/or 3D virtual object is created using a game engine (e.g., unity). For example, the 3D virtual scene includes a bathroom, and the 3D virtual objects in the scene are different bathroom facilities (e.g., bathtubs, shower heads, and/or toilets).
At 604, a user selection of a 3D virtual object is detected in the 3D virtual environment. The user selects a particular 3D virtual object based on, for example, detected user input matching a predetermined selection gesture. The selected 3D virtual object may be enlarged in the presentation via the display device, thereby making its features more visible to the user.
In some embodiments, the 3D virtual object comprises an animation. For example, a 3D virtual object is a combination of a bathroom accessory and a liquid (e.g., water) released from the accessory. In particular, the 3D virtual object includes a water stream flowing from a faucet, shower head, or nozzle. In some embodiments, the 3D virtual object is animated. For example, if the 3D virtual object is a water stream, the animation is that water is continually ejected from a source (e.g., faucet, shower head, or nozzle).
At 606, user input is detected. The motion detection device detects motion/input of a user (e.g., hand, eye, head, foot, and/or leg). In some embodiments, one or more points in 3D space that are related to user input are determined. For example, coordinates related to the position of the user's hand in 3D space are determined.
At 608, it is determined whether the user input collides with the 3D virtual object. If the user input collides with the 3D virtual object, control will transfer to 610. Otherwise, if the user input does not collide with the 3D virtual object, control will return to 606 to await user input.
In some embodiments, to determine whether the user input collides with the 3D virtual object, a point in 3D space associated with the user input is associated with a collision region of the 3D virtual object. For example, the collision region of the 3D virtual object may be defined as a portion of the 3D virtual object or the entire volume of the 3D virtual object. For example, if coordinates related to the position of the user's hand in 3D space overlap any defined collision region of the 3D virtual object, it is determined that there is a collision between the user input and the 3D virtual object. But if the coordinates related to the position of the user's hand in 3D space do not overlap with any defined collision area of the 3D virtual object, it is determined that there is no collision between the user input and the 3D virtual object and it is waited for the next user input to be detected in step 606.
At 610, a measurement is determined based at least in part on the user input and the reference point. In some embodiments, the reference point is defined as part of a 3D virtual object. In some embodiments, the reference point is defined as the location of the motion detection device. For example, a distance between a location of the user input and a reference point in 3D space may be calculated.
At 612, the appearance of the 3D virtual object is updated based at least in part on the measurements. In some embodiments, the 3D virtual object may be updated by changing the shape and/or animation of the 3D virtual object. For example, if the detected user's hand is close to a reference point of a 3D virtual object comprising a water flow source, the distance between the water flow source and the user's hand will decrease and the 3D virtual object may be displayed as a shorter water flow that appears to be compressed. In another example, depending on the position of the user's hand, the water flow may be shown to both collide with and flow around the object (the user's hand).
At 614, temperature adjustment feedback is generated based at least in part on the measurements. In some embodiments, the temperature adjustment feedback may be updated by turning on the heating assembly, changing the temperature of the heating assembly, turning on the fan, and/or combinations thereof. Because the heating component of the temperature adjustment feedback device cannot instantaneously generate heat (e.g., generating heat may take two to three seconds), it may be turned on before a collision between user input and the 3D virtual object is detected to expect such a collision. For example, when a user selects to interact with a 3D virtual object that provides thermal feedback, the heating component of the temperature-regulated feedback device may be turned on even when the 3D virtual environment is first presented (e.g., at step 604 or 602, respectively). For example, later, when a collision between the user's hand and the 3D virtual object is detected, the fan portion of the temperature adjustment feedback device may be turned on to direct the generated heat to the location of the user's hand (e.g., palm) to simulate the warm (or other temperature adjustment) feedback provided by the user engaged with the 3D virtual object.
At 616, temperature regulation feedback is output. If multiple temperature adjustment feedback devices are used, temperature adjustment feedback may be provided by a combination of temperature adjustment feedback devices depending on the position of the user's hand (e.g., palm).
At 618, non-temperature dependent feedback is generated based at least in part on the measurements.
At 620, non-temperature dependent feedback is output.
In some embodiments, the half-air haptic feedback may be updated by changing the intensity/pressure and/or area of the half-air focus position. For example, if the user's hand is to be closer to a 3D virtual object that includes a source of water flow, the haptic feedback may provide greater pressure to the user's hand (e.g., palm) than if the user's hand were to be farther from the source of water flow. Further, for example, if the user's hand were to be closer to a reference point of a 3D virtual object that includes a source of water flow, the semi-empty haptic feedback would provide pressure to a smaller contact area of the user's hand (e.g., palm) than if the user's hand were farther from the source of water flow. If multiple half-space haptic feedback systems are used, haptic feedback may be provided by a combination of haptic feedback systems, depending on the position of the user's hand (e.g., palm).
In some embodiments, audio feedback may be updated by playing multiple recorded audio data and/or at different volumes depending on the position of the user's hand.
In some embodiments, the scent-based feedback may be updated by emitting a plurality of scents according to the detected user input and/or the type of product associated with the selected 3D virtual object.
In some embodiments, the non-temperature-dependent feedback is coordinated with the temperature adjustment feedback such that the user may perceive multiple feedback together. For example, the haptic feedback may be output in the same direction from the haptic feedback device and at least partially synchronized to the output of the haptic feedback device from the haptic feedback device. In a particular example, if a user performs user input on a water-sprayed 3D showerhead object, semi-empty heat and semi-empty haptic feedback (in addition to audio feedback) may be output to the position of the user's palm so that the user may experience a simulated sensation of warm water touching his or her palm.
In some embodiments, the type of update to the 3D virtual object and/or the type of feedback generated by the one or more feedback devices is determined based on a predetermined mapping between the current position of the user's hand and the corresponding update and/or feedback rules. In some embodiments, the type of update to the 3D virtual object and/or the type of feedback generated by the one or more feedback devices is dynamically determined based on the entity-based simulation. For example, the type of update to the 3D virtual object includes how the shape and animation of the 3D virtual object changes under different user inputs. For example, the type of feedback generated by the one or more feedback devices includes length, temperature, intensity, volume, and/or output direction of each feedback device (e.g., a half-space temperature regulated feedback device, a half-space haptic feedback device).
At 622, a determination is made as to whether to continue to provide a half-space temperature regulated output. If the half-air temperature regulation output continues to be provided, control returns to 606. Otherwise, if the half-space temperature adjustment output is not continued to be provided, the process 600 ends.
For example, if the system implementing process 600 is shut down and/or powered down, the half-air temperature regulation output is no longer provided.
Fig. 7A and 7B depict examples of 3D virtual objects displayed by a system for providing a half-space temperature regulated output responsive to user input. In the example of fig. 7A and 7B, system 700 includes a display device 710, an eye movement detection device 708, a manual detection device 706, a mid-air haptic feedback device 702a, a mid-air haptic feedback device 702B, a mid-air temperature adjustment device 704a, a mid-air temperature adjustment device 704B, and a host computer (not shown). In the example of system 700, eye movement detection device 708 is embedded in display device 710, but manual detection device 706 is not embedded in display device 710, but is disposed adjacent to haptic feedback device 702 b. As shown in fig. 7A and 7B, the half-space haptic feedback device 702a and the half-space temperature adjusting device 704a are disposed above the display device 710, and the half-space haptic feedback device 702B and the half-space temperature adjusting device 704B are disposed below the display device 710. Feedback may be provided by different combinations of the haptic feedback device 702a, the haptic feedback device 702b, the thermal regulating device 704a, and the thermal regulating device 704b depending on the detected position of the user input (e.g., with respect to the eye movement detection device 708 and the manual detection device 706) and/or the orientation of the 3D virtual showerhead 712. Although not shown in fig. 7A and 7B, the host computer is connected to and receives and transmits instructions from the display device 710, the eye movement detection device 708, the manual detection device 706, the haptic feedback device 702a, the haptic feedback device 702B, the haptic temperature adjustment device 704a, and the haptic temperature adjustment device 704B.
In the example of fig. 7A, a display device 710, which is an autostereoscopic display, is displaying a 3D virtual showerhead 712. For example, the 3D virtual showerhead 712 is selected for display by a user from a menu previously displayed on the display device 710. In a particular example, the virtual bathroom scene was previously presented on the display device 710 and the user selected the 3D virtual showerhead 712 to interact with features of the showerhead model. The 3D virtual showerhead 712 includes an animation of the water flow from the showerhead 716. For example, animation of the water flow 712 of the 3D virtual showerhead 716 may be modeled based on how water is actually ejected from the physical showerhead on which the 3D virtual showerhead 712 is based. As shown in fig. 7A, without any user interaction with the 3D virtual showerhead 712, the animation of the water stream 716 flows uninterrupted because it does not collide with any objects. As described below with reference to fig. 7B, user inputs (e.g., gestures and/or eye movements) detected by the system 700 may cause the 3D virtual showerhead 712 to present and/or cause the system 700 to output feedback via at least one feedback device (e.g., the haptic feedback device 702a, the haptic feedback device 702B, the haptic temperature adjustment device 704a, and the haptic temperature adjustment device 704B) in different ways.
In the example of fig. 7B, a user hand 714 is placed "under" the 3D virtual showerhead 712 to "touch" the animated water flowing from the 3D virtual showerhead 712. The pose and position of the user's hand 714 (in 3D space where the 3D virtual showerhead 712 is present) is detected by the manual detection device 706 (and, in some embodiments, the user's eye movement is through the eye movement detection device 708). For example, a current position of the user's hand 714 is detected to collide with a defined collision zone of the 3D virtual showerhead 712 and its accompanying water animation. Thus, a measurement of the length/distance between the current position of the user's hand 714 and the position of the manual detection device 706 (or a reference point on the 3D virtual showerhead 712, e.g., the center of the face of the 3D virtual showerhead 712) is made by the host computer. The measured length/distance is used by the host computer to cause water to exit the 3D virtual showerhead 712, the water flow animation 720, which appears as if the water flow collides with the object (user hand 714), now appears to be deformed and flows around the detected object (user hand 714) rather than flowing as uninterrupted, as shown in fig. 7A.
In addition, the measured length/distance is used by the host computer to cause the half-air tactile feedback device 702a to provide tactile feedback 722 downwardly to the user's hand 714 and to provide heated air feedback 718 downwardly by the half-air temperature adjustment device 704a to the user's hand 714. The focal position (e.g., ultrasonic) haptic feedback 722 is determined by the host computer as being near or at the current position of the user's hand 714 (e.g., based on measured length/distance) and/or a known water pressure associated with the physical version of the 3D virtual showerhead 712. The intensity of the heated/air flow of the heated air feedback 718 is also determined by the host computer as being near or at the current position of the user's hand 714 (e.g., based on the measured length/distance). The combination of the haptic feedback 722 and the hot air feedback 714 on the user's hand 718 will simulate the feel of warm water spray from the 3D virtual showerhead 712 and contact the palm of the user's hand 714. For example, the feedback is caused to be output by the half-air tactile feedback device 702a and the half-air temperature adjustment device 704a, both of which are directed downward because the 3D virtual showerhead 712 is directed downward, so the palm of the user is expected to be facing upward to contact warm water flowing downward from the 3D virtual showerhead 712. In a different example, water is ejected from a 3D virtual object onto (e.g., a spout in a hot water bathtub) causing feedback to be output by a half-air tactile feedback device 702b and a half-air temperature adjustment device 704b, both of which are directed upward, because the palm of the user is expected to be in downward contact with the warm water flowing upward from the spout. In addition to the haptic feedback 722 and the hot air feedback 718, additional feedback, such as, for example, water sounds and/or fragrances, may be output by the system 700 from corresponding feedback devices (not shown) to obtain an immersive simulated experience.
As the user hand 714 moves, the focal length and/or pressure of the haptic feedback 722 and the heat and/or wind intensity of the heated air feedback 718 may be updated accordingly by the host computer based on the current position of the user hand 714 to better simulate the varying degrees of water pressure and warmth that the user should experience when using the physical version of the 3D virtual showerhead 712.
Embodiments are disclosed that provide a half-air temperature adjustment output responsive to user input. User input is detected with respect to the virtual object. At least half-space temperature regulation feedback (and sometimes in addition to half-space haptic feedback) is provided in response to detected user input to provide the user with a temperature regulation (e.g., including cold, hot, and/or air flow) that simulates what is emitted by/from a virtual object so that the user can obtain an immersive virtual experience without requiring direct contact with the physical object.
Although the foregoing embodiments have been described in some detail for purposes of clarity of understanding, the invention is not limited to the details provided. There are many alternative ways of implementing the invention. The disclosed embodiments are illustrative and not restrictive.
Claims (20)
1. A system, comprising:
A motion detection device configured to detect user input regarding a virtual object;
a display device configured to present a virtual object;
Temperature adjusting device, and
A processor coupled to the motion detection device, the display device, and the temperature adjustment device, the processor configured to:
Obtaining information of a virtual object corresponding to a product sold on an online shopping platform from a server of the online shopping platform, wherein the information comprises characteristics of the product corresponding to the virtual object and an image which can be used for rendering the virtual object;
displaying the virtual object on the display device according to the image control;
Determining a temperature adjustment based at least in part on characteristics of a product corresponding to the virtual object and the user input regarding the virtual object, and
The thermostat is caused to output the thermostat so that a user may interact with a virtual object of the product prior to purchasing the physical version of the product.
2. The system of claim 1, wherein the determining the temperature adjustment based at least in part on the user input regarding the virtual object comprises:
determining a measurement based at least in part on the user input and a reference point associated with the motion detection device;
determining instructions related to generating the temperature adjustment using the measurements, and
The instructions related to generating the temperature adjustment are sent to the temperature adjustment device.
3. The system of claim 1, wherein the temperature regulating device comprises one or more of a fan, a heating assembly, and a cooling assembly.
4. The system of claim 3, wherein the causing the temperature adjustment device to output the temperature adjustment comprises:
First sending a first command to the temperature regulating device to activate the heating assembly, and
A second time a second instruction is sent to the temperature regulating device to activate the fan, wherein the second time lags the first time.
5. The system of claim 1, further comprising:
a haptic feedback device configured to generate haptic feedback;
wherein the processor is coupled to the haptic feedback device and is further configured to control the haptic feedback device in response to the user input.
6. The system of claim 5, wherein the half-space haptic feedback device comprises an ultrasonic haptic display configured to emit ultrasonic waves to generate haptic sensations at one or more focal points using a plurality of transducers.
7. The system of claim 5, wherein the controlling the half-space haptic feedback device in response to the user input comprises:
determining a measurement based at least in part on the user input and a reference point associated with the motion detection device;
determining instructions related to generating the haptic feedback using the measurements, and
The instructions related to generating the haptic feedback are sent to the haptic feedback device.
8. The system of claim 5, wherein the processor is configured to control the temperature adjustment device to output the temperature adjustment at least partially in synchronization with the haptic feedback device outputting the haptic feedback.
9. The system of claim 1, further comprising:
the processor is further configured to determine whether the user input collides with the virtual object, and
Wherein the processor is configured to determine the temperature adjustment based at least in part on the user input regarding the virtual object based at least in part on whether the user input collides with the virtual object.
10. The system of claim 9, wherein the display device comprises an autostereoscopic display device.
11. The system of claim 1, wherein the user input is obtained without requiring physical contact of the user with the physical object.
12. A method, comprising:
Obtaining information of a virtual object corresponding to a product sold on an online shopping platform from a server of the online shopping platform, wherein the information comprises characteristics of the product corresponding to the virtual object and an image which can be used for rendering the virtual object;
displaying the virtual object according to the image;
Updating the virtual object based at least in part on the user input;
determining a temperature adjustment based at least in part on characteristics of the virtual object corresponding product and the user input, and
The temperature adjustment is output so that a user can interact with the virtual object of the product before purchasing the physical version of the product.
13. The method of claim 12, wherein the updating the virtual object comprises updating at least one of a shape and an animation associated with the virtual object.
14. The method of claim 12, wherein the determining the temperature adjustment based at least in part on the user input comprises:
determining a measurement based at least in part on the user input and a reference point, and
The temperature adjustment is determined using the measurements.
15. The method of claim 14, wherein the reference point is associated with a portion of the virtual object.
16. The method of claim 14, wherein the reference point is related to a location of a motion detection device that detects the user input.
17. The method of claim 12, wherein the information about the plurality of virtual objects is obtained from a server of the online shopping platform, further comprising:
causing the plurality of virtual objects to be rendered, and
A selection is received regarding the presented virtual object.
18. The method of claim 12, further comprising determining that the user input collides with a collision zone associated with the virtual object.
19. The method of claim 12, wherein the temperature regulation comprises air flow, and at least one of thermal regulation and cold regulation.
20. The method of claim 12, further comprising:
determining a haptic feedback based at least in part on the user input, and
Outputting the half-empty haptic feedback.
Applications Claiming Priority (6)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| US201962860687P | 2019-06-12 | 2019-06-12 | |
| US62860687 | 2019-06-12 | ||
| US62/860,687 | 2019-06-12 | ||
| US16/681,629 US20200393156A1 (en) | 2019-06-12 | 2019-11-12 | Temperature adjustment feedback system in response to user input |
| US16/681,629 | 2019-11-12 | ||
| US16681629 | 2019-11-12 |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| CN112083798A CN112083798A (en) | 2020-12-15 |
| CN112083798B true CN112083798B (en) | 2025-01-07 |
Family
ID=73735864
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| CN202010451233.9A Active CN112083798B (en) | 2019-06-12 | 2020-05-25 | Temperature regulation feedback system responsive to user input |
Country Status (1)
| Country | Link |
|---|---|
| CN (1) | CN112083798B (en) |
Family Cites Families (13)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US8009022B2 (en) * | 2009-05-29 | 2011-08-30 | Microsoft Corporation | Systems and methods for immersive interaction with virtual objects |
| US9098873B2 (en) * | 2010-04-01 | 2015-08-04 | Microsoft Technology Licensing, Llc | Motion-based interactive shopping environment |
| JP2011227672A (en) * | 2010-04-19 | 2011-11-10 | Sharp Corp | Input device, information processor and information processing system |
| US9092953B1 (en) * | 2012-01-24 | 2015-07-28 | Bruce J. P. Mortimer | System and method for providing a remote haptic stimulus |
| US9367136B2 (en) * | 2013-04-12 | 2016-06-14 | Microsoft Technology Licensing, Llc | Holographic object feedback |
| US11341566B2 (en) * | 2014-10-13 | 2022-05-24 | Kimberly-Clark Worldwide, Inc. | Systems and methods for providing a 3-D shopping experience to online shopping environments |
| US9658693B2 (en) * | 2014-12-19 | 2017-05-23 | Immersion Corporation | Systems and methods for haptically-enabled interactions with objects |
| US20190079480A1 (en) * | 2015-03-17 | 2019-03-14 | Whirlwind VR, Inc. | System and Method for Delivery of Variable Flow Haptics in an Immersive Environment with Latency Control |
| GB2570073A (en) * | 2016-10-27 | 2019-07-10 | Walmart Apollo Llc | Systems and methods for conserving computing resources during an online or virtual shopping session |
| CN106598215B (en) * | 2016-11-02 | 2019-11-08 | Tcl移动通信科技(宁波)有限公司 | The implementation method and virtual reality device of virtual reality system |
| CN107307970A (en) * | 2017-08-10 | 2017-11-03 | 湖州健凯康复产品有限公司 | A kind of body feeling interaction massage armchair |
| CN108073285B (en) * | 2018-01-02 | 2021-05-18 | 联想(北京)有限公司 | Electronic equipment and control method |
| US20200201437A1 (en) * | 2018-12-21 | 2020-06-25 | Immersion Corporation | Haptically-enabled media |
-
2020
- 2020-05-25 CN CN202010451233.9A patent/CN112083798B/en active Active
Also Published As
| Publication number | Publication date |
|---|---|
| CN112083798A (en) | 2020-12-15 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US20200393156A1 (en) | Temperature adjustment feedback system in response to user input | |
| JP6906580B6 (en) | Viewport-based augmented reality tactile effects systems, methods and non-transitory computer-readable media | |
| US12154234B2 (en) | Devices, methods, and graphical user interfaces for interacting with three-dimensional environments | |
| Araujo et al. | Snake charmer: Physically enabling virtual objects | |
| US11366512B2 (en) | Systems and methods for operating an input device in an augmented/virtual reality environment | |
| JP6453448B2 (en) | 3D Contextual Feedback | |
| TW201816554A (en) | Interaction method and device based on virtual reality | |
| KR20170081225A (en) | Sensory feedback systems and methods for guiding users in virtual reality environments | |
| JP2013025789A (en) | System, method and program for generating interactive hot spot of gesture base in real world environment | |
| US12039858B2 (en) | Haptic feedback generation | |
| US20220043518A1 (en) | Touch Enabling Process, Haptic Accessory, and Core Haptic Engine to Enable Creation and Delivery of Tactile-Enabled Experiences with Virtual Objects | |
| US20200201437A1 (en) | Haptically-enabled media | |
| JP2019519856A (en) | Multimodal haptic effect | |
| CN111752389B (en) | Interactive system, interactive method and machine-readable storage medium | |
| CN108509043B (en) | Interaction control method and system | |
| CN112083798B (en) | Temperature regulation feedback system responsive to user input | |
| US8799825B2 (en) | 3D interface apparatus and interfacing method using the same | |
| Hosoi et al. | Vwind: Virtual wind sensation to the ear by cross-modal effects of audio-visual, thermal, and vibrotactile stimuli | |
| TWI754899B (en) | Floating image display apparatus, interactive method and system for the same | |
| De Pra et al. | Infrared vs. ultrasonic finger detection on a virtual piano keyboard | |
| US20170083952A1 (en) | System and method of markerless injection of 3d ads in ar and user interaction | |
| US12141904B1 (en) | Controlling joints using learned torques | |
| TW202046251A (en) | Method and system for displaying online content using stereoscopic images | |
| CN119987226A (en) | Smart home control method, device, control equipment and storage medium | |
| WO2025186962A1 (en) | Information processing device, method, and program |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| PB01 | Publication | ||
| PB01 | Publication | ||
| SE01 | Entry into force of request for substantive examination | ||
| SE01 | Entry into force of request for substantive examination | ||
| GR01 | Patent grant | ||
| GR01 | Patent grant |