[go: up one dir, main page]

HK1180126A - Wireless communication techniques - Google Patents

Wireless communication techniques Download PDF

Info

Publication number
HK1180126A
HK1180126A HK13107032.0A HK13107032A HK1180126A HK 1180126 A HK1180126 A HK 1180126A HK 13107032 A HK13107032 A HK 13107032A HK 1180126 A HK1180126 A HK 1180126A
Authority
HK
Hong Kong
Prior art keywords
display
devices
wireless
frame
communication
Prior art date
Application number
HK13107032.0A
Other languages
Chinese (zh)
Inventor
R.G.弗莱克
J.A.佩里
P.S.霍昂
Original Assignee
微软技术许可有限责任公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 微软技术许可有限责任公司 filed Critical 微软技术许可有限责任公司
Publication of HK1180126A publication Critical patent/HK1180126A/en

Links

Description

Wireless communication technology
RELATED APPLICATIONS
Priority of united states provisional patent application No.61/430,639 (attorney docket No. 331722.01), filed 2011, 1, month 7, and united states provisional patent application No.61/431,312 (attorney docket No. 332027.01), filed 2011, 1, month 10, are claimed in this application under 35u.s.c. § 119(e), the entire disclosure of each of these applications being incorporated herein by reference.
Technical Field
The present invention relates to wireless communication technology.
Background
Wireless communications continue to grow in popularity. Initially, wireless communication technologies were used by computing devices such as traditional desktop and laptop computers to communicate locally with each other and remotely via the internet. The use of these techniques is then extended to a wide variety of other devices, such as game consoles, input devices (e.g., keyboard and mouse), printers, and so forth.
However, as such use expands, conventional techniques for performing wireless communication face various difficulties. For example, the large popularity of these techniques may result in interference between devices utilizing these techniques, thereby limiting the usefulness of these techniques to each of the devices utilizing these techniques. Moreover, these techniques may consume relatively large amounts of power to overcome such interference, which may limit the usefulness of these techniques to mobile devices that are battery powered and cause further interference.
Disclosure of Invention
Wireless communication techniques are described. In one or more implementations, techniques are described that involve active power control such that a device may bypass the use of a power amplifier to communicate wirelessly. In one or more additional implementations, wireless communication techniques are described in which a transmitting device utilizes one or more streams on a receiving device (such as by using a buffer). In one or more further implementations, a receiving device is configured to adjust a display based on wireless communications received from a plurality of devices.
Further, wireless communication techniques are described that may utilize multiple bands to provide wireless communication. Additionally, in one or more implementations, wireless communication techniques are described in which a transmitting device may employ codec adaptation. Still further, in one or more implementations, wireless communication techniques are described that may be used to change characteristics of a channel used to communicate data.
This summary is provided to introduce a selection of concepts in a simplified form that are further described below in the detailed description. This summary is not intended to identify key features or essential features of the claimed subject matter, nor is it intended to be used as an aid in determining the scope of the claimed subject matter.
Drawings
The detailed description describes embodiments with reference to the drawings. In the drawings, the left-most digit(s) of a reference number identifies the drawing in which the reference number first appears. The use of the same reference numbers in different instances in the description and the figures may indicate similar or identical items.
Fig. 1 is an illustration of an environment in an example implementation that is operable to employ short-range and medium-range wireless communication techniques.
Fig. 2 depicts a system in an example implementation in which the computing device of fig. 1 is configured to bypass and disable a power amplifier for short-range wireless communication.
Fig. 3 is a flow diagram depicting a procedure in an example implementation of a wireless communication technique in connection with controlling a power amplifier to transmit data over different ranges.
Fig. 4 depicts a system in an example implementation in which a transmitting device leverages the presence of buffers and/or use of data streams on a receiving device.
Fig. 5 is a flow diagram depicting a procedure in an example implementation of a wireless communication technique in relation to a wireless buffering and streaming technique.
Fig. 6 depicts a system in an example implementation in which display technology is employed in a wireless environment.
Fig. 7 is a flow diagram depicting a procedure in an example implementation of a wireless communication technology in relation to a wireless display technology for content received from multiple devices.
Fig. 8 illustrates an example system in which dual-band functionality of a wireless device is utilized to provide wireless communication utilizing two bands.
Fig. 9 is a flow diagram depicting a procedure in an example implementation of a wireless communication technology relating to a communication function utilizing multiple bands.
Fig. 10 depicts a system in an example implementation in which wireless encoding and decoding techniques are employed.
Fig. 11 is a flow diagram depicting a procedure in an example implementation of a wireless communication technology in relation to wireless encoding and decoding technologies.
Detailed Description
Overview
Devices employing wireless communication are increasing in popularity. Accordingly, conventional techniques for providing wireless communication may face an ever-increasing amount of interference between these devices, which may limit the usefulness of the communication techniques.
Wireless communication techniques are described. In one or more implementations, techniques are described that involve active power control such that a device may bypass and disable a power amplifier to communicate wirelessly. These techniques may be employed by short-range and medium-range direct and indirect communications, such as bluetooth, Wi-Fi (e.g., IEEE 802.11), Wi-Max, and so on, but are not so limited. In this way, the power consumption of the device may be reduced in situations where the use of power amplifiers may be avoided, further discussion of which may be found in relation to fig. 2 and 3.
In one or more implementations, wireless communication techniques are described in which a transmitting device utilizes one or more frame buffers and/or streams on a receiving device. For example, a transmitting device may determine that the next frame matches a frame that has already been transmitted to a receiving device, and may then put the device's means for transmitting frames to sleep until a "new" frame is to be transmitted. In this manner, power usage and network interference of the transmitting device may be reduced, further discussion of which may be found in relation to fig. 4 and 5.
In one or more implementations, a receiving device is configured to adjust a display based on wireless communications received from a plurality of devices. For example, the receiving device may be configured as a television set. A television set may receive wireless communications from a number of different devices (e.g., mobile phones). The television may then divide the display to display video from each of these devices. In addition, the video transmitted by these devices may be configured according to how it is to be displayed by the display device, such as by adjusting the resolution and/or aspect ratio to match the portion of the video to be displayed. Further discussion of these techniques may be found in relation to fig. 6 and 7.
In one or more implementations, wireless communication techniques are described that may provide wireless communication using multiple bands. For example, wireless communication may support both the 2.4GHz band and the 5.0GHz band. The device may be configured to communicate with other devices using two bands, such as simultaneously using the 2.4GHz band to communicate control information and the 5.0GHz band to communicate data payloads, further discussion of which may be found in relation to fig. 8 and 9.
In one or more implementations, depending on the content type of a particular frame, the codec type may be changed to a type that is more suitable for handling the current information type. This may be performed by utilizing a number of processing techniques, such as frequency profiling (frequency profile), frequency gradients, temporal variations, edge variation detection, and other video and image evolution algorithms. Further, this may be performed as part of a decision tree to select an appropriate compression (e.g., codec) for use with the video frames, further discussion of which may be found in relation to fig. 10 and 11.
In one or more implementations, wireless communication techniques are also described that may be used to change characteristics of a channel used to communicate data. For example, the transmitting device may detect noise and renegotiate a new channel with the receiving device, thereby saving power due to increased cleanliness of the new channel and less data retransmission. Other techniques are also contemplated such as dynamically adjusting compression ratios, delta amounts, changes from one codec to another, beamforming, FEC (forward error correction), etc., further discussion of which may be found in relation to fig. 10 and 11.
In one or more implementations, wireless communication techniques are described in which a transmitting device employs codec adaptation. For example, the sending device may determine whether the receiving device supports video in the current format. If so, the sending device may transmit the video without decoding it. If not, the sending device may transcode the video. In this manner, the transmitting device may reserve resources that would otherwise be used to unnecessarily decode the video, further discussion of which may be found in relation to fig. 10 and 11.
In the discussion that follows, an example environment is described that is operable to perform the techniques described herein. Example processes are also described that may be performed in the example environment or elsewhere. Accordingly, the example environment is not limited to execution of the example processes, but rather, the example processes are not limited to execution in the example environment.
Example Wireless Environment
Fig. 1 is an illustration of an environment 100 in an example implementation that is operable to employ wireless communication techniques described herein. The illustrated environment 100 includes an access point 102, a computing device 104, and another computing device 106 communicatively coupled via a wireless network 108.
The computing devices 104, 106 may be configured in various ways. For example, the computing devices 104, 106 may be configured as computers capable of communicating over the wireless network 108, such as desktop computers, mobile workstations, entertainment devices, tablet computers, set-top boxes communicatively coupled to display devices, wireless telephones, game consoles, digital televisions, and so forth. Thus, the computing devices 104, 106 may range from full resource devices with substantial memory and processor resources (e.g., personal computers, game consoles) to low-resource devices with limited memory and/or processing resources (e.g., traditional set-top boxes, hand-held game consoles, "dumb" digital televisions with limited functionality).
The computing devices 104, 106 may also include entities (e.g., software) that cause hardware of the computing devices 104, 106 to perform operations, such as processors, functional blocks, and so forth. For example, the computing devices 104, 106 may include computer-readable media that may be configured to hold instructions that cause the respective computing devices, and in particular the hardware of the computing devices 104, 106, to perform operations. Thus, the instructions are used to configure hardware to perform operations and in this way cause hardware transformations to perform functions. The instructions may be provided by the computer-readable medium to the computing devices 104, 106 through a variety of different configurations.
One such computer-readable medium configuration is a signal bearing medium and thus is configured to transmit instructions (e.g., as a carrier wave) to the hardware of the computing device, e.g., over network 108. The computer-readable medium may also be configured as a computer-readable storage medium and thus is not a signal bearing medium. Examples of computer readable storage media include Random Access Memory (RAM), Read Only Memory (ROM), optical disks, flash memory, hard disk memory, and other storage devices that may use magnetic, optical, and other technologies for storing instructions and other data.
Although a single wireless network 108 is shown, the network may assume a variety of configurations and may be configured to include multiple networks, e.g., to support direct and/or indirect communication, to comply with different standards, and so forth. For example, the wireless network 108 may be configured for short range communications, such as those typically employed at distances of 10 meters. For example, short-range communications may be configured to support direct and/or indirect communications within a room or between adjacent rooms of a structure, such as a typical user's house.
The wireless network 108 may also be configured for mid-range communications, such as according to Wi-Fi (e.g., IEEE 802.11) for distances up to about 300 meters, WiMAX (e.g., IEEE 802.16) for distances up to about 1 kilometer, and so forth. These standards allow a variety of different computing devices (e.g., laptops, phones, gaming consoles, and consumer electronics devices) to connect to access point 102 and/or directly with each other to allow mobile transfer of various content, such as web content, media content, email, messaging, and various other data types. For example, most medium and high-end mobile communication devices may utilize Wi-Fi to enable rich browsing (rich browsing), increased functionality of applications, and data-oriented communication. Thus, in each of these short-range and medium-range communication examples, the wireless network 108 is not a wireless telephone network (e.g., a cellular network) typically used for telephonic communications, although such implementations are also contemplated.
Access point 102 and computing devices 104, 106 are each shown to include a respective communication module 110, 112, 114. The communication modules 110, 112, 114 represent functionality for the respective devices to communicate over the wireless network 108. For example, the communication modules 110, 112, 114 may represent functionality operable to encode data for transmission and decode data received by the device in accordance with one or more of the standards described above. The functionality may also relate to techniques that may be used to manage communications (such as to negotiate channels, resolve conflicts, etc.).
As described above, various devices employing wireless communication technologies are increasing, such as laptop computers, digital televisions, smart phone platforms, compact disc players, and so forth. Some of these devices may also employ a set of standards (e.g., from the digital living network alliance) to allow device discovery and connectivity, media file browsing, and the exchange of digital media such as photos, music, and videos.
Accordingly, a variety of different techniques may be used for communicating via the wireless network 108. For example, access point 102 may be used to communicate such that computing device 104 communicates data through access point 102 for receipt by computing device 106. Direct communication between the computing devices 104, 106 that does not involve using the access point 102 or other devices as an intermediary may also be supported.
For example, direct communication (e.g., Wi-Fi direct) may be used to avoid expensive dual-path connections (e.g., up to access point 102 and down from access point 102) for devices, where a predominantly point-to-point connection may be employed, for example, when computing devices 104, 106 are within range of each other. This allows the data type (e.g., video) to be transmitted directly from a transmitting device (e.g., computing device 104, shown as a smartphone) to a receiving device, e.g., to computing device 106, shown as a digital television. Thus, wireless network 108 may also represent communications that do not involve access point 102. In one implementation, computing device 104 may also communicate with access point 102 for web-based content, while data from a telephone to a digital television is not transmitted to access point 102.
Other criteria related to Wi-Fi display may be employed in the environment 100. For example, information such as video that may be displayed by a display device, such as the digital television shown, may be transmitted in compliance with an uncompressed standard (e.g., Wi-Gig) or a compressed standard (e.g., 802.11 n). Wi-Fi displays open up multiple opportunities to web pages, games, messaging, etc., beyond traditional media types. It may also be used to allow a source device (e.g., computing device 104) to control a target display (e.g., computing device 106), allowing for a predictable and consistent user experience. For example, the source device (and possibly multiple devices) may be used to drive each pixel on the target device, some of which are assigned to the source device, and so on as further described in the discussion below.
In the following sections, various power consumption and overall wireless network quality improvement techniques are discussed. Examples of such techniques include: techniques to "back-off" transmit power to a minimum level that can be used to successfully drive a wireless display device, avoid using common frequencies for wireless display and access point transactions, dynamically change codec types and/or parameters, and can be used to avoid transmission of redundant data between devices. Further discussion of these and other techniques may be found in relation to the following sections.
Generally, any of the functions described herein can be implemented using software, firmware, hardware (e.g., fixed logic circuitry), manual processing, or a combination of these implementations. The terms "module" and "functionality" as used herein generally represent hardware, software, firmware, or a combination thereof. In the case of a software implementation, the module, functionality, or logic represents instructions and hardware that performs operations specified by the hardware, e.g., one or more processors, functional blocks, and/or application specific integrated circuits.
Power amplifier technology
Fig. 2 depicts a system 200 in an example implementation in which the computing device 104 of fig. 1 is configured to bypass and disable a power amplifier for wireless communication. The communication module 112 of the computing device 104 is shown to include a communication manager module 202, a power amplifier 204, an antenna 206, a power supply 208, and a switch 210.
In the illustrated example, Wi-Fi communication between the computing device 104 and the access point 102 may be over a wide range because the access point 102 may be frequently located on different rooms, different floors in a house, and so on. Thus, to drive high bandwidth communications in such situations, the output power from the access device may be relatively high. However, other situations may be encountered, such as when the communication devices (e.g., the computing device 104 and the other computing device 106) are located relatively close to each other than they are, as shown. For example, in a wireless display scenario involving direct communication between devices and relatively short distances (e.g., less than five meters), much lower power may be used while still operating at high bandwidth, e.g., in a high Quadrature Amplitude Modulation (QAM) configuration, as would otherwise occur when operating across relatively long distances. Thus, reducing the RF coverage area of a particular wireless communication system will allow for greater frequency and/or channel reuse densities.
Accordingly, in this example, the communication module 112 may adapt to changes in range by measuring the wireless link quality and adjusting the power output control on a periodic basis (such as on a packet-by-packet basis, at predefined intervals, etc.). For example, the communication module 112 may use tentative data (pilot data) sent during the Wi-Fi handshake of each packet data transmission, RSSI signal strength information, use packet error status to adjust rates, and so on. In one or more implementations, the output power may use one or more of these data items to reduce the value by 10db and more assuming a relatively close range (e.g., small distance) to the destination device (e.g., less than five meters).
In addition to power control, power consumption may be further reduced by bypassing power amplifier 204, which, although power amplifier 204 is shown as being built-in to the device, may be configured as an external amplifier and thus switch 210 may be deployed internally or externally. For example, in some cases, the power amplifier 204 employed by the computing device 104 may consume high static power even with low output power requirements, such as when a class AB amplifier is used. Accordingly, the communication manager module 202 may employ the switch 210 to allow the power amplifier 204 to be used in situations involving a relatively large range, such as may involve access points 102 at large distances. The communication manager module 202 may directly drive the antenna 206 for Wi-Fi Direct (Wi-Fi Direct) situations (e.g., where the computing device 106 is located relatively close) and use the switch 210 to bypass the power amplifier 204 to save power, which may be particularly useful in mobile applications, but may also be useful in other applications. For example, for power amplifiers that consume base load current, these techniques may avoid substantial power consumption (even at low power levels), such as by employing a switch 210 that may disable the power amplifier from the power supply 208, e.g., by "turning off" the power supply rail to the power amplifier 204. Thus, the RF transmit power can be reduced to a minimum power level sufficient to support the required link quality, which consequently reduces the RF coverage area of the communication system.
Moreover, using relatively low power may also result in a reduction in noise in the wireless network 108, further saving power for both the device and other devices. For example, disruptions to other devices within range of other devices engaged in wireless communication may be reduced by having each device reduce the amount of output power used, e.g., in offices and other high density environments. Thus, multiple devices in the 2.4GHz/5GHz unlicensed band range may share a single frequency if the reduced RF power footprint is small enough to create a low noise floor (floor) for other devices in an office environment. For example, the wireless display function may generate significant band noise due to bandwidth utilization, and thus reducing the amount of power will reduce the Radio Frequency (RF) coverage area and further minimize disruption to devices on the same frequency. The smaller the RF coverage area achieved, the more efficient the non-overlapping channel reuse, and the greater the density of wireless devices on the same frequency that can be achieved.
Fig. 3 depicts a procedure 300 in an example implementation of a wireless communication technique relating to controlling a power amplifier to transmit data. The following discussion describes various techniques that may be implemented utilizing the above-described systems and devices. Aspects of each of the procedures may be implemented using hardware, firmware, or software, or a combination thereof. The process is illustrated as a set of blocks that specify operations performed by one or more devices and is not necessarily limited to the orders shown for performing the operations by the blocks. In portions of the following discussion, reference will be made to the environment 100 of FIG. 1 and the system 200 of FIG. 2.
The transmitting device detects whether the communication between the receiving device and the transmitting device meets a predefined link quality (block 302). For example, the detection may be based at least in part on an error rate or scanning of one or more wireless channels during one or more non-transmission periods. Further, the detection may be performed on a per packet basis, at predefined intervals (e.g., based on the number of packets sent, the passage of a predefined amount of time), and so forth.
In response to determining that the communication meets the predefined link quality, a power amplifier of a transmitting device is bypassed to transmit the wireless communication to be received by the transmitting device (block 304). For example, the determination may be based on the detection described above. If the receiving device is determined to be within range based on the predefined link quality, the communication manager module 202 may use a switch 210 or other technique to bypass the power amplifier 204 and disable the power amplifier 204 such that the power supply rail from the power supply 208 (e.g., battery, "plug in" source, etc.) to the power amplifier 204 is disabled. The communication manager module 202 may then communicate directly with the antenna 206 to transmit the wireless communication (e.g., one or more packets) without the assistance of the power amplifier 204. In this way, the transmitting device can reduce the amount of power used to perform wireless communications, reduce the amount of interference caused by wireless communications with other wireless devices that would otherwise be within interference range, and so forth.
However, in response to determining that the communication does not conform to the predefined link quality, a wireless communication to be received by the transmitting device is transmitted using a power amplifier of the transmitting device (block 306). Thus, in this case, the transmitting device may use an additional operating range that may be afforded by using the power amplifier 204, such as to comply with the operating range of the IEEE standard.
Accordingly, the receiving device may receive, from the transmitting device, a wireless communication transmitted without using a power amplifier of the transmitting device when the communication meets the predefined link quality (block 308). Thus, the power used by the transmitting device may be conserved and interference caused by the communication reduced. Additionally, the receiving device may receive, from the transmitting device, a wireless communication transmitted using a power amplifier of the transmitting device when the communication does not conform to the predefined link quality (block 310). In this way, the operating range of the computing device may be extended if the device is not within the predefined range.
Wireless buffering and streaming techniques
Fig. 4 depicts a system 400 in an example implementation in which a sending device utilizes the presence of buffers on a receiving device.
In this illustrated example, the computing device 104 sends data (e.g., streams the data) to the other computing devices 106 via the wireless connection. The communication modules 112, 114 of the respective computing devices 104, 106 are shown in more detail as including respective communication manager modules 402, 404 and antennas 406, 408. The communication manager module 404 is further illustrated as including a frame buffer 410, the frame buffer 410 being operable to cache frames to be rendered by the computing device 106.
Depending on the configuration, there may be two or more frame buffers and/or streams that may be used to support the multiple wireless sources shown in fig. 6. For example, at a wireless display, there are buffers and/or streams associated with each of the wireless a/V sources received by the display. Thus, frame buffer 410 may represent multiple different frame buffers available for multiple different streams. This data may then be processed (e.g., scaled) and incorporated into the main frame buffer and/or stream that the display uses to generate the final image on the display. There are various other techniques that may be used to handle the management of multiple wireless video sources onto a single screen, as further described in connection with fig. 6.
Various conventional wireless display schemes involve power consumption above a desired target for mobile applications (i.e., for battery-powered computing devices). Fortunately, many scenarios involve significant periods of time when updates to the wireless display are not necessary, e.g., for web browsing, Instant Messaging (IM), music, presentations that do not involve active animations, etc. However, conventional schemes continue to transmit frames 412 with equivalent content, thereby wasting power of both the transmitting and receiving devices, and also resulting in the introduction of additional and unnecessary noise that would otherwise be avoidable to the environment.
In this example, additional control signaling (signaling) of the frame buffer 410 of the sending computing device 104 and the receiving computing device 106 is utilized such that frames with new content are sent from the source device while redundant frames (e.g., frames with matching content) are not sent.
In this case, the additional control signals include control data that causes the wireless display to repeat a particular frame until a new frame is provided. The source transmit portion of module 112 may then enter a sleep mode until a new frame is to be transmitted to the wireless display. In response, the internal frame buffer 410 may be used to drive the target display until a new frame is sent. In addition, the frame rate may be reduced from the conventional frame rate (60Hz/50Hz) to a lower frame rate that is sent to the target display along with the associated control signals.
For example, a presentation may involve a relatively static display, e.g., a slide that does not change until input is received. Accordingly, the system may send a frame, then determine that the information has not changed and thus send a "repeat" command to the wireless display. The wireless display may then display the frame (e.g., continuously) until the source device transmits a new frame. During this period, little or no wireless video information may be transmitted, and thus the power consumed by the source device and the noise that would otherwise occur in the environment from transmitting the video information is reduced.
Additionally, the portion 414 of the frame 412 being updated may also be sent using a control signal. This may be valid for a variety of different content, such as a web page with active portions (e.g., advertisements) where a substantial portion of the frame 412 is not active. Furthermore, an extender concept can also be utilized, where video and audio streams are sent independently of the UI stream. Thus, when the overlaid UI frame is quiet, a video update of the sub-portion of the display in which the video is presented is sent. In additional embodiments, the control data may also be used for animation and object manipulation.
Fig. 5 depicts a procedure 500 in an example implementation of a wireless communication technology related to wireless buffering and streaming. The following discussion describes various techniques that may be implemented utilizing the above-described systems and devices. Aspects of each of the procedures may be implemented using hardware, firmware, or software, or a combination thereof. The process is illustrated as a set of blocks that specify operations performed by one or more devices and is not necessarily limited to the orders shown for performing the operations by the blocks. In portions of the following discussion, reference will be made to the environment 100 of FIG. 1 and the system 400 of FIG. 4.
The sending device makes that the second frame to be sent by the sending device to the receiving device includes at least a portion that is a repetition of a corresponding portion of the first frame that has been sent by the sending device to the receiving device (block 502). For example, the communication manager module 402 may determine that a frame to be wirelessly transmitted from the computing device 104 to another computing device 106 is repetitive. For example, the frame may be part of a presentation or part of other content including a relatively static display of the content.
In response to the determination, at least a portion of the first frame that matches the portion of the second frame is caused to be repeatedly displayed by the receiving device without the transmitting device transmitting the portion of the second frame (block 504). Continuing with the previous example, the communication manager module 402 may develop a control signal to cause the communication manager module 404 of the computing device 106 to repeat at least a portion of the frames stored in the frame buffer 410. In one implementation, the portion may cover a large amount of the display area of the frame. In another example, the transmitting device does not transmit a control signal or frame to the receiving device. The receiving device may then detect such a lack of data (e.g., frames and/or control signals) and thus repeat frames that have been received by the receiving device. In this way, the receiving device may act to repeat the frame without receiving data from the transmitting device.
Also in response to the determination, one or more hardware devices or subsystems of the transmitting device may enter a sleep mode until an update to the first frame is to be transmitted to the receiving device that includes content not included in the first frame (block 506). For example, the communication manager module 402 may cause one or more hardware components of the computing device 104 involved in wireless transmissions to enter a sleep mode to reduce power consumption of the computing device 104. This may include reducing the power supplied to the hardware components but still maintaining a baseline level of available power so that these components may be quickly awakened. This may also include shutting down power rails to one or more components to reduce power consumption completely or nearly completely. Various other examples are also contemplated.
Further in response to the determination, a transmission frame rate of the transmitting device to the receiving device may be decreased (block 508). For example, the communication manager module 402 may determine that the static display is to continue. Accordingly, the communication manager module 402 may reduce the frame rate, thereby conserving power and reducing interference in the wireless environment with other devices participating in wireless communication.
The sending device may also employ one or more expander concepts to separate the streams to be sent to the receiving device (block 510). For example, the extender concept may be configured to cause video, audio and/or user interface streams to be sent separately. In this way, updates to each of these streams may be transmitted without involving the other streams.
Accordingly, the receiving device may receive one or more control signals and then repeat at least a portion of the frames stored in the frame buffer (block 512) and employ other of the foregoing techniques for wireless communication as described in the blocks above. Although examples of buffering and streaming techniques are described, various other techniques may also be employed without departing from the spirit and scope thereof, examples of which may be found in the following subsections.
Wireless display technology
FIG. 6 depicts a system in an example implementation in which wireless display technology is shown. In this example, two mobile computing devices 602, 604 communicate wirelessly with another computing device (shown as display device 606). However, as described in connection with FIG. 1, these computing devices may assume a variety of other configurations.
The display device 606 is shown receiving wireless data from the mobile computing devices 602, 604. In response, the display device 606 divides the available display area, in this case dividing the display area down the middle, although other examples are also contemplated, such as to employ picture-in-picture techniques with portions of various sizes that may be adjustable by a user of the display device 606.
For example, for three-dimensional display, the entire display area of the display device may be used, while the display area is configured to display specific content to particular users by utilizing glasses that are typically worn by those users (e.g., LCD shutter glasses). Through synchronization between the wireless display and the glasses, each user can see different content and the display appears to the user to be simultaneous.
Further, the transmitting device may be configured to take advantage of this division. For example, the mobile communication devices 602, 604 may be configured to reformat data being sent to the display device 706 to have an aspect ratio, resolution, or the like configured to approximate or even match the respective portions. In this way, if the full display area is consumed by the data (e.g., video), the device may transmit less data than would otherwise be transmitted. Various other implementations are also contemplated without departing from the spirit and scope thereof.
Fig. 7 depicts a procedure 700 in an example implementation of a wireless communication technology relating to displaying content from multiple devices. The following discussion describes various techniques that may be implemented utilizing the above-described systems and devices. Aspects of each of the procedures may be implemented using hardware, firmware, or software, or a combination thereof. The process is illustrated as a set of blocks that specify operations performed by one or more devices and is not necessarily limited to the orders shown for performing the operations by the blocks. In portions of the following discussion, reference will be made to the environment 100 of FIG. 1 and the system 600 of FIG. 6.
Two or more streams are wirelessly received at a display device from respective two or more computing devices (block 702). As shown in fig. 6, for example, a wireless display device 606 may receive a stream of content from first and second mobile computing devices 602, 604.
The display device may then automatically divide the display area of the display device to enable content from the two or more streams to be displayed by the display device simultaneously (block 704). Continuing with the previous example, the display device 606 divides the display area into halves (half) as shown to enable content from multiple mobile computing devices 602, 604 to be concurrently displayed. Further, picture-in-picture techniques may be employed to allow a user to change the size of portions, reposition portions, and so forth. Various other partitioning techniques are also contemplated.
For example, the display area may be divided such that a first stream is viewable using a first pair of three-dimensional viewing glasses and not viewable using a second pair of three-dimensional viewing glasses, while content from the stream is viewable using the second pair of three-dimensional viewing glasses and not viewable using the first pair of three-dimensional viewing glasses (block 706). In this example, the display device 606 may be configured to perform three-dimensional display by communicating with three-dimensional viewing glasses (e.g., LCD shutter glasses). In this example, a larger portion may be displayed (e.g., even overlapped to a point that consumes nearly the entire display area) and may appear to be simultaneous to two or more users, even though the users may view different content from different streams.
The data may also be reformatted by respective ones of the two or more computing devices according to respective portions of the data to be displayed by the display device therethrough (block 708). For example, the display device 606 may communicate with the mobile computing devices 602, 604 to provide details regarding available resolutions, aspect ratios, etc. to be used for displaying portions of content from the devices. The mobile computing device 602, 604 may then format the content accordingly such that this reformatting may be "offloaded" (offloads) from the display device 606. Other implementations are also contemplated, such as reformatting performed by the display device, using predefined portions so that reformatting can be performed automatically without user intervention of the mobile computing device, and so forth.
Dual band communication
Fig. 8 illustrates an example system 800 in which functionality of a wireless device involving multiple bands is utilized to provide wireless communication employing two or more of the bands. In the system 800 shown here, the communication module 112 is shown in greater detail as employing a communication manager module 802, a 2.4GHz band module 804, a 5.0GHz band module 806, and respective antennas 808, 810. Other implementations are also contemplated, such as using a dual-band antenna.
Techniques are described below that may be used to support multiple bands (e.g., 5GHz and 2.4GHz bands) simultaneously with independent hardware available on many conventional devices without significant reconfiguration of the hardware. Traditionally, this radio frequency (non-baseband) hardware of separate bands is not shared between the bands, although in some cases a common Phase Locked Loop (PLL) is used but can be duplicated by the vendor. Accordingly, in one or more implementations, wireless communication techniques may be employed to utilize two or more bands, such as to communicate control information 812 over a 2.4GHz band and data payload 814 over a 5.0GHz band.
In another example, a wireless channel within a band (e.g., 2.4 or 5.0GHz) may be used for audio/visual and associated control information while another wireless channel within the band may be used for general wireless networking traffic.
Adjacent channels, e.g., non-overlapping channels, may also be used, with one channel being used primarily for access point internet data and a second channel being used for wireless display. This may be performed in both the 5GHz and 2.4GHz bands, for example, to further increase bandwidth.
It is also contemplated to use a beacon signal to avoid loss of data sent to the device during Wi-Fi display operations. For example, the returned data may be checked for beacons and used to interleave (interleave) transmissions for wireless display.
Furthermore, because the wireless display is largely transmission-specific, a packet-specific ACK technique can be utilized at the receiving end (e.g., wireless display) to reduce "listening" to the wireless display channel.
In one implementation, many baseband portions of the system may be designed to handle larger bandwidths (e.g., in excess of 20MHz) with a dual-channel (e.g., 40MHz) or multi-channel (802.11ac-80MHz and beyond) scheme. Accordingly, this hardware can be multiplexed from its current dual + (dual +) channel single Orthogonal Frequency Division Multiplexing (OFDM) stream to also handle dual independent OFDM streams, even though these are at very different frequencies, since the associated baseband data may be the same. In other words, the Fast Fourier Transform (FFT) engine, Viterbi (Viterbi), and bit processing engines of communication module 112 may process frames on two separate streams (e.g., internet-based traffic and wireless displays for access point 102). Typically, the antennas for 2.4GHz and 5GHz may also be sufficiently different that separate antennas may be used, even if they are packaged within one component (as shown).
In additional implementations, the system may employ aspects of this functionality that support or do not support concurrent 2.4GHz and 5GHz operation. For example, time division multiplexing may be performed between the bands. In another example, two channels in one of the 2.4GHz or 5GHz bands may be used. In an example implementation of the second example, two independent 20MHz streams may be used. Likewise, one may operate at high power to communicate with access point 102, while a second may use low power to reach only relatively close computing devices (e.g., devices within a predefined range so as not to use a power amplifier), as described in connection with fig. 2 and 3.
Fig. 9 depicts a procedure 900 in an example implementation of a wireless communication technology relating to wireless communication utilizing multiple bands. The following discussion describes various techniques that may be implemented utilizing the above-described systems and devices. Aspects of each of the procedures may be implemented using hardware, firmware, or software, or a combination thereof. The process is illustrated as a set of blocks that specify operations performed by one or more devices and is not necessarily limited to the orders shown for performing the operations by the blocks. In portions of the following discussion, reference will be made to the environment 100 of FIG. 1 and the system 800 of FIG. 8.
Data for transmission from a sending device to a receiving device is obtained (block 902). For example, the data may be obtained by executing one or more applications, received from another device, located in local or remote storage, and so forth. Thus, the data may originate from a variety of different sources.
The data is transmitted to the receiving device over the first and second bands simultaneously using the first and second modules of the transmitting device (block 904), such as over the 2.4GHz and 5.0GHz bands. For example, each of these bands may utilize one or more channels to communicate with the same device, different devices, and so on.
The communication manager module 802 may also perform this communication using various other techniques. For example, the communication manager module 802 may communicate control information using a first module and corresponding first band and data using a second module and corresponding second band (block 906). In another example, the communication manager module 802 may use the beacon signal to interlace transmissions for wireless display by the wireless display device (block 908), as previously described. Further, the communication manager module may process a plurality of independent Orthogonal Frequency Division Multiplexing (OFDM) streams using the first and second modules (block 910). For example, a Fast Fourier Transform (FFT) engine, a Viterbi (Viterbi), and a bit processing engine of communication module 112 may process frames on two separate streams (e.g., internet-based traffic and wireless displays for access point 102).
The communication manager module 802 may also employ time division multiplexing between the first and second bands (block 912). This time multiplexing may be performed by channels within a band, by different bands, and so on. The communication manager module 802 may also employ techniques described in connection with the other sections. For example, the communication manager module may use the first or second module to vary the amount of power used by the first and second modules based on whether the receiving device is within a predefined range (block 914). Various other examples may also be envisaged without departing from the spirit and scope thereof.
Wireless decoding techniques
Fig. 10 depicts a system 1000 in an example implementation in which wireless decoding techniques are employed. The computing device 104 and the other computing device 106 are shown as participating in wireless communications. The communication module 112 of the computing device 104 is shown in greater detail as employing a communication manager module 1002, a decoding module 1004, and an antenna 1006.
In this example, the communication module 112 is configured to determine whether a receiving device (e.g., the computing device 106) is capable of decoding the source content format. If so, the encoded data may be transmitted by the communication module 112 without being decoded by the decoding module 1004. This also allows the target display to further improve image quality (if appropriate).
In the event that the type of item displayed can be identified, the communication manager module 1002 can employ different codecs and/or codec rates to reduce traffic on the wireless communication link. Because wireless communications may be performed via packetized networks (e.g., 802.11 standards), the reduction in traffic may also have power and noise floor advantages. For example, for games, the use of an h.264 encoder may be appropriate. For UIs and solutions that support object-like manipulation, wireless communication may be significantly reduced by sending objects and animation controls instead of sending data for each frame. For situations like internet browsing, motion JPEG may be substituted to preserve quality and data traffic. This may be performed by: an appropriate codec (e.g., compression algorithm) is selected as part of the decision tree for encoding the video frame using a variety of processing techniques, such as frequency curves, frequency gradients, temporal variations, edge change detection, and other video and image processing algorithms.
Further, to minimize transmit RF power, communication manager module 1002 may use different audio/visual (a/V) compression schemes for a given type of media content. For example, the compression type and ratio may be adjusted on a per-frame or per-subframe basis to minimize the Radio Frequency (RF) power used to maintain a reliable RF link.
In an implementation, the devices (e.g., computing devices 104, 106) perform a scan to locate the possible 20MHz in the target frequency band that is as "idle" as possible compared to other bands before starting the link. This further allows for spatial diversity of multiple wireless display users. Such scanning may be performed if a high error condition occurs in a band being used by each device, when a clean channel is detected, in response to a request for a change to a new contention free channel being sent to a wireless display or the like. Furthermore, beamforming can be used to reduce power requirements and minimize RF channel coverage for a given segment or spatial region.
In one or more implementations, a receiving device, such as a wireless display, may provide a request to a source device to change to a different channel. In response, the source may inform the receiving device (e.g., sink) to which channel the device is to move. In this way, the source may drive multiple receiving devices in a contention-free manner, although other implementations using receiving devices to manage communications are also contemplated. For example, where multiple sources are driving to a single sink, they may be flipped, or a master source may be defined between the sources that specifies which channel to use for wireless communications, e.g., A/V transactions. In another example, the receiving device may be made aware of which data is lost (e.g., media frames) and provide a recommendation to the sending device to change to a new channel once a given threshold is exceeded.
Fig. 11 depicts a procedure 1100 in an example implementation of a wireless communication technology related to transmission of a frame. The following discussion describes various techniques that may be implemented utilizing the above-described systems and devices. Aspects of each of the procedures may be implemented using hardware, firmware, or software, or a combination thereof. The process is illustrated as a set of blocks that specify operations performed by one or more devices and is not necessarily limited to the orders shown for performing the operations by the blocks. In portions of the following discussion, reference will be made to the environment 100 of FIG. 1 and the system 1000 of FIG. 10.
A transmitting device obtains one or more frames to be wirelessly transmitted to a receiving device (block 1102). For example, the frames may be generated by an application executed locally by the computing device 112, stored locally from the computing device 112, obtained remotely via a network, and so forth.
A type of content is determined for one or more frames to be wirelessly transmitted by a transmitting device to a receiving device (block 1104). For example, the frames may relate to streaming video, a portion of a presentation, including a scene from a video game, obtained via a browser, a portion of an execution of an application (e.g., a user interface), from a video camera, and so forth. Accordingly, each of these types may have particular characteristics that may be used to reduce power consumption, noise and interference, etc., that may be involved in the wireless transmission of frames. The determination may be performed in various ways, such as based on a frequency curve, a frequency gradient, a time variation, edge variation detection, and so forth.
A codec to be used for encoding the one or more frames is identified based at least in part on the determined type (block 1106). In response to determining that the one or more frames are not encoded using the identified codec, the one or more frames are encoded using the identified codec (block 1108). Continuing with the previous example, certain types of codecs may be particularly suited to encoding a particular type of frame. For example, for games, an h.264 codec may be used to encode the frames. For user interfaces that support object-like manipulation, wireless traffic may be reduced by sending objects and animation controls instead of sending data for each frame. For situations like internet browsing, motion JPEG may be substituted to preserve quality and data traffic. Various other examples are also contemplated.
The one or more frames may also be compressed based at least in part on the determination of the type of content (block 1110). For example, a codec or compression algorithm may also be selected based on characteristics of the wireless channel used to transmit the data. Further, this selection may be performed in various time frames, such as on a per frame or sub-frame basis.
The sending device may also change to a different channel selected by the sending device in response to a request received at the source device from the receiving device (block 1112). For example, a receiving device (e.g., a wireless display device) may determine that a significant amount of noise is present on the current channel and therefore send a communication to a transmitting device (e.g., a mobile device) to change the channel used to communicate with the transmitting device. The transmitting device may then select a new channel and transmit this information back to the receiving device. Thus, in this example, the sending device manages wireless communications, although other examples are also contemplated.
Various other wireless communication techniques are also contemplated, such as dynamically adjusting compression ratios, delta amounts, changing from one codec to another, beamforming, FEC (forward error correction), and so forth.
Mobile device wirelessly sharing screen with other devices
Mobile communication devices have become increasingly powerful and capable of being highly connected and capable of acting as relatively large storage devices capable of handling complex tasks ranging from gaming to photo editing. However, even though the display devices employed by mobile communication devices have increased from an average size of less than 3 inches in diagonal to approximately 4 inches, panning (pan) and zooming are still involved in reading typical web pages, emails, and the like. These devices are also typically limited in their ability to enter content or control applications because of the limited on-screen keyboard or keypad on these devices. Accordingly, techniques are described that may be used to enhance the input and output of a mobile communication device in situations where a second display and/or input device is available.
In one or more implementations, a mobile communication device can "remotely" its display and input mechanisms (e.g., touches, buttons, etc.) via a wireless display to a simple display device for screen content and control back channel (control back channel). This can be used to support a variety of different modes of operation:
1) simple remote use, where the user only sees a larger device similar to a tablet device. However, in this case, there are no separate processors, memories, WAN communications, etc., allowing for lower cost, increased mobility, and the ability to synchronize. The experience may also be enhanced by allowing higher resolution displays to be supported on the remote device via zooming or direct rendering to a larger size.
2) The remote screen can be used as a display device and a phone with a copy (clone) on the display is used so that control can be done from the phone.
3) The remote screen may act as an auxiliary display and the phone may display different content, contextual content, a keypad, and the like. Control may be from both the remote display and the telephone.
In the first case, the phone may be held in a person's pocket. In cases 2) and 3), the remote screen may be located on a surface, in a base, physically connected to a phone, and so forth.
These techniques may support a variety of different functions, such as the ability to interface to a remote phone via an optimized wireless display mechanism (e.g., optimized link, remote UI, animation, and display compression), where an application may run entirely on the phone, but may interface to a user on a remote display. Further, the compressed video may be embedded to be decoded locally by a remote display, perform UI animation remotely, and adjust decoding based on content on the remote end.
Further, touches (e.g., multi-finger gestures) and button clicks may be embedded in the reverse channel of the mobile communication device, and these commands may be replayed as if they were executed locally on the mobile communication device. A translation may also be made on the remote device to translate the touch point from the remote coordinates to native coordinates on the mobile communication device. Further, these techniques may utilize an integrated Wi-Fi/decoder scheme that provides communication, content decoding, an embedded frame buffer scheme, and a cost-effective controller.
Traditionally, patrons would do everything, from web browsing to gaming to reading email, using either their phone or laptop computer. More recently, however, a third device, commonly referred to as a "tablet computer," has become popular that has a screen size between the phone and the laptop, a longer battery life than both, a smartphone-like touch interface and application, and a smartphone-like thickness. These devices offer a third option to users, but may have significant costs (e.g., both device and operator costs), synchronization issues, and in some cases significant user interface differences. The proposed solution brings the customer the option of having an internet tablet that synchronizes with his smartphone and provides a universal user interface at a much lower cost.
The scheme may utilize techniques for remotely identifying, connecting, encoding, transmitting, and decoding/displaying. Standards such as 802.11, Wi-Fi direct, uPNP, h.264, motion JPEG, and the like may be utilized.
To build a remote tablet computer, a small portion of a typical internet tablet computer can be used to construct a "thin" device. For example, a tablet computer may be manufactured without an application processor, large flash memory or DRAM, a WAN modem, or the like. Further, the remote tablet may employ a relatively small battery, have a Wi-Fi, similar display, and utilize a decoder and a relatively small controller, while still providing most typical internet tablet functionality. When using a phone as a source device, the remote display may be able to provide this experience, however on a larger display. These costs, if the display is excluded, may represent between twenty and fifty percent of a typical electricity bill for the materials of an internet tablet computer.
In addition, the return channel may be utilized for packet setup and acknowledgement. When events from touch controls or buttons occur, these events go to the small controller to be interpreted and encoded. These can then be sent to the mobile communication device to avoid latency between the displayed object and the touch event. Once received, the mobile communication device can decode these events as if received from its touch controller. For the dual screen case, the touch event may be received as a second touch controller.
The Wi-Fi beacon signal or BT may also be used to allow the mobile communication device to wake up from a remote device. Both sides may be powered down after a period of time in operation that is controllable by the user.
Further, the graphics processing unit of the mobile communication device may be used to render not only to local display size, but also to greater resolution, allowing for better viewing of applications, web content, and so forth.
Negotiation between the mobile communication device and the remote tablet computer may be used to identify which compression types are permitted. For example, after completing a frame, applications may have their displayed content encoded in h.264 or motion JPEG. In this example, the application is not made aware that the frame was transmitted for remote viewing. For the case where a media player is used (or an embedded media player is invoked as in a web page), the encoded media stream may be captured before being decoded on the mobile communication device. The stream is then encapsulated and sent to the remote device to be decoded within the graphics frame, decoded or merged, or simply decoded full screen. Other types may also be more efficiently encoded/decoded based on the data type, but this may involve greater knowledge of the remote device of the application and possibly more cost to the remote device. Audio streams may also be embedded in either direction to support applications like conference calls, media playback, and voice commands.
Mobile device broadcast to multiple wireless displays
Displaying content on multiple displays using conventional techniques involves the use of splitters and cables routed to the various displays. This presents a problem of setting difficulties and the present solution is intended to solve this problem. Conventional wireless solutions (e.g., over short to medium distances) do not support broadcasting to multiple displays. The present solution allows a mobile communication device to broadcast its content wirelessly to multiple devices.
A mobile device (also referred to as a source device, for example) using the techniques described herein is able to broadcast its content (whether audio/video, picture, data, screen display, or otherwise) to multiple wireless displays (which should now be referred to as sink devices) simultaneously. In this way, broadcast of content from a source device to multiple sink devices simultaneously may be supported.
For example, a user may initiate a broadcast feature on a source device and select which sink devices to broadcast the content to. The user may then select a plurality of sink devices within range of the source device. Once the link between the source device and the sink device is established, the user is able to select the content to be broadcast on the source device. For example, a user may select to broadcast the screen content of a source device to a sink device. In this case, if the user is playing audio/video content on the source device, its content is also broadcast to the sink device. Upon receiving the content, the sink device may display the content of the source device. The link between the source device and the sink device may be bi-directional to allow for handling of packet errors, link control, data transmission, service establishment, and the like.
Wireless screen sharing of mobile devices with other devices
The present technology allows a user to share the screen content of a device (whether it be pictures, audio/video, data, etc.) with multiple devices when a link is established between the devices. This allows other devices to display the shared content as well as local content on the device.
For example, the mobile device (which shall now be referred to as the source device) can wirelessly screen share its content (whether it be audio/video, pictures, data, screen display, or otherwise) with other devices (which shall now be referred to as sink devices), and vice versa. This allows the sink device to view its local content and the shared content. The shared content screen size may be adjusted on the sink device. Thus, these techniques may support a variety of different functions:
wirelessly sharing screen content from a source device to multiple sink devices when the devices are within their wireless range.
Allowing the sink device to view the local content and the shared content.
Shared content screen size can be adjusted on the sink device.
These devices are not limited to laptop computers, desktop computers, wireless displays, tablet computers (slates), and mobile devices. A source device may be defined as a device that provides content to be shared. A sink device may be defined as a device that receives the shared content. During a screen sharing session, multiple sink devices may be allowed but a single device may be designated as a source.
In this context, a user initiates a screen sharing session on each of these devices. One of these devices is designated as the source device, while the other devices are configured as sink devices. As soon as a screen sharing session is established between the devices, the sink device is able to display the shared content on its screen.
The shared content screen size on the sink device may be user-configurable as a full screen (maximized size) or as a restored screen (adjustable size). During the screen sharing session, any one of the sink devices may become the source device by requesting a role change. As soon as the role change is finally negotiated, the devices can be reconfigured accordingly and new content sharing begins.
Audio synchronization of mobile devices with wireless displays
When the device provides a video stream to the wireless display while the end user is listening to the audio at the device end, the audio and video may not be synchronized. This may be due to compression, transmission and decompression latencies that vary depending on the RF environment and video processing.
To enhance audio and video (a/V) synchronization while the source device is providing content to the wireless display while the audio content is playing locally, a mechanism to dynamically synchronize the a/V content may be employed. For example, an audio buffer and/or stream that dynamically adjusts the playback point and/or playback rate may be used to take into account system latency to ensure that the audio at the source is synchronized with the video at the remote endpoint.
There are a variety of different mechanisms that may be utilized. For example, if the source device supports a microphone, the audio buffer control system may compare the locally received test tone to the tone originating from the display. For example, the tone may be implemented as a relatively short burst and/or imperceptible short duration outside the human voice range. The control system can then measure the latency and adjust the audio buffer playback state and/or playback rate appropriately to align the a/V.
In another example, the source device may send the RF timing packets to an endpoint at the display. The endpoint may then respond and the source device may measure the RF delay. The round trip times coupled with known or estimated encoding and decoding latencies may be summed to provide an overall system latency measurement of packet latency. Various other examples are also contemplated.
Conclusion
Although the invention has been described in language specific to structural features and/or methodological acts, it is to be understood that the invention defined in the appended claims is not necessarily limited to the specific features or acts described. Rather, the specific features and acts are disclosed as example forms of implementing the claimed invention.

Claims (10)

1. A method, comprising:
the transmitting device detecting whether a communication between the receiving device and the transmitting device complies with a predefined link quality (302);
in response to determining that the communication conforms to the predefined link quality, transmitting (304) a wireless communication to be received by the transmitting device, bypassing a power amplifier of the transmitting device; and
in response to determining that the communication between the receiving device and the transmitting device does not conform to the predefined link quality, transmitting, using the power amplifier of the transmitting device, a wireless communication to be received by the transmitting device (306).
2. The method of claim 1, wherein the detecting is based at least in part on an error rate or scanning of a channel during one or more non-transmission periods.
3. The method of claim 1, wherein the bypassing causes the power amplifier to be disabled.
4. The method of claim 1, wherein the bypassing is performed to shut off a supply rail to the power amplifier.
5. The method of claim 1, wherein the bypassing uses a Radio Frequency (RF) switch to route signals to an antenna and bypasses the power amplifier.
6. A method, comprising:
the sending device determining that a second frame to be sent by the sending device to a receiving device comprises at least a portion that is a repetition of a corresponding portion of a first frame that has been sent by the sending device to the receiving device (502); and
in response to the determination, causing at least a portion of the first frame of the receiving device that matches the portion of the second frame to be repeatedly displayed by the receiving device without the transmitting device transmitting the portion of the second frame (504).
7. The method of claim 6, wherein the receiving device employs a buffer that stores a copy of the portion of the first frame to be repeated.
8. The method of claim 6, wherein the causing is performed without the sending device sending a control signal or frame to the receiving device.
9. A method, comprising:
wirelessly receiving two or more streams from respective two or more computing devices at a display device (702); and
automatically partitioning a display area of the display device to enable content from the two or more streams to be concurrently displayed by the display device (704).
10. The method of claim 9, wherein the display device is configured to function as a three-dimensional display, and wherein content from a first of the streams is viewable using a first pair of three-dimensional viewing glasses and not viewable using a second pair of three-dimensional viewing glasses, and content from a second of the streams is viewable using the second pair of three-dimensional viewing glasses and not viewable using the first pair of three-dimensional viewing glasses.
HK13107032.0A 2011-01-07 2013-06-14 Wireless communication techniques HK1180126A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US61/430,639 2011-01-07
US61/431,312 2011-01-10
US13/088,986 2011-04-18

Publications (1)

Publication Number Publication Date
HK1180126A true HK1180126A (en) 2013-10-11

Family

ID=

Similar Documents

Publication Publication Date Title
US8983555B2 (en) Wireless communication techniques
JP6406249B2 (en) Information processing apparatus and information processing method
US10009646B2 (en) Image processing device, image reproduction device, and image reproduction system
JP6382319B2 (en) Dynamic and automatic control of latency buffering for audio / video streaming
US8019883B1 (en) WiFi peripheral mode display system
US9412332B2 (en) Method for wirelessly transmitting content from a source device to a sink device
US10805672B2 (en) Information processing device, information processing system, and information processing method
WO2022111672A1 (en) Data processing method and apparatus, and electronic device
JP2017532864A (en) Coordinated demand-based dual mode Wi-Fi network control to optimize wireless power and performance
US10306043B2 (en) Information processing apparatus and method to control a process based on control information
US10085068B2 (en) Information processing apparatus and information processing method
US20050066089A1 (en) Consumption based source throttling
GB2486425A (en) Rendering multimedia content from a mobile device onto an external display device
WO2017018042A1 (en) Information processing device, information processing method, and source apparatus
WO2023011408A1 (en) Multi-window video communication method, device and system
US20230251812A1 (en) Methods and Apparatus for Mesh Networking Using Wireless Devices
HK1180126A (en) Wireless communication techniques
US20220224972A1 (en) Device and method for processing and transmitting image data in wireless av system