Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples are not representative of all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with some aspects of the present disclosure as detailed in the accompanying claims.
The terminology used in the present disclosure is for the purpose of describing particular embodiments only and is not intended to be limiting of the disclosure. As used in this disclosure and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise. It should also be understood that the term "and/or" as used herein refers to and encompasses any or all possible combinations of one or more of the associated listed items.
It should be understood that although the terms first, second, third, etc. may be used in this disclosure to describe various information, these information should not be limited to these terms. These terms are only used to distinguish one type of information from another. For example, first information may also be referred to as second information, and similarly, second information may also be referred to as first information, without departing from the scope of the present disclosure. The term "if" as used herein may be interpreted as "at..once" or "when..once" or "in response to a determination", depending on the context.
Fig. 1 is a schematic flow chart of a photographing method, which may be performed by a terminal, according to an embodiment of the present disclosure. The terminal comprises a communication device such as a mobile phone, a tablet computer, a wearable device, a sensor, an internet of things device and the like, and is provided with a plurality of shooting lenses.
As shown in fig. 1, the photographing method includes:
In step S101, a first focusing distance between a target object and a first photographing lens currently in use is determined;
In step S102, a second photographing lens is selected according to the first focusing distance, and the target object is photographed by the second photographing lens to generate a first image, wherein the photographing definition of the second photographing lens on the target object is better than that of the first photographing lens;
In step S103, the region of the target object in the first image is replaced with the position of the target object in the second image captured according to the first photographing lens.
In some embodiments, the terminal is equipped with a plurality of photographing lenses, which may include a main wide-angle lens (wide), an ultra wide-angle lens (ultrawide), a tele lens (tele).
The main shooting wide-angle lens is usually the most commonly used lens of the terminal, is suitable for most application scenes, can meet most shooting requirements of users, has a larger field angle than the main shooting wide-angle lens, but has a smaller opposite focusing distance than the Yu Zhu shooting wide-angle lens, and has a smaller field angle than the main shooting wide-angle lens, but has a larger focusing distance than the main shooting wide-angle lens.
In some embodiments, in the default photographing mode, the magnification is 0.5 to 1 times the super-wide-angle focal length, the magnification is 1 to 3 times the wide-angle focal length, the magnification is more than 3 times the length Jiao Jiaoduan, or the magnification is 0.5 to 1 times the super-wide-angle focal length, the magnification is 1 to 5 times the wide-angle focal length, and the magnification is more than 5 times the length Jiao Jiaoduan.
It should be noted that the above embodiment is only an example of an illustrative property of the disclosure, and is not limited to the disclosure, and in fact, the focal segment corresponding to the magnification may be flexibly set according to the requirement.
In some embodiments, the terminal may determine the first focusing distance according to an image distance of an image of the target object through the first photographing lens.
The first focusing distance refers to a distance from the target object to the presented image.
When the target object is in the focusing range of the shooting lens, the shooting lens can focus the target object, and the definition of the shot image is higher.
In some embodiments, the target object may comprise a human face.
In general, a user may wish to capture a clearer face on an image, so that a target object may include the face, so that in the capturing process, the method provided by the disclosure is executed under the condition that the terminal recognizes the face, so that the face in the captured image is clearer, and better experience is brought to the user.
In some embodiments, the target object may also include an object within a range area selected by the user.
The user may select a scope area on the preview image by clicking, framing, or the like, and identify the scope area or objects within the scope area as target objects.
The user may select the content of interest to make the content of interest in the captured image clearer.
In some embodiments, the terminal may determine whether the first focusing distance of the target object is within the focusing range of the first photographing lens, if the first focusing distance is within the focusing range of the first photographing lens, it indicates that the first photographing lens can focus the target object, the sharpness of the target object in the photographed image is higher, and at this time, the second photographing lens is not required to be selected, if the first focusing distance of the target object is outside the focusing range of the first photographing lens, it indicates that the first photographing lens cannot focus the target object at this time, the sharpness of the target object in the photographed image is lower, and if the user enlarges the target object area, the image quality of the target object is lost more.
In some embodiments, in the case that it is determined that the first focusing distance of the target object is out of the focusing range of the first photographing lens, a second photographing lens is selected according to the first focusing distance, and the target object is photographed by the second photographing lens to generate a first image, wherein the photographing definition of the second photographing lens on the target object is better than that of the first photographing lens.
In some embodiments, the first focus distance of the target object is within a focus range of the second photographing lens.
In this case, the second photographing lens is able to focus on the target object, and an image of the target object photographed by the second photographing lens is necessarily clearer than the first photographing lens outside the focus range.
In some embodiments, the first focus distance of the target object is outside the focus ranges of the first and second photographing lenses, but is closer to the focus range of the second photographing lens.
In this case, although both the first photographing lens and the second photographing lens cannot perform focusing on the target object, since the target object is closer to the focusing range of the second photographing lens, the image obtained by photographing the target object by the second photographing lens is still clearer than the image obtained by photographing the target object by the first photographing lens even if both the first photographing lens and the second photographing lens cannot perform focusing.
In some embodiments, the region of the target object in the first image is replaced with the position of the target object in a second image taken from the first photographing lens.
Because the image of the target object obtained by shooting by the second shooting lens is clearer, the area of the target object in the first image can be replaced to the corresponding position in the image shot by the first shooting lens in an image synthesis mode, so that a user can obtain the clearer image of the target object when shooting by using the first shooting lens.
According to the method and the device, the target object is shot through the second shooting lens, after shooting, the clear target object area shot by the second shooting lens is replaced to the corresponding position of the image shot by the first shooting lens used by the user in a mode of image synthesis, so that the user can have better definition for the target object when shooting the image by using the first shooting lens, and user experience is improved.
In some embodiments, the minimum focusing distance of the lens may be used to determine whether the target object can be focused by the photographing lens to determine a second photographing lens with higher photographing definition.
In general, the focus ranges of different photographing lenses are different, and there is no overlapping portion of the focus ranges. Therefore, it can be considered that when the first focusing distance corresponding to the target object is greater than the minimum focusing distance of the photographing lens, the photographing lens can obtain a clearer image, and the clearer the image is the closer the minimum focusing distance of the photographing lens is to the first focusing distance.
In some embodiments, the selecting the second photographing lens according to the first focusing distance includes selecting the second photographing lens to have a minimum focusing distance smaller than the first focusing distance and the target object within the field of view of the second photographing lens if the minimum focusing distance of the first photographing lens is larger than the first focusing distance, and selecting the second photographing lens to have a minimum focusing distance smaller than the first focusing distance but larger than the minimum focusing distance of the first photographing lens and the target object within the field of view of the second photographing lens if the minimum focusing distance of the first photographing lens is smaller than the first focusing distance.
How to select the second photographing lens will be described in detail with reference to the embodiments of fig. 2 and 3.
Fig. 2 is a schematic diagram illustrating selecting a second photographing lens according to a first focusing distance according to an embodiment of the present disclosure.
As shown in fig. 2, the terminal 210 has three photographing lenses, lens 1, lens 2, and lens 3, respectively. The minimum focusing distances corresponding to the three lenses are shown as the minimum focusing distance of the lens 1, the minimum focusing distance of the lens 2 and the minimum focusing distance of the lens 3 from the near to the far in sequence, that is, the minimum focusing distance of the lens 1 is smaller than the minimum focusing distance of the lens 2 and smaller than the minimum focusing distance of the lens 3.
In fig. 2, 3 positions where the target object may be located are shown, where the position 1 is located between the minimum focusing distance of the lens 1 and the minimum focusing distance of the lens 2, the position 2 is located between the minimum focusing distance of the lens 2 and the minimum focusing distance of the lens 3, and the first focusing distance of the position 3 is larger than the minimum focusing distance of the lens 3.
When the first photographing lens selected by the user is the lens 3 and the position of the target object is at the position 2, the minimum focusing distance of the first photographing lens is larger than the first focusing distance, and the second photographing lens selected according to the first focusing distance can be the lens 1 and the lens 2 with the minimum focusing distance smaller than the position 2 regardless of the angle of view, wherein, since the minimum focusing distance of the lens 2 is closer to the position 2 than the minimum focusing distance of the lens 1, if the lens 2 is used, the image of the target object obtained by photographing is clearer than the image of the target object obtained by photographing with the lens 1.
When the first photographing lens selected by the user is lens 1 and the position of the target object is at position 3, the minimum focusing distance of the first photographing lens is smaller than the first focusing distance, and the second photographing lens selected according to the first focusing distance can be lens 2 and lens 3 with the minimum focusing distance smaller than the first focusing distance but larger than the minimum focusing distance of lens 1 without considering the angle of view, wherein, since the minimum focusing distance of lens 3 is closer to position 3 than the minimum focusing distance of lens 2, if lens 3 is used, the image of the target object obtained by photographing will be clearer than the image of the target object obtained by photographing with mirror 2.
Of course, the above embodiment is only a specific example, and there may be various cases to select a suitable second photographing lens according to the minimum focusing distance of the first photographing lens and the difference of the first focusing distance, which is not exemplified herein.
In some embodiments, in addition to determining the second photographing lens according to the minimum focusing distance and the first focusing distance of the plurality of photographing lenses, the second photographing lens needs to satisfy a condition including the target object within the angle of view.
Fig. 3 is a schematic view of a field angle of a photographing lens according to an embodiment of the present disclosure.
As shown in fig. 3, the three lenses of the terminal 210 have minimum focusing distances of the lens 1, the lens 2, and the lens 3 shown in fig. 2, wherein the view angles of the respective lenses can be seen in fig. 3, the view angle of the lens 1 having the minimum focusing distance is the largest, and the view angle of the lens 3 having the maximum minimum focusing distance is the smallest.
The selected field angle of the second shooting lens must contain a target object, if the field angle of the shooting lens does not contain the target object, the terminal is at the current position, and the target object is out of the image range obtained by shooting the shooting lens, namely, the target object cannot be shot.
For example, when the first photographing lens is the lens 1, the target object is located at the position 4, and the second photographing lens can be used as the lens 2 and the lens 3 based on the determination of the minimum focusing distance, wherein the photographing effect of the lens 3 is clearer, however, since the position 4 is outside the field angle of the lens 3, if the lens 3 is used for photographing, the target object will not exist in the photographed image, and therefore, only the lens 2 including the position 4 in the field angle can be selected as the second photographing lens.
In summary, when the second photographing lens is selected, the photographing lens with higher photographing definition can be determined according to the relationship between the minimum focusing distance and the first focusing distance, and whether the target object can appear in the image photographed by the photographing lens can be determined according to the angle of view of the lens and the position of the target object.
In some embodiments, the terminal further comprises an optical image stabilizer (Optical Image Stabilizer, OIS) that the target object is within the field angle of the second photographing lens, including within the field angle of the second photographing lens after being compensated by the OIS.
The terminal can compensate the angle of view through OIS to expand the angle of view range of the lens, so that the target object can be located in the angle of view of more shooting lenses.
For example, taking tele as an example, since the field angle of the tele lens is only a middle partial region of the field angle of the wide, for example, one third or one fifth. Therefore, when the first photographing lens is a wide lens, the target object detected by the wide lens has a high probability of falling outside the view angle of the tele, so that the tele lens cannot be selected as the second photographing lens due to the view angle, and the target object in the finally photographed image cannot reach an ideal definition. Therefore, in order to reduce the inability to select the photographing lens as the second photographing lens due to the angle of view, the OIS device may be used to move the tele lens for a small amount of angle of view compensation.
For example, a moving axis OIS motor may be offset by plus or minus 0.5 °, with the magnitude of the compensation exceeding fifty branches one of the tele field angle. If the translational closed loop OIS motor is adopted, larger jitter can be compensated, and the maximum compensation angle can reach plus or minus 1.5 degrees, and the compensation amplitude is about one tenth of the tele field angle.
By compensating the angle of view through OIS, the probability that the target object is within the angle of view of the photographing lens can be improved, the probability of using the photographing method proposed by the present disclosure can be increased, and the selection range of the selected second photographing lens can also be increased.
In some embodiments, the plurality of photographing lenses includes a wide-angle lens, a main lens, and a tele lens, wherein a minimum focusing distance of the wide-angle lens is smaller than a minimum focusing distance of the main lens, and a minimum focusing distance of the main lens is smaller than a minimum focusing distance of the tele lens.
The plurality of photographing lenses of the terminal include a (ultra) wide-angle lens (ultrawide), a main (wide) lens (wide), and a tele lens (tele). In some embodiments, the minimum focusing distance of the main lens may be 15-25 cm, the minimum focusing distance of the tele lens may be 80-120 cm, and the minimum focusing distance of the ultra-wide lens may be less than 15 cm.
Fig. 4A, 4B, 4C are schematic diagrams illustrating a selection of a second photographing lens according to an embodiment of the present disclosure.
In the following, with reference to fig. 4A, 4B, 4C, some possible second shot choices will be listed.
The minimum focusing distance of the three lenses is ultrawide < window < tele > from small to large. The order of the angles of view from large to small is ultrawide > wide > tele.
As shown in fig. 4A, when the first photographing lens is ultrawide, if the first focusing distance is greater than the minimum focusing distance of tele and is within the angle of view of tele, then the second photographing lens may be tele when the first focusing distance is greater than the minimum focusing distance of tele, and meanwhile, if the first focusing distance is not greater than the minimum focusing distance of tele, the minimum focusing distance is necessarily greater than the minimum focusing distance of wide, so when the first focusing distance is determined to be greater than tele, but the target object is not within the angle of view of tele, a wide lens may be considered, and if not, then the second photographing lens is required to determine whether the target object is within the angle of view of wide, if not, then the first focusing distance is less than the minimum focusing distance of tele, then the second photographing lens is determined to be greater than the minimum focusing distance of wide, if not greater than the minimum focusing distance of tele, then the clearest photographing lens is the first photographing lens ultrawide, then if greater than the first photographing lens is required to determine whether the target object is within the angle of view of wide, and thus the second photographing lens is required to determine whether the target object is within the angle of view.
As shown in fig. 4B, when the first photographing lens is the lens, it is required to determine whether the first focusing distance is greater than the minimum focusing distance of tele, if so, the second photographing lens is tele when the target object is within the angle of view of tele, and if not, in both cases, when the first focusing distance is between the lens and tele, the lens with the clearest photographing is the first photographing lens wide, when the first focusing distance is between ultrawide and the lens with the clearest photographing is ultrawide, and because the angle of view of ultrawide is greater than the lens with the minimum focusing distance, the angle of view of ultrawide inevitably satisfies the condition including the target object, so that the determination of the target object within the angle of view may not be performed, and the lens with the clearest photographing is directly selected ultrawide.
As shown in fig. 4C, when the first photographing lens is tele, since the angle of view of tele is minimum, the target object is necessarily within the angles of view of the other lenses. If the first focusing distance is larger than tele, the most clear shot lens is the first shooting lens tele, when the first focusing distance is between the window and tele, the second shooting lens is the window, and when the first focusing distance is between ultrawide and the window, the second shooting lens is ultrawide.
In some embodiments, the method further comprises determining a first coordinate position of the target object in the corresponding image under the first shooting lens, determining a second coordinate position of the target object under the second shooting lens according to the first coordinate position and the coordinate mapping relation of the first shooting lens and the second shooting lens, shooting the target object through the second shooting lens to generate a first image, and focusing the target object through the second shooting lens according to the second coordinate position and generating the first image.
Whether the user shoots under the first shooting lens or not, the first shooting lens determines a first coordinate position of the target object in the previewed shooting image after the target object is identified.
Because the installation positions and pixels of different shooting lenses are different, the center points of the shooting lenses may have offset, and the coordinate systems are different in size, after the first coordinate position is determined, the second coordinate position of the target object under the second shooting lens can be determined according to the coordinate mapping relation.
Through the determined second coordinate position, when the second shooting lens is used for shooting, the focusing of the second shooting lens on the target object can be completed more quickly.
In some embodiments, a second coordinate position of the target object under the second photographing lens is determined according to the first coordinate position and a coordinate mapping relationship of the first photographing lens and the second photographing lens.
For example, when the target object is a face, the first coordinate position of the face may be multiplied by the coordinate system ratio of the first photographing lens and the second photographing lens, and then the center point offset value under the two lenses is added to determine the second coordinate position.
In some embodiments, the method further comprises shooting and generating the second image through the first shooting lens in response to a received shooting instruction before determining the first focusing distance corresponding to the target object, or shooting and generating the second image through the first shooting lens in response to a received shooting instruction after shooting the target object through the second shooting lens to generate the first image.
That is, the photographing method of the present disclosure may have two execution stages.
In some embodiments, after the user issues the shooting instruction, the first shooting lens shoots and generates the second image, and at this time, the method is performed to acquire the first image of the second shooting lens for the target object, and the finally output image is acquired through an image synthesis technology. In this way, after a shooting instruction is issued, the delay of the graph is higher, but the power consumption of the terminal is lower.
In other embodiments, the capturing of the first image with the second capture lens and the image composition may be performed in real time as the user selects the first capture lens and views the area containing the target object. That is, before the user issues the shooting instruction, the terminal performs image synthesis in real time according to the view finding preview image of the first shooting lens, and after the shooting instruction is issued, the terminal can acquire the final image at a higher speed, but because the terminal performs image synthesis in real time, the power consumption is higher.
In some embodiments, the target object comprises a human face and the determining the first focus distance of the target object from the currently used first photographing lens comprises using a human face detection algorithm under the first photographing lens to determine the target object and determine the first focus distance.
In some embodiments, after the user issues the shooting instruction, the image of the target object shot by the second shooting lens may be fused with the second image shot by the first shooting lens using a Local Fusion (Local Fusion) algorithm. Wherein the region of the target object in the first image is replaced to the position of the target object in the second image.
Corresponding to the embodiment of the photographing method of the disclosure, the disclosure also provides an embodiment of a corresponding photographing apparatus.
Referring to fig. 5, fig. 5 is a block diagram of a photographing apparatus in one embodiment of the present disclosure. As shown in fig. 5, the photographing apparatus is disposed in a terminal equipped with a plurality of photographing lenses, the apparatus comprising:
A determining unit 510 configured to determine a first focusing distance of the target object from a first photographing lens currently used;
A photographing unit 520 configured to select a second photographing lens according to the first focusing distance, and photograph the target object through the second photographing lens to generate a first image, wherein a photographing definition of the second photographing lens on the target object is better than that of the first photographing lens;
A replacing unit 530 configured to replace an area of the target object in the first image with a position of the target object in a second image photographed according to the first photographing lens.
In some embodiments, the selecting the second photographing lens according to the first focusing distance includes selecting the second photographing lens to have a minimum focusing distance smaller than the first focusing distance and the target object within the field of view of the second photographing lens if the minimum focusing distance of the first photographing lens is larger than the first focusing distance, and selecting the second photographing lens to have a minimum focusing distance smaller than the first focusing distance but larger than the minimum focusing distance of the first photographing lens and the target object within the field of view of the second photographing lens if the minimum focusing distance of the first photographing lens is smaller than the first focusing distance.
In some embodiments, the terminal further comprises an optical image stabilizer OIS, wherein the target object is in the field angle of the second shooting lens, and the target object is in the field angle of the second shooting lens compensated by the OIS.
In some embodiments, the plurality of photographing lenses includes a wide-angle lens, a main lens, and a tele lens, wherein a minimum focusing distance of the wide-angle lens is smaller than a minimum focusing distance of the main lens, and a minimum focusing distance of the main lens is smaller than a minimum focusing distance of the tele lens.
In some embodiments, the device is further configured to determine a first coordinate position of the target object in the corresponding image under the first shooting lens, determine a second coordinate position of the target object under the second shooting lens according to the first coordinate position and the coordinate mapping relation of the first shooting lens and the second shooting lens, and shoot the target object through the second shooting lens to generate a first image, wherein focusing the target object through the second shooting lens according to the second coordinate position and generating the first image.
In some embodiments, the apparatus is further configured to, before determining the first focus distance corresponding to the target object, perform shooting by the first shooting lens and generate the second image in response to the received shooting instruction, or, after shooting the target object by the second shooting lens to generate the first image, perform shooting by the first shooting lens and generate the second image in response to the received shooting instruction.
In some embodiments, the target object comprises a human face and the determining the first focus distance of the target object from the currently used first photographing lens comprises using a human face detection algorithm under the first photographing lens to determine the target object and determine the first focus distance.
The implementation process of the functions and roles of each unit in the above device is specifically shown in the implementation process of the corresponding steps in the above method, and will not be described herein again.
The embodiment of the disclosure also provides an electronic device, which comprises a processor, a memory and a processor, wherein the memory is used for storing a computer program, and the processor is used for executing the delay relieving method according to any embodiment by calling the computer program.
Embodiments of the present disclosure also provide a computer readable storage medium having stored thereon a computer program, characterized in that the program, when executed by a processor, implements a method of mitigating latency as described in any of the embodiments above.
Fig. 6 is a schematic block diagram illustrating one method for photographing an apparatus 600 according to an embodiment of the present disclosure. For example, apparatus 600 may be a mobile phone, computer, digital broadcast terminal, messaging device, game console, tablet device, medical device, fitness device, personal digital assistant, or the like.
Referring to FIG. 6, the apparatus 600 may include one or more of a processing component 602, a memory 604, a power component 606, a multimedia component 608, an audio component 610, an input/output (I/O) interface 612, a sensor component 614, and a communication component 616.
The processing component 602 generally controls overall operation of the apparatus 600, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing component 602 may include one or more processors 620 to execute instructions to perform all or part of the steps of the information receiving method described above. Further, the processing component 602 can include one or more modules that facilitate interaction between the processing component 602 and other components. For example, the processing component 602 may include a multimedia module to facilitate interaction between the multimedia component 608 and the processing component 602.
The memory 604 is configured to store various types of data to support operations at the apparatus 600. Examples of such data include instructions for any application or method operating on the apparatus 600, contact data, phonebook data, messages, pictures, video, and the like. The memory 604 may be implemented by any type or combination of volatile or nonvolatile memory devices such as Static Random Access Memory (SRAM), electrically Erasable Programmable Read Only Memory (EEPROM), erasable Programmable Read Only Memory (EPROM), programmable Read Only Memory (PROM), read Only Memory (ROM), magnetic memory, flash memory, magnetic or optical disk.
The power supply component 606 provides power to the various components of the device 600. The power supply components 606 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 600.
The multimedia component 608 includes a screen between the device 600 and the user that provides an output interface. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive input signals from a user. The touch panel includes one or more touch sensors to sense touches, swipes, and gestures on the touch panel. The touch sensor may sense not only the boundary of a touch or slide action, but also the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 608 includes a front camera and/or a rear camera. The front camera and/or the rear camera may receive external multimedia data when the apparatus 600 is in an operational mode, such as a photographing mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have focal length and optical zoom capabilities.
The audio component 610 is configured to output and/or input audio signals. For example, the audio component 610 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 600 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may be further stored in the memory 604 or transmitted via the communication component 616. In some embodiments, audio component 610 further includes a speaker for outputting audio signals.
The I/O interface 612 provides an interface between the processing component 602 and peripheral interface modules, which may be a keyboard, click wheel, buttons, etc. These buttons may include, but are not limited to, a home button, a volume button, an activate button, and a lock button.
The sensor assembly 614 includes one or more sensors for providing status assessment of various aspects of the apparatus 600. For example, the sensor assembly 614 may detect the open/closed state of the device 600, the relative positioning of the components, such as the display and keypad of the device 600, the sensor assembly 614 may also detect a change in position of the device 600 or a component of the device 600, the presence or absence of user contact with the device 600, the orientation or acceleration/deceleration of the device 600, and a change in temperature of the device 600. The sensor assembly 614 may include a proximity sensor configured to detect the presence of nearby objects in the absence of any physical contact. The sensor assembly 614 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 614 may also include an acceleration sensor, a gyroscopic sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 616 is configured to facilitate communication between the apparatus 600 and other devices in a wired or wireless manner. The apparatus 600 may access a wireless network based on a communication standard, such as WiFi, 2G, 3G, 4G LTE, 5G NR, or a combination thereof. In one exemplary embodiment, the communication component 616 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the communication component 616 further includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 600 may be implemented by one or more Application Specific Integrated Circuits (ASICs), digital Signal Processors (DSPs), digital Signal Processing Devices (DSPDs), programmable Logic Devices (PLDs), field Programmable Gate Arrays (FPGAs), controllers, microcontrollers, microprocessors, or other electronic elements for executing the information receiving methods described above.
In an exemplary embodiment, a non-transitory computer readable storage medium is also provided, such as memory 604, including instructions executable by processor 620 of apparatus 600 to perform the above-described information receiving method. For example, the non-transitory computer readable storage medium may be ROM, random Access Memory (RAM), CD-ROM, magnetic tape, floppy disk, optical data storage device, etc.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.
It is noted that relational terms such as first and second, and the like are used solely to distinguish one entity or action from another entity or action without necessarily requiring or implying any actual such relationship or order between such entities or actions. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising one does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises an element.
While the methods and apparatus provided by the embodiments of the present disclosure have been described in detail and with specific examples applied thereto, the foregoing description of the embodiments is merely provided to facilitate understanding of the methods and core ideas of the disclosure, and since modifications may be made by those skilled in the art in light of the concepts of the disclosure, the disclosure should not be construed as being limited to the specific embodiments and applications described above.